My father has a hologram of an antique microscope hanging on his wall. Not a little hand microscope, but a full size desk model. An specimen had been mounted when the hologram was taken, so you can walk up to the eyepiece and see it. The effect is quite striking.
I had an idea for producing a hologram of the sort described in the OP that worked along completely different lines to most other possibilities I’ve come across. It had plenty of limitations, but some of these posts have given me a new idea.
The original idea is based on a technology that has been demonstrated in a simple form and is being developed - using sound waves to confine particles to regions of space. Researchers are working with ultrasound and other tricks to increase the resolution. I was thinking that if you simply did this with coloured dust, you have a hologram, albeit one that’s thin and very fragile.
However, Mangetout gave me an idea. Why not just use sound waves to confine material to produce the optics needed to make a projection appear to float in space and do the rest with normal light?
Are there any materials with a high enough refractive index to make a thin cloud into a useful lens? And does anyone know enough about sonic confinement methods to tell if the required shape for a lens could be produced?
Nonsense; everyone knows they are a sort of curled wholemeal roll with kibbled oats on the top; a cross between a hoagie and a bagel.
any problem with the idea I put in the OP? for a low resolution ‘hologram’?
two walls, lots of little guns in a grid faceing eachother
couldn’t it make an image if it shot little bits of things that would react when they hit eachother? (floor and ceiling make more sense than walls so you can walk around it).
like have tiny bits of sodium (grain of sand sized) pop up from the floor with drops of water falling from the ceiling in a calculated way so they strike in the air at a point where the object has a lit pixel. so that a little flash of light would happen where they met. enough in a grid that could fire in a row fast enough and timed right should create a 3d ‘black and white’ image (change the chemicals as needed… two things that react to make light very quickly and brightly and burn away)
would not be something you could put your hand through, but is theer any problem with that?
I’m most dubious about the nature of the holodeck, personally. Its ‘base state’ is something like a fairly small cube, inside which several people may be standing when the program begins. Fair enough. The problem is when the ‘space’ opens up and the various participants wander off in different directions – the holodeck must be keeping track of what each person is seeing, somehow project only the correct elements into the field of vision of another, decide which objects are tangible to whom, and have some incredibly complex algorithms to make sure that no-one halfway across a virtual ‘world’ from someone else will accidentally walk into them, or it’s actually manipulating the dimensions beyond normal 3D space. If the latter were true, then Star Trek tech would be a lot more advanced than it generally lets on, and the pokey crewman-quarters would be unnecessarily stingy. Tch.
VR-wise, it’s much simpler to stick a plug into someone’s head and, while they remain stationary, cause them to perceive elements from the program. And more energy-efficient, too. Gives new terror to the BSOD ‘phenomenon’, though…
One problem would be that there could only really be one ‘pixel’ active in each vertical column at any moment in time, but this could be overcome by rasterising.
Some particles interact with the medium through which they travel and they can be made to (most probably) interact desctructively with matter (not sure if air is dense enough) at a certain point by closely controlling their speed; faster particles get further before they crash into a nucleus. (or something like that).
Why worry about all this “projecting into the air” nonsense? Just have little RGB laser guns on the holodeck walls that shoot through your pupil and scan directly onto your retina. There were VR glasses in Donaldson’s Snow Crash that worked like this, and removing the the glasses just requires more computer power to track your eye movements.
Of course, get the power requirements wrong and there’s a good chance of frying your optics.
Two problems; from a distance of more than a few centimetres, the only part of the retina that would be targetable by the laser would be the part that, when stimulated, makes us see a bright dot at exacctly the point of emission; so you can’t (from a distance)project a dot on the retina that makes the viewer see a bright dot somewhere ‘wrong’.
OK, so what - we’re going to cover all the walls with these lasers, so the dots can originate in the right place… but then there is the somewhat significant problem of obstacles; the lasers all project their beams in such a way as to make me think that there is a person standing there in a pedestal in the centre of the room, but when someone walks behind the pedestal, they block the light from the lasers and part of the image disappears; I can tell you from experience that this is quite disturbing visually - seeing a near object obscured by a far one messes with your head.