Using femtosecond lasers (a femtosecond is a quadrillionth of a second, and the lasers transmit bursts that last 30 to 270 femtoseconds), the team can make holograms that are safe to touch. The images are three-dimensional, with resolutions up to 200,000 dots per second. The voxels are light emitted by plasma that's created when the laser's focused energy ionizes the air.
Abstract :We present a method of rendering aerial and volumetric graphics using femtosecond lasers. A high-intensity laser excites a physical matter to emit light at an arbitrary 3D position. Popular applications can then be explored especially since plasma induced by a femtosecond laser is safer than that generated by a nanosecond laser. There are two methods of rendering graphics with a femtosecond laser in air: Producing holograms using spatial light modulation technology, and scanning of a laser beam by a galvano mirror. The holograms and workspace of the system proposed here occupy a volume of up to 1 cm^3; however, this size is scalable depending on the optical devices and their setup. This paper provides details of the principles, system setup, and experimental evaluation, and discussions on scalability, design space, and applications of this system. We tested two laser sources: an adjustable (30-100 fs) laser which projects up to 1,000 pulses per second at energy up to 7 mJ per pulse, and a 269-fs laser which projects up to 200,000 pulses per second at an energy up to 50 uJ per pulse. We confirmed that the spatiotemporal resolution of volumetric displays, implemented with these laser sources, is 4,000 and 200,000 dots per second. Although we focus on laser-induced plasma in air, the discussion presented here is also applicable to other rendering principles such as fluorescence and microbubble in solid/liquid materials.
According to the linked article above, it's a point of light made by ionizing the air at that spot:
The system can generate different optical effects, colors and images by using different kinds of materials for the projection particles.“They could be just about anything,” Smalley said. “Glass beads, diamonds, cellulose, tungsten — a wide variety of materials. What we've found most effective is a substance called black liquor, which is a byproduct of the paper manufacturing process. It's essentially just paper, cellulose."
Quote from: sanman on 08/19/2018 06:41 amAccording to the linked article above, it's a point of light made by ionizing the air at that spot:Your first links says the light is projected onto paper dust, cellulose, in an optical trap not ionised air emitting light.From seeker.comQuoteThe system can generate different optical effects, colors and images by using different kinds of materials for the projection particles.“They could be just about anything,” Smalley said. “Glass beads, diamonds, cellulose, tungsten — a wide variety of materials. What we've found most effective is a substance called black liquor, which is a byproduct of the paper manufacturing process. It's essentially just paper, cellulose."
In the automotive industry, there's a trend moving away from physical controls (single-function knobs and buttons) and towards touch screens. While this enables more flexibility, it has a big drawback: it makes it more difficult to operate the controls blindly. Controls in a car fall into 2 basic groups: 1. controls that have to be usable while driving (lights, wipers, AC, audio)2. controls that are acceptable if they are only operable while stationary (setting user preferences, satnav etc)For spacecraft something similar holds. For operating a robotic arm, you need dedicated physical controls because you can't afford to hunt for them. The same goes for maneuvering the spacecraft, emergency operations etc.
CEO and Co-founder of CTRL-labs, Thomas Reardon, on stage for the Keynote at the O'Reilly Artificial Intelligence conference in NYC, shares a bold vision for the future of human-computer interaction and how the company uses non-invasive neural interfaces to unlock human potential.
This could enable cabin space to be freed up, by not having a large physical dashboard panel in front of the crew/occupants.The display/control interface could be flexibly and dynamically reconfigurable to different dimensions, based on the particular need/application.The display/control interface hardware could be made more miniaturized, compact and rugged against shock/vibration/stresses associated with liftoff & re-entry or even depressurization, due to its virtually projected nature, as opposed to being a large physical display.There could be more backup miniaturized hardware available for redundancy, in case the main interface broke down - as opposed to carrying extra physical monitor screens for redundancy.Virtual projected screens can't be damaged the way physical monitor screens can by bumping into them, etc - nor can they bang into you and cause you injury.
my old employer hasnot quite yet "digitalized" the Auto pilot flight director controls (the AFDS) but its probably coming
I vaguely remember a concept for light thruster controls where a ball floats in a cube space strongly pinned by magnetic forces, and a person grips the ball. Sensors detect the ball moving in 6 axis movement, and interpreting that as commands for the thrusters. This was intended for close range manual docking/berthing work.
QuoteI vaguely remember a concept for light thruster controls where a ball floats in a cube space strongly pinned by magnetic forces, and a person grips the ball. Sensors detect the ball moving in 6 axis movement, and interpreting that as commands for the thrusters. This was intended for close range manual docking/berthing work.Sounds like a more expensive version of the 3D controllers currently in use in the CAD/3D modelling industry.
As Hobbes-22 points out, you can't operate the controls blindly. Additionally, in high-g environment, getting your hands up to the controls could be problematic. Similar thing in a dynamic vibrating environment. The see-through aspect of these controls can also cause problems. Is that dot you're looking at part of the control or something that drifted out of somewhere and stuck to the wall behind the projection area?I see these kinds of controls as disasters waiting to happen. They may look cool in sci-fi, they may work in stationary installations, but they're just trouble waiting to happen in a moving vehicle.
Quote from: laszlo on 08/23/2018 11:43 pmAs Hobbes-22 points out, you can't operate the controls blindly. Additionally, in high-g environment, getting your hands up to the controls could be problematic. Similar thing in a dynamic vibrating environment. The see-through aspect of these controls can also cause problems. Is that dot you're looking at part of the control or something that drifted out of somewhere and stuck to the wall behind the projection area?I see these kinds of controls as disasters waiting to happen. They may look cool in sci-fi, they may work in stationary installations, but they're just trouble waiting to happen in a moving vehicle.Watch the vid I posted above, you can control a screen, robot or whatever, with just the intention of moving your hand. Actually moving the hand is unnecessary. Mind you the high-g environment you are referring to is very rare, I only know of one instance which came close.