The Tiger-Tight technology operates on a micro-topographic scale. Industrial diamonds embedded in an electro-less nickel matrix penetrate and interlock with the mating surfaces to create an extremely high retaining force and prevent loosening under vibration and shock
Future configurations of the Falcon 9 are Off-Topic for the COTS-2+ mission thread, people. All of this back and forth is based on an image capture from the launch, which was both On-Topic and cool. There is a Falcon 9 V1.1 thread for this discussion, with many of these questions answered.
The location of the retroreflectors on the JEM were known years in advance.There was evidence of their effect on DragonEye years in advance, too.DragonEye does not "scan". It is has a 2D detector array. Any change in the FOV (or FOR) would be electronic, not physical as one could do with a scanner.
During the first docking, the laser ranging system was thrown off by sunlight glinting from the space station. Musk says the problem has been fixed. "LIDAR is sort of a 3D laser scanner that scans something and then it comes up with a point cloud and tries to figure out what it’s looking at," he says. "The system tries to fit that with what it’s expecting to see, and then, using that, it figures out what the relative position and motion is between Dragon and the space station. And it uses that to plot an approach vector."During that first approach in May, the Dragon was working from a model of the ISS that wasn’t totally accurate, as pieces have been added to and subtracted from the real-life station. "There was a reflector on the Japanese model that was extremely bright and it was showing up to a far greater degree than we expected," Musk says. SpaceX solved the problem on the fly by uploading some new code to narrow the field of view, similar to putting blinders on a horse so it doesn’t get distracted. That temporary fix has morphed into a permanent reprogramming. "We’ve improved the software on the LIDAR, on the image-recognition software, so if it encounters this again it would not have a problem," Musk says.
The timeline for reflector changes and DragonEye demos was as follows:May 2008 - STS-124 delivers JEM (TCS reflectors 7 and 8 )July 2009 - STS-127 performs first DragonEye demoFeb 2011 - STS-133 performs second DragonEye demo and delivers PMM (TCS reflector 9)There are other reflectors, of course, but they were either present prior to STS-124, or were never visible to Dragon due to placement.Now, the DragonEye demos flew the normal shuttle approach profile (Rbar approach to 180 m, flyaround to Vbar with arrival at ~130 m). The JEM reflectors have a shield facing the Vbar that prevented the shuttle from seeing those reflectors while on the Vbar. The anomaly on the COTS 2+ flight occurred within 180 m, so it may have been a range-dependent phenomenon.
Quote from: Comga on 08/28/2012 05:39 pmThe location of the retroreflectors on the JEM were known years in advance.There was evidence of their effect on DragonEye years in advance, too.DragonEye does not "scan". It is has a 2D detector array. Any change in the FOV (or FOR) would be electronic, not physical as one could do with a scanner.Don't shoot the messenger.http://www.popularmechanics.com/science/space/news/coming-in-october-spacex-dragon-gets-down-to-work-11953752QuoteDuring the first docking, the laser ranging system was thrown off by sunlight glinting from the space station. Musk says the problem has been fixed. "LIDAR is sort of a 3D laser scanner that scans something and then it comes up with a point cloud and tries to figure out what it’s looking at," he says.
During the first docking, the laser ranging system was thrown off by sunlight glinting from the space station. Musk says the problem has been fixed. "LIDAR is sort of a 3D laser scanner that scans something and then it comes up with a point cloud and tries to figure out what it’s looking at," he says.
Quote from: ugordan on 08/28/2012 05:50 pmQuote from: Comga on 08/28/2012 05:39 pmThe location of the retroreflectors on the JEM were known years in advance.There was evidence of their effect on DragonEye years in advance, too.DragonEye does not "scan". It is has a 2D detector array. Any change in the FOV (or FOR) would be electronic, not physical as one could do with a scanner.Don't shoot the messenger.http://www.popularmechanics.com/science/space/news/coming-in-october-spacex-dragon-gets-down-to-work-11953752QuoteDuring the first docking, the laser ranging system was thrown off by sunlight glinting from the space station. Musk says the problem has been fixed. "LIDAR is sort of a 3D laser scanner that scans something and then it comes up with a point cloud and tries to figure out what it’s looking at," he says.Isn't the point that it's the LASER that does the scanning to paint the target, and the 2D detector array just passively detects where the bright spot appears in the FOV.cheers, Martin
I suspect that people are using too tight a definition of the word 'scan'. Simply because 100 years ago something physical had to move does not prevent modern detector technology reducing the searching part to a loop in the software.http://en.wikipedia.org/wiki/Image_scanning
The laser does NOT scan. It broadcasts as a single pulse covering the entire 45 degree square cone. This is an imaging lidar, similar to one I worked on. I am familiar with the ASC detector, and most details are readily available on their web page.
Quote from: Comga on 08/29/2012 07:42 pmThe laser does NOT scan. It broadcasts as a single pulse covering the entire 45 degree square cone. This is an imaging lidar, similar to one I worked on. I am familiar with the ASC detector, and most details are readily available on their web page.Traditional LIDAR does scan, usually using rotating or pivoting mirror(s); I think the beam spreader in this design is confusing people. I imagine the range for this type of LIDAR is much shorter, but you get higher frame rates and less processing is needed to transform the point cloud into a working 3D surface?-R C(Who usually only works with preprocessed First and Last return and gridded bare earth DEM products, but has read about the process...)
Yes. "Traditional" systems use a single, high speed detector to detect the time of flight of a short laser pulse. DragonEye and Orion VNS use "imaging lidar" detectors, arrays of detectors, (128 square for DragonEye and 256 square for VNS), each of which performs the function of the "traditional" single element lidar detector.
I think the distinction is worth keeping - if the sensor uses light (like DragonEye), it's LIDAR, if it uses RF, it's radar. The use of "laser radar" should be discouraged.
Quote from: Jorge on 08/30/2012 10:07 pmI think the distinction is worth keeping - if the sensor uses light (like DragonEye), it's LIDAR, if it uses RF, it's radar. The use of "laser radar" should be discouraged.RF is just really long-waved light. HaHA, PHYSICS!*runs off*
If we detect the return with an antenna, its a wave and radar. If we detect the absorbed energy, its a photon and lidar.
Quote from: Comga on 08/30/2012 07:46 pmYes. "Traditional" systems use a single, high speed detector to detect the time of flight of a short laser pulse. DragonEye and Orion VNS use "imaging lidar" detectors, arrays of detectors, (128 square for DragonEye and 256 square for VNS), each of which performs the function of the "traditional" single element lidar detector. Just jumping in here to ask Comga a quick question. After following the last couple of posts and then doing some Googling, is the DragonEye better described as a "3-D gated viewing laser radar" (as described here: http://www.stanfordcomputeroptics.com/a-3d-gated-viewing-laser-radar.html) instead of as a LIDAR? Or is the former simply a subset of the latter?
It can be noted that many of the technical references from refereed journals include the term "laser radar" in their titles. Apparently, Jorge's and my distinction of terminology is not universally shared in the industry. Oh well.