Just out of interest, if you have two craft in formation, how do they currently know, automatically, where they were in relation to each other, without GPS.
On the subject of transmitting the video to the Dragon for store and possibly forward...What's the free-space range of some nearly off the shelf 802.11ad equipment? That can deliver up to 7Gbit/s. Officially it supports up to 10 meters with beam forming... might have to go with 802.11ac, which would limit you to a much slower rate... but still, assuming no interference (a big assumption), still in the ballpark of regular gigabit ethernet.There's Li-Fi too, though that's actually slower, it would have potentially greater range perhaps. Or just slap some of the laser comms gear on it from CommX, though that may be too bulky ...trying to return the entire camera drone via re-entry on it's own is certainly way too hard to do by comparison to simply sending the data...
Quote from: JamesH65 on 03/09/2017 10:04 amJust out of interest, if you have two craft in formation, how do they currently know, automatically, where they were in relation to each other, without GPS.Both would require INS and some sort of sensor suite like LIDAR, RADAR, or 3D cameras (multiple), etc
Because of all the effort going into those things for autonomous vehicles, I'd think any of those things could be an option. LIDAR and proximity RADAR is much cheaper. 3D imaging is very useful at less than 100m like this. Tesla folks may have more to contribute than SpaceX.
Quote from: Ludus on 03/10/2017 04:26 pmBecause of all the effort going into those things for autonomous vehicles, I'd think any of those things could be an option. LIDAR and proximity RADAR is much cheaper. 3D imaging is very useful at less than 100m like this. Tesla folks may have more to contribute than SpaceX. INS is needed regardless and it would need some type of star tracker
I'd want to target human eye resolution, which is 150 - 200 urad/pixel. Once you account for some overlap between sensors, the combined resolution of the camera ball will be over 400 megapixels. At 60 fps, that's 24 gigapixels/second. The moon flyby will last at least an hour, maybe two.Good video compression will get that down to 0.1 bits/pixel, so 1-2 terabytes. Transmitted over the course of 4 days, that's 23-46 Mb/s. That's a very, very fast link. Maybe you could use WiFi to transmit from the drone to the Dragon, and let the Dragon bring the data home, but that link would have to stay up for four days over ever-increasing range, or more likely run at very, very high data rates for many hours. I agree that this is also a reasonable direction to go in, but I don't like it for a couple of reasons.
many of the cameras will be pointing at open space. they will essentially be recording 'black' with a few pinpricks of starlight. any reasonable video codec will collapse the stream from those cameras down to nearly nothing.
I'd expect the cameras that ARE pointing at something to compress pretty well too - dragon will be essentially static, the moon will be moving, but slowly, and very predictably, and no low level noise to filter out from atmospheric effects.
Quote from: starsilk on 03/10/2017 07:45 pmmany of the cameras will be pointing at open space. they will essentially be recording 'black' with a few pinpricks of starlight. any reasonable video codec will collapse the stream from those cameras down to nearly nothing.Yes, a starfield should be very compressible.QuoteI'd expect the cameras that ARE pointing at something to compress pretty well too - dragon will be essentially static, the moon will be moving, but slowly, and very predictably, and no low level noise to filter out from atmospheric effects.What are the atmospheric effects that produce low level noise? Most of the image noise that I've run across is photon shot noise, which is a quantum mechanical thing not having to do with the atmosphere. The rest, significant only in low SNR images, is readout noise from the electronics on the sensor.