Quote from: smfarmer11 on 02/28/2017 04:19 pmThe kestrel in the trunk idea is a non starter mainly because it's use of cryogenic propellants. Better would be a superDraco with a higher expansion ration for better ISP, and the storability of hypergolics. However this subject was discussed extensively in this thread:https://forum.nasaspaceflight.com/index.php?topic=40318.0As for "junk in the trunk" I don't see it becoming a service module but I do think it will likely carry one or even a few expendable free flying camera equipped cubesats or something similar that will separate, maneuver, and maintain somewhat close formation with the capsule as it swings by the moon to provide the ultimate drone imagery of the Dragon with the lunar surface rotating underneath. Images will be transmitted to the capsule as part of the tourist package and for SpaceX promotional use. Such a thing could also be as simple as a few wifi GoPros or commercial 360 degree ball cameras ejected from under the nose cap.
The kestrel in the trunk idea is a non starter mainly because it's use of cryogenic propellants. Better would be a superDraco with a higher expansion ration for better ISP, and the storability of hypergolics. However this subject was discussed extensively in this thread:https://forum.nasaspaceflight.com/index.php?topic=40318.0
Great minds think alike Quote from: Helodriver on 02/28/2017 05:09 pmQuote from: smfarmer11 on 02/28/2017 04:19 pmThe kestrel in the trunk idea is a non starter mainly because it's use of cryogenic propellants. Better would be a superDraco with a higher expansion ration for better ISP, and the storability of hypergolics. However this subject was discussed extensively in this thread:https://forum.nasaspaceflight.com/index.php?topic=40318.0As for "junk in the trunk" I don't see it becoming a service module but I do think it will likely carry one or even a few expendable free flying camera equipped cubesats or something similar that will separate, maneuver, and maintain somewhat close formation with the capsule as it swings by the moon to provide the ultimate drone imagery of the Dragon with the lunar surface rotating underneath. Images will be transmitted to the capsule as part of the tourist package and for SpaceX promotional use. Such a thing could also be as simple as a few wifi GoPros or commercial 360 degree ball cameras ejected from under the nose cap.
You're thinking of something like this: https://phys.org/news/2016-10-selfie-microsatellite-captures-images-chinese.html
They have been flying prototypes in the ISS for years. See S.P.H.E.R.E.S.They pre-date VR/ball cams but the guidance, communication and propulsion (warm CO2) has gotten a lot of flying time.Creating an update with modern batteries and processing power would seem a logical evolution.
To simplify the thing, it could be a one use item. It would be released at the optimum time to get the hero shot, and then left on its own. This would eliminate worries about it doing any damage. Alternately, it could be on an extensible arm (selfie stick) to capture images like Curiousity has. And then fold back up. This eliminates the need for antennas and RF interference worries. Matthew
At one point, this was part of the CONOPS for ISS. It was a piece of hardware called AERCam. A proof of concept demonstration flew on STS-87. I can't find any concrete information for when or why it was cancelled, though.
Getting a shot of the Dragon re-entering the Earth's atmosphere would be Extra Credit. Like, lots of freakin extra credit. And I think this can be done.
A. After passing the moon, the camera trajectory is going divert from the Dragon because mid course corrections. And since the Dragon does not thrusters in couples, any attitude change is going to result in a delta VB. What is going to aim the camera at the Dragon?C. Also, what is going to keep the camera from hitting the Dragon? There still is a chance of collision.D. There is no point in doing this during entry. The plasma will block any imaging.
A: Trajectory changes after midcourse corrections seem expensive. The drone could diverge so far from Dragon that it comes down hundreds or thousands of miles away. That could make finding it expensive, and if it slams into the ground there's a good chance the flash storage and beacon are toast.B: No need to aim, just avoid rotating too fast. The cameras point in all directions. During reentry, the cameras point in most directions, just not close to forward. How about the drone re-enters slightly behind the capsule, with a slightly smaller ballistic coefficient so it falls back as it goes? Give it some lateral velocity before entry, so the capsule stays off to the side a bit and still in view.D: Here's a video looking out from Orion during reentry:
A. Any maneuver of the Dragon is going to separate them. Even Dragon attitude changes
Raspberry Pi would have enough oomph to do the math for orientation, could also handle the camera, wifi and you can plug in an accelerometer and GPS module for location.
For orientation towards the dragon wouldn't a regular ping from the dragon be enough to choose which way to point, maybe a slight offset two-ping system to determine precice direction in relation to the drone, then just aim towards the beacon...
B. PointingMaybe the drone can be made small enough, and the capsule can eject one every 60 seconds during re-entry. C. SeparationIf the drones are design to trail and laterally separate from the Dragon during re-entry, they will be behind the capsule when it pops its chutes. Yikes! (Awesome visual, of course, if collision can be avoided.)D: Plasma optical densityhttp://mms.businesswire.com/bwapps/mediaserver/ViewMedia?mgid=173435&vid=5Looking out the side just doesn't look like a problem. Obviously the cameras won't be staring through the heat shield. Note that the side-mounted cameras can look forward to some extent.
Jim,B: The drones would be attached to the top of the Dragon and would eject from there. I have no clue how that might affect aerodynamics during launch, maybe they'd need an aero shield around them. The Dragon has a cover over the port used to dock to the ISS. Maybe some of that can go, given that it's not docking to anything. Or maybe the drones can fit behind a bulged version of that cover.
1. How does dragon open that front hatch cover without power? Is it just a cable like the trunk release on my car? It's got to close it again afterwards, somehow, right? And the parachutes have to release somehow as well.So the drones might use some more of whatever the existing equipment up there is using.2. I'm thinking the drone has no pressure tanks. Cameras, batteries, motors, flywheels, heat shield, and a radio beacon.
Quote from: JamesH65 on 03/08/2017 06:31 pmRaspberry Pi would have enough oomph to do the math for orientation, could also handle the camera, wifi and you can plug in an accelerometer and GPS module for location.Are you talking about just entry or the whole mission.RPI would be useless since GPS isn't going to help with orientation or location at the moon. Neither is the accelerometer.For entry, GPS is not going to help because of the plasma.At any rate, accelerometer and GPS is not going to provide enough info to point the camera at the capsule.
1. Those services are wired specifically for those tasks. There isn't extra power or signal cables that can be tapped into. Hence, there is no "existing equipment". 2. What is it going to use for propulsion and attitude control (the flywheel along can't do it)
1. Okay, so they wire the releases specifically for the drones. Duh. It's not like the Dragon going around the moon is going to be exactly the same as a Dragon going to the ISS. There are going to be a whole bunch of special accomodations.2. There is no need for propulsion, aside from a tiny solid rocket motor which drives the drone away from the Dragon. That might be done pneumatically, I think SpaceX prefers pneumatics.3. Attitude control is done with a flywheel. During reentry, it'll saturate in short order, but we only need it to keep attitude control for a minute or two during reentry, as we're going to be out of range after that amount of time anyway. Okay, maybe actual flywheels won't produce enough torque, and the drone will need moment control gyros. But those are more complicated and bigger.The drone going around the moon can disengage a lot more slowly, maybe even how the Russians deploy satellites from Dnepr, where they let go of the satellite with no impulse and then back the bus away from it. It will see vanishingly small external torques from radiation pressure, and so should be able to go at least hours before saturating the flywheels.As for GPS, it would probably be useful to have the drone attempt to get a GPS fix just before reentry, and broadcast that on its beacon, and do the same while it's bobbing in the water. If the in-space transmission can be received, it'll help the recovery crew find it sooner.
They are not going to put it on the front. The whole thread started with putting them in the trunk. You just don't go and add something willy nilly. There aren't services like power or data all around the exterior of the Dragon to separate an attached object. Also there is the safety implications of non separations with the pressure tanks of the drone.
Just out of interest, if you have two craft in formation, how do they currently know, automatically, where they were in relation to each other, without GPS.
On the subject of transmitting the video to the Dragon for store and possibly forward...What's the free-space range of some nearly off the shelf 802.11ad equipment? That can deliver up to 7Gbit/s. Officially it supports up to 10 meters with beam forming... might have to go with 802.11ac, which would limit you to a much slower rate... but still, assuming no interference (a big assumption), still in the ballpark of regular gigabit ethernet.There's Li-Fi too, though that's actually slower, it would have potentially greater range perhaps. Or just slap some of the laser comms gear on it from CommX, though that may be too bulky ...trying to return the entire camera drone via re-entry on it's own is certainly way too hard to do by comparison to simply sending the data...
Quote from: JamesH65 on 03/09/2017 10:04 amJust out of interest, if you have two craft in formation, how do they currently know, automatically, where they were in relation to each other, without GPS.Both would require INS and some sort of sensor suite like LIDAR, RADAR, or 3D cameras (multiple), etc
Because of all the effort going into those things for autonomous vehicles, I'd think any of those things could be an option. LIDAR and proximity RADAR is much cheaper. 3D imaging is very useful at less than 100m like this. Tesla folks may have more to contribute than SpaceX.
Quote from: Ludus on 03/10/2017 04:26 pmBecause of all the effort going into those things for autonomous vehicles, I'd think any of those things could be an option. LIDAR and proximity RADAR is much cheaper. 3D imaging is very useful at less than 100m like this. Tesla folks may have more to contribute than SpaceX. INS is needed regardless and it would need some type of star tracker
I'd want to target human eye resolution, which is 150 - 200 urad/pixel. Once you account for some overlap between sensors, the combined resolution of the camera ball will be over 400 megapixels. At 60 fps, that's 24 gigapixels/second. The moon flyby will last at least an hour, maybe two.Good video compression will get that down to 0.1 bits/pixel, so 1-2 terabytes. Transmitted over the course of 4 days, that's 23-46 Mb/s. That's a very, very fast link. Maybe you could use WiFi to transmit from the drone to the Dragon, and let the Dragon bring the data home, but that link would have to stay up for four days over ever-increasing range, or more likely run at very, very high data rates for many hours. I agree that this is also a reasonable direction to go in, but I don't like it for a couple of reasons.
many of the cameras will be pointing at open space. they will essentially be recording 'black' with a few pinpricks of starlight. any reasonable video codec will collapse the stream from those cameras down to nearly nothing.
I'd expect the cameras that ARE pointing at something to compress pretty well too - dragon will be essentially static, the moon will be moving, but slowly, and very predictably, and no low level noise to filter out from atmospheric effects.
Quote from: starsilk on 03/10/2017 07:45 pmmany of the cameras will be pointing at open space. they will essentially be recording 'black' with a few pinpricks of starlight. any reasonable video codec will collapse the stream from those cameras down to nearly nothing.Yes, a starfield should be very compressible.QuoteI'd expect the cameras that ARE pointing at something to compress pretty well too - dragon will be essentially static, the moon will be moving, but slowly, and very predictably, and no low level noise to filter out from atmospheric effects.What are the atmospheric effects that produce low level noise? Most of the image noise that I've run across is photon shot noise, which is a quantum mechanical thing not having to do with the atmosphere. The rest, significant only in low SNR images, is readout noise from the electronics on the sensor.