Quote from: ugordan on 05/01/2014 02:29 pmOh, I'm sure they have it quite well-documented. Just not in a format your typical rocket pr0n enthusiast likes. It's in the form of vehicle telemetry. That's gold, anything else is just gravy. If you want to go that way, we don't need a video. And yet we have one.
Oh, I'm sure they have it quite well-documented. Just not in a format your typical rocket pr0n enthusiast likes. It's in the form of vehicle telemetry. That's gold, anything else is just gravy.
Quote from: ugordan on 05/01/2014 02:29 pmIt still doesn't change my argument that any such solutions are just too much trouble for the amount of use they'll have eventually.What trouble? Switch the camera for a mono version? Or reconfigure the camera to transmit in gray scale? You make it sound like mono cameras are troublesome, but in fact they are EXACTLY the same! Usually manufacturers supply the same camera in mono and color version.
It still doesn't change my argument that any such solutions are just too much trouble for the amount of use they'll have eventually.
Ugordan is right. Switching to monochrome would not save much bandwidth. (if any) Don't argue if you don't understand how chroma (color) is sub-sampled compared to luminosity (brightness).
EDIT: the used codec can have color subsampling of 4:2:2 or 4:4:4, meaning at best a 2:1 relation between luminance and chroma. So it is more likely that the color information is 33% of the video bandwidth, not considering compression effects of similarities between frames. Where did you get the 12.5%?
Quote from: Adaptation on 05/01/2014 08:11 amQuoteFirst we took a pass on the data to align every MPEG packet on a 188 byte boundary and set the packet start byte to 0x47.I'm not seeing that. For instance location 26EC is divisible by 188 (0XBC) but its value is 4F not 47.I can concur. In the repair1.ts file the sync bytes have not been "fixed" to 47 (hex). Maybe they uploaded the wrong file? Not that this does much: the rest of the header is usually broken as well. Also, in the raw.ts there are 5 places where I had to add exactly 56 bytes in order for the headers to align on the 188 bytes grid.Anyway. Back to trying to get a little more life out of this video.
QuoteFirst we took a pass on the data to align every MPEG packet on a 188 byte boundary and set the packet start byte to 0x47.I'm not seeing that. For instance location 26EC is divisible by 188 (0XBC) but its value is 4F not 47.
First we took a pass on the data to align every MPEG packet on a 188 byte boundary and set the packet start byte to 0x47.
The problem is the noisy signal, not the codec. The reason for the noisy signal is signal strength, pure and simple, worsened by the distance from the rocket and the bad weather. The easiest solution is to install a more powerful transmitter.
the other option would be to double the number of frames, 30fps. Then differences between frames would be reduced to half, increasing compression ratio.
@AJA why red filter? Maybe a luminance filter with IR cut, but can't understand the reason to use the red filter.
Quote from: ugordan on 05/01/2014 02:29 pmQuote from: IRobot on 05/01/2014 02:18 pm12.5% is actually a lot. When receiving such error-prone signal, it makes a lot of difference. It also reduces transmission power requirements (for the same frame rate), therefore more power available for transmission, therefore better S/N ratio.Maybe. *If* that's the only telemetry sent. Who's to say the video wasn't multiplexed along all the other, high rate vehicle telemetry so the 12.5% for video is more of a noise in the total bandwidth budget? Quote from: IRobot on 05/01/2014 02:18 pmAlso true monochrome cameras are up to 3x more sensitive, meaning less (camera) noise to start with.Seriously? At the codec quality settings they're using, the camera dirt that's deposited on the way down you're worried about camera noise?Quote from: IRobot on 05/01/2014 02:18 pmQuote from: ugordan on 05/01/2014 01:42 pmAgain, why go around making up solutions that would only be relevant for a couple of seconds before splashdown? Because the way up is quite well documented. The way down is not. Oh, I'm sure they have it quite well-documented. Just not in a format your typical rocket pr0n enthusiast likes. It's in the form of vehicle telemetry. That's gold, anything else is just gravy. It still doesn't change my argument that any such solutions are just too much trouble for the amount of use they'll have eventually.Worked on sort of "telemetry", we divided data into what's important (10%, in this case sensors data) and what's less important (90%, in this case video feed). Both were CRC-ed, but the first got retransmitted if corrupted or lost, the second was not. Sort of TCP and UDP.On worsening channel the retransmissions of sensor data occupied more and more bits until no bits were available to video.I guess SpaceX does the same, so if we see SOME video, it means ALL sensor data was received without gaps.
Quote from: IRobot on 05/01/2014 02:18 pm12.5% is actually a lot. When receiving such error-prone signal, it makes a lot of difference. It also reduces transmission power requirements (for the same frame rate), therefore more power available for transmission, therefore better S/N ratio.Maybe. *If* that's the only telemetry sent. Who's to say the video wasn't multiplexed along all the other, high rate vehicle telemetry so the 12.5% for video is more of a noise in the total bandwidth budget? Quote from: IRobot on 05/01/2014 02:18 pmAlso true monochrome cameras are up to 3x more sensitive, meaning less (camera) noise to start with.Seriously? At the codec quality settings they're using, the camera dirt that's deposited on the way down you're worried about camera noise?Quote from: IRobot on 05/01/2014 02:18 pmQuote from: ugordan on 05/01/2014 01:42 pmAgain, why go around making up solutions that would only be relevant for a couple of seconds before splashdown? Because the way up is quite well documented. The way down is not. Oh, I'm sure they have it quite well-documented. Just not in a format your typical rocket pr0n enthusiast likes. It's in the form of vehicle telemetry. That's gold, anything else is just gravy. It still doesn't change my argument that any such solutions are just too much trouble for the amount of use they'll have eventually.
12.5% is actually a lot. When receiving such error-prone signal, it makes a lot of difference. It also reduces transmission power requirements (for the same frame rate), therefore more power available for transmission, therefore better S/N ratio.
Also true monochrome cameras are up to 3x more sensitive, meaning less (camera) noise to start with.
Quote from: ugordan on 05/01/2014 01:42 pmAgain, why go around making up solutions that would only be relevant for a couple of seconds before splashdown? Because the way up is quite well documented. The way down is not.
Again, why go around making up solutions that would only be relevant for a couple of seconds before splashdown?
Someone on Youtube did a decent effort of cleaning it up: I'm not sure how accurate it is, and I don't think the legs were extended in the beginning of the clip? Nonetheless it looks neat.
can someone please explain to me why spacex still hasn't released the footage they got from their airplane? Why are they being so secretive about it? I mean seeing is believing so why not just release the footage and prove to everyone in the world they really accomplished such an unprecedented feat?I mean they did actually record it from their airplane right?