Author Topic: SpaceX Falcon 9 v1.1 CRS-3 Splashdown Video Repair Task Thread  (Read 414575 times)

Offline rickyramjet

  • Member
  • Posts: 32
  • Liked: 3
The problem is the noisy signal, not the codec.  The reason for the noisy signal is signal strength, pure and simple, worsened by the distance from the rocket and the bad weather.  The easiest solution is to install a more powerful transmitter.

Offline Adaptation

  • Full Member
  • *
  • Posts: 135
  • Liked: 32
Quote
First we took a pass on the data to align every MPEG packet on a 188 byte boundary and set the packet start byte to 0x47.

I'm not seeing that.  For instance location 26EC is divisible by 188 (0XBC) but its value is 4F not 47.
I can concur. In the repair1.ts file the sync bytes have not been "fixed" to 47 (hex). Maybe they uploaded the wrong file? Not that this does much: the rest of the header is usually broken as well. Also, in the raw.ts there are 5 places where I had to add exactly 56 bytes in order for the headers to align on the 188 bytes grid.

Anyway. Back to trying to get a little more life out of this video.  ;)

Well some bits in the header should be able to be restored as well. 

Here is a prototype for the two header bytes.
1000 0111  (this is the G or 47)
0Y0Y YYYY
YYYY YYYY
00X1 XXXX

Where 1's and 0's are values that should be set regardless of contents of packets.  X's contain data that may be determined by analysing headers before and after this packet.  Y's contain data that could possibly be determined by analyzing data within the packet, knowing the identifiers associated with the codec and several invalid values could be excluded.

Without doing much analysis we can make some bitwise filters for the second and fourth bytes to fix five bits. 

0101 1111  to be and with the second byte (this sets the packet to not be ignored and not to give it special priority)
0011 1111  to be and with the fourth byte  (this declares the stream to be unencrypted)
0001 0000  to be ored with the fourth byte  (this sets packet contains payload to true, which is possibly too big of an assumption)

http://en.wikipedia.org/wiki/MPEG_transport_stream#Packet
« Last Edit: 05/01/2014 06:06 PM by Adaptation »

Online IRobot

  • Full Member
  • ****
  • Posts: 818
  • Liked: 53
  • Portugal & Germany
The problem is the noisy signal, not the codec.  The reason for the noisy signal is signal strength, pure and simple, worsened by the distance from the rocket and the bad weather.  The easiest solution is to install a more powerful transmitter.
More or less. If you use a codec that reduces the information to 50% (for example by not using color), and keeping a frame rate of 15fps, you can actually send the same frame twice (or send the same transmission packet twice). But as that would require a deep software change, the other option would be to double the number of frames, 30fps. Then differences between frames would be reduced to half, increasing compression ratio.

I'm no codec expert, so there are probably better solutions on how to use bandwidth to reduce transmission noise (data corruption). Still, a change in the codec is probably a lot easier than replacing the transmitter. A more powerful transmitter also uses more energy.

Online Lars_J

  • Senior Member
  • *****
  • Posts: 6106
  • Liked: 615
  • California
Someone on Youtube did a decent effort of cleaning it up:

I'm not sure how accurate it is, and I don't think the legs were extended in the beginning of the clip? Nonetheless it looks neat.

« Last Edit: 05/01/2014 06:51 PM by Lars_J »

Online IRobot

  • Full Member
  • ****
  • Posts: 818
  • Liked: 53
  • Portugal & Germany
That guy got the rocket image from the best frame and overimposed it on all frames... it offers a visual cue, but that's all.

Offline Adaptation

  • Full Member
  • *
  • Posts: 135
  • Liked: 32
the other option would be to double the number of frames, 30fps. Then differences between frames would be reduced to half, increasing compression ratio.

Doubling the framerate would do little to solve the problem on a modern codec.  You would need a full frame codec like MJPEG for that to really work.  You could reduce the threshold for sending a keyframe or have keyframes sent twice.  As they are using a fixed bit rate stream there may be room for some of these tricks. 

The best thing they could do is know better where the rocket will come down and have adequate downlink capability there. 

Higher transmit power is nice but it only gets you so far, double the power and you get 1/4 more range, you can only double the power so many times before the strategy gets out of hand.  But using for instance a 28 dbi directional antenna on the receiver gives you the same result as multiplying your transmit power 500 times.  The only problem is you have to be able to point it very accurately, if you're off by just 5 you are only going to get 250x the receive power but it steeply drops from there a few more degrees and its the same as sticking blinders up on your receiver. 

AFIK this launch did not aggressively attempt landing at a precise spot because more velocity was given to the dragon to assure the highest possible margins for mission success. 
« Last Edit: 05/01/2014 07:03 PM by Adaptation »

Offline AJA

  • Full Member
  • ****
  • Posts: 604
  • Liked: 36
  • Per Aspera Ad Ares, Per Aspera Ad Astra
  • India
@AJA why red filter? Maybe a luminance filter with IR cut, but can't understand the reason to use the red filter.

Falcon's white. The legs too.. have a white border (scroll to find input~2's YT screenshot ITT). The ocean's blue. Now that means that the RGB luminances of Falcon are (say) r1, g1, b1, whereas the ocean's is ~0, ~g2(bodies of water do look greenish sometimes don't they? Plus, plankton?), b2. The difference |b2-b1|, is I would assume the smallest of the three pairs and doesn't really help in terms of establishing where the legs end and water begins (in the image data). |g2-g1| would probably be more than the blue channel differences, but the largest, by far would be |r2-0| (I'm assuming that the ocean's black in the red channel..or very close to it atleast). So this allows you to differentiate.

While it may seem useless, and like a really poor version of a grasshopper video if you're not able to tell if Falcon is moving up and down in response to the waves... I'm counting on the fact that as waves break, the surf is going to be white...and will be visible in the image as well.

They may still want to cut out IR.. because the water might be radiating, and once the engine lights, it'll probably saturate the sensor.

I don't think this'd require much modification at all. Unless they're using some special space qualified camera, with a custom chip, a custom form factor etc.... can't you get a black and white camera and stick a red filter in front of it? If they keep the same data payload, they can trade two channels for more dynamic resolution...

Online MP99

  • Armchair rocket scientist.
  • Senior Member
  • *****
  • Posts: 5674
  • Liked: 191
  • UK
12.5% is actually a lot. When receiving such error-prone signal, it makes a lot of difference. It also reduces transmission power requirements (for the same frame rate), therefore more power available for transmission, therefore better S/N ratio.

Maybe. *If* that's the only telemetry sent. Who's to say the video wasn't multiplexed along all the other, high rate vehicle telemetry so the 12.5% for video is more of a noise in the total bandwidth budget?

Also true monochrome cameras are up to 3x more sensitive, meaning less (camera) noise to start with.

Seriously? At the codec quality settings they're using, the camera dirt that's deposited on the way down you're worried about camera noise?

Again, why go around making up solutions that would only be relevant for a couple of seconds before splashdown?
Because the way up is quite well documented. The way down is not.

Oh, I'm sure they have it quite well-documented. Just not in a format your typical rocket pr0n enthusiast likes. It's in the form of vehicle telemetry. That's gold, anything else is just gravy.

It still doesn't change my argument that any such solutions are just too much trouble for the amount of use they'll have eventually.
Worked on sort of "telemetry", we divided data into what's important (10%, in this case sensors data) and what's less important (90%, in this case video feed). Both were CRC-ed, but the first got retransmitted if corrupted or lost, the second was not. Sort of TCP and UDP.

On worsening channel the retransmissions of sensor data occupied more and more bits until no bits were available to video.

I guess SpaceX does the same, so if we see SOME video, it means ALL sensor data was received without gaps.

You guys are worried about how many bits the chroma component takes when some of analysis says they included substantial fill-in packets to bump up the data rate to make it a fixed rate transmission??

If they had infinite time to work on the transmission system it would have been nice to optimise it with lots of redundancy data instead of fill-in "0xffffffff" packets. (Though maybe those "ffff"s make it easier to re-synchronise the stream once major errors start to bite??)

But, I suspect this is more of an off-the-shelf system that was stymied by weather conditions on the day.

Next launch / splashdown should have an easier time of it.

cheers, Martin

Offline eeergo

  • Phystronaut
  • Senior Member
  • *****
  • Posts: 3857
  • Liked: 35
  • Blacksburg, VA; Italy; Spain
Someone on Youtube did a decent effort of cleaning it up:

I'm not sure how accurate it is, and I don't think the legs were extended in the beginning of the clip? Nonetheless it looks neat.



The overlay this person made is actually quite misleading: there are some misplaced pixels at 0:14-0:15 from the engine exhaust that appear as yellow artifacts to the left of the image - in the original video they were not so apparent since there was a lot of noise, but here they take the context away and get quite distracting. Also, it makes the splashdown and subsequent tipping over of the stage very confusing to watch, since the legs should be submerged.
-DaviD-

Online Lars_J

  • Senior Member
  • *****
  • Posts: 6106
  • Liked: 615
  • California
Yes, the overlay works better for the first part of the video.

Offline michaelni

  • Member
  • Posts: 28
  • Liked: 23
The video seems simple profile level 3 mpeg4 video in mpeg TS.
It seems none of the error resilience features of mpeg4 have been used when encoding it. Which is a pitty, had slices been used then the decoder could resume decoding of a frame at the next slice start, had data partitioning been used then the more important low resolution information and motion vectors would have been coded first in each slice making errors less likely to damage them. And had rvlcs been used then slices could have been decoded from both the start and end again, limiting the impact of bit errors.

I know nothing about how the video was generated or how it was transmitted, but if there was some FEC in there then then it should be possible in principle to re-run FEC decoding after manual fixing up all mpeg-TS and mpeg4-ES headers. And as such manual fixing would decrease the errors, FEC would then have fewer errors to deal with and might in a few rare cases be able to fix a few more errors.
Also if some kind of CRCs have been used, CRCs can also be used to correct bit errors as long as the number of errors is sufficiently small, which each CRC would need to correct, the exact number that could be corrected that way depends on the packet size and  crc polynomial being used.

Offline arnezami

  • Full Member
  • **
  • Posts: 275
  • Liked: 258
Yeah. It's a real challenge. Pretty stuck here.

But I still got a few ideas I want to try...

Offline SVBarnard

  • Member
  • Posts: 30
  • Liked: 4
  • USA
can someone please explain to me why spacex still hasn't released the footage they got from their airplane? Why are they being so secretive about it? I mean seeing is believing so why not just release the footage and prove to everyone in the world they really accomplished such an unprecedented feat?

I mean they did actually record it from their airplane right?

Offline luinil

  • Full Member
  • *
  • Posts: 105
  • Liked: 30
  • Tokyo
the airplane might not have been close enough to take a video.

Remember the weather was pretty heavy, NASA renounced to send their plane.

Online Lars_J

  • Senior Member
  • *****
  • Posts: 6106
  • Liked: 615
  • California
can someone please explain to me why spacex still hasn't released the footage they got from their airplane? Why are they being so secretive about it? I mean seeing is believing so why not just release the footage and prove to everyone in the world they really accomplished such an unprecedented feat?

I mean they did actually record it from their airplane right?

I suspect we will see more footage when SpaceX releases their usual mission highlights video.

Tags: