I never expected that ffmpeg could do something like that. If you have more than one stream in there, how should it decide to which it belongs? I would understand that it needs the right PID but the AF flag has to be set right too to get all of the data.
Quote from: maschnitz on 05/31/2014 08:04 pmQuote from: moralec on 05/31/2014 04:25 pmI doesn't seem like rocket science, but it does seem like a lot of work specially on the online tool side (we don't want you to hate us Iaincole). Ideas?VCSes - version control systems - are really good at moralec's database problem. One idea is to shoehorn this into a VCS by using a VCS as storage. Here's one way to do that: force MMB lists into a VCS friendly format, by a) breaking up the MMBs into one per line [VCSes tend to work best on whole lines at a time] andb) sort the MMBs into a standard sort order, most likely the raster order most folks are already used to. [EDIT: I guess this isn't strictly necessary, but may make IainCole's life easier]c) Then you need to always check in and out of the VCS via something that understands the format. IainCole's website, or some other website, for example, would be the only entity with access to the VCS. This website would be responsible for parsing raw MMBs, and providing the MMBs back in a ffmpeg-friendly format.d) Ideally this file format would have comments to allow for notes and simple alternatives. The VCS would be responsible for full 'branches' [git/mercurial is good for this].And then a flat file format might come in handy in other ways.There are other ways: you could use raw RCS diffs to manage things; you could find a diff program that's really good at big long lines (most aren't); you can code up the delta checking by hand; you can simply store all the MMBs ever in groups, normalized in the database, for users to sort out; etc.This was pretty much exactly my suggestion here: http://forum.nasaspaceflight.com/index.php?topic=34597.msg1207365#msg1207365The added bonus being that the online editor is already using that file to load MMBs.People can either submit MMBs by pull request, or a few key people can be made collaborators on that git repo to add them in based off submissions to the wiki.Edit: There's obviously some administrative overhead with making sure that the wiki / json file are in sync, but short of writing a new system to do the job of both things, I think it's the simplest option.
Quote from: moralec on 05/31/2014 04:25 pmI doesn't seem like rocket science, but it does seem like a lot of work specially on the online tool side (we don't want you to hate us Iaincole). Ideas?VCSes - version control systems - are really good at moralec's database problem. One idea is to shoehorn this into a VCS by using a VCS as storage. Here's one way to do that: force MMB lists into a VCS friendly format, by a) breaking up the MMBs into one per line [VCSes tend to work best on whole lines at a time] andb) sort the MMBs into a standard sort order, most likely the raster order most folks are already used to. [EDIT: I guess this isn't strictly necessary, but may make IainCole's life easier]c) Then you need to always check in and out of the VCS via something that understands the format. IainCole's website, or some other website, for example, would be the only entity with access to the VCS. This website would be responsible for parsing raw MMBs, and providing the MMBs back in a ffmpeg-friendly format.d) Ideally this file format would have comments to allow for notes and simple alternatives. The VCS would be responsible for full 'branches' [git/mercurial is good for this].And then a flat file format might come in handy in other ways.There are other ways: you could use raw RCS diffs to manage things; you could find a diff program that's really good at big long lines (most aren't); you can code up the delta checking by hand; you can simply store all the MMBs ever in groups, normalized in the database, for users to sort out; etc.
I doesn't seem like rocket science, but it does seem like a lot of work specially on the online tool side (we don't want you to hate us Iaincole). Ideas?
OK so I used to consider myself good with coding. I am a complete newbie when I comes to MPEG4, so I tried the online editor and it's still beyond mortal man w/o lots of explanation.What I am proposing to get more eyes working on this is to focus on getting correct data packets and then the rest of the crowd can start flipping bits in the payload. What I am hoping for (it may not be possible due to the compression) is to be able to break the payload down to specific parts and display them as such. Then if the tool can allow us to concentrate on the payload part by part, mortal eyes may have a better chance of flipping bits to start fixing the rest of the picture. But the key is to get us to the point where we can focus on the parts of the payload as opposed to try to guess at the whole payload as appears to be the case right now.My other comment is as the online editor gets used by more people - you probably need to moderate inputs before committing them to a master database so we don't inadvertently screw something up and 2. we guard against some malicious event (not that I am expecting that, but stranger things have happened).Standing by to help, but we have to get this a bit less technical.
1 - Do you think we need better tutorials for using the editor? What would you consider better? 2 - Do you think the online editor should be more intuitive to use? What is not intuitive right now?3 - Do you need more feedback flipping single bits or changing bitpositions? Do you need to see the actual bits? What kind of feedback apart from the resulting frame?4 - Do you need to have a better explanation about MPEG4? Do you want to see a concrete example based on the actual video data?5 - Do you want simpler and more concrete tasks to do? It's now divided into roughly 300 parts. Should we divide it more? Or should it be divided differently?
I started by looking at the work done by other people and trying to figure out how they'd done it... moving the blocks around and nulling out others was self explanatory, but I couldn't figure out how in the world people found the correct start positions for blocks in the i frames when they're wrong, and honestly I'm still not sure about that part and if it's more than trial and error. So this is the one thing that I definitely would like some more information on.
And to follow the wall of text, a bit of eye candy and showing some of the repair process. I saved the image I was working on periodically to check chroma values at a higher saturation and to make sure I'm not missing too much. I decided to keep the intermediary images and animate the process. There's no new data in the frame, just a lot of luminance and chroma adjustments. You can see the left-right, top-down progression and then a final pass on chroma issues, there's around ~10 changes in each frame.
I recognize the data is compressed, what I keep struggling with is why all the padding that you are finding in the stream?
Stunning!
Amazing! I'd note that you guys need to think about the next milestone point, then we should think about another updated video and we'll get that out there like the previous.We're not going to have Elon tweeting every time, but when we get to the point we've done all we can, we can show the milestones.When we do get to that "we've done all we can" point, then we should consider an article on site. Heck, we have Elon quotes now! I'll need a LOT of help with to translate the technical to something readable by Joe Public, quote some of the team here and use the Elon quotes, etc.Also, it was suggested to me we could also have a highly technical overview as a press release style write up. I'm cool with all of that, obviously.Something to keep in mind.PS Per the next milestone video. What we absolutely should do is "where we are" before the ORBCOMM launch, as that launch is probably going to have a good video beamed back. Regardless, this is the historic first, so there's no value lost on this effort if ORBCOMM sends a good video back.
PS Per the next milestone video. What we absolutely should do is "where we are" before the ORBCOMM launch, as that launch is probably going to have a good video beamed back. Regardless, this is the historic first, so there's no value lost on this effort if ORBCOMM sends a good video back.
Quote from: arnezami on 06/02/2014 09:11 amStunning!I think there's going to be some SpaceX people weeping for joy this morning when they see that. Just... wow.
Quote from: Chris Bergin on 06/02/2014 12:32 pmPS Per the next milestone video. What we absolutely should do is "where we are" before the ORBCOMM launch, as that launch is probably going to have a good video beamed back. Regardless, this is the historic first, so there's no value lost on this effort if ORBCOMM sends a good video back.I think the Orbcomm video, even if it's solid, will be in darkness. So, this video will remain unique even if the Orbcomm video is good.
1 - Do you think we need better tutorials for using the editor? What would you consider better? 2 - Do you think the online editor should be more intuitive to use? What is not intuitive right now?3 - Do you need more feedback flipping single bits or changing bitpositions? Do you need to see the actual bits? What kind of feedback apart from the resulting frame?4 - Do you need to have a better explanation about MPEG4? Do you want to see a concrete example based on the actual video data?5 - Do you want simpler and more concrete tasks to do? It's now divided into roughly 300 parts. Should we divide it
I'd note that you guys need to think about the next milestone point, then we should think about another updated video and we'll get that out there like the previous.We're not going to have Elon tweeting every time, but when we get to the point we've done all we can, we can show the milestones.When we do get to that "we've done all we can" point, then we should consider an article on site. Heck, we have Elon quotes now! I'll need a LOT of help with to translate the technical to something readable by Joe Public, quote some of the team here and use the Elon quotes, etc.Also, it was suggested to me we could also have a highly technical overview as a press release style write up. I'm cool with all of that, obviously.PS Per the next milestone video. What we absolutely should do is "where we are" before the ORBCOMM launch, as that launch is probably going to have a good video beamed back. Regardless, this is the historic first, so there's no value lost on this effort if ORBCOMM sends a good video back.