Getting misguided by the GoPro SmartRC, I ended up shooting a video instead of timelapse on the Fusion. I went ahead and stitched the 100gb of that video, sliding the Yaw slider in studio as usual, to center the image. However, I missed that Fusion studio centers the rendered video on the initial camera angle, so the yaw correction is useless and I now have a host of frames that all have the same magnetic/gyro heading, but it obviously isn’t always in the direction of travel (which is what I want).
I’ve had a look through pano editing options, and most of it is either Adobe (not paying for that) or Hugin (cumbersome for many images).
Is there an easy/quick option of adjusting yaw/centering of a panoramic image?
Yeah, but setting it up for just this one off, seemingly simple task, seems overkill.
I am not very good with Hugin, but also can’t seem to find an option to just edit a stitched image
For equirectangular projection the yaw (horizontal movement) is actually very easy to edit. Just shift the image left or right, cut the residual from one side and paste it to the other. The horizontal coordinate is simply longitude, so in order to know how much to shift divide the image width by 360 and multiply to the amount of degrees you need. You can automate it in any programming language like PHP or C++, and I guess some tools like imagemagick can do that too, but never tried it.
If it’s only yaw, you could edit the EXIF value ‘GPSImgDirection’ also? there is software out there for it… (I built my “editor” in Perl CGI, modules for there also)
There is a way to edit yaw in EXIF/XMP I believe, but it only edits the metadata, so not sure if Mapillary will pick that up.
Trouble with trying to automate this for all images is that correct/incorrect yaw will be different for all of them.
Having run the Hugin assistant, it appears to detect/read yaw values relative to the first image. The Gopro stitcher, from what I know, tried to correct yaw of each frame so that it is the same relative to the first image. If I can get that relative yaw for each subsequent image, and set it to 0 (which in theory would be 0 relative to the first image, the only one which the stitcher helpfully corrected as needed), I could be on to a winner
That would a good solution, but I guess I would still need to get the relative yaw value of each image (screengrabs) - so probably back to Hugin, as that seemed like it was reading the direction of something.
UPD: it looks like Hugin looks for matches between each and every image in the sequence, instead of matching all images to the first one. It makes sense, but also makes this option unusable, so I am back to square 1.
The Optimise function of Hugin/PTGui appears to be what I need (it seems to look at relative yaw), but I can’t get any stitcher software to accept that I’m feeding it individual stitched panoramas.
UPD2: I stand corrected, Hugin seems to offer an option to output separate remapped images “for post blending”. Still, I can’t seem to figure out how to normalise relative yaw with Stack align. It seems to calculate some sort of relative position values from control points, however they are not quite the values I need to (manually) input to get the image into direction of travel