The smart part in the Mapillary software is SfM, I am doing some preliminary testing with 360 images I took with my LG360 and am working on an implementation for my village, but I want/need a good workflow before I realy go to work on creating the imagery. In those testes I “stumbled” upon SfM (structure from motion). They use that tech to create the cool transitions between your images. With that tech they also perfect the GPS Position in the placing and animation structure.
The question/bug I wrote here is linked to your question. the SfM software essentially corrects the yaw! They just don’t show that on the map!
They also don’t use the corrected GPS postion in the API data (my bug report).
Now if they are able to correct the GPS position I expect they could correct the pitch and roll as well… if they would/could, that would be perfect!
i feel like this is a limitation of the (web) viewer rather than the sfm stack behind mapillary - it was designed with smartphone pictures in mind, that always have this FoV/heading match that makes it easy to show them in a natural order, looking the right (forward) way.
it appears 360 imagery is just left as is in the viewer.
but then I wonder how does sfm/object detection work on an image that hasn’t got any compass exif data (although I thought fusion has a magnetometer)? Does mapillary know where in relation to the camera/world the objects in the image are if you manually fix the compass heading (eg normalise the sequence - does that even work for panoramas @gpsmapper ?)
This is exactly the question / topic I want get more clarity upon.
I do not understand how mapillary AI/SFM works in relation with Fusion panos.
I have tried normalization with another test sequence
And eventually it has some positive effect on default behavior of the FIRST panorama image you open from the sequence.
Initial FOV angle seems to be set matching the compass angle in the sequence, but it is kept fixed when you play the sequence or switch to another image from the same sequence (you could try). BUT if you will open another image from the normalized sequence (I was testing with direct links using ‘share’ function), FOV will be set to compass angle.
TEST images from the sequence: Link 1 Link 2
As you can see - both are opened with FOV matching compass angle (move direction).
So, it seems that my feature requestabove is
a. doable
b. will make a perfect sense
Still I don’t understand how mapillary finds the correct orientation of panorama image in space. Both ‘lens facing side’ setup + 90’ manual yaw correction in Fusion studio (first sequence above) AND ‘lens facing front’ (second sequence) + 90’ manual yaw correction (resulting in panorama center on the side) are being recognized by Mapillary correctly.
Keeping in mind that Fusion does not have any extra data in EXIF like heading or orientation - it looks like a magic…
Heading of 360 images can sound confusing but it’s actually simple The heading expected by Mapillary in a 360 image is the heading in the center of the image, just like in images with other projection types.
So all you need to check when you upload images is that the center of the image is pointing at the stored direction.
As for the GoPro Fusion, although it does have a magnetometer, we currently have no method to read the magnetometer in tools, so it’s best to just use interpolate directions and get directions from the GPS points.
@gpsmapper if the direction of your images was always correct even with different uploaded directions, it’s probably because SFM corrected the orientation values.
ok, probably it would be easier to accept this as a given, without going into details and trying to understand how it works
@josealb
Do you think it would be possible to implement the ‘dynamic’ FOV for panorama sequences?
This is the sample normalized sequence:
Heading (compass angle) is matching the direction of movement, and panorama source images are centered accordingly.
When you open any of the images from this sequence using a direct link (e.g. links 1,2 above), you see the panorama center and facing direction of move. BUT, when you try to use ‘playback’ or just change to another image from this sequence manually - FOV remains fixed (= FOV of the first image you’ve opened), and not changing in line with the heading.
So, I think it would make a lot of sense if you will implement something like:
make use of compass angle in a sense that it could be used as a default FOV during sequences playback. So, if panorama sequence is normalized correctly, you can playback it with FOV always set correctly and adjusted dynamically, i.e. keeping focus with the direction of movement.
Now - it is fixed, so you can rotate image to apply a different FOV but during playback angle stays fixed, so if sequence is changing direction, direction movement could go out of ‘focus’
@gpsmapper, you are right, that is a known issue when playing back 360 sequences.
The good thing is you can use the mouse to rotate the images while playing.
The images do need to be 360-images… otherwise it’ll get messy I’m afraid
*) I needed to put in a time delay, as you can see in the numbers jumping all over the place at the right, upon nodechange the x and y value “jump” after a moment to the right value…
I hope that it will not take ages to get something similar and elegant implemented in the ‘out-of-the-box’ viewer app.
But from my experience - usually when you hear smth. like
means that it’s not really on a radar… and in best case is parked in the queue with low prio…
I hope this would be an exception
yeah, well they are probably hammered with all of the dev work on the marketplace and the apps anyway.
I wonder if the Web viewer is on github, @eesger might be willing to send a PR?
@gpsmapper so, looks like simply centering the equirect jpeg and then normalising on upload at least points the little cone in the right way? The fov indicator doesn’t seem to be displaying any sort of “image heading” info, so the cone is the only thing to rely on when determining where the sequence is going.
I guess if you don’t fix the heading and upload something that is 90deg off the actual camera heading, either the SfM magic will kick in as described @josealb (presumably deriving the relative movement direction from gps + sfm) or it will all be horribly broken (like one of those sequences where someone fixed the heading in the app the wrong way)
Will need to actually get round to testing this. Or maybe wait for the new Fusion…
Well, I went ahead and ordered a Fusion 360 today, and upgraded my computer’s video card yesterday so it can process the images. I’m also pondering creative mounting ideas. Imagine something jointly designed by Cecil B. DeMille and Rube Goldberg, and you’ll be close to what I’m thinking.
Frankly, reading through the posts about the camera, I was hesitant about buying this camera. I would like to mount it sideways, more to avoid the &*(^&#!! bug splatter on the lens than for wind drag (this is not an airplane, after all). But an easy answer to the question of what setting(s) to change to get the front of the rendered images pointing forward seems elusive.
It would seem to me that if the camera is pointed 90 degrees right of the true heading, the offset should be -90 (or possibly 270, I suppose). Am I missing something?
Ideally, one would think the stitching app would be able to do this when rendering the files. But one could also be wrong. I haven’t looked at the app because I don’t have the camera; and trying to figure out an image-processing app without images to process tends to be wasted time.
In summary, I really don’t care about the “why” so much as the “how.” What do I have to do to correct the images so they will be most useful? Is it something I should do before capture, in the app, in the uploader, or some combination thereof?
@GeekOnTheHill if I understood you correctly, you are referring to the yaw correction as mentioned above - this is done in Fusion Studio pre-rendering to point the centre of the image in the direction of travel (let’s all hope mapillary devs find a way to read gopro compass exif and we don’t have to do it…)
finally got round to rendering a test sequence. Have to say Fusion Studio doesn’t make it terribly obvious how to export time-lapses in jpeg.
Cropping out unwanted parts of the image was fairly easy and seems to have cut image size by 90% for 10% on the nadir - I cut myself out, and mapillary stitching on the nadir is a bit funny now, but seems ok otherwise.
As feared, a consumer 360 camera has disgusting image quality, and I plan to be doing 360 shots only for general direction awareness. Pilot Era might be interesting with 8k, but prices for 360 cameras shooting good images (and mostly not even having an on board location chip) are too high - even used Insta Pros are like 2k, without the GPS dongle. I could be ranting about this for a while…
Otherwise, the Fusion seems like the best Mapillary 360 camera, almost plug and play
I ordered some super heavy-duty magnets ($4.99 each at Harbor Freight) for my Rube Goldberg vehicle mount, which exists only in my head at this point. Each magnet is rated at 95 pounds, but I’ll lose a few pounds with the anti-mar covering. Still, I think it will be suitably overkill if I use even three of the four. I’ll use a safety strap just in case, though.
Once I get the thing mounted I’ll get a bit of footage to play with for a while before uploading.
Some footage I’ve seen with the Fusion is excellent, and some is crap. I wonder how much has to do with the SD cards. I have a dashcam that will work with almost any SD card, but the image quality is crap with the El Cheapo jobbies.
Around here, the scenes may look barren to humans, but they’re pretty complex for a machine. It will be interesting to see how well the camera handles the forests.
I can send you some files if you want to test out fusion studio beforehand, or you can get some from 360rumors.
Didn’t think about magnet mounting as a bit afraid of paintwork damage, but it sounds quite sturdy.
Most of the footage I’ve seen from Fusions looks ok zoomed out (as expected), but loses all detail zooming in. Colours are ok and dynamic range exists, but the detail is just not there, and no sd card would fix that.
If the rumours are true about the next fusion, we shouldn’t expect a bump in image quality…
Thank you. I appreciate the offer. However, the camera will be arriving tomorrow, the good Lord and UPS willing, so I should have some footage to play with quite soon.
I built most of the mount today and posted a link about it here. I still have to finish the mast (the one in there was to align it while cementing the pipes, I have to install a U-hook for the safety strap, and I want to paint it. But it’s pretty much done.
I wound up picking up the magnets at Home Depot because Harbor Freight was taking their time shipping the order, which I wouldn’t have mind had I not spend almost as much for the 2-day shipping as the magnets cost. The cost was pretty much a wash save for the fuel and the time.
If you cover the bottom of the magnets with something very thin, like self adhering shelf liner; and make sure the roof is clean (some wax doesn’t hurt, either), the magnets don’t cause any damage. I used to use magnetic antennas for two-way radios, and they never caused any problems. (And if they do, that’s what polishing compound is for.
Hi all!
Does somebody have any idea of how to batch-export panoramas in Fusion Studio? I’ve only seen a single image export option: screen|690x178
I’m using the latest version of Fusion Studio (V 1.3.0.400) on Windows 10.
if you haven’t found it yet: you need to go to the “rendering” stage, and there under export options among video files you’ll find jpeg and possibly raw. need a screenshot?