Bi-directional sequence - proper handling?

Hi there,
I am about finished testing the sampling and processing of imagery data with mapillary_tools and have tested an small upload of pre-processed data.

Now I’ve got some questions to clarify.

My setup:

  1. Front facing dashcam Blackvue DR650S-1CH (with built-in gps logger)
  2. Rear facing BlueSkySea B1W daschcam (without built-in gps logger)

I have prepared a set of frames, 2 fps from each camera, aligned between one another.
Gps log from the frontcam was used to geotag both front and rear camera imagery.
Rearcam imagery was processed with --offset_angle 180

As a result for every point I have 2 images, 1 front +1 rear, both having IDENTICAL EXIF Datetimestamp and coordinates, only diference is the angle (rear is offset by 180)

I have uploaded a sample set of 15 frames front + 15 frames rear
My thought was that Mapillary will automatically stitch them together and user will be able to “switch” between the front and rear while following the sequence and will also be able to see that the points are ‘two-directional’ (compass with be showing 2 sectors).

But as a result I’ve got 2 separate sequences:

  1. Front
    Mapillary
  2. Rear
    Mapillary

now the questions

  1. when user clicks on one of the points on the map - he defaults to ‘rear’ sequence, meaning that he can’t see front unless he will ‘rotate’ using a navigation arrow (not always present btw).
    What should I do to make front facing sequence showing as a default one?
    I only suspect that images uploaded later override, so, maybe making rear queued first, followed by front will help. But this is odd…

  2. Something weird happens with navigation arrows in the rear sequence. Angles (compass) in the images are set correctly as you could see. But, for some reason, after processing, mapillary has altered movement direction somehow and it is not matching the reality. So, the front/backward navigation arrows are not aligned with the road/driving direction.


    I guess the issue is related with ‘North’ not being aligned by mapillary properly (it difers between front and rear by approx. 60 degrees, not 180 as it should…
    Capture

  3. Having said the above, does it really make any sense for spending extra effort on two-directional processing?

I may be misunderstanding what you’re getting at. I see one short sequence that looks to be the rear-view. It’s set up and working as I’d expect.

I also see another that’s the front view. That’s working and running as I’d expect, too.

You’re not seeing more arrows because there aren’t other images uploaded at this time. Maybe you have some in process that aren’t showing on the map yet?

ok, let me explain.

  1. go to the map where both sequences are posted, not using the direct links above, but zoom view
    Mapillary
    cap1

  2. when you will try to hover over the points with a mouse, it will default to ‘rear camera’ sequence, showing only rear images and respective compass angles.
    So it is NOT visible that there is a front sequence available.
    I do not know how it works (I can only guess that points uploaded last will override the other ones). In fact I expect front sequence showing by default but don’t know how to achieve that during upload.

cap2

  1. The ONLY (unobvious for end-user) way to switch to the front sequence is to use a navigation arrows on the image screen.


I thought that this would be implemented in a more obvious way - since points in both sequences are having same coordinates, timestamps, user - they should be marked somehow (i.e. showing compass sector in both directions with ability to switch by clicking on it). I.e. on the map view:


or on the image view (two available sectors are shown and switch is activated by clicking on the resp sector):
cap7

  1. Finally, after server-side processing was applied for rear sequence, geographical north is not set properly, resulting in navigation arrows not lined up with the driving direction and generating a lot of distortion/glitches when ‘playing’ with 3D transitions enabled (default). During rendering images are considered as taken from the side, not from the back.

hmm. with new uploads it got only worse.
Now front+back sequences are combined in subsets of 500 images.
And playback is awful, it is switching between front and back on each frame

ok, I give up for now (processing and uploading a footage from rear camera).
Will request to clean-up all my sequences from the servers and will re-upload only front facing ones.

As @allen writes, the images may require more processing time. Usually the nagivation arrows are only added when Mapillary detects enough common features to add them using computer vision. Because there are no side pictures that may not be possible. In other cases it has worked fine for me, but with side cameras - and some times it puts a 90 deg picture on the 180 deg arrow.

I think it makes good sense. You will get pictures of more, so the more the better. How Mapillary handles it at this time, is less important to me. They might fix some issues in a couple of years or others may upload pictures that helps stitch it all together.

something messes up the exif on those, it looks like.
it should theoretically be showing a u-turn arrow for those.
i wonder if you could stitch those into fake 360, filling the sides with black pixels

well, definitely it’s not me :slight_smile: original Exif is fine, so it’s mapillary AI playing tricks…
Even on the map - image angle settings are correct.

I could do many things, and can go an extra mile if there will be a real value.
Even processing 2 directions (which requires a lot of extra effort to sync them and to process) - is already too much.

I will keep 1 direction (front facing) and will see later.
Though I am driving a lot across Europe - I definetely don’t have money to invest 500$ in DR900S-2CH or GoPro setup.

Wonder when I’ll get some response from support… asked them to wipe off all my uploads so I could upload clean (front-facing) data again.
Erasing 60 sequences manually is not how I want to spend this evening…and I heard that sequence deletion does not wipe the data completely from the servers
@peter, could you please help with this?

it’s disheartening, but i guess going for a simpler/front-only workflow might be better at this point.
i didn’t know mapillary’s processing can actually alter exif unless you tell it to, weird.
you can get a decent action cam setup (2 cameras, bluetooth gps receiver) for about $250, but obviously no need when you’ve already got the dashcam.
also just had a look, and at least a couple of your images seemed fine - as in, the u-turn arrow actually was in place and flipped the cameras.

ok, all sequences in subj were removed and only front facing sequences re-uploaded.

To fill in here - in the backend we are not having any logic attached to the actual coordinates of the EXIF in terms of navigation. Instead, the logic is:

  • ingest the images, segment them to find out volatile and moving objects (sky, cars etc)
  • omitting the volatile regions of the images, put them through a Structure From Motion pipeline, which will result in a corrected lat/lon/heading that fits best (within the accuracy radius) with the exiting point cloud considering the other images in an area
  • In the viewer, request the local corrected camera positions and then make decisions on which of all the possible cameras to display as the one to move to (there could possibly we other navigation logic in MapillaryJS in the future, like based on image age, color, time-of-day etc)

With this, what you see if probably the SfM not estimating the directions between images correctly, thus giving a off-navigation experience (left arrow instead of turning arrow)

We are working on making SfM more stable WRT horizon estimation (important for ground level estimation) and taking more weight considerations to images coming from the same sequence (right now every image has the same weight when merging the point cloud)

Sorry that your cases are not working out here right now. We have been considering actually stitching images in the backend, but that leads to very unflexible effects when new material enters an area, and makes certain filters impossible (like filtering for angles and users)

2 Likes