Scripts to detect blurriness, adjust exposure before uploading?

Hi,

I was wondering if anyone else does any editing of their photos before uploading besides what’s available at https://github.com/mapillary/mapillary_tools

I occasionally still have about 5% or so of my photos that are blurry and figured someone may already have a script to detect blurriness and delete those before uploading.

I also occasionally still have some photos that would benefit from increased exposure before uploading, even after using the new mapillary app’s exposure quick setting (which I find useful)

Regards,
Will

I use a slightly adapted version of this https://gist.github.com/shahriman/3289170 in my upload script to delete all images with a blur of 30% and above.

1 Like

Regarding the white balance (colors/exposure) I considered using this in my toolchain http://www.fmwconcepts.com/imagemagick/autowhite/index.php but I don’t think it is the right way to do any image processing on client side. From my point of view pictures should be authentic right from the camera without modifcations when uploaded to the Mapillary server.

1 Like

Hi @skorasaurus,

Detecting blurriness automatically to a high degree of accuracy is difficult to achieve. We’re working towards doing this on our backend so that you don’t have to worry about it.

1 Like

I agree with @moenk, that all processing should be done by Mapillary on the server. Then they can redo it when they make it better. I manually delete blurry photos, some times all with a shutter speed below 1/50 as well as pictures taken close together. But the latter would probably be useful some times and I definitely doesn’t get all bad pictures.

The additional twist that I think is really important here is that for some purposes even a quite high level of blurryness doesn’t really matter. e.g. you can quite easily detect if a road is paved or not from even a heavily blurry picture. Same goes for various landuse issues, existence of a bridge or not (i.e. just a ford) crossing a waterway, etc.

Of course in a perfectly saturatedly mapillaried world blurry is worse than sharp. But when the option is a blurry photo over none in a given place then blurry is much better for many uses.

What this brings us to is that I too think that essentially all quality control and adjustments to photos should happen on the server side. And the threshold to totally delete or hide should be high over simply prioritizing what is shown over something else. There could also be a toggle for “typical user” vs. “heavy user”/“mapper”/what_not that would have different filters/priorities.

EDIT=Addition of some examples:

Having noted the above I too have done some vetting of my photos some times before uploading. What I try to do is evaluate if a photo adds anything valuable to photos before or after it. If not, then I delete it. But if it adds even quite a little (like the above noted implication that e.g. pavement continues without breaks) then I don’t.

Just my 2 cents.

2 Likes

Jaakoh,

Very good points. In many contexts, you can still can gain valuable information and knowledge from blurry photos.
In lesser developed places like in Nicaragua that have limited spatial resources (current and high-resolution aerial imagery, shapefiles of road geometry from the govt, etc) (nor is google streetview), mapillary photos, even blurry ones, provide knowledge that hadn’t been there at all.

My use cases for mapillary are a bit different than yours (often extracting information from POIs on the side of the road that wouldn’t be legible in blurry photos), and have convinced me that, when in doubt, users should generally upload the blurry photos and let the server do the determination of blurriness. Perhaps in the future, even let the users set the parameters of how blurry they wish to view the photos (a big wish from me :slight_smile: )

This sequence that I just had taken yesterday afternoon of Cleveland has some great examples of photos (particularly the latter half like here) - that would benefit from some increased exposure to gain a bit more poi information.

If you’re curious, I’d recommend watching the entire sequence where it starts - here in downtown Cleveland and hit the play button (it takes 3-5 minutes to watch). begins because it shows the (Rock and Roll Hall of Fame in the background of a few photos, some of downtown Cleveland, and some varied housing styles here in Cleveland.

Regards,
Will

1 Like

@skorasaurus: What type of camera do you use? Smartphone with the Mapillary app or a conventional camera? What kind of blur do you experience (caused by shaking or by wrong focus distance)?

I use a Canon S100 compact camera that I can program using a CHDK script ( https://github.com/ltog/mapillary_utils/blob/master/mpllry.lua ).

To prevent blur due to shaking I use short exposure times (around 1/500 to 1/1000 depending on context). To prevent blur due to wrong focus distance (often caused by the camera focussing on the car’s windshield) I filtered on EXIF data (exiftool, value “Focus Distance Upper”) in the past. Later I was able to adjust the script to not take the picture in the first place if the focus distance is too short. The camera then tries to refocus instead.

2 Likes

I can definitely see the need for exposure adjustment in your linked sequence, Will. The single posted photo is a prime example of that but the sequence is a good example – and one that I would assume to be decently reasonably simple to identify as a potential candidate to adjust for both the technical characteristics of the photos as well as the reality of when they where taken and where (latitude + time/date of year + time of day).

The workaround/hack I try to do (until there are such tools/algorithms on the server) is to point the camera a bit more towards the ground. This makes the hood of the car visible in the photos but when the (visible) horizon is above the center line (so that the more bright sky covers less than half of the photo) it helps the camera app to adjust the exposure level of the photos. Surely also with the expense of over exposed sky but there’s not much we’re interested, is there?