A local user lives in an area with a lot of imagery. So they are filling the few paths with no imagery at all, and the roads with older imagery. However, some of the existing imagery is really low quality. Is it possible to filter images based on their quality? I can imagine an AI/human filter approach should be possible to classify images; at least shacky/low res/dark ones.
I agree with your request, A good starting point would for mapillary to better highlight areas with older imagery.
- Older is not always low quality
- Low quality may be the only imagery available, e.g. on hiking/foot trails there is almost no imagery. You will have to cherish that, even if the quality is low.
Optional filters on age and quality (if that can be measured) would be nice to have, but I would like them to be gradual and easy to set, e.g. a separate slider for each.
But nothing stands in the way of uploading better quality imagery if one happens to go that way anyhow - it will help to know which ways are affected, and that is the request I read in the OP’s post.
It would be helpful to know that previous imagery is at a low resolution, with perfectly exposed cumulus clouds (but barely any detail in objects on the ground) and motion blur - perhaps photographed around nightfall, driving towards the sunset : documenting that way again with a better camera at a different time of day would make sense, and for me tip the balance in favour of taking that route.
Who is “they” ?
I cannot reply at the correct place.
Body is too similar to what you recently posted
That is exactly why it would be nice to be able to filter separately on them. The website already has a way to filter out older photos. If one could also filter out bad quality, it would be easy to plan a trip to capture images for the most needed areas.
Another nice feature would be a more “in your face” representation of direction(s) in which the street was traveled. This would allow one to plan a trip going in the opposite direction.
@joostjakob, it’s not as slick but if we could create more complex filters, ones with multiple date ranges and NOTs you may be able to cobble something together. E.g. any pictures not taken by user123 AND ! in range 01/01/2001 to 01/01/2002.
“A local user lives in an area with a lot of imagery. So they are filling the few paths with no imagery”
This “they” is the very same local user. It’s a bit of a weird construction, I admit, but I think it’s correct.
Your answer was in the correct place I believe.
It is not in Flanders, I suppose.
I know the bad guys there.
Haha, if I recall the original question correctly, it’s actually about some of your earlier work
(if you take a few million pictures, there’s bound to be a few ugly ones)
I still do not understand.
Can anyone else explain the sentences ?
Read my profile. I ask for help to delete them. In the past it was not possible.
I know someone who made very bad pictures in the past.
I just asked for deletion of 84 too fuzzy pictures of this sequence from 2015. I accidently came across this sequence today. I am totally not to blame for these bad pictures. That is the history of Mapillary.
On the other hand, such things still happen. I am recalcitrant to mail or expose the culprits. I have been severely insulted on this forum for doing so.
It might help optimize collection planning from the community if Mapillary could help identify areas that need collecting (either they don’t have any images, or the images are too old/low quality).
Basically, right now the map shows where imagery HAS been collected (i.e. green lines). What if that were flipped, to help point mappers to places where collection is most needed? Mapillary best understands the imagery requirements of their algorithms and customers… It would help remove some of the guess work from the community.
Bonus points for helping calculate the optimal route for collection
[quote=“koninklijke, post:5, topic:3111”]
perhaps photographed around nightfall, driving towards the sunset[/quote]
This sounds like a pretty easy problem to solve. SunCalc.js can get the angle of the sun at a given location and time when facing north, assume phone cameras have a 70° horizontal and 50° vertical field of view and flag images where it’s under 25° above the horizon and/or within 35° of the camera’s rotation.
Suncalc would be usable if it were integrated in osmand.
I do not take pictures facing a low sun or with low light when it is a third passage, unless I have a reason.
And there are no good photographers east of me.
It’d be reasonably easy to add this to OsmAnd. You’d just need a developer to use their API key to do the filtering and add tracks on top of OSM as tiles are requested, and add that as a custom map source. I’ve never done anything like that, I might have a stab at it.
Ideally the Mapillary app would support custom map layers, then we’d be able to see it from within the Android app too.
The Mapillary map does not even have a distance scale.
with perfectly exposed cumulus clouds (but barely any detail in objects on the ground
Been thinking about this one too. This ought to be possible by:
- shrinking the image (to reduce processing)
- splitting it into squares (to identify regions)
- put each square through an edge detection filter (a GPU-friendly one for maximum speed)
- resize those down to 1x1 pixel to the average colour of the region (which now represents the detail level)
If the squares at the top of the image are brighter than the bottom, then all the detail is in the sky and you’ve likely got happy clouds and grumpy ground. If they’re all dark then it’s likely a blurred image. If they’re black then they’re so blurred that they’re likely useless and are good candidates for deletion.
Also, to get general low-light images, I don’t think cameras pick up vivid colours in low light, so looking at each pixel and seeing how far any of the red, green and blue are their average (grey), call that vividity, and if none are very vivid and all are pretty dark then it’s probably a poor quality image.