Been thinking about weather for a while. We ought to be able to get the sun’s position on the screen, and by extension whether the image is looking into the sun, or whether it’s day or night time, based on the location, time and orientation of the shot, and identify images that are likely bad ones. It could also give clues about the direction of shadows. I was thinking that encouraging but filtering out night time images (at least by default) would make it much easier to detect street lighting and lit road signs… there’s a wealth of different data available at night.
What would be cooler, is if the app somehow stored local weather conditions in the EXIF metadata. This could give good clues about the sharpness of shadows or whether there’s likely to be rain on the road or windscreen, how likely snow or ice is (I’ve shots here that detect snow in Britain in mid-August - clearly a false detection). Gathering historic weather data would likely be expensive and awkward, but there’s probably lots of ways to gather that data on a phone at the time of capture for free, and it’d be tiny amounts of data compared to pixels - well worth save it in case it’s useful to future detection models, or just having a way to filter out images of crappy days in my tourist town.