today I saw something that I haven’t seen before. A face on a traffic sign has been blurred, as usual. But what was new is that not all in the rectangular blur area (the one that gets visible when entering the blur editor) was blurred, but only the face. In my experience, when an area gets auto-detected or manually set to be blurred, all of it gets blurred.
The question is: Is this something new, so the AI now recognizes the exact dimensions of the face/license plate and only blurs that part? Or has this always been the case and I just haven’t seen it yet?
I’ve attached images for a better visualization of what I mean:
1st image: Screenshot from blur editor with the area that was auto-detected to be blurred
2nd image: The blurred image
3rd image: Took my professional MS Paint skills and marked the area that was set to be blurred (red) and what actually has been blurred (green) on the original picture. You can notice at the boy’s hairline and ear that it’s only partially blurred, since these details aren’t blurred.
The image I’ve used in this example is this one. I’ve also seen this on other pictures with license plates and “real” faces, but only since today. Go back a couple of pictures, where two people are walking on the left side, and a scooter driver is coming from the front, it happened there too at some pictures.
I’ve never seen this before, or haven’t noticed at least, and found it kinda interesting but haven’t found something about it on the blog and in this forum.
Thanks
Edit: Oh and I also noticed that the auto-blurring seems to have become better in general. I’ve been doing many blur edits for a while and got a feeling for things that are correctly auto-blurred and things that aren’t blurred but should and things that are blurred but shouldn’ etc. Long story short, the auto-blurs seem to have become better in terms of long distances (for example, a car driving a couple of hundred meters in front of the cam but the license plate is still readable - these usually weren’t blurred but are now) and in terms of wrong blurs (for example dark fences in front of a white wall). When I started blurring today or yesterday, can’t remember, I was surprised how well it was blurred already; if there weren’t tiny details (like incorrectly blurred ztuff or missing blurs or way too big blurring areas compared to the plate/face) I would’ve thought someone has done it manually already.
They are clearly not yet done (regarding number plates), as visible in this picture. In next pictures, the faces are also nicely blurred, but just one street away, faces are still unblurred. But lets see what happens, maybe they are still processing
Edit: meanwhile, both mentioned missing blurs are now nicely covered.
I’d be interested in that, too.
I hope someone will soon say something about it, since except for this topic nobody lost a word about it. I check the blog from time to time if there’s something official about it.
You are correct we are currently reblurring the whole world with a new blur algorithm thats really good. Happy for any feedback, also stay tuned for a blogpost.
Currently we are reblurring at a rate of 500 images/second so lots of data flowing through the system.
I don’t know what I like more about the new algorithm, that it seems to find (almost) anything that needs blurring or that it doesn’t blur random fences and stuff anymore
I’ve just added a number of photos and the blurring is vastly improved but I do have a female police officer clearly visible in one of my new images and I have no way of redacting her without requesting a delete.