The idea of using a picture upload for automated verification is completely unviable. A much more commonly used system would be something like telling you to perform a random gesture on camera on the spot, like “turn your head slowly” or “open your mouth slowly” which would be trivial for a human to perform but near impossible for AI generators.
Due to having so many people trying to impersonate me on the internet, I’ve become somewhat of a expert on verification pictures.
You can still easily tell that this is fake because if you look closely, the details, especially the background clutter, is utterly nonsensical.
The point isn’t that you can spot it.
The point is that the automated system can’t spot it.
Or are you telling me there is a person looking at every verification photo, and if they did they would thoroughly scan the photo for imperfections?
The idea of using a picture upload for automated verification is completely unviable. A much more commonly used system would be something like telling you to perform a random gesture on camera on the spot, like “turn your head slowly” or “open your mouth slowly” which would be trivial for a human to perform but near impossible for AI generators.
…I feel like this isn’t the first time I heard that statement before.