Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I know OpenAI watermarks their stuff. But I wish they wouldn't. It's a "false" trust.

Now it means whoever has access to uncensored/non-watermarking models can pass off their faked images as real and claim, "Look! There's no watermark, of course, it's not fake!"

Whereas, if none of the image models did watermarking, then people (should) inherently know nothing can be trusted by default.





Yeah, I'd go the other way. Camera manufacturers should have the camera cryptographically sign the data from the sensor directly in hardware, and then provide an API to query if a signed image was taken on one of their cameras.

Add an anonymizing scheme (blind signatures or group signatures), done.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: