Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Apple have a list of photos they know are CSAM, or rather hashes of them

Technically, that government agencies claim are CSAM and the "hash" is highly vulnerable to preimage attacks: I can take random unrelated photos and make them match other hashes, https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX/issue...

This means people can take child porn images and modify them so they match targeted innocent images, and take innocent images and modify them so they match child porn.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: