The new Data (Use and Access) Act, which criminalises intimate image abuse, is a huge victory won fast in a space where progress is often glacially slow
For Jodie*, watching the conviction of her best friend, and knowing she helped secure it, felt at first like a kind of victory. It was certainly more than most survivors of deepfake image-based abuse could expect.
They had met as students and bonded over their shared love of music. In the years since graduation, he’d also become her support system, the friend she reached for each time she learned that her images and personal details had been posted online without her consent. Jodie’s pictures, along with her real name and correct bio, were used on many platforms for fake dating profiles, then adverts for sex work, then posted on to Reddit and other online forums with invitations to deepfake them into pornography. The results ended up on porn sites. All this continued for almost two years, until Jodie finally worked out who was doing it — her best friend – identified more of his victims, compiled 60 pages of evidence, and presented it to police. She had to try two police stations, having been told at the first that no crime had been committed. Ultimately he admitted to 15 charges of “sending messages that were grossly offensive or of an indecent, obscene or menacing nature” and received a 20-week prison sentence, suspended for two years.
Continue reading…
The new Data (Use and Access) Act, which criminalises intimate image abuse, is a huge victory won fast in a space where progress is often glacially slow
For Jodie*, watching the conviction of her best friend, and knowing she helped secure it, felt at first like a kind of victory. It was certainly more than most survivors of deepfake image-based abuse could expect.
They had met as students and bonded over their shared love of music. In the years since graduation, he’d also become her support system, the friend she reached for each time she learned that her images and personal details had been posted online without her consent. Jodie’s pictures, along with her real name and correct bio, were used on many platforms for fake dating profiles, then adverts for sex work, then posted on to Reddit and other online forums with invitations to deepfake them into pornography. The results ended up on porn sites. All this continued for almost two years, until Jodie finally worked out who was doing it — her best friend – identified more of his victims, compiled 60 pages of evidence, and presented it to police. She had to try two police stations, having been told at the first that no crime had been committed. Ultimately he admitted to 15 charges of “sending messages that were grossly offensive or of an indecent, obscene or menacing nature” and received a 20-week prison sentence, suspended for two years. Continue reading…Technology | The Guardian
