Child pornography: the police lack the tools to deal with hyperfaking

The rise of artificial intelligence in crime complicates the work of police officers, who in particular have no tools to spot hyperfakes in child pornography, our Bureau of Investigation noted.

This situation greatly worries the Canadian Center for Child Protection (CCPE), an organization that works to reduce the sexual exploitation and abuse of minors, reports the show I, broadcast this evening on TVA.

This is because with the rise of technologies, we are now able to produce material quickly, in large quantities and with a quality that makes it difficult to distinguish the true from the false, explains the spokesperson for the organization, René Morin.

René Morin, spokesperson for the Canadian Center for Child Protection (CCPE).

Screenshot I

The Sûreté du Québec confirms that there is currently no tool that can quickly identify hyperfaking based on images of known attacks.

“Several companies are working on different technologies to detect fake photos and videos, but for the moment we have not tested any of them,” concedes spokesperson Benoit Richard.

Concerns

The Canadian Center for Child Protection fears that hyperfaking will overload the work of investigators.

“They will try to identify these victims without necessarily knowing that with artificial intelligence, they are perhaps looking for children who do not even exist,” specifies Mr. Morin.

These fears are justified, confirms the Sûreté du Québec.

“It’s certain that it increases the work… It will lead to additional investigations that must be started again each time,” explains Mr. Richard.


Benoit Richard, lieutenant communications coordinator at the Sûreté du Québec.

Photo Chantal Poirier

The Larouche case

The trial of Steven Larouche, 62, highlighted this phenomenon. The pedophile was sentenced last April to eight years in prison for having produced more than 86,000 images and videos of young victims attacked using hyperfaking, a first in Canada.

In fact, he had used cutting-edge software to produce videos by inserting images of children being attacked from his impressive collection. He originally held more than half a million files and videos, which had been circulating for years in clandestine networks.

During the trial, the quality of the new images generated by Steven Larouche thanks to the hyperfakes stunned the court.

“These were images and even videos that looked completely real […] It was significant for all the actors in the judicial system who had to be exposed to this,” relates the Crown prosecutor in the case, Ms.e Véronique Gingras-Gauthier.

Only the seasoned eye of investigator Charles-Henri Jenniss, in the technology division of the Sûreté du Québec, had made it possible to carry out the deception. His extensive knowledge of the “pornographic media library” allowed him to realize that these were sophisticated montages.

  • Listen to the interview with Alain McKenna, Quebec journalist, author and blogger specializing in science, technology and automobiles on Sophie Durocher’s show via QUB :
Requested software

Even today, this work is done in a “manual” manner, confirms the SQ.

The Canadian Center for Child Protection is putting pressure on the companies that produce this software so that hyperfakes remain traceable.

“It would also be useful on the production of hyperfakes with politicians, for example. That would tell people that it’s not real,” explains spokesperson René Morin.

Lartificial intelligence available to fraudsters

It is only a matter of time before fraudsters use artificial intelligence to create cloned voices and trap their victims in Quebec, believes the Sûreté du Québec.

“It’s coming,” says Benoit Richard, spokesperson for the Sûreté du Québec, without hesitation.

The phenomenon of “grandparent” fraud saw a marked increase last year, causing losses of more than $9 million in the country. In North America, some of these scams have been perpetrated using cloned voices, leading the victim to believe that a loved one needs their help… and their money.

Johanne Éthier, from Laval, was the victim of “grandparent” fraud last July. The voice on the line sounded so much like her son that she wonders today if she was not the victim of a fake.

“He told me: ‘Mom, I just had an accident… I hit a car,’” she said on the show I

The young man informs him that he is in prison and that he must pay bail of nearly $5,000 to free him. Subsequently, a “bailiff” came to her house to ask her to give him the money.


Johanne Éthier, from Laval, was the victim of “grandparent” fraud.

Screenshot I

The next day, Mme Éthier checked on her boy… only to realize that she had been fooled. She filed a complaint with the Laval police, who have not made any arrests in this case to date.

I try the experience

The Sûreté du Québec and several municipal police forces in Quebec have not reported any cases of “grandparent” fraud using cloned voices, for the moment. The author of these lines, however, managed to easily trap an old uncle, thanks to artificial intelligence.

Helped by software to reproduce voices in the language of Molière – which was almost impossible to find a year ago – our team managed to synthesize the reporter’s voice.

A computer contacted the targeted victim by telephone. At the end of the line, the voice asked him to quickly prepare two glasses of water: the reporter and his cameraman had a small emergency and would arrive shortly.

A few minutes later, the duo knocked on his door… and noticed the presence of glasses of water.


Journalist Richard Olivier managed to trap his 81-year-old uncle, thanks to artificial intelligence.

Screenshot I

Some safety instructions regarding hyperfakes:

  • When a loved one calls you in an emergency and asks for money, hang up and call them back to verify that it is really them and that they are telling you the truth.
  • Establish a code word with the family, to verify your identity. In a distress call, you can then ask your interlocutor to validate their identity, by transmitting this code word.
  • Limit access to your online photos and videos – especially those of your children – to those close to you only.

Do you have any information to share with us about this story?

Write to us at or call us directly at 1 800-63SCOOP.


source site-64