(LifeSiteNews) — Canadian country music star Paul Brandt is calling on Parliament to exercise the notwithstanding clause of the Charter of Rights and Freedoms in the wake of another horrifying case of a boy creating digital deepfake pornography of female classmates—child sexual abuse material (CSAEM).
A 17-year-old boy in Calgary, Alberta used AI to transform normal photographs of schoolgirls into child sexual abuse material and then uploaded them to social media. According to police, the teen is facing charges “including making, possessing, and distributing child sexual abuse and exploitation materials, along with criminal harassment.”
“The victims—teen girls from multiple high schools—are now in trauma counselling,” wrote Brandt in an X thread that has already been viewed nearly a million times. “Some may never feel safe at school again. Their images are forever weaponized. This isn’t ‘bullying.’ This is sexual violence enabled by technology and enabled by weak laws.”
A 17-year-old boy in Calgary just used AI to turn innocent photos of his female classmates into explicit deepfakes. He then uploaded them.⁰Real children were sexually violated without ever touching anyone.⁰This is the new face of child sexual abuse material (CSAEM). Please… https://t.co/hYlQMUxatx
— paulbrandt (@paulbrandt) December 4, 2025
“Under the law before October 31, 2025, this boy faced a mandatory 1-year jail sentence for making & distributing CSAEM, plus another possible 1-year for possession,” Brandt continued. “That floor sent a clear message: you do not get to sexually exploit children and walk away with probation…The Supreme Court struck down the 1-year mandatory minimum for possession of CSAEM because of a hypothetical 18-year-old who keeps a consensual sext from his 17-year-old girlfriend.”
RELATED: Canada’s Supreme Court goes soft on child pornography
This ruling, Brandt noted, means that many judges will let “young offenders” off light—and the social consequences of this will be transformative. “Imagine being one of those Calgary girls and hearing the boy who deepfaked you into porn might get… house arrest,” Brandt wrote. “Or a conditional discharge. Or NOTHING. Because a court 3,000 km away worried about a different, hypothetical 18-year-old’s love life.”
These “mandatory minimums,” Brandt stated, are essential. “They are the only way to tell every teenager with a phone and an AI app: If you turn a classmate into child sexual abuse material, your life changes too. You do not get to ruin hers and live yours without consequence. Without that certainty, we are teaching two generations the wrong lesson: Victims: ‘Your trauma is negotiable.’ Perpetrators: ‘Worst-case scenario is a stern talking-to.’”
Importantly, mandatory minimum sentences for these crimes were an important deterrent to sexually predatory behavior. “After mandatory minimums were introduced in 2015, self-reported teen sextortion and revenge-porn cases dropped,” Brandt wrote.
When the floor disappears, the behaviour comes roaring back—we’re already seeing it in the 54% surge in Alberta ICE files this year. The perpetrator is also a child. He needed a bright red line years ago—before he pressed “generate.” Clear, certain punishment protects him too: from becoming an adult predator, from a lifetime on the sex-offender registry, from potentially destroying his own future along with theirs.
Brandt called on Prime Minister Mark Carney’s Liberal government to invoke the notwithstanding clause and “restore the mandatory minimums for possession, making, and distribution of CSAEM,” and table a bill “with even stronger sentence if necessary. Brandt called on Canadians to sign his petition urging the government to take action “before the next classroom is violated.”
Paul Brandt launched an anti-sex trafficking organization in 2017 called #NotInMyCity, which focuses on raising awareness, advancing preventative strategies, mobilizing communities, and “facilitating transformational systems change.” The organization has assembled a vast army of corporate, political, and social allies over the past eight years.
LSN published a report in this space on December 3, detailing the growing crisis of AI-generated pornography via apps like “nudify,” which children and teens have used to digitally undress their classmates and produce CSAEM.
RELATED: AI is accelerating the porn crisis as kids create, consume explicit deepfake images of classmates















