
Distributing a fake nude of your spouse is ‘morally reprehensible’ — but not a crime under Canadian law, Ontario judge warns
“It is not her nude body and, it is not her breasts,” the judge wrote, explaining why the Criminal Code does not apply.
Nov. 5, 2025
A Burlington judge recently acquitted a man accused of sending intimate images of his wife to an unknown man without her consent.
Andrew Francis Wallace/Toronto Star
Jacques-Gallant
By Jacques GallantCourts and Justice Reporter
Distributing fake nude images of a real person without their consent may be “morally reprehensible and, frankly, obscene,” but it’s not a crime, a Burlington judge has ruled.
In a case that one expert says highlights the urgent need for Parliament to expand the definition of intimate images in the Criminal Code, Ontario Court Justice Brian Puddington acquitted a man accused of sending pictures of his wife without her consent to an unknown man via Snapchat.
The woman was clothed in some of the photos, but in one, she was wearing only a bra in her bathroom, while in another, her face was “digitally manipulated” and placed on top of a naked body that was not hers.
Neither the bra photo nor the fake met the Criminal Code definition of an intimate image, Puddington found, which is that the person must be nude, exposing his or her genital organs, anal region or breasts, or engaged in sexual activity.
The case is the latest example of a grey area in Canadian law around so-called deepfake images, where Photoshop or AI tools can be used to create convincing nude images of almost anyone, often with little time or skill involved.
A boy created AI-generated porn with the faces of girls he knew. Why Toronto police said he didn’t break the law
Star Exclusive
A boy created AI-generated porn with the faces of girls he knew. Why Toronto police said he didn’t break the law
In the fake photo, “it is not her nude body and, it is not her breasts — both of which are necessary to meet that definition,” Puddington wrote in his October ruling (emphasis his.)
“If this type of photo were meant to be captured by this section, Parliament would have specifically done so … This is not to say that a fake image does not cause harm and embarrassment, but that harm is not captured by the current provisions.”
And the bra photo, in which there was scribbling over the woman’s face, also failed to meet the definition because while it “shows a significant portion of her breasts,” they are not exposed as she is wearing clothing, the judge found, while nevertheless stating that sharing a photo of someone in their underwear without their knowledge is “disgusting and disgraceful.”
Puddington granted the defence’s application for a directed verdict, meaning he concluded at the end of the Crown’s case that they had failed to present sufficient evidence to support the charge, and it wasn’t necessary to hear closing submissions and for the defence to decide if they wanted to call evidence.
In acquitting the accused man, no final determination was made as to whether he had sent the photos, which his daughter testified she had found on a Snapchat account on his phone. The decision also does not specify a motive for why he allegedly shared the images.
“Nothing in these reasons should be read as saying that (the complainant) did not experience an embarrassing and humiliating event,” Puddington wrote. “The fact remains that the photos somehow exist, and (she) had to endure viewing and describing them in a courtroom full of people. Those photos may have also made their way into the public domain. She was emotional in court, and rightfully so.
“While I am sympathetic to her, that sympathy cannot play any role in my judgment. I must apply the law dispassionately, and not try to shoehorn images into a definition simply because I find the photographs deplorable.”
The proliferation of digitally altered photos and AI-generated pornography appropriating someone’s likeness “is a clear growing problem and Parliament needs to act,” said lawyer Gillian Hnatiw, who specializes in gender-based violence cases and was not involved with the matter before Puddington.
“Yes, I do think Parliament needs to expand the definition” of intimate images, she said. “Feminist legal scholars have been sounding the alarm about this for some time.”
As Puddington noted in his ruling, the previous Trudeau government did introduce an expanded definition in its Online Harms Act, to include an image “that falsely presents in a reasonably convincing manner a person as being nude.” The bill died on the order paper following the prorogation of Parliament earlier this year and subsequent federal election. The Carney government has not yet reintroduced a similar bill.
The issue of deepfake porn made headlines last year after pop star Taylor Swift threatened legal action over a series of pornographic fakes that were viewed millions of times before they were taken down.
Meanwhile, some jurisdictions are floating aggressive changes to the law in response. Denmark, for example, has proposed cracking down on AI-generated deepfakes by expanding its digital copyright laws to include a person’s features or voice.
“This is a facet of gender-based violence, which is endemic,” Hnatiw said, “and until the law starts to take these forms of violence/coercion/intimidation seriously, they will continue.”