David Do is, on the floor, unassuming and respectable. He owns a home simply outdoors Toronto together with his companion, drives a Tesla and is paid $121,000 a year as a hospital pharmacist.
However he leads a double life as a key individual behind the world’s most infamous web site for non-consensual, AI-generated porn of actual individuals: MrDeepFakes.com. He has by no means been totally recognized till now.
MrDeepFakes was the preferred website globally for deepfake porn, and hosted tens of 1000’s of non-consensual and typically violent deepfake movies and pictures of celebrities, politicians, social media influencers and personal residents, together with Canadians.
This week, after CBC Information’s visible investigations unit — in collaboration with open-source investigative outlet Bellingcat and Danish publications Politiken and Tjekdet — contacted Do about his position within the website’s operations, MrDeepFakes shut down for good.

A message posted to the location’s homepage states {that a} “crucial service supplier has terminated service completely,” including that the location “is not going to be relaunching.”
The location had greater than 650,000 customers, a few of whom charged lots of of {dollars} to create customized movies. And the content material — which ranges from graphic strangulation scenes involving an AI faux of actor Scarlett Johansson to group intercourse with actor Natalie Portman to masturbation movies of musician Michael Bublé — has gotten greater than two billion views for the reason that website’s inception in 2018.
However not everybody on the location was a Hollywood actor or well-known singer.

“[It’s] fairly violating,” stated Sarah Z., a Vancouver-based YouTuber who CBC Information discovered was the topic of a number of deepfake porn photos and movies on the web site. “For anyone who does suppose that these photos are innocent, simply please take into account that they are actually not. These are actual individuals … who usually endure reputational and psychological injury.”
Creators on the location additionally took requests from individuals asking for deepfake porn of their companions and wives.
Sharing non-consensual deepfake porn is against the law in a number of nations, together with South Korea, Australia and the U.Okay. It is usually unlawful in a number of U.S. states, and whereas there isn’t a federal legislation but, the Home of Representatives passed a bipartisan bill banning it in April.
Nevertheless, it isn’t against the law in Canada. Prime Minister Mark Carney pledged to pass a law criminalizing it throughout his federal election marketing campaign, saying, “We’ll make producing and distributing non-consensual sexual deepfakes a felony offence.”
Requests for remark ignored
David Do took care to cover his affiliation with MrDeepFakes; his title by no means appeared anyplace on the location. However Do’s position may be pieced collectively utilizing knowledge from the net, public data and forensic evaluation of the location.
In line with that analysis, Do had a central position in operating the MrDeepFakes website and has an extended on-line historical past discussing the small print of working a preferred grownup web site that makes a revenue.
CBC Information, over a interval of a number of weeks, despatched a number of emails to Do searching for a response. Regardless of opening the emails a number of occasions, in line with mail trackers hooked up to the emails, he by no means responded.
When a CBC Information reporter hand-delivered a letter to Do at Markham Stouffville Hospital, the place he works as an in-patient pharmacist, on April 11, he stated, “I do not know something about that.”
“What do you imply?” the reporter requested.
“I am at work proper now,” Do stated, and insisted on going again to work.
AI-generated deepfake porn photos are getting simpler to make and more durable to battle. The Nationwide breaks down the way it works, the real-life impression on victims and what the choices are if faux photos of you begin circulating on-line.
After this interplay, Do’s Fb profile and the social media profiles of relations have been taken down.
Two weeks later, a assessment posted on Do’s Airbnb account indicated he was in Portugal.
Then, on Could 4, MrDeepFakes.com went offline — apparently, for good.

“A crucial service supplier has terminated service completely. Knowledge loss has made it unattainable to proceed operation,” says a shutdown discover on the web site’s fundamental web page. “We is not going to be relaunching. Any web site claiming that is faux. This area will ultimately expire and we’re not chargeable for future use. This message shall be eliminated round one week.”
On Could 5, a CBC Information reporter approached Do in an try to interview him about his position within the web site. Do instructed the reporter he did not wish to be recorded and that he was busy, earlier than driving away in his car.
Hidden in plain sight
MrDeepFakes was shrouded in secrecy. Members have been suggested to stay nameless utilizing aliases and video creators have been paid in cryptocurrency. The location’s internet hosting suppliers, in line with a previous report by Bellingcat, moved world wide. However Do slipped up, leaving clues hiding in plain sight.
It began with a username.
As early as 2018, a person with the deal with DPFKS was an administrator of the MrDeepFakes discussion board, the place individuals may pay to have customized deepfakes made of celebrities and personal people, even spouses.

In a 2018 publish on the discussion board website Voat — a website DPFKS stated they utilized in posts on the MrDeepFakes discussion board — an account with the identical username claimed to “personal and run” MrDeepFakes.com.
“I simply acquired house from my day job,” they stated in one other publish on Voat, “now again to this!”
DPFKS additionally stated that “MrDeepFakes.com was previously dpfks.com” in a 2018 publish on the MrDeepFakes discussion board. The identical distinctive Google Analytics tag is linked to each URLs, indicating they’re managed by the identical proprietor.
DPFKS did greater than run the web site; they created greater than 150 deepfake porn movies. They even posted a dataset of greater than 6,000 pictures of U.S. Rep. Alexandria Ocasio-Cortez so different customers may create non-consensual deepfake porn.
Operating that username via a searchable database that features data pulled from the open internet and former knowledge breaches reveals extra clues: usernames, IP addresses, date of beginning, bodily house addresses — and a number of other private e-mail addresses that includes combos of the title Duy David Do going again greater than 10 years.

One other e-mail additionally appeared: DPFKScom@gmail.com, which was the contact e-mail for MrDeepFakes till 2020. That e-mail deal with additionally appeared within the web site’s supply code on the time.
That is the place Do’s cautious veil of secrecy fell aside: many of those e-mail addresses had registered accounts on completely different web sites utilizing the identical distinctive 11-character password, which was uncovered in a number of knowledge breaches.

Do’s private emails have been additionally linked to a Yelp account for a person named David D. who lives within the Better Toronto Space, and an Airbnb account that contained a photograph of Do. A profile on document-hosting website Issuu beneath the username “dpfkscom” that hyperlinks to the MrDeepFakes web site additionally lists its location as Canada.
That Airbnb photograph matched the profile of an Ontario pharmacist named David Do who works at Markham Stouffville Hospital and Uxbridge Hospital within the Oak Valley Well being community.

A 2020 Instagram publish from Oak Valley Well being encompasses a image of Do, quoting him saying that his position as a pharmacist “strikes past allotting medicines,” and that he is a part of a group that’s “usually concerned in medicine administration … and guaranteeing secure practices.”
Oak Valley Well being, Do’s employer, instructed CBC Information it had initiated an inner investigation into the matter “in session with authorized counsel.”
“We’re unable to make additional remark, however wish to clarify that Oak Valley Well being unequivocally condemns the creation or distribution of any type of violent or non-consensual sexual imagery.”
The Ontario College of Pharmacist’s code of ethics states that no member ought to have interaction in “any type of harassment,” together with “displaying or circulating offensive photos or supplies.”
The Ontario Faculty of Pharmacists instructed CBC Information the “allegations you will have shared with us are extraordinarily critical” and that it was “taking speedy steps to look into this matter additional and decide the mandatory actions we have to take to guard the general public.”
Do has taken steps to take care of his personal privateness. Property data present that he and his companion personal a house within the Toronto space. The property is blurred on Google Maps, a privateness function that’s out there upon request.
Non-public residents, violent deepfaked movies
The faces on MrDeepFakes are, in lots of instances, family names: Taylor Swift, actor Emma Watson, former prime minister Justin Trudeau, U.S. politician Alexandria Ocasio-Cortez, Ivanka Trump and local weather activist Greta Thunberg, simply to call just a few.
A number of the movies are violent in nature. “Scarlett Johannson will get strangled to demise by creepy stalker” is the title of 1 video; one other known as “Rape me Merry Christmas” options Taylor Swift.

The checklist of victims contains Canadian American Gail Kim, who was inducted into the TNA Wrestling Corridor of Fame in 2016 and has made current appearances on reality-TV reveals The Superb Race Canada and The Traitors Canada.
Kim hadn’t seen the movies of her on MrDeepFakes, as a result of “it is scary to consider.”

“A variety of younger individuals commit suicide as a result of they’re shamed for issues which can be completely not them,” Kim instructed CBC Information in an interview. “I want there was some form of approach to management these items.”
Not all the individuals featured on the location have been celebrities. For instance, one of many website’s guidelines acknowledged solely social media influencers with greater than 120,000 Instagram followers are acceptable to deepfake, and that non-celebrities can’t be deepfaked with out consent.

But CBC Information discovered deepfake porn of a lady from Los Angeles who has simply over 29,000 Instagram followers.
“Each time it’s getting used on some actually big-name movie star like Taylor Swift, it emboldens individuals to apply it to a lot smaller, far more area of interest, extra personal people like me,” stated the YouTuber Sarah Z.
Customers additionally publish about getting movies of their companions or wives. One individual wrote that they need “an knowledgeable to deepfake my companion” and to “please [direct message] me necessities and value.”
Speedy evolution in deepfake porn
Deepfake porn know-how has made important advances since its emergence in 2017, when a Reddit person named “deepfakes” started creating express movies based mostly on actual individuals.
“In 2017, these [videos] have been fairly glitchy. You possibly can see quite a lot of glitchiness significantly across the mouth, across the eyes,” stated Suzie Dunn, a legislation professor at Dalhousie College in Halifax, N.S.
In 2025, she stated the know-how has progressed to the place “somebody who’s extremely expert could make an virtually indiscernible sexual deepfake of one other individual.”
Making a high-quality deepfake requires top-shelf laptop {hardware}, time, cash in electrical energy prices and energy. In line with a 2025 preprint study by researchers at Stanford College and UC San Diego, dialogue round assembling giant datasets of sufferer’s faces — usually, 1000’s of photos — accounts for one-fifth of all discussion board threads on MrDeepFakes.
These high-quality deepfakes can value $400 or extra to buy, in line with posts seen by CBC Information.

Making a digital faux does not essentially require coaching a bespoke AI mannequin. There are actually numerous “nudify” apps and web sites that may do face swaps in seconds. They’re getting simpler to make use of — and lots of of them are free.
In line with a report by cybersecurity agency Security Hero, there was a 550 per cent improve within the variety of deepfakes from 2019 to 2023.
In 2023, the agency discovered there have been greater than 95,000 deepfake movies on-line, 99 per cent of that are deepfake porn, primarily of girls.
In 2025, MrDeepFakes hosted greater than 70,000 deepfake porn movies.
“They see the ladies in these photos as digital objects,” stated Dunn. “In a very critical manner. It actually discourages individuals from going into politics, going, even being a celeb.”
Cashing in on deepfake porn
Even early on within the website’s existence, it was evident David Do was having bother coping with the expansion of MrDeepFakes, and the cash it brings in with promoting.
In 2018, an account linked to Do’s MrDeepFakes e-mail — DPFKScom@gmail.com — with the username “Aznrico” posted a message on a web-hosting discussion board asking for assist “figuring out bottlenecks and causes for slowness” for an grownup website “that will get round 15-20k guests per day.”
In March 2025, in line with internet knowledge platform Semrush, MrDeepFakes acquired greater than 18 million visits.

A cryptocurrency buying and selling account for Aznrico later modified its username to “duydaviddo.”
An account with the username Aznrico additionally posted on an auto discussion board in 2009: “my automotive is the 06 lancer ralliart,” a reference to the Mitsubishi Lancer Ralliart. Public data obtained by CBC Information present {that a} 2006 Mitsubishi Lancer Ralliart is registered to Do’s father, and the automotive seems in Google Maps imagery of Do’s mother and father’ home from 2009 onwards.

In 2020, one other account linked to Do — it initially had the username “dj01039,” an abbreviation of davidjames01039@gmail.com, which is an e-mail linked to a PayPal donation button that appeared on MrDeepFakes in 2019 — posted that he’s on the lookout for “enterprise options” for the grownup website the place he’s the “webmaster,” and that it makes as much as $7,000 monthly.
An account on one other discussion board with the username “dj01039” was registered with the DPFKScom@gmail.com deal with, data present.

However DPKFS’s affect on the web site went past internet hosting and IT. In line with their account on the MrDeepFakes discussion board, DPFKS posted 161 deepfake porn movies.
On the location’s discussion board, one person requested for a deepfake of Korean pop star Yeri. The person Paperbags — previously DPFKS — posted that that they had “already made 2 of her. I’m transferring onto different requests.”
In a Could 2018 trade, a person requested a Sandra Bullock video. Paperbags responded, “Okay i am going to see what I can do. Perhaps in a pair weeks.” 4 weeks later, he posted, “at present engaged on Sandra Bullock” and shared a hyperlink to a video titled “Sandra Bullock Nude Ass Pounding.”

Evolving authorized panorama
Regardless of the recognition of deepfake porn and broadly out there instruments to make it, legal guidelines in Canada and internationally are simply beginning to catch up.
Creating and sharing non-consensual deepfake AI porn of adults just isn’t a felony offence in Canada, however legal guidelines round baby pornography have just lately been utilized by courts to embody AI deepfakes, stated Moira Aikenhead, a lecturer on the College of British Columbia’s Allard Faculty of Regulation.
A just lately handed legislation in British Columbia makes it simpler for grownup victims to pursue civil recourse resembling takedowns and damages. Similar laws exist in Prince Edward Island, New Brunswick and Saskatchewan.
“Solely the federal authorities can move felony laws,” stated Aikenhead, and so “this transfer must come from Parliament.”
Within the U.S., no felony legal guidelines exist on the federal degree, however the Home of Representatives overwhelmingly passed the Take It Down Act, a bipartisan invoice criminalizing sexually express deepfakes, in April. A patchwork of legal guidelines exist on the state degree.
In the U.K. and Australia, sharing non-consensual express deepfakes was made a felony offence in 2023 and 2024, respectively.
However victims of deepfake porn in Canada are nonetheless ready for recourse past civil litigation.
“There’s solely a lot that I as a person can do,” stated Sarah Z. “Any change should be legislative and systemic.”
Do you will have one thing that you simply suppose wants investigating? You’ll be able to ship tricks to eric.szeto@cbc.ca.
Source link