Pornhub’s Parent Company MindGeek Is Being Investigated, Here’s Why

Fight the New Drug
5 min readMay 19, 2021
Image credit: The New York Times.

On December 4th, 2020, a New York Times investigative opinion column about Pornhub set off a chain of events that rippled through the porn industry in the following months.

Written by Pulitzer Prize-winning journalist Nicholas Kristof and titled, “The Children of Pornhub,” this article shed light on the reported numerous cases where child sexual abuse material (CSAM), trafficking, rape, and nonconsensual content have proliferated one of the world’s most popular porn sites.

Years of survivor stories and advocate work were given visibility on an international stage, and for the first time, Pornhub’s parent company, MindGeek, was thrown into the spotlight. This Montreal-based tech giant owns over 100 porn sites and subsidiaries and controls much of what people understand to be the mainstream porn industry today.

An aftermath of public outcry

What happened after the article was released was a tidal wave of public outcry and pressure for Pornhub and MindGeek to take responsibility for allowing the spread of nonconsensual content on their platforms, profiting from exploitation, and revictimizing survivors. Visa and Mastercard announced investigations into the claims of CSAM, warning that they would suspend their payment processing services on Pornhub if they found anything illicit.

In response, Pornhub made drastic site security changes that child exploitation survivors and anti-exploitation advocates had requested for years: increased moderation, disabling content download features that allow users to save videos, only permitting “verified” users to upload content to the site, and updating the profile verification process. It’s unclear whether MindGeek applied these increased security measures to all of its porn sites, but Pornhub announced these changes under the threat of losing payment processing services.

Despite the changes, payment processing giants Visa, Mastercard, and Discover severed ties with Pornhub, suspending payments on the site from being processed through them. They did not, however, sever ties with MindGeek’s own ad company, TrafficJunky, that primarily financially supports MindGeek’s porn tube sites, including Pornhub, by running ads on them. Note that 50% of Pornhub’s reported revenue comes from advertisements, according to MindGeek executives, and the rest comes from site subscriptions and selling user data.

Just four days later, on December 14th, Pornhub deleted over 10 million videos from its site that were uploaded from unverified users. Pornhub’s unfathomably large library of content that was long considered an asset by the porn giant had become a liability. The videos were not deleted because it was certain they were exploitative, but they were deleted because it was not certain they were not. All that remains to this day on Pornhub’s site are videos uploaded from verified accounts.

But here’s something to consider — do verified accounts completely solve the problem of nonconsensual content and CSAM? Unfortunately, they haven’t been shown to. There are numerous cases of sex trafficking victims and children being featured in content uploaded to verified accounts, including child trafficking survivors as well as dozens of trafficking victims in the infamous “GirlsDoPorn” case. Even “verified” content on Pornhub is not guaranteed to be exploitation-free.

And while these changes to Pornhub have been a step forward in preventing and removing some CSAM and nonconsensual content, there was still the matter of the porn site and its parent company, MindGeek, being held accountable.

MindGeek and Pornhub scrutinized by Canada’s Parliament

The Standing Committee on Access to Information, Privacy and Ethics of Canada’s House of Commons (ETHI) launched an investigation into the Montreal-based company as part of the ripple effect of the New York Times article, calling child exploitation and rape survivors to testify and share their stories of their exploitation and abuse being shared to Pornhub. The ETHI committee also called MindGeek executives to account for their actions and the business model of their most popular site, Pornhub. And while the executives claimed that they reported cases of CSAM on Pornhub to child protection authorities as required by Canadian law, the evidence suggests otherwise.

During one ETHI committee meeting where child protection advocates were questioned about MindGeek’s alleged actions, it was revealed that the porn giant had not reported a single case of CSAM to authorities, as required by Canadian law, until 2020. It seems to be that only after the New York Times article was published that more thorough steps were taken to alert law enforcement to child exploitation on the site. It still remains unclear whether MindGeek is working to hold all of its other sites and subsidiaries to account, as it is appearing to work toward with Pornhub.

Over the course of these ETHI committee meetings, the scope of the issue of CSAM and nonconsensual content on Pornhub was revealed. It’s become clear that this company, which controls much of the mainstream porn industry and claims to care about victims of child sexual abuse material and nonconsensual content, has reportedly only very recently put basic safeguards in place to prevent CSAM and nonconsensual content from spreading.

And new, basic safeguards were put in place seemingly not because of the multitudes of victims of image-based abuse, trafficking, and child exploitation begging for the videos and images of their exploitation to be removed in years prior, but reportedly because they wanted to protect their financial successes and preserve their bottom line.

As of writing this article, the latest developments in the public reckoning of MindGeek and Pornhub include waiting to see whether the ETHI committee will recommend a full criminal investigation into the porn giant. In April, RCMP Commissioner Brenda Lucki reportedly told members of Parliament that a call for a criminal investigation into MindGeek and its website Pornhub is under review. According to a report by The Globe and Mail, Lucki said that the RCMP does not comment on whether an incident is under investigation, but that the matter is under review, including whether further action is required.

In the meantime, several child exploitation and child trafficking survivors, as well as dozens of other exploitation and trafficking survivors, have filed suit against MindGeek for allegedly failing to moderate and profiting from the videos of their abuse that were uploaded to Pornhub.

What’s happening now?

Nicholas Kristof wrote another article in April titled, “Why Do We Let Corporations Profit From Rape Videos?” exposing the prevalence of CSAM and nonconsensual content on other porn tube sites, this time about Pornhub’s direct rivals: sister sites XVideos and Xnxx.

Kristof’s initial “Children of Pornhub” article and the subsequent events it sparked were years in the making. So many advocates and survivors have raised the alarm and raised their voices unwaveringly in calling out the porn industry’s exploitative and harmful business practices while shedding light on the innumerable cases of nonconsensual content on porn tube sites like Pornhub. This wave of events would not have been possible without their tireless work.

Now, it’s clear more than ever that taking a stand against exploitation and raising awareness on the harmful effects of porn to consumers, relationships, and our world has tangible effects. Giving visibility to the exploitative reality of the porn industry has never been more important, and we will continue to do the work as an organization to expose porn’s harms and the industry’s exploitative practices.

This fight isn’t over. To stay updated and see a detailed timeline of events of everything that’s happened since December, head over to FTND.org/PHtimeline.

--

--

Fight the New Drug

Fight the New Drug exists to provide individuals the opportunity to make an informed decision regarding pornography by raising awareness on its harmful effects.