Can the Philippines confront threats of deepfake porn?

The Philippines' law enforcement and legal systems will soon be faced with the rising threat of deepfake pornography. Given legal and technological gaps, can authorities effectively protect the vulnerable and go after perpetrators?

MANILA, Philippines — The horror of deepfake pornography has swept across the world like a pandemic as the technology becomes more and more accessible to the public. 

In the Philippines, deepfakes are typically used as memes, with netizens able to generate made-up videos of famous personalities dancing to whatever viral Tiktok song there is at the moment. With the 2025 midterm elections coming around, one might see an increased use of deepfakes emulating the likeness of politicians, either used to spread fake news or to simply make more funny content. 

But deepfakes can have a more sinister use. In South Korea this year, underage Korean girls were horrified to find fake photos and videos of themselves engaging in sexual acts, all created by artificial intelligence (AI)—often generated by their classmates. 

Even celebrities are not immune to this, with global popstar Taylor Swift recently being a victim of AI porn. 

This is the horror of deepfake porn: a person’s likelihood is stolen and used in an invasive and perverse manner. 

Deepfake is here

The question is not if deepfake porn is already in the Philippines: it is. A quick Google search would reveal disturbingly realistic deepfakes of Filipina celebrities engaging in pornographic acts. 

Christopher Porras, a gender and development researcher, told Philstar.com that creating deepfake pornography is “alarmingly easy,” adding that it could be done within a few hours.

“With the Philippines being a highly connected nation in terms of social media usage, cases of cyber sextortion and image-based abuse have become more common,” Porras said. 

Porras cited a local study which found that there were cases of deepfake pornography being used to blackmail victims. 

"The ease with which this content can be produced makes it a potent tool for cybercriminals and an increasing threat in the landscape of gender-based violence."

“The ease with which this content can be produced makes it a potent tool for cybercriminals and an increasing threat in the landscape of gender-based violence,” Porras said. 

The more then becomes: How will the Philippines combat the proliferation of deepfake pornography? 

An uncharted legal territory

Gabriela Women's Party legal counsel Minnie Lopez has luckily not encountered a case of deepfake porn yet in her legal career, but having observed the deepfake pandemic in South Korea, she believed that she may encounter them soon enough. 

Lopez told Philstar.com that the Philippines has no laws yet to prosecute someone who creates deepfake porn. 

“It’s very dangerous and doubly dangerous because we have no laws protecting the victims against deepfake pornography or AI-generated porn or AI-generated sexual abuses,” Lopez said in a mix of English and Filipino. 

This is not to say that there are no laws that protect victims of deepfake porn. However, lawyers like Lopez will have to get creative with interpreting existing legislation. 

Laws like the Safe Spaces Act or the Anti-Photo and Video Voyeurism Act may be used to prosecute perpetrators of deepfake porn, despite not having the terms deepfake or AI written into the law. 

"Because it's electronic evidence, you have to authenticate it, you have to prove the chain of custody from the time that you saved it, took a screenshot, etc. Prosecution is really hard."

The problem lies in the technicalities, Lopez said. She has handled cybercrime and cybersex cases that were dismissed due to technicalities. For AI and deepfake porn, the evidence was clearly there but there needs to be a clear trail of accountability. 

“Because it's electronic evidence, you have to authenticate it, you have to prove the chain of custody from the time that you saved it, took a screenshot, etc. Prosecution is really hard,” Lopez said.

Passing a law targeting deepfake porn is also difficult. Unless a law is certified by the president as urgent, the legislative mill often runs slow. Bills could get bogged down for years, and even decades. 

The advancement of technology easily outpaces the rate that a law is passed, Lopez said. 

Law enforcement in the AI era

Even if there are laws that are passed that specifically pertain to AI and deepfake porn, enforcing these laws are a completely different matter, Lopez said. 

Both the Philippine National Police (PNP) and the National Bureau of Investigation have their own cybercrime units. Philstar.com spoke to PNP-Anti-Cybercrime Group Cyber Response Unity Chief Jay Guillermo to discuss their countermeasures against deepfake pornography.    

Guillermo admitted they have yet to receive a complaint on deepfake pornography, which he admits is not yet defined. “Deepfake... actually we do not have our own definition,” Guillermo said in Filipino.

Victims could still opt to file a libel case since their reputations are being tarnished, he suggested.

When it comes to investigating these cases, Guillermo stressed the need for additional legal tools. One law, not yet passed, but can be used to combat cybersex crimes is would grant authorities the power to compel telcos to surrender crucial data for investigations.

Where deepfake pornography presents a unique challenge: there may not be a readily identifiable victim willing to step forward with a complaint.

Guillermo also distinguished between deepfake pornography and more conventional cybersex crimes, such as the non-consensual sharing of explicit materials. In traditional cases, the law clearly identifies both the victim and the perpetrator. This is where deepfake pornography presents a unique challenge: there may not be a readily identifiable victim willing to step forward with a complaint.

“Deepfake pornography it’s hard to… you know, cases need to have a victim,” Guillermo said.

Potential victims may hesitate to file complaints, Guillermo said, knowing the explicit images aren't really them, even if they look convincing.

Vulnerable to virtual exploitation

But Lopez, the lawyer from women's group Gabriela, thumbed down the notion that deepfake pornography is a victimless crime. She said perpetrators are aware of what they are doing, and their claim that no one is getting hurt is just an old excuse. 

“It’s an attack on their dignity. They do not have consent, they do not know,” the lawyer said, referring to victims.

Lopez believes deepfakes have enabled new forms of abuse and exploitation, particularly affecting vulnerable groups.

Porras, the researcher, pointed to the sheer amount of explicit materials with women available online to turn into deepfake pornography. 

“There is an abundance of publicly accessible images of women online, which perpetrators can easily manipulate. Social and cultural norms around women’s sexuality in countries like the Philippines often exacerbate the harm caused by deepfakes, as victim-survivors face not only emotional trauma but also severe reputational damage,” Porras said. 

Porras and Guillermo advised deepfake porn victims to go to the authorities, but both admitted that there were gaps that extended beyond the scope of legal mechanisms. 

Porras and Guillermo recommended that victims of deepfake pornography report the incidents to authorities. But they also admitted gaps beyond legal mechanisms. Victims would have to depend on institutions that have shortcomings in preventing and pursuing abuse cases in deepfake technology.

“Combating deepfake pornography requires a multi-faceted approach that combines legal, technological and societal solutions,” Porras said.

Show comments