Calls are growing to crack down on 'deepfake' porn, which uses advanced imaging technology to replace a person's face in a pornographic video.

Smartphone apps that make it easier to produce such images have resulted in a rapid spread of them on social media. The FacePlay app that was released in August has been downloaded more than 1 million times here and around 37,000 clips have been posted on Instagram.

A similar app called Reface has already been downloaded 100 million times and there are more than 650,000 images floating around in cyberspace that were made with it.

A deepfake image shows the face of a male Chosun Ilbo reporter pasted onto the body of a woman.

A petition posted on the Cheong Wa Dae website in January called for tough punishment for people creating and uploading deepfake porn videos of Korean celebrities and drew around 390,000 signatures.

But increasingly, people's ordinary friends and acquaintances have become victims. Police in North Jeolla Province arrested a man in his 20s who created deepfake porn videos of a woman he met on the Internet and uploaded them on porn websites.

A growing number of ads on social media offer to create deepfake porn using photos sent by clients.

A government taskforce tracked 11,891 deepfake porn videos on adult websites and social media last year and found that 20 percent of them targeted ordinary people.

Park Sung-hye at the taskforce said, "Deepfake technology has become more accessible and we need to be more vigilant against abuses."

People who create and spread sexually explicit videos or images face criminal charges. A new clause in Korea's sexual violence law makes them punishable by up to five years in prison or a fine of up to W50 million (US$1=W1,184).

Ahn Joo-young, an attorney, said, "It is a digital crime to create sexually explicit deepfake videos or images without the consent of others. They could also constitute defamation."