TWICE
Girl group TWICE has sent a strong response to the production and distribution of illegal deepfake* videos (synthetic functions based on artificial intelligence). While sexual crimes related to the spread of “deepfake sexual exploitation” have recently emerged as a serious social problem, the damage caused by deepfake sexual crimes is spreading to the music industry.

TWICE’s agency, JYP Entertainment, told TWICE’s official fan community on August 30, “We take the situation of the recent spread of deepfake videos targeting artists very seriously. This is a clearly illegal act, and we are currently collecting all related data and are taking vigorous legal action without leniency with our law firm.”
The agency then warned, “We would like to say that we will never sit idly by and will never deal with any actions that violate the rights and interests of artists to the end.”
- DEEPFAKE: Deepfake is a combination of “deep learning” and “fake.” A deepfake is a technology for manipulating photos or videos to replace someone’s face with another face. This allows the creator to make people believe that this person has taken a particular type of photo (often naked, or wanting to make people believe that they are in a certain place), or in a video, to make them say anything.
In particular, public figures who have made their faces known to the public, such as celebrities, are more exposed to damage caused by deepfake sex crimes because it is easy to obtain photos. Previously, several stars in the domestic music industry, including singer Kwon Eun-bi and singer Yoo Jeong of the group Brave Girls, have already complained about pornographic damage caused by deepfake. Taylor Swift, a globally popular pop star, also faced controversy in January due to the distribution of deepfake pornography using her face.
Journalist: Shawn
Translator: Shawn
Source: SNS JYP Entertainment