Then the scammer exchanges text messages with the victim to build some trust and finally makes a video call. Usually, a scammer (using a profile photo of a woman, almost certainly a fake one) sends a friend request to the target person on social media. On Wednesday, the Times of India reported about a new type of cybercrime in India: sex extortion involving deepfake porn videos not of the victim but of the fraudster, at least at first. (However, high-quality content still requires a powerful computer.)Īs technology becomes more and more available, fraudsters have started to use it in different spheres of life. Many popular programs for making face-swap videos are free, and it is possible to create one even on iPhone. To produce a deepfake video, it is not necessary to be a professional hacker. To get a realistic image, scammers use artificial intelligence that studies photos and videos of the person from different angles. After the swap, the fraudsters can make the target person say or do just about anything. In deepfake videos, which first appeared in 2017, a computer-generated face (often of a real person) is superimposed on someone else. In March, the Federal Bureau of Investigation warned that it expected fraudsters to leverage “synthetic content for cyber … operations in the next 12-18 months.” Nevertheless, deepfakes are causing trouble-for regular people. (Less sophisticated “ cheapfake” videos certainly did make the rounds, though.) Highly realistic deepfake videos didn’t quite make the splash some feared they would during the 2020 presidential election.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |