Taylor Swift: scammers release fake promotional video – 01/09/2024 – Tech

Taylor Swift: scammers release fake promotional video – 01/09/2024 – Tech

[ad_1]

Taylor Swift’s affinity for cookware maker Le Creuset is real: Her cookware collection was featured on a Tumblr account dedicated to the pop star’s home decor, in an in-depth look at her kitchen published by Variety magazine, and in a Netflix documentary that was highlighted on Le Creuset’s Facebook page.

What’s not real: Taylor Swift’s endorsement of the company’s products, which have appeared in recent weeks in ads on Facebook and elsewhere, featuring her image and voice.

The ads are among the many scams that use supposed celebrity videos and audios and become much more convincing with artificial intelligence. In a single week in October, actor Tom Hanks, journalist Gayle King and YouTuber MrBeast said AI versions of themselves were used, without permission, for misleading promotions for dental plans, iPhone giveaways and other advertisements.

In Taylor Swift’s case, experts said artificial intelligence technology helped create a synthetic version of the singer’s voice, which was stitched together with images of her alongside snippets showing Le Creuset Dutch cookware.

In several advertisements, the music star’s cloned voice addressed the “Swifties” — as her fans are known — and said she was “enthusiastic” about giving away free sets of kitchen utensils. All people had to do was click a button and answer a few questions before the end of the day.

Le Creuset said it was not involved with the singer in any consumer sweepstakes. The company urged shoppers to check their official online accounts before clicking on suspicious ads. Representatives for Taylor Swift, who was named Time magazine’s Person of the Year in 2023, did not respond to requests for comment.

Famous people have lent their image to advertisers for as long as advertising has existed. Sometimes this happens against their will. More than three decades ago, Tom Waits sued Frito-Lay — and won nearly $2.5 million — after the snack company imitated the singer in a radio ad without his permission.

The scam campaign that used Le Creuset also featured fabricated versions of presenters Martha Stewart and Oprah Winfrey, who in 2022 posted a video criticizing the number of false advertisements on social media, emails and websites that falsely claimed that she endorsed a product that would help to lose weight.

Over the past year, significant advances in artificial intelligence have made it much easier to produce “deepfakes,” an unauthorized digital replica of a real person. Audio parodies have been especially easy to produce and difficult to identify, said Siwei Lyu, a computer science professor who directs the Media Forensic Laboratory at the University at Buffalo in New York.

The fake ad involving Le Creuset and Taylor Swift was likely created using a text-to-speech service, Lyu said. These tools typically translate a script into an AI-generated voice, which can then be embedded into existing video footage using lip-sync programs.

“These tools are becoming very affordable nowadays,” said Lyu, who added that it was possible to make a “decent quality video” in less than 45 minutes. “It’s getting really easy, which is why we’re seeing more of it.”

Dozens of separate but similar Le Creuset scam ads featuring Swift — many of them posted this month — were visible as of late last week in the Public Ad Library of Meta, the company that owns Facebook and Instagram. The campaign was also run on TikTok.

The ads directed viewers to websites that imitated legitimate outlets, such as the Food Network television channel, which displayed fake news coverage of the Le Creuset offering alongside fabricated customer testimonials.

Participants were asked to pay a “small shipping fee of $9.96” for the cookware. Those who complied faced hidden monthly charges without ever receiving the promised utensils.

Some of the fake Le Creuset ads, like one imitating interior designer Joanna Gaines, had a deceptive appearance of legitimacy on social media thanks to tags identifying them as sponsored posts or originating from verified accounts.

In April, the Better Business Bureau warned consumers that fake AI celebrity scams were “more convincing than ever.” Victims were often left with higher than expected charges and no response regarding the product they had ordered. Banks have also reported attempts by scammers to use voice “deepfakes,” or synthetic replicas of real people’s voices, to commit financial fraud.

Over the past year, several well-known people have publicly distanced themselves from ads featuring their image or voice manipulated by AI.

This summer, fake advertisements spread online that purportedly showed country singer Luke Combs promoting weight-loss gummies recommended to him by fellow country singer Lainey Wilson.

The singer posted a video on Instagram denouncing the ads, saying that “people will do anything to make money, even if they are lies.” Combs’ manager, Chris Kappy, also announced on Instagram that the singer had no involvement in the gum campaign and accused foreign companies of using artificial intelligence to replicate his client’s image.

“For other entrepreneurs out there, AI is a scary thing, and they are using it against us,” the entrepreneur said.

Companies say they fight scammers

A TikTok spokesperson said the app’s ad policy requires advertisers to obtain consent for “any synthetic media that contains a public figure,” adding that TikTok’s community standards require creators to disclose “synthetic or manipulated media that show realistic scenes”.

Meta reported that it has taken action against ads that violate its policies and that it prohibits content that uses public figures in a misleading way to try to deceive users and obtain money. The company said it has taken legal action against some of the scammers, but added that malicious ads often manage to escape Meta’s review systems by hiding their content.

With no federal laws to address AI scams, lawmakers have proposed legislation that aims to limit their impact. Two bills introduced in the United States Congress last year. They would require safeguards like content labels or permission to use someone’s voice or image.

[ad_2]

Source link

tiavia tubster.net tamilporan i already know hentai hentaibee.net moral degradation hentai boku wa tomodachi hentai hentai-freak.com fino bloodstone hentai pornvid pornolike.mobi salma hayek hot scene lagaan movie mp3 indianpornmms.net monali thakur hot hindi xvideo erovoyeurism.net xxx sex sunny leone loadmp4 indianteenxxx.net indian sex video free download unbirth henti hentaitale.net luluco hentai bf lokal video afiporn.net salam sex video www.xvideos.com telugu orgymovs.net mariyasex نيك عربية lesexcitant.com كس للبيع افلام رومانسية جنسية arabpornheaven.com افلام سكس عربي ساخن choda chodi image porncorntube.com gujarati full sexy video سكس شيميل جماعى arabicpornmovies.com سكس مصري بنات مع بعض قصص نيك مصرى okunitani.com تحسيس على الطيز