Technology: How Siri and Alexa lost the AI ​​race – 03/16/2023 – Tech

Technology: How Siri and Alexa lost the AI ​​race – 03/16/2023 – Tech


On a rainy Tuesday in San Francisco, Apple executives took to the stage in a packed auditorium to introduce the fifth generation of the iPhone. The device, which looked identical to the previous version, brought a novelty that soon caught the public’s attention: Siri, a virtual assistant.

Scott Forstall, then head of software, pressed a button on the device to summon the assistant and asked her questions. At his request, Siri checked the time in Paris (“8:16 PM,” he replied), defined the word “mitosis” (“Cell division in which the nucleus divides into nuclei containing the same number of chromosomes”), and came up with a list of 14 highly rated Greek restaurants, five of them in Palo Alto, California.

“I’ve been in the field of AI for a long time, and it still blows my mind,” Forstall said.

That was 12 years ago. Since then, people have been far from surprised by Siri and competing assistants that are powered by artificial intelligence, such as Amazon’s Alexa and Google Assistant. Technology has remained largely stagnant, and talking assistants have become the butt of jokes, including in a 2018 “Saturday Night Live” skit that featured a smart speaker for seniors.

Now, the tech world is excited about another type of virtual assistant: chatbots. these bots [diminutivo de “robots”, ou robôs] with artificial intelligence, such as ChatGPT and the new ChatGPT Plus, from the San Francisco company OpenAI, can quickly improvise responses to questions typed into a chat box. People are using ChatGPT to tackle complex tasks like coding software, writing business proposals, and writing fiction.

And ChatGPT, which uses AI to guess which word will come next, is rapidly improving. A few months ago, he couldn’t write a proper haiku; now he can do it with pride. On Tuesday, OpenAI unveiled its next-generation AI engine, GPT-4, which powers ChatGPT.

The hype around chatbots illustrates how Siri, Alexa and other voice assistants – who once sparked similar hype – have squandered their lead in the AI ​​race.

Over the past decade, products have faced obstacles. Siri ran into technological snags, including clunky code that took weeks to update with basic features, said John Burkey, a former Apple engineer who worked on the assistant. Amazon and Google miscalculated how voice assistants would be used, leading companies to invest in areas with technology that rarely paid off, former employees said. When those experiments failed, domestic enthusiasm for the technology waned, they said.

Voice assistants are “dumb as a stone,” Satya Nadella, Microsoft’s chief executive, said in an interview this month with The Financial Times, declaring that a newer AI would pave the way. Microsoft worked closely with OpenAI, investing $13 billion (R$68.7 billion) in the startup and incorporating its technology into the Bing search engine, among other products.

Apple declined to comment on Siri. Google said it is committed to providing a great virtual assistant to help people on their phones and inside their homes and cars; the company is separately testing a chatbot called Bard. Amazon said there was a 30% increase in customer engagement with Alexa globally last year, and that it is optimistic about its mission to build world-class AI.

Assistants and chatbots rely on different types of AI. Chatbots are powered by so-called big language models, systems trained to recognize and generate text based on huge datasets pulled from the web. They can then suggest words to complete a sentence.

On the other hand, Siri, Alexa and Google Assistant are basically known as command-and-control systems. They can understand a finite list of questions and requests like “What’s the weather like in New York City?” or “Turn on the bedroom light.” If a user asks the virtual assistant to do something that isn’t in its code, the bot simply says it can’t help.

Siri also had a cumbersome design that took a long time to add new features, said Burkey, who was given the task of improving Siri in 2014. The assistant’s database contains a huge list of words, including the names of musical artists and places like restaurants, in nearly two dozen languages.

That made it “a big snowball”, he said. If someone wants to add a word to Siri’s database, he added, “it goes into a big pile”.

So seemingly simple updates, like adding a few sentences to the dataset, would require rebuilding the entire database, which could take up to six weeks, Burkey said. Adding more complex features like new search tools could take nearly a year. That meant there was no way for Siri to become a creative assistant like ChatGPT, he said.

Alexa and the Google Assistant relied on technology similar to Siri, but companies struggled to generate significant revenue from assistants, former Amazon and Google directors said — in contrast, Apple successfully used Siri to attract buyers. for your iPhones.

After Amazon launched the Echo, an Alexa-powered smart speaker, in 2014, the company hoped the product would help boost sales at its online store by allowing consumers to speak to Alexa to place orders, said an ex. -leader of Amazon involved with Alexa. But while people had fun playing with Alexa’s ability to respond to weather queries and set alarms, few asked Alexa to order products, he added.

Amazon may have invested too much in making new types of hardware, such as now discontinued alarm clocks and microwaves, that worked with Alexa and sold at or below cost, the former executive said.

The company also invested little in creating an ecosystem for people to easily expand Alexa’s abilities, the way Apple did with its App Store, which helped spark interest in the iPhone, the person said. Although Amazon offered a “skills” store to make the assistant control third-party accessories like light switches, it was difficult for people to find and configure skills for the speakers – unlike the simple experience of downloading mobile programs on app stores.

“We never had an App Store moment for assistants,” said Carolina Milanesi, a consumer technology analyst at research firm Creative Strategies and an Amazon consultant.

At the end of last year, the division of Amazon that worked on Alexa was one of the main targets of the company’s 18,000 layoffs, and several top executives left the company.

Kinley Pearsall, an Amazon spokesperson, said Alexa was much more than a voice assistant, and “we’re very optimistic about that mission, as always.”

Amazon’s mistakes may have misled Google, said a former manager who worked on the Google Assistant. Google engineers have spent years experimenting with their assistant to mimic what Alexa did, including designing smart speakers and voice-sensitive tablet screens to control home accessories like thermostats and light switches. Later, the company integrated advertisements into these household products, which did not become a major source of revenue.

Over time, Google realized that most people only used the voice assistant for a limited number of simple tasks, like starting timers and playing music, the former manager said. In 2020, when Prabhakar Raghavan, a Google executive, took over Google Assistant, his group refocused on the virtual companion as a prominent feature for Android smartphones.

In January, when Google’s parent company laid off 12,000 employees, the team working on operating systems for home devices lost 16% of its engineers.

Today, many big tech companies are rushing to come up with ChatGPT responses. Last month, Apple held its annual AI summit, an internal event for employees to learn about its large language model and other AI tools, two people familiar with the program said. Many engineers, including members of the Siri team, have been testing language generation concepts every week, the people said.

On Tuesday, Google also said it will soon launch generative AI tools to help companies, governments and software developers build applications with built-in chatbots and include the underlying technology in their systems.

In the future, chatbot and voice assistant technologies will converge, AI experts say. That means people will be able to control chatbots with their voice, and those using products from Apple, Amazon and Google will be able to ask virtual assistants to help them with their work, not just tasks like forecasting the weather.

“These products didn’t work in the past because we didn’t have human-level conversation capabilities,” said Aravind Srinivas, founder of Perplexity, an AI startup that offers a chatbot-based search engine. “Now we have.”

Translated by Luiz Roberto M. Gonçalves



Source link