PL of Fake News will not affect Signal, says president of the app – 02/05/2023 – Politics

PL of Fake News will not affect Signal, says president of the app – 02/05/2023 – Politics

[ad_1]

“The Fake News PL does not bring restrictions to communication applications like ours”, says Meredith Whitaker, president of the Signal Foundation, when asked about the impacts of the bill 2.630/2020 on the technology developed by her organization.

Signal is a messaging app committed to not collecting personal data and delivering 100% encrypted messages. The organization does not disclose the number of users of this technology, but the Play Store accounts for more than 100 million downloads on Android devices alone – competitor WhatsApp has been downloaded more than 5 billion times, and Russian Telegram, more than 1 billion.

Meredith says that the Russian app, which allows chats with more than 200,000 people, is different from Signal, which only allows one-on-one conversations. Telegram also does not automatically offer end-to-end encryption in messages, to prevent third parties from having access to the content of the dialogs.

In addition to leading the entity behind Signal, Meredith is a researcher, former professor at New York University and president of the AI ​​Now Institute, an entity critical of the concentration of power among technology giants and the current use of artificial intelligence. Before that, she worked at Google for 13 years, which she left after learning of a contract between the search giant and the US Department of Defense.

The president of the Signal Foundation says she is waiting to see how and if the Fake News PL will be approved, but she is not against holding platforms accountable for the impacts they cause in the world. She says she maintains contact with Brazilian research institutes such as InternetLab.

Signal is a non-profit organization, right? Is it a challenge to create an app to make it popular without making a profit? Signal launched in 2011. People started using our app because they recognized that competing apps promote surveillance and Signal does not. It is, however, very difficult to work non-profit in the technology industry, where every part of that industry has been shaped by the surveillance economy. Everything we do goes against the grain of the tech industry because we refuse to participate in the surveillance economy.

Is it necessary to change the consumer culture, considering that the user, in general, looks for simpler resources, such as WhatsApp? I use WhatsApp sometimes when there are people who don’t use Signal. We are always working to make Signal easier to use, more fun and intuitive. People don’t pick up their cell phones to use an app, but to talk to their friend, to show the location of a party. We want to make it so that the user doesn’t have to think about privacy. Privacy is the default.

Is that the only difference? It is also important to mention that WhatsApp, Telegram and these other apps work as social media platforms, which allow for wide dissemination through channels and other transmission tools, which Signal does not have. They attract different usage forms. We really focus on being a private communication app, not a social media platform that allows for broadcasts or other forms of communication. Very different from Telegram, for example, where you have these big groups.

Why did Signal avoid adding mass communication features? We did it consciously, because Telegram has content moderation, it has people monitoring the big groups. They need to have these tools, legally and because broadcast networks and social media don’t work without moderation, without ways to filter out spam or hate speech or other forms of content that make it impossible for people to participate. There are real issues that Telegram faces that we don’t, because we made clear choices to keep a communication app, not expand into a social media platform.

Is it possible to ensure privacy in large group messaging apps like Telegram? Telegram talks a lot about privacy, but it’s not a serious privacy app. They don’t encrypt groups, they don’t encrypt messages, they only encrypt private messages if the person turns on the option. Using Telegram, as a rule, there is no way to have more privacy, even if Pavel Durov [criador do Telegram] are constantly talking about how much they care about privacy.

Signal does a lot less marketing, but we are extremely serious about developing technology that can be validated as a form of privacy and we take on Herculean tasks to protect every piece of information possible. We use Signal’s protocol to protect messages and then we use other techniques we’ve developed to protect users. We don’t know contact lists, who’s in the groups you’re in, we don’t know your name. This personal information is sometimes more dangerous than the contents of the message itself.

The Brazilian Congress is currently debating a bill on the regulation of social networks, which mentions the accountability of platforms. Could this make it impossible for Signal to operate and its proposed communication with privacy in the country? I work with experts like InternetLab and others in Brazil who are much more knowledgeable than I am on the subject, but from my reading of the bill, at this point in time, it would not implicate Signal. It would apply to Telegram and it looks like it’s specifically reaching out to social media, which Signal isn’t. It remains to be seen how the vote comes out and how that will be interpreted, but I will continue to say that I am not opposed to holding platforms accountable for their impacts on the world.

There’s a big difference between overseeing private communications, what you can say to your friends, your chat group, a journalist, and ensuring that a group of 200,000 people has moderation and doesn’t contribute to distorting public opinion and our notion of shared reality.

Folha reported on Monday (1st) that a Google search on the Fake News PL returns more negative news than positive. Google says they have no control over this, although it has invested heavily in advertising on the topic. That makes sense? Of course they have control. But it could have happened in other ways. Opponents of the law may have bought a bunch of ads on the topic. A company that opposes text can produce a lot of content that is optimized for search and that makes that material appear at the top.

There are many ways to lead in search without directly involving Google. Google may have had their finger on website rankings, it’s possible, but we’ll never know. These big companies hold a lot of power without accountability. Anyone outside these companies is left to guess, because everything that happens on these sites appears invisible and is protected by corporate secrecy. We don’t have the information we need to democratically govern these platforms.

These companies use a lot of public data available on the internet. Are there efficient forms of control? It’s scary to think how much power they’ve gained in such a short period. On the other hand, we have some answers: the Fake News Bill, the Artificial Intelligence law under discussion in the European Union, the Digital Markets Law, legislation that tries to inject some responsibility into an industry that, at this point, is not reliable in any way. some. We also have other forces pushing against it.

One example that made me very happy this morning is that of the Screenwriters Guild of Hollywood, whose workers have seen their lives affected by studio and corporate choices. Writers who refuse to use certain technologies are paid less and have decided to strike against it.

One of the demands is that workers, for themselves, decide whether artificial intelligence is used in everyday life and how. We can see that organized workers, who are the people who suffer the most from the introduction of artificial intelligence, by their bosses, now fight for the right to hold artificial intelligence accountable.

Is it possible to face the challenges in terms of digital privacy and concentration of power by these large companies without society having a repertoire on technology to understand what happens? Yes of course. People have these conversations all the time, not knowing they’re about technology. In 2015, in the US state of Virginia, there was a huge wave of teacher strikes. Teachers stopped under allegations that the working and studying conditions of students did not work for anyone.

One of the reasons for the strike was the school district’s order to install an insurance app that tracked everywhere teachers went. I had access to photos, access to contact lists, and that was to be able to monitor them, in order to receive health coverage in return. In the US, people cannot have access to healthcare without having a plan. The teachers, however, said they would not accept this situation. They wanted health insurance and also dignity, not through a third-party app that sold data. These professors don’t necessarily have a computer science education. What they do know is that they don’t want someone with power over their lives, following their every action. This is not a technical insight, but about human dignity and the human desire for autonomy.

Has the privacy debate evolved in recent years?
More and more people recognize the importance of this. I see Signal, our user base keeps growing. This reflects a growing awareness of the importance of privacy and a growing unease with the surveillance regimes of a lot of US and Chinese companies. Signal is a new business alternative to messaging services like WhatsApp, Messenger and all those that have been the standard technologies for so long.

Mrs. Can you share how many users have Signal? Unfortunately, we don’t share this number publicly. We have many millions of users, and this is increasing. If you look in the Play Store, Google’s app store, with data only for Android devices, the app has been downloaded over 100 million times.

Do Brazilians also look for Signal? Yes, we saw that there is a search for a healthy use of technology in Brazil. People realize that Telegram is not private, that there are costs to use WhatsApp, and we are optimistic that people in Brazil adopt Signal, and prefer to use our application, because there is really no collection of personal data, and that allows communication with privacy and dignity.

Could the lack of privacy imply risks for users? People are killed for it. We take this very seriously. We spoke to people in Hong Kong, where there really is a difference between disappearing and not disappearing.

And is it impossible to access messages sent on Signal? It is impossible today to hack Signal itself. We and other testers have proven this time and time again. There are technologies like Pegasus [programa malicioso que dá acesso a tela do celular de um terceiro]. One has to be very careful to avoid this. Don’t click any strange links, don’t download anything, don’t open an unfamiliar PDF. With Pegasus, bad guys can listen to your playlist, check your email, open your messaging apps, they can do everything you can. But if they attack Signal’s server, they won’t find anything because we don’t store anything. Anyone can go to our website and see all the processes we’ve been forced to comply with. It has almost no data, it’s a lot of blank pages.


MEREDITH WHITTAKER

He has chaired the Signal foundation since 2022, and co-founded the AI ​​Now Institute in 2017, of which he is also president. She worked for 13 years at Google, developed an academic career and became a professor at New York University.

The journalist traveled to the Web Summit Rio at the invitation of Stone

[ad_2]

Source link