Big techs criticize the lack of debate on regulation in the country – 03/18/2023 – Politics

Big techs criticize the lack of debate on regulation in the country – 03/18/2023 – Politics

[ad_1]

Internet platforms criticize the discussion on the new regulation of networks in Brazil, which they consider not very transparent, and show concern about the possibility of changes in the Marco Civil da Internet.

A Sheet spoke with representatives of six of the main platforms operating in Brazil. None of them are being formally heard in the conversations about regulation in the Chamber of Deputies, which is discussing PL 2630, or in the Executive, which is negotiating a proposal to be incorporated into the bill.

The companies, as well as members of civil society, did not see the proposal and there was no public discussion.

The government claims that it will open for discussion after there is a consensus text within the Executive to be negotiated with the Chamber. And he says that the Fake News PL itself was the subject of several public hearings.

In a note sent to SheetGoogle said it supports “informed public debate on creating regulatory measures to deal with societal challenges such as the phenomenon of disinformation and threats to the democratic process.”

“However, we believe it is important that any proposals be widely discussed with various sectors of society and designed to ensure the protection of fundamental rights such as freedom of expression, privacy and equal opportunities for all.”

The company also makes a veiled criticism of the type of regulation under discussion, which would supposedly benefit traditional communication groups. “It is also essential to ensure the maintenance of an economic environment that allows innovation and free competition, without favoring certain groups or sectors.”

The main platforms –Twitter, WhatsApp, Facebook and Instagram (Meta), Google and YouTube, TikTok, Kwai, Telegram– only participate in the working group set up by Minister Alexandre de Moraes, in the presidency of the TSE (Superior Electoral Court).

In the meetings conducted by the secretary general of the court, José Levi do Amaral, the objective is to reach a common proposal for self-regulation of the platforms. Companies have already sent suggestions. However, Moraes wants them to at least include some form of accountability for boosted or monetized content.

The biggest concern of the platforms is the prospect of changes in the Marco Civil, from 2014. The Marco Civil is the main law that regulates the internet in Brazil and determines that platforms can only be held civilly liable for third-party content if they do not comply with court orders of removal.

The proposal under discussion in the Executive provides for punishments against big techs even before a court order for content with racism, violations of the Democratic State Law and the rights of children and adolescents.

The text will be forwarded and discussed with Deputy Orlando Silva (PC do B-SP), rapporteur for Bill 2630. The Deputy supports accountability and has already said that the change in Marco Civil is “inexorable”.

According to the government text, platforms would not have to proactively monitor content to detect illegal posts. They would only be liable if they knew about the illegal content and failed to act. It is the so-called “notice and action” that is in the Digital Services Law that has just entered into force in the European Union.

Platforms would need to have a reporting channel that is easily accessible to users. When they received that information, they would have to review it and decide whether the reported content violates the law and therefore should be removed. If they do not act and the content is illegal, then they can be held responsible.

Every six months, companies would have to publish a report on the so-called “duty of care”, specifying complaints about allegedly illegal content, removals of posts that violate the law, mitigation measures for this. The reports would undergo an independent audit.

Companies would not be punished if they missed one or another illegal content — they would only be fined if there was widespread non-compliance with the “duty of care”.

Members of the STF (Federal Supreme Court) such as ministers Gilmar Mendes, Alexandre de Moraes and Luís Roberto Barroso have already expressed support for holding platforms accountable for certain third-party content, such as those that incite violence or advocate a coup d’état.

The STF convened a public hearing to debate, on March 28, two extraordinary appeals that could change the Marco Civil. A decision in any of these cases would have general repercussions, could set a precedent of civilly holding platforms accountable for content before there is a court order for removal, as stated in the Civil Rights Framework.

Platforms see accountability as a threat to their business model. They argue that, to protect themselves, they will remove a multitude of contents to avoid possible punishment. They point to the difficulty of determining what content is “anti-democratic” or “hate speech”, since this depends on the context.

Such an analysis would be much more difficult than analyzing the type of content that companies already remove – copyright infringement, pornography and pedophilia. They would need to have very specific parameters about what is illegal. Otherwise, just in case, just in case, they will remove it. And that would weaken the internet in Brazil as a space for exchanging ideas, with a reduction in freedom of expression, they say.

However, the government’s proposal provides that the assessment of compliance with the companies’ duty of care will take into account whether they sinned by excessive removals, since content moderation would have to be “proportional”.

There is an awareness on the part of companies that the regulatory discussion environment has completely changed.

In early 2020, when the Fake News PL began to be processed in Congress, the discussion was centered on the need to expand media education and preserve freedom of expression. Now, after the Covid pandemic, the attack on the US Capitol in January 2021 and the coup violence in Brasilia in January 2023, there is enormous pressure to hold companies accountable for content that has real-world impacts.

A minority calculates that it will be necessary to make concessions, such as accepting responsibility for monetized or boosted content, or for very specific exceptions to article 19 of the Marco Civil.

For others, however, any exception will demolish the Marco Civil, because it will open the door to litigation that will gradually erode immunity in other cases.

Others say the platforms’ existing rules, which broadly ban violent content, are enough. But they don’t accept some kind of responsibility for not playing by their own rules.

They claim that civil liability is not a panacea and that there are other tools that are much more adequate than making exceptions to Article 19. As suggestions, they speak of tighter deadlines for complying with court orders, more transparency, user empowerment, more cooperation with authorities and more investment in fact-checking.


Understand what’s up for debate

What is the debate about the regulation of social networks?Under the impact of the coup acts of January 8, the Lula government prepared a proposal for a provisional measure that obliges networks to remove content that violates the Democratic State Law, with incitement to a coup, and a fine if there is widespread non-compliance with obligations. Faced with resistance from Congress, Planalto backed down and discusses including these measures in PL 2630, the so-called PL for Fake News.

What is the Civil Rights Framework for the Internet?It is a law with rights and duties for the use of the internet in the country. Article 19 of the framework exempts platforms from liability for damages generated by third-party content, that is, they are only subject to paying compensation, for example, if they do not comply with a court order of removal. The constitutionality of article 19 is questioned in the STF.

What is the discussion about this article?The rule was passed with a view to ensuring freedom of expression. One justification is that networks would be encouraged to remove legitimate content for fear of being held accountable. On the other hand, critics say the rule discourages companies from tackling harmful content.

Does the government’s proposal impact the Marco Civil?The understanding is that the project will make one more exception in the Marco Civil. Today, companies are required to remove non-consensual nude images even before a court order. The government wants scamming content to also become an exception to the immunity granted by law, but companies would not be subject to fines if one or another violating content was found on the platform.

How has Congress reacted to the discussion?Part of the Legislature criticizes Planalto’s proposal for believing that accountability would lead companies to censor themselves to avoid sanctions. In addition, measures such as the creation of a regulatory body for platforms and parliamentary immunity on networks are studied, a point defended by Arthur Lira, mayor.

[ad_2]

Source link