Understand which items of PL 2630 have to do with content – 05/06/2023 – Power

Understand which items of PL 2630 have to do with content – 05/06/2023 – Power

[ad_1]

The Fake News PL, which aims to regulate social media, has generated disputes about how much it can impact or not freedom of expression.

The bill does not give the government or other governing body the power to determine what specific content should be removed from networks.

Despite not delegating this type of power, it brings obligations to the platforms related to this, such as combating the dissemination of posts that constitute crimes against the democratic State, against children and adolescents, racism, among others.

The body responsible for overseeing compliance and applying fines should not stick to specific cases, but to the measures that companies take to adapt.

It is not clear, however, to what extent it is possible to guarantee that such an analysis is free from bias. There is still no definition, for example, of which body will have the attribution of supervising the application of the law and its independence from the government and companies.

The bill had its vote in the Chamber postponed last Tuesday (2) to avoid a defeat by the Lula government, which supports the initiative. There is still no definition on the date of a new attempt at voting.

Understand which parts of the law are related to content, which obligations, punishments and who would be responsible for applying them.

Does the text define or criminalize fake news? No. Despite the nickname that the project gained in 2020, when it began to be discussed in the Senate, the current version under discussion in the Chamber is not directly related to misinformation, nor does it seek to define what fake news would be.

Is the text related to content? Yes. The current version of the project has a series of obligations for platforms related to combating illegal content. There is in the law, for example, a list of crimes over which companies have a “duty of care”.

Besides dealing with content, what else does PL 2630 determine? It brings obligations for companies, such as publishing transparency reports, and rights for users, such as being notified in case of moderation and being able to appeal decisions.

Who will decide whether or not a post should be deleted? The task of moderating content and accounts would remain with the companies themselves. Today, except when there is a court decision determining the removal, this is done only based on the rules established by the networks, which are often global. The project creates obligations for platforms to start combating illegal content, according to a list of crimes defined in Brazilian law.

“For now, we only have their private rules, community policies, defining what should be the object of peremptory content moderation action privately. So now we only have the State saying: ‘look, I want you to look for this here and for this here'”, says Yasmin Curzi, professor of human rights and researcher at the Center for Technology and Society at FGV Direito Rio.

What punishments can companies suffer? From an administrative point of view, companies may be subject to fines and other sanctions that can go as far as blocking, depending on the seriousness of the infraction. The text talks about analysis of the companies’ “set of efforts and measures” and about “systematic non-compliance”.

In the case of specific posts, companies may continue to be the target of lawsuits filed by users requesting, for example, post removal and compensation for moral damages.

Does the project change anything in the civil liability of companies regarding the content they contain? Yes. Today, article 19 of the Marco Civil da Internet exempts platforms from liability for damages generated by third-party content. According to this law, they are only subject to pay compensation if they have not obeyed a previous court order of removal. Exceptions are only in case of non-consensual nudity and copyright content.

The main change with PL 2630 is that there would be two new, more direct exceptions. One of them would be in the case of ads or boosted posts. The second would be for posts on a certain topic during the so-called “security protocol” (understand below).

What is the “systemic risk” analysis mentioned in the law? The project foresees that platforms should annually publish reports with evaluations on their “systemic risks”, considering aspects such as the functioning of their algorithms, their content moderation systems, their terms of use and how they are applied.

Topics such as the dissemination of illegal content, the guarantee and promotion of freedom of expression, the democratic rule of law and the soundness of the electoral process, racism and violence against women, among others, should be considered in the analysis.

“he himself [risco sistêmico] it is an instrument that will encourage the platform to look at those five types of risk and proactively develop mitigation measures that will impact content”, says Francisco Brito Cruz, executive director of InternetLab.

He believes that this type of project figure may not be related to specific contents, but should have a more general impact. “It will impact the structure, how much you [plataforma]on average, will remove more or less.”

What is the “duty of care”? The text determines that platforms should start to prevent and mitigate illegal practices such as content that constitutes crimes of terrorism, crimes against the democratic rule of law, racism, crimes against children and adolescents, crime of instigating suicide, among others.

The project provides for sanctions “in the event of systematic non-compliance” with obligations regarding the “duty of care”. There is still no definition on the body responsible for carrying out such an analysis.

According to the project, it will be made based on the systemic risk and transparency reports, taking into account “the set of efforts and measures adopted by the providers, with no assessment of isolated cases”.

Bruna Martins dos Santos, a researcher at the Alexander von Humboldt Foundation and an activist at the Coalition Rights on the Network, says that it is wrong to talk about censorship. “The approach will be on top of content that is already provided for by law as crimes and is normally already moderated”, she says. “Specific cases will continue to reach the Judiciary.”

What is the “security protocol”? This protocol could be triggered in three scenarios: when “risk imminence” is configured, negligence or failure of the company’s action, and it is necessary to specify which companies would be targeted by it.

It would last for 30 days, with the possibility of extension, and would have to be related to a specific topic.

During this period, moderation would remain the responsibility of the companies. The main consequence of activating the protocol is that it would change the exemption from liability for content provided for in the Marco Civil in relation to that particular topic, during the duration of the protocol. It would be enough for the platform to have been notified of content and not have removed it to be ordered to pay damages in the event of a lawsuit for damages.

The protocol would be established by an administrative body – it is not yet known which one. The text brings some requirements for the foundation of the establishment, but points about the “protocol”, how its stages and objectives would be defined in a regulation subsequent to the approval of the law.

What does PL 2630 change about ads and boosted posts? Platforms could be held civilly liable for boosted ads and posts, even without having failed to comply with a lawsuit for removal. In this case, it will be necessary for someone to take legal action against an advertiser or author of a post and for the judge to understand that there was damage and that the platform was also responsible for it. The rule should have an impact on what kind of ads the platforms will and will not allow.

Who will monitor compliance with the law? An open point in the text is which body will be responsible for detailing the procedures for applying the law, in addition to promoting its inspection and applying fines.

What happens if that body abuses its prerogatives? Companies could sue the Judiciary to question measures taken by the administrative body responsible for overseeing the law.

[ad_2]

Source link

tiavia tubster.net tamilporan i already know hentai hentaibee.net moral degradation hentai boku wa tomodachi hentai hentai-freak.com fino bloodstone hentai pornvid pornolike.mobi salma hayek hot scene lagaan movie mp3 indianpornmms.net monali thakur hot hindi xvideo erovoyeurism.net xxx sex sunny leone loadmp4 indianteenxxx.net indian sex video free download unbirth henti hentaitale.net luluco hentai bf lokal video afiporn.net salam sex video www.xvideos.com telugu orgymovs.net mariyasex نيك عربية lesexcitant.com كس للبيع افلام رومانسية جنسية arabpornheaven.com افلام سكس عربي ساخن choda chodi image porncorntube.com gujarati full sexy video سكس شيميل جماعى arabicpornmovies.com سكس مصري بنات مع بعض قصص نيك مصرى okunitani.com تحسيس على الطيز