Fake News PL: understand what the text says about criminal content and compliance with court orders

Fake News PL: understand what the text says about criminal content and compliance with court orders

[ad_1]

The clash between the owner of Alexandre de Moraes and Elon Musk. Rosinei Coutinho/SCO/STF and REUTERS/Gonzalo Fuentes The recent clashes between the owner of Congress approves a project to regulate digital platforms – the so-called Fake News Bill. In recent days, Musk has made attacks against STF decisions on blocking profiles on X. On Saturday (6), the businessman stated that he would not comply with Moraes’ determinations, and would reactivate blocked accounts, disregarding court orders. In reaction, Moraes established a daily fine of R$100,000 for each profile that X reactivates without authorization. The minister also ordered the opening of an investigation against the platform and the inclusion of Musk in the digital militia investigation. Faced with Musk’s movement, ministers and parliamentarians began to defend the regulation of social networks. Representative Orlando Silva (PCdoB-SP), rapporteur for the Fake News PL, said he will ask the president of the Chamber, Arthur Lira (PP-AL), to include the proposal on the voting agenda. The project makes it a crime to promote or finance the mass dissemination of messages with untrue content through accounts controlled by robots. It also foresees changes in the liability of platforms for criminal content, in addition to establishing deadlines for compliance with court decisions. The Fake News PL has already been approved by the Senate, but failed to advance in the Chamber. In 2023, Lira entered the articulation circuit. Last May, when he assessed that there were not enough votes to approve the text, he postponed the analysis of the proposal in the House plenary. Since then, the deputy analyzed possible slices, but the project did not move forward. Criticism from the evangelical bench and the pressure from big techs — giants in the technology market that control social networks — on parliamentarians are seen as the main factors in the text’s retreat in the Chamber. To interlocutors, last year, Arthur Lira even signaled that the text would only receive new impetus if there is a decision from the Federal Supreme Court (STF) that obliges Congress to legislate on the topic. See in this report what the Fake News PL says about (click on the link to follow the content): accountability of networks duty of care judicial decisions punishments Moraes, STF, includes Elon Musk, owner of networks The latest version of the opinion filed by Orlando Silva in the Chamber establishes that platforms may be held civilly liable for criminal content published by users, as long as it is proven that the company ignored risks and gave up moderation mechanisms. Liability will also occur when criminal content is transmitted through paid promotion and advertising instruments. The measures amend the Marco Civil da Internet, which provides that providers can only be held responsible when, after a court order, they do not remove criminal content. Back Duty of care According to the text, companies must adopt a protocol to analyze risks related to platforms and their algorithms. This assessment should address, for example, the dissemination of content against the Democratic Rule of Law and prejudiced publications. Based on this analysis, companies will have to adopt measures to mitigate risks. The project also creates a so-called “duty of care”, which, if ignored, can lead to the platform being held liable. The mechanism determines that providers need to act “diligently” to prevent or mitigate illicit content published on platforms. Company negligence or identification of risks may lead to the opening of a security protocol. With the start of the procedure, platforms may be held responsible for omissions in user reports against criminal content available on social networks. Content moderation is also foreseen in the project. According to the text, the procedure must follow the “principles of necessity, proportionality and non-discrimination”. It also establishes that decisions regarding publications must be communicated to users, with the basis of the measure and the appeal mechanisms. Back Understand the Fake News Bill Judicial decisions The proposal establishes that digital platforms must comply, within 24 hours, with court decisions to take down criminal content. Non-compliance can be punished with a fine of up to R$1 million per hour, which can be tripled if the content was driven by paid resources. The removed publications and the access data of the user responsible for the content must be stored for six months. According to the text, the platform must report to the authorities signs of threats to a person’s life. Back Punishments In addition to liability in the Judiciary, companies that fail to comply with the measures provided for in the text may, for example, be punished with: warning daily fine of up to R$50 million fine of up to 10% of the company’s revenue in Brazil fine per user fine of up to R$50 million per infraction and temporary suspension of activities in Brazil. The proposal also provides that all companies that have operations in Brazil must have legal representatives in the country. To go back

[ad_2]

Source link