Ce sujet a été résolu
Un rapport de la chambre des représentants a été rendu public hier. En résumé X est dans le viseur parce qu'il est sorti (en partie) de la logique de manipulation des algorithmes mis progressivement en place depuis 2015 (bien avant le DSA) pour invisibiliser les contenus légaux mais jugés par la commission européenne comme "borderlines". Ces contenus indésirables sont :
- la rhétorique populiste
- le contenu anti-gouvernement/anti-UE
- le contenu anti-élite
- la satire politique
- le contenu anti-migrants et islamophobe
- le sentiment anti-réfugiés/immigrés
- le contenu anti-LGBT
- la sous-culture meme
Quelques extraits :
Le document en question :
https://judiciary.house.g[...]THREAT-PART-II-2-3-26.pdf
- la rhétorique populiste
- le contenu anti-gouvernement/anti-UE
- le contenu anti-élite
- la satire politique
- le contenu anti-migrants et islamophobe
- le sentiment anti-réfugiés/immigrés
- le contenu anti-LGBT
- la sous-culture meme
Quelques extraits :
Across all of these subgroups, there were more
than 90 meetings between platforms, censorious civil society organizations (CSOs), and
European Commission regulators between late 2022 and 2024.
than 90 meetings between platforms, censorious civil society organizations (CSOs), and
European Commission regulators between late 2022 and 2024.
These DSA Election Guidelines were branded as voluntary best practices.
But behind closed doors, the European Commission made clear that the Election Guidelines were obligatory.
Prabhat Agarwal, the head of the Commission’s DSA enforcement unit, described the Guidelines
as a floor for DSA compliance, telling platforms that if they deviated from the best practices,
they would need to “have alternative measures that are equal or better.”
But behind closed doors, the European Commission made clear that the Election Guidelines were obligatory.
Prabhat Agarwal, the head of the Commission’s DSA enforcement unit, described the Guidelines
as a floor for DSA compliance, telling platforms that if they deviated from the best practices,
they would need to “have alternative measures that are equal or better.”
Since the DSA came into force in 2023, the European Commission has pressured
platforms to censor content ahead of national elections in Slovakia, the Netherlands, France,
Moldova, Romania, and Ireland, in addition to the EU elections in June 2024.50 Nonpublic
documents produced to the Committee pursuant to subpoena demonstrate how the European
Commission regularly pressured platforms ahead of EU Member State national elections in order
to disadvantage conservative or populist political parties.
platforms to censor content ahead of national elections in Slovakia, the Netherlands, France,
Moldova, Romania, and Ireland, in addition to the EU elections in June 2024.50 Nonpublic
documents produced to the Committee pursuant to subpoena demonstrate how the European
Commission regularly pressured platforms ahead of EU Member State national elections in order
to disadvantage conservative or populist political parties.
The European Commission also helped
to organize “rapid response systems” where government-approved third parties were empowered
to make priority censorship requests that almost exclusively targeted the ruling party’s
opposition. TikTok reported to the European Commission that it censored over 45,000 pieces
of alleged “misinformation,” including clear political speech on topics including “migration,
climate change, security and defence and LGBTQ rights,” ahead of the 2024 EU elections.
to organize “rapid response systems” where government-approved third parties were empowered
to make priority censorship requests that almost exclusively targeted the ruling party’s
opposition. TikTok reported to the European Commission that it censored over 45,000 pieces
of alleged “misinformation,” including clear political speech on topics including “migration,
climate change, security and defence and LGBTQ rights,” ahead of the 2024 EU elections.
il y a 9 heures






