2002
DOI: 10.1023/a:1015691211878
|View full text |Cite
|
Sign up to set email alerts
|

Untitled

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
4
0

Year Published

2003
2003
2023
2023

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 0 publications
0
4
0
Order By: Relevance
“…There is plenty of lawful but awful content spreading over the internet, ranging from discriminatory speech to medical misinformation. 193 The DSA did not require platforms to moderate such content by prescribing new content prohibitions, but rather regulated the systems and processes by which platforms enforce their own house rules. 194 That is to say, platforms are regarded as a mini-government assigned with the power to define and moderate harmful content within their house rules.…”
Section: Reduce Platforms' Concentrated Power Over Speechmentioning
confidence: 99%
See 1 more Smart Citation
“…There is plenty of lawful but awful content spreading over the internet, ranging from discriminatory speech to medical misinformation. 193 The DSA did not require platforms to moderate such content by prescribing new content prohibitions, but rather regulated the systems and processes by which platforms enforce their own house rules. 194 That is to say, platforms are regarded as a mini-government assigned with the power to define and moderate harmful content within their house rules.…”
Section: Reduce Platforms' Concentrated Power Over Speechmentioning
confidence: 99%
“…199 Rather than imposing stringent liability on platforms for usergenerated content or mandating comprehensive content monitoring, contemporary platform regulation ought to concentrate on establishing norms for platforms' operational procedures, including modifications to terms of service and algorithmic decision-making processes. 200 Accountable governance, such as necessary notifications and disclosures to users whenever platforms change their terms of service, can help reduce the information asymmetry between users and powerful gatekeeper platforms. 201 Meanwhile, users should be empowered to better understand how they can notify platforms about both problematic content and problematic takedown decisions and should be informed about how content moderation works on large platforms.…”
Section: Reduce Platforms' Concentrated Power Over Speechmentioning
confidence: 99%
“…This dynamic is exacerbated by increasing reliance on algorithmic enforcement, given the limitations of currently-existing technology in understanding the meaning and context of expressions 106 . Speech rules shift to reflect what algorithms are capable of assessing, rather than what is actually considered desirable on policy grounds: for example, when all nudity is treated as pornography because it is what can most easily be identified by image recognition software 107 .…”
Section: Ambiguous Categories and The Power Of Interpretationmentioning
confidence: 99%
“…For example, contextual factors which are traditionally considered relevant in applying the law but which are harder to incorporate into industrial-scale moderation processes may be excluded entirely 117 . As noted above, this is exacerbated by automated enforcement, as standards shift to reflect the limited evaluative capabilities of software 118 . Safeguards provided by law -such as appeals systems for users, which the EU relies upon heavily in the Terrorist Content Regulation, Copyright Directive and Digital Services Act 119 -may not be effective or widely used in practice 120 .…”
Section: Ambiguous Categories and The Power Of Interpretationmentioning
confidence: 99%