top of page
Writer's pictureKayelene Kerr

Understanding Section 230 of the Communications Decency Act: Why does it exist and what does it mean?

Updated: Sep 18

This article was written by Kayelene Kerr from eSafeKids.


The internet and technology have transformed the ways we learn, work, create, play, connect and are entertained. It’s given our children access to the world, but it has also given the world access to our children. Children can gain immense benefits from being online, but there are also risks. Digital environments, including emerging environments, are not designed for children or with children’s safety in mind, yet they’re an integral part of their lives.


Globally the increasing number of children online has seen a corresponding upward trend of online grooming, online child sexual abuse and exploitation, sextortion, youth produced sexual content, image-based abuse, cyberbullying, exposure to pornography and other illegal, hurtful, harmful and age-inappropriate content. Much of what children are exposed to and are navigating, is too much too soon. Digital harm is occurring on apps, platforms and online services at unprecedented levels.


Safety has, and in many cases continues to be, an afterthought by technology companies. Sadly, and to the detriment of the health, wellbeing and safety of children, technology companies clearly demonstrate profits over people, profits over harm, profits over child safety. Technology companies have demonstrated time and time again that they will not adhere to their civic responsibility to ensure their networks, platforms and services are not used in ways that cause or contribute to violating children’s rights to be protected from harm.

 

Technology companies’ priority is revenue generating activity, not children’s safety. These services have been developed in such a way that they create a supply chain of commercial activity, data processing, advertising and marketing. Persuasive technology and design features anticipate and often guide children and young people towards more extreme and harmful content. While acknowledging the above-mentioned features may not have intended to cause harm to children, experience has shown us they have ultimately facilitated and perpetuated it. For years technology companies have engaged in wilful blindness, prioritising commercial gain ahead of children’s safety. This is an incredibly complex landscape, but if technology companies continue to operate with the relative impunity afforded by Section 230 United States Communications Decency Act, 1996 (Section 230) these issues will be difficult to address.


Children’s rights to online safety should not only be respected but protected, however safeguarding children in online places has an added layer of complexity.


Often cited as, 'The Twenty Six Words That Created the Internet', Section 230 stands as a cornerstone of online freedom and innovation. Enacted in 1996, this piece of legislation is often hailed as the bedrock of the modern internet, yet it remains one of the most frequently debated aspects of online law. While many Australians are unaware of Section 230 it very much impacts Australian citizens, including our children and young people.


Let's delve into what Section 230 is, why it matters and the ongoing discussions surrounding its future. *** Please note this is a very brief overview, not a deep dive in.


Section 230 of the Communications Decency Act, 1996

Section 230 of the Communications Act of 1934, enacted as part of the Communications

Decency Act of 1996, provides limited federal immunity to providers and users of interactive computer services.


Section 230(c)

PROTECTION FOR “GOOD SAMARITAN” BLOCKING AND SCREENING OF OFFENSIVE MATERIAL


Section 230(c) (1)

TREATMENT OF PUBLISHER OR SPEAKER No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.


In simple terms this shields online platforms from liability for user-generated content.


Section 230(c) (2)

CIVIL LIABILITY

No provider or user of an interactive computer service shall be held liable on account of

(A) any action voluntarily taken in good faith to restrict access to or availability

of material that the provider or user considers to be obscene, lewd, lascivious, filthy,

excessively violent, harassing, or otherwise objectionable, whether or not such

material is constitutionally protected; or

(B) any action taken to enable or make available to information content providers

or others the technical means to restrict access to material described in [subparagraph

(A)]


In simple terms this allows platforms to moderate content on their services by removing or restricting access to it provided they act in good faith.


This means that online platforms like social media sites, forums and other digital services are not legally responsible for the content posted by their users. If someone posts defamatory, illegal or otherwise harmful content on a platform, the platform itself cannot be held liable for that content. Instead, responsibility rests with the original content creator. This shields technology companies that can host astronomical amounts of content from being sued by anyone who feels wronged by something someone else has posted on the platform or content the platform has removed.


While this may sound reasonable the reality is it's also afforded technology companies protection, protection that many argue big technology companies have misused and abused. Section 230 has created expansive immunity for claims based on third-party content that appears online. Consequently, internet companies frequently rely on Section 230’s protections to avoid liability in litigation. In recent years it's been proposed that the broad immunity courts have recognised under Section 230 is beyond the law’s intended scope.


One of the primary benefits of Section 230 in the early days of the internet was that it fostered innovation and growth in the then emerging online space. By shielding platforms from legal liability for user-generated content, Section 230 has enabled the rise of a diverse range of services - from social media giants like Facebook and Snapchat to niche forums and blogs. Without this legal protection, many of these platforms might never have come into existence due to the risk of costly litigation.


“The primary thing we do on the internet is we talk to each other. It might be email, it might be social media, might be message boards, but we talk to each other. And a lot of those conversations are enabled by Section 230, which says that whoever’s allowing us to talk to each other isn’t liable for our conversations” said Eric Goldman, a professor at Santa Clara University specialising in internet law. "The Supreme Court could easily disturb or eliminate that basic proposition and say that the people allowing us to talk to each other are liable for those conversations. At which point they won’t allow us to talk to each other anymore."


If platforms were not immune under the law, then they would not risk the legal liability that could come with hosting user-generated content. In the alternate, they may choose to abandon moderation altogether and we could end up with a far worse online environment than what we currently experience.


In the Australian context the long tail of Section 230 was recently cited in eSafety Commissioner v X Corp [2024] FCA 499. The Office of the eSafety Commissioner (eSafety) issued X Corp, an American corporation, with a removal notice under s109 of the Online Safety Act 2021 (Cth) (Act). The removal notice concerned 65 links to user-generated posts on the X platform. The 65 links all contained video footage of the stabbing of Bishop Mar Mari Emmanuel on 15 April 2024. While not a particularly explicit thread in the dispute, the spectre of s230 of the infamous Communications Decency Act lurked in several key documents, including powerful expert affidavits.


Challenges and Controversies

Despite its benefits, Section 230 has faced significant scrutiny and calls for reform. Critics argue that the law allows platforms to avoid responsibility for harmful content, such as misinformation, hate speech, and illegal activities. Proponents contend that platforms should be more accountable for the content they host and the ways they curate and recommend information.


Misinformation and Harmful Content

One major concern is the proliferation of misinformation and harmful content. Critics argue that large platforms have become breeding grounds for false information, hate speech and extremist views, and that Section 230 shields them from necessary accountability. Calls for reform seek to address these issues by imposing stricter regulations or by revising the protections offered to platforms.


The Power of Big Tech

Another point of contention is the power wielded by big tech companies. Critics argue that platforms like Facebook, Google and X, with their immense reach and influence, should bear more responsibility for the content they host. They suggest that the current interpretation of Section 230 allows these companies to avoid accountability for the negative impacts of their platforms whilst making inordinate profits.


Criminal Activity

Some argue the courts have expanded the scope of immunity to big tech companies too far. The laws and regulations applicable to brick-and-mortar products and environments have been evaded by tech companies. Platforms have used Section 230 to evade laws and liability even when they knew their services were being used from criminal activity. The Australian eSafety Commissioner Julie Inman-Grant describes the American regime: "Some of the worst terrorist and child sex abuse and revenge porn sites in the world are hosted in the US… They don't have a regulatory structure or government agency to go after these sites."


Looking forward

Ensuring that the internet is a safe, but also vibrant, open, and competitive environment is important, but so too is online safety. It's time for Section 230 to be recalibrated to take into account the vast technological advances that have occured since 1996. Many of the once fledgling technology companies are now titans, wielding significant wealth and influence. It's important to acknowledge changes to Section 230 are likely to have may ripple effects around the world.


A final thought

Section 230 remains a critical element of the digital landscape. Its protections have enabled the growth of the internet and supported free expression, but it also faces significant scrutiny as the online world continues to change. The ongoing discourse around Section 230 highlights the need for a nuanced approach to internet regulation that balances innovation with accountability. As this conversation unfolds, it will shape the future of online communication and the broader digital ecosystem, only time will tell what that might look like.


Further information

These are complex global issues that require global responses. It is unfair and unreasonable for Australian children, young people, schools, parents/caregivers, educators, other professionals and community-based organisations to address these issues alone. For too long technology companies have not carried the responsibility of protecting children from online harms and for managing and mitigating the serious real-world consequences that are impacting the health, wellbeing and personal safety of children and young people. The current situation is untenable and can’t continue.

 

Knowing that it is not always easy for adults to talk about these topics I’ve developed workshops for parents, carers, educators and other professionals who support children and young people. These workshops explore the practical skills and strategies that can be used to support children as they navigate the online places they spend time. I translate the research into easy-to-understand information, enriched by practical examples. I’ve also developed and sourced books and resources to support in-home education and curriculum delivery.


You may like to read the following eSafeKids Blogs:




To learn more about eSafeKids workshops and training visit our services page.


To view our wide range of child friendly resources visit our online shop.


Join the free eSafeKids online Members' Community. It has been created to support and inspire you in your home, school, organisation and/or community setting.


About The Author

Kayelene Kerr is recognised as one of Western Australia’s most experienced specialist providers of Protective Behaviours, Body Safety, Cyber Safety, Digital Wellness and Pornography education workshops. Kayelene is passionate about the prevention of child abuse and sexual exploitation, drawing on over 27 years’ experience of study and law enforcement, investigating sexual crimes, including technology facilitated crimes. Kayelene delivers engaging and sought after prevention education workshops to educate, equip and empower children and young people, and to help support parents, carers, educators and other professionals. Kayelene believes protecting children from harm is a shared responsibility and everyone can play a role in the care, safety and protection of children. Kayelene aims to inspire the trusted adults in children’s lives to tackle sometimes challenging topics.


About eSafeKids

eSafeKids strives to reduce and prevent harm through proactive prevention education, supporting and inspiring parents, carers, educators and other professionals to talk with children, young people and vulnerable adults about protective behaviours, body safety, cyber safety, digital wellness and pornography. eSafeKids is based in Perth, Western Australia.


eSafeKids provides books and resources to teach children about social and emotional intelligence, resilience, empathy, gender equality, consent, body safety, protective behaviours, cyber safety, digital wellness, media literacy, puberty and pornography.


eSafeKids books can support educators teaching protective behaviours and child abuse prevention education that aligns with the Western Australian Curriculum, Australian Curriculum, Early Years Learning Framework (EYLF) and National Quality Framework: National Quality Standards (NQS).


Cyber Safety Perth


Educate, equip and empower children with knowledge through stories!

Reading with children provides an opportunity to teach vital life skills in a child friendly, fun, age and stage appropriate way. Reading books that are meaningful can have a lasting impact. Selecting books with teachable moments and content can assist you to discuss a wide range of topics, particularly those that are sometimes tricky and sensitive.

bottom of page