Improve your detection and simplify moderation - in one AI-powered platform.
Stay ahead of novel risks and bad actors with proactive, on-demand insights.
Proactively stop safety gaps to produce safe, reliable, and compliant models.
Deploy generative AI in a safe and scalable way with active safety guardrails.
Online abuse has countless forms. Understand the types of risks Trust & Safety teams must keep users safe from on-platform.
Protect your most vulnerable users with a comprehensive set of child safety tools and services.
Our out-of-the-box solutions support platform transparency and compliance.
Keep up with T&S laws, from the Online Safety Bill to the Online Safety Act.
Over 70 elections will take place in 2024: don't let your platform be abused to harm election integrity.
Protect your brand integrity before the damage is done.
From privacy risks, to credential theft and malware, the cyber threats to users are continuously evolving.
Stay ahead of industry news in our exclusive T&S community.
Signed just this month, the California Age-Appropriate Design Code Act (“ADCA”) places responsibilities on businesses that provide a product or service likely to be accessed by children. The law will be adopted on July 1, 2024, placing new obligations that Trust & Safety teams should begin preparing for now.
ActiveFence spoke with Michal Brand – Gold, our General Counsel VP, to understand who the act will apply to, what the obligations are, and how companies can prepare.
The ADCA seeks to protect children’s data. Specifically, the act aims to mitigate the following risks:
The ADCA will apply to businesses that provide an online service, product, or feature likely to be accessed by children in California. These companies:
Platforms that the ACDA applies to must comply with the following requirements:
The CA Age-Appropriate Design Code can hold violators liable for a fine of up to $2,500 per affected child for each negligent violation and up to $7,500 per affected child for each intentional violation.
However, businesses that comply with the law will be given notice before any actions are initiated. The business will have 90 days to rectify any violations before penalties are given.
An essential piece of the ADCA is creating a working group whose purpose is to develop best practices for implementing the law, as well as identify services that are required to comply with it.
Additionally, the working group has been given access to leverage California’s Privacy Protection Agency (CPPA) which has years of experience developing data privacy policies.
Given that the law establishes a Working Group and previously established the CPPA, it seems likely that the ADCA will be enforced.
The California act is modeled after the UK’s Age Appropriate Design Code which took effect in September 2021. Given their similar requirements, many major tech companies have already implemented measures to meet the UK’s code and, therefore, California’s requirements.
Major tech companies redesigned online platforms to comply with the Children’s Code. However, we haven’t yet seen enforcement or fines from the code.
Instead, the focus of the UK’s code authority, the ICO, has been to help companies find solutions. As such, the ICO issues design guidance, a self-assessment risk tool, and transparency best practices.
In the evolving legal landscape of online liability, preparedness is key for online platforms to stay compliant. To do so, teams must remain up to date on legislation worldwide that affects the internet. Check out our Trust & Safety Compliance Center and help ensure your platform is compliant.
The decision to build or buy content moderation tools is a crucial one for many budding Trust and Safety teams. Learn about our five-point approach to this complex decision.
Recognized by Frost & Sullivan, ActiveFence addresses AI-generated content while enhancing Trust and Safety in the face of generative AI.
Building your own Trust and Safety tool? This blog breaks down for Trust and Safety teams the difference between building, buying, or using a hybrid approach.