A bill aimed at combating the creation and dissemination of CSAM, or child sexual abuse imagery using artificial intelligence was recently introduced in the California state legislature. Assembly Bill 1831 was introduced by Menlo Park Assemblymember Marc Berman and District Attorney Erik Nasarenko of Ventura County. In a news release, it was announced that the bill seeks to “address the escalating threat posed by artificial intelligence (AI) in the creation of lifelike, illicit content involving children.”
AB 1831 is in response to recent incidents of AI-generated child abuse material, including students at a Beverly Hills middle school who were caught sharing nude pics of their classmates. The photos were made by superimposing headshots of the students onto nude bodies that were created by artificial intelligence software. An investigation into the incident, which was uncovered in February 2024, revealed that 16 students were victimized, and 5 students were expelled for their involvement.
California is not the only state that’s dealing with an epidemic of AI-created child pornography. In September of last year, prosecutors from all across the U.S. sent a letter to the U.S. Congress, urging leaders from both parties to come together and “study the means and methods of AI that can be used to exploit children specifically.”
Using software and online technology like Photoshop to create revenge porn and other altered images has been a problem for many years. But nowadays, it’s not just adults who are being victimized, as the abuse has shifted to children, whose images can be used to create sexually explicit content. There is plenty of software available on the dark web, not to mention tools that can be exploited on social media platforms to generate and distribute images of child abuse.
Even kids can easily download the necessary technology and use it to “nudify” images of innocent girls and boys, then share them with all their friends, who share them with other people, and so forth. Eventually, it becomes impossible to track where these images were sent to, and the unfortunate reality is that the photos are out there in the deep, dark web to be used and sold by anybody.
Hopefully, Democratic and Republic leaders in Congress can put aside their differences and come up with federal legislation to deal with the proliferation of AI-generated child sexual abuse material. We wholeheartedly agree with the words of Ventura County DA Erik Nasarenko, who said in a public statement, “As technology evolves, so must our laws. This bill sends a clear message that our society will not tolerate the malicious use of artificial intelligence to produce harmful sexual content involving minors.”
As Assembly Bill 1831 makes its way through the state legislature, we are left with the question of how to bring justice to those who lives have been permanently damaged by child sexual abuse content. There is the option of filing criminal charges, but you may also have the right to file a lawsuit for monetary compensation. For more information on your rights and legal options, contact us to speak with a AI- generated child nude and pornography lawsuit attorney

Our Recent Verdicts and Settlements
$1.9 Million
$600,000
$1.5 Million
$54 Million
$525,000
$1.2 Million
Child Sexual Abuse Imagery and Content Laws in California
Prior to the introduction of Assembly Bill 1831, another law was passed regarding the facilitation and distribution of sexually explicit content involving minors. Assembly Bill 1394, which was signed into law by Governor Gavin Newsom in October 2023, aims to punish web services, primarily social media sites for “knowingly facilitating, aiding, or abetting commercial sexual exploitation” of underage individuals.
Though this law will not go into effect until January 1, 2025, there has been much talk about the liabilities and punishments that will be assessed for web service providers that fail to audit and remove child abuse content from their platforms. Furthermore, these companies can face criminal penalties for “deploy[ing] a system, design, feature, or affordance that is a substantial factor” in causing children to be victimized for “commercial sexual exploitation.”
While the law doesn’t explicitly say so, we can infer that this passage is referring to technological features on social media apps that are regularly exploited for uploading and sharing images of child sexual abuse. Under AB 1394, companies like TikTok and Instagram can face criminal and civil penalties for their failure to properly audit such content or enforce rules and protective measures to keep predatory individuals off their platform.
It comes as no surprise to us that Assembly Bill 1831 followed so close on the heels of AB 1394, as California ramps up its efforts to prevent AI-generated child abuse imagery. Of course, there is no way to truly eradicate these practices, which is why it’s essential to go after anyone who participates in or enables the people who create this type of content.
How is AI Technology Used to Create Child Sexual Abuse Material?
We know that parents out there are overwhelmed by much of the technology that their kids use on a regular basis. But it’s crucial to understand the basics of artificial intelligence software and why it’s so dangerous to children, who spend much of their time on cell phones, tablets, and computers. Between May and June in 2023, the UK-based watchdog organization IWF, officially known as the Internet Watch Foundation, investigated 29 cases of suspected child abuse content that was generated by artificial intelligence.
IWF found that AI-generated nude and pornographic images of kids online had increased dramatically in just a few years. The group warned that AI altered images of children has essentially flooded the internet, and most of these were created from pictures taken off of social media sites. But children aren’t the only ones who are victimized by the criminal use of artificial intelligence. Photos and video of adults were “de-aged” using AI for the purpose of depicting them in situations of child sexual assault.
One of the leading AI software used for this purpose is Stable Diffusion, which is a product made by the open source artificial intelligence company, Stability AI. The company is quick to point out that it “prohibits any misuse for illegal or immoral purposes across our platforms.” But this is precisely how it’s used by many of their customers, and as a result, they can’t just say that they have zero responsibility when it comes to CSAM that’s generated by their product. These companies also have a legal duty to investigate complaints from innocent victims and comply with investigations by law enforcement if someone has used their technology to create content related to child sexual abuse.
Sadly, we are all too aware that social media and software companies are reluctant to do their part when it comes to images, video, and other content that’s harmful to children. Complaints are dismissed or not referred to the authorities, and it’s very rare that a proper investigation is conducted by these companies. If anything, the victim is told that nothing was found to substantiate their claims, followed by a bunch of legal jargon on why the company has no obligation to help them.
It’s no wonder the abusers get away with their actions and continue to prey on kids who will forever live with the scars of sexual exploitation using artificial intelligence technology.
Can I Sue as a Victim of CSAM Created by Artificial Intelligence?
Yes, you can sue if someone used your likeness to create AI-generated child sexual abuse imagery or videos. People who make, possess, and distribute such content argue that no one actually gets hurt, since these are artificially generated images. Or, they blame the victims or their parents for “putting themselves out there” on public accounts that anyone can access. We have even heard some people say that the images are not pornographic, but rather “artistic representations.”
These arguments, however, are not a defense for the fact that it is illegal to create depictions of minors in sexual situations, including nude photos that are generated by AI technology. And there’s no excuse for taking advantage of someone in this manner, knowing that these images are out there forever to be used by countless people, to be shared and sold on porn sites, with or without the victim’s knowledge. Aside from the trauma and psychological damage, victims can be hurt by extortion attempts from scammers that threatened to send the photos, video, etc. to loved ones and employers unless they pay an exorbitant sum of money.
This is why AI-generated CSAM isn’t just fake images that can be laughed off as a joke. And the people / entities that played a role in causing you to be victimized cannot be allowed to go unpunished. Thus, it’s important to investigate each of these claims and go after all the parties that are responsible, whether it’s a social media site, an AI software developer, or a public school district.
These are just some of the entities that have a duty of care to protect minors from sexual exploitation and assault. A school district, for example, must take immediate action when there are suspicions or complaints of AI-generated CSAM featuring some of their students. The middle school in Beverly Hills did their part by conducting an investigation and expelling the guilty students. However, we know of more than one school system where such incidents were brushed under the rug and victims were told that it was all a silly prank.
No matter who is liable, counsel from an experienced sexual abuse lawyer is the key to obtaining justice from a AI-generated child pornography lawsuit. Please contact our office to schedule a free consultation if your likeness was used for CSAM generated by artificial intelligence.
Value of a AI Generated Child Sexual Abuse Material Lawsuit
Lawsuits for the sexual abuse of a minor often generate settlements between $1,000,000 and $5,000,000. However, there are significant variations in the amount you can receive, with some cases being settled for under $500,000, while others are settled for upwards of $10,000,000. The differences are based on the degree of injury, the long-term effect on the victim’s life, their current and future monetary losses, and many other factors. That’s why there is no singular amount that can be applied to every lawsuit for AI-generated child sexual abuse material. However, it’s safe to say that one can generally expect payments of 6 to 7 figures, and up to 8 figures for the most extreme cases of abuse and harm to the victim.
How Long Does it Take to Settle One of these Lawsuits?
The average amount of time to settle a lawsuit for child sexual exploitation is around 12 to 24 months, with some cases taking over 3 years due to various issues that are beyond our control. One thing we can say for sure is that most cases will be settled privately between the plaintiff and defendant, so there is very little chance that your case will go to trial. But even without that, it’s uncommon for these cases to settle in less than 1 year, especially when a relatively new field of law such as child abuse via artificial intelligence is involved.
What is the Statute of Limitations to Sue for AI-Generated CSAM?
Ordinarily, victims have 22 years from whenever they turn 18 years old if they wish to sue for sexual abuse that occurred during their childhood. But there are circumstances where a victim has repressed the trauma of sexual abuse for a very long time. Due to constantly running from and denying the memories of abuse, it can take a very long time before they understand the harm they’ve suffered.
The majority of our clients are older adults who end up in therapy for mental health issues, ranging from mood disorders to long-term substance abuse. It can take quite a while before they reveal what happened to them as a child, and learn how those incidents are connected to their current suffering. This is why California law has a discovery rule, where victims have 5 years to file a lawsuit from when they learn about an injury / illness that resulted from child sexual assault and exploitation.
Our lawyers can help you determine the exact deadline for a lawsuit based on your circumstances, so contact us right away to explore the option of suing for artificially generated nude photos or child pornography.
Contact Our Law Firm
The sexual abuse lawsuit attorneys of Normandie are here for you 24/7 if you need advice on what to do as a victim of AI-generated CSAM and other forms of child sexual exploitation. We have a Zero Fee Guarantee policy, so you pay absolutely nothing if you decide to file a lawsuit. All legal fees are obtained from the defendant as long as we win your lawsuit. And if we don’t recover your settlement, you owe us $0, so there is never any risk to your finances.
Please contact our law firm and schedule a free case review. We look forward to fighting for your interests and bringing you the compensation you are entitled to.
Other Pages on Our Website Related to This Topic
Polinsky Children’s Center Foster Care Group Home Sexual Abuse Lawyer
Eastlake Juvenile Hall Sexual Abuse Lawyer
Camp David Gonzales – Juvenile Hall Sexual Abuse Lawyer







