Stop CSAM bill is harmless - extremely unlikely to pass - rather pointless - narrowly focused on removing materials depicting sexual abuse of children- already very illegal - does NOT break encryption
- pointless because NONE OF THE SITES MOST OF US USE WANT THIS MATERIAL on their website anyway and they are already doing what they can to stop it
This is a very narrowly focused bill that is brought to Congress every year and so far has never got anywhere but every year leads to people FALSELY claiming it will break encryption or force sites to take down LGBT. It just requires Facebook to do what it already does.
Most sites use PhotoDNA from Microsoft, which is free. It hashes images - through a proprietary method
it turns the image into a long list of letters and numbers called a hash, which will be the same for visually almost identical images.
When encrypting messages it makes a hash of every image before encrypting
If one of the hashes is identical to a hash in the database of hashes for CSAM [Child Sexual Abuse of Minors] material, it saves a low resolution image of it
Then the low resolution image and the hash is sent to a separate organization (NOT the social media company) which checks the low resolution image against the image it is a hash to see if it really is identical.
This works because most of the CSAM shared online is shared over and over everywhere that CSAM gets uploaded, and has already been discovered before and identified as CSAM.
PhotoDNA is free.
Bluesky uses a different form of CSAM detection software called Thorn Safer which is also very effective and similar approach using hashes https://safer.io/resources/when-users-flock-harm-follows-what-rapid-migration-to-bluesky-teaches-us-about-content-moderation/
Bluesky is also a member of the Internet Watch Foundation which helps its members to find and detect CSAM https://www.iwf.org.uk/news-media/news/new-partnership-strengthens-bluesky-s-ability-to-tackle-child-sexual-abuse-imagery/
Every large platform has to tackle CSAM already because
it is already illegal
none of them want CSAM on their platform
. The joint statement with BlueSky and Thorn says:
QUOTE The reality is that child sexual abuse material (CSAM) – and other threats to kids – can thrive anywhere there is an upload button https://safer.io/resources/when-users-flock-harm-follows-what-rapid-migration-to-bluesky-teaches-us-about-content-moderation/
It would depend on the details but so long as the requirements are proportional to the size of the organization so that a small forum run as a volunteer by one person say doesn't have to do the same things as a big organization like BlueSky with millions of users it should be fine.
The only thing is, that nobody wants this stuff on their platform anyway - it's illegal and harmful to their users so they are doing the best they can to remove it anyway.
So that's my main objection is that it's unnecessary to require it since it's already illegal and there is free software out there already to remove it which the larger platforms will be using already.
But the bill is harmless
So long as it is worded carefully
I haven't checked the latest version but they keep trying to pass the same bill in identical text over and over and the same misunderstandings each time so I expect it's the same as usual.
The Stop CSAM act is very narrowly focused. It is specifically to take down Child Sexual Abuse Material which is already illegal and which all the major social media companies take down already.
Not about anything to do with legal but harmful. NONE of them involve any backdoors into secure encryption. NONE of them target LGBT material and none of them can be used by states to target LGBT material.
NONE OF THEM WILL MAKE ANY DIFFERENCE TO FACEBOOK. Facebook ALREADY take down CSAM very pro-actively with community standards that go way above the STOP CSAM or other similar acts like the EARN It act.
QUOTE During the first quarter of 2023, the social network removed 8.9 million pieces of child sexual exploitation content that violated their community standards, and 1.9 million pieces of child nudity and physical abuse content.
https://www.statista.com/statistics/1013776/facebook-child-exploitation-removal-quarter/
EARN IT and Stop CSAM etc will make no difference to Facebook. They already take it all down very proactively and they are very successful at it. They do far more than would be required by the Stop CSAM or Earn It acts.
And all this about a claim that if they stop CSAM they have to block LGBTQ is absolute NONSENSE. Facebook doesn't ban LGBTQ discussions! And for sites that have adult content then that is not CSAM either.
CSAM is already illegal.
And it doesn't make sense that they would change their community guidelines to ban LGBT content because they are required to take down CSAM.
That would be like changing their community guidelines to ban any adult discussion of sex.
Because most CSAM is for heterosexual viewers so if they have to ban adult LGBT discussions they have to ban adult cis sexual discussions too.
But it wouldn't make sense to respond to a requirement to take down CSAM by saying that adults can't talk about sex in an adult way on your platform. So why would they respond by saying that adults can't talk about LGBT sex? It makes no sense.
The usual reason given is a mention of grooming in KOSA which is a different bill altogether. They claim that states will be able to use the mention of grooming in KOSA to force Facebook or Twitter or Instagram etc to ban LGBT conversations by people in their state. That makes NO SENSE.
Because grooming in this context refers specifically to activity conducted as part of a criminal offence.
Grooming has a precise legal definition as activity that persuades or coerces a person to engage in prostitution and other action that's a criminal offence. It is nothing to do with LGBT - most grooming would be heterosexual grooming. Not so many people are coerced into gay prostitution.
QUOTE STARTS
In the U.S. child grooming is considered a federal offence pursuant to 18 USCS § 2422. the provision of the section reads as “(a) Whoever knowingly persuades, induces, entices, or coerces any individual to travel in interstate or foreign commerce, or in any Territory or Possession of the United States, to engage in prostitution, or in any sexual activity for which any person can be charged with a criminal offense, or attempts to do so, shall be fined under this title or imprisoned not more than 20 years, or both.
(b) Whoever, using the mail or any facility or means of interstate or foreign commerce, or within the special maritime and territorial jurisdiction of the United States knowingly persuades, induces, entices, or coerces any individual who has not attained the age of 18 years, to engage in prostitution or any sexual activity for which any person can be charged with a criminal offense, or attempts to do so, shall be fined under this title and imprisoned not less than 10 years or for life”.
A site with LGBT content such as sex education or porn would be no more at risk at being a problem for grooming than a site with heterosexual sex education or porn.
The requirement in KOSA is just to make best effort to mitigate and prevent it which the big social media giants are already doing - an important aspect of it is that it requires them to allow researchers to have access to the data which will give a better idea of how well they are doing this.
There are already laws against grooming. This is not introducing a new law against grooming
The mention of grooming is in this section
QUOTE STARTS
(b) Prevention of harm to minors.—In acting in the best interests of minors, a covered platform shall take reasonable measures in its design and operation of products and services to prevent and mitigate—
(1) mental health disorders or associated behaviors, including the promotion or exacerbation of self-harm, suicide, eating disorders, and substance use disorders;
(2) patterns of use that indicate or encourage addiction-like behaviors;
(3) physical violence, online bullying, and harassment of a minor;
(4) sexual exploitation, including enticement, grooming, sex trafficking, and sexual abuse of minors and trafficking of online child sexual abuse material;
(5) promotion and marketing of narcotic drugs (as defined in section 102 of the Controlled Substances Act (21 U.S.C. 802)), tobacco products, gambling, or alcohol; and
(6) predatory, unfair, or deceptive marketing practices, or other financial harms.
https://www.congress.gov/bill/117th-congress/senate-bill/3663/text
It is highly unlikely to be added to any other bill.
If Stop CSAM does get included in some bill, nothing is going to happen that anyone will notice except geeks who follow the statistics of the amount of CSAM removed by social media companies.
If the bills are effective they will see an uptick in the amount removed in some sites that don't do such a good job as Facebook. This then leads to the people who make these videos no longer having such a big market for them and then takes away the financial incentive that leads to some criminals exploiting young kids to make these videos.
The Stop CSAM bill I did this debunk for is here:
https://www.lexology.com/library/detail.aspx?g=ed495b35-4f55-4116-b54b-762a8c863c1b
[but not likely to have changed significantly]
It is only about CSAM
This is a summary by its proposers of what it does:
https://www.durbin.senate.gov/imo/media/doc/STOP%20CSAM%20Act%20of%202023.pdf
The bill itself is hard to read - it's a series of instructions for someone to edit the legal code - saying to to this page, change this word in the third sentence, add conditions e to g to the end etc.
So you can only figure out what it means by going to the legal code it modifies.
This is quite usual.
Most of them are introducing new measures.
So then they just give all the text of the measures.
The ones modifying section 230 are simple because they just modify one part of it.
But ones that modify an existing complex piece of legislation are always done like this.
You then need to go to legal commentary by experts who are familiar with the legislation or spend a few hours downloading the text, and going through and making the changes.
It is a simple straightforward bill from the description.
Then to the legal text.
On encryption, this is what the legal text says. I will summarize what it means after the quote.
QUOTE STARTS
``(g) Encryption Technologies.--
``(1) In general.--Notwithstanding subsection (a), none of the following actions or circumstances shall serve as an independent basis for liability of a provider of an interactive computer service for conduct relating to child exploitation:
``(A) The provider utilizes full end-to-end encrypted messaging services, device encryption, or other encryption services.
``(B) The provider does not possess the information necessary to decrypt a communication.
``(C) The provider fails to take an action that would otherwise undermine the ability of the provider to offer full end-to-end encrypted messaging services, device encryption, or other encryption services.
``(2) Consideration of evidence.--Nothing in paragraph (1) shall be construed to prohibit a court from considering evidence of actions or circumstances described in that paragraph if the evidence is otherwise admissible.
So it just says that providing encryption is not by itself evidence of liability.
It says that it is permitted to take account of providing encryption when combined with other evidence.
You'd expect that.
If e.g. they specifically provided encryption services to a known ring of people providing CSAM you need to take account of that.
You can't say in a bill like this that they would never in any circumstances be liable for providing encrypted services to criminals.
As for the rest of it need to look a bit closer. It is an amendment of Section 2255 of title 18, United States Code, and then go to the end and all that text gets added to it.
It's an amendment of this
https://www.law.cornell.edu/uscode/text/18/2255
It would add at the end new sections d to h
The one the EFF are worried about is section (e)
[EFF are THOROUGHLY UNRELIABLE]
QUOTE
`(e) Relation to Section 230 of the Communications Act of 1934.--Nothing in section 230 of the Communications Act of 1934 (47 U.S.C. 230) shall be construed to impair or limit any claim brought under this section for conduct relating to child exploitation.
The EFF claims this modifies section 230 in another bill.
But go to section 230 and you read:
QUOTE (1) No effect on criminal law
QUOTE Nothing in this section shall be construed to impair the enforcement of section 223 or 231 of this title, chapter 71 (relating to obscenity) or 110 (relating to sexual exploitation of children) of title 18, or any other Federal criminal statute.
So section 230 already has an exception for CSAM specifically (in its mention of chapter 110 of title 18).
The Stop CSAM bill is just reminding the reader of this.
Stop CSAM is a modification of chapter 110 of title 18 so automatically it is excepted out of section 230 without them saying anything so they just say it is an exception as a courtesy to the reader so they don't have to go check section 230 for themselves
Then on what Stop CSAM actually does to Section 2255 of title 18.
First of all it modifies it to be less specific.
Instead of:
QUOTE Any person who, while a minor, was a victim of a violation of section 1589, 1590, 1591, 2241(c), 2242, 2243, 2251, 2251A, 2252, 2252A, 2260, 2421, 2422, or 2423 of this title and who suffers personal injury as a result of such violation, regardless of whether the injury occurred while such person was a minor, may sue in any appropriate United States District Court and shall recover the actual damages such person sustains or liquidated damages in the amount of $150,000, and the cost of the action, including reasonable attorney’s fees and other litigation costs reasonably incurred. The court may also award punitive damages and such other preliminary and equitable relief as the court determines to be appropriate.
It becomes
QUOTE Any person who, while a minor, was a victim of a child exploitation violation or conduct relating to child exploitation and who suffers personal injury as a result of such violation or conduct, regardless of whether the injury occurred while such person was a minor, may bring a civil action in the appropriate United States District Court and shall recover the actual damages such person sustains or liquidated damages in the amount of $150,000, and the cost of the action, including reasonable attorney’s fees and other litigation costs reasonably incurred. The court may also award punitive damages and such other preliminary and equitable relief as the court determines to be appropriate.
Then it goes on to define the term `conduct relating to child exploitation'"
For social media or other providers of interactive computer services the "conduct relating to child exploitation" includes
QUOTE the intentional, knowing, or reckless hosting or storing of child pornography or making child pornography available to any person;
So it is saying that section 230 can’t be used to impair or limit any claim for
QUOTE the intentional, knowing, or reckless hosting or storing of child pornography
I.e. that a site can be prosecuted for intentional knowing or reckless hosting of CSAM.
The EFF don't mention that It is only reckless hosting of CSAM that is mentioned not any hosting of CSAM.
Here reckless is NOT the same as negligent.
It has a precise legal definition.
QUOTE Behavior that is so careless that it is considered an extreme departure from the care a reasonable person would exercise in similar circumstances. As a mens rea (mental state) in the criminal law context, reckless action is distinguished from negligent action in that the actor consciously disregards a substantial and unjustified risk, as opposed to merely being unreasonable. For example, in State v. Olson, a 1990 South Dakota Supreme Court decision, the court did not find a tractor driver who turned left at 5-15 mph and hit another car reckless because the prosecution could not prove that he was aware that there was an oncoming car.
Extreme departure from what a reasonable person would do.
It's different from negligence.
So
Stop CSAM is nothing to do with adult or explicit material of any kind
It is specifically about CSAM
It has to affect minors.
In this case the exploitation is
taking the photos.
And facilitating that by knowingly or recklessly permitting people who take such photos or spread them to host them on your platform
and so encouraging people to force kids to do the actions that they photograph.
So not just hosting CSAM but recklessly hosting CSAM in a way that is an extreme departure from what a reasonable person would do running a social media website.
It's not anything to worry about at all!
See also
BLOG: KOSA is about genuine issues kids face on social media
— similar to those who want to remove fake doomsday from their feed and can’t
— nothing to do with LGBT
— only genuinely harmful content in feeds
— not going to lead to content being taken down
You can read it here: https://debunkingdoomsday.quora.com/KOSA-is-about-genuine-issues-kids-face-on-social-media-similar-to-those-who-want-to-remove-fake-doomsday-from-their-fe
BLOG: The KOSA bill is designed to protect kids from serious online harm
— updates in new KOSA bill deal with concerns it could be misused
— and it is now fully supported by LGBTQ groups
You can read it here: https://doomsdaydebunked.miraheze.org/wiki/The_KOSA_bill_is_designed_to_protect_kids_from_serious_online_harm_-_updates_in_new_KOSA_bill_deal_with_concerns_it_could_be_misused_-_and_it_is_now_fully_supported_by_LGBTQ_groups
BLOG: EARN IT bill nothing to be scared of
— won’t introduce an encryption backdoor and won’t require platforms to ban LGBTQ or NSFW materials
— narrowly focused on child sexual abuse materials
— already illegal and against all platform’s terms of use
You can read it here: https://debunkingdoomsday.quora.com/EARN-IT-bill-nothing-to-be-scared-of-won-t-introduce-an-encryption-backdoor-and-won-t-require-platforms-to-ban-LGBTQ-o

