cross-posted from: https://jamie.moe/post/113630
There have been users spamming CSAM content in [email protected] causing it to federate to other instances. If your instance is subscribed to this community, you should take action to rectify it immediately. I recommend performing a hard delete via command line on the server.
I deleted every image from the past 24 hours personally, using the following command:
sudo find /srv/lemmy/example.com/volumes/pictrs/files -type f -ctime -1 -exec shred {} \;
Note: Your local jurisdiction may impose a duty to report or other obligations. Check with these, but always prioritize ensuring that the content does not continue to be served.
Update
Apparently the Lemmy Shitpost community is shut down as of now.
Someone is trying really hard to hurt Lemmy by continually attacking the most popular instance. Is this all coming from right-wingers upset that their nazi instances were defederated across basically the whole fediverse?
The simplest explanation is 4chan types just doing it for the lulz.
Could be, I’m surprised /g/ didn’t create an instance
Their knowledge stops at creating sway screenshots.
I’m sure someone’s already created a logo. But, that’s as far as they ever get.
My tin foil hat is telling me it’s one of the other social media companies funding a hacking group to do it. They stand to have the most to lose, and they’ve seemingly decided to enjoy changing the narrative regarding multiple topics. Lemmy stands directly against what the bigger social medias stand for.
I have no evidence to back this though. As a business owner I just know that things become very consistent when people are being paid, and very inconsistent when they aren’t. These attacks are seemingly very consistent/organized.
You think a company that is posed to go public is going to attack a competitor with a minuscule amount of traffic with extremely illegal material that could put them in prison for even having?
See, I don’t believe this was done by a large corp. But all the DDoSing that’s happened? I can see u/spez orchestrating that.
Lemmy isn’t a threat to Reddit. It’s the same old trolls doing it like every other time.
I don’t think they do see it as a threat, I just think spez is petty enough and juvenile enough to do it.
Like, again, I pretty solely think it’s spez’s own personal ego shit. For example, he could have just shutdown the API. Instead, he had a weeks-long meltdown including committing libel against a developer. Someone like Zuckerberg doing this doesn’t make any sense to me, but I can totally see spez being exactly that kind of petty.
Reddit? No. I was thinking moreso Meta. They have the deeper pockets and a proven track record of breaking privacy laws to their own benefit.
That’s even worse. Meta probably doesn’t even know what Lemmy is.
So then why was Meta trying to get Threads to be on the Fediverse? Of course they’re aware of any potential threats, no matter how small.
Why reinvent the wheel if someone’s just going to hand you the backend? Lemmy is no threat to them.
The threat is a new sustainable community that’s sheltered from advertising that people could leave Factbook/Instagram/whatever and go to.
Meta was talking about adding Mastodon federation to their Threads app. So I very much doubt it.
They’d probably take an Embrace, Expand, Extinguish approach.
You would pay a third party to do it. And keep details extremely vague so you have plausible deniability.
Just No, it’s nonsense.
You have a massively inflated view of Lemmy’s importance in the social media market.
There must be room under that tinfoil hat for the both of us, because this was my first thought too.
The longer it continues, the more likely that scenario is IMO. Bitter alt-right extremists would probably start losing interest after a short while, whereas social media competitors would stand to gain from long-term interference.
Come on in! There’s cookies.
I’d go with state actors first.
When a particular social media platform is centralized, you can buy yourself a say percentage of stock and have sway over it (cough tencent), or have a useful idiot ruin the platform (cough musk), or another useful idiot to run propaganda you like anyway (cough truth social, cough fox news, cough newsmax…), or yet another that will sell out it’s host country’s citizens for cold hard cash (cough facebook).
But when that social media platform is decentralized? Well, then you’d need to figure out how to poison the well early on to stave off adoption. The Saudi Arabias, UAEs, Chinas definitely don’t like the idea of lemmy, and it’ll be way harder for them to control if critical mass is hit.
Yep, that’s a great point.
Add to that the fact that mainstream social media companies wouldn’t touch DDoS and CSAM attacks with a 100-foot pole, even if they contracted with a third party. Both of these attacks are highly illegal and would surely ruin a publicly traded company (or one that’s trying to go public, like Reddit).
And don’t forget Russia in your list of state actors who are threatened by the unrestricted flow of information. They definitely don’t want their citizenry to be informed of how disastrously their invasion of Ukraine is going, or what a murderous scumbag Putin is.
You don’t get a lot of upvotes and sure we don’t know but it isn’t like the NSA infiltrated (in person) left wing groups and more.
It’s definitely a possibility that someone doesn’t like decentralised content enough to put some meager efforts against it.
This makes the most sense to me. It’s a pretty vitriolic attack, therefore I don’t think it’s simply a troll while at the same time I don’t believe it’s any corporate social media.
deleted by creator
Considering all the alt-right garbage that was popping up there the last couple of days this seems at least plausible. I sometimes envy their ability to utterly destroy anything they touch.
I’m sure you’d love to link to some examples
See people claim this constantly with no proof
You want me to link posts that the mods removed? That seems like an unrealistic expectation. You could always check the post pinned to the top of lemmyshitpost where they describe the recent problems, but I suspect you didn’t ask for proof in good faith
Ah that’s actually my bad, I thought you were replying to a different comment in reference to hexbear
So, from memory there has been:
- This recent attack
- Regular DDOS attacks
- Frequent attempts to spam community creation
- That one time the instance got hacked and set to redirect to shock sites
Am I missing anything?
This seems like more than just a few trolls. Maybe someone really doesn’t want to see user-owned social media take off.
I see where you’re going with this, but no, people really are just absolutely horrible. The fact is that with other social media they’re just already very set up in managing this so we never see it. Lemmy wants to be open, this is the flipside of that openness.
It’s generally easy to crap on what’s ‘bad’ about big players, while underestimating or undervaluing what they are doing right for product market fit.
A company like Meta puts hundreds of people in foreign nations through PTSD causing hell in order to moderate and keep clean their own networks.
While I hope that’s not the solution that a community driven effort ends up with, it shows the breadth of the problems that can crop up with the product as it grows.
I think the community will overcome these issues and grow beyond it, but jerks trying to ruin things for everyone will always exist, and will always need to be protected against.
To say nothing for the far worse sorts behind the production and more typical distribution of such material, whom Lemmy will also likely eventually need to deal with more and more as the platform grows.
It’s going to take time, and I wouldn’t be surprised if the only way a federated social network eventually can exist is within onion routing or something, as at a certain point the difference in resources to protect against content litigation between a Meta and someone hosting a Lemmy server is impossible to equalize, and the privacy of hosts may need to be front and center.
The solution in this case is absolutely AI filters. Unfortunately you won’t find many people willing to build robust model for that. Because they’d be those getting the ptsd you mention.
Iirc, ptsd is something only certain characters get. We should probably focus on finding people who really have no problem watching rough content. I have ptsd so I probably am not the right person for the job.
I don’t want to try. I have pretty low barrier. I set up NSFW filter on lemmy because I found disturbing the furry content that was common some time ago… I don’t want even to try anything worst than that
Can absolutely relate. Just seeing nsfw if you‘re not anticipating it is very weird.
It is very reminiscent of the trolls in the earlier web.
Why do these deranged fucks do this
You really think the trolls would pass up on this golden opportunity?
This isn’t trolling, this is just disgusting crime.
The crime happened in the past when the children were abused. This is some weird amalgam of criminal trolling.
Edit: yeah yeah I get that csam is criminal, that’s why I called it an amalgam. It’s both trolling and criminal.
spreading child pornography is in fact illegal in most of the world
It’s still a crime. Taking the pictures is a crime. Sharing the pictures is also a crime.
Depending on jurisdiction, I am not a lawyer, etc etc, but I’d imagine with fairly high degree of probability that re-distribution of CSAM is also a crime.
Some sick fuck out there kept the files
The crime happened in the past when the children were abused.
That’s true. You could look at it that way and stop right there and remain absolutely correct. Or, you could also look at it from the eventual viewpoint of that victim as a human being: as long as that picture exists, they are being victimized by every new use of it, even if the act itself was done decades ago.
Not trying to pile on, but anyone who has suffered that kind of violation as a child suffers for life to some extent. There are many who kill themselves, and even more that cannot escape addiction because the addiction is the only safe mental haven they have where life itself is bearable. Even more have PTSD and other mental difficulties that are beyond understanding for those who have not had their childhood development shattered by that, or worse, had that kind of abuse be a regular occurrence for them growing up.
So to me, adding a visual record of that original violating act to the public domain that anyone can find and use for sick pleasure is an extension of the original violation and not very different from it, IMO.
The visual records are kind of a sick gift that never stop giving, and worse still if the victim knows the pics or videos are out there somewhere.
I am well aware not everyone sees it this way, but an extra bit of understanding for the victims would not go amiss. Imagine being an adult and browsing the web, thinking it’s all in the past and maybe you’re safe now, and stumbling across a picture of yourself being raped at the age of five, or whatever, or worse still, having friends or family or spouse or children stumble across it.
So speaking only for myself, I think CSAM is a moral crime whenever it is accessed, one of the most hellish that can be committed against another human being, regardless of the specificities of the law.
I don’t have a problem with much else that people share, but goddamn I do have a problem with that.
Ok retard
Archive.org gets crapped up this way, too. Repulsive people posting repulsive stuff, “troll” is too kind a word.
deleted by creator
big F in chat for those of you dealing with this. my #1 fear about setting upand instance.
It impacts everyone when this shit happens. It takes time for mods/admins to take down. And you can’t unsee it.
I hope nobody else has the misfortune of stumbling on that shit
There have been studies which found playing tetris for an hour or two after seeing something traumatic can prevent it taking root in our longterm memory.
I tried it once after accidentally clicking a link on reddit that turned out to be gore, I can’t remember exactly what it was now (about 9 months later) so it must have worked
This advice is a few hours too late for me. Hope it helps others
Don’t worry, life with hold many more traumas
Cue Homer Simpson: This is the worst thing you’ve seen in your life so far
I just posted an article explaining the study to the ‘You Should Know’ community, so hopefully some of the people who need to see it do so
That’s pretty genious. One way to work with trauma is moving the eyes from side to side along to a moving light. It’s basically to force the brain to work through something that used to stall it. So tetris, since it is highly reactive and logical, also needs spatial thinking, should very much force the brain to work instead of stalling.
Can you please link a study?
Yeah you really can’t. I’m pretty desensitized from earlier internet with death and other shock gore content but had managed to avoid CSAM until today. It was a lot worse than I expected, felt my heart drop. Worse, my app autoplays gifs in thumbnail so it kept going while I was reporting it.
I’ve mostly forgotten and it wasn’t on my mind until I saw this thread (happened less than 24hr ago) but even the slightest reminder is oddly upsetting. Wish I’d thought of the Tetris thing.
Likely scum moves from reddit patriots to destroy or weaken the fediverse.
I remember when Murdoch hired that Israeli tech company in Haifa to find weaknesses is TV smart cards and then leaked it to destroy their market by flooding counterfit smart cards.
They are getting desperate along with those DDOS attacks.
Could be, but more likely it’s just the result of having self hosted services, you have individuals exposing their own small servers to the wilderness of internet.
These trols also try constantly to post their crap to mainstream social media but they have it more difficult there. My guess is that they noticed lemmy is getting a big traction and has very poor media content control. Easy target.
Moderating media content is a difficult task and for sure centralized social media have better filters and actual humans in place to review content. Sadly, only big tech companies can pay for such infrastructure to moderate media content.
I don’t see an easy way for federated servers to cope with this.
Yeah exactly. This is the main reason I decided not to attempt to self host a Lemmy instance. No way am I going to let anyone outside of my control have the ability to place a file of their choosing on my hardware. Big nope for me.
Yup. Nope.
Pictrs is just completely disabled now. Rather be safe, then sorry.
Is this why I couldn’t upload a meme to the Lemmy World servers earlier today?
Fuck…
Yeah… Just wow. I disabled pictrs and deleted all its images, which also means all my community images/uploaded images are gone, and it’s more of a hassle to see other people’s images, but in the end I think it’s worth it.
Through caching every image pictrs was also taking up a massive amount of space on my Pi, which I also use for Nextcloud. So that’s another plus!
Note, apparently, lemmy will get pretty pissy if pictrs isn’t working… and the “primary” lemmy GUI will straight-up stop working.
Although, https://old.lemmyonline.com/ will still work.
And- I am with you. My pictrs storage, has ended up taking up quite a bit of room.
There has to be a more elegant way of dealing with this in the future, like de-coupling between Lemmy-account hosting (which effectively means acitivypub-fediverse account) and Lemmy-communities hosting.
Is disabling Pictrs as simple as stopping the Docker container?
Yup.
I sent a step further, and commented out the pictrs related configuration from the lemmy.hjson too.
Does that disable image saving and processing for one’s instance?
Yup.
So far, mostly everything appears to work still. But, trying to upload an image, just throws an error.
SyntaxError: Unexpected token ‘R’, “Request er”… is not valid JSON
I don’t see a way to actually “gracefully” disable it, but, this works.
Edit- don’t just stop pictrs.
Lemmy gets very pissy… and b reaks.
How desperate to destroy Lemmy must you be to spam CSAM on communities and potentially get innocent people into trouble?
Maybe you’re a dev on the Reddit team and own a lot of shares for what you know is about to go public?
What’s CSAM?
Child sexual abuse material - underage porn. For obvious reasons, you don’t want this to be something you’re hosting automatically out of your basement server.
That’s what I thought. Back in my days it was called CP.
I’ve been listing to the audiobook for American Prometheus: The Triumph and Tragedy of J. Robert Oppenheimer and the number of times they say “CP” as an abbreviation for “Communist Party” is too damn high.
Also last time I went to the amusement park Cedar Point they’ve got “CP” as an abbreviation on all sorts of stuff.
Made me chuckle, but I do think it’s perhaps time to move to the abbreviation CSAM since it’s less likely to get used for other purposes.
Youtube literally flat banned any channel with CP in it. Regardless of what CP actually stood for, like say combat power in pokemon go… Lots of bigger YouTubers got hit with account closures for like a week before it got reversed.
https://www.techspot.com/news/78814-youtube-bans-several-pokmon-go-channels-over-mistaken.html
Csam is an objectively better name.
‘Porn’ implies consent.
In what world anyone would think that CP implies consent? I mean, the word ‘child’ is right there. Do you think that the term ‘child soldiers’ implies consent? I don’t have anything against the term CSAM but if it was created because of doubts around consent it was a silly reason to create it.
The term originates from professionals - psychiatrists etc - who work in that field, because they knew even decades ago that “pron” is the wrong word for this kind of material.
I think it’s more likely some people working in those fields wanted to improve their career by popularizing a new term.
I think it has less to do with the existence of non-consensual porn as with the possibility and, indeed, existence of vast amounts of consensual porn. Consent is very much possible in adult porn, it isn’t with CSAM. It’s also possible with soldiers, though of course conscription exists and ask a random Ukrainian they’d rather not have to be a soldier for their loved ones to be protected.
There’s a lot of porn that wasn’t made consensually either. I don’t care what we refer to csam as but I think it’s important to acknowledge that.
Child sexual abuse material
Just google it.
Edit: Not Safe For Life
I kind of suspected it’s better not to google it at work.
deleted by creator
i’d love for a good tech journalist to look into how and why this is happening and do a full write-up on it. come on ars, verge, vice
Self hoster here, im nuking all of pictrs. People are sick. Luckily I did not see anything, however I was subscribed to the community.
- Did a shred on my entire pictrs volume (all images ever):
sudo find /srv/lemmy/example.com/volumes/pictrs -type f -exec shred {} \;
-
Removed the pictrs config in lemmy.hjson
-
removed pictrs container from docker compose
Anything else I should to protect my instance, besides shutting down completely?
I went ahead and just deleted my entire pictrs cache and will definitely disable caching other servers images when it becomes available.
Anyone know if this work is tracked anywhere? I’m suddenly really suspicious of continuing to run my own instance.
https://github.com/LemmyNet/lemmy/pull/3897
It does say “thumbnails” but as far as I know, Lemmy (or pictrs) makes a copy of the full image too. I don’t know if this PR includes full images.
To be clear, if no one on a given instance sub to that particular /c, the content won’t federate to said instance, correct?
At this point, the community is clean. So unless more is posted, then you should be good. If someone searched for the community and caused a preview to load while the content was active though, then it could be an issue.
Cool. Thanks. I cleaned up anything from the past 2 days, to be safe, and blocked that community.
blocked lemmyshitpost some time age because it is trash anyway
I checked and there shouldn’t be any images stored on the server when running lemmy 1.18.4. The post was made in high emotional distress and shouldn’t be taken at a face value. If the posts are bothering you I advise purging the posts in question. (I have already done that)
I’m on 1.18.4, once I deleted the most recent images, the former CSAM posts(among others) became broken images. So yes, it was pulling from local disk cache. Then I took care of the posts themselves after the content was invalidated.
How did you check this? From my understanding, images from external servers are copied (and transcoded) over locally. At least in my server (running 0.18.4), they do.
There is a possibility that my instance is buggy and it isn’t caching images even though it should.
It’s pretty inconsistent from my experience. Sometimes images do cache and sometimes they don’t.
edit:
Here’s an example from my instance:
https://ani.social/post/284147 - JPEG image that isn’t copied/cached by my server.
https://ani.social/post/285861 - WEBP image copied/cached by my server.
Let me try to figure this out. The first is a photo uploaded to lemmy.world, the second is a photo originally uploaded to lemmy.nz, both posts are in a federated version of lemmy.world’s shitpost community.
This is just a theory, but perhaps images hosted on the same server as the federated community will directly link, whereas images uploaded somewhere other than the federated community will be copied into cache, presumably in case the original host shuts down unexpectedly? See if this is the case?
images hosted on the same server as the federated community will directly link
https://ani.social/post/288601 - This image is uploaded from a user on the same instance as the federated community (lemmy.world) but the image is cached.
images uploaded somewhere other than the federated community will be copied into cache
https://ani.social/post/285354 - This image is uploaded from a user on a different instance (lemm.ee) from the federated community (lemmy.world) but the image is not cached.
The behaviour is pretty weird. Hopefully we can disable image caching/copying-over-locally so we don’t have to deal with problematic images hosted by other instances.
It depends on how the image posted, the thumbnails might get federated. If the image is used in a post/comment body, usually the thumbnails are not federated.
You can refer to this post. The full image is copied to my instance (and transcoded). Not just the thumbnail.
I shut down the pictrs or whatever docker container on my instance so all I host is containers and the database. All the images that I see on my instance are external links. I can check by just looking at the rendered HTML.
I was looking into self hosting. What can I do to avoid dealing with this? Can I not cache images? Would I get in legal trouble for being federated with an instance being spammed?
Someone else here mentioned not running "pictrs” at all
Refer to this comment chain from this same thread https://poptalk.scrubbles.tech/comment/577589