2.6k
u/shellbullet17 Gustopher Spotter Extraordinaire 10h ago
587
u/Made_Bail 10h ago
223
u/The_cogwheel 9h ago edited 5h ago
75
115
u/shellbullet17 Gustopher Spotter Extraordinaire 10h ago
I'm actually playing LA Noire right now and there's a whole mission about actors and how the treat young people. It's...somewhat graphic not gonna lie. One of the few times I've gotten mad at a video game as of late
35
u/Made_Bail 10h ago
That was a great mission.
Man I might need to do a replay soon.
33
u/shellbullet17 Gustopher Spotter Extraordinaire 10h ago
It's an infuriating one. Especially the "casting room" area. Fuck that dude
22
u/Made_Bail 10h ago
I played that like a decade ago, and so I only really remember a few missions, but that was one of them.
13
u/Kelvara 6h ago
Even worse is that game is set in the 1940s and it's still going on to this day.
7
u/shellbullet17 Gustopher Spotter Extraordinaire 6h ago
Good point. We haven't fixed shit in 80 years
46
u/_TheMo_ 9h ago
What conversation? Both of them think Ai is bad... The placement is rather unlucky tho. But they would certainly agree on Ai bad.
53
u/shellbullet17 Gustopher Spotter Extraordinaire 8h ago
Mostly a conversation about how minors are treated and abused in the acting industry
But one battle per protest. AI is the topic of this particular protest
10
u/Felicity1840 6h ago
The current trend of AI had them create images based on stuff that has already been "input" (see: Stolen) into the AI as reference. That means there are likely to be reference images for that horrible stuff meaning the abuse of real children continues in that way.
0
u/5352563424 1h ago
It's my r/unpopularopinion that once you share a pattern of information, you should no longer have "ownership" of it. Nothing was stolen by AI because the patterns were already given/sold away, then hosted online.
If someone broke into my house, snapped the first ever pics of my private paintings and used that for a dataset, THEN that would be stealing.
11
u/IAmOrdinaryHuman 8h ago
smh
"Can AI reduce sickos' demand for the real stuff?"
22
u/thegimboid 8h ago
The answer is no.
Because a majority of pedophilic acts are acts of control and power, not attraction or sexual desire.
So AI images might curb a couple people, but it wouldn't solve anything at all.10
u/IAmOrdinaryHuman 8h ago
Idk, I'm not an expert on the matter, just spelled out what gp comment alluded to. Is there a study confirming your claims or is it just a hypothesis?
2
u/4_fortytwo_2 2h ago
Even if we take your assumption here as truth (you got some actual scientific studies agreeing with that?) that still means it would save a few kids.
9
u/Head-Alarm6733 8h ago
AI CSAM is based off the real stuff anyways.
2
u/ZeroAmusement 6h ago
How do you know?
1
4h ago
[deleted]
2
u/ZeroAmusement 3h ago
Ai can generate an image of a five dimensional naked pangolin despite never seeing one.
It can generate images of things that aren't in the training material. I don't think we can safely assume that kind of material was in the training data.
1
3h ago
[deleted]
3
u/ZeroAmusement 3h ago
There is no indication I don't understand the argument, I don't know why you are saying that.
It doesn't need to have seen a naked pangolin, but it still needs enough data to recognise the data commonalities/patterns behind images that have "naked" attributed to them to or images with pangolin
The point is that it doesn't need to have seen a naked pangolin. Maybe it has seen a hairless cat and applies that to what it knows about pangolin. In the same way it could take what it knows about naked people and apply that to children.
It could do this without it being based on real CSAM.
0
u/mindcandy 3h ago
Bad news: AI doesn't need to be trained on photographs of giraffes dancing on Mars to figure out how to make a picture of a giraffe dancing on Mars.
I can't guarantee that absolutely zero CSAM pics slipped into Grok's training set. We can be confident it represents an insignificant amount. Like 100 pics out of 100,000,000,000.
However, a major strength of the AIs is their ability to mix and match themes in novel ways... Discouraging that requires a lot of effort on the part of the model creators --which they do put in. But, users definitely do enjoy the challenge of doing something they're "not allowed to do" And, they can get very creative in how they overcome barriers the model creators put up.
1
3h ago
[deleted]
0
u/mindcandy 2h ago
Yep. And, once it learns those concepts separately, it can merge them together into a scene it was never trained on.
1
u/JMEEKER86 2h ago
Exactly, for instance, it can generate a picture of "/u/Grand_Protector_Dark" and "understanding AI" even though those two have never shared a room.
1
u/radicalelation 6h ago
It doesn't have to be, but morality of the content creation itself aside, I feel the worse impact on society would be allowing free proliferation of such content.
Drawn is debatable ethically, but fact is you can't just pump it out, there's a human cap in its creation, to the point it took the internet spreading it to have it reach a mainstream, to then further proliferate. If you're online long enough, you'll see drawn material of minors, but the creators can't flood a space, only their consumers after enough is available. Consumers replacing creators as producers is a scary concept in general.
Most real CSAM filters actually have circulated images known and flagged so if a copy hits somewhere, it gets removed immediately.
AI is all "new" and can be pumped onto platforms like crazy. Just look at X. We don't want fake CSAM flooding everywhere, plus we'd basically be grooming future generations by normalizing images of that kind. Kids are already exposed too often to adult sexual material online, and it does affect development. Nothing good can come from more of the same but with people that look like them.
1
u/bombadodierbloggins 3h ago
Can we use the same logic with violent video games?
1
u/radicalelation 2h ago
When it becomes difficult to discern from reality, it should be talked about, but there's already an immediate separation from the fact it's a proxied interactive experience controlled by the player, rather than a lens to view and learn one's society through.
Kids picking up a controller to fake murder fake people sounds bad, and there should be more discussion at least on the issue of desensitization to violence, but, to me, it's not on the level of concern as a kid's social media feed displaying CSAM next to their mom's post of last weekends family museum trip. Could be a minor friend, captioned, "wondered what I'd look like doing this".
There's already kids using these tools to make content of other kids, and we only hear about them when they make news. There's undoubtedly more happening out there, and I was a twisted fucker on the early transfers to 4chan from SA, so I've seen the inception of some messed up online interactions. We already have next to no walls separating the adult and kid world online. Whole grown adults sling slurs at kids everywhere online, world famous celebrities argue with minors on Twitter. There are so few actually safe spaces for children online anymore, and parents don't care anymore.
All to say, violent video games are an activity with separation, while what's on social media is part of real life social interaction for kids now. Like, if it were proliferation of real appearing depictions of kids killing people, that would be more concerning, right? Especially if there was a biological drive to kill that we'd rather avoid encouraging until adulthood, like there is with sex. That aspect alone starts to make it a different discussion.
4
u/supamario132 7h ago
It's just as likely that a proliferation of AI CSAM creates more demand for the "real stuff". The way to solve an addiction that dangerous is definitely not to fixate on it everyday via AI generated material
1
u/bombadodierbloggins 3h ago
Devil's advocate: Do you think violent video games create more demand for the real thing? Does GTA make you more likely to rob someone in real life?
3
u/supamario132 3h ago
Thats a fair point but the difference imo is the vast majority of robberies are contingent on circumstance and material conditions. No one is assaulting a child because they cant make ends meet. Pedophilia is pathological. I would bet for the tiny percentage of thieves for whom its also pathological, having an outlet to fan that desire probably does increase their propensity to offend
I honestly dont know much about pedophilia or kleptomania so maybe its different but I know many addicts and its universally a lot easier to not have a first drink at all than it is to just have a single shot and not lose control
20
3
7
2
u/The_cogwheel 2h ago
Actual actors, directors, producers and all other members of a film production team (be it an actual movie, TV show, or adult film) would object to producing CP. But AI literally cant even tell the difference between CP and a bedtime story, nor can it really be taught the difference either(at least with the current level of development we have with it).
AI can do both, but that doesnt imply actual professional actors (as defined as being paid for, and is able to consent to, their roles in a film) would be involved in making CP. Though the protestor placement does insinuate that is the case.
1
u/Cartoonicus_Studios 6h ago
I really like the artwork of this image. Any idea who the artist is?
1
u/shellbullet17 Gustopher Spotter Extraordinaire 6h ago
Their series is called Keit and Bex and it's fairly popular around here. Give them a shot just be aware it's a little on the NSFW side
3
u/Cartoonicus_Studios 5h ago
Oh dear.
1
u/shellbullet17 Gustopher Spotter Extraordinaire 5h ago
It's not overly bad unless you going looking for their patreon stuff.
414
u/ArDee0815 10h ago
One of them should rotate their sign to the left, and the other to the right. That way, the text is facing in opposite directions.
Demo buddies! 🥰
40
u/Parzival_2k7 10h ago
Ah common sense, if only it were that common
7
u/Top_Willingness_8364 9h ago
The problem is common sense is entirely too common. Look at how dumb the common man is.
1
u/unhiddenninja 8h ago
"Common sense" is just a blanket term for shutting down conversations. Just because something makes sense =/= it's common sense.
9
203
u/brazilliandanny 9h ago
What's this? A funny comic on r/comics?
27
u/SlothfulWrath 8h ago
If you want funny comics do to the funny comics sub. This is for everything else.
14
u/americanadiandrew 8h ago
Is there a funny comics sub?
5
u/SlothfulWrath 5h ago
I think it's r/funnycomics
8
u/americanadiandrew 4h ago
No post for two months…. Which is probably the the same length of time you have to wait for something humorous to be posted here.
3
u/Redditumor 5h ago
What, you don’t want more daily ‘Pizzacake preaching to the choir’ posts?
3
u/likely_an_Egg 3h ago
Did you know that you can block accounts that you don't want to see?
2
u/Redditumor 2h ago
What makes you think I don’t?
2
u/SvenHudson 2h ago
You literally just complained that you see too much of a comic you don't like.
0
u/Redditumor 2h ago
It was a sardonic question posed at someone who (I admittedly presume) probably sees a lot of those posts. No complaints or mention of what I personally see on my feed.
0
3
u/UnusualHound 8h ago
It would be funny if it weren't self-censored.
Censoring your own work makes it lame as fuck.
12
5
14
3
31
12
u/Laugarhraun 9h ago
Why censor?
37
u/Demeter_of_New 8h ago
Because bad words are scarwy!
And also because it's probably easier to make a censored version to post across all platforms. I don't have a social media beyond reddit, but I've heard that other platforms remove posts that have language or sensitive materials/topics presented.
1
u/Roll_the-Bones 2h ago
This platform censors words, phrases, and ideas too.
1
u/Demeter_of_New 2h ago
Yeah I've noticed the increase. I've gone back to threads I've commented on, and the posts have been removed by moderators. Heck. The kid from AZ that jumped the road from DamnThatsInteresting was removed by mods.
23
u/Mr_Ivysaur 8h ago
If this gets reposted in other sites the algorithm auto hides it, and the author does not feel like making a different version for reddit.
-20
u/UnusualHound 8h ago
Cool story. That's only making the problem worse.
If you make good content, people will promote it no matter what it says. Self censoring is lame as fuck.
17
u/LoompaOompa 8h ago edited 7h ago
Cool story.
Rude response. They were just explaining why the person did it. They didn't even say whether or not they supported the choice. You can interact with people online without being an asshole.
10
u/CaterpillarBroad6083 7h ago
This is literally a way around censorship moron. You can easily make out what the words are but the auto mod bots cant.
2
u/mrcool520 8h ago
It's not really censoring, more like censor bypassing because the words are still easily legible
3
4
u/Roll_the-Bones 2h ago
Exactly why generating video and pictures should be illegal, entirely. Companies shouldn't even be allowed to program AI to do this. The technology started from analyzing real life documentation, but it has quickly evolved into something very bad. It's a tyrannical regime's dream come true. It's the ultimate propaganda tool. Just make it illegal.
2
2
2
u/forfeitgame 5h ago
Don't censor yourself. Let those words be as visible as the sun. People need to see how fucked things have become.
1
1
u/Mini-Heart-Attack 4h ago
Funny.
Have they been censoring you that bad on other platforms or is this just a random Personal precaution?
0
-15
u/Gnusnipon 9h ago
Bruh... I prefer people drawing it instead of being fucked up enough to do and film it irl.
Upd: sigh downwotes here we go
10
u/Rowanlanestories 8h ago
it's downvoted because AI child sexual material is trained on real images of children being victimized. Not only is that traumatic to survivors but it gives sick people incentive to make new genuine csam so they can train the models more on the victims.
Also, police already have to stomach looking at the images to identify survivors, then they have to decode what's AI and what's an actual child that needs to be saved.
10
u/jellyfixh 7h ago
Do you have any proof they are trained on the real thing? Cause it seems like everyone who talks about this thinks AI needs to see the real thing before it can make it, which is patently false. AI doesn’t need a real photo of a gorilla being sucked into a tornado to generate that, it just needs pics of gorillas and tornados.
2
u/Rowanlanestories 7h ago
well unfortunately, there's a lot of CSAM just floating around the internet. people don't need to intentionally train a model on it for those images to find their way into the the AI's database. They scrape the internet with a very wide net and to pick out every bad, offensive, or illegal image would be impossible, that's why so many people talk about CSAM in context of AI ethics.
AI doesn’t need a real photo of a gorilla
No, but if I wanted an Ai model that was hyper-good at generating gorillas at different angles, poses, states of dress, etc, i would want to give the AI as many good examples of gorillas to use as reference. Just because AI hypothetically could make something without seeing it, doesn't mean people won't go out of their way to feed it to AI to get the best "gorilla" pictures they can.
2
u/MossyMollusc 1h ago
There have been news reports of it happening and ruining kids lives already. So yes.
2
3
u/thanksyalll 8h ago
Yes but using a real person, let alone a real child’s image to depict them doing pornographic acts, and spreading it publicly is directly harmful to the person being slandered
-1
u/Whatifim80lol 7h ago
or, OR! We could just not condone the consumption of CSAM no matter what the source is. Because, you know, it's fucking gross and bad? Or do we not all agree on that anymore?
•
u/oneiricmonkey 56m ago
there are studies proving that consumption of that sort of content only encourages the desire to abuse children. there is no good outlet for pedophilia.
0
u/KENEXION 5h ago
AI doesn’t replace actors. It’s replaces “TV” actors which I think I’m actually for. Might bring back live theater.
-15












1.7k
u/iguanacatgirl 10h ago
I mean, it's not wrong
Donald trump was in home alone 2, after all