soap (fake)
soap (fake)
@xn--jw9h.bsky.social
it's not possible dawg
you cannot extract training data from a model. that's simply how they work
February 11, 2025 at 3:48 AM
erm you don't get to engage in a conversation with someone and block them when they're right
and ftr these are all accounts i've had for a long time i just enjoy making accounts
February 11, 2025 at 3:34 AM
the model doesn't have any code in it. the code tells the model what to do, but the code simply cannot make the model produce images because that's not what the model does
February 11, 2025 at 3:24 AM
and the code does exactly nothing to a model that has already been trained
it's not like they're publishing a CSAM database. once the model is trained you simply cannot make it produce images. that's how ML works
February 11, 2025 at 3:23 AM
you're the one that insists human moderators should be watching child porn lol
February 11, 2025 at 3:22 AM
of course you can change the code. but you cannot extract the images from an already trained ML model, nor can you make an image detection model produce images
that's not what stalking is. i would love to have a real conversation about this
February 11, 2025 at 3:22 AM
then provide some kind of high level overview of how you could make the detection model produce images? that's simply not how it works
February 11, 2025 at 3:21 AM
straight cap unfortunately
you blocked me and everyone else that can form a coherent argument against you
again, please walk me through how you can reverse engineer an image detection model
February 11, 2025 at 3:15 AM
xn--jw9h
February 29, 2024 at 1:49 PM
this one is 🧼.bsky.social
February 29, 2024 at 1:47 PM
i'm saving the punycode handle for if bluesky ever supports them
February 29, 2024 at 1:47 PM
(fake)
February 29, 2024 at 1:44 PM