Mitchel
banner
mitmittens.bsky.social
Mitchel
@mitmittens.bsky.social
(He/They) | Lead/Sr. Designer | Occasional DM | History, Sociology, Politics buff | Citation Needed | Good things happen when good people do something
bsky.app/profile/lias...

Your words, mine said 'no you're just not reading far enough' and then argued its 'normal' legal language, I see skepticism to legal wording all the time, LLM or not... You're misattributing this to ML, not the other way around 🫶
July 15, 2025 at 9:38 AM
All I'm saying is we know companies make conscious decisions to keep face and this massive platform paid an expert to keep the door open, skepticism is warranted ('AI' or not)

We all know that's what lawyers do, that's why it's silly to just kneejerk to 'people dislike ML'
July 15, 2025 at 9:27 AM
I suppose this is our differences then, I keep up with it, I don't 'stop reading' and assume it'll just get fixed

Gotta watch the canaries, but certainly shouldn't be saying other people are tuning out after "Machine learning" if you just tune to 'assuming' afterwards 🤐
July 15, 2025 at 9:26 AM
That's the point bud, they pay a (not dumb) lawyer to keep doors open whilst knowing the community is quite adamant on clarity, implying they at least know... History taught us what that implies.

You literally admit the same skepticism? This isn't new, so why are you saying "LLM and stop reading"?
July 15, 2025 at 9:24 AM
It was because of the grippers cam right?

(The missing thumbnails and the video's critiqued associated communities speak volumes for real though)
July 15, 2025 at 9:21 AM
Read further, you'll see it I promise!

*Including* implies there could be some feature creep, companies know LLM is unpopular in services creatives use, why would they consciously use that very specific language?

They're not dummies, neither are we :)
July 15, 2025 at 9:15 AM
Not going to lie, both level design and shader work made me much more details oriented & in turn appreciative of little nicks and dents, flakes of paint, cracks and the stories they tell, as well as more starry eyed about subsurface scattering and 'proper' lighting!
July 15, 2025 at 9:12 AM
Re: video, you could just say 'i dont understand economics of scale' and save me the time, fucking Anglos 🫠
July 8, 2025 at 7:28 AM
Results don't lie bud, I live in a country that has none of the US's issues Except the ones we took from your neolib shills, our education is better and wider spread, higher per capita gdp, higher media income, higher median happiness...

Look outside your narrow Northern American micro-world, lmao
July 8, 2025 at 7:27 AM
Ye sure, right after the 15mil people he cut off healthcare and billions he funneled from middle class to the rich
July 8, 2025 at 7:25 AM
'i bet u voted for 1 of 2 even semi-viable candidates'

Please learn your civics, you'll look less moronic online, embarrassing a Euro (from an admittedly better educated country) knows your shit better, shy advertise that on social media??
July 8, 2025 at 7:23 AM
Literally everything good about my country's (and Mamdani's tbh) policies is Socialist, the US has been a dumpster fire for almost a century, incidentally overlapping with the collapse of Labour unions, etc (Socialists...)

Hope you guys manage to catch up from the propaganda ya'll suffered
July 8, 2025 at 7:19 AM
Dems have been very loud at who they'll support while pretending to frown (read: fascists) and who they outright dismiss (Progressives) since the 'Blue no matter who' crowd though
July 8, 2025 at 7:15 AM
So usually pushed by C-suite mistakenly thinking we're scared of being outperformed by AI, rather than worried a shitty LLM app replaces or restricts a competent specialist 🤧

It's sticking because, unlike web3, loads of 'normies' find daily use in it as it's easier than being ace at search engines
May 24, 2025 at 6:32 AM
<10% using it usually uses it for some thing they're unskilled at, that LLM is actually ok for (e.g. making text more concise/translations/reformat)

In my experience this usually results in dunning-kruger valleys with C-suite folks, thinking they can value a tool as well as a specialist can
May 24, 2025 at 6:27 AM
Ironically, I've jaded a little and can fathom how that norm was established and held, after the past decades!

that said digitalisation has made it a lot harder to monopolise the outgoing narrative, despite all the downsides coming with it... Now we just need widespread media literacy 🤧
May 23, 2025 at 7:15 AM
My frequent examples are women's personal bank accounts ('74 in US), last freed colonies ('90s) and the end of SA Apartheid ('94)

Honestly much of the horrific behaviour of the early 20th century nations is recent history, history has a "winner's" bias after all, and we've a short memory
May 23, 2025 at 7:06 AM
Non-exhaustive list but:
Co-ownership
Royalties
Maybe profit sharing if we don't get laid off bi-yearly
Competitive salary (To tech, not exclusively games)
Sabbaticals

At least listen to devs and stop ridiculous scoped burn-out spirals, but most of these are pie in the sky in current system
May 13, 2025 at 7:18 AM
I feel my leanings are an everchanging combination of a series of prefixes (mostly a long string of "post-") and a serotonin dependant slot machine filled with obscure Leftist streams, but "then different"

Also anarcho-Posadism then??
May 1, 2025 at 1:35 AM
This is particularly funny when we're supposed to take AI technuts seriously posting a 'game' with every room filled to the brim with trolley carts in random 90 degree rotations, at least Ken actually adds something interesting to LinkedIn
April 23, 2025 at 8:37 AM
youtu.be/5sFBySzNIX0?...

Ook Simon Clark had hier een goede video over.

Ben zelf vaak meer werk kwijt aan output, maar templating/format, massa search, en zelfs slechte output inzicht in AI en hoe men 't gaat gebruiken.

Helaas met nóg meer 'confirmation bias' risico en dergelijk dan voorheen.
Should I feel guilty using AI?
YouTube video by Simon Clark
youtu.be
April 23, 2025 at 8:11 AM
Two guys chilling in a corpo AI group pic, 5ft apart cause they're *not* gay
April 21, 2025 at 11:10 PM