Rob Horning
robhorning.bsky.social
Rob Horning
@robhorning.bsky.social
robhorning.substack.com
this seems right, and always makes me wonder what the "suckers" get out of being on "sucker lists," why they enjoy being on them. Is it just some combination of masochism and getting paid attention to? Confirmation bias as a service? maxread.substack.com/p/prediction...
November 7, 2025 at 7:10 PM
regardless of what else they convey in a specific image, image generators offer images that tell those people who want to hear it that "effort/craft is a sham and a waste of time"
November 4, 2025 at 7:20 PM
seems like one of the main use cases for genAI is to eliminate the joy that other people take in thinking and actually doing things www.thecut.com/article/woul...
November 4, 2025 at 6:02 PM
Reposted by Rob Horning
Sora 2 is not being “misused” when people use it to produce racist, transphobic, misogynistic, & otherwise vile images—it exists to make it easier to put these images out into the world. Calling it a “misuse” is a grave misunderstanding of what these companies are up to, & lets OpenAI off the hook.
November 4, 2025 at 3:22 PM
Reposted by Rob Horning
However, the problem with these images is not just their genericness. It's the deeply populist idea that politics can be reduced to its immediately visible effects: politics is not judged by how it affects people's concrete daily lives, but rather by its aesthetics—by what image it produces
4/
October 23, 2025 at 11:37 AM
hope that is the actual cover
October 22, 2025 at 7:01 PM
analysis of political AI slop as "generated déjà vu" slavoj.substack.com/p/welcome-to...
October 22, 2025 at 6:56 PM
but the existence of text and video generators will also fortify those modes of social verification that don't amount to "it's acceptable for me to believe everything I see and read" www.nytimes.com/2025/10/19/o...
October 21, 2025 at 3:54 PM
when any person's face can be pasted into any face-shaped hole, it doesn't make things feel "personalized"; rather it negates the idea that there is anything personal about a face
October 16, 2025 at 4:20 PM
steal your face right off your head
if you want a picture of the future, imagine your own face staring back at you from the chumbox, forever nymag.com/intelligence...
October 16, 2025 at 4:13 PM
or you could say the entire station has been made into one big ad for this photographer news.artnet.com/art-world/hu...
October 14, 2025 at 5:03 PM
Reposted by Rob Horning
«Video generators allow people to experience ideas or beliefs as content without their having to invest their imagination into making them real, into ‹really› believing in them and coming to terms with the implications of their beliefs.» @robhorning.bsky.social on Sora Slop Feeds
There are too many waterfalls here
Sora slop feeds
robhorning.substack.com
October 14, 2025 at 7:49 AM
Reposted by Rob Horning
October 9, 2025 at 3:39 PM
that users could prefer a generated simulation to actual old clips for nostalgia purposes clarifies how nostalgia is about consuming "decontextualization" in itself — nostalgia negates history under the auspices of longing for it
YouTube has a legit library of recordings from quotidian settings (which are interesting, mostly as historical markers) but instead of promoting that social media pushes soulless facsimiles solely meant to associate a feeling with a moment sans the immediate, substantive context
This is doing numbers on social media right now and it's so depressing how people truly yearn for this shit and want to preserve that feeling indefinitely like a mausoleum of false memories.
October 7, 2025 at 5:44 PM
"new conspiracism" doesn't explain anything but is a means for isolated individuals to experience "social validation" on demand, in the absence of a verifiable public — a way to intensify the gratification of parasociality www.nplusonemag.com/issue-51/pol...
October 3, 2025 at 5:04 PM
“infinite video” means not infinite entertainment but infinite boredom; the death drive incarnate
October 2, 2025 at 7:51 PM
the idea that some videos are intrinsically interesting to watch (regardless of whether they have any reference to events or things in themselves, any kind of auratic appeal) feels like it can't survive generative models, which makes all forms of mere seeing trivial
October 2, 2025 at 7:44 PM
this from Yves Citton's Mythocracy is maybe useful for thinking about Sora 2 and other slop feeds: Generated video constitutes an "imaginary of power" that gives consumers pictures of how they've been trained to believe things are "supposed to be"
October 2, 2025 at 7:24 PM
wonder if the ease and rapidity with which "AI" can generate right-wing fantasy images and propaganda makes them more convincing for their consumers — as though one shouldn't have to use their own imagination to manifest the bigotry they insist on www.lrb.co.uk/blog/2025/se...
September 24, 2025 at 4:52 PM
LLMs mean that no one has to write anything they don't care about, but they also mean that "writing anything" will get equated with "not caring" for most people. (If you really cared, you would video yourself talking about it on your phone.)
September 23, 2025 at 6:25 PM
not bad advice, but presumes that most people read and write to experience "charm, surprise, and strangeness" when the opposite may be the case www.nplusonemag.com/issue-51/the...
September 23, 2025 at 6:06 PM
What does it mean to "optimize" for this condition — to train users to enjoy it? Why is it most profitable for companies to train us in wanting to pay attention as a way of avoiding rather than seeking meaning? www.noemamag.com/the-last-day...
September 11, 2025 at 1:50 PM
seems indicative of how stagnant the ideas behind "AI" are that Baudrillard could write a critique of them in 1995 (The Perfect Crime) and none of it seems dated
Why is AI so shit? Its shoddy outputs reflect the brittleness of an epistemology that relies fully on abstraction & reduction and actively eliminates relationality and adaptivity. As such, it doesn’t replace human activity but acts as its violent suppression.
September 10, 2025 at 2:36 PM
Reposted by Rob Horning
And how does the machine know your intent, you might ask? Well by constant surveillance.

Just kidding, the machine does not “know” your fucking “intent.”
September 9, 2025 at 7:30 PM
algorithmic recommendation tries to make it impossible for us to escape our own predictability; continued interaction with these sorts of surveillance systems changes our relationship to our own capability to want things—makes it alien, fully externalized www.theguardian.com/media/2025/a...
September 9, 2025 at 7:21 PM