So to reframe what we're all witnessing in real-time... the tech industry has already committed murder, and it is currently scrambling to figure out how to make that everyone else's problem, and to cover it up, before we notice what actually happened.
November 10, 2025 at 10:57 PM
So to reframe what we're all witnessing in real-time... the tech industry has already committed murder, and it is currently scrambling to figure out how to make that everyone else's problem, and to cover it up, before we notice what actually happened.
It's still fundamentally the same problem. I don't see how a tech company that makes AI coding tools sells those tools to other tech companies and grows their revenue YoY even if they charge by token/compute/whatever unit.
November 10, 2025 at 10:53 PM
It's still fundamentally the same problem. I don't see how a tech company that makes AI coding tools sells those tools to other tech companies and grows their revenue YoY even if they charge by token/compute/whatever unit.
Even if you approach LLM coding assistance from the (imo catastrophic) perspective of "what if the entire tech sector can layoff most engineers and replace them with AI and maintain everything they currently do?"
November 10, 2025 at 10:53 PM
Even if you approach LLM coding assistance from the (imo catastrophic) perspective of "what if the entire tech sector can layoff most engineers and replace them with AI and maintain everything they currently do?"
Like crashes past. Like 2008. This is will ultimately be a massive wealth transfer upward. Middle class savings will be wrecked. The working class squeezed harder. An attempt will be made to sacrifice a few companies and individuals to the public by building them up as enemies in media narratives.
November 10, 2025 at 10:50 PM
Like crashes past. Like 2008. This is will ultimately be a massive wealth transfer upward. Middle class savings will be wrecked. The working class squeezed harder. An attempt will be made to sacrifice a few companies and individuals to the public by building them up as enemies in media narratives.
I think the most likely outcome is that Wall Street has already seen the writing on the wall, and they are currently engaged in a musical chairs shadow standoff to pump the markets and get as best-positioned as possible for a crash.
A few big players will be wiped out.
A few big players will be wiped out.
November 10, 2025 at 10:50 PM
I think the most likely outcome is that Wall Street has already seen the writing on the wall, and they are currently engaged in a musical chairs shadow standoff to pump the markets and get as best-positioned as possible for a crash.
A few big players will be wiped out.
A few big players will be wiped out.
What we are effectively doing with AI at present is cannibalizing futures for present "value" that ultimately only exists on the basis of multiplying rather than reducing futures.
Or put another way, destruction for destruction's sake.
Or put another way, destruction for destruction's sake.
November 10, 2025 at 10:46 PM
What we are effectively doing with AI at present is cannibalizing futures for present "value" that ultimately only exists on the basis of multiplying rather than reducing futures.
Or put another way, destruction for destruction's sake.
Or put another way, destruction for destruction's sake.
Human beings, I believe, actually live in the past, present, and future simultaneously, in a very real way.
Our ability to "touch" the future is actually incredibly valuable, and not something AI is remotely close to accessing.
Our ability to "touch" the future is actually incredibly valuable, and not something AI is remotely close to accessing.
November 10, 2025 at 10:46 PM
Human beings, I believe, actually live in the past, present, and future simultaneously, in a very real way.
Our ability to "touch" the future is actually incredibly valuable, and not something AI is remotely close to accessing.
Our ability to "touch" the future is actually incredibly valuable, and not something AI is remotely close to accessing.
LLMs can only regurgitate the past through very advanced pattern matching. Sometimes they can do it in incredible ways, and in ways that are truly beyond the limits of humans. LLMs cannot directly interact with the future.
November 10, 2025 at 10:46 PM
LLMs can only regurgitate the past through very advanced pattern matching. Sometimes they can do it in incredible ways, and in ways that are truly beyond the limits of humans. LLMs cannot directly interact with the future.
Eventually, something will be the thread that pulls it all apart.
Quite literally the only thing that could really prevent this crash is AGI, and unfortunately for the tech industry, AGI is still beyond the horizon line. We're nowhere close.
Quite literally the only thing that could really prevent this crash is AGI, and unfortunately for the tech industry, AGI is still beyond the horizon line. We're nowhere close.
November 10, 2025 at 10:46 PM
Eventually, something will be the thread that pulls it all apart.
Quite literally the only thing that could really prevent this crash is AGI, and unfortunately for the tech industry, AGI is still beyond the horizon line. We're nowhere close.
Quite literally the only thing that could really prevent this crash is AGI, and unfortunately for the tech industry, AGI is still beyond the horizon line. We're nowhere close.
This is an impossible market.
It's very unlikely any AI companies will see meaningful revenue growth with a clear ability to scale in any space other than AI coding tools in the near future.
This will never be able to justify the insane capex spending of big tech and investors.
It's very unlikely any AI companies will see meaningful revenue growth with a clear ability to scale in any space other than AI coding tools in the near future.
This will never be able to justify the insane capex spending of big tech and investors.
November 10, 2025 at 10:46 PM
This is an impossible market.
It's very unlikely any AI companies will see meaningful revenue growth with a clear ability to scale in any space other than AI coding tools in the near future.
This will never be able to justify the insane capex spending of big tech and investors.
It's very unlikely any AI companies will see meaningful revenue growth with a clear ability to scale in any space other than AI coding tools in the near future.
This will never be able to justify the insane capex spending of big tech and investors.
The success of this product is as strongly correlated to the decline of the size of this market. No matter how you slice it, there's no way to grow the revenue of AI coding tools, and also continue to train and improve better models and create better AI tools for engineers.
November 10, 2025 at 10:46 PM
The success of this product is as strongly correlated to the decline of the size of this market. No matter how you slice it, there's no way to grow the revenue of AI coding tools, and also continue to train and improve better models and create better AI tools for engineers.
Furthermore, the actual revenue of the AI coding tool products will plummet, as a company paying per engineer for access has zero reasons to pay the same amount for AI coding tools for fewer engineers.
November 10, 2025 at 10:46 PM
Furthermore, the actual revenue of the AI coding tool products will plummet, as a company paying per engineer for access has zero reasons to pay the same amount for AI coding tools for fewer engineers.
As the tech industry contracts to greater and greater amounts of code being written by fewer and fewer engineers "multiplied" by AI, we're actually effectively polluting the datasets of potentially better future models by training them on accelerating echo-chambers.
November 10, 2025 at 10:46 PM
As the tech industry contracts to greater and greater amounts of code being written by fewer and fewer engineers "multiplied" by AI, we're actually effectively polluting the datasets of potentially better future models by training them on accelerating echo-chambers.
AI coding tools always need *more* code—and more importantly more *new* code—written by humans outside the current scope of the AI's training models in order to improve. Some improvements can be made by training the same data better, but ultimately what you need is more and more new data to evolve.
November 10, 2025 at 10:46 PM
AI coding tools always need *more* code—and more importantly more *new* code—written by humans outside the current scope of the AI's training models in order to improve. Some improvements can be made by training the same data better, but ultimately what you need is more and more new data to evolve.
If you have 100 engineers, and you are paying $40/mo per engineer to GitHub, you are paying GitHub $48,000 a year.
If you downsize to 90 engineers, you are now paying Github $43,200 a year.
If you downsize to 70 engineers, you are now paying Github $33,600 a year.
Do you see the problem here?
If you downsize to 90 engineers, you are now paying Github $43,200 a year.
If you downsize to 70 engineers, you are now paying Github $33,600 a year.
Do you see the problem here?
November 10, 2025 at 10:46 PM
If you have 100 engineers, and you are paying $40/mo per engineer to GitHub, you are paying GitHub $48,000 a year.
If you downsize to 90 engineers, you are now paying Github $43,200 a year.
If you downsize to 70 engineers, you are now paying Github $33,600 a year.
Do you see the problem here?
If you downsize to 90 engineers, you are now paying Github $43,200 a year.
If you downsize to 70 engineers, you are now paying Github $33,600 a year.
Do you see the problem here?
The more engineers you can lay off, the more valuable the AI is to you, because you can pay for code from AI less than you pay for code from humans.
The more engineers you lay off, the fewer engineers you have to lay off. The less valuable AI is to you.
AI coding tools eat their own markets.
The more engineers you lay off, the fewer engineers you have to lay off. The less valuable AI is to you.
AI coding tools eat their own markets.
November 10, 2025 at 10:46 PM
The more engineers you can lay off, the more valuable the AI is to you, because you can pay for code from AI less than you pay for code from humans.
The more engineers you lay off, the fewer engineers you have to lay off. The less valuable AI is to you.
AI coding tools eat their own markets.
The more engineers you lay off, the fewer engineers you have to lay off. The less valuable AI is to you.
AI coding tools eat their own markets.
There is real utility here. It may grow anywhere from a little to a lot over the next year. It's probably not going to become less useful than it currently is.
At the end of the day, the basis by which companies are willing to pay for AI coding tools is ultimately tied to headcount.
At the end of the day, the basis by which companies are willing to pay for AI coding tools is ultimately tied to headcount.
November 10, 2025 at 10:46 PM
There is real utility here. It may grow anywhere from a little to a lot over the next year. It's probably not going to become less useful than it currently is.
At the end of the day, the basis by which companies are willing to pay for AI coding tools is ultimately tied to headcount.
At the end of the day, the basis by which companies are willing to pay for AI coding tools is ultimately tied to headcount.
it just seems like we’ve really passed the point in the flowchart where there’s an off ramp. i hope grey market stuff continues to accelerate and cis people get smart about funneling prescriptions to trans people, but i don’t see how that’s a viable long term solution at scale. it’s just a bandage.
November 10, 2025 at 4:59 AM
it just seems like we’ve really passed the point in the flowchart where there’s an off ramp. i hope grey market stuff continues to accelerate and cis people get smart about funneling prescriptions to trans people, but i don’t see how that’s a viable long term solution at scale. it’s just a bandage.