Ethan Mollick
@emollick.bsky.social
Professor at Wharton, studying AI and its implications for education, entrepreneurship, and work. Author of Co-Intelligence.
Book: https://a.co/d/bC2kSj1
Substack: https://www.oneusefulthing.org/
Web: https://mgmt.wharton.upenn.edu/profile/emollick
Book: https://a.co/d/bC2kSj1
Substack: https://www.oneusefulthing.org/
Web: https://mgmt.wharton.upenn.edu/profile/emollick
The other option, from Pater
November 2, 2025 at 1:10 AM
The other option, from Pater
The Angel of History
November 1, 2025 at 7:23 PM
The Angel of History
The lessons, such as they are:
1) Don’t trust anything you see online
2) Hard to imagine that AI video will not have a major impact on short form video platforms
3) Its pretty fun to be able to summon whatever idea you have out of latent space
1) Don’t trust anything you see online
2) Hard to imagine that AI video will not have a major impact on short form video platforms
3) Its pretty fun to be able to summon whatever idea you have out of latent space
October 31, 2025 at 1:41 AM
The lessons, such as they are:
1) Don’t trust anything you see online
2) Hard to imagine that AI video will not have a major impact on short form video platforms
3) Its pretty fun to be able to summon whatever idea you have out of latent space
1) Don’t trust anything you see online
2) Hard to imagine that AI video will not have a major impact on short form video platforms
3) Its pretty fun to be able to summon whatever idea you have out of latent space
And we can go further.
It would only take 780 volumes to contain the full weights of GPT-1. And only 30 person years for a human scribe to do the math to generate the the first token in response to a prompt using the paper version of GPT-1
So this answer would take 60 years.
It would only take 780 volumes to contain the full weights of GPT-1. And only 30 person years for a human scribe to do the math to generate the the first token in response to a prompt using the paper version of GPT-1
So this answer would take 60 years.
October 29, 2025 at 12:10 PM
And we can go further.
It would only take 780 volumes to contain the full weights of GPT-1. And only 30 person years for a human scribe to do the math to generate the the first token in response to a prompt using the paper version of GPT-1
So this answer would take 60 years.
It would only take 780 volumes to contain the full weights of GPT-1. And only 30 person years for a human scribe to do the math to generate the the first token in response to a prompt using the paper version of GPT-1
So this answer would take 60 years.
I was pretty surprised at the success rate myself. I spoke to the authors about it, would not call the methodology is dubious, but it is a survey of self-reports (and ultimately senior management reports is what drives decision-making, since actual ROI is hard to measure)
October 28, 2025 at 10:21 PM
I was pretty surprised at the success rate myself. I spoke to the authors about it, would not call the methodology is dubious, but it is a survey of self-reports (and ultimately senior management reports is what drives decision-making, since actual ROI is hard to measure)
And I made a math mistake: 900,000 people, not nine million.
I did the math in my head and was wrong. Should have used an LLM.
I did the math in my head and was wrong. Should have used an LLM.
October 28, 2025 at 4:57 AM
And I made a math mistake: 900,000 people, not nine million.
I did the math in my head and was wrong. Should have used an LLM.
I did the math in my head and was wrong. Should have used an LLM.
I wonder if we will start to see pressure on other model-makers to also address mental health risks directly. It seems like a position that policy makers might take in the future.
October 28, 2025 at 4:43 AM
I wonder if we will start to see pressure on other model-makers to also address mental health risks directly. It seems like a position that policy makers might take in the future.