Fed guy
jerome-powell.bsky.social
Fed guy
@jerome-powell.bsky.social
vibecession forever. jpow wishes this was his burner. (for clarity, I am not jerome powell)
🫢😱
July 25, 2025 at 1:29 PM
In terms of behavior I think this is accurate for self-generated growth capex, but broader reindustrialization thru incentivizing financial investment requires lower cost of debt. Not denying importance of positive contribution margin, only magnitude
April 21, 2025 at 8:52 PM
However higher interest rates raise the threshold of demand/profitability required to undertake investment
April 21, 2025 at 8:37 PM
👊 🇺🇸 🔥
April 19, 2025 at 2:45 AM
Interesting! In fairness a variety of scenarios can play out based on what happens over the next few months / how nimble players are. Contra to my take is hyperscalers still own synergies for innovation and dry powder
January 28, 2025 at 6:35 PM
vs. training (library vs. detective analogy) and/or lower overall training spend. Recall 2015-2019 with rise of cloud computing, data center workload demand almost tripled, while electricity consumption remained flat. In the coming months, the 17% load growth over 5 yrs assumption will be reviewed
January 28, 2025 at 6:13 PM
In most scenarios, power infrastructure investments have the most to lose here. The goal of increasing efficiency is to decrease time/money spend, with electricity being one of the main input costs. Greater efficiency will target and reduce power demand, whether through shifts towards utilization...
January 28, 2025 at 6:09 PM
If both (1) and (2) are true, who are the real losers of the DeepSeek news? Hyperscalers are a clear no. 1, as the barriers to entry are now significantly reduced. Their moat is now capital: if efficiency can be gained across all models, $billions committed eventually translate to greater gains
January 28, 2025 at 6:05 PM
(2) In contrast, lower barriers to entry and efficient model training can lead to increase in industry entrants and higher utilization (ie Jevons Paradox). If training efficiency is now such that $5m is needed to train vs $100m, which encourages 20 people to enter market, market size stays equal
January 28, 2025 at 5:59 PM
This mainly pertains to capex made by hyperscalers, but has rippled implications. Capital raised for infrastructure and power investments to service AI learning now has lower expected ROI - all else equal
January 28, 2025 at 5:55 PM
Right direction, will probably be poor execution unfortunately
December 11, 2024 at 3:36 AM