Jacob Eisenstein
banner
jacobeisenstein.bsky.social
Jacob Eisenstein
@jacobeisenstein.bsky.social
natural language processing and computational linguistics at google deepmind.
🛸
August 27, 2025 at 7:23 PM
thanks! i was more confused about the “kugel” part but TIL that this is apparently inspired by an airy globe?
August 26, 2025 at 3:37 AM
August 25, 2025 at 9:55 PM
i think my great grandmother was the last owner of these books that knew how to read them
August 25, 2025 at 9:52 PM
On the positive side, this vario grinder, which i bought second hand, is the best technological upgrade of the summer in my house.

(Its grind settings are 1-10, a-z, so the chatgpt output is clearly wrong and the claude output is nonsensical)
August 11, 2025 at 4:06 PM
right but you’ll notice it’s pretty hard to validate a proposed answer to those why questions, so it was not unreasonable to hypothesize that a better formal model of language might yield better features for an NLP system
August 8, 2025 at 7:54 PM
The project of putting statistical meat on grammarian bones is, imo, a beautiful one (this is basically what textbook is about), even if it didn’t work out as a way to build NLP. It was helpful to have the participation of people like Bender, who understood the latest ideas in theoretical syntax.
August 8, 2025 at 2:39 PM
Bender is/was fairly distinct among syntax people for caring about statistical NLP and for believing that it can or even must incorporate sophisticated ideas about grammar. Lots of NLP people thought this for a long time, but I don’t think many linguists did.
August 8, 2025 at 2:27 PM
Reposted by Jacob Eisenstein
this is very cool and i’m looking forward to reading the paper, but a basic question about this data: isn’t it likely that a congressional rep’s speeches are written by a shifting cast of speechwriters over the course of their career? wouldn’t that explain adoption of new usages?
July 29, 2025 at 9:16 PM