banner
xcm.bsky.social
@xcm.bsky.social
Public librarian and total nerd in Palo Alto, CA

AKA @akamarkman
concert tour t-shirt spoof that just lists different LLM model drop dates in a year
November 24, 2025 at 7:41 PM
The hilarious idea of combining www.copyright.gov/vcc/ & DeepSeek OCR
Virtual Card Catalog | U.S. Copyright Office
Virtual Card Catalog
www.copyright.gov
October 23, 2025 at 12:19 AM
10/20/25, a day forever known in the public library world as the "Hooplapocalypse"
October 20, 2025 at 11:54 PM
Don't be fooled, "Agentic AI" is "Daemonic AI" in sheep's clothing 🤣
September 18, 2025 at 12:51 AM
Reposted
What happens when you show up for a Zoom meeting ... and the only other participants are AI note takers?

It happened to my colleague:
www.washingtonpost.com/technology/2...
No one likes meetings. They’re sending their AI note takers instead.
Artificial intelligence apps that record and summarize meetings can tempt workers into skipping calls, leaving humans who join in the company of silent bots.
www.washingtonpost.com
July 2, 2025 at 9:49 PM
Reposted
why did this make me laugh
July 3, 2025 at 11:31 AM
Reposted
The diversity of horizontal navigation in 2000

#WebDesignHistory
June 19, 2025 at 1:08 PM
I'll say it again: can't ignore the personalized accessibility benefits of GenAI, especially combined w/ local LLM. Game changer in many ways.
March 14, 2025 at 5:52 PM
Extremely satisfying to fix $100 headphones with $0.02 tape 🎶
February 17, 2025 at 9:04 PM
So I just found this and it's very reminiscent of the ideas percolating at Xerox PARC before the PC revolution took off. What happens we focus tech on making people smarter? In a lot of ways the internet became an entertainment machine, but that's not all it can do!
Tools for Thought - Microsoft Research
Better thinking through AI Much AI research focuses on solving specific tasks for people – generating content or automating processes. While such systems may be powerful, there are risks that this app...
www.microsoft.com
February 11, 2025 at 7:57 PM
Reposted
This paper is wild - a Stanford team shows the simplest way to make an open LLM into a reasoning model

They used just 1,000 carefully curated reasoning examples & a trick where if the model tries to stop thinking, they append "Wait" to force it to continue. Near o1 at math. arxiv.org/pdf/2501.19393
February 7, 2025 at 2:53 AM
LLM Horror Story: The (API) Call is Coming From Inside the House
January 28, 2025 at 5:45 AM
OpenAI "Operators" + scheduled tasks will be game changer for public library systems. We're all union catalogs now!
January 25, 2025 at 12:17 AM
Over the years I've come to realize that my quality of life is directly proportional to quick laser printer access. Respect the printer, behold the power of publishing🫡🖨️(and I'm only half kidding about this)
January 18, 2025 at 9:27 PM
Watching a webinar recording you did months ago and then realizing you're currently wearing the same shirt is funny. Perhaps I time traveled, but forgot?
January 17, 2025 at 1:50 AM
Nice to see "books3" in the LLM news once again. Trying to mesh this with Lawrence Lessing's book 'Remix' from 2009 and all the fair use advocacy that librarians did back then...
January 11, 2025 at 12:40 AM
Reposted
New AI Snake Oil essay: Last month the AI industry's narrative suddenly flipped — model scaling is dead, but "inference scaling" is taking over. This has left people outside AI confused. What changed? Is AI capability progress slowing? We look at the evidence. 🧵 www.aisnakeoil.com/p/is-ai-prog...
December 19, 2024 at 12:16 PM
I don't normally rate dogs, but this one is easily a 12/10
"a pug is crowned king of england in an elaborate ceremony, ending with a crown being placed on its head"
December 18, 2024 at 1:43 AM
Ok instead of Secret Santa I propose a new game called Secretive Santa™
December 7, 2024 at 12:50 AM
The irony of a 30 second video intros about "not wasting time and getting into it" 😉
December 6, 2024 at 6:57 PM
"We can rebuild it. We have the technology. We can make it better than it was. Better, stronger, faster. Higher Quality Memes. Lower pings. 144hz refresh rates."
November 15, 2024 at 12:06 AM
Reposted
The Onion should buy Elsevier next
November 14, 2024 at 8:28 PM