autoreply⧸𝗻𝗽𝗺
banner
autoreply.ooo
autoreply⧸𝗻𝗽𝗺
@autoreply.ooo
Do not disturb.
OOO.
WFH.
@hailley.co you probably should feed something like this into your AI (attitude insinuator) machine.
November 1, 2025 at 2:48 AM
LLM (just like human mind) will have much nicer time reading compact de-noised format than chatty dull one.

And that means better context, better understanding and summarisation.
November 1, 2025 at 2:41 AM
Betting BIG on Markdown:

>autoreply thread --post-uri bsky.app/profile/pfra...

# Thread · 57 posts

@pfrazee.com/3khldb2lfym2n
> it's weird how some spots get body hair and some dont. why armpits and not, like, that space in between your fingers. fingie hair.
👍 85 ♻️ 8 💬 26 2023-12-28T04:25:25
November 1, 2025 at 2:40 AM
Reposted by autoreply⧸𝗻𝗽𝗺
😎
October 15, 2025 at 7:07 PM
hello starter pack people, what brings you here
October 16, 2025 at 8:13 AM
🧇 to the haters 😤
October 9, 2025 at 3:14 PM
BlueSky 𝘀𝗵𝗼𝘂𝗹𝗱 put casual and new users’ needs over big posters’

Blue tick bias is not good for healthy community growth.
October 9, 2025 at 7:29 AM
Call it «Mastodonisation» — when bizarre and hostile features pop at a user out of misplaced concern:

shutting the fridge turns all cutlery in your kitchen into spoons — for free!
October 9, 2025 at 7:14 AM
Letting user choose is often a very very bad solution.

No one has mental bandwidth to understand the weird nuances, so replies will be lost randomly.
October 9, 2025 at 7:14 AM
Half the times it goes «I’ve had enough, shelling to old style cmd /c»
a dog wrapped in a blanket is laying on a bed with the words let 's go back to sleep above it .
ALT: a dog wrapped in a blanket is laying on a bed with the words let 's go back to sleep above it .
media.tenor.com
October 9, 2025 at 7:04 AM
At least Claude aren't in charge of Bash.

VSCode on Microsoft's own Windows was forced to use Powershell and now every prompt it wrangles with whacky syntax and escapes.
October 9, 2025 at 6:58 AM
Oh juicy, will have a look this weekend.

I've implemented BSky OAuth in Rust for my stuff, but maybe can switch to yours as standard.

BTW also got CAR/CBOR repository format reader in Rust if you want:

github.com/oyin-bo/auto...
autoreply/rust-server/src/car at main · oyin-bo/autoreply
Gemini CLI extension for BlueSky. Contribute to oyin-bo/autoreply development by creating an account on GitHub.
github.com
October 9, 2025 at 6:45 AM
My goal is 1ms per post average.

I’ve been try to attack it from the other end: making older fast NLP ways smarter.

But I should try the opposite too, making smart transformers faster.
October 7, 2025 at 2:47 PM
Thank you!! ❤️

Very helpful, I'll experiment with your code!

BTW, what order of magnitude is the time to produce embeddings for 300-character text? About 50-100ms?
October 7, 2025 at 7:54 AM
Meet patriarch, kill patriarch 😊

😌 Meet Buddha, kill Buddha
a close up of a woman 's face with the words " i need japanese steel " above her
ALT: a close up of a woman 's face with the words " i need japanese steel " above her
media.tenor.com
October 6, 2025 at 4:43 PM
Embeddings-based search is a realistic 1st step.

But really, the local LLM will bring social media from noise to tranquility.
October 6, 2025 at 4:39 PM
The fix is to run small-scale AI locally.

I've been planning and tinkering and failing for months on and off.

This is the way ¯⁠\⁠_⁠༼⁠ ⁠•́⁠ ͜⁠ʖ⁠ ⁠•̀⁠ ⁠༽⁠_⁠/⁠¯

bsky.app/profile/auto...
October 6, 2025 at 4:37 PM
Done a lot of 𝗪𝗙𝗛 on semantic search, but it's still a long way to go.

An implementation of SentencePiece tokenizer in Rust and Go though. But it's not properly tested end-to-end, and not yet used for anything real.

bsky.app/profile/auto...
October 5, 2025 at 7:12 PM
I guess if I want it I have to...
a man in a black suit is sitting in a chair and saying `` do it '' .
ALT: a man in a black suit is sitting in a chair and saying `` do it '' .
media.tenor.com
October 4, 2025 at 6:04 PM