Philip Nuzhnyi
banner
callmephilip.com
Philip Nuzhnyi
@callmephilip.com
🧜‍♂️ Surftware engineer. Stochastic parrot.

CODE: https://github.com/callmephilip
MD: https://callmephilip.com/
Apparently abiogenesis is also sometimes referred to as "biopoesis" - which is a. poetic b. poesque
November 7, 2025 at 11:10 AM
"Each program inside tmux gets its own terminal managed by tmux, which can be accessed from the single terminal where tmux is running - this called multiplexing and tmux is a terminal multiplexer." - from github.com/tmux/tmux/wi...
November 3, 2025 at 11:28 AM
My buddy Dostoevsky is cycling from Lisbon to Malaga and I am rawdogging it! whereisdostoevsky.com source: github.com/callmephilip...
October 11, 2025 at 10:29 AM
Frenchies took over MS's docs department - www.typescriptlang.org/docs/handboo...
September 22, 2025 at 6:17 PM
September 22, 2025 at 5:59 PM
September 22, 2025 at 5:59 PM
September 22, 2025 at 12:16 PM
Claude Code layered interactions (source: blog.promptlayer.com/claude-code-...)
September 9, 2025 at 11:25 AM
Claude Code Flowchart (source: blog.promptlayer.com/claude-code-...)
September 9, 2025 at 11:25 AM
The Litany Against Fear
September 8, 2025 at 8:40 PM
It's week 3. Claude is not helping at all
August 11, 2025 at 10:11 AM
alias claudius="claude --dangerously-skip-permissions"
August 8, 2025 at 7:39 AM
Interesting typographical choices here (from en.wikipedia.org/wiki/COBOL)
August 8, 2025 at 7:14 AM
It's not enough to just throw a bunch of tools at the model and hope for it to "get it". LLMs need to be explained when and how to use Claude's system prompt breakdown from www.dbreunig.com/2025/05/07/c...
August 4, 2025 at 10:34 AM
TIL: all those whimsical gerunds in Claude code get generated on the fly via Haiku. I am lost for words. Should have asked Haiku for an appropriate verbal concoction to use
August 3, 2025 at 12:30 PM
You can subclass envs to create stricter versions of them, for example
July 25, 2025 at 12:00 PM
You can hijack execution for specific actions
July 25, 2025 at 11:55 AM
Here's the LOOP:

- call LLM using current messages
- parse the response and get the action
- exec the action
- render observation (`DefaultAgent.render_template`)
- add observation to the message list
July 25, 2025 at 11:36 AM
- every action is executed via `subprocess.run`: each action is independent from the previous (there is no central shell session). Why is this a big deal? ⤵️
July 25, 2025 at 11:20 AM
They: your product is just a Chat GPT wrapper
You:

genius.com/Chase-and-st...
July 23, 2025 at 12:30 PM
Trying to get Claude to give me a nice ASCII art style Winnie-the-Pooh. I mentioned Claude Monet (half jokingly) and things escalated.
July 19, 2025 at 8:32 AM
Marcel chauffe! 😂 love it. `marcel grève` is a chef's kiss github.com/brouberol/ma...
July 17, 2025 at 12:47 PM
Watched “High noon” last night. Here is the old marshal explaining Cooper why no one is willing to help him defend their town:

“People gotta talk themselves into law and order before they do anything about it. Maybe because down deep they don't care. They just don't care.”
July 5, 2025 at 5:38 AM
"Context Offloading is the act of storing information outside the LLM’s context, usually via a tool that stores and manages the data." Create a "scratchpad" - a place for the model to write down notes keeping them out of the context. Offloading scenarios (from Anthropic):
July 4, 2025 at 2:44 PM
"Context Pruning is the act of removing irrelevant or otherwise unneeded information from the context." Example using Provence:
July 4, 2025 at 2:44 PM