Vesper
banner
index.vesper.computer.ap.brid.gy
Vesper
@index.vesper.computer.ap.brid.gy
A journal of my understanding of the world

🌉 bridged from https://vesper.computer/ on the fediverse by https://fed.brid.gy/
Power User Feature Requests for AI
I thought it might be fun to jot down a few features I wish AI chatbots had, things that fit better with the way I think and use them. I'm keeping this as a living scorecard to see what I get right vs. what actually ships. Also, it just so happens that towards the end of 2025, Your Year with ChatGPT told me I was in the top 1% of users by usage. Obviously, given this elite status, my thoughts are so valuable that they needed to be shared. ## On Memory Memory is one of my favorite features of ChatGPT, and I was surprised it took as long as it did for Claude (late 2025) to get it. It's powerful enough to create real lock in for users (more on this in a separate post). As of right now, I can think of at least a couple of ways memories could be improved. ### Decay ### **Added:** 2026-01-01 ChatGPT Plus and Pro users have an option to turn on automatic memory management to prioritize relevant memories and avoid hitting capacity. Anecdotally, however, ChatGPT ends up saving memories about me that are often time-limited and hasn't yet purged them. I use ChatGPT for shopping from time to time (for instance, to test their new shopping research experience). This is one of the biggest categories where memories not having a sort of decay mechanism becomes an issue for me. For about a year now, ChatGPT has had saved in it things like "user is shopping for Lululemon alternatives". I'm not sure how many times this memory was accessed and where, but at the very least it is wasteful in the limited space for memory. This brings me to a second feature I'd like to have. ### Usage Stats **Added:** 2026-01-01 If OpenAI does want me to manage my own memory, I need to know more about how they are used. Are all the memories always appended to the beginning of each of my prompts? Probably not, but OpenAI does not disclose how exactly they are fed into the model, and at what stage. The only thing I could find from the documentation is, > Like custom instructions, saved memories are part of the context ChatGPT uses to generate a response. Unless you delete them, saved memories are always considered in future responses. In fact, only when I was writing this post did I discover that apparently ChatGPT now puts some memories at the top of its mind, > To decide which memories stay top of mind, ChatGPT considers factors such as how recent a detail is and how often you talk about a topic. How often do those 'top of mind' memories actually get used? I would at least like to know what memories were accessed when answering each of my queries. I had once asked ChatGPT to be concise/direct with me when discussing technical details of a research paper, and it remembered that as "user wants me to be concise/direct" without any context of when. At a later point, when I was asking it questions about a topic to learn more about it, the responses were oddly short. It took me a second to realize that it was this past memory impacting the result and once I deleted it, it went back to normal. There were probably more instances like this, but the point is, I don't want to play a guessing game of what could be influencing my current conversation. Memories that were accessed could be shown next to the response like citations, or under the menu with the thinking details of the model. Overall usage stats such as the number of times a certain memory was accessed should be displayed in the Manage Memories screen. I would also like to have memories evolve with me over time. Consider something like my political stance doing a complete 180 after a period of time. Current documentation suggests that ChatGPT could do this on demand if I ask it update/delete a particular memory. Ideally, it should recognize the contradiction and replace the older one with the current one. ## Context management **Added:** 2026-01-01 This one's perhaps a bit more out there. Despite memories existing, context in a given chat is still clearly very important. The thing I like about ChatGPT over Claude (as of 2025) is that ChatGPT has basically never prompted me to start a new chat because it's running out of context in the current one whereas Claude had done this to me many times. However, given the importance of context, it's still useful to run different chat sessions about different topics. The issue though is, if OpenAI wants get to a point where it's almost replacing Google Search, there's bound to be many ongoing, unfinished conversations happening at any given time. Maybe this is an issue specific to how I use ChatGPT, or maybe I'm a bit ahead of most people in this given my higher than average usage of ChatGPT. But, managing these conversations is turning into a bit of a chore for me. It feels like I have a large personal knowledge management (PKM) style system where instead of organizing notes into different folders or tags or whatever, I'm doing it with chats. I have to think about when to branch from the existing one into a new chat, a fresh chat, continue from an older one, decide which chats go into a "project", and so on. Something that'd be cool is if ChatGPT would do all this for me. Just as it is supposedly deciding which memories to use for a given prompt, I want it to show me past messages about this particular topic, regardless of which conversation window they happened in. The ultimate version of this would be just a box that I would start typing in, and the context would be pulled automatically as it figures out what topic I'm talking about and to what end.
vesper.computer
January 2, 2026 at 8:13 AM
Are We Looking at the First iPhone Built for AI?
The iPhone design cycle seemed to have shifted from two years to three in the recent years. But with the 17 Pro, Apple has completely redesigned the chassis and used a new material, after a brief two year stint with titanium. Is this an intentional move to switch back to a two year design cycle as Ben Thompson speculates? Or did something else cause this change? A bit of context for those that are not obsessively tracking internal details about iPhones (I am aware of my problem). Apple had gone from using stainless steel to titanium for their Pro phones in 2023, touting the many benefits of titanium. It was stronger and lighter than stainless steel. The good thing about titanium is that it allows Apple to finally resume its pursuit of making an ultra-thin phone, while maintaining enough tensile strength so that the phone does not bend easily. There was an initial bit of drama around the thermals of the 15 Pro resulting in a firmware update and a statement by Apple that it was not the titanium that was causing the issue. While true, reviews around the 15 Pro's performance under sustained workloads have been mixed (some stress tests favored 15 Pro while others showed more throttling). To Apple's credit, they tweaked the design with the 16 Pro, with up to 20% better sustained performance. I'm not saying titanium caused "bad" thermals, more that it is definitely the case is that aluminum is way better at spreading heat around the phone compared to titanium. The iPhone 17 Pro reflects this, with a claimed up to 40% better sustained performance vs the 16 Pro. iPhone case designs are believed to be locked eighteen to twenty-four months before launch day. And we heard that Craig Federighi became interested in AI after playing with it over Christmas of 2022. If we assume that was around the time that Apple started taking AI seriously, it was already too late to change the design of the 16 Pro, which would've been locked between September 2022 and March 2023. It does give Apple ample time to design the 17 Pro _from the ground up_ for AI. My hypothesis is that titanium was a choice made before the AI hype reached a stratospheric level, and before Apple started to take it seriously. The choice at the time might have seemed fine. Apple sacrifices a bit of thermal headroom in exchange for being able to make the iPhone Air, and later the iPhone Fold (perhaps as soon as next year). AI forced a change in the design two years in, bringing us the first iPhone truly[1] built for Apple Intelligence[2]. * * * 1. Since Apple already marketed iPhone 16/16 Pro as "built for Apple Intelligence. ↩︎ 2. Whenever it ships. ↩︎
vesper.computer
September 15, 2025 at 2:16 AM