Power User Feature Requests for AI
I thought it might be fun to jot down a few features I wish AI chatbots had, things that fit better with the way I think and use them. I'm keeping this as a living scorecard to see what I get right vs. what actually ships. Also, it just so happens that towards the end of 2025, Your Year with ChatGPT told me I was in the top 1% of users by usage. Obviously, given this elite status, my thoughts are so valuable that they needed to be shared.
## On Memory
Memory is one of my favorite features of ChatGPT, and I was surprised it took as long as it did for Claude (late 2025) to get it. It's powerful enough to create real lock in for users (more on this in a separate post). As of right now, I can think of at least a couple of ways memories could be improved.
### Decay
### **Added:** 2026-01-01
ChatGPT Plus and Pro users have an option to turn on automatic memory management to prioritize relevant memories and avoid hitting capacity. Anecdotally, however, ChatGPT ends up saving memories about me that are often time-limited and hasn't yet purged them.
I use ChatGPT for shopping from time to time (for instance, to test their new shopping research experience). This is one of the biggest categories where memories not having a sort of decay mechanism becomes an issue for me. For about a year now, ChatGPT has had saved in it things like "user is shopping for Lululemon alternatives". I'm not sure how many times this memory was accessed and where, but at the very least it is wasteful in the limited space for memory. This brings me to a second feature I'd like to have.
### Usage Stats
**Added:** 2026-01-01
If OpenAI does want me to manage my own memory, I need to know more about how they are used. Are all the memories always appended to the beginning of each of my prompts? Probably not, but OpenAI does not disclose how exactly they are fed into the model, and at what stage. The only thing I could find from the documentation is,
> Like custom instructions, saved memories are part of the context ChatGPT uses to generate a response. Unless you delete them, saved memories are always considered in future responses.
In fact, only when I was writing this post did I discover that apparently ChatGPT now puts some memories at the top of its mind,
> To decide which memories stay top of mind, ChatGPT considers factors such as how recent a detail is and how often you talk about a topic.
How often do those 'top of mind' memories actually get used? I would at least like to know what memories were accessed when answering each of my queries. I had once asked ChatGPT to be concise/direct with me when discussing technical details of a research paper, and it remembered that as "user wants me to be concise/direct" without any context of when. At a later point, when I was asking it questions about a topic to learn more about it, the responses were oddly short. It took me a second to realize that it was this past memory impacting the result and once I deleted it, it went back to normal. There were probably more instances like this, but the point is, I don't want to play a guessing game of what could be influencing my current conversation.
Memories that were accessed could be shown next to the response like citations, or under the menu with the thinking details of the model. Overall usage stats such as the number of times a certain memory was accessed should be displayed in the Manage Memories screen.
I would also like to have memories evolve with me over time. Consider something like my political stance doing a complete 180 after a period of time. Current documentation suggests that ChatGPT could do this on demand if I ask it update/delete a particular memory. Ideally, it should recognize the contradiction and replace the older one with the current one.
## Context management
**Added:** 2026-01-01
This one's perhaps a bit more out there. Despite memories existing, context in a given chat is still clearly very important. The thing I like about ChatGPT over Claude (as of 2025) is that ChatGPT has basically never prompted me to start a new chat because it's running out of context in the current one whereas Claude had done this to me many times. However, given the importance of context, it's still useful to run different chat sessions about different topics.
The issue though is, if OpenAI wants get to a point where it's almost replacing Google Search, there's bound to be many ongoing, unfinished conversations happening at any given time. Maybe this is an issue specific to how I use ChatGPT, or maybe I'm a bit ahead of most people in this given my higher than average usage of ChatGPT. But, managing these conversations is turning into a bit of a chore for me. It feels like I have a large personal knowledge management (PKM) style system where instead of organizing notes into different folders or tags or whatever, I'm doing it with chats. I have to think about when to branch from the existing one into a new chat, a fresh chat, continue from an older one, decide which chats go into a "project", and so on.
Something that'd be cool is if ChatGPT would do all this for me. Just as it is supposedly deciding which memories to use for a given prompt, I want it to show me past messages about this particular topic, regardless of which conversation window they happened in. The ultimate version of this would be just a box that I would start typing in, and the context would be pulled automatically as it figures out what topic I'm talking about and to what end.