All posts
roadmapplatformmobile

What's Next: Desktop Agent, Mobile, and the Developer API

The foundation is built. Here's what's coming — a desktop system agent, mobile companion, and a public API that lets any developer build memory-aware AI applications.

What's Next: Desktop Agent, Mobile, and the Developer API

Where We Are

Nine phases of development have produced a working, polished, multi-platform AI memory layer:

  • Browser extension covering every major web AI chat
  • MCP server for Cursor, Claude Desktop, and compatible editors
  • CLI for terminal users
  • Web dashboard for browsing, organizing, and managing memories
  • Intelligence layer with Style Memory, Contradiction Engine, and Career Brain
  • Security foundation with encryption, rate limiting, and GDPR compliance

memset works. People use it. The core promise — "tell one AI and they all know" — is real for browser and MCP-connected tools.

But there are gaps. And we have a clear plan to close them.

The Desktop System Agent

The browser extension covers web AI chats. MCP covers compatible editors. But most of a developer's workflow happens outside both — in terminals, native apps, clipboard operations, and IDEs that don't support MCP yet.

The Desktop System Agent is a native application (built with Tauri and Rust) that runs in the background and extends memset's reach to the entire desktop:

Global hotkeys — press Ctrl+Shift+M from any application to save a memory. Press Ctrl+Shift+R to search. Two seconds, no context switching.

File watcher — automatically syncs AI tool preference files (.cursor/rules, CLAUDE.md, .github/copilot-instructions.md, .windsurfrules) bidirectionally with memset. Preferences saved in one tool automatically appear in every other tool's config files.

Clipboard intelligence — when you're working in an IDE or terminal and copy something that looks like code, an error message, or a configuration block, the agent offers to save it. Context-aware: only active in work applications you've opted in. Completely invisible during personal use.

Terminal observer — a lightweight shell hook that captures notable terminal events. When you spend 10 minutes debugging an error and finally fix it, the agent offers to save the solution. Error resolution patterns — the "failed, then modified, then succeeded" sequence — are detected automatically.

IDE context injection — for IDEs that don't support MCP (JetBrains, older VS Code), the agent writes a managed section into their config files with relevant project context from memset. Ghost Memory, delivered via file system.

The agent is designed with privacy as a first-class concern. Nothing is monitored by default — every feature is opt-in, context-aware, and transparent. See our Roadmap for the detailed timeline.

The Mobile Companion

Mobile is the hardest surface for memset because iOS and Android sandbox applications aggressively. No browser extensions, no MCP, no file watching.

Our mobile strategy has three layers:

Share extension — select text in any mobile app (ChatGPT, Claude, browser, Slack, Notes) → Share → memset → saved. The most natural capture mechanism on mobile, fully within platform rules.

Companion app — a lightweight app for quick capture (text, voice, photos), memory search, and "context prep." Context prep generates a formatted block of your relevant memories and style preferences that you can copy and paste as the first message in any AI chat. It's manual Ghost Memory — not as seamless as the extension, but it takes 5 seconds and works everywhere.

Custom instructions export — memset formats your style profile for each AI platform's custom instructions or settings page. When your profile evolves, the app notifies you to update. When these platforms eventually open APIs for writing custom instructions, we'll integrate automatically.

The Developer API & SDK

memset's positioning is the brain, not the interface. We don't want to build another AI chat app — we want to be the memory layer that any AI app can plug into.

The centerpiece is a new ghostContext endpoint — a single API call that packages everything a developer needs:

  • Relevant memories for the current query (Ghost Memory)
  • User's style profile formatted as a system prompt (Style Memory)
  • Project-specific context if applicable

A developer building an AI-powered app can add memset with essentially one line of code:

const context = await memset.ghostContext({ query: userMessage });
// Prepend to system prompt → user's AI now "knows" them

We're building SDKs for JavaScript/TypeScript, Swift, and Kotlin to make integration trivial. The goal: any AI application, on any platform, can offer personalized memory to its users by plugging into memset.

The Vision

Today, every AI tool starts from zero with every user. Your preferences, your knowledge, your style — locked in silos, forgotten between sessions, invisible across tools.

memset is building the infrastructure to change that. Not by replacing AI tools, but by giving them all a shared memory layer. Tell one AI something once, and they all know it.

The browser extension and MCP server are live today. The desktop agent, mobile companion, and developer API are coming. Check the Roadmap for timelines, and get started free to see what persistent AI memory feels like.