ACMI gives any AI agent persistent, queryable memory using exactly three Redis keys per entity. No vector index. No knowledge graph. No fact-extraction LLM pass. On npm, MIT-licensed, run it yourself.
Every entity in ACMI — a project, an agent, a contact, a deal — is stored using exactly three Redis keys. Each key answers one question an LLM needs to make a decision.
acmi.profile.set(
"user:mikey",
{ name: "Mikey",
tz: "America/New_York",
role: "operator" }
);
acmi.signals.set(
"user:mikey",
"current_focus",
"shipping ACMI v1.3"
);
acmi.timeline.append(
"user:mikey",
{ source: "github",
kind: "merged-pr",
summary: "ROADMAP.md +Sigil v2.0" }
);
The in-memory adapter is zero-dependency, so this runs the moment you
save it. Copy it into acmi.mjs and node acmi.mjs.
acmi.mjsimport { createAcmi } from "@madezmedia/acmi"; import { InMemoryAdapter } from "@madezmedia/acmi/adapters/in-memory"; const acmi = createAcmi(new InMemoryAdapter()); await acmi.profile.set("user:mikey", { name: "Mikey", tz: "America/New_York" }); await acmi.signals.set("user:mikey", "current_task", "shipping ACMI"); await acmi.timeline.append("user:mikey", { source: "user:mikey", kind: "started_recording", correlationId: "manifesto-001", summary: "video 1 of 3", }); console.log(await acmi.timeline.read("user:mikey"));
The same SDK speaks to any backing store through an adapter contract. Three are shipped today; the conformance suite tells you when a fourth is done.
Upstash — edge-compatible (Workers, Vercel Edge, Deno Deploy):
upstashimport { createAcmi } from "@madezmedia/acmi"; import { UpstashAdapter } from "@madezmedia/acmi/adapters/upstash"; const acmi = createAcmi( new UpstashAdapter({ url: process.env.UPSTASH_REDIS_REST_URL, token: process.env.UPSTASH_REDIS_REST_TOKEN, }) );
Self-hosted Redis via ioredis — Node.js runtimes:
redisimport Redis from "ioredis"; import { createAcmi } from "@madezmedia/acmi"; import { RedisAdapter } from "@madezmedia/acmi/adapters/redis"; const acmi = createAcmi( new RedisAdapter({ client: new Redis(process.env.REDIS_URL), ownClient: true, }) );
| Adapter | Use case | Edge-compat | Status |
|---|---|---|---|
| in-memory | Tests, examples, dev | n/a | stable |
| upstash | Edge (Workers, Vercel Edge, Deno) | yes | stable |
| redis (ioredis) | Self-hosted, Node.js runtimes | no | stable |
The dominant memory pattern for AI agents today is vector embeddings. Useful, but not the right primitive for most agent decisions. An agent rarely asks "find me the semantically closest five documents." It asks: who is this person, what's their current state, and what just happened?
Those are three different questions, and they map cleanly onto three different data shapes — a JSON profile, a JSON signal bag, and a chronologically-sorted event log. ACMI gives each one a Redis key with a deterministic name, and stops there.
The result: an agent waking up reads three keys, gets the full operating context, and makes a decision. No multi-table joins, no schema artifacts wasting tokens, no embedding round-trip on the hot path. The SDK is small, the spec is short, and the conformance suite tells you when an adapter is finished.
Use ACMI alongside your existing Postgres or warehouse — they're for different jobs. Postgres for transactional integrity. ACMI for the state shape your agents actually consume.
The @madezmedia/acmi/testing/conformance
suite is the canonical contract. Pass it in your runtime against your store
of choice, and you ship a working adapter.
Want to build a DynamoDB adapter? A Cloudflare KV adapter? A FoundationDB adapter? Read CONTRIBUTING.md, run the conformance suite, open a PR. The suite will tell you exactly where you're not yet protocol-correct.
actor_type
becomes a required field on agent profiles. §12 multi-tenant: namespace prefixes
for multi-org deployments. Additive — no breaking changes.
Full roadmap on GitHub: ROADMAP.md. Spec lives at SPEC.md.