Developers spend 80% of their time orchestrating data flows, managing context windows, and building RAG pipelines instead of refining the actual AI experience. The complexity of stateful AI interactions leads to fragmented architectures and technical debt.
// The manual way (rebuilding basics) const context = await db.getMemory(userId); const docs = await vectorStore.search(query); const prompt = constructPrompt(query, context, docs); const response = await llm.generate(prompt); await db.saveMemory(userId, response); // ... repeat for every feature
A modular, event-driven framework designed to handle the entire lifecycle of an AI interaction.
The building blocks of the Snipet ecosystem.
The primary unit of execution. Encapsulates logic, model configuration, and state management.
Raw data origins. Connectors for PDFs, APIs, databases, or real-time streams.
Logical grouping of processed data. Manages how information is partitioned and accessed.
Automated vectorization flow. Handles chunking, embedding, and indexing into vector stores.
Shared memory layer. Provides persistence and context across multiple snipet executions.
External capabilities. Allows the AI to perform actions like sending emails or querying APIs.
The core orchestration logic. Manages the sequential or parallel execution of snipets and skills.
Connect your data (S3, Notion, SQL)
Chunk and vectorize using your preferred model
Store and manage vectors in pgvector (PostgreSQL)
Raw input combined with Scope memory
RAG retrieval and LLM inference
Tool execution and final response delivery
Everything you need to ship enterprise-grade AI features.
Swap models, vector stores, and memory adapters without changing your core logic.
First-class support for retrieval-augmented generation with built-in hybrid search.
Scopes manage context history automatically, ensuring the AI remembers what matters.
Use OpenAI, Anthropic, or local models via Ollama with a unified interface.
Define tools as simple TypeScript functions and let Snipet handle the orchestration.
Built-in tracing and logging for every step of the execution pipeline.