🚀 Developer Portal Coming Soon - SDK and API Documentation in Progress
Add a persistent memory layer and sentiment analysis to your AI application with just a few lines of code. Caura SDK and API handle identity management, conversation persistence, and emotional context—so your AI can build real relationships.
The foundation of memory-powered AI
Persistent user profiles that follow across sessions, devices, and platforms. Each user gets a unique memory graph.
user = caura.get_user("user_123")
session_id = caura.create_session(user_id)
user.update_profile(preferences)
Seamless conversation flow with automatic context retrieval. Your AI remembers past interactions and learns from them.
session_id = caura.create_session(user_id)
response = caura.chat(session_id, message)
# Auto-retrieves relevant context
Automatic memory formation with semantic search. Store facts, preferences, and emotional states intelligently.
caura.store_memory(session_id, content)
memories = caura.search_memories(user_id, query)
Advanced capabilities for intelligent AI experiences
Track emotional patterns and adapt responses. Build empathetic AI that understands user mood and context.
sentiment = caura.get_sentiment(session_id)
caura.track_emotion(session_id, context)
Compare responses from multiple LLMs and let users select the best answer. Caura learns from selections to improve future responses.
responses = caura.compare_llms(session_id, query)
caura.record_preference(session_id, selected)
Autonomous memory agents that proactively organize, update, and synthesize information. Your AI's memory evolves and improves over time.
agent = caura.memory_agent(user_id)
agent.auto_organize(session_id)
# Install Caura SDK pip install caura-sdk import caura # Initialize with your API key caura.init(api_key="your_api_key") user_id = "user_123" # Create a new user session session_id = caura.create_session(user_id) # Store memory in the session caura.store_memory(session_id, "I prefer TypeScript over JavaScript") # Chat with perfect memory recall response = caura.chat(session_id, "What's my favorite language?", llm="gpt-4o") # AI responds: "You prefer TypeScript!" # Compare multiple LLM responses responses = caura.compare_llms(session_id, "Explain async programming") caura.record_preference(session_id, responses[0]) # Track sentiment and emotions for this session sentiment = caura.get_sentiment(session_id) caura.track_emotion(session_id, "excited about new project") # Search across all user memories memories = caura.search_memories(user_id, "programming languages")
Everything you need to build memory-powered AI applications
Simple HTTP endpoints for all memory operations. Integrate Caura into any application with just a few API calls.
Works seamlessly with OpenAI, Anthropic Claude, Google Gemini, and local Ollama models. Switch models on the fly.
Real-time streaming responses for instant user feedback. Built-in support for both WebSocket and Server-Sent Events.
Powered by Pinecone for lightning-fast semantic search. Automatic embedding generation and similarity matching.
Native Model Context Protocol support for Claude Desktop. Use Caura as a memory layer in your Claude conversations.
Full-featured Python client library with async support. Type hints, auto-retry logic, and comprehensive documentation.