Current documentation for the llmapi C++23 module library.
- Getting Started - install, build, and first request
- C++ API Guide - types, providers, and
Client<P> - Examples - chat, streaming, embeddings, and tool flows
- Providers - OpenAI, Anthropic, and compatible endpoints
- Advanced Usage - persistence, async calls, and custom configuration
- C++23 modules via
import mcpplibs.llmapi - Typed chat messages and multimodal content structs
- Provider concepts for sync, async, streaming, and embeddings
- Built-in OpenAI and Anthropic providers
- OpenAI-compatible endpoint support through configurable base URLs
- Conversation save/load helpers for local session persistence
Apache-2.0 - see LICENSE