Description:
Create an OpenAI API-compatible HTTP server that wraps mellea, allowing any OpenAI-compatible client to use mellea backends.
Key deliverables:
- using
m serve serves an OpenAI API compatible endpoints
- FastAPI/Starlette server implementing OpenAI chat completions endpoint
/v1/chat/completions endpoint with streaming support
- Model listing endpoint
- Configuration for backend selection
Acceptance criteria:
Notes:
- This enables "any framework that supports OpenAI" to use mellea
- Consider using existing libraries like
litellm proxy as reference
Task breakdown
P0 — Breaks Compatibility
These are things that cause OpenAI SDK clients to fail or return incorrect results.
P1 — Core Features Expected by Clients
Description:
Create an OpenAI API-compatible HTTP server that wraps mellea, allowing any OpenAI-compatible client to use mellea backends.
Key deliverables:
m serveserves an OpenAI API compatible endpoints/v1/chat/completionsendpoint with streaming supportAcceptance criteria:
Notes:
litellm proxyas referenceTask breakdown
P0 — Breaks Compatibility
These are things that cause OpenAI SDK clients to fail or return incorrect results.
P1 — Core Features Expected by Clients