Skip to content

docs(integration): add OpenSearch integration guide#164

Open
maheshbabugorantla wants to merge 3 commits intotraceloop:mainfrom
maheshbabugorantla:mbg/opensearch-integration-docs
Open

docs(integration): add OpenSearch integration guide#164
maheshbabugorantla wants to merge 3 commits intotraceloop:mainfrom
maheshbabugorantla:mbg/opensearch-integration-docs

Conversation

@maheshbabugorantla
Copy link
Copy Markdown
Contributor

@maheshbabugorantla maheshbabugorantla commented Apr 4, 2026

Summary

Adds comprehensive documentation for integrating OpenLLMetry with OpenSearch, enabling users to visualize LLM traces in OpenSearch Dashboards' Trace Analytics using a Data Prepper pipeline for OTLP ingestion.

What's Changed

  • New integration guide: openllmetry/integrations/opensearch.mdx
    • Quick start setup with step-by-step instructions
    • Data Prepper pipeline configuration for OTLP ingestion
    • Workflow decorator usage patterns for hierarchical traces
    • Production considerations (security, sampling, content logging)
    • Complete metadata attribute reference

Supporting screenshots:

  • img/integrations/opensearch-trace-details.png — Waterfall trace detail view in OpenSearch Dashboards showing parent/child LLM spans

Key Features Documented

  1. Service visibility: Shows how LLM calls appear as traces with full metadata (model, tokens, prompts) in OpenSearch Dashboards' Trace Analytics
  2. Workflow tracing: Demonstrates @workflow and @task decorators for complex multi-step applications
  3. Pipeline setup: End-to-end Data Prepper configuration for routing OTLP traces into OpenSearch

Documentation Structure

  • Quick Start (4 steps)
  • Data Prepper pipeline configuration
  • Environment variables reference table
  • Workflow decorator examples
  • Example trace visualization (screenshot)
  • Captured metadata attributes
  • Production considerations (Content Logging, Sampling, Security)
  • External resources

Testing

  • Verified all code examples are syntactically correct
  • Screenshot displays trace waterfall accurately with parent/child span hierarchy
  • Navigation entries added to mint.json and integrations/introduction.mdx

Summary by CodeRabbit

  • Documentation
    • Added OpenSearch integration docs: a comprehensive quick-start, step‑by‑step configuration for collectors and Data Prepper, viewing traces in OpenSearch Dashboards Trace Analytics, supported env vars, example workflow usage, captured LLM span attributes, and production recommendations (TLS, auth, sampling).
    • Added an Integrations Catalog entry and navigation link for OpenSearch.

Add documentation for using OpenSearch as an LLM traces destination
via Data Prepper pipeline ingestion. Covers setup, configuration,
SDK initialization, workflow decorator usage, and captured metadata.
…uide

Add a waterfall trace detail screenshot from OpenSearch Dashboards
to illustrate the parent/child span relationship visible after
instrumenting an LLM workflow with OpenLLMetry.
@coderabbitai
Copy link
Copy Markdown
Contributor

coderabbitai bot commented Apr 4, 2026

No actionable comments were generated in the recent review. 🎉

ℹ️ Recent review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 05e80906-d32a-4d9a-b546-58263379bdcb

📥 Commits

Reviewing files that changed from the base of the PR and between c635229 and d5ce291.

📒 Files selected for processing (1)
  • openllmetry/integrations/opensearch.mdx
✅ Files skipped from review due to trivial changes (1)
  • openllmetry/integrations/opensearch.mdx

📝 Walkthrough

Walkthrough

Added OpenSearch integration docs: a new navigation entry, an integration card in the Integrations Catalog, and a full guide describing routing OpenLLMetry traces via OpenTelemetry Collector → Data Prepper → OpenSearch and viewing them in OpenSearch Dashboards.

Changes

Cohort / File(s) Summary
Navigation & Catalog
mint.json, openllmetry/integrations/introduction.mdx
Inserted new navigation route for the OpenSearch page and added an "OpenSearch" integration card to the Integrations Catalog.
OpenSearch Integration Guide
openllmetry/integrations/opensearch.mdx
Added comprehensive documentation covering Traceloop SDK setup, otel-collector-config.yaml (OTLP receivers/exporter, processors, health extension), data-prepper-pipelines.yaml with OpenSearch sinks, example Python instrumentation and workflow decorators, captured LLM span attributes, environment variables, security/production notes, and resource links.

Sequence Diagram(s)

sequenceDiagram
    participant App as Traceloop-enabled App
    participant Collector as OpenTelemetry Collector
    participant DataPrepper as Data Prepper
    participant OpenSearch as OpenSearch
    participant Dashboards as OpenSearch Dashboards

    App->>Collector: Send OTLP traces (HTTP/gRPC)
    Collector->>DataPrepper: Export traces to Data Prepper (OTLP exporter)
    DataPrepper->>OpenSearch: Index traces into trace-analytics-* indices
    Dashboards->>OpenSearch: Query trace indices for visualization
    Dashboards-->>User: Display trace waterfalls and span metadata
Loading

Estimated code review effort

🎯 2 (Simple) | ⏱️ ~8 minutes

Poem

🐰 I hop through docs with eager paws,
Tracing footsteps, mapping laws,
OpenSearch opens wide the gate,
Traces bloom and dashboards wait.
A merry hop — integration done! 🥕✨

🚥 Pre-merge checks | ✅ 3
✅ Passed checks (3 passed)
Check name Status Explanation
Description Check ✅ Passed Check skipped - CodeRabbit’s high-level summary is enabled.
Title check ✅ Passed The title clearly and concisely summarizes the main change: adding an OpenSearch integration guide to the documentation.
Docstring Coverage ✅ Passed No functions found in the changed files to evaluate docstring coverage. Skipping docstring coverage check.

✏️ Tip: You can configure your own custom pre-merge checks in the settings.

✨ Finishing Touches
🧪 Generate unit tests (beta)
  • Create PR with unit tests

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share

Comment @coderabbitai help to get the list of available commands and usage tips.

Copy link
Copy Markdown
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.

Inline comments:
In `@openllmetry/integrations/opensearch.mdx`:
- Around line 152-168: Update the examples so Traceloop is initialized before
any LLM client is imported: move the Traceloop import and Traceloop.init(...)
call to precede importing OpenAI (or any LLM client) and adjust the prose to
state "Initialize Traceloop (call Traceloop.init) before importing the LLM
client (e.g., OpenAI)" so the example and text consistently show Traceloop.init
occurring prior to the OpenAI import to ensure auto-instrumentation works
correctly.
- Line 262: The <img src="/img/integrations/opensearch-trace-details.png" /> tag
is missing an alt attribute which harms accessibility; update the image element
(the <img ... /> line) to include a concise, descriptive alt text (for example:
alt="OpenSearch trace details screenshot") so screen readers can convey the
image content.
🪄 Autofix (Beta)

Fix all unresolved CodeRabbit comments on this PR:

  • Push a commit to this branch (recommended)
  • Create a new PR with the fixes

ℹ️ Review info
⚙️ Run configuration

Configuration used: defaults

Review profile: CHILL

Plan: Pro

Run ID: 93902af7-c58b-4e91-9901-f03997477a27

📥 Commits

Reviewing files that changed from the base of the PR and between e236f33 and c635229.

⛔ Files ignored due to path filters (1)
  • img/integrations/opensearch-trace-details.png is excluded by !**/*.png
📒 Files selected for processing (3)
  • mint.json
  • openllmetry/integrations/introduction.mdx
  • openllmetry/integrations/opensearch.mdx

Comment on lines +152 to +168
Import and initialize Traceloop before any LLM imports:

```python
from os import getenv

from traceloop.sdk import Traceloop
from openai import OpenAI

# Initialize Traceloop with OTLP endpoint
Traceloop.init(
app_name="your-service-name",
api_endpoint="http://localhost:4318"
)

# Traceloop must be initialized before importing the LLM client
# Traceloop instruments the OpenAI client automatically
client = OpenAI(api_key=getenv("OPENAI_API_KEY"))
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Fix contradictory initialization guidance vs example code order.

The page says to initialize Traceloop before LLM imports, but both examples import OpenAI before Traceloop.init(). This can mislead users and break auto-instrumentation expectations.

Suggested doc fix
-    from traceloop.sdk import Traceloop
-    from openai import OpenAI
+    from traceloop.sdk import Traceloop
@@
     Traceloop.init(
         app_name="your-service-name",
         api_endpoint="http://localhost:4318"
     )
+    from openai import OpenAI
@@
-from traceloop.sdk.decorators import workflow, task
-from openai import OpenAI
+from traceloop.sdk.decorators import workflow, task
@@
 Traceloop.init(
   app_name="recipe-service",
   api_endpoint="http://localhost:4318",
 )
+from openai import OpenAI

Also applies to: 227-229

🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In `@openllmetry/integrations/opensearch.mdx` around lines 152 - 168, Update the
examples so Traceloop is initialized before any LLM client is imported: move the
Traceloop import and Traceloop.init(...) call to precede importing OpenAI (or
any LLM client) and adjust the prose to state "Initialize Traceloop (call
Traceloop.init) before importing the LLM client (e.g., OpenAI)" so the example
and text consistently show Traceloop.init occurring prior to the OpenAI import
to ensure auto-instrumentation works correctly.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant