Skip to content

Hallucination Detection Requirements #503

@psschwei

Description

@psschwei

Description:
Requirements for detecting hallucinations and unfounded claims. Use both heuristic and LLM-based validation.

Acceptance Criteria:

  • GroundedInSourceRequirement - output must be traceable to provided sources
  • NoFabricatedFactsRequirement - LLM-as-judge for unsupported claims
  • CitationRequirement - requires citations for factual claims
  • ConsistencyRequirement - output consistent with prior context
  • UncertaintyAcknowledgementRequirement - admits when unsure
  • Integration with existing mellea/stdlib/components/intrinsic/rag.py
  • Tests with known hallucination examples

Metadata

Metadata

Assignees

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions