Basic checks
What's broken?
message.content can be four different types (String, Hash, Content, Content::Raw) depending on where in the lifecycle you access it. This makes it difficult to work with reliably, and Content::Raw lacks a to_s implementation, so any code using .to_s gets #<RubyLLM::Content::Raw:0x...>.
How to reproduce
Use any chat with a schema and ActiveRecord persistence:
chat = chat_model.create!
chat.with_schema(MySchema)
chat.ask("first question") # response.content is a Hash (after JSON.parse)
chat.ask("second question") # previous response reloaded as Content::Raw
Expected behavior
Content type is normalized and stays consistent regardless of whether the message was just created or loaded from the database
At minimum, Content::Raw should implement to_s
What actually happened
On the second turn, to_llm reloads messages from the database. The first response's content, which was a Hash in memory, is now a Content::Raw wrapping that same Hash.
Environment
- Ruby version: 3.2.2
- RubyLLM version: 1.14.0
- Rails version: 7.2.3
- Provider (OpenAI, Anthropic, etc.): OpenAI
Basic checks
What's broken?
message.contentcan be four different types (String,Hash,Content,Content::Raw) depending on where in the lifecycle you access it. This makes it difficult to work with reliably, andContent::Rawlacks ato_simplementation, so any code using.to_sgets#<RubyLLM::Content::Raw:0x...>.How to reproduce
Use any chat with a schema and ActiveRecord persistence:
Expected behavior
Content type is normalized and stays consistent regardless of whether the message was just created or loaded from the database
At minimum,
Content::Rawshould implementto_sWhat actually happened
On the second turn,
to_llmreloads messages from the database. The first response's content, which was aHashin memory, is now aContent::Rawwrapping that sameHash.Environment