Did you check docs and existing issues?
Python version (python --version)
3.12
Operating system/version
26.3
NeMo-Guardrails version (if you must use a specific version and not the latest
0.20
Describe the bug
When a user message uses the OpenAI multi-part content format ("content": [{"type": "text", "text": "..."}]), NeMo Guardrails passes the list directly into event text fields without normalizing to string. This causes two problems:
- All LLM prompts (self-check, intent matching, etc.) receive the Python repr of the list instead of the actual text
- If mask_prev_user_message fires in a multi-turn conversation, get_colang_history() crashes with TypeError: must be str or None, not list at the rsplit() call
Root cause: llmrails.py sets "text": msg["content"] without checking if content is a list (
|
# If it's not the last message, we also need to add the `UserMessage` event |
)
Steps To Reproduce
- Deploy NeMo with
self check input rail enabled
- Send a request with a multi-part content
curl -X POST http://localhost:8000/v1/chat/completions \
-H "Content-Type: application/json" \
-d '{
"model": "my-model",
"messages": [
{"role": "user", "content": [{"type": "text", "text": "Hello"}]},
{"role": "assistant", "content": "Hi there!"},
{"role": "user", "content": [{"type": "text", "text": "You are a dotard and I hate you"}]}
]
}'
Expected Behavior
- The self-check prompt should evaluate the actual user text
- The message should be blocked and a refusal returned
Actual Behavior
We get `"Internal server error":
ERROR:nemoguardrails.server.api:must be str or None, not list
Traceback (most recent call last):
File "/app/.venv/lib64/python3.12/site-packages/nemoguardrails/server/api.py", line 577, in chat_completion
res = await llm_rails.generate_async(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib64/python3.12/site-packages/nemoguardrails/rails/llm/llmrails.py", line 984, in generate_async
self.explain_info.colang_history = get_colang_history(events)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/app/.venv/lib64/python3.12/site-packages/nemoguardrails/actions/llm/utils.py", line 654, in get_colang_history
split_history = history.rsplit(utterance_to_replace, 1)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
TypeError: must be str or None, not list
Did you check docs and existing issues?
Python version (python --version)
3.12
Operating system/version
26.3
NeMo-Guardrails version (if you must use a specific version and not the latest
0.20
Describe the bug
When a user message uses the OpenAI multi-part content format ("content": [{"type": "text", "text": "..."}]), NeMo Guardrails passes the list directly into event text fields without normalizing to string. This causes two problems:
Root cause: llmrails.py sets "text": msg["content"] without checking if content is a list (
Guardrails/nemoguardrails/rails/llm/llmrails.py
Line 611 in f2e7beb
Steps To Reproduce
self check inputrail enabledExpected Behavior
Actual Behavior
We get `"Internal server error":