Hi all,
I’m running into an issue with n8n’s OpenRouter Chat Model node. I have an AI Agent node that expects a strictly JSON output (using a system prompt and a JSON Schema parser), and it works perfectly when I use the built-in OpenAI Chat Model node (gpt-4.1-mini). But as soon as I switch to the OpenRouter Chat node (openai/gpt-4.1-mini-2025-04-14), the execution fails with “Provider returned error,” and the logs show:
```json
"finish_reason": "tool_calls"
```
Has anyone solved this before?
Any guidance would be hugely appreciated!
n8n version: 1.89