[
docs/platforms
]
|
query-multiple-lllms
Query multiple LLMs (Ollama)
In this UI you can ask the same question to the multiple supported Ollama models