Ollama backend implementation for LLM interactions. Provides streaming chat, embeddings, tool calling, vision support, and model management.
[email protected] is safe to use (health: 51/100)
Get this data programmatically — free, no authentication.
curl https://depscope.dev/api/check/pub/llm_ollamaLast updated · 2026-02-28T13:23:05.445619Z