Vercel AI Provider for running LLMs locally using Ollama
[email protected] low health (58/100) — consider alternatives
Get this data programmatically — free, no authentication.
curl https://depscope.dev/api/check/npm/ollama-ai-providerFirst published · 2024-05-05T09:34:41.490Z
Last updated · 2025-01-17T16:54:36.767Z