Skip to main content
Ollama logo

Ollama

Operational

Run AI models anywhere

Run large language models locally or via Ollama's cloud API. Simple, fast, and developer-friendly.

Company

Headquarters
Coming soon
Founded
Coming soon

Capabilities

Compute locations are estimated from public sources and may be outdated. Verify directly with the provider for compliance decisions.

Models Hosted
Curated Deck
Deployment
Local + Cloud
Specialties
Coming soon
API Style
Native + OpenAI-compatible
Est. Compute Region
Unknown — contacted by email 16 Feb 2026 [do you know? contact us]

Models

Coming soon

We are standardizing model listings across providers.

Why Use Ollama

Coming soon

Details

Coming soon
Newsletter

Get the signal, skip the noise.

Weekly digest of new models and provider updates across 40+ compute providers. Curated for AI builders who ship.

New model releases
Capability updates
Provider status
bots.so
The AI Inference Model Index
© bots.so — The AI Inference Model Index

bots.so aggregates publicly available model deployment information from official provider sources. We are not affiliated with any model provider. Model availability changes rapidly; always verify on official sites.