ollamaPackages: docs: link to some mixture-of-experts models id like to try

This commit is contained in:
2025-07-24 21:33:42 +00:00
parent e2a183e8d3
commit 011c428c08

View File

@@ -106,3 +106,14 @@ ollama API isn't documented anywhere, and it has changed over time, but it's all
- [x] orca-mini (3b, 7b, 13b, 70b)
- released 2023-06-23
- <https://ollama.com/library/orca-mini>
- [ ] dolphin-mixtral (8x7b, 8x22b)
- Mixture-of-Experts
- <https://ollama.com/library/dolphin-mixtral>
- [ ] deepseek-coder-v2 (16b, 236b)
- Mixture-of-Experts
- <https://ollama.com/library/deepseek-coder-v2>
- [ ] mixtral (8x7b, 8x22b)
- Mixture-of-Experts
- <https://ollama.com/library/mixtral>