ollamaPackages: docs: link to some mixture-of-experts models id like to try
This commit is contained in:
@@ -106,3 +106,14 @@ ollama API isn't documented anywhere, and it has changed over time, but it's all
|
||||
- [x] orca-mini (3b, 7b, 13b, 70b)
|
||||
- released 2023-06-23
|
||||
- <https://ollama.com/library/orca-mini>
|
||||
|
||||
|
||||
- [ ] dolphin-mixtral (8x7b, 8x22b)
|
||||
- Mixture-of-Experts
|
||||
- <https://ollama.com/library/dolphin-mixtral>
|
||||
- [ ] deepseek-coder-v2 (16b, 236b)
|
||||
- Mixture-of-Experts
|
||||
- <https://ollama.com/library/deepseek-coder-v2>
|
||||
- [ ] mixtral (8x7b, 8x22b)
|
||||
- Mixture-of-Experts
|
||||
- <https://ollama.com/library/mixtral>
|
||||
|
Reference in New Issue
Block a user