Meta-llama

Meta: Llama 4 Scout

meta-llama/llama-4-scout
Llama 4 Scout 17B Instruct (16E) is a mixture-of-experts (MoE) language model developed by Meta, activating 17 billion parameters out of a total of 10...
327,680 токенов text+image->text
2025
Ввод: $0.0000000800
Вывод: $0.0000003000

Meta: Llama Guard 4 12B

meta-llama/llama-guard-4-12b
Llama Guard 4 is a Llama 4 Scout-derived multimodal pretrained model, fine-tuned for content safety classification. Similar to previous versions, it c...
163,840 токенов text+image->text
2025
Ввод: $0.0000001800
Вывод: $0.0000001800

Meta: LlamaGuard 2 8B

meta-llama/llama-guard-2-8b
This safeguard model has 8B parameters and is based on the Llama 3 family. Just like is predecessor, [LlamaGuard 1](https://huggingface.co/meta-llama/...
8,192 токенов text->text
2024
Ввод: $0.0000002000
Вывод: $0.0000002000

Microsoft

Microsoft: MAI DS R1

microsoft/mai-ds-r1
MAI-DS-R1 is a post-trained variant of DeepSeek-R1 developed by the Microsoft AI team to improve the model’s responsiveness on previously blocked to...
163,840 токенов text->text
2025
Ввод: $0.0000003000
Вывод: $0.0000012000

Microsoft: MAI DS R1 (free)

microsoft/mai-ds-r1:free
MAI-DS-R1 is a post-trained variant of DeepSeek-R1 developed by the Microsoft AI team to improve the model’s responsiveness on previously blocked to...
163,840 токенов text->text
2025
Стоимость: Бесплатно

Microsoft: Phi 4

microsoft/phi-4
[Microsoft Research](/microsoft) Phi-4 is designed to perform well in complex reasoning tasks and can operate efficiently in situations with limited m...
16,384 токенов text->text
2025
Ввод: $0.0000000600
Вывод: $0.0000001400

Microsoft: Phi 4 Multimodal Instruct

microsoft/phi-4-multimodal-instruct
Phi-4 Multimodal Instruct is a versatile 5.6B parameter foundation model that combines advanced reasoning and instruction-following capabilities acros...
131,072 токенов text+image->text
2025
Ввод: $0.0000000500
Вывод: $0.0000001000

Microsoft: Phi 4 Reasoning Plus

microsoft/phi-4-reasoning-plus
Phi-4-reasoning-plus is an enhanced 14B parameter model from Microsoft, fine-tuned from Phi-4 with additional reinforcement learning to boost accuracy...
32,768 токенов text->text
2025
Ввод: $0.0000000700
Вывод: $0.0000003500

Microsoft: Phi-3 Medium 128K Instruct

microsoft/phi-3-medium-128k-instruct
Phi-3 128K Medium is a powerful 14-billion parameter model designed for advanced language understanding, reasoning, and instruction following. Optimiz...
128,000 токенов text->text
2024
Ввод: $0.0000010000
Вывод: $0.0000010000

Microsoft: Phi-3 Mini 128K Instruct

microsoft/phi-3-mini-128k-instruct
Phi-3 Mini is a powerful 3.8B parameter model designed for advanced language understanding, reasoning, and instruction following. Optimized through su...
128,000 токенов text->text
2024
Ввод: $0.0000001000
Вывод: $0.0000001000

Microsoft: Phi-3.5 Mini 128K Instruct

microsoft/phi-3.5-mini-128k-instruct
Phi-3.5 models are lightweight, state-of-the-art open models. These models were trained with Phi-3 datasets that include both synthetic data and the f...
128,000 токенов text->text
2024
Ввод: $0.0000001000
Вывод: $0.0000001000

WizardLM-2 8x22B

microsoft/wizardlm-2-8x22b
WizardLM-2 8x22B is Microsoft AI's most advanced Wizard model. It demonstrates highly competitive performance compared to leading proprietary models, ...
65,536 токенов text->text
2024
Ввод: $0.0000004800
Вывод: $0.0000004800

Minimax

MiniMax: MiniMax M1

minimax/minimax-m1
MiniMax-M1 is a large-scale, open-weight reasoning model designed for extended context and high-efficiency inference. It leverages a hybrid Mixture-of...
1,000,000 токенов text->text
2025
Ввод: $0.0000004000
Вывод: $0.0000022000

MiniMax: MiniMax M2

minimax/minimax-m2
MiniMax-M2 is a compact, high-efficiency large language model optimized for end-to-end coding and agentic workflows. With 10 billion activated paramet...
204,800 токенов text->text
2025
Ввод: $0.0000002550
Вывод: $0.0000010200

MiniMax: MiniMax-01

minimax/minimax-01
MiniMax-01 is a combines MiniMax-Text-01 for text generation and MiniMax-VL-01 for image understanding. It has 456 billion parameters, with 45.9 billi...
1,000,192 токенов text+image->text
2025
Ввод: $0.0000002000
Вывод: $0.0000011000

Mistralai

Mistral Large

mistralai/mistral-large
This is Mistral AI's flagship model, Mistral Large 2 (version `mistral-large-2407`). It's a proprietary weights-available model and excels at reasonin...
128,000 токенов text->text
2024
Ввод: $0.0000020000
Вывод: $0.0000060000

Mistral Large 2407

mistralai/mistral-large-2407
This is Mistral AI's flagship model, Mistral Large 2 (version mistral-large-2407). It's a proprietary weights-available model and excels at reasoning,...
131,072 токенов text->text
2024
Ввод: $0.0000020000
Вывод: $0.0000060000

Mistral Large 2411

mistralai/mistral-large-2411
Mistral Large 2 2411 is an update of [Mistral Large 2](/mistralai/mistral-large) released together with [Pixtral Large 2411](/mistralai/pixtral-large-...
131,072 токенов text->text
2024
Ввод: $0.0000020000
Вывод: $0.0000060000

Mistral Small

mistralai/mistral-small
With 22 billion parameters, Mistral Small v24.09 offers a convenient mid-point between (Mistral NeMo 12B)[/mistralai/mistral-nemo] and (Mistral Large ...
32,768 токенов text->text
2024
Ввод: $0.0000002000
Вывод: $0.0000006000

Mistral Tiny

mistralai/mistral-tiny
Note: This model is being deprecated. Recommended replacement is the newer [Ministral 8B](/mistral/ministral-8b) This model is currently powered by...
32,768 токенов text->text
2024
Ввод: $0.0000002500
Вывод: $0.0000002500
Загрузка...

Загрузка моделей...