Mixtral-8x7B-Instruct-v0.1

Large Language Models (LLM)

The Mixtral-8x7B-Instruct-v0.1 model, developed by Mistral AI, is a Sparse Mixture of Experts model released in 2023. It is optimized for following instructions, completing requests, and generating creative text formats.

Sobre o modelo Mixtral-8x7B-Instruct-v0.1

Publicado em huggingface

11/12/2023


Preço de entrada

0.63 /Mtoken(entrada)

Preço de saída

0.63 /Mtoken(saída)


Funcionalidades suportadas
Streaming
Formatos de saída
raw_textjson_objectjson_schema
Tamanhos de contexto
32k
Parâmetros
46.7B

Experimente o modelo brincando com ele.