Mixtral-8x7B-Instruct-v0.1

Large Language Models (LLM)

The Mixtral-8x7B-Instruct-v0.1 model, developed by Mistral AI, is a Sparse Mixture of Experts model released in 2023. It is optimized for following instructions, completing requests, and generating creative text formats.

Acerca del modelo Mixtral-8x7B-Instruct-v0.1

Publicado el huggingface

11/12/2023


Precio de entrada

0.63 /Mtoken(entrada)

Precio de salida

0.63 /Mtoken(salida)


Características soportadas
Streaming
Formatos de salida
raw_textjson_objectjson_schema
Tamaños de contexto
32k
Parámetros
46.7B

Prueba el modelo jugando con él.