Phi-2
Phi-2 is an LLM with 2.7 billion parameters. It was trained by Microsoft using 'textbook quality' data, partially synthetic. Despite its size, Phi-2 is claimed to be competitive with much larger models, up to the 13b range.
Supported variants
Airtrain supports one variant of the Phi-2 model.
3b-instruct
This is the original model released by Microsoft, which does support instruction usage.
Updated 9 months ago
What’s Next