added
New Extremely Large Model!
over 1 year ago by Katherine Jackson
Our new and improved xlarge
has better generation quality and a 4x faster prediction speed. This model now supports a maximum token length of 2048 tokens and frequency and presence penalties.