emasters

joined 1 year ago
[–] [email protected] 1 points 1 year ago

Seems like they're sticking with the 7b series for the time being.

 

Long Sequence Modeling with XGen: A 7B LLM Trained on 8K Input Sequence Length -- https://blog.salesforceairesearch.com/xgen/