EleutherAI Open-Sources Six Billion Parameter GPT-3 Clone GPT-J

“A team of researchers from EleutherAI have open-sourced GPT-J, a six-billion parameter natural language processing (NLP) AI model based on GPT-3. The model was trained on an 800GB open-source text dataset and has performance comparable to a GPT-3 model of similar size.

Developer Aran Komatsuzaki announced the release on his blog. The model was trained on EleutherAI’s Pile dataset using Google Cloud’s v3-256 TPUs; training took approximately five weeks. On common NLP benchmark tasks, GPT-J achieves an accuracy similar to OpenAI’s published results for their 6.7B parameter version of GPT-3. EleutherAI’s release includes the model code, pre-trained weight files, Colab notebook, and a demo website. According to Komatsuzaki,…”

Leave a Reply