Google trained a trillion-parameter AI language model

Google trained a trillion-parameter AI language model

VentureBeat article https://venturebeat.com/2021/01/12/google-trained-a-trillion-parameter-ai-language-model

Google Switch Transformers: Scaling to Trillion Parameter Models with constant computational costs
Towards Data Science article https://towardsdatascience.com/google-switch-transformers-scaling-to-trillion-parameter-models-with-constant-computational-costs-806fd145923d

Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity
YouTube 33 minute video by Yannic Kilcher discussing paper https://www.youtube.com/watch?v=iAR8LkkMMIM

Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficient Sparsity
Google paper on arXiv https://arxiv.org/abs/2101.03961
Google paper PDF https://arxiv.org/pdf/2101.03961

Google paper GitHub: Mesh TensorFlow — Model Parallelism Made Easier
GitHub https://github.com/tensorflow/mesh

Stay up to date. Subscribe to my posts https://morrislee1234.wixsite.com/website/contact
Web site with my other posts by category https://morrislee1234.wixsite.com/website

LinkedIn https://www.linkedin.com/in/morris-lee-47877b7b

Photo by Oudom Pravat on Unsplash

--

--

AI News Clips by Morris Lee: News to help your R&D
AI News Clips by Morris Lee: News to help your R&D

Written by AI News Clips by Morris Lee: News to help your R&D

A computer vision consultant in artificial intelligence and related hitech technologies 37+ years. Am innovator with 66+ patents and ready to help a firm's R&D.

No responses yet