What is Generative Pre-trained Transformer 3 (GPT-3)

Machine Learning

Generative Pre-trained Transformer 3 (GPT-3

GPT-3 is a model that uses deep learning to create human-like text.  "GPT-3's full version has a capacity of 175 billion machine learning parameters."  This can be loosely contrasted with the  'Turing Natural Language Generation (T-NLG)' at 17 billion parameters and the 'Tangora' model with over 8 trillion parameters.  

It has been stated that the "quality of the text generated by GPT-3 is so high that it can be difficult to determine whether or not it was written by a human".  These algorithms can perform variety of tasks including summarizing texts and answering questions.