kb:gpt-3

GPT-3

Generative Pre-trained Transformer 3 (GPT-3; stylized GPT·3) is an autoregressive language model that uses deep learning to produce human-like text.

Snippet from Wikipedia: GPT-3

Generative Pre-trained Transformer 3 (GPT-3) is an autoregressive language model released by OpenAI in 2020 that uses deep learning to produce human-like text. When given a prompt, it will generate text that continues the prompt.

The architecture is a decoder-only transformer network with a 2048-token-long context and then-unprecedented size of 175 billion parameters, requiring 800GB to store. The model was trained using generative pre-training; it is trained to predict what the next token is based on previous tokens. The model demonstrated strong zero-shot and few-shot learning on many tasks.

The successor to GPT-2, GPT-3, is the third-generation language prediction model in a GPT series of foundation models created by OpenAI, a San Francisco-based artificial intelligence research laboratory. GPT-3, which was introduced in May 2020, and was in beta testing as of July 2020, is part of a trend in natural language processing (NLP) systems of pre-trained language representations.

The quality of the text generated by GPT-3 is so high that it can be difficult to determine whether or not it was written by a human, which has both benefits and risks. Thirty-one OpenAI researchers and engineers presented the original May 28, 2020 paper introducing GPT-3. In their paper, they warned of GPT-3's potential dangers and called for research to mitigate risk.: 34  David Chalmers, an Australian philosopher, described GPT-3 as "one of the most interesting and important AI systems ever produced." An April 2022 review in The New York Times described GPT-3's capabilities as being able to write original prose with fluency equivalent to that of a human.

Microsoft announced on September 22, 2020, that it had licensed "exclusive" use of GPT-3; others can still use the public API to receive output, but only Microsoft has access to GPT-3's underlying model.

Related:

  • kb/gpt-3.txt
  • Last modified: 2022/08/13 10:59
  • by Henrik Yllemo