GPT-2

Generative Pre-trained Transformer 2 (GPT-2)
Original author(s)OpenAI
Initial release14 February 2019 (14 February 2019)
Repositoryhttps://github.com/openai/gpt-2
PredecessorGPT-1
SuccessorGPT-3
Type
LicenseMIT[1]
Websiteopenai.com/blog/gpt-2-1-5b-release/

Generative Pre-trained Transformer 2 (GPT-2) is a large language model by OpenAI and the second in their foundational series of GPT models. GPT-2 was pre-trained on a dataset of 8 million web pages.[2] It was partially released in February 2019, followed by full release of the 1.5-billion-parameter model on November 5, 2019.[3][4][5][6][7]

GPT-2 was created as a "direct scale-up" of GPT-1[8] with a ten-fold increase in both its parameter count and the size of its training dataset.[7] It is a general-purpose learner and its ability to perform the various tasks was a consequence of its general ability to accurately predict the next item in a sequence,[2][9] which enabled it to translate texts, answer questions about a topic from a text, summarize passages from a larger text,[9] and generate text output on a level sometimes indistinguishable from that of humans,[10] however it could become repetitive or nonsensical when generating long passages.[11] It was superseded by GPT-3 and GPT-4 models, which are not open source anymore.

GPT-2 has, like its predecessor GPT-1 and its successors GPT-3 and GPT-4, a generative pre-trained transformer architecture, implementing a deep neural network, specifically a transformer model,[8] which uses attention instead of older recurrence- and convolution-based architectures.[12][13] Attention mechanisms allow the model to selectively focus on segments of input text it predicts to be the most relevant.[14][15] This model allows for greatly increased parallelization, and outperforms previous benchmarks for RNN/CNN/LSTM-based models.[8]

  1. ^ "gpt-2". GitHub. Archived from the original on 11 March 2023. Retrieved 13 March 2023.
  2. ^ a b Cite error: The named reference gpt2paper was invoked but never defined (see the help page).
  3. ^ Cite error: The named reference verge2 was invoked but never defined (see the help page).
  4. ^ Cite error: The named reference 15Brelease was invoked but never defined (see the help page).
  5. ^ Cite error: The named reference voxxy2 was invoked but never defined (see the help page).
  6. ^ Cite error: The named reference vb was invoked but never defined (see the help page).
  7. ^ a b Cite error: The named reference openai was invoked but never defined (see the help page).
  8. ^ a b c Cite error: The named reference gpt1paper was invoked but never defined (see the help page).
  9. ^ a b Cite error: The named reference badpaper was invoked but never defined (see the help page).
  10. ^ Cite error: The named reference tds2 was invoked but never defined (see the help page).
  11. ^ Cite error: The named reference guardian was invoked but never defined (see the help page).
  12. ^ Cite error: The named reference attention was invoked but never defined (see the help page).
  13. ^ Cite error: The named reference attentionRNNs was invoked but never defined (see the help page).
  14. ^ Cite error: The named reference jointly was invoked but never defined (see the help page).
  15. ^ Cite error: The named reference effective was invoked but never defined (see the help page).

© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search