GPT-2
Generative Pre-trained Transformer 2 (GPT-2) is an open-source artificial intelligence created by OpenAI in February 2019. GPT-2 translates text, answers questions, summarizes passages, and generates text output on a level that, while sometimes indistinguishable from that of humans, can become repetitive or nonsensical when generating long passages. It is a general-purpose learner; it was not specifically trained to do any of these tasks, and its ability to perform them is an extension of its general ability to accurately synthesize the next item in an arbitrary sequence. GPT-2 was created as a "direct scale-up" of OpenAI's 2018 GPT model, with a ten-fold increase in both its parameter count and the size of its training dataset.
Wikipage redirect
primaryTopic
GPT-2
Generative Pre-trained Transformer 2 (GPT-2) is an open-source artificial intelligence created by OpenAI in February 2019. GPT-2 translates text, answers questions, summarizes passages, and generates text output on a level that, while sometimes indistinguishable from that of humans, can become repetitive or nonsensical when generating long passages. It is a general-purpose learner; it was not specifically trained to do any of these tasks, and its ability to perform them is an extension of its general ability to accurately synthesize the next item in an arbitrary sequence. GPT-2 was created as a "direct scale-up" of OpenAI's 2018 GPT model, with a ten-fold increase in both its parameter count and the size of its training dataset.
has abstract
Generative Pre-trained Transfo ...... h an API offered by Microsoft.
@en
release date
2019-02-14
Link from a Wikipage to an external page
Wikipage page ID
66,045,029
page length (characters) of wiki page
Wikipage revision ID
1,023,343,358
Link from a Wikipage to another Wikipage
caption
GPT-2 completion using the Hug ...... xt from this Wikipedia article
@en
name
Generative Pre-trained Transformer 2
@en
released
2019-02-14
repo
screenshot
@en
website
wikiPageUsesTemplate
subject
comment
Generative Pre-trained Transfo ...... size of its training dataset.
@en
label
GPT-2
@en
sameAs
wasDerivedFrom
homepage
isPrimaryTopicOf
name
Generative Pre-trained Transformer 2 (GPT-2)
@en