Will human beings be superseded by generative pre-trained transformer 3 (GPT-3) in programming?
Abstract
In today’s world, which is full of innovations in various fields, the role of Information Technologies (IT) has increased substantially. Consequently, a mountainous number of independent software engineers as well as IT companies are working together in order to pave the way for the further developments in the IT world. There were a significant number of new technological advancements which have solved dozens of both virtual and real-world issues and have eased the burden from people’s shoulders. As a very conspicuous example would be Generative Pre-trained Transformer 3 or GPT-3 which was introduced in May 2020 as a part of a trend in natural language processing (NLP) systems of pre-trained language representations. This article will provide background data related to GPT-3, its applications in today’s world, positives and negatives, and the future of this remarkable, yet unpredictable tool.
References
McCandlish, Sam; Radford, Alec; Sutskever, Ilya; Amodei, Dario and others (July 22, 2020). "Language Models are Few-Shot Learners". arXiv:2005.14165.
Sagar, Ram (June 3, 2020). "OpenAI Releases GPT-3, The Largest Model So Far". Analytics India Magazine. Retrieved July 31, 2020.
Dana Kozubska (September 23, 2020), AI Zone, 'OpenAI GPT-3: How It Works and Why It Matters'
GPT-3 (September 8, 2020). "A robot wrote this entire article. Are you scared yet, human? | GPT-3". The Guardian. ISSN 0261-3077. Retrieved September 15, 2020