2021
DOI: 10.1177/20539517211047734
|View full text |Cite
|
Sign up to set email alerts
|

The great Transformer: Examining the role of large language models in the political economy of AI

Abstract: In recent years, AI research has become more and more computationally demanding. In natural language processing (NLP), this tendency is reflected in the emergence of large language models (LLMs) like GPT-3. These powerful neural network-based models can be used for a range of NLP tasks and their language generation capacities have become so sophisticated that it can be very difficult to distinguish their outputs from human language. LLMs have raised concerns over their demonstrable biases, heavy environmental … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
29
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 78 publications
(45 citation statements)
references
References 53 publications
0
29
0
Order By: Relevance
“…[199]. For training GPT-3, the supercomputer developed for OpenAI is a single system with more than 285,000 CPU cores, 10,000 NVIDIA V100 GPUs and 400 gigabits per second of network connectivity for each GPU server [200]. Megatron-Turing Natural Language Generation (MLT-NLG) model is an AI model with a whopping 530 billion parameters [201].…”
Section: Current Statusmentioning
confidence: 99%
“…[199]. For training GPT-3, the supercomputer developed for OpenAI is a single system with more than 285,000 CPU cores, 10,000 NVIDIA V100 GPUs and 400 gigabits per second of network connectivity for each GPU server [200]. Megatron-Turing Natural Language Generation (MLT-NLG) model is an AI model with a whopping 530 billion parameters [201].…”
Section: Current Statusmentioning
confidence: 99%
“…-Kehlmann's use of CTRL was, one is lead to speculate, in no small part a PR campaign of McCann on behalf of CTRL; given the virtually total insignificance of CTRL compared to other large language models today, the collaboration appears to have been not just an artistic but also a business failure. 4 CTRL has 1.6 billion parameters -or 'neurons' in its neural network -while GPT-3 boasts 175 billion; CTRL was trained on 140 gigabytes of text, GPT-3 on 570 gigabytes; PaLM, introduced by Google in April 2022, has 540 billion parameters and was trained on 780 gigabytes of text. For CTRL, see [8]; for GPT-3, see [9]; for PaLM, see [10].…”
Section: Strong and Weak Artistic Aimentioning
confidence: 99%
“…In 2021, he turned to an entirely different subject. In the essay 1 A useful introduction to large language models is [4].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…While in the past, industrial revolutions improved only production technologies, now-under the conditions of the fourth Industrial Revolution-management technologies are also improved. AI ensures the intellectual support for decision making, in particular investment decisions (Luitse and Denkena, 2021;Som, 2021).…”
Section: Introductionmentioning
confidence: 99%