« Back to Glossary Index

Generative pre-trained transformers (GPT) are a type of large language model (LLM) and a prominent framework for generative artificial intelligence

The first GPT was introduced in 2018 by OpenAI. GPT models are artificial neural networks that are based on the transformer architecture, pre-trained on large data sets of unlabelled text, and able to generate novel human-like content. As of 2023, most LLMs have these characteristics and are sometimes referred to broadly as GPTs.


A transformer is a type of neural network that can predict the next word in a sentence based on the previous words and the context with a certain percent in certainty.

Examples below.

« Back to Glossary Index