GPT Generative Pre-trained Transformer

Frederic BOX Avatar

GPT, or “Generative Pre-trained Transformer,” is an artificial intelligence language model that employs a neural network architecture called Transformer.

It is pretrained on extensive sets of textual data and is capable of generating text in a contextual and coherent manner. GPT was developed by OpenAI, and its initial version, GPT-1, was released in June 2018.

Since then, multiple iterations have been launched, with continuous improvements in the model’s performance and capabilities. GPT is widely used in various applications, including text generation, machine translation, natural language understanding, and other tasks related to natural language processing.