Distilled from Wikipedia: Generative Pre-Training [GPT] is a term from information science. It's a "language model" which is a tool that attempts to provide context to distinguish between words and phrases that sound similar. Part of constructing an AI. Models of the GPT family have in common that they are language models based in the transformer architecture, pre-trained in a generative, unsupervised manner, that shows decent performance. See also GPT-3.