« Back to Glossary Index

LLM inference is the process of entering a prompt and generating a response from an LLM. It involves a language model drawing conclusions or making predictions to generate an appropriate output based on the patterns and relationships to which it was exposed during training. 

« Back to Glossary Index