« Back to Glossary Index

FLOPS, or Floating Point Operations per Second, is a measure of computer performance – useful in fields of scientific computations that require floating-point calculations.

For AI, it’s an important measure since training complex machine learning models involve billions of such operations. The speed at which these operations are performed can significantly affect the time it takes to train these models. High-performance GPUs or TPUs, specifically designed for these operations, hence, are often used in AI computations. The higher the FLOPS, the faster the AI model can be trained or inference can be made.

« Back to Glossary Index