>_TheQuery
← Glossary

Algorithm

Fundamentals

A finite sequence of well-defined instructions for solving a problem or performing a computation.

An algorithm is a step-by-step procedure that takes an input, processes it through a defined sequence of operations, and produces an output. Algorithms are the foundation of all computing — every program, model, and system is ultimately an implementation of one or more algorithms.

In machine learning, the term usually refers to the learning algorithm: the method by which a model updates its parameters from data. Gradient descent, backpropagation, and k-means clustering are all algorithms. The choice of algorithm determines how a model learns, what tradeoffs it makes between speed and accuracy, and what kinds of patterns it can capture.

Algorithms are evaluated on correctness, time complexity (how runtime scales with input size), and space complexity (how memory usage scales). In practice, an algorithm that is theoretically optimal but impossible to implement efficiently on real hardware loses to one that is 'good enough' and runs fast. This tradeoff between theoretical elegance and practical performance is a recurring theme across computer science and AI.

Last updated: March 7, 2026