Your definition can be sharpened up a bit.
Some conventional definitions of an algorithm include:
- An algorithm is a process the execution of which is clearly specified to the smallest details.
- An algorithm is a sequence of unambiguous instructions of finite length the execution of which is deterministic and will conclude after the execution of a finite number of instructions.
- Any sequence of operations that can be simulated by a Turing-complete system
An algorithm is generally assumed to be deterministic, terminating, and non-quantum unless otherwise stated.
Here are some selections from the first few pages of Hans Hermes' book
Enumerability - Decidability - Computability (from which I learned recursive function theory, Turing machines, etc.):
"An algorithm is a general procedure such that for any appropriate question the answer can be obtained by the use of a simple calculation according to a specified method."
"In this book we shall understand by a general procedure a process the execution of which is clearly specified to the smallest details. Among other things this means that we must be able to express the instructions for the execution of the process in finitely long text.
"There is no room left for the practice of the creative imagination of the executer. He has to work slavishly according to the instructions given to him, which determine everything to the smallest detail."
"In this book we want to adopt the convention of calling procedures general procedures only if the way of proceeding is completely unambiguous."
"There are terminating algorithms, whereas other algorithms can be continued as long as we like [e.g. calculating the square root]."
[] added by Complexity.
From the Wikipedia article on 'algorithm':
In
mathematics and
computer science, an
algorithm i/ˈælɡərɪðəm/ (from
Algoritmi, the
Latin form of
Al-Khwārizmī) is a step-by-step procedure for calculations. More precisely, it is an
effective method expressed as a
finite list
[1] of well-defined instructions
[2] for calculating a
function.
[3] Algorithms are used for
calculation,
data processing, and
automated reasoning.
Starting from an initial state and initial input (perhaps
empty),
[4] the instructions describe a
computation that, when
executed, will proceed through a finite
[5] number of well-defined successive states, eventually producing "output"
[6] and terminating at a final ending state. The transition from one state to the next is not necessarily
deterministic; some algorithms, known as
randomized algorithms, incorporate random input.
[7]
While there is no generally accepted
formal definition of "algorithm," an informal definition could be "a set of rules that precisely defines a sequence of operations."
[11] For some people, a program is only an algorithm if it stops eventually; for others, a program is only an algorithm if it stops before a given number of calculation steps.
[12]