Sunday, July 12, 2009

Niklaus Wirth: On recursive algorithms

I have started reading the computer science, classic collection "ALGORITHMS + DATA STRUCTURES = PROGRAMS" by Niklaus Wirth. Dr. Wirth wrote this text in 1975. It's a great book.

Though "recursive algorithms" are widely known to computer science community since long time, I still could find some good advice in Dr. Wirth's book on usage of recursive algorithms.

Dr. Wirth mentions:
"An object is said to be recursive if it partially consists or is defined in terms of itself. Recursion is a particularly powerful means in mathematical definitions. The power of recursion evidently lies in the possibility of defining an infinite set of objects by a finite statement. In the same manner, an infinite number of computations can be described by a finite recursive program, even if this program contains no explicit repetitions."

We all know, what recursive algorithms are. It's a widely known programming technique. But I found particularly the advice, "When not to use recursion" in Dr. Wirth's book very worth while to apply.

Dr. Wirth further mentions:
"Recursive algorithms are particularly appropriate when the underlying problem or the data to be treated are defined in recursive terms. This does not mean, however, that such recursive definitions guarantee that a recursive algorithm is the best way to solve the problem.

Programs in which the use of algorithmic recursion is to be avoided can be characterized by a schema which exhibits the pattern of their composition. Such schema's can described as following:

[1]
P => if B then (S; P)

or, equivalently

P => (S; if B then P)
"

Dr. Wirth illustrates this principle with a well known, recursive definition of the factorial computation (mentioned below):

F0 = 1
F(i+1) = (i + 1) * f(i)

Dr. Wirth maps the factorial problem with the recursive anti-pattern he defines ([1] above):

[2]
P => if I < n then (I := I + 1; F := I * F; P)
I := 0; F := 1; P

In the above definition [2], S (ref, [1]) refers to,
I := I + 1; F := I * F

Dr. Wirth in the book, illustrates a following, iterative definition of factorial computation:

I := 0;
F := 1;
while I < n do
begin I := I + 1; F := I * F
end


Dr. Wirth says, "The lesson to draw is to avoid the use of recursion when there is an obvious solution by iteration.
This, however, should not lead to shying away from recursion at any price. The fact that implementations of recursive procedures on essentially non-recursive machines exists proves that for practical purposes every recursive program can be transformed into a purely iterative one. This, however, involves the explicit handling of a recursion stack, and these operations will often obscure the essence of a program to such an extent that it becomes most difficult to comprehend. The lesson is that algorithms which by their nature are recursive rather than iterative should be formulated as recursive procedures."

Just thought of sharing a bit of text, from this nice book and encouraging readers to read the book!

No comments: