Traversing a tree is a systematic method for accessing or "visiting" each node. We will see and analyze three tree traversal algorithms, inorder, preorder, and postorder. They differ in when we visit an internal node relative to its children. In preorder we visit the node first, in postorder we visit it last, and in inorder, which is only defined for binary trees, we visit the node between visiting the left and right children.
Recursion will be a very big deal in traversing trees!!
On the right are three trees. The left one just has a root, the right has a root with one leaf as a child, and the middle one has six nodes. For each node, the element in that node is shown inside the box. All three roots are labeled and 2 other nodes are also labeled. That is, we give a name to the position, e.g. the left most root is position v. We write the name of the position under the box. We call the left tree T0 to remind us it has height zero. Similarly the other two are labeled T2 and T1 respectively.
Our goal in this motivation is to calculate the sum the elements in all the nodes of each tree. The answers are, from left to right, 8, 28, and 9.
For a start, lets write an algorithm called treeSum0 that calculates the sum for trees of height zero. In fact the algorithm, will contain two parameters, the tree and a node (position) in that tree, and our algorithm will calculate the sum in the subtree rooted at the given position assuming the position is at height 0. Note this is trivial: since the node has height zero, it has no children and the sum desired is simply the element in this node. So legal invocations would include treeSum0(T0,s) and treeSum0(T2,t). Illegal invocations would include treeSum0(T0,t) and treeSum0(T1,r).
Algorithm treeSum0(T,v) Inputs: T a tree; v a height 0 node of T Output: The sum of the elements of the subtree routed at v Sum←v.element() return Sum
Now lets write treeSum1(T,v), which calculates the sum for a node at height 1. It will use treeSum0 to calculate the sum for each child.
Algorithm treeSum1(T,v) Inputs: T a tree; v a height 1 node of T Output: the sum of the elements of the subtree routed at v Sum←v.element() for each child c of v Sum←Sum+treeSum0(T,c) return Sum
OK. How about height 2?
Algorithm treeSum2(T,v) Inputs: T a tree; v a height 2 node of T Output: the sum of the elements of the subtree routed at v Sum←v.element() for each child c of v Sum←Sum+treeSum1(T,c) return Sum
So all we have to do is to write treeSum3, treSum4, ... , where treSum3 invokes treeSum2, treeSum4 invokes treeSum3, ... .
That would be, literally, an infinite amount of work.
Do a diff of treeSum1 and treeSum2.
What do you find are the differences?
In the Algorithm line and in the first comment a 1 becomes a 2.
In the subroutine call a 0 becomes a 1.
Why can't we write treeSumI and let I vary?
Because it is illegal to have a varying name for an algorithm.
The solution is to make the I a parameter and write
Algorithm treeSum(i,T,v) Inputs: i≥0; T a tree; v a height i node of T Output: the sum of the elements of the subtree routed at v Sum←v.element() for each child c of v Sum←Sum+treeSum(i-1,T,c) return Sum
This is wrong, why?
Because treeSum(0,T,v) invokes treeSum(-1,c,v), which doesn't
exist because i<0
But treeSum(0,T,v) doesn't have to call anything since v can't have any children (the height of v is 0). So we get
Algorithm treeSum(i,T,v) Inputs: i≥0; T a tree; v a height i node of T Output: the sum of the elements of the subtree routed at v Sum←v.element() if i>0 then for each child c of v Sum←Sum+treeSum(i-1,T,c) return Sum
The last two algorithms are recursive; they call themselves. Note that when treeSum(3,T,v) calls treeSum(2,T,c), the new treeSum has new variables Sum and c.
We are pretty happy with our treeSum routine, but ...
The algorithm is wrong! Why?
The children of a height i node need not all be of height i-1.
For example s is hight 2, but its left child w is height 0.
(A corresponding error also existed in treeSum2(T,v)
But the only real use we are making of i is to prevent us from recursing when we are at a leaf (the i>0 test). But we can use isInternal instead, giving our final algorithm
Algorithm treeSum(T,v) Inputs: T a tree; v a node of T Output: the sum of the elements of the subtree routed at v Sum←v.element() if T.isInternal(v) then for each child c of v Sum←Sum+treeSum(T,c) return Sum
Our medium term goal is to learn about tree traversals (how to "visit" each node of a tree once) and to analyze their complexity.
Our complexity analysis will proceed in a somewhat unusual order. Instead of starting with the bottom or lowest level routines (the tree methods in 2.3.1, e.g., is Internal(v)) or the top level routines (the traversals themselves), we will begin by analyzing some middle level procedures assuming the complexities of the low level are as we assert them to be. Then we will analyze the traversals using the middle level routines and finally we will give data structures for trees that achieve our assumed complexity for the low level.
Let's begin!
These assumptions will be verified later.
Definitions of depth and height.
Remark: Even our definitions are recursive!
From the recursive definition of depth, the recursive algorithm for its computation essentially writes itself.
Algorithm depth(T,v) if T.isRoot(v) then return 0 else return 1 + depth(T,T.parent(v))
The complexity is Θ(the answer), i.e. Θ(dv), where dv is the depth of v in the tree T.
The following algorithm computes the height of a position in a tree.
Algorithm height(T,v): if T.isLeaf(v) then return 0 else h←0 for each w in T.children(v) do h←max(h,height(T,w)) return h+1
Remarks on the above algorithm
Algorithm height(T) height(T,T.root())
Let's use the "official" iterator style.
Algorithm height(T,v): if T.isLeaf then return 0 else h←0 childrenOfV←T.children(v) // "official" iterator style while childrenOfV.hasNext() h&lar;max(h,height(T,childrenOfV.nextObject()) return h+1
But the children iterator is defined to return the empty set for a leaf so we don't need the special case
Algorithm height(T,v): h←0 childrenOfV←T.children(v) // "official" iterator style while childrenOfV.hasNext() h&lar;max(h,height(T,childrenOfV.nextObject()) return h+1
Theorem: Let T be a tree with n nodes and let cv be the number of children of node v. The sum of cv over all nodes of the tree is n-1.
Proof:
This is trivial! ... once you figure out what it is saying.
The sum gives the total number of children in a tree. But this almost
all nodes. Indeed, there is just one exception.
What is the exception?
The root.
Corollary: Computing the height of an n-node tree has time complexity Θ(n).
Proof: Look at the code of the first version.
To be more formal, we should look at the "official" iterator version. The only real difference is that in the official version, we are charged for creating the iterator. But the charge is the number of elements in the iterator, i.e., the number of children this node has. So the sum of all the charges for creating iterators will be the sum of the number of children each node has, which is the total number of children, which is n-1, which is (another) $Theta;(n) and hence doesn't change the final answer.
Do a few on the board. As mentioned above, becoming facile with recursion is vital for tree analyses.
Definition: A traversal is a systematic way of "visiting" every node in a tree.
Visit the root and then recursively traverse each child. More formally we first give the procedure for a preorder traversal starting at any node and then define a preorder traversal of the entire tree as a preorder traversal of the root.
Algorithm preorder(T,v): visit node v for each child c of v preorder(T,c) Algorithm preorder(T): preorder(T,T.root())
Remarks:
Do a few on the board. As mentioned above, becoming facile with recursion is vital for tree analyses.
Theorem: Preorder traversal of a tree with n nodes has complexity Θ(n).
Proof:
Just like height.
The nonrecursive part of each invocation takes O(1+cv)
There are n invocations and the sum of the c's is n-1.
Homework: R-2.3
First recursively traverse each child then visit the root. More formerly
Algorithm postorder(T,v): for each child c of v postorder(T,c) visit node v Algorithm postorder(T): postorder(T,T.root())
Theorem: Preorder traversal of a tree with n nodes has complexity Θ(n).
Proof: The same as for preorder.
Remarks:
Problem Set 2, Problem 1. Note that the height of a tree is the depth of a deepest node. Extend the height algorithm so that it returns in addition to the height the v.element() for some v that is of maximal depth. Note that the height algorithm is for an arbitrary (not necessarily binary) tree; your extension should also work for arbitrary trees (this is *not* harder).
Recall that a binary tree is an ordered tree in which no node has more than two children. The left child is ordered before the right child.
The book adopts the convention that, unless otherwise mentioned, the term "binary tree" will mean "proper binary tree", i.e., all internal nodes have two children. This is a little convenient, but not a big deal. If you instead permitted non-proper binary trees, you would test if a left child existed before traversing it (similarly for right child.)
Will do binary preorder (first visit the node, then the left subtree, then the right subtree, binary postorder (left subtree, right subtree, node) and then inorder (left subtree, node, right subtree).
We have three (accessor) methods in addition to the general tree methods.