Trees

DSA AVL Trees

The AVL Tree is a kind of Binary Search Tree that bears the names of the two Soviet inventors who created it in 1962: Georgy Adelson-Velsky and Evgenii Landis.

Because AVL trees self-balance, a very short runtime for searching, inserting, and deleting nodes with temporal complexity is ensured by keeping the tree height to a minimum. O(logn).

AVL Trees

An AVL tree and a standard Binary Search tree are identical save for the fact that AVL trees do additional rotation operations to maintain tree balance.

When the height difference between the left and right subtrees is less than two, a binary search tree is said to be in balance.

The AVL Tree maintains equilibrium to guarantee a minimum tree height, allowing for extremely quick search, insert, and delete operations.

==== example mukavu ====

The AVL Tree has balanced itself, thus even though the two Binary Search Trees above have the same nodes and an identical alphabetic in-order traversal, their heights are significantly different.

Watch as the video below walks you through the process of creating an AVL tree, showing you how rotation operations are carried out as necessary to restore balance and how balance factors are updated.

==== example mukavu ====

Read on to find out more about rotation operations, the calculation of the balancing factor, and the implementation of AVL Trees.

Left and Right Rotations

An AVL Tree can be balanced again by rotating it to the left, right, or a combination of the two directions.

One particular left rotation and one special right rotation are displayed in the preceding animation.

However, left and right rotations are typically executed as seen in the video below.

==== example mukavu ====

Observe how the parent of the subtree is altered. In order to maintain proper in-order traversal and the BST property that the left child is less than the right child for every node in the tree, subtrees rotate their parents in this manner.

Also keep in mind that it is not always the root node that become unbalanced and need rotation.

The Balance Factor

The variance in the subtree heights determines a node’s balancing factor.

An AVL tree stores the subtree heights at each node for every node. To determine whether the tree has gotten out of balance, the balance factor is computed based on the subtree heights.

The number of edges that connect a subtree’s root node to its farthest-down leaf node is known as the subtree’s height.

The Balance Factor (BF) for a node (X) is the difference in height between its right and left subtrees.

BF(X)=height(rightSubtree(X))−height(leftSubtree(X))Balance factor values

  • 0: The node is in balance.
  • more than 0: The node is “right heavy”.
  • less than 0: The node is “left heavy”.
    If the balance factor is less than -1, or more than 1, for one or more nodes in the tree, the tree is considered not in balance, and a rotation operation is needed to restore balance.

Let’s examine the many rotation operations that an AVL Tree can perform in order to achieve equilibrium.

The Four "out-of-balance" Cases

When the balance factor of just one node is less than -1, or more than 1, the tree is regarded as out of balance, and a rotation is needed to restore balance. There are four different ways an AVL Tree can be out of balance, and each of these cases require a different rotation operation.

CaseDescriptionRotation to Restore Balance
Left-Left (LL)The unbalanced node and its left child node are both left-heavy.A single right rotation.
Right-Right (RR)The unbalanced node and its right child node are both right-heavy.A single left rotation.
Left-Right (LR)The unbalanced node is left heavy, and its left child node is right heavy.First do a left rotation on the left child node, then do a right rotation on the unbalanced node.
Right-Left (RL)The unbalanced node is right heavy, and its right child node is left heavy.First do a right rotation on the right child node, then do a left rotation on the unbalanced node.
CaseDescriptionRotation to Restore Balance
Left-Left (LL)The unbalanced node and its left child node are both left-heavy.A single right rotation.
Right-Right (RR)The unbalanced node and its right child node are both right-heavy.A single left rotation.
Left-Right (LR)The unbalanced node is left heavy, and its left child node is right heavy.First do a left rotation on the left child node, then do a right rotation on the unbalanced node.
Right-Left (RL)The unbalanced node is right heavy, and its right child node is left heavy.First do a right rotation on the right child node, then do a left rotation on the unbalanced node.

See animations and explanations of these cases below.

The Left-Left (LL) Case

Not only is the node where the imbalance is found left heavy, but it also leaves its left child node heavy.

One right rotation on the unbalanced node is sufficient to bring balance back in this LL situation.

Navigate through the video below to show the LL scenario and how a simple right rotation can restore balance.

==== example mukavu ====

Two LL instances occur as you navigate through the animation above:

1.The balancing factor of Q becomes -2 when D is added, indicating that the tree is out of balance. Because the imbalance node Q and its left child node P are both left heavy (negative balance factors), this is an LL situation. The balance of the tree is restored by a single right rotation at node Q.

2.P’s balance factor is -2 when nodes L, C, and B are added, indicating that the tree is out of balance. Because the imbalanced node P and its left child node D are both left heavy, this is likewise an LL situation. Balance is restored with a single spin to the right.

Note:When the LL instance recurs in the animation above, L switches from being the right child of D to the left child of P by virtue of a right rotation. In order to maintain the proper in-order traversal (‘B, C, D, L, P, Q’ in the animation above), rotations are carried out in this manner. Maintaining the BST property—that is, the left child’s constant lower position than the node and the right child’s constant higher position—is another justification for switching parents throughout a rotation.

The Right-Right (RR) Case

When a node is imbalanced and right heavy, and its right child node is likewise right heavy, this is known as a Right-Right situation.

In the RR example, restoring balance only requires a single left rotation at the unbalanced node.

==== example mukavu ====

In the aforementioned animation, the RR situation occurs twice:

A gets uneven and bots A and B become too heavy when node D is introduced. The tree equilibrium is restored by rotating to the left at node A.

Node B loses equilibrium when nodes E, C, and F are added. Node B and its right sibling, node D, are both right heavy, which makes this an RR instance. The tree equilibrium is restored by a left rotation.

The Left-Right (LR) Case

When the unbalanced node is left heavy but its left child node is right heavy, this is known as the Left-Right case.

In this LR example, the initial imbalanced node is rotated to the right after the left child node has undergone a left rotation.

Navigate through the animation below to understand how the rotation operations are carried out to restore equilibrium and how the Left-Right situation can occur.

==== example mukavu ====

The Left-Right case occurs twice while you are developing the AVL Tree in the animation above, and rotation operations are necessary to restore equilibrium.

This is a Left-Right instance because node Q becomes unbalanced with a balance factor of -2 when K is introduced, making it heavy on the left and its left offspring, E, heavy on the right.

It is a Left-Right situation because node K becomes imbalanced and left heavy after nodes C, F, and G are placed, with its left child node E becoming right heavy.

The Right-Left (RL) Case

When the unbalanced node is right heavy and its right child node is left heavy, this is known as the right-left case.

In this instance, we rotate the imbalanced node’s right child to the right first, and then we rotate the unbalanced node to the left.

Navigate through the animation below to discover how rotations are used to restore equilibrium and how the Right-Left scenario can arise.

==== example mukavu ====

Node A becomes imbalanced and right heavy after adding node B, while its right child is left heavy. This results in a Right-Left case. Node F rotates to the right, followed by Node A to the left, in order to bring the system back into equilibrium.

After nodes G, E, and D are inserted, the following Right-Left case takes place. Because B is out of balance and right-heavy, and because its right child F is left-heavy, this is a right-left instance. Node F rotates to the right, followed by Node B to the left, in order to bring the system back into equilibrium.

Retracing in AVL Trees

An AVL tree may become unstable once a node is added or removed. We must recalculate all ancestor nodes’ balance factors and update their heights in order to determine whether the tree is unbalanced.

Recursion is used to manage this procedure, which is referred to as retracing. Following an insertion or deletion, the recursive calls propagate back to the root, updating the height of each ancestor node and recalculating the balancing factor. To bring the tree’s balance back, a rotation is carried out at any ancestor node whose balance factor is discovered to be outside the range of -1 to 1.

As retracing operates by recursion, the imbalance at node H is found and repaired first, which in this case also fixes the unbalance in nodes E and C. In the simulation below, after introducing node F, the nodes C, E, and H are all out of balance.

==== example mukavu ====

The code will retrace after inserting node F, computing balancing factors as it ascends back to the parent node. A right rotation is performed after node H is reached and the balancing factor of -2 is determined. The code will only resume retracing and calculate balance factors on ancestor nodes E and C after this rotation is complete.

The rotation keeps nodes E and C’s balancing factors the same as they were prior to the insertion of node F.

AVL Insert Node Implementation

This code for inserting nodes is based on the BST implementation seen on the preceding page.

The height of each node in the AVL tree is the sole new property when compared to the BST, but because of the way the AVL Tree rebalances itself, numerous new methods and extra code lines are required for the AVL Tree implementation.

The AVL Tree in the simulation above is created by the implementation below, which constructs an AVL tree based on a list of characters. Similar to what happened in the simulation above, the final node to be added, “F,” likewise causes a right rotation.

Example

Python:

				
					class TreeNode:
    def __init__(self, data):
        self.data = data
        self.left = None
        self.right = None
        self.height = 1

def getHeight(node):
    if not node:
        return 0
    return node.height

def getBalance(node):
    if not node:
        return 0
    return getHeight(node.left) - getHeight(node.right)

def rightRotate(y):
    print('Rotate right on node',y.data)
    x = y.left
    T2 = x.right
    x.right = y
    y.left = T2
    y.height = 1 + max(getHeight(y.left), getHeight(y.right))
    x.height = 1 + max(getHeight(x.left), getHeight(x.right))
    return x

def leftRotate(x):
    print('Rotate left on node',x.data)
    y = x.right
    T2 = y.left
    y.left = x
    x.right = T2
    x.height = 1 + max(getHeight(x.left), getHeight(x.right))
    y.height = 1 + max(getHeight(y.left), getHeight(y.right))
    return y

def insert(node, data):
    if not node:
        return TreeNode(data)

    if data < node.data:
        node.left = insert(node.left, data)
    elif data > node.data:
        node.right = insert(node.right, data)

    # Update the balance factor and balance the tree
    node.height = 1 + max(getHeight(node.left), getHeight(node.right))
    balance = getBalance(node)

    # Balancing the tree
    # Left Left
    if balance > 1 and getBalance(node.left) >= 0:
        return rightRotate(node)

    # Left Right
    if balance > 1 and getBalance(node.left) < 0:
        node.left = leftRotate(node.left)
        return rightRotate(node)

    # Right Right
    if balance < -1 and getBalance(node.right) <= 0:
        return leftRotate(node)

    # Right Left
    if balance < -1 and getBalance(node.right) > 0:
        node.right = rightRotate(node.right)
        return leftRotate(node)

    return node

def inOrderTraversal(node):
    if node is None:
        return
    inOrderTraversal(node.left)
    print(node.data, end=", ")
    inOrderTraversal(node.right)

# Inserting nodes
root = None
letters = ['C', 'B', 'E', 'A', 'D', 'H', 'G', 'F']
for letter in letters:
    root = insert(root, letter)

inOrderTraversal(root)
				
			

AVL Delete Node Implementation

The AVL Tree needs the minValueNode() function to determine a node’s next node in the in-order traversal when deleting a node that is not a leaf node. This is analogous to what happens, as described on the preceding page, when a node in a Binary Search Tree is deleted.

In an AVL Tree, deleting a node requires the same code that is used to insert it in order to restore balance.

Example

Python:

				
					def minValueNode(node):
    current = node
    while current.left is not None:
        current = current.left
    return current

def delete(node, data):
    if not node:
        return node

    if data < node.data:
        node.left = delete(node.left, data)
    elif data > node.data:
        node.right = delete(node.right, data)
    else:
        if node.left is None:
            temp = node.right
            node = None
            return temp
        elif node.right is None:
            temp = node.left
            node = None
            return temp

        temp = minValueNode(node.right)
        node.data = temp.data
        node.right = delete(node.right, temp.data)

    if node is None:
        return node

    # Update the balance factor and balance the tree
    node.height = 1 + max(getHeight(node.left), getHeight(node.right))
    balance = getBalance(node)

    # Balancing the tree
    # Left Left
    if balance > 1 and getBalance(node.left) >= 0:
        return rightRotate(node)

    # Left Right
    if balance > 1 and getBalance(node.left) < 0:
        node.left = leftRotate(node.left)
        return rightRotate(node)

    # Right Right
    if balance < -1 and getBalance(node.right) <= 0:
        return leftRotate(node)

    # Right Left
    if balance < -1 and getBalance(node.right) > 0:
        node.right = rightRotate(node.right)
        return leftRotate(node)

    return node
				
			

Time Complexity for AVL Trees

View the imbalanced Binary Search Tree in the following section. Comparisons must be made between all nodes save one in order to search for “M”. However, we only need to visit 4 nodes in the AVL Tree below in order to search for “M”.

In the worst situation, search, insert, and delete algorithms would have to traverse the entire tree’s height. Thus, maintaining the height (h) of the tree low, like we achieve with AVL Trees, results in a shorter runtime.

DSA AVL Trees -

View the comparison of the time difficulties of AVL trees and Binary Search trees below, along with an explanation of how the time complexities depend on the tree’s height (h) and node count (n).n) within the tree.

  • The BST is not self-balancing. This means that a BST can be very unbalanced, almost like a long chain, where the height is nearly the same as the number of nodes. This makes operations like searching, deleting and inserting nodes slow, with time complexity O(h)=O(n).
  • The AVL Tree however is self-balancing. That means that the height of the tree is kept to a minimum so that operations like searching, deleting and inserting nodes are much faster, with time complexity O(h)=O(logn).

O ( log n ) Explained

he fact that the time complexity is O(h)=O(logn) for search, insert, and delete on an AVL Tree with height h and nodes n can be explained like this:

Imagine a perfect Binary Tree where all nodes have two child nodes except on the lowest level, like the AVL Tree below.

DSA AVL Trees -
DSA AVL Trees -

We know that the time complexity for searching, deleting, or inserting a node in an AVL tree is O(h), but we want to argue that the time complexity is actually O(log(n)), so we need to find the height h described by the number of nodes n:

DSA AVL Trees -

It may not be immediately apparent how the last line above is obtained, however when considering temporal complexity, the “+1” and “-1” terms are irrelevant for a Binary Tree with a large number of nodes (huge n). See this page for further information on how to compute the temporal complexity using Big O notation.

The calculation above demonstrates the temporal complexity of an AVL Tree’s search, delete, and insert actions. O(h)is indeed able to be stated as O(logn) which is quite quick compared to the time complexity for BSTs, which is O(n)..

Share this Doc

DSA AVL Trees

Or copy link

Explore Topic