Explain the rotation operations used in AVL trees and their significance in maintaining balance.

  • Primary and secondary rotations; Primary rotations adjust immediate subtrees, while secondary rotations modify distant subtrees.
  • Simple and complex rotations; Simple rotations involve basic adjustments, while complex rotations involve intricate reconfigurations.
  • Single and double rotations; Single rotations involve left or right rotations, while double rotations involve combinations of single rotations.
  • Triple and quadruple rotations; Triple rotations involve three consecutive rotations, while quadruple rotations involve four rotations simultaneously.
Rotation operations used in AVL trees are single and double rotations. Single rotations include left rotations and right rotations, which help maintain balance by adjusting the heights of subtrees. Double rotations are combinations of single rotations performed to restore balance in specific cases, such as the double rotation involving left-right or right-left rotations.

Selection sort's time complexity can be improved to _______ by implementing certain optimizations.

  • O(log n)
  • O(n log n)
  • O(n)
  • O(n^2)
Selection sort's time complexity can be improved to O(n log n) by implementing certain optimizations, such as using more advanced data structures or algorithms to perform the selection in a more efficient manner.

Merge sort is a _______ sorting algorithm that follows the _______ strategy.

  • Bubble
  • Divide and Conquer
  • Dynamic Programming
  • Greedy
Merge sort is a Divide and Conquer sorting algorithm that follows the Divide and Conquer strategy. It recursively divides the array into two halves, sorts them, and then merges them back together.

Explain why binary search is more efficient than linear search for large datasets.

  • Binary search always finds the element in the first comparison
  • Binary search can only be used with small datasets
  • Binary search divides the search space in half at each step, reducing the time complexity to O(log n)
  • Linear search has a time complexity of O(n^2)
Binary search is more efficient for large datasets because it divides the search space in half at each step, resulting in a time complexity of O(log n), which is significantly faster than linear search (O(n)).

Can LCS be applied to strings of different lengths? Why or why not?

  • No, because it can only be applied to arrays, not strings.
  • No, because it only works on strings of equal lengths.
  • Yes, as long as the algorithm is modified to handle different lengths.
  • Yes, without any modification.
Yes, the longest common subsequence (LCS) algorithm can be applied to strings of different lengths. It involves modifying the dynamic programming approach to handle the differences in lengths by considering all possible pairs of substrings and building the LCS table accordingly.

In the Fractional Knapsack Problem, items can be divided to fit into the knapsack partially, whereas in the 0/1 Knapsack Problem, items must be chosen _______.

  • Arbitrarily
  • Completely
  • Exponentially
  • Sequentially
In the 0/1 Knapsack Problem, items must be chosen completely, meaning either an item is included in its entirety or not at all. On the other hand, the Fractional Knapsack Problem allows items to be divided and included partially.

The time complexity of radix sort is _______ in most scenarios.

  • O(k * n)
  • O(n * log n)
  • O(n + k)
  • O(n^2)
The time complexity of radix sort is O(k * n), where 'k' is the number of digits or components in the keys, and 'n' is the number of elements. It is linear and often more efficient.

How does Dijkstra's algorithm guarantee the shortest path in a graph with non-negative edge weights?

  • Always selects the smallest tentative distance
  • Considers random paths
  • Prioritizes longest paths
  • Utilizes heuristics for optimization
Dijkstra's algorithm guarantees the shortest path by always selecting the smallest tentative distance, ensuring that the chosen path at each step is the most optimal. It relies on a greedy approach and the non-negativity of edge weights to consistently find the shortest paths. Heuristics, random paths, or prioritizing longest paths are not part of Dijkstra's algorithm logic.

How is the next number in the Fibonacci sequence generated from the previous two numbers?

  • Addition of the two preceding numbers.
  • Division of the two preceding numbers.
  • Multiplication of the two preceding numbers.
  • Subtraction of the two preceding numbers.
The next number in the Fibonacci sequence is generated by adding the two preceding numbers. For example, if the last two numbers are 'a' and 'b', then the next number is 'a + b'. This recurrence relation defines the Fibonacci sequence.

How does the performance of regular expression matching change with the complexity of the pattern and input text?

  • Performance degrades exponentially with the complexity of the pattern and input text.
  • Performance improves as both pattern and input text become more complex.
  • Performance is independent of the pattern complexity but depends on the input text complexity.
  • Performance remains constant regardless of the complexity of the pattern and input text.
The performance of regular expression matching typically degrades exponentially with the complexity of both the pattern and input text. More complex patterns and longer input texts can lead to significantly increased processing time.