What is the time complexity of searching for an element in an unsorted array of size 'n' in the worst-case scenario?
O(1)
O(log n)
O(n)
O(n^2)
What is the time complexity of Dijkstra's algorithm for finding the shortest path in a graph with V vertices and E edges using an adjacency list and a priority queue?
O(V log E)
O(E log V)
O(V^2)
O(V + E)
What does it mean for an algorithm to have a time complexity of Θ(1)?
The algorithm is guaranteed to be the fastest possible solution for the problem.
The runtime grows linearly with the input size.
The algorithm's runtime is unpredictable and varies greatly with different inputs.
The runtime is constant and independent of the input size.
Which notation signifies that a function 'f(n)' grows strictly slower than another function 'g(n)' as 'n' approaches infinity?
Big Theta (Θ)
Big-O (O)
Little-o (o)
Big Omega (Ω)
Which of the following scenarios typically leads to an algorithm with linear time complexity, O(n)?
Calculating the Fibonacci sequence recursively.
Finding the smallest element in a max-heap.
Performing a linear search on a sorted array.
Traversing a binary tree.
Which of the following has a time complexity of O(n^2) in its worst-case scenario?
QuickSort
Heap Sort
Merge Sort
Bubble Sort
What is the Big-O notation of an algorithm that iterates through an array of size 'n' twice in nested loops?
Why is understanding time complexity crucial when comparing the efficiency of algorithms?
It helps determine the exact amount of memory an algorithm will use.
It allows us to analyze how the algorithm's runtime changes relative to the input size.
It reveals the underlying hardware limitations that affect algorithm performance.
It provides a precise measurement of an algorithm's execution time in milliseconds.
What is the time complexity of calculating the nth Fibonacci number using the dynamic programming approach?
O(2^n)
O(n log n)
An algorithm has a time complexity of O(n log n). Which of the following statements is NOT necessarily true?
The algorithm's best-case runtime could be O(log n).
The algorithm's runtime grows at most as fast as n log n.
For large enough input sizes, the runtime is bounded above by some constant multiple of n log n.
The algorithm will always be slower than an algorithm with O(n) complexity.