Which sorting algorithm has a time complexity of O(n^2) in its average and worst case?
Heap Sort
Merge Sort
Bubble Sort
Quick Sort
What is the primary focus of Big-O notation in time complexity analysis?
Describing the upper bound of an algorithm's growth rate
Expressing the exact number of operations an algorithm performs
Calculating the average-case runtime of an algorithm
Representing the lower bound of an algorithm's growth rate
Which sorting algorithm is generally considered the fastest for large datasets with an average time complexity of O(n log n)?
Insertion Sort
Selection Sort
You have two algorithms for a task: Algorithm A has a time complexity of O(n log n), and Algorithm B has O(n^2). For which input size 'n' would Algorithm A likely start to outperform Algorithm B?
n = 10
n = 100
n = 1000
It depends on the specific algorithms and their constant factors.
Which of the following operations typically represents constant time complexity, O(1)?
Sorting an array using bubble sort
Finding the smallest element in a sorted array
Inserting an element at the beginning of a linked list
Searching for a specific value in an unsorted array
Which searching algorithm has a time complexity of O(log n) in the average case?
Binary Search
Jump Search
Linear Search
Interpolation Search
Which data structure, when used for searching, can potentially improve the time complexity from O(n) to O(log n)?
Linked List
Binary Tree
Queue
Array
What is the time complexity of the QuickSort algorithm in the worst-case scenario?
O(n)
O(n log n)
O(log n)
O(n^2)
What is the time complexity of accessing an element in an array using its index?
O(1)
How can understanding the time complexity of data structures aid in optimizing code?
It has no direct impact on code optimization; it's purely for theoretical analysis.
It helps determine the best programming language for the algorithm.
It helps choose the most appropriate data structure for the task, optimizing operations.
It guides the choice of variable names for improved code readability.