Algorithm: Sorting
updated in crl
Sorting Algorithms
Section titled âSorting AlgorithmsâTime and Space Complexity
Basic Sort :
| Sorting Algorithm | Worst Time Complexity | Average Time Complexity | Best Time Complexity | Space Complexity |
|---|---|---|---|---|
| Selection Sort | O(n^2) | O(n^2) | O(n^2) | O(1) |
| Bubble Sort | O(n^2) | O(n^2) | O(n) â (with swap-flag optimization) | O(1) |
| Insertion Sort | O(n^2) | O(n^2) | O(n) â (already sorted) | O(1) |
| Quick Sort | O(n^2) â | O(n * log n) | O(n * log n) | O(log n) (avg recursion) / O(n) (worst recursion) â |
| Merge Sort | O(n * log n) | O(n * log n) | O(n * log n) | O(n) |
| Heap Sort â | O(n * log n) | O(n * log n) | O(n * log n) | O(1) â(aux space) |
Advance Sort :
| Sorting Algorithm | Best Time Complexity | Average Time Complexity | Worst Time Complexity | Space Complexity |
|---|---|---|---|---|
| Shell Sort | O(n * log n) | O(n * log^2 n) | O(n^2) | O(1) |
| Counting Sort | O(n + k) | O(n + k) | O(n + k) | O(k) |
| Radix Sort â | O(nk) | O(nk) | O(nk) | O(n + k) |
| Bucket Sort | O(n + k) | O(n + k) | O(n^2) | O(n + k) |
| Tim Sort | O(n) | O(n * log n) | O(n * log n) | O(n) |
Comparison Sort and Alternatives
Fastest Sort : Quick Sort ->Average
TC:O(n*logn)
Comparison Sorting
-
Quicksort usually has a running time of n x log(n), but is there an algorithm that can sort even faster? In general, this is not possible. Most sorting algorithms are comparison sorts==, i.e. they sort a list just by comparing the elements to one another. A ==comparison sort algorithm cannot beat n x log(n) (worst-case) running time, since n x log(n) represents the minimum number of comparisons needed to know where to place each element. For more details, you can see these notes (PDF).
Alternative Sorting
- Another sorting method, the counting sort, does not require comparison. Instead, you create an integer array whose index range covers the entire range of values in your array to sort. Each time a value occurs in the original array, you increment the counter at that index. At the end, run through your counting array, printing the value of each non-zero valued index that number of times.
Stable, In-Place and Adaptive
Section titled âStable, In-Place and AdaptiveâDefinition :
| Term | Definition | Why it matters |
|---|---|---|
| Stable | If ==two elements compare equal, their original relative order== is preserved after sorting. | Important when a list has secondary keys (e.g., sort by last name then keep the existing firstâname order). |
| Inâplace | The algorithm uses only O(1) (or a very small constant) extra memory beyond the input array itself. | Saves RAM â crucial for huge data sets or memoryâconstrained environments. |
| Adaptive | The ==running time improves when the input is already partially ordered==. Typically the cost is O(n + f(disorder)) where disorder measures how far the list is from sorted (e.g., number of inversions). | Gives nearâlinear performance on âalmostâsortedâ data, which occurs a lot in real programs (log files, UI tables, etc.). |
Each Properties Satisfying Sort Algorithms
| Property | Algorithms that have it |
|---|---|
| Stable | Insertion, Bubble, Merge (classic), Counting, Radix, Tim Sort, StableâQuick Sort (extra memory) |
| Inâplace | QuickSort, HeapSort, Selection, Shell, Insertion, Bubble, Binary Insertion, (some variants of) Merge (complex) |
| Adaptive | Insertion, Bubble, Shell, TimSort, Counting/Radix (linear on already sorted keys), Bucket (when data clustered) |
Sort Algorithms that satisfy all three properties :
| Algorithm | Reason it meets all three |
|---|---|
| Insertion Sort | Stable by nature, works inâplace on the original array, and stops early when the input is already (or nearly) sorted (O(n) best). |
| Bubble Sort | Each pass swaps only adjacent outâofâorder pairs â stable; uses only a few extra variables â inâplace; terminates when a full pass makes no swap â adaptive. |
| Binary Insertion Sort (stable implementation) | Same as insertion sort, just finds the insertion point with binary search â still stable, inâplace, and adaptive. |
Practical tip: For realâworld code you rarely use plain bubble sort because its constant factors are high. Insertion sort is excellent for small or nearly sorted subâarrays (e.g., as the âinsertionâsort fallbackâ inside TimSort or introsort). If you need a stable, adaptive algorithm on large data and can afford linear extra memory, TimSort is usually the best choice.
Sorting algorithms classified by these properties :
| Algorithm | Stable? | Inâplace? | Adaptive? | Typical time complexity* |
|---|---|---|---|---|
| Insertion Sort | â | â | â
(runs in O(n + inv); inv = #inversions) | O(n²) worst, O(n) best |
| Bubble Sort | â | â | â (stops early when no swaps) | O(n²) worst, O(n) best |
| Binary Insertion Sort | â (if insertion is stable) | â | â (same adaptivity as plain insertion) | Same as insertion |
| Shell Sort | â (standard version) | â | â (faster on partially sorted data) | â O(nâŻlog² n) depending on gap sequence |
| TimSort (Pythonâs/Javaâs builtâin sort for objects) | â | â ď¸ Not strictly inâplace â uses a temporary run buffer of size â¤âŻn/2 (often O(n) worst case) | â (detects runs, merges them) | O(n log n) worst, O(n) best on already sorted |
| Merge Sort (classic topâdown) | â | â (O(n) extra array) | â (does not gain from presorted input) | O(n log n) always |
| Bottomâup Merge Sort with inâplace merging | â (possible) | â ď¸ Inâplace variants exist but are complex and slower | â (still O(n log n)) | O(n log n) |
| ==QuickSort ==(Lomuto/Hoare partition) | â (standard) | â | â (doesnât adapt to order) | O(n log n) avg, O(n²) worst |
| Randomâpivot QuickSort | â | â | â | Same as QuickSort |
| Stable QuickSort (extra buffer or linked list) | â (with extra memory) | â (needs O(n) aux) | â | O(n log n) |
| ==HeapSort== | â | â | â | O(n log n) always |
| Selection Sort | â (if you keep equalâkey order when swapping) but usually implemented unstable | â | â (always scans whole unsorted part) | O(n²) |
| Counting Sort / Radix Sort | â | â ď¸ Not inâplace â needs counting array or buckets (O(k) extra) | â (linear on already sorted keys) | O(n + k) |
| Bucket Sort | â (if buckets keep order) | â ď¸ Not strictly inâplace (needs bucket storage) | â (fast when data already clustered) | O(n + k) |
1. Stable vs Unstable Sort
Stable Sorting: If two elements have the same key (equal value), their relative order in the original array is preserved in the sorted output.
Unstable Sorting: If two equal elements may change their relative order after sorting.
Example:
Array = [4a, 3, 4b, 2] (4a and 4b are equal but distinct items)
- Stable sort result:
[2, 3, 4a, 4b](order of 4a before 4b is preserved) - Unstable sort result:
[2, 3, 4b, 4a](order of equal elements changed)
Stable Sorting Algorithms
- Bubble Sort â â Stable â Swaps only adjacent elements, so equal elements never cross order.
- Insertion Sort â â Stable â Inserts element into sorted part without jumping equal elements ahead.
- Merge Sort â â Stable â During merge, if equal, it takes the left one first (preserves original order).
- Counting Sort â â Stable â Places elements in output array in order of input traversal.
- Radix Sort â â Stable â Relies on Counting Sort as subroutine, which is stable.
- Bucket Sort â â Stable (if stable sub-sort like Insertion is used inside buckets).
- TimSort â â Stable â Designed to be stable (merge + insertion hybrid).
- Cocktail Shaker Sort â â Stable â Bidirectional Bubble Sort, still only adjacent swaps.
- Odd-Even Sort â â Stable â Adjacent comparisons like Bubble, order preserved.
- C++ stable_sort â â Stable â Explicitly implemented to keep equal order (uses Merge-like logic).
Unstable Sorting Algorithms
-
Selection Sort â â Unstable â Swaps non-adjacent elements; equal elements can jump over each other.
-
Quick Sort â â Unstable â Partition step may reorder equal elements around pivot.
-
Heap Sort â â Unstable â Heapify swaps elements far apart, disturbing order of equals.
- Shell Sort â â Unstable â Elements are moved across large gaps, breaking order of equals.
- IntroSort (STL sort) â â Unstable â Based on Quick + Heap + Insertion, instability inherited.
- Tournament Sort â â Unstable â Equal elements may be promoted/replaced in tree arbitrarily.
đ Shortcut to remember:
-
Most O(n²) simple sorts: Bubble (stable), Insertion (stable), Selection (unstable).
-
Most O(n log n) divide & conquer sorts: Merge (stable), Quick (unstable)==, ==Heap (unstable).
- Counting/Radix/Bucket â stable, because they donât rely only on comparisons.
đ Shortcut Rule:
-
Algorithms that only swap neighbors== (Bubble, Insertion, Odd-Even) â ==Stable.
-
Algorithms that move elements far apart== (Selection, Quick, Heap, Shell) â ==Unstable.
2. In-place vs Out-of-place Sorting
In-place Sorting Algorithms (Use only constant or O(1) extra space, besides recursion stack)
- Bubble Sort â In-place
- Insertion Sort â In-place
- Selection Sort â In-place
- Quick Sort â In-place (needs recursion stack O(log n))
- Heap Sort â In-place
- Shell Sort â In-place
- IntroSort (C++ STL sort) â In-place
Out-of-place Sorting Algorithms (Require extra arrays or large auxiliary space, $O(n)$ or more)
- Merge Sort â Out-of-place (needs $O(n)$ auxiliary array)
- Counting Sort â Out-of-place (needs $O(n+k)$ array)
- Radix Sort â Out-of-place (needs counting arrays + temporary storage)
- Bucket Sort â Out-of-place (needs buckets)
- TimSort â Out-of-place (needs temporary arrays)
- External Merge Sort â Out-of-place (disk-based, very large data)
Ques. Why Quick Sort is In-Place if it take O(logn) space complexity
- In-place sorting means: the algorithm does not need extra arrays proportional to n, it mostly rearranges elements in the same array. Quick Sort only uses extra space for the recursion stack, which is O(log n) on average. This log n stack space is tiny compared to creating a full extra array of size n, so itâs still considered in-pla
đ Shortcut rule:
-
Simple âswap-basedâ sorts== (Bubble, Insertion, Selection, Quick, Heap, Shell) â ==In-place
-
âAuxiliary-array basedâ sorts== (Merge, Counting, Radix, Bucket, TimSort) â ==Out-of-place
Question & Answers
Section titled âQuestion & AnswersâImportant QA Concept
Questions & Answers
Section titled âQuestions & AnswersâEasy Sorting Q/A
- Fastest average-case sorting? Quick Sort
- Guaranteed O(n log n) sorting? Merge Sort
-
Sorting for linked list? Merge Sort
-
Sorting for priority queue?Heap Sort
-
Non-comparison-based sorting? Counting, Radix, Bucket
-
Which sort is best for small arrays==?==Insertion Sort; Time: $O(n^2)$, Space: $O(1)$.
-
Which sorting algorithm is stable and has $O(n \log n)$ worst-case time==?==Merge Sort; Time: $O(n \log n)$, Stable.
-
Which algorithm is used for heap-based priority queues?Heap Sort; Time: $O(n \log n)$, Space: $O(1)$
-
Which sorting is fastest in average case==?==Quick Sort; Time: $(n \log n)$ average, Space: $O(\log n)$.
-
Which algorithm is guaranteed stable and out-of-place==?==Merge Sort; Space: $O(n)$
-
Which sort is in-place but unstable==?==Heap Sort==; Space: ==O(1), Unstable.
-
Which sort is good for linked lists==?==Merge Sort.
-
Which sorting is non-comparison based==?==Counting Sort; Time: $O(n + k)$
-
Which sort works using digit grouping==?==Radix Sort; Time: $O(dn)$, where $d$ is digits.
-
Which algorithm is used by Pythonâs built-in sort==? âââ ==Timsort (Hybrid of Merge and Insertion Sort).
-
Which sort is unstable but in-place==?==Quick Sort; Space: $O(log n)$.
-
Which sort is stable and in-place for small arrays==?==Insertion Sort.
-
Which sort swaps minimum element to correct place each time==?===Selection Sort=; Time: $O(n^2)$, Space: $O(1)$.
-
Which sort is best for nearly sorted data==?==Insertion Sort; Time: $O(n)$ best case.
-
Which sorting is divide and conquer== + merging?==Merge Sort.
-
Which sorting algorithm uses a pivot for partitioning==?==Quick Sort.
- What is the worst-case time complexity of Bubble Sort?
Time: $O(n^2)$ -
Which sort is both adaptive and stable==?==Insertion Sort.
-
Which sort requires range information== for $O(n)$ time?==Counting Sort.
-
Which sort is slow but simple for small data==?==Bubble Sort; Time: $O(n^2$).
-
Which sort uses heap data structure==?==Heap Sort; Space: $O(1)$.
-
Which sort is good for external sorting== (large files)?==Merge Sort; Space: $O(n)$.
-
Which sort is not stable by default==?==Heap Sort; Unstable.
-
Which sort requires extra space for merging==?==Merge Sort; Space: $O(n)$.
-
Which sort is best for large unsorted arrays==?==Quick Sort; Time: $O(n\logâĄn)$ average case.
-
Which sort is used in Java for primitives==? âââ ==Dual-Pivot Quick Sort.
-
Which sorting uses minimum assignment operations==?==Selection Sort. â
-
Which sort can detect sorted input to finish early==?==Bubble Sort (with optimization).
-
Which sort is good for repeated keys==?==Counting Sort. â
-
Which sort is bad for large datasets==?==Bubble Sort; Time: $O(n^2)$. â
-
Which sort is both non-adaptive and unstable==?==Selection Sort.
-
Which sort merges two sorted lists efficiently==?==Merge Sort.
- Which sortâs worst case is when data is sorted?
Quick Sort (with poor pivot). -
Which sort uses a heap for sorting in-place?Heap Sort.
-
Which sort is used in STL C++ sort()==? âââ ==IntroSort (Hybrid).
-
Which sort compares every adjacent pair repeatedly==?==Bubble Sort.
-
Which sort fills sorted position in every pass==?==Selection Sort.
-
Which sort is highly parallelizable==?==Merge Sort. â
-
Which sortâs best case and worst case is same==?==Selection Sort; Time: $O(n^2)$. â
-
Which sort partitions around a pivot==?==Quick Sort.
-
Which sort is stable and works well for small files==?==Insertion Sort.
-
Which sort is not in-place==?==Merge Sort; Space: $O(n)$.
-
Which sort is in-place and simple to implement==?==Insertion Sort. â
-
Which sort divides array until one element subarrays==?==Merge Sort.
-
Which sort prefers swap over comparison cost==?==Selection Sort. â
-
Which sort is fastest for random large arrays==?==Quick Sort (on average).
-
Which sortâs basic operation is bubble-up or bubble-down==?==Heap Sort. â
-
Which sortâs performance unaffected by input order==?==Merge Sort. â
-
Which sort is unstable unless== modified for stability?==Quick Sort. â
-
Which sort is optimal for sorting strings or digits==?==Radix Sort; Time: $O(nd)$.
Medium Sorting Questions and Answers
- How is stability achieved in Merge Sort?
Merge carefully reorders equal elements during merging; Time: O(nlogâĄn)O(n \log n)O(nlogn), Stable. - What is the space complexity of Heap Sort?
Space: O(1)O(1)O(1), In-place and Unstable. - How does Quick Sort pick pivots to avoid worst-case?
Median-of-three or randomized selection makes partitions more balanced; Time: O(nlogâĄn)O(n \log n)O(nlogn) average. - How is time complexity of Counting Sort determined?
Time: O(n+k)O(n + k)O(n+k), where kkk is the range of keys. - Why might Radix Sort be preferred over Merge Sort?
Faster than comparison-based sorts when input is digit-based and range not too wide; Time: O(nd)O(n d)O(nd). - How does IntroSort combine three sorts?
Uses Quick Sort, switches to Heap Sort if recursion too deep; ends with Insertion Sort for small arrays; Time: O(nlogâĄn)O(n \log n)O(nlogn). - Which sorting is ideal for streaming data insertion?
Insertion Sort adapts quickly; Time: O(n2)O(n^2)O(n2) worst; O(n)O(n)O(n) best. - Which sort is non-adaptive but stable?
Merge Sort; always O(nlogâĄn)O(n \log n)O(nlogn). - What are the steps in Bucket Sort?
Divide input into buckets, sort each bucket (typically Insertion Sort); Time: depends on distribution, average O(n)O(n)O(n). - Best sorting for external (disk) sorting?
Merge Sort handles large files; Space: O(n)O(n)O(n). - Why is heapify used in Heap Sort?
To maintain heap property after each removal; Time: O(n)O(n)O(n) for building heap, then O(nlogâĄn)O(n \log n)O(nlogn). - Which algorithm is sensitive to repeated elements?
Quick Sort can degrade with many duplicates; 3-way partitioning can help. - Which algorithm offers best parallelization?
Merge Sort easily subdivides tasks; Time: O(nlogâĄn)O(n \log n)O(nlogn). - Time complexity for Selection Sort on partially sorted data?
Always O(n2)O(n^2)O(n2), not adaptive. - Space requirement for Counting Sort?
O(k)O(k)O(k) for counts, O(n)O(n)O(n) for output. - Sorting algorithm for large, nearly sorted datasets?
Timsort; combines insertion and merge; Time: O(n)O(n)O(n) best, O(nlogâĄn)O(n \log n)O(nlogn) worst. - What is the best-case complexity for Bubble Sort?
O(n)O(n)O(n) if array already sorted. - What is a stable variant of Quick Sort?
Implementing with additional arrays; not in-place, but stable. - Which sorting uses recursion stack space?
Quick Sort and Merge Sort; Space: O(logâĄn)O(\log n)O(logn). - Why is Selection Sort poor for adaptive cases?
Does not exploit existing order; always quadratic time. - Best algorithm for finding the Kth smallest element?
QuickSelect (based on Quick Sort); Time: O(n)O(n)O(n) average. - What algorithm sorts by repeatedly extracting min/max?
Selection Sort or Heap Sort (using heap property). - Which sorting is used in C++ STL for large inputs?
IntroSort switches between Quick, Heap, Insertion; Time: O(nlogâĄn)O(n \log n)O(nlogn). - How is stability defined in sorting algorithms?
Equal keys retain their relative order; Merge, Bubble, Insertion are stable. - Which sorting algorithm is not stable and not adaptive?
Heap Sort. - What is the maximum recursion depth in Quick Sort?
Up to O(n)O(n)O(n) worst-case (stack overflow risk), typically O(logâĄn)O(\log n)O(logn). - Which sorting uses divide and conquer for arrays?
Merge Sort and Quick Sort. - Best sorting for only distinct keys and bounded range?
Counting Sort; Time: O(n+k)O(n + k)O(n+k). - Which sort is slow when input range is huge?
Counting Sort uses too much space for large kkk. - Which sort can be optimized for cache locality?
Quick Sort (partitioning on contiguous memory). - How does merge operation work in Merge Sort?
Combines two sorted arrays in O(n)O(n)O(n). - How can Quick Sort be made stable?
By using extra space; result is not in-place. - Best algorithm to sort constant-range integers?
Counting Sort; Time: O(n)O(n)O(n). - How does heapify-up differ from heapify-down?
Heapify-up for insertion, heapify-down for removal in Heap Sort. - Which sorting algorithm is used in Java for objects?
Timsort; stable; Time: O(nlogâĄn)O(n \log n)O(nlogn). - Best way to merge sorted linked lists?
Merge Sort or iterative merging; Time: O(n)O(n)O(n). - What is the overhead for Merge Sort in space?
At least O(n)O(n)O(n) extra memory requirement. - Which sorting minimizes the number of data writes?
Selection Sort; fewest swaps per pass. - Which algorithm sorts based on character strings efficiently?
Radix Sort (with fixed or bounded length strings); Time: O(nk)O(nk)O(nk).github - How can sorting help solve three sum problems?
Sort the array then use two pointers for combination search. - Algorithm to reorder log files: Letters before digits?
Custom stable sort (Timsort or Merge Sort preferred). - Which sort works well on uniform distribution?
Bucket Sort; Time: O(n)O(n)O(n) average case. - Algorithm that minimizes comparisons in best case?
Insertion Sort; Time: O(n)O(n)O(n) if nearly sorted. - How does multi-way merge extend Merge Sort?
Merges kkk sorted arrays instead of two; Time: O(nlogâĄk)O(n \log k)O(nlogk). - Time complexity of Timsort for random data?
O(nlogâĄn)O(n \log n)O(nlogn). - Which sort is highly cache unfriendly?
Classic Merge Sort. - Mitigation for Quick Sort stack overflows?
Use tail recursion or iterative approach. - Best approach for finding duplicates with sorting?
Sort array, scan for adjacent duplicates (any O(nlogâĄn)O(n \log n)O(nlogn) sort). - How does natural Merge Sort differ from classic?
Detects pre-existing runs and merges bigger blocks; Time: depends on order. - Which sort is used in Unixâs sort utility?
Usually Merge Sort or Timsort for stability on large files.
Hard Sorting Questions and Answers
- How do you make any unstable sort stable?
By pairing data with original indices and using these as tie-breakers; increases space to O(n)O(n)O(n).interviewkickstart - How to sort a huge dataset not fitting in RAM?
Use External Merge Sort; Time: O(nlogâĄn)O(n \log n)O(nlogn) I/O, divides into chunks, sorts in-memory, merges.finalroundai - When does Quick Sortâs recursion stack overflow?
Worst-case: O(n)O(n)O(n) recursion depth if poor pivots; avoid by median-of-three/randomized partitioning.finalroundai - How do you make Quick Sort stable?
Use extra arrays to partition, losing in-place advantage.finalroundai - Explain in-place merge for Merge Sort.
Achievable but increases time to O(nlogâĄ2n)O(n \log^2 n)O(nlog2n); typical merge sort uses O(n)O(n)O(n) extra space for merging.geeksforgeeks - Can you reverse a string array lexicographically in O(n)?
No, comparison-based sorts require O(nlogâĄn)O(n \log n)O(nlogn); but use Counting/Radix Sort if bounds known.geeksforgeeks - How does Timsort detect natural runs?
Finds ascending/descending runs, merges with size thresholds; Time: O(n)O(n)O(n) best, O(nlogâĄn)O(n \log n)O(nlogn) worst.geeksforgeeks - How to optimize Quick Sort on many duplicates?
Use 3-way partitioning (âDutch National Flagâ); reduces redundant pivots.geeksforgeeks - Explain stable Bucket Sort for floating-point numbers.
Place values into O(n)O(n)O(n) buckets, stable sort each with list or queue; average O(n)O(n)O(n) if uniformly distributed.geeksforgeeks - Whatâs the lower bound for comparison-based sorting?
Ί(nlogâĄn)\Omega(n \log n)Ί(nlogn) for any deterministic comparision-based algorithm.interviewkickstart - Is Heap Sort stable or unstable? Why?
Heap Sort is unstable due to element swaps not preserving equal keysâ order.interviewkickstart - Which sort is best for very large distributed merge?
Multi-Way Merge Sort, where kkk-way merge optimizes merge cost in O(nlogâĄk)O(n \log k)O(nlogk).finalroundai - Whatâs the cache complexity of Merge Sort?
Poor in naive version, better as Merge Path or Tiled Merge Sort; shifting blocks improves locality.geeksforgeeks - Counting Sort: when optimal? When catastrophic?
Optimal when k=O(n)k = O(n)k=O(n); impractical if kâŤnk \gg nkâŤn, e.g. massive integer range.finalroundai - When can Radix Sort outperform Merge/Quick?
If word size/bounded digits, Time: O(nd)O(nd)O(nd) with counting sort as stable sub-routine.techinterviewhandbook - Can Shell Sort reach O(nlogâĄn)O(n \log n)O(nlogn) time?
Yes, with specific gap sequences (e.g., Pratt sequence), but tricky; usually O(n3/2)O(n^{3/2})O(n3/2).techinterviewhandbook - How does Bitonic Sort work?
Recursively builds bitonic sequences, then merges; Time: O(logâĄ2n)O(\log^2 n)O(log2n) depth (parallel).techinterviewhandbook - Why canât Heap Sort be perfectly parallelized?
Heapify and extract-min steps are serial, merge and distribution are easier to parallelize.geeksforgeeks - Which sort achieves best-case O(n)?
Insertion Sort, Counting Sort, and Timsort for nearly sorted inputs.geeksforgeeks - How to minimize write operations in sorting?
Cycle Sort and Selection Sort minimize data writes; Cycle Sort does exactly n writes for unique entries.interviewkickstart - Explain Adaptive Heap Sort.
Modifies heap structure based on run length; more efficient if input is partially sorted.techinterviewhandbook - Where is Odd-Even Sort applicable?
Suitable for parallel hardware (sorting networks); Time: O(n2)O(n^2)O(n2) sequential, optimal on some hardware.techinterviewhandbook - How does Flashsort achieve linear time?
Maps items to buckets via proportion, then sorts each; ideal for uniform distributions; average O(n)O(n)O(n).interviewkickstart - When can you reduce Merge Sort space to O(1)?
Using in-place merge (Reinhardtâs algorithm), but at a cost of increased time complexity; rarely practical.geeksforgeeks - Tail recursion optimization for Quick Sort?
Always recurse to the smaller subarray; converts other call to a loop, reduces stack.finalroundai - Define a stable, in-place, O(n\log n) sort.
Classic Merge Sort is not in-place, but binary insertion sort can be in-place and stable on bounded data.interviewkickstart - How to efficiently merge K sorted arrays?
Use a min-heap; Time: O(nlogâĄk)O(n \log k)O(nlogk) where nnn is total elements.finalroundai - When is Timsort worst-case O(n log n)?
Adversarial data that defeats natural run detection; still never slower than O(n log n).geeksforgeeks - Can radix sort work on negatives?
Yes, with sign bucket separation or digit biasing; total time O(nd)O(nd)O(nd).interviewkickstart - Best way to sort massive logs by timestamp?
External Merge Sort, chunk and merge approach, O(n log n) disk I/O.finalroundai - How does patience sorting relate to LIS?
Builds piles for LIS, uses binary search for O(nlogâĄn)O(n \log n)O(nlogn) LIS computation.interviewkickstart - How to ensure stability in multi-key sort (lex sort)?
Sort on least-significant key first, repeat for all keys (radix principle).techinterviewhandbook - Whatâs the sorting complexity for read-only arrays?
Minimum is O(n2)O(n^2)O(n2) if extra memory is forbidden (Selection Sort).techinterviewhandbook - Which sort is best for sorting objects by multiple fields?
Stable sort (Merge/Timsort) from least- to most-significant key.interviewkickstart - Explain External Natural Merge Sort.
Scans for runs already present in memory/file, reduces number of merge passes.geeksforgeeks - Sorting algorithm with best cache-oblivious properties?
Cache-Oblivious Merge Sort uses divide and conquer on sub-blocks, optimizes for multiple cache levels.geeksforgeeks - How to parallelize sorting on GPU?
Use bitonic sort or radix sort, both can be parallelized for SIMD/GPU.techinterviewhandbook - Space complexity of Counting Sort for Unicode or wide int?
Can explode to O(k) where k is huge (like Unicode codepoints), making it impractical.finalroundai - How is cycle detection used in Cycle Sort?
Detects where each value belongs and cycles it directly; achieves minimal data writes O(n).interviewkickstart - Which sort works for immutable data (read-only)?
Selection Sort (can sort with only swaps, no duplicates allowed); Time: O(n2)O(n^2)O(n2).techinterviewhandbook - Best algorithm for small but frequent sorts?
Insertion Sort (overhead is tiny, performs optimally for n â¤\leq⤠20).devinterview - Why is block sort âblock stableâ?
Block sort (Block Merge Sort, GrailSort) merges using buffer, achieves in-place stable O(nlogâĄn)O(n \log n)O(nlogn).interviewkickstart - How is entropy used in sorting lower bounds?
Comparison tree needs logâĄ2(n!)\log_2(n!)log2(n!), justifying comparison sort lower bound; about nlogâĄnn\log nnlogn.interviewkickstart - Explain Recursive Insertion Sort and its utility.
Solves by sorting first n-1 elements recursively, then inserting last; pedagogical, not practical for large n.techinterviewhandbook - Which sort optimally sorts linked lists in O(n log n) time?
Merge Sort, as linked lists lack random access.interviewkickstart - Best method to keep an ordered list after many inserts/deletes?
Self-balancing BST or skip list, not sorting per se, but maintains order in O(logâĄn)O(\log n)O(logn).finalroundai - Which sort is used in âk-wayâ external file sort?
Multi-way Merge Sort; merges k files at a time for efficient I/O.geeksforgeeks - Explain smoothsortâs advantage over heapsort.
Smoothsort adapts to pre-existing order, can be O(n)O(n)O(n) if nearly sorted, O(nlogâĄn)O(n \log n)O(nlogn) worst-case.finalroundai - Find the median in an unsorted array fast.
Use QuickSelect (Hoareâs selection); expected time: O(n)O(n)O(n).finalroundai - Does a comparison-based sort exist with O(n) time?
No, but non-comparison sorts do for certain constraints (Counting/Radix/Bucket sort).interviewkickstart