Section 8.4 Insertion Sort
Imagine you have a stack of phone bills from the past two years, and you want to arrange them in order by date. A natural way to tackle this task is to start with the first two bills and put them in the correct order. Then, take the third bill and place it in its proper position relative to the first two bills. You continue this process, taking each bill and adding it to the already sorted pile. This simple approach serves as the foundation for our first sorting algorithm, known as Insertion Sort.
Insertion Sort works by iterating through the list of records. During each iteration, the current record is inserted at the appropriate position within a sorted list consisting of the records already processed.
Although the best case for Insertion Sort is considerably faster than the average and worst cases, the average and worst cases provide more reliable estimates of the typical running time. However, there are scenarios where we can anticipate the input to be already sorted or nearly sorted. In such cases, Insertion Sort can be a suitable choice, especially when the disordering is minor. Even when the input is not perfectly sorted, Insertion Sort’s performance is proportional to the number of inversions, making it efficient for "nearly sorted" lists. Shellsort and Quicksort are examples of algorithms that leverage Insertion Sort’s near-best-case running time.
When counting comparisons or swaps, similar trends emerge. Each iteration of the inner for loop involves both a comparison and a swap, except for the last comparison that fails the loop’s test and does not require a swap. Consequently, the number of swaps in the entire sorting operation is n-1 less than the number of comparisons. The best case has zero swaps, while the average and worst cases exhibit a Θ(n^2) complexity.
As we delve into algorithms with superior growth rates, Insertion Sort becomes less favorable for larger arrays. It is not the most efficient sorting algorithm for most situations. However, there are specific scenarios where it excels, such as when the input is already sorted or when working with very small arrays. Other algorithms with better asymptotic growth rates tend to be more complex, leading to larger constant factors in their running time. While they require fewer comparisons for larger arrays, their cost per comparison is higher. Although this observation may seem less beneficial, there are instances where numerous sorting operations are performed on very small arrays.
You have attempted of activities on this page.
opendsa-server.cs.vt.edu/OpenDSA/Books/Everything/html/InsertionSort.html