Quick Sort for Dummies
Mahatma Gandhi once said "Good algorithms are better than supercomputers". Just kidding; he didn't. Still, it's fascinating to see how big of an impact a good algorithm can make in today's world where everything is computationally intensive, millions of TBs of data pouring in every hour of every day. A good algorithm is always the need of the hour.
I present to you one of the game changers in the world of computer algorithms: The QuickSort algorithm, which is honored as one of the top 10 algorithms of the 20th century. Sir Charles Antony Richard Hoare won the Turing Award in 1980 for this exceptionally great algorithm's discovery.
Algorithms are a way to tell the computer how to execute a particular task step by step. A sorting algorithm is an algorithm that puts elements of a list in a certain order, be it ascending or descending. There are several sorting algorithms ranging from a basic comparing and swapping adjacent items to the algorithms that took decades to get discovered and years of research to make them even more efficient. Some of them include selection sort, insertion sort, quick sort, radix sort etc.
Let’s look at some stats. Shall we?
In this comparison, I’m picking 3 sorting methods. The infamous bubble sort (the first thing you’ll learn in the “How to become a programmer in 30 days?” book because it is probably the simplest sorting algorithm ever), insertion sort and quick sort.
We can infer the below points from the above graph
- The x-axis represents the number of elements sorted and y-axis, the time it took to sort those elements.
- Bubble sort’s curve would make a dream come true graph for a company’s profit. Jokes apart, it performs so bad that it shouldn’t even be in the syllabus. I just put it here in the comparison to make you hate bubble sort even more.
- Quick Sort amazed me. Didn’t you get amazed?! The graph speaks it all. Quick Sort seems to not want to let go off the axis. It is not a mistake, it is for real and it is a great quality to have for a sorting algorithm i.e., to not have a steep slope.
Insertion sort was being widely used before this algorithm stole the show. Shown below are the rough empirical estimates on the running times of both insertion sort and quick sort on both a general-purpose home pc and a supercomputer.
Why is it so efficient?
The algorithm is efficient mainly because it requires very small amounts of memory thus being cache friendly. In fact, the extra space needed to run this is constant. So, you won't be needing much extra space in memory and thus not letting the processor sit idle and wait for memory related stuff to take place or switch to a different process instead.
Also, it takes on the approach of the divide and conquer, which is usually very effective in solving a problem. We all know how well the East India Company used this approach to break entire countries apart, don’t we?
Disclaimer You may find some really disturbing, super-technical stuff in the next section. Reader's curiosity is mandatory.
Below are the basic steps describing the approach of the quick sort algorithm.
- Randomly shuffle the array [Needed for performance guarantee as it performs better if the values are randomly ordered]
- Partition so that, for some j
- Entry a[j] is in place
- No larger entry to the left of j
- No smaller entry to the right of j
Now that you've crossed that section, here's an interesting fact. Quick sort performs the worst when the numbers are already sorted!
In the end, it does matter
As we look more and more into this subject matter, we tend to infer one important idea behind all these not good but great algorithms which is to reduce the overhead. Be it memory, running time, power consumption etc. As I quoted earlier, “Good algorithms are better than supercomputers”. Especially in this tremendously fast paced field of computer science. Still there are few problems that are left unsolved and it does matter to add your part of the spice to this delicious recipe called algorithm.