If it takes a given computer one second on average to run Quicksort on an array of 1000 records, how long (to the nearest thousand seconds) will it take to run Quicksort on 1,000,000 records? (Hint: You know from this statement that the machine can do about 10,000 comparisons per second. To get the answer, you first need to compute about how many total comparisons 1,000,000 records will require.)

2000

    What is Quicksort's average case running time?

    It is \Theta(n \log n)

    This means 10 \cdot 1000 instructions ran in one second for an input of 1000 records

    What is n \log n when n = 1,000,000?

    It is about 20 \cdot 1,000,000

    How many seconds is 20,000,000/10,000?