We can place an upper-bound on the execution time of algorithms using O(big-oh) notation. An algorithm that runs in O(n2) time indicates that execution time increases with the square of the dataset size. For example, if we increase dataset size by a factor of ten, execution time will increase by a factor of 100. A more precise explanation of big-oh follows.
Assume that execution time is some function t(n), wherenis the dataset size. The statement
t(n) = O(g(n))
implies that there exists positive constants c and n0such that
for allngreater than or equal to n0. This is illustrated graphically in the following figure.
t(n) <= c·g(n)
Figure 1-4: Big-Oh
Big-oh notation does not describe the exact time that an algorithm takes, but only indicates an asymptotic upper bound on execution time within a constant factor. If an algorithm takes O(n2) time, then execution time grows no worse than the square of the size of the list.
Table 1-1 illustrates growth rates for various functions. A growth rate of O(lgn) occurs for algorithms similar to the binary search. The lg (logarithm, base 2) function increases by one whennis doubled. Recall that we can search twice as many items with one more comparison in the binary search. Thus the binary search is a O(lgn) algorithm.
If the values in Table 1-1 represented microseconds, then a O(n1.25) algorithm may take 10 microseconds to process 1,048,476 items, a O(lgn) algorithm 20 seconds, and a O(n2) algorithm up to 12 days! In the following chapters a timing estimate for each algorithm, using big-O notation, will be included. For a more formal derivation of these formulas you may wish to consult the references.