**Learn about Wiki****Lectures****Study****Tips**

**Learn about Wiki****Lectures****Study****Tips**

study:algorithms:o-notation

1.f(n) = O(g(n))

- for sufficiently large value of n, f grows no faster than g
- There exist some constants c and n0 such that, for all n>= n0

, |f(n)| ⇐ c|g(n)|

- n = O(n)
- n^2 = O(n^2)
- 3n^2 + 2n + 5 = O(n^2)
- lower order terms are not counted
- constant coefficeints are not counted

2. Application to algorithm analysis

- we will state the running time of an algorithm using asymptotic notations such as O(n^2)
- It just gives the rate of growth but not the coefficients

3. Some useful computing times

- O(1) < O(log n) < O(n) < O(n log n) < O(n^2) < O(n^3) < O(2^n) < O(n!)
- log 2 n = O(log n)
- log k n = O(log n)
- O(n^m) < O(2^n)

4. When is such an analysis useful?

- ease of programming : though O(n^2) < O(n^3), we may prefer O(n^3) algorithm
- if it has a small constant of proportionality it is easy to program

- In some cases, the features avaliable on a machine make an algorithm especally fast. eg) parallel hardware
- We often do the worst case analysis because it's easy. eg) Quicksort = O(n^2)
- We can not compare two algorithm of the same order

5. Some useful fomulas

- O(f(n)) + O(g(n)) = O(f(n) + g(n))
- O(f(n)) x O(g(n)) = O(f(n) x g(n))
- O(n) + O(n^2) = O(n^2)
- O(n) x O(n^2) = O(n x n^2) = O(n^3)

study/algorithms/o-notation.txt · Last modified: 2019/02/04 14:26 (external edit)

Except where otherwise noted, content on this wiki is licensed under the following license: CC Attribution-Noncommercial-Share Alike 4.0 International