Basic Algorithms

================ Start Lecture #2 ================

Homework: I should have given some last time. It is listed in the notes (search for homework). Also some will be listed this time. BUT, due to the Jewish holiday, none is officially assigned. You can get started if you wish since all will eventually be assigned, but none will be collected next class.

1.2: Asymptotic Notation

Now we are going to be less precise and worry only about approximate answers for large inputs.

1.2.1: The Big-Oh Notation

Definition: Let f(n) and g(n) be real-valued functions of a single non-negative integer argument. We write f(n) is O(g(n)) if there is a positive real number c and a positive integer n0 such that f(n)≤cg(n) for all n≤n0.

What does this mean?

For large inputs (n≤n0), f is not much bigger than g (f(n)≤cg(n)).

Examples to do on the board

  1. 3n-6 is O(n). Some less common ways of saying the same thing follow.
  2. 3x-6 is O(x).
  3. If f(y)=3y-6 and id(y)=y, then f(y) is O(id(y)).
  4. 3n-6 is O(2n)
  5. 9n4+12n2+1234 is O(n4).
  6. innerProduct is O(n)
  7. innerProductBetter is O(n)
  8. innerProductFixed is O(n)
  9. countPositives is O(n)
  10. n+log(n) is O(n).
  11. log(n)+5log(log(n)) is O(log(n)).
  12. 1234554321 is O(1).
  13. 3/n is O(1). True but not the best.
  14. 3/n is O(1/n). Much better.
  15. innerProduct is O(100n+log(n)+34.5). True, but awful.

A few theorems give us rules that make calculating big-Oh easier.

Theorem (arithmetic): Let d(n), e(n), f(n), and g(n) be nonnegative real-valued functions of a nonnegative integer argument and assume d(n) is O(f(n)) and e(n) is O(g(n)). Then

  1. ad(n) is O(f(n)) for any nonnegative a
  2. d(n)+e(n) is O(f(n)+g(n))
  3. d(n)e(n) is O(f(n)g(n))

Theorem (transitivity): Let d(n), f(n), and g(n) be nonnegative real-valued functions of a nonnegative integer argument and assume d(n) is O(f(n)) and f(n) is O(g(n)). Then d(n) is O(g(n)).

Theorem (special functions): (Only n varies)

  1. If f(n) is a polynomial of degree d, then f(n) is O(nd).
  2. nx is O(an) for any x>0 and a>1.
  3. log(nx) is O(log(n)) for any x>0
  4. (log(n))x is O(ny) for any x>0 and y>0.

    Homework: R-1.19 R-1.20

Definitions: (Common names)
  1. If a function is O(log(n)) we call it logarithmic.
  2. If a function is O(n) we call it linear.
  3. If a function is O(n2) we call it quadratic.
  4. If a function is O(nk) with k≥1, we call it polynomial.
  5. If a function is O(an) with a>1, we call it exponential.
Remark: The last definitions would be better with a relative of big-Oh, namely big-Theta, since, for example 3log(n) is O(n2), but we do not call 3log(n) quadratic.

Homework: R-1.10 and R-1.12.

R-1.13: The outer (i) loop is done 2n times. For each outer iteration the inner loop is done i times. Each inner iteration is a constant number of steps so each inner loop is O(i), which is the time for the ith iteration of the outer loop. So the entire outer loop is σO(i) i from 0 to 2n, which is O(n2).

1.2.2: Relatives of the Big-Oh

Big-Omega and Big-Theta

Recall that f(n) is (g(n)) if for large n, f is not much smaller than g. That is g is some sort of upper bound on f. How about a definition for the case when g is (in the same sense) a lower bound for f?

Definition: Let f(n) and g(n) be real valued functions of an integer value. Then f(n) is Ω(g(n)) if g(n) is O(f(n)).

Remarks:

  1. We pronounce f(n) is Ω(g(n)) as "f(n) is big-Omega of g(n)".
  2. What the last definition says is that we say f(n) is not much smaller than g(n) if g(n) is not much bigger than f(n), which sounds reasonable to me.
  3. What if f(n) and g(n) are about equal, i.e., neither is much bigger than the other?

Definition: We write f(n) is Θ(g(n)) if both f(n) is O(g(n)) and f(n) is Θ(g(n)).

Remarks We pronounce f(n) is Θ(g(n)) as "f(n) is big-Theta of g(n)"

Examples to do on the board.

  1. 2x2+3x is θ(x2).
  2. 2x3+3x is not θ(x2).
  3. 2x3+3x is Ω(x2).
  4. innerProductRecutsive is Θ(n).
  5. binarySearch is Θ(log(n)). Unofficial for now.
  6. If f(n) is Θ(g(n)), the f(n) is &Omega(g(n)).
  7. If f(n) is Θ(g(n)), then f(n) is O(g(n)).

Homework: R-1.6

Little-Oh and Little-Omega

Recall that big-Oh captures the idea that for large n, f(n) is not much bigger than g(n). Now we want to capture the idea that, for large n, f(n) is tiny compared to g(n).

If you remember limits from calculus, what we want is that f(n)/g(n)→0 as n→∞. However, the definition we give does not use limits (it essentially has the definition of a limit built in).

Definition: Let f(n) and g(n) be real valued functions of an integer variable. We say f(n) is o(g(n)) if for any c>0, there is an n0 such that f(n)≤g(n) for all n>n0. This is pronounced as "f(n) is little-oh of g(n)".

Definition: Let f(n) and g(n) be real valued functions of an integer variable. We say f(n) is ω(g(n) if g(n) is o(f(n)). This is pronounced as "f(n) is little-omega of g(n)".

Examples: log(n) is o(n) and x2 is ω(nlog(n)).

Homework: R-1.4. R-1.22