This article has participated in the Denver Nuggets Creators Camp 3 “More Productive writing” track, see details: Digg project | creators Camp 3 ongoing, “write” personal impact.

The main idea of asymptotic analysis is to measure the efficiency of algorithms, independent of machine-specific constants, mainly because this analysis does not require the time spent implementing algorithms and comparing programs. We have discussed three main asymptotic symbols. The following two asymptotic symbols are used to represent the time complexity of the algorithm.

ο Asymptotic symbol

Big-o is used as a tight upper limit on algorithm workload growth (described by function f(n)), although as written, it can also be a loose upper limit. The notation “ο” (ο()) is used to describe an upper limit that cannot be tightened.

Definition: Let f(n) and g(n) be functions that map positive integers to positive real numbers. If for any real constant c > 0, there exists an integer constant n0 ≥ 1 such that 0 ≤ f(n) < c*g(n). Therefore, a small o() means that f(n) has a loose upper bound. Little O is a rough estimate of the maximum increase order, while big-O may be the actual increase order. In mathematical relations, f(n) = O (g(n)) means lim f(n)/g(n) = 0 and n→∞

Example:

7n + 8 ∈ o(n 2 )
Copy the code

For this to be true, for any c, we must be able to find n0 that makes f(n) < c * g(n) asymptotically correct. Let’s take some examples. If c = 100, we check that the inequality is clearly true. If c = 1/100, we will have to use more imagination, but we will be able to find n0. (Try n0 = 1000.) From these examples, the conjecture seems to be correct. Then check the limit, lim F (n)/g(n) = lim (7n + 8)/(n 2) = lim 7/2n = 0 (L ‘hospital) n→∞ n→∞ n→∞

So 7n + 8 ∈ O (n 2)

Small omega asymptotic notation

Definition: Let f(n) and g(n) be functions that map positive integers to positive real numbers. If for any real constant c > 0, there exists an integer constant n0 ≥ 1 such that f (n) > c * g(n) ≥ 0 for every integer n ≥ n0.

F (n) has a higher growth rate than G (n), so the main difference between Big Omega and Little Omega is their definition. In the Big Omega (n) = f Ω (g (n))) and boundary is 0 < = CG (n) < = f (n), but in the case of small Omega, for 0 < = c * g (n) < f (n) is correct.

The relationship between Big Omega (ω) and Little Omega (ω) is similar to that between Big-O and Little O, except now we are looking at the lower limit. Small omega is a rough estimate of the order of increase, while large omega may represent the exact order of increase. We use the omega notation to represent a lower bound that is not asymptotically compact. Moreover, f(n) ∈ ω(g(n)) if and only if G (n) ∈ ο((f(n))).

In mathematical relation, if f(n) ∈ ω(g(n)),

Lim f(n)/g(n) = infinity n goes to infinity

Example:

Prove that 4n + 6 ∈ ω(1);Copy the code

The running time of the small Omega (ο) can be proved by applying the limit formula given below. If lim, f/g (n) (n) = up so omega is the function f (n) (n) (g) n – up here, we have a function f (n) = 4 n + 6 and g (n) = 1 lim (4 n + 6)/(1) = up n – up and, And also for any c we can get n0 for this inequality 0 <= CG (n) < f(n), 0 <= c1 < 4n+6.