This answer is restricted. Please login to view the answer of this question.

Login NowFor a Turing machine, the time complexity refers to the measure of the number of times the tape moves when the machine is initialized for some input symbols and the space complexity is the number of cells of the tape written.

Time complexity all reasonable functions −

**T(n) = O(n log n)**

TM’s space complexity −

**S(n) = O(n)**

**1. Big oh notation (O): **

It is defined as the upper bound and the upper bound on an algorithm is the most amount of time required ( the worst case performance).

**Big oh notation** is used to describe **asymptotic upper bound**.

Mathematically, if** f(n)** describes the running time of an algorithm; **f(n)** is **O(g(n))** if there exist positive constant **C** and **n0** such that

0 <= f(n) <= Cg(n) for all n >= n0

**n** = used to give upper bound a function.

If a function is **O(n)**, it is automatically **O(n-square)** as well.

Graphic example for** Big oh (O) : **

**2. Big Omega notation (Ω) :**

It is defined as a lower bound and the lower bound on an algorithm is the least amount of time required ( the most efficient way possible, in other words, the best case).

Just like **O** **notation** provides an **asymptotic upper bound**, **Ω** **notation** provides an **asymptotic lower bound**.

Let **f(n)** define the running time of an algorithm;

**f(n)** is said to be** Ω(g (n))** if there exists positive constant **C** and **(n0)** such that

0 <= Cg(n) <= f(n) for all n >= n0

**n** = used to give a lower bound on a function

If a function is **Ω(n-square)** it is automatically **Ω(n)** as well.

Graphical example for **Big Omega (Ω): **

**3. Big Theta notation (Θ) :**

It is defined as the tightest bound and the tightest bound is the best of all the worst-case times that the algorithm can take.

Let **f(n)** define the running time of an algorithm.

**f(n)** is said to be **Θ(g(n))** if **f(n)** is **O(g(n))** and** f(n)** is **Ω(g(n)).**

**Mathematically, **

0 <= f(n) <= C1g(n) for n >= n0 0 <= C2g(n) <= f(n) for n >= n0

**Merging both the equation, we get : **

0 <= C2g(n) <= f(n) <= C1g(n) for n >= n0

**The equation simply means there exist positive constants C1 and C2 such that f(n) is a sandwich between C2 g(n) and C1g(n). **

A graphic example of **Big Theta (Θ)**:

If you found any type of error on the answer then please mention on the comment or report an answer or submit your new answer.

Click here to submit your answer.

HAMROCSIT.COM

## Discussion