Introduction to Algorithms No.8

Hello. I have completed Chapter 4, "Divide and Conquer." I have learnt how to get the running time of divide-conquer algorithms. First, I can make a recursion of a divide-conquer algorithm. Taking merge-sort, the recursion is as follows:

T(n) = T(floor(n/2)) + T(ceil(n/2)) + Theta(n)

Then, I can prove the following proposition:

T(n) in Theta(n*log(n))

-------
f(n) in O(g(n)) <=> there exists a positive constant c such that f(n) <= c * g(n) for enough large n.
f(n) in Omega(g(n)) <=> there exists a positive constant c such that f(n) >= c * g(n) for enough large n.
f(n) in Theta(g(n)) <=> f(n) in O(g(n)) and f(n) in Omega(f(n)).
-------

In other words, I can prove:

-------
For T(n) = T(floor(n/2)) + T(ceil(n/2)) + Theta(n), there exist two positive constants c_1, c_2 such that c_1*n*log(n) <= T(n) <= c_2*n*log(n) for enough large n.
-------

Therefore, I can conclude that the running time of merge-sort is Theta(n*log(n)).

Thank you.


p.s.
I have written merge-sort at No.5
And c-program: https://github.com/uriha421/introduction_to_algorithms/blob/master/I_Foundations/merge_sort.c

Comments

Popular posts from this blog

Introduction to Algorithms No.3

Introduction to Algorithms No.7

Introduction to Algorithms No.10