I am talking about

those comparisons are surely made at most Nwhile(j<N-1) && (A[i]-A[j] > D)

This must be something basic, but I don't know.]]>

Please correct me if I'm wrong.

The link is here:

http://community.topcoder.com/tc?module=Static&d1=tutorials&d2=complexity2]]>

I found the article on Computational Complexity: Section 2 very useful. While going through this article, I found a couple of points which looked incorrect to me.

1) In Example 7, the diagram of the recursion tree shows root node with value: cNlogN. Shouldn't it be cN as we have only theta(N)

2)In the same example there is this statement "Each of the levels has five times more vertices than the previous one, thus the last level has 5* log(n) levels.". It should be corrected to "Each of the levels has five times more vertices than the previous one, thus the last level has vertices or nodes."

ie the word levels has to be replaced by vertices/nodes.

Regards

Dilip]]>

Could someone please help me out in getting this example correct.

Thanks and Regards,

Prabhakar]]>

In example 6, since the total amount of work is falling with each progressive level of the tree, we don't have to care at all about anything above the first element. misof is evaluating that infinite geometric series just to provide reassurance that, yes, the overall complexity of this recursion really is Theta(n^3), just like the root of the tree. You're right that the real recursion we have here is finite, but misof's point is that it doesn't matter. Even if we overcount and take the recursion to be infinite, we still have just Theta(n^3).

In example 7, again, the recursion is actually a finite geometric progression, but we can ignore the series because we know that the bulk of the work is done in the last term, and the sum of all of the other work is O(the work at the bottom level). What he's doing, then, is noting that if you have an increasing finite geometric progression, you can reverse the order of the terms and have a decreasing finite geometric progression instead, just like in example 6. Thus, if we can compute the last term in the progression, then we have the asymptotic complexity of the entire recursion, and we can ignore all of the other levels.

Is that a little more clear?

(sidenote, when I first read your post, it took me quite a while of scratching my head trying to figure out why you were talking about Gaussian processes...computer science is acronym hell, I tell you... :) )]]>

Any thoughts?]]>

Coming to example 6 under "More recursion trees", why are you summing the series till infinity? There are lg(N) levels, so shouldn't you be summing only the first lg(N) terms? If it doesn't matter, can you tell me why?

Also, there is a typo in the graph at level 3, each vertex should be c(N/4)

Now, example 7, as you said, the total amount of work increases as we go deeper into the tree and so, most of the work is done at the lowest level. I don't understand why in this case you are not using the sum of a finite G.P formula? The only difference between example 6 and 7 is that you have a decreasing G.P in 6 and an increasing G.P in 7. I'm not sure if I'm missing something here!

You calculated the work done at the last level (= c5

TL;DR - In both examples under "More recursion trees", why are you not using the sum of a finite G.P formula to calculate the total amount of work?]]>

I was missing that... that's the point.

I knew that but then I read (from http://en.wikipedia.org/wiki/Context_of_computational_complexity):

And that brings me some doubts about it.

I would try with that book!

Thanks again!]]>