If you drop constant, then it becomes (n).
but conceptually, it's half the loop.
in short, this is Complexity question.
is n/2 = n??????
O(n/2) is same as O(n) Because what being measured is how fast the execution time/memory/etc. grows with increasing n. https://en.m.wikipedia.org/wiki/Big_O_notation
True sir, I am looking into the definition for asymptotic notation. And my doubt is Does Big Oh care about any constants ? Ref: https://xlinux.nist.gov/dads/HTML/bigOnotation.html excerpt from above: "Definition: A theoretical measure of the execution of an algorithm, usually the time or memory needed, given the problem size n, which is usually the number of items. Informally, saying some equation f(n) = O(g(n)) means it is less than some constant multiple of g(n). The notation is read, "f of n is big oh of g of n". But I am confused here Formal Definition: f(n) = O(g(n)) means there are positive constants c and k, such that 0 ≤ f(n) ≤ cg(n) for all n ≥ k. The values of c and k must be fixed for the function f and must not depend on n." I got my crack but still confused with this formal definition.
That is just how you would rigorously define it O(n^3 + 3n^2 + 1) becomes O(n^3) as 0 < (n^3 + 3n^2 + 1) < 3(n^3) for all n > 2, here c = 3, k = 2 O(n/2) becomes O(n) as 0 < (n/2) < 3n for all n > 0, here c = 3, k = 0
to calculate bigO just think of what class of function the complexity falls under.
Обсуждают сегодня