Which time complexity is true?
CodeGuru Home VC++ / MFC / C++ .NET / C# Visual Basic VB Forums Developer.com
Results 1 to 7 of 7

Thread: Which time complexity is true?

  1. #1
    Join Date
    Mar 2012
    Posts
    19

    Which time complexity is true?

    Hi,
    Which time complexity is true?
    O(n)
    O(nlogn)
    O(logn)
    O(2^n/2)
    Name:  gifa.gif
Views: 133
Size:  744 Bytes

  2. #2
    Join Date
    Mar 2012
    Posts
    19

    Re: Which time complexity is true?

    Please help me

  3. #3
    Join Date
    Dec 2012
    Location
    England
    Posts
    2,262

    Re: Which time complexity is true?

    True for what? You have sqrt(2)*sqrt(F(n/2)). You don't specify what F() is. My limited understanding is that O() for evaluation would be dependent upon F().
    All advice is offered in good faith only. You are ultimately responsible for effects of your programs and the integrity of the machines they run on.

  4. #4
    Join Date
    Mar 2012
    Posts
    19

    Re: Which time complexity is true?

    Quote Originally Posted by 2kaud View Post
    True for what? You have sqrt(2)*sqrt(F(n/2)). You don't specify what F() is. My limited understanding is that O() for evaluation would be dependent upon F().
    Unfortunately, I don't understand it
    I want know what the function time complexity ?

    http://forums.codeguru.com/attachmen...5&d=1401537827

    O(n) ?
    O(nlogn)??
    O(logn)???
    O(2^n/2)???
    Last edited by irpersian20; June 1st, 2014 at 12:57 PM.

  5. #5
    Join Date
    Aug 2013
    Posts
    55

    Re: Which time complexity is true?

    Quote Originally Posted by irpersian20 View Post
    Which time complexity is true?
    As has been said the question could be clearer.

    But say the posted square-root expression describes F(n) recursively and that the stop criterion is when n reaches 0 that is the function stops at F(0). Then the function would evaluate recursively to a closed form expression like this,

    F(n) = sqrt(2*sqrt(2*sqrt( ... 2*sqrt(2*F(0)) ... )))

    where the number of recursive invocations will depend on n. And since n is halved in each invocation the number of invocations will be proportional to the logarithm of n. That means the function is O(log N).
    Last edited by zizz; June 4th, 2014 at 01:44 AM.

  6. #6
    Join Date
    Apr 2014
    Location
    in northeast ohio
    Posts
    49

    Re: Which time complexity is true?

    too bad i never got far in school
    this is why i hate advanced math stuff tons of symbols different meanings very unstrict
    programing seems so clear and clean compared to that...
    from my algebra uneducated eyes i just see
    O(n)

    i always thought n was the recursions or number of indexs
    so n/2 is half a array then n/2 = .5 so .5+.5 = 1
    after that it looks like the result is square-rooted cause its around the functions
    i don't see how that affects it first
    if it does it seems like a bogus argument then cause ...
    how do you square root invocations of like 24 4.9?
    or 2 = 1.4 / 2 = .707 + .707 and get 1.4 itterations
    how do you get .4 invocations doesn't make any sense.
    if n can be decimal what is this function when n = .5 at the start
    then you squaroot then /2 and add your going up O(n/2+log n)
    that's enough to give me a headache
    ...
    well point is.
    if that was in pure code with a clear question
    im sure you probably could just see it without even asking
    heck you could just run it and count it for proof.
    so can anyone code this now i wanna know wth that means ?
    Last edited by willmotil; June 18th, 2014 at 03:39 AM.

  7. #7
    Join Date
    Jul 2013
    Posts
    201

    Re: Which time complexity is true?

    Quote Originally Posted by willmotil View Post
    so can anyone code this now i wanna know wth that means ?
    A recursive C++ function would look like this,

    Code:
    double F(int n) {
       if (n<=0) {
          return 1.0; // assumed terminal value
       } else {
          return sqrt(F(n/2) + F(n/2)); // recursive call
       }
    }
    Note that n is an integer so when it's halved it will be by integer division rather than floating point division (for example 1/2 is 0 and not 0.5).

    The complexity of integer halving of n until it becomes 0 is O(log N). It's the same complexity you have in a binary search which is also based on successive halving of an initial range of size N.

    Also in this case it must be noted that although f(n/2) + f(n/2) is mathematically equivalent to 2.0 * f(n/2) these expressions are not equivalent from a complexity standpoint. The latter is (as zizz notes) equivalent to successive halving and thus O(log N), but the former is not. It's equvalent to a binary tree traversal of N nodes so it's O(N). It's easy to see this if one views the f(n/2) on each side of the + sign as the left and right child of a node in a binary tree.

    So I'd say the complexity of the recursive function as stated is O(N). If it's reduced in the mathematically obvious way the complexity drops to O(log N).
    Last edited by razzle; June 23rd, 2014 at 01:45 AM.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  


Azure Activities Information Page

Windows Mobile Development Center


Click Here to Expand Forum to Full Width

This is a CodeGuru survey question.


Featured


HTML5 Development Center