“Big O of a recursive function” can be translated to “递归函数的时间复杂度” in Chinese.

huangapple go评论67阅读模式
英文:

Big O of a recursive function

问题

I'm learning Big O notation but am stuck when it comes to recursion. This is a fairly new topic to me, I was taught to solve this either using substitution or the master method. I don't know how I would apply this to pseudo code though.

我的理解是,如果我们使用 T(n) = aT(n/b) + f(n) 来分析,其中 a = 1,b = 3,d = 1,那么这个算法的运行时间将为 O(n)。

'a' 是递归调用的次数

'b' 是子问题的因子

'd' 是递归调用之外的运行时间指数。

但我不确定是否我做得正确,或者是否我完全错误地处理了它。

任何帮助或建议都将不胜感激。

英文:

I'm learning Big O notation but am stuck when it comes to recursion. This is a fairly new topic to me, I was taught to solve this either using substitution or the master method. I don't know how I would apply this to pseudo code though.


    def isThis(n): 
        print(n) 
        if (n == 1):
            return True
        if (n == 0): 
            return False
        if (( n % 3) == 0):
            return (isThis(n/3))
        else:
            return False

My understanding is that this algorithm will run O(n) if we use T(n) = aT(n/b) + f(n), where a = 1, b = 3 and d = 1

'a' is number of recursive calls

'b' is factor of the subproblem

'd' is exponent of the running time outside of the recursive call.

But I have no idea if I'm doing this correct or tackling it in the completely wrong way.

Any help or suggestions would be appreciated.

答案1

得分: 2

你的代码在执行中当然是以O(log(n))的算术操作为单位的,因为在最坏情况下,它执行的操作数量与三进制数字的数量一样多。如果你想应用主定理,你的情况符合wikipedia中的情况2,因为f(n) = O(1)

请注意,这个渐近描述是以算术操作为单位的。这段代码并不在O(log(n))时间内运行,因为每个算术操作都不在O(1)时间内运行。

英文:

Your code runs in O(log(n)) arithmetic operations of course, since in the worst case it performs as much operations as there are ternary digits. If you want to apply the master theorem, your case is case 2 from wikipedia, since f(n) = O(1).

Note that this asymptotic description is in arithmetic operations. This code does not run in O(log(n)) time since every arithmetic operation does not run in O(1) time.

答案2

得分: 1

以下是已经翻译好的内容:

主定理(Master Theorem)的公式使用Landau符号表示如下:

T(n) = aT(n/b) + O(n^d)

其中:

  • n 是输入的大小,
  • T(n) 是算法在大小为n的问题上的时间复杂度。
  • a 是算法在每次递归期间生成的子问题数量。
  • n/b 是子问题的大小。这假定算法将初始问题分解成大小为n/b的子问题。
  • O(n^d) 表示为大O符号,表示合并子问题解以及所有其他非递归操作的成本。

在你提供的算法中,函数 isThis(n) 基于将n除以3来进行递归,直到n达到1或其他停止条件。因此,我们可以将n视为输入的大小,并且:

  • a = 1,因为递归仅在每次迭代中生成一个子问题(inThis 中只会有最多一个对 isThis 的调用)。
  • b = 3,因为n在每次递归中除以3(每次 isThis 调用)。

要确定O(n^d),我们可以看到验证操作(条件和返回)的成本是常数,不受输入n的大小影响。因此,O(n^d) = O(1),因此 d = 0

现在,有了a、b和d,我们可以检查主定理的哪种情况与我们的情况匹配:

  • 情况1:如果 d < log_b(a),则 T(n) = O(n^log_b(a))
  • 情况2:如果 d = log_b(a),则 T(n) = O(n^d log_b(n))
  • 情况3:如果 d > log_b(a),则 T(n) = O(n^d)

在我们的情况下,我们有 log_b(a) = log_3(1) = 0

而且我们也有 d = 0。

所以我们处于第二种情况,你的函数的复杂度是:

O(n^d log_b(n)) = O(n^0 log_3(n)) = O(log_3(n)),简化为 O(log(n))

英文:

The Master Theorem formula using Landau notation is:

T(n) = aT(n/b) + O(n^d)

Where :

  • n is the size of the input,
  • T(n) is the time complexity of the algorithm for a problem of size n.
  • a is the number of sub-problems generated by the algorithm during each recursion.
  • n/b is the size of the sub-problems. This assumes that the algorithm divides the initial problem into sub-problems of size n/b.
  • O(n^d) is, expressed as a Big O notation, the cost of merging the sub-problem solutions and all other non-recursive operations performed.

In the algorithm you've given, the function isThis(n) performs
a recursion based on dividing n by 3 until n
reaches 1 or some other stopping condition. So, we can consider n to be the size of our input, and:

  • a = 1, as the recursion generates only one sub-problem at each
    iteration (There will only be a maximum of one call to isThis inside inThis).

  • b = 3, because n is divided by 3 at each recursion (each call of isThis)

To determine O(n^d), we see that the cost of the verification operation (condition and return) is constant, regardless of the size of the the input n.
So O(n^d) = O(1), thus d = 0.

Now with a, b and d, we can check which case of the Master Theorem match our case :

  • Case 1 : if d < log_b(a) then T(n) = O(n^log_b(a))
  • Case 2 : if d = log_b(a) then T(n) = O(n^d log_b(n))
  • Case 3 : if d > log_b(a) then T(n) = O(n^d)

In our case, we have log_b(a) = log_3(1) = 0

And we also have d = 0.

So we are in the second case, and the complexity of your function is :

O(n^d log_b(n)) = O(n^0 log_3(n)) = O(log_3(n)), simplified as O(log(n)).

huangapple
  • 本文由 发表于 2023年7月4日 23:28:09
  • 转载请务必保留本文链接:https://go.coder-hub.com/76614057.html
匿名

发表评论

匿名网友

:?: :razz: :sad: :evil: :!: :smile: :oops: :grin: :eek: :shock: :???: :cool: :lol: :mad: :twisted: :roll: :wink: :idea: :arrow: :neutral: :cry: :mrgreen:

确定