问一个比较小白的问题,论文里delay用O()表示是怎么回事
时间:10-02
整理:3721RD
点击:
先贴一段相关的论文,第一次见到这样的定义和表达形式,请论坛老手路过指点一下,google好久都没有相关资料,难道是数学某个函数的表达式?
论文片段如下
There are many cases where it is desired to add more than two numbers together. The straightforward way of adding
together m numbers (all n bits wide) is to add the first two, then add that sum to the next, and so on. This requires
a total of m − 1 additions, for a total gate delay of O(mlg n) (assuming lookahead carry adders). Instead, a tree of
adders can be formed, taking only O(lgm · lg n) gate delays.
google "big o"
表示数量级,在分析算法复杂度时常用到。找一本讲算法的教材,前几章肯定会介绍它。
谢谢二楼和三楼的兄弟了,三楼的前辈能推荐本算法书吗?
