In computer science, the **iterated logarithm** of *n*, written log^{*}*n*, is the number of times the logarithm function must be iteratively applied before the result is less than or equal to 1. The simplest formal definition is the result of this recursive function: **function** iterLog(*real* n) **if** n ≤ 1 **return** 0 **else** **return** 1 + iterLog(log(n)) **Figure 1.** Demonstrating log ^{*}4 = 2 In computer science, lg^{*} is often used to indicate the binary iterated logarithm, which iterates the binary logarithm instead. The iterated logarithm accepts any positive real number and yields a natural number. Graphically, it can be understood as the number of "zig-zags" needed in Figure 1 to reach the y-axis above the logarithm function. The iterated logarithm is useful in analysis of algorithms and their computational complexity, appearing in the time and space complexity bounds of many algorithms such as: The iterated logarithm is an extremely slowly-growing function, much more slowly than the logarithm itself; for all practical values of *n* (less than 2^{65536}, which is much more than the number of atoms in the universe), it does not exceed 5. Indeed, the only function used in complexity theory that grows more slowly is the inverse of the Ackermann function. For all practical purposes, the iterated logarithm may be considered to be a constant. |