## Time complexity - Wikipedia, the free encyclopedia

--------------------

** Time complexity **

"Running time" redirects here. For the film, see Running Time (film).

In computer science, the *time complexity* of an algorithm quantifies the
amount of time taken by an algorithm to run as a function of the length of
the string representing the input^[1]^:226. The time complexity of an
algorithm is commonly expressed using big O notation, which excludes
coefficients and lower order terms. When expressed this way, the time
complexity is said to be described /asymptotically/, i.e., as the input
size goes to infinity. For example, if the time required by an algorithm on
all inputs of size /n/ is at most 5/n/^3 + 3/n/ for any /n/ (bigger than
some /n[]/), the asymptotic time complexity is O(/n/^3).

Time complexity is commonly estimated by counting the number of elementary
operations performed by the algorithm, where an elementary operation takes
a fixed amount of time to perform. Thus the amount of time taken and the
number of elementary operations performed by the algorithm differ by at
most a constant factor.

Since an algorithm's performance time may vary with different inputs of the
same size, one commonly uses the worst-case time complexity of an
algorithm, denoted as */T/(/n/)*, which is defined as the maximum amount of
time taken on any input of size /n/. Less common, and usually specified
explicitly, is the measure of average-case complexity. Time complexities
are classified by the nature of the function /T/(/n/). For instance, an
algorithm with /T/(/n/) = /O/(/n/) is called a /linear time algorithm/, and
an algorithm with /T/(/n/) = /O/(/M/^/n/) and /m/^/n/= O(/T/(/n/)) for some
/M/ â¥ /n/ > 1 is said to be an /exponential time algorithm/.

*Contents*

· 1 Table

Source: en.wikipedia.org/wiki/Time_complexity