Algorithm Complexity: Difference between revisions

From NovaOrdis Knowledge Base
Jump to navigation Jump to search
Line 12: Line 12:
Algorithm analysis aims at determining, or predicting, the amount of resources consumed by an algorithm while solving a [[Data Structures and Algorithms#Computational_Problem|computational problem]] as a function of its [[#Input_Size|input size]]. In most cases the resource being consumed is [[#Running_Time|running time]]. An analysis can also refer to memory, network bandwidth, etc.
Algorithm analysis aims at determining, or predicting, the amount of resources consumed by an algorithm while solving a [[Data Structures and Algorithms#Computational_Problem|computational problem]] as a function of its [[#Input_Size|input size]]. In most cases the resource being consumed is [[#Running_Time|running time]]. An analysis can also refer to memory, network bandwidth, etc.


<span id='Input_Size'></span>The concept of '''input size''' depends on the problem being addressed. In case of sorting algorithms, the most natural measure fo the input size is the '''number of items in the input'''.  
<span id='Input_Size'></span>The concept of '''input size''' depends on the problem being addressed. In case of sorting algorithms, the most natural measure of input size is the '''number of items in the input'''.  For the problem of multiplying two integers, the best measure of input size is the '''total number of bits''' needed to represent the input in binary notation. In the case of graph algorithms, there are two numbers that describe the input size: the graph's number of '''vertices''' and '''edges'''.





Revision as of 19:43, 6 August 2018

External

Internal

Overview

Algorithm analysis aims at determining, or predicting, the amount of resources consumed by an algorithm while solving a computational problem as a function of its input size. In most cases the resource being consumed is running time. An analysis can also refer to memory, network bandwidth, etc.

The concept of input size depends on the problem being addressed. In case of sorting algorithms, the most natural measure of input size is the number of items in the input. For the problem of multiplying two integers, the best measure of input size is the total number of bits needed to represent the input in binary notation. In the case of graph algorithms, there are two numbers that describe the input size: the graph's number of vertices and edges.


running time (or computational time)

Random-Access Machine (RAM)

An algorithm is usually analyzed in the context of a random-access machine (RAM), which is a theoretical model for the computing resources typically used by algorithms and their costs. A random-access machine has only one processor. In this model, instructions are executed one after another, with no concurrent operations, though extensions of the model exists when we need to analyze multithreaded algorithms. Algorithms are implemented as computer programs for such a machine. The random-access machine mimics how real computers are designed, so it has instructions commonly found in real computers: arithmetic (add, subtract, multiply, divide, remainder, floor, ceiling), data movement (load, store, copy) and control (conditional branch, subroutine call and return). It is important to emphasize that each instruction takes a constant amount of time - this assumption is essential in algorithm analysis. In accordance with how real computers are implemented, such operations as multiplication with 2k are also considered constant-time, because they are implemented by shifting left a number representation for k times, as long as k is no more than the number of bits in a computer word.

The data types in the RAM model are integer and floating point. The model assumes a limit on the size of each word of data (if the word size could grow arbitrarily, we could store large amounts of data in one word and operate on it in constant time, which is an unrealistic scenario).

A random-access machine does not attempt to model a memory hierarchy (caches or virtual memory). In some cases, memory hierarchies effects are significant, but usually, RAM-model analyses are good predictors of performance on actual machines.

Comparing running time for two algorithms of which one is c1n2 and the other is c2 n lg n, even if c2 ≥ c1, no matter how much smaller c1 is than c2, there will always be a crossover point beyond which the c2n lg n algorithm is faster.

Best-case running time.

Worst-case running time.

Average-case running time.

Expected running time.

Order of growth

Examples of Time Complexity Analyses

Recurrence

The running time of recursive algorithms can be analyzed using recurrences.



A method of analyzing running time of recursive algorithms is the "master theorem".