Burton Rosenberg
1 April 2003
By definition, an algorithm proceeds by discret steps of simple computations. Given an input of n characters, we can count the number of steps taken before the algorithm produces an answer. We expect that number to depend on n: the larger n the more steps required for the computation.
If we consider problems whose results are produced in cn^k steps, for some c and k, these are considered polynomial time computations. What is considered a single computation step is not particularly crucial. Counting any sufficient obvious computation as a singe step gives the same result when considering whether a polynomial amount of time is sufficient or not. It has been the world's experience that any machine built, or abstract computation system considered, which have varying notions of an elementary step never differ so much that the steps posited elementary for one machine or abstraction cannot be reproduced by a polynomial number of steps elementary in another machine or abstraction.
In particular, the must elementary of computing machines, the Turing machine, whose computational power appears rediculously limited, differs from the fastest, parallelized supercomputer only by a polynomial number of steps.