FOM: re: Church's thesis and complexity theory
Soren Moller Riis
smriis at brics.dk
Tue Nov 3 07:14:09 EST 1998
---------------------------------------------
re: Church's thesis and complexity theory
---------------------------------------------
On fri, 30 Oct 1998 Joe Shipman wrote:
> On the other hand, "polynomial time" may be too narrow a class to
> capture all that is "effectively computable", because there is no proof
> it includes all functions feasibly calculable by probabilistic
> algorithms or by quantum computation.
>From a naive point of view there must be a limit to how much computational
power a volume of nature can posses. Thus the computational power
of a computer is limited by the volume of the computer. We can of
course view any physical system as a computer which simulate itself.
Quite recently I spoke to an astronomer who had made some simulations
of the sun. He had an algorithm for approximating its behaviour which
was running in time n^3 where n was the number of discrete
points chosen in each direction. He was very happy he had
improved his earlier algorithm which was running in time n^4.
Now (as I pointed out to him), there better have to be simulations
running in n^3, because otherwise nature (here the sun) would
simply occasionally have to "stall" in order to simulate its own
behaviour (when its radius tends to infinity). I was told
(to my surprise) that this simple observation seems to be unknown. I have
often voiced this idea and have been meet by the same responce.
For all practical purposes the validity of my argument must depend
on the extend to which non-locality is irrelevant.
So over large a scale (and assuming that classical local physics is
sufficient to simulate the system) the time required to
simulate the system ought to increase roughly with the volume
of the system. If time n^4 is needed one should be suspicious
about the algorithm used for the simulation.
If nature contains hidden flat dimensions - as has been suggested -
they are presumably so flat they cannot contribute to computations
by more than a constant factor.
Of course (being cynical) all functions which can be computed
in the physical universe presumably can be computed in constant time
(for some sufficiently large constant). Yet I think it makes much
more sense to identify "physical" feasible computations with
something say slightly more* than runtime n^3.
(*) how much more depends (in practise) on ones patience
(=willing time investment) divided by ones ability to
write long inputs.
All this is however somewhat controversial because quantum mechanics
seems to allow closed systems having a computational power which grows
exponentially in the volume.
I think however that in all circumstances (also including
non-local models of computations) it is safe to bound natures
computational power by an exponential function (possible in some
oracle) of the volume.
Finally let me add: If one have the same patience writing long inputs,
as one have in building computers a more feasible run time
seems to be n^2.
Soren Riis
More information about the FOM
mailing list