-1

There is a similar discussion on Reddit .

The idea is to use the number Pi as a trigger to prove ourselves that we do not live in some kind of computer simulation.

The logic is simple: as we know from mathematics, the number Pi is practically endless, and if we are simulated, the hardware used for this should have some limitations, for example an integer limit. So, we are not simulated.

Can this be a 'virtual infinity'? What you think?

programings
  • 107
  • 1
  • 2
  • 3
    Can you clarify better how this infinity might create a problem? – nir Jan 03 '16 at 14:07
  • Computers deal with numbers. Current architectures can handle as much as 64 bits can store. https://en.wikipedia.org/wiki/9223372036854775807 – programings Jan 03 '16 at 14:14
  • I believe that is not true. For example, this is from the docs of the python language: "Long integers have unlimited precision." Try typing 2 ** 1000 in the (right pane of the) following online Python interpreter and see what happens: https://repl.it/languages/python3. You can represent a number with any number of bits. – nir Jan 03 '16 at 14:24
  • Nope, the Python interpreter just keeps that illusion - http://stackoverflow.com/a/7604998 . The limit depends on the size of the CPU registers. – programings Jan 03 '16 at 14:30
  • 2
    why do you call it an illusion rather than computation? – nir Jan 03 '16 at 14:40
  • So we can get confirmation we are in reality if we write out an infinite sequence -- something we cannot do in reality. Not useful. –  Jan 03 '16 at 20:31

4 Answers4

1

You are right: computational errors due to the finite precision of binary computers, when representing irrational numbers like pi, serve as an counterargument for the hypothesis that our world is a simulated mathematical universe.

See one of the last chapters from Greene, Brian: The Hidden reality.

What do you mean by "virtual infinity"?

Expanded. Assume a certain law of nature is the solution of a differential equation. The digital computer can alculate the solution as an extrapolation into the future only with finite precision, Hence in the course of time the simulated solution necessarily will deviate from the observed law of nature by an increasing amount. The error may become arbitrary big. Such that we observe anomalies even in the mesocosmic domain.

In the long run the simulated results do not follow the laws, which we consider laws of nature. And due to the finite precision we get a random distribution of rounding errors. Hence the anomalies do not follow a common scheme as were to be expected, when the only reason were a slightly different law of nature.

These are the arguments, why we do not live in a virtual wold simulated by a digital computer, see Greene, Brian: The Hidden Reality. 2014

Jo Wehler
  • 30,912
  • 3
  • 29
  • 94
  • 1
    Can you provide a summary or a more precise reference to Greene's argument? – nir Jan 03 '16 at 15:00
  • but the OP example is an ambiguous one sentence - that is why it is a question, not a book. (a) How would we realize the observed data is an anomaly? not by comparing the data with a limited precision computation of our own? (b) upon detecting such anomaly, how would we rule out it is the result of a (slightly) different law of nature? — this is why I ask for Greene's argument, for surely he has contemplated these problems. – nir Jan 03 '16 at 15:26
  • @nir I converted my two comments from our conversation into a supplement to my answer. – Jo Wehler Jan 05 '16 at 14:18
1

the idea is to use the number pi to show that we don't live in a simulated universe

Why is this of any consequence? I mean why choose Pi? Any number, when considered as a real is 'endless':

1.00000 ...

or

0.0000 ...

Etc;

Even more, I know what the millionth, or googleplex to googleplex digit of this is: zero.

There seems to be a much more obvious problem with the notion of simulation ie what is it simulated on; and is that a simulation? Is it simulations all the way down?

Also, simulations are programmed; by a programmer; so are you too positing some God-like programmer?

Mozibur Ullah
  • 47,073
  • 14
  • 93
  • 243
1

If we live in a simulation, mathematics could be simulated, thus making us believe in the infinity of Pi without it being present in the 'simulator' world.

Also, 'infinite hardware' could be plausible in the simulator world, even when we cannot imagine it.

fpluis
  • 11
  • 3
1

There are programming languages that can, thanks to lazy evaluation, store the concept of something infinite without actually having to store it in memory. For example, in Haskell one can write [1..] for the list of natural numbers, which is infinite. You can use it as long as you don't ask for the whole list. For example, you can write take 10 [1..] for the first ten natural numbers.

So, there's no problem with the concept of something infinite existing in finite hardware.

Also, you don't argue why we can't be simulated in infinite hardware.