'Many works of science fiction as well as some forecasts by serious technologists and futurologists predict that enormous amounts of computing power will be available in the future. Let us suppose for a moment that these predictions are correct. One thing that later generations might do with their super-powerful computers is run detailed simulations of their forebears or of people like their forebears. Because their computers would be so powerful, they could run a great many such simulations. Suppose that these simulated people are conscious (as they would be if the simulations were sufficiently fine-grained and if a certain quite widely accepted position in the philosophy of mind is correct). Then it could be the case that the vast majority of minds like ours do not belong to the original race but rather to people simulated by the advanced descendants of an original race. It is then possible to argue that, if this were the case, we would be rational to think that we are likely among the simulated minds rather than among the original biological ones. Therefore, if we don't think that we are currently living in a computer simulation, we are not entitled to believe that we will have descendants who will run lots of such simulations of their forebears.' - Nick Bostrom
In his classic rendering of the simulation argument, Bostrom posits that one of the following trilemma must be true:
So, to frame this by the assumptions it makes, if we assume the following:
Then it must follow that there are many more simulated realities than the one true base reality (possibly reaching ratios of millions or even billions to one). Therefore, for any conscious being, the probability they can assign to themselves being in base reality is 1/N.
(Where N=the number of simulated realities there are and is highly likely to be > millions)
Therefore the probability we are in a simulated reality is very close to 1.
Now let's look at some of these assumptions in more detail and see if the argument still holds.
Is it possible for a civilisation to reach a post-human level of technological advancement?
This is actually one of the least certain of all of the assumptions in my opinion. Humanity sits very precariously on the brink of self-annihilation, but we have to hope that it is possible for a society (even just one, out of all post-human civilisations) to collaborate enough not to destroy themselves with nuclear warfare, superintelligent AI, superbug pandemics or catastrophic climate change.
Is it actually theoretically possible to create a simulation indistinguishable from reality?
This question essentially boils down to whether we would have enough computing power to do so. Well, physicists in Oxford have recently claimed, that to accurately model quantum phenomena, we would need prohibitive levels of computing power. For example, to store the information on a few hundred electrons would require more atoms than is available in the universe. However, as we will discuss, a full simulation of quantum systems is not necessary for a simulated reality so I don't think this really challenges the simulation theory.
Bostrom argues that if technology continues to progress at any rate, then it will be possible, whether that be in a few decades or hundreds of thousands of years. It's extremely difficult to put an upper bound estimate on what level of computing power is possible, as we cannot imagine what future technology will look like. If you told someone 100 years ago that they could have a match box sized object that contained access to all of the world's knowledge and information, they would have laughed at you.
However, we can put some lower bound estimates based on what we currently know about the laws of physics and computing. Eric Drexler has calculated a physical limit of 10^21 operations/second for a sugar cube sized computer. Robert Bradbury has written about the Matryoshka brain which is a computer with a mass on the order of a planet that would be able to achieve 10^42 operations/second. And Seth Lloyd has calculated the physical limit to computing power to be 5x10^50 operations/second carried out on 10^31 bits by an object with volume 1L and mass 1Kg in his seminal Nature paper.
Is this enough to create a simulated reality?
In short, yes. To create a simulated reality, the main computing cost would be the simulated minds within it. This is because it is simply not necessary to simulate the entire universe to a quantum level. One also does not need to fill in detail on unobserved regions such as the far reaches of the universe, the earth's core or the atomic structure of most things etc. Because a simulator would have access to the belief states of their simulated minds and could fill in detail on an ad-hoc basis as and when needed e.g. only when someone decides to look down an electron microscope does some of the atomic structure actually need to be in place. Even if an error does occur, the simulator could just edit the brains of the simulated minds or skip back a few seconds and re-run the simulation.
So, the computing power required for everything other than the simulated minds themselves is negligible, but would it be feasible to simulate minds? To do so it has been estimated that we would need no more than about 10^14 10^17 operations/second. RAM requirements of a human brain are about 10^8 bits/second, so negligible. As a highly conservative upper bound estimate then:
100 billion humans X 50 years per human X 30 million seconds per year X 10^14-17 = 10^33-36 operations/second.
So a single matryoshka brain (of which a highly advanced society would likely have many) can achieve 10^42 operations/second, and would thus be able to simulate the entire mental history of humanity with less than 1 millionth of its processing power for 1 second.
Would our simulated minds have a conscious experience of reality like we do?
This is naturally a difficult question. I do think however, that it would be naïve to assume that consciousness is substrate dependent and that an essential property of it is that it is implemented on carbon-based biological neural networks housed in a cranium. I don't see any reason that a silicon-based system in a computer couldn't also work.
Would a post-human civilisation actually want to create simulated realities?
Some have argued that the scientific value of ancestor simulated minds to post-human civilisations is negligible and that they may have decided that recreational activities are a very inefficient means of attaining pleasure, which is much more easily obtained from direct stimulation of the brain's reward centres. Personally, I don't buy this, there would have to be an extremely strong convergence amongst all possible post-human civilisations towards not wanting to create them, even at a negligible computing power cost.
Given all of the above, I'm of the opinion that there is a non-negligible (and in fact highly probable) chance that we are indeed living in a simulation. Now let's turn to some of the consequences of this:
Thinking we might be in a simulation is not even that much of a fringe, niche idea either. Neil de Grasse Tyson has put the odds of us being in a simulation at 50%, Max Tegmark at 17% and David Chalmers at 42%. Elon Musk is famed for stating his belief that there is a 1/billions chance that we are not in simulation. The physicist James Gates has even found error correcting codes (computer code used in the creation of web browsers) in the physical laws governing quarks, electrons and supersymmetry and Ray Kurzweil (Google's AI expert & famous futurist) is also a key proponent.
It's almost never best to pay it off early.
If you're lucky enough to go into a well paying job, it's the question that should be on your mind. Is it better to just pay the bare minimum interest each month, or should you pay in as much as you can?
I've built a basic model to calculate the answer to that exact question, with space for you to edit (yellow highlighted boxes) to incorporate your own situation.
I've compared what would happen to the total amount paid over the 30 years (until it is written off) if you pay extra money in ('Raw amount saved with early payment' box). I've also included what the financial difference is between doing that and instead putting any additional payments into a savings account at 5% interest rate ('total amount saved inc. opportunity cost' box).
Essentially, it looks like you are nearly always best to just save up and put money into a savings account rather than into paying your student loan off early because of the way in which your money can grow extremely rapidly with compound interest.
Download the spreadsheet to edit it properly!
Low carb brownies
Conventional wisdom argues that fat makes you fat and you should eat less of it and then you won't be fat anymore. Right?
Actually, not quite.
We are facing a paradigm shift in nutrition science and it is increasingly widely acknowledged that fats are not to blame for the fact that more than 2/3 of us are overweight, carbohydrates are.
Your body can store energy in 3 ways:
1kg of fat stores about 7700 calories of energy, so on average we each have many tens or even hundreds of thousands of calories stored up on our hips and waist. Some have in excess of a million (e.g. a 150kg man). In fact, if we assume that you burn roughly 700 calories in a 1 hour jog, that means our 150kg man holding 1,000,000 calories round his waist would have to run for 8 hours continuously for 3 months and not eat anything extra, just to lose half of that fat. Keep on running!
To lose weight, we of course have to reduce those stores of fat. Now, insulin is the hormone of plenty in the body, and amongst its multitude of effects it tells the body to store more fat and acts as a lever switching our bodies to preferentially use glucose as opposed to fat. Not ideal, this is the opposite to what we want if we want to lose weight!
Now consider this, humans evolved for and are built for a life of scarcity. In hunter gatherer times (i.e. 99% of the time homo sapiens have walked the earth), we would have lived a life of fasting and feasting cycles, where days of fasting (aside from a few berries here and there) would be punctuated irregularly with large protein feasts. There certainly wouldn't have been any of these thrice daily, solo gorges on carb heavy meals meant for a family of 4. Romans used to eat 1 meal a day and it wasn't until the 18th century that 3 square meals a day became the norm.
It certainly wasn't until the birth of fast food and on demand donuts over the past century that we have been able to dump what is essentially spoons and spoons of pure sugar into our blood, on demand, day after day. The effect this has, is to massively spike insulin levels, which as we mentioned earlier, causes us to build up bigger and bigger fat stores. Not only does this permanent state of high insulin make us fat, it also gives us diabetes and is thought to play a role in a host of other chronic diseases too.
Now here is the really key point.
Insulin levels are spiked by carbohydrate intake. Fat and protein do not raise insulin levels.
That's not just all those cookies and puddings, it's all that pasta, rice and bread as well.
So, given all of that, doesn't minimising insulin levels by cutting down on carbohydrate intake seem like a reasonable idea? Not only would this reverse the process of storing fat, but it would also switch that lever in our body to force preferential burning of fat and decrease our risk of diabetes.
This is where the ketogenic diet comes out to play.