Upper limit on theoretical processing capacity?

SLAM: debunk creationism, pseudoscience, and superstitions. Discuss logic and morality.

Moderator: Alyrium Denryle

Post Reply
User avatar
adam_grif
Sith Devotee
Posts: 2755
Joined: 2009-12-19 08:27am
Location: Tasmania, Australia

Upper limit on theoretical processing capacity?

Post by adam_grif »

Just wondering if there's a theoretical upper limit on how well things can process information. Intuitively, I would think that an upper limit would be that no computer can ever simulate something as complex as itself in real time. Otherwise a computer would need only break this limit, then simulate a more powerful computer than itself ad infinitium to generate infinite processing power.

Can anybody provide any insights? I've had people (usually singularity wank fanboys) claiming that "unlimited processing power" is possible, plausible, and even inevitable. Infinite processing power sounds like ridiculous fucking bullshit to me, but if it isn't I'd like to know.
A scientist once gave a public lecture on astronomy. He described how the Earth orbits around the sun and how the sun, in turn, orbits around the centre of a vast collection of stars called our galaxy.

At the end of the lecture, a little old lady at the back of the room got up and said: 'What you have told us is rubbish. The world is really a flat plate supported on the back of a giant tortoise.

The scientist gave a superior smile before replying, 'What is the tortoise standing on?'

'You're very clever, young man, very clever,' said the old lady. 'But it's turtles all the way down.'
User avatar
Kuroji
Padawan Learner
Posts: 323
Joined: 2010-04-03 11:58am

Re: Upper limit on theoretical processing capacity?

Post by Kuroji »

Infinite processing power is ridiculous fucking bullshit, yeah. There's no way a computer can run an emulation of itself without some degradation either; there's always some overhead to running anything, especially an OS. The overhead can be minimized, sure, but not eliminated.

The only thing the singularity actually represents is a point beyond which we cannot predict what will happen even in general terms. While it's entirely possible that some computer will be able to design a better computer or a better version of itself, it won't be able to do that in real-time unless it's modifying itself while it runs, and there's lag time between the design and the creation of something like that otherwise. Sure, in theory a sufficiently advanced computer could figure out ways around the screwy quantum effects we're starting to see as chip manufacturing shrinks smaller and smaller. But eventually we'll run up against a wall, whether it's in 2020 or 2050, and chips will not be able to be miniaturized further. Right now we're maintaining Moore's Law by putting multiple cores on chips, but you can't keep multiplying your processing capacity forever -- they're expecting to be making about ten nanometer chips around 2020 and when you go smaller than that quantum tunneling is a bitch to mitigate. The universe has hard limits that cannot be bypassed unless you leave it, and there's nothing anyone can speculate on even in the most remote theories that'll allow us to leave the universe at this point.

Bullshit like 'the singularity means we will get infinite processing power' crosses the line past speculation, you may as well call it the fucking nerd rapture.
Steel, on nBSG's finale: "I'd liken it to having a really great time with these girls, you go back to their place, think its going to get even better- suddenly there are dicks everywhere and you realise you were in a ladyboy bar all evening."
User avatar
adam_grif
Sith Devotee
Posts: 2755
Joined: 2009-12-19 08:27am
Location: Tasmania, Australia

Re: Upper limit on theoretical processing capacity?

Post by adam_grif »

Infinite processing power is ridiculous fucking bullshit, yeah. There's no way a computer can run an emulation of itself without some degradation either; there's always some overhead to running anything, especially an OS. The overhead can be minimized, sure, but not eliminated.
Well yeah. It's an upper limit in the same way that the 1st law of thermodynamics forbids the total energy of a closed system increasing. You're never going to brush up against that because of other inefficiencies. I suppose you could think of objects as being the "ideal" simulations of themselves, with computational simulations of things never reaching the same level of efficiency.
The only thing the singularity actually represents is a point beyond which we cannot predict what will happen even in general terms.
Of course. But this doesn't stop people from making claims like "we'll learn how to control physics!" "We'll get unlimited processing power!" "technology will increase forever!"
A scientist once gave a public lecture on astronomy. He described how the Earth orbits around the sun and how the sun, in turn, orbits around the centre of a vast collection of stars called our galaxy.

At the end of the lecture, a little old lady at the back of the room got up and said: 'What you have told us is rubbish. The world is really a flat plate supported on the back of a giant tortoise.

The scientist gave a superior smile before replying, 'What is the tortoise standing on?'

'You're very clever, young man, very clever,' said the old lady. 'But it's turtles all the way down.'
User avatar
Ford Prefect
Emperor's Hand
Posts: 8254
Joined: 2005-05-16 04:08am
Location: The real number domain

Re: Upper limit on theoretical processing capacity?

Post by Ford Prefect »

adam_grif wrote:Just wondering if there's a theoretical upper limit on how well things can process information.
Bremermann's Limit sets the absolute physical maximum of a self-contained system in the material universe at 2.56x1047 bits per second per gram. The Bekenstein Bound describes the maximum limit on information that can be stored in a volume of space. Black holes co-opted for computing purposes would have information storage density exactly equal to the Bekenstein Bound. Seth Lloyd tossed together what is essentially the upper limit for 'computers' allowed for by the laws of physics, though he uses a mass equivalent to that of your average laptop, but the numbers are so hilariously massive that envisioning a larger version is just silly.
What is Project Zohar?

Here's to a certain mostly harmless nutcase.
User avatar
adam_grif
Sith Devotee
Posts: 2755
Joined: 2009-12-19 08:27am
Location: Tasmania, Australia

Re: Upper limit on theoretical processing capacity?

Post by adam_grif »

And then the important question is: is 2.56x10^47 bits per second enough to simulate one gram of matter for one second with no loss of fidelity?
A scientist once gave a public lecture on astronomy. He described how the Earth orbits around the sun and how the sun, in turn, orbits around the centre of a vast collection of stars called our galaxy.

At the end of the lecture, a little old lady at the back of the room got up and said: 'What you have told us is rubbish. The world is really a flat plate supported on the back of a giant tortoise.

The scientist gave a superior smile before replying, 'What is the tortoise standing on?'

'You're very clever, young man, very clever,' said the old lady. 'But it's turtles all the way down.'
User avatar
Starglider
Miles Dyson
Posts: 8709
Joined: 2007-04-05 09:44pm
Location: Isle of Dogs
Contact:

Re: Upper limit on theoretical processing capacity?

Post by Starglider »

adam_grif wrote:I've had people (usually singularity wank fanboys) claiming that "unlimited processing power" is possible, plausible, and even inevitable.
That is a nonsensical claim. However it's a subset of the general problem that most people throw around the word 'infinity' without really comprehending what it means. No practical consequences require 'infinite' computing power, although some of them may require a very large finite amount.
kuroji wrote:While it's entirely possible that some computer will be able to design a better computer or a better version of itself, it won't be able to do that in real-time unless it's modifying itself while it runs, and there's lag time between the design and the creation of something like that otherwise.
This is technically correct but not really relevant. Massive improvements are possible purely through software. You can see this even in biology; a human brain has the slightly less mass and general microstructure as a dolphin brain, but humans are capable of higher reasoning. FPGAs are an example of very low level reconfiguration accessible to software; future computing substrates will similarly blur the line between software changes and hardware changes. It does of course take time to build new hardware, and even longer to build the tools to build new designs of hardware, but this isn't usually relevant to predictions; any fantasy involving instantaneous changes is a pointless strawman.
kuroji wrote:But eventually we'll run up against a wall, whether it's in 2020 or 2050, and chips will not be able to be miniaturized further
While strictly correct, this is not a near term limitation on processing power. Existing chips are 2D planar arrays of rather crudely formed transistors with a few layers of interconnect. We could improve capacity three to six orders of magnitude by going to 3D chips with thousands to millions of layers, but no one has done it because of cost, power and heat dissipation issues. We could improve clock speed by a factor of ten to one hundred using superconducting RSFQ circuits, but no one wants to make the massive investments in development, tooling and installation of liquid nitrogen cooling systems. Reversible computing has the potential to decrease power use by at least two orders of magnitude, but currently we don't have the surplus die area, design tools or software support to implement it in a commercial chip. Asynchronous logic has the potential to significantly improve processing speeds and reduce power, but it isn't used much because we don't have effective design automation to reduce the massive man hour requirements of designing such chips.

This is just conventional semiconductors. Speculative designs for nanocomputers are a lot more impressive; see Drexler's mechanical nanocomputer lower bound calculations in Nanosystems. Of course, we don't have the tools to build these yet, we can only simulate them. Quantum computers are a whole other issue; they potentially give you massive raw computing power, if we can scale them, but this isn't fully comparable to normal computers.
adam_grif wrote:But this doesn't stop people from making claims like "we'll learn how to control physics!" "We'll get unlimited processing power!" "technology will increase forever!"
You might as well claim that we will discover a practical method of instantaneous teleportation across any distance (i.e. infinite travel speed), or a practical method of extracting free energy from quantum vacuum fluctuations. It isn't quite impossible, but it seems very unlikely given our current understanding of physics, and this predictions have no place in serious futurism.
adam_grif wrote:And then the important question is: is 2.56x10^47 bits per second enough to simulate one gram of matter for one second with no loss of fidelity?
This is a nonsensical question. You can't 'perfectly simulate' any physical system because (a) you can never perfectly know the starting state and (b) the behaviour of subatomic particles is probabilistic. AFAIK there are an infinite number of possible virtual particle interaction sequences that gram of matter could have. However this is completely irrelevant for nearly all practical purposes, as being 99.99% sure what will happen is good enough.
User avatar
adam_grif
Sith Devotee
Posts: 2755
Joined: 2009-12-19 08:27am
Location: Tasmania, Australia

Re: Upper limit on theoretical processing capacity?

Post by adam_grif »

This is a nonsensical question.
Fine.

Can this be used to simulate a computer with more power than it has in real time?

This was more what my initial post is about, the actual numbers involved aren't of interest.
A scientist once gave a public lecture on astronomy. He described how the Earth orbits around the sun and how the sun, in turn, orbits around the centre of a vast collection of stars called our galaxy.

At the end of the lecture, a little old lady at the back of the room got up and said: 'What you have told us is rubbish. The world is really a flat plate supported on the back of a giant tortoise.

The scientist gave a superior smile before replying, 'What is the tortoise standing on?'

'You're very clever, young man, very clever,' said the old lady. 'But it's turtles all the way down.'
User avatar
Kuroji
Padawan Learner
Posts: 323
Joined: 2010-04-03 11:58am

Re: Upper limit on theoretical processing capacity?

Post by Kuroji »

adam_grif wrote:But this doesn't stop people from making claims like "we'll learn how to control physics!" "We'll get unlimited processing power!" "technology will increase forever!"
I ... oh my GOD the stupidity and ignorance of those statements astounds me. Anyone who theorizes that should be answered with "and since we can learn how to control physics, we can travel back in time to the moment we have this discussion and you can prove me right, so where are you?"

If we are controlling physics then we functionally become God. The odds of that are, to be blunt, nil. Of course, you knew this...
Steel, on nBSG's finale: "I'd liken it to having a really great time with these girls, you go back to their place, think its going to get even better- suddenly there are dicks everywhere and you realise you were in a ladyboy bar all evening."
User avatar
Starglider
Miles Dyson
Posts: 8709
Joined: 2007-04-05 09:44pm
Location: Isle of Dogs
Contact:

Re: Upper limit on theoretical processing capacity?

Post by Starglider »

adam_grif wrote:Can this be used to simulate a computer with more power than it has in real time?

This was more what my initial post is about, the actual numbers involved aren't of interest
In the general case, of course not. However it is often possible to simulate the output of a computer running a specific program using less operations and/or storage space, by using a more efficient implementation of the same algorithm. This is typically what happens every time a contemporary programmer optimises an existing program to run faster.
User avatar
Sea Skimmer
Yankee Capitalist Air Pirate
Posts: 37390
Joined: 2002-07-03 11:49pm
Location: Passchendaele City, HAB

Re: Upper limit on theoretical processing capacity?

Post by Sea Skimmer »

It depends on the technology, but a limit should exist for the maximum work a given type of processor can do, depending on the type of work. That’s because after a point the bigger the processor gets, the time lags multiply until eventually adding more processing cores would not actually increase speed. Barring absurd breakthroughs in technology, you can’t speed up electric or light past a certain point.

However this kind of limit may never be relevant, because such densely high power computer chips would probably become too big and overheat long before the time lag was crippling. Plus like Starglider points out, software is an enormous limitation, and software optimized for one purpose won’t work as well for others. That means endless new software to meet new requirements.
"This cult of special forces is as sensible as to form a Royal Corps of Tree Climbers and say that no soldier who does not wear its green hat with a bunch of oak leaves stuck in it should be expected to climb a tree"
— Field Marshal William Slim 1956
User avatar
Starglider
Miles Dyson
Posts: 8709
Joined: 2007-04-05 09:44pm
Location: Isle of Dogs
Contact:

Re: Upper limit on theoretical processing capacity?

Post by Starglider »

Sea Skimmer wrote:It depends on the technology, but a limit should exist for the maximum work a given type of processor can do, depending on the type of work. That’s because after a point the bigger the processor gets, the time lags multiply until eventually adding more processing cores would not actually increase speed.
More specifically, all computing power is not created equal. Computers vary in the speed at which they can perform various basic operations, the speed at which they can access various different levels of storage, the number of things they can process simultaneously and the time taken to exchange data between different processes. Grouping a huge number of processors together improves performance on parallel tasks but doesn't do anything for serial tasks. Serial speed is always preferable, because it's easier to program and because you can use it to simulate parallel computing almost losslessly. However parallel speed is much easier to add. The actual utility of parallel speed on any given task is described by Amdahl's law - though the classic 'law' is an extreme simplification, practical parallel performance optimisation is a highly multidimensional problem.
However this kind of limit may never be relevant, because such densely high power computer chips would probably become too big and overheat long before the time lag was crippling.
This first became relevant designing discrete supercomputers; the original Cray machines had crazy packaging stategies to minimise connection lengths. ICs solved the problem for a while, but it soon became relevant again for processor designers. The biggest issue is that the clock signal takes time to propagate across the chip and through all the clock distribution stages; circuits have to be carefully designed to allow for this. Clock distribution burns up quite a lot of power and doesn't do any useful work, but clock-less logic is currently too expensive to design/debug. For a while several high-end processors were including 'drive' stages in the pipeline that devote clock cycles purely to pushing data around the chip, but this seems to have gone away for a while as chips have been continuing to shrink while clock speeds have stagnated. At the supercomputer level, low-latency is critical in cluster interconnects; the main issue is NIC and switch speed, but wire delay is often relevant too.
User avatar
Purple
Sith Acolyte
Posts: 5233
Joined: 2010-04-20 08:31am
Location: In a purple cube orbiting this planet. Hijacking satellites for an internet connection.

Re: Upper limit on theoretical processing capacity?

Post by Purple »

I was reading this and just wondering something.
If we ignore things like energy loses and assume that by some magical effect we had a computer that can simulate it self in real time. The simulation can not possibly be more efficient than the computer it self (otherwise it would not be a simulation).
Now if we run the simulation, and the simulation can simulate another one with this repeating for N iterations. How would we than get infinite processing power?
Won't we only have one iteration that is actually available to do something while the rest of them (N-1) will be using 100% of their processing power to run the next simulation?

Either my logic is flawed or the whole endeavor is pointless.

adam_grif wrote:And then the important question is: is 2.56x10^47 bits per second enough to simulate one gram of matter for one second with no loss of fidelity?
Also, I am not exactly well educated in terms of physics but would this not go against Heisenberg's uncertainty principle?
It has become clear to me in the previous days that any attempts at reconciliation and explanation with the community here has failed. I have tried my best. I really have. I pored my heart out trying. But it was all for nothing.

You win. There, I have said it.

Now there is only one thing left to do. Let us see if I can sum up the strength needed to end things once and for all.
User avatar
Starglider
Miles Dyson
Posts: 8709
Joined: 2007-04-05 09:44pm
Location: Isle of Dogs
Contact:

Re: Upper limit on theoretical processing capacity?

Post by Starglider »

Purple wrote:If we ignore things like energy loses and assume that by some magical effect we had a computer that can simulate it self in real time. The simulation can not possibly be more efficient than the computer it self (otherwise it would not be a simulation).
It can be more efficient if you only want the same output. It can't if you want the complete internal state description.
Either my logic is flawed or the whole endeavor is pointless.
It is inherently pointless because you won't get any information out of ten nested VMs that you wouldn't get out of one nested VM.
User avatar
Sarevok
The Fearless One
Posts: 10681
Joined: 2002-12-24 07:29am
Location: The Covenants last and final line of defense

Re: Upper limit on theoretical processing capacity?

Post by Sarevok »

What could be some possible uses for extremely high computation power ? As in far future computers approaching the upper limit of what is physically possible.
I have to tell you something everything I wrote above is a lie.
User avatar
Kuroji
Padawan Learner
Posts: 323
Joined: 2010-04-03 11:58am

Re: Upper limit on theoretical processing capacity?

Post by Kuroji »

Running Crysis on Very High.

But seriously? It will run the full gamut, as it does today. Rapid sequencing of genomes, simulations of any and every kind, crunching data to solve problems or to design new drugs, or for that matter to solve engineering problems (can you say 'space elevator' and '50% efficient solar panels'?). Self-writing software, or self-optimizing anyway, so as to counter bloatware, or predict and patch security holes and bugs. And possibly artificial intelligence once it hits the point that there's enough raw processing power. Of course, there's also the matter of simple recreational use. People love their PCs and their cell phones...
Steel, on nBSG's finale: "I'd liken it to having a really great time with these girls, you go back to their place, think its going to get even better- suddenly there are dicks everywhere and you realise you were in a ladyboy bar all evening."
User avatar
Starglider
Miles Dyson
Posts: 8709
Joined: 2007-04-05 09:44pm
Location: Isle of Dogs
Contact:

Re: Upper limit on theoretical processing capacity?

Post by Starglider »

We have as much chance of predicting the major uses of computing power in 50 years time as people in 1960 did of predicting current computer power usage. Which is to say, very little.
User avatar
Teleros
Jedi Council Member
Posts: 1544
Joined: 2006-03-31 02:11pm
Location: Ultra Prime, Klovia
Contact:

Re: Upper limit on theoretical processing capacity?

Post by Teleros »

Sarevok wrote:What could be some possible uses for extremely high computation power ? As in far future computers approaching the upper limit of what is physically possible.
Take something in the world, reduce it to numbers, and a computer can in principle simulate it. To use some obvious examples: audio & video files, games, artwork, protein simulations, cryptography, the human brain... in other words, a hell of a lot. There's a reason "virtual reality" has the word "reality" in it after all ;) .
Post Reply