Quantum computers
Moderator: Alyrium Denryle
Quantum computers
Anyone here have techTV? (or ZDTV as it's also, called, I think that's another name for it) I was watching the screen savers and frequently they have a guess scientist named Michio Kaku who talks about how in about 20 years Moore's law will collapse and Silicon valley will become a rust belt with the rise of quantum computers. I kinda forget a little about what he was talking about, but he was saying that silicon is limited in a sense that it can't be used forever as you make the processors smaller and smaller, that pretty soon you'll only be able to use a few atoms, then you have to go smaller- with quantum computers. How exactly would quantum computers work. I mean what kind of subatomic particles would they use, positrons? Photons are a quantum particle aren't they?
Quantum computing refers to the concept of using particles that are in a superposition of two or more states to solve certain types of equations very quickly, because every possible answer is tested at once. This is possible because on the quantum level the state of any particle is indeterminate until it is measured, thus it is possible for a particle that is not being measured to be in two states at once. By using input that is in two states at once, one can then theoretically test all possible input states for correctness at once, making certain things for which there is no efficient algorithim (factoring) be able to be computed much faster. This is a very bad thing for cryptoligists, because most modern computer codes rely on the sheer impracticality of factoring large numbers to remain secure. If a quantum computer ever became a reality, then it would become possible to break any code very quickly, posing a major threat to national security. On the other hand, this would be very beneficial to mathematicians, because they would be virtually guaranteed of employment.
As for what this guy is talking about, he's only saying that eventually computing will switch to a medium other than silicon. Beyond that he only proves that he has no clue what quantum computing is.
As for what this guy is talking about, he's only saying that eventually computing will switch to a medium other than silicon. Beyond that he only proves that he has no clue what quantum computing is.
data_link has resigned from the board after proving himself to be a relentless strawman-using asshole in this thread and being too much of a pussy to deal with the inevitable flames. Buh-bye.
Quantum computing refers to the concept of using particles that are in a superposition of two or more states to solve certain types of equations very quickly, because every possible answer is tested at once. This is possible because on the quantum level the state of any particle is indeterminate until it is measured, thus it is possible for a particle that is not being measured to be in two states at once. By using input that is in two states at once, one can then theoretically test all possible input states for correctness at once, making certain things for which there is no efficient algorithim (factoring) be able to be computed much faster. This is a very bad thing for cryptoligists, because most modern computer codes rely on the sheer impracticality of factoring large numbers to remain secure.
Um, what about the security systems that shut you out after 3 failed tries of trying to crack the password?If a quantum computer ever became a reality, then it would become possible to break any code very quickly, posing a major threat to national security.
He's a Harvard educated theoretical physicist, doesn't mean he's infallible but I'm sure he knows more than you do about it.As for what this guy is talking about, he's only saying that eventually computing will switch to a medium other than silicon. Beyond that he only proves that he has no clue what quantum computing is.
- GrandMasterTerwynn
- Emperor's Hand
- Posts: 6787
- Joined: 2002-07-29 06:14pm
- Location: Somewhere on Earth.
Re: Quantum computers
Current single-processor computers are very limited. In ten years, they will have clock speeds of 10 - 15 GHz and feature sizes of 35 nm (About a quarter the size of modern processors which have 130 nm features.) At these speeds, the speed of light starts to become a serious issue in processor design. (The signals, travelling at about 2/3 lightspeed, can't make it across the chip in a single clock cycle.)Shrykull wrote:Anyone here have techTV? (or ZDTV as it's also, called, I think that's another name for it) I was watching the screen savers and frequently they have a guess scientist named Michio Kaku who talks about how in about 20 years Moore's law will collapse and Silicon valley will become a rust belt with the rise of quantum computers. I kinda forget a little about what he was talking about, but he was saying that silicon is limited in a sense that it can't be used forever as you make the processors smaller and smaller, that pretty soon you'll only be able to use a few atoms, then you have to go smaller- with quantum computers. How exactly would quantum computers work. I mean what kind of subatomic particles would they use, positrons? Photons are a quantum particle aren't they?
But, before quantum computers, there are several other tricks consumer-grade microprocessor designers will resort to.
A) Parallel computing: Two approaches. Multiple CPUs sharing one chip and a bunch of glue logic. Already we see devices like this, like Xilinx's Virtex II Pro, which has four IBM PowerPC cores embedded in a reconfigurable logic matrix. You hook up microprocessor cores according to what you want the computer to do. Another approach is just to have the reconfigurable logic assembled into numerous specialized parallel units. This means you'd have a computer that's good at basic window drawing and integer maths for your word processor one day and a computer capable of fast floating-point calculations for your game the next.
B) Asynchronous clocks: Modern microprocessors march to the beat of a single clock. Unfortunately some instructions take multiple clock-cycles to execute, tying up valuable resources and computing time. Letting the various parts of the chip run at different speeds is one way of squeezing a little extra performance from the chip.
A Primer to Quantum Computing.
But quantum computing is a whole other animal. A quantum computer has a number of things called qubits. These are the basic unit of information representation for a quantum computer, like a bit is for a binary computer. However, a qubit can take on all of it's potential values through quantum superposition until you observe it, collapsing it into a single state producing the answer.
To better explain this, one must realize that a computer is basically a complex state machine. A state machine has a bunch of discrete states that it can be in. For example, on the computer you're reading this on, there are several states that it has to go through.
State A: Wait here for a user input event.
If the event was to click on the scroll button, then we goto State B
State B: Change the contents of the text window.
Then we go to State C.
State C: Redraw the window.
At which point, we go back to State A.
A better example of this is when you first learned how to multiply large numbers in elementary school. Back then you had to go thorugh several steps to get the answer. These steps could be thought of as states in your state machine. The computer has to go through steps to get to it's answer too.
In an ordinary computer, each state would take about one tick of the clock. So to produce a vector transform to move your character in UT, the computer might take 5 - 15 clock ticks to do that.
Now here's the difference between quantum computers and ordinary serial computers. Because of superposition, a quantum computer can represent all these states in a single clock cycle!! A quantum computer, if we could build one, would be the ultimate massively parallel computing machine, similar to the ones I talked about waaay back (but much bigger in scope.)
However, quantum computers are not twenty years off. They're more like fifty or 100 years off.
Tales of the Known Worlds:
2070s - The Seventy-Niners ... 3500s - Fair as Death ... 4900s - Against Improbable Odds V 1.0
2070s - The Seventy-Niners ... 3500s - Fair as Death ... 4900s - Against Improbable Odds V 1.0
Um... you do realize the difference between code-breaking and hacking, right?Shrykull wrote:Um, what about the security systems that shut you out after 3 failed tries of trying to crack the password?
Appeal to authority. Look it up.Shrykull wrote:He's a Harvard educated theoretical physicist, doesn't mean he's infallible but I'm sure he knows more than you do about it.
data_link has resigned from the board after proving himself to be a relentless strawman-using asshole in this thread and being too much of a pussy to deal with the inevitable flames. Buh-bye.
quote:
--------------------------------------------------------------------------------
Originally posted by Shrykull:
Um, what about the security systems that shut you out after 3 failed tries of trying to crack the password?
--------------------------------------------------------------------------------
Um... you do realize the difference between code-breaking and hacking, right?
quote:
--------------------------------------------------------------------------------
Originally posted by Shrykull:
He's a Harvard educated theoretical physicist, doesn't mean he's infallible but I'm sure he knows more than you do about it.
--------------------------------------------------------------------------------
Appeal to authority. Look it up.
I figured you would say that, it isn't when he actually knows what he's talking about like Wong's steve Hawking example, you didn't even see what this guy was talking about (and I'm guessing you haven't even heard of him I bet too) and your making assumptions already? Sounds like hasty generalization.
--------------------------------------------------------------------------------
Originally posted by Shrykull:
Um, what about the security systems that shut you out after 3 failed tries of trying to crack the password?
--------------------------------------------------------------------------------
Um... you do realize the difference between code-breaking and hacking, right?
quote:
--------------------------------------------------------------------------------
Originally posted by Shrykull:
He's a Harvard educated theoretical physicist, doesn't mean he's infallible but I'm sure he knows more than you do about it.
--------------------------------------------------------------------------------
Appeal to authority. Look it up.
I figured you would say that, it isn't when he actually knows what he's talking about like Wong's steve Hawking example, you didn't even see what this guy was talking about (and I'm guessing you haven't even heard of him I bet too) and your making assumptions already? Sounds like hasty generalization.
I don't make the assumption that he has no understanding of the subject, I was only pointing out that he wasn't showing any in the argument you posted. You're right however - I made the assumption that you reported what he said accurately, which in the absence of a direct quote is clearly fallacious. It is possible that his argument demonstrated a clear understanding of quantum computing, which was lost when you posted your understanding of it here. Regardless, since I never made an evaluation of him your pointing out that he knows what he is talking about does not show that my knowledge is in error. Now if you want to criticize my understanding of quantum mechanics I suggest that you first back up your statements with an actual point.Shrykull wrote:I figured you would say that, it isn't when he actually knows what he's talking about like Wong's steve Hawking example, you didn't even see what this guy was talking about (and I'm guessing you haven't even heard of him I bet too) and your making assumptions already? Sounds like hasty generalization.
data_link has resigned from the board after proving himself to be a relentless strawman-using asshole in this thread and being too much of a pussy to deal with the inevitable flames. Buh-bye.
- Enlightenment
- Moderator Emeritus
- Posts: 2404
- Joined: 2002-07-04 07:38pm
- Location: Annoying nationalist twits since 1990
Re: Quantum computers
That's silly. If Moore's Law collapses for traditional computer techniques and everyone switches to quantum computing then Sillicon Valley will switch from making traditional logic to making quantum logic. The only way Sillicon Valley is going to turn into a rust belt is if Moore's Law collapses full stop or the global market for US-designed computers implodes on account of legally-mandated Palladium.Shrykull wrote: in about 20 years Moore's law will collapse and Silicon valley will become a rust belt with the rise of quantum computers.
It's not my place in life to make people happy. Don't talk to me unless you're prepared to watch me slaughter cows you hold sacred. Don't talk to me unless you're prepared to have your basic assumptions challenged. If you want bunnies in light, talk to someone else.
The current issue of Scientific American has a nice article on Quantum Computing. Giving some info on superposition and such.
Supposidly, a quantum computer could potentially perform an operation in one second, which might take 150,000 years for a traditional supercomputer. Sweet,... how will Gates slow *that* computer down?
(It also has a nice article giving the progress on the theory of everything: loop quantum gravity. These articles can be found in the mag and online at www.sciam.com under the current issue.)
Supposidly, a quantum computer could potentially perform an operation in one second, which might take 150,000 years for a traditional supercomputer. Sweet,... how will Gates slow *that* computer down?
(It also has a nice article giving the progress on the theory of everything: loop quantum gravity. These articles can be found in the mag and online at www.sciam.com under the current issue.)
- Lord Pounder
- Pretty Hate Machine
- Posts: 9695
- Joined: 2002-11-19 04:40pm
- Location: Belfast, unfortunately
- Contact:
after reading the book timeling by Michael Crichton i became facinated with the idea of quantium physics and quantium computers. According to crichtons theory the computers will abandon binary, no more will it be 1's and 0's but you can use 0-9, giving you much more power with smaller space. I know this was only a book and crichton is only an author, but his previosu books have been well researched, so how accurate are his theories?
RIP Yosemite Bear
Gone, Never Forgotten
Gone, Never Forgotten
- Wicked Pilot
- Moderator Emeritus
- Posts: 8972
- Joined: 2002-07-05 05:45pm
Well, there still is no Jurassic Park. Just out of curiousity, does he hold any scientific or engineering degrees?Darth Pounder wrote:I know this was only a book and crichton is only an author, but his previosu books have been well researched, so how accurate are his theories?
The most basic assumption about the world is that it does not contradict itself.
- Uraniun235
- Emperor's Hand
- Posts: 13772
- Joined: 2002-09-12 12:47am
- Location: OREGON
- Contact:
Because as we all know, it's secretly Bill Gates' desire to bring every computer he sees to a dying crawl.Zoink wrote:The current issue of Scientific American has a nice article on Quantum Computing. Giving some info on superposition and such.
Supposidly, a quantum computer could potentially perform an operation in one second, which might take 150,000 years for a traditional supercomputer. Sweet,... how will Gates slow *that* computer down?