Next Big Future had an update on the Blue Brain project today, and this bit caught my eye:
The Blue Brain team is now developing a model of a complete rat brain—that should be done in 2010—Markram will download the simulation into a robotic rat, so that the brain has a body. He’s already talking to a Japanese company about constructing the mechanical animal.
Installing Blue Brain in a robot will also allow it to develop like a real rat. The simulated cells will be shaped by their own sensations, constantly revising their connections based upon the rat’s experiences. “What you ultimately want,” Markram says, “is a robot that’s a little bit unpredictable, that doesn’t just do what we tell it to do.” His goal is to build a virtual animal—a rodent robot—with a mind of its own.
I had no idea that they were this far along. How freakin' cool will that be if they pull off the simulated cyber-rat in the next year?
Also, what implications does this have for a human-level simulation in the next 10-20 years?
All those moments will be lost in time... like tears in rain...
I think the claimed timeline is much too optimistic - I suspect that creating the rat body will take several years. Also, I think there will be huge teething problems with creating the mind/body interface. Starglider is involved in AI and can provide a less intuitive answer, but I suspect his expert knowledge will agree with my intuition.
TVWP: "Janeway says archly, "Sometimes it's the female of the species that initiates mating." Is the female of the species trying to initiate mating now? Janeway accepts Paris's apology and tells him she's putting him in for a commendation. The salamander sex was that good." "Not bad - for a human"-Bishop to Ripley GALACTIC DOMINATION Empire Board Game visit link below: GALACTIC DOMINATION
The Future of Humanity Institute at Oxford University released a Whole Brain Emulation Roadmap last year, people interested in the timeline for human brain simulation may want to take a look.
tl;dr version: the roadmap claims that all we need are incremental improvements in currently existing technologies, and getting enough computing power/memory is the main bottleneck for emulating a whole human brain. Page 80 has a table of when they estimate the necessary computing power to be available: depending on how accurately the brain needs to be modeled, they estimate that the earliest year that $1 million buys enough supercomputer time for a real-time emulation is between 2008 and 2111. Earlier in the document (page 13-14), it's remarked that a consensus of people involved in one emulation research workshop places the necessary level of detail at level 4, 5 or 6 at the scale they use, which for computing power translates into the years 2019, 2033 and 2044 respectively. Take from that what you will.
"You have zero privacy anyway. Get over it." -- Scott McNealy, CEO Sun Microsystems
"Did you know that ninety-nine per cent of the people who contract cancer wear shoes?" -- Al Bester in J. Gregory Keyes' book Final Reckoning
There are two components to whole-brain emulation; the computing power and the model. The computing power part is coming along nicely and has demonstrated a nice smooth scaling, so predictions about when it will reach specific thresholds are quite sensible.
The model, not so much. We've characterised gross brain morphology quite well and there are now very good models of the low level (firing patterns in local circuits e.g. within a cortical microcolumn, down to the synapse level) but the are still a big gaps in the intermediate scale morphology, long-term plasticity and brain development. The assumption that existing models will work perfectly (for whole-brain simulation) as soon as we have sufficient computing power to run them is the standard brand of ridiculous over-optimism that has been plaguing artificial intelligence since the field was founded fifty years ago. Of course the decent researchers aren't making that assumption, but plenty of the boosters, PR people and general hangers on are.
All that said, accurate whole-brain simulation within a few decades is pretty much inevitable. Unlike de-novo (from scratch) AI, the problem is crackable by brute force and hard work, no special and unpredictable insight required.
Would you say a whole brain simulation for humans could occur before the first AGI is ready ? Afterall as you mention this is more of a brute force problem. Progress has been steady in both increased computing power and detailed understanding of the brain. An AGI on otherhand is a more of an uncharted territory with many unsolved problems.
I have to tell you something everything I wrote above is a lie.
Sarevok wrote:Would you say a whole brain simulation for humans could occur before the first AGI is ready?
A simulated human brain is an AGI. If you mean 'will this approach work before the people trying to make AGIs from scratch succeed' then yes, certainly that may happen.
The human brain(~1300-1400 grams) out masses an entire rat(~600-700 grams) at least twice, so there is an astronomically massive difference.
One thing going for whole mind simulating, is the posibility of destructively scanning a flash frozen brain. You would need todo some reconstruction on the resulting model, but it would skip a lot of the mess of trying to model a brain without a complete understanding.
Now, non-destructive whole brain scanning that's a pipedream.
"Okay, I'll have the truth with a side order of clarity." ~ Dr. Daniel Jackson.
"Reality has a well-known liberal bias." ~ Stephen Colbert
"One Drive, One Partition, the One True Path" ~ ars technica forums - warrens - on hhd partitioning schemes.
A brain is very fragile and quite sensitive to heat, and to boot constantly changing. The immune system doesn't function inside the brain because the neurons actually physically map the synapses into thier RNA, and would always turn up as an unknown to the immune system.
For a brain scan worth a damn, you need to map every neuron(10^11) and all the synapses(10^14) for those neuron and record any chemical composition of the surrounding fluids (because neuron can comunication over "long" distances chemically). The entire mess is a vast array of subtle differences which make up a functioning brain.
The required resolution is so tiny, the amount of data so vast there are some real physical limitations with reading the information out of a brain to be fast enough to be useful.
"Okay, I'll have the truth with a side order of clarity." ~ Dr. Daniel Jackson.
"Reality has a well-known liberal bias." ~ Stephen Colbert
"One Drive, One Partition, the One True Path" ~ ars technica forums - warrens - on hhd partitioning schemes.
I think it's more likely that we'll have some form of gradual electronic replacement - beginning with brain damaged individuals, for example. The brain has something around a hundred subunits - replacing and, eventually, enhancing brain function small steps at a time may be a future possibility.
There is also the possibility of gene therapy to make this process easier or to somehow replace it entirely.
Give fire to a man, and he will be warm for a day.
Set him on fire, and he will be warm for life.
Xon wrote:The required resolution is so tiny, the amount of data so vast there are some real physical limitations with reading the information out of a brain to be fast enough to be useful.
The only practical way to do real time scanning (including non-destructive and progressive uploading) is to thread the whole brain with a nanoengineered sensor network, e.g. a network built out of carbon nanotube interconnects and processor nodes the size of cell nuclei. This is physically possible, in that simulations of basic nanorobotic systems indicate that the components needed for systems like this should work. Unfortunately we don't have the ability to fabricate nanorobotic components nor the body of engineering knowledge required for such intricate applications. Give it another 50 years (or wait for someone to make an AI that solves the problem).
Starglider, baring magic sensors, you are absolutely correct. Any type of brain upload would require adding stuff to the human brain, it just isn't wired to haven't state read.
Xeriar wrote:I think it's more likely that we'll have some form of gradual electronic replacement - beginning with brain damaged individuals, for example. The brain has something around a hundred subunits - replacing and, eventually, enhancing brain function small steps at a time may be a future possibility.
One major issue with implants or nanotech like this, is heat dispersion. This massively limits what you can do computationly wise, both in complexity and magnitude.
The human brain just isn't design to radiate more than a few of watts, possibily tens of watts.
"Okay, I'll have the truth with a side order of clarity." ~ Dr. Daniel Jackson.
"Reality has a well-known liberal bias." ~ Stephen Colbert
"One Drive, One Partition, the One True Path" ~ ars technica forums - warrens - on hhd partitioning schemes.
Xon wrote:One major issue with implants or nanotech like this, is heat dispersion. This massively limits what you can do computationly wise, both in complexity and magnitude.
Oh, it's not so bad. The computations-per-watt ratio has been scaling nearly as well as the computations-per-dollar ratio, as long as you're not after absolute serial speed. The theoretical waste heat limit for technology is at least five orders of magnitude better than the brain, even without considering reversible computing, partly because biochemistry is inherently inefficient and partly because the brain wastes so much energy on non-compute biological activities.
The human brain just isn't design to radiate more than a few of watts, possibily tens of watts.
Even if offloading to an external device is out of the question, there's no need to have the bulk of the implant in the brain cavity. Electronic communication is ridiculously fast compared to neural impulses. For heat dissipation you could distribute the computing elements as a series of flat bricks implanted beneath the skin of the torso, and just have the electrode array in the brain. For additional freak-out-the-luddites-factor you could have combined solar charger and heat radiator panels that unfold out of people's backs like black insect wings.
Starglider wrote:For additional freak-out-the-luddites-factor you could have combined solar charger and heat radiator panels that unfold out of people's backs like black insect wings.
You know the hardcore body-mod types won't settle for anything less than grafting big hunks of aluminum to their skulls.
All those moments will be lost in time... like tears in rain...
Xon wrote:
One major issue with implants or nanotech like this, is heat dispersion. This massively limits what you can do computationly wise, both in complexity and magnitude.
The human brain just isn't design to radiate more than a few of watts, possibily tens of watts.
Dissipation scales with the fourth power of temperature. Adding another watt or so onto the ~25 watts the brain currently dissipates is not a huge issue, and we can already do quite a bit with a single watt. The initial step is not going to involve improving the original design at all - it's going to be about restoring functionality to damaged and destroyed parts of the brain. Like an artificial heart transplant - it doesn't need to function better, it needs to function.
ThomasP wrote:
You know the hardcore body-mod types won't settle for anything less than grafting big hunks of aluminum to their skulls.
Humans we are talking about. We will see silver, copper, and occasionally gold plating. It's going to be a fashion statement.
Give fire to a man, and he will be warm for a day.
Set him on fire, and he will be warm for life.
The human brain and nerve tissue is actually suprisingly thermally efficient.
"Okay, I'll have the truth with a side order of clarity." ~ Dr. Daniel Jackson.
"Reality has a well-known liberal bias." ~ Stephen Colbert
"One Drive, One Partition, the One True Path" ~ ars technica forums - warrens - on hhd partitioning schemes.
Compared to modern machines, for what it does and the way it does it, yes. I wouldn't be surprised if a lot of visual tasks are close to ideal, but for logic and math, it is clearly a pathetic hack job.
Give fire to a man, and he will be warm for a day.
Set him on fire, and he will be warm for life.
"Okay, I'll have the truth with a side order of clarity." ~ Dr. Daniel Jackson.
"Reality has a well-known liberal bias." ~ Stephen Colbert
"One Drive, One Partition, the One True Path" ~ ars technica forums - warrens - on hhd partitioning schemes.
Completely irrelevant (to Xeriar's point). That article discusses how closely the mechanism for propagating depolarisation waves along neuron cell membranes approaches the threoretical chemical optimum. It says nothing at all about how this compares to the efficiency of other mechanisms for transmitting information, and in fact the way neurons do so is ridiculously slow and inefficient compared to electronic transmission in a simple wire. Energy is constantly wasted maintaining concentration gradients, and even more energy is wasted restoring the gradient after a local depolarisation.
Copper interconnects in CPUs require no energy when not in use and dissipate negligible energy to resistance (switching and gate leakage dominates semiconductor power use), despite operating one million times faster than the fastest nerves. Resistive losses are set to drop even further when we start using optical on-chip interconnects and/or carbon nanotube based hyperconductors, and if it really bothers you you can always use superconductors and transmit information at zero energy cost.
Starglider wrote: you can always use superconductors and transmit information at zero energy cost.
Assuming room temperature supercontors. Otherwise there's still the cost of cooling the fuckers which would make it kinda like putting your car on a flatbed with a full tank of gas and then claiming that your car gets INFINITY MPG because the amount of gas in the tank never drops no matter how far you drive...
Xeriar wrote:This belief that humans will all stay as fleshy bags of meat clinging to a thin shell of ~300K temperatures amuses me.
Current superconducting processor designs only work with low-temperature superconductors, and they do dissipate (a small amount of) power in the logic gates, so they'd need active cooling even in space. That's probably a fabrication issue though; there's no reason in theory why we wouldn't use high temperature superconductors, and liquid nitrogen temperatures are rather easier to achieve without active cooling. Power consumption is not really the attraction of superconducting CPUs anyway; it's the fact that they can achieve terahertz gate switching frequencies and effective clock speeds in the 100 GHz range. At the moment massively parallel is getting all the attention, but sometimes there's no substitute for serial speed.
Anyway, my mention of superconductors was facetious; for CPUs based on conventional transistor gates, they are overkill for on-chip interconnects. Certain types of carbon nanotubes have been demonstrated to conduct electrons ballistically, which equates to basically zero resistance over sub-centimeter distances, with easier fabrication and a much more generous temperature range.
A biological human brain is not intrinsically superior to a stash of sufficiently interconnected circuits per se. The problem is that a human-like intelligence would have to emerge from a human-like development... After all, we're all supposedly human from the day we're born, but for years, our intelligence (measured as our ability to understand problems and solve them with the least input possible) does not equal that of a dog, or a chimpanzee, at later stages.
The experience-component should not be excluded matter-of-factly. It is experience that makes us the people we are now, and very likely it will be what will make computers ultimately intelligent. But first they've got to develop an intelligence that learns from experience as ours, otherwise, their efforts will run into a hard brick wall, so to speak. There's no way we can reallistically input all the preset electric pathways of an adult brain into computer chips any time soon, IMHO.
Life in Commodore 64: 10 OPEN "EYES",1,1
20 GET UP$:IF UP$="" THEN 20
30 GOTO BATHROOM
...
GENERATION 29
Don't like what I'm saying?
Take it up with my representative:
Akkleptos wrote:The experience-component should not be excluded matter-of-factly. It is experience that makes us the people we are now, and very likely it will be what will make computers ultimately intelligent. But first they've got to develop an intelligence that learns from experience as ours, otherwise, their efforts will run into a hard brick wall, so to speak. There's no way we can reallistically input all the preset electric pathways of an adult brain into computer chips any time soon, IMHO.
For brain simulations, this problem is actually even harder than it sounds. The reason being that human brain development is very carefully choreographed to build up capability; the way children learn is not the same as the way adults learn. Being able to recreate an adult brain is not enough, but simulating the entire brain development path (equivalent to human childhood) is a really hard problem. Currently most researchers seem to be hoping they can fudge/tweak their way around this problem. Historically that hasn't worked very well.
Other kinds of AI have the same general problem of 'how to get a reasonable base of knowledge into the system', but the specifics are radically different depending on the architecture.