LionElJohnson's Singularity-God tangent
Moderator: Alyrium Denryle
-
- Padawan Learner
- Posts: 287
- Joined: 2010-07-14 10:55pm
LionElJohnson's Singularity-God tangent
Lagmonster notes:
- This was split from the population growth thread as it was a verifiable tangent.
- Addressing Lion's arguments is acceptable. Substanceless flames are not.
- The MCP is your AI God now, bitches. End of line.
Honestly, I think that all of this discussion of Malthusian collapse is a moot point; either we'll build a Friendly AI that will usher in an age of peace and prosperity as our godlike robot overlord, or we'll build an unFriendly AI that'll wipe us out as it busily converts the solar system into paperclips or something.
As for the OP's question of raising population growth, I would invest in life-extension treatments to make us biologically immortal, as well as the production of artificially-produced humans in tanks with rapid prototyping-style technologies.
- This was split from the population growth thread as it was a verifiable tangent.
- Addressing Lion's arguments is acceptable. Substanceless flames are not.
- The MCP is your AI God now, bitches. End of line.
Honestly, I think that all of this discussion of Malthusian collapse is a moot point; either we'll build a Friendly AI that will usher in an age of peace and prosperity as our godlike robot overlord, or we'll build an unFriendly AI that'll wipe us out as it busily converts the solar system into paperclips or something.
As for the OP's question of raising population growth, I would invest in life-extension treatments to make us biologically immortal, as well as the production of artificially-produced humans in tanks with rapid prototyping-style technologies.
Re: How would you raise population growth?
Making an AI doesn't produce factories. At the best, it simply means that administrative costs drop dramatically as you don't need people and decisions get better, but it still will take materials and resources to drive the economy. And these whole food problem is happening to people who are lacking in the material and resource category.Honestly, I think that all of this discussion of Malthusian collapse is a moot point; either we'll build a Friendly AI that will usher in an age of peace and prosperity as our godlike robot overlord, or we'll build an unFriendly AI that'll wipe us out as it busily converts the solar system into paperclips or something.
-
- Padawan Learner
- Posts: 287
- Joined: 2010-07-14 10:55pm
Re: How would you raise population growth?
Obviously you've never heard of an intelligence explosion; yes, it does make factories, because the AI will single-mindedly pursue its agenda, and that will almost definitely involve procuring resources to improve its intelligence and impose its purpose on the world. This will most likely mean that it will acquire factories, use them to produce better tools to make better tools to make better tools and before you know it the Earth is covered in the AI's nanobots. I referred to it as being either our godlike robot overlord or the direct cause of our extinction for a reason.Samuel wrote:Making an AI doesn't produce factories. At the best, it simply means that administrative costs drop dramatically as you don't need people and decisions get better, but it still will take materials and resources to drive the economy. And these whole food problem is happening to people who are lacking in the material and resource category.Honestly, I think that all of this discussion of Malthusian collapse is a moot point; either we'll build a Friendly AI that will usher in an age of peace and prosperity as our godlike robot overlord, or we'll build an unFriendly AI that'll wipe us out as it busily converts the solar system into paperclips or something.
Re: How would you raise population growth?
Technically it isn't the AI building factories- it is using the money saved to expand the capital stock. This doesn't mean it will magically create factories- you need to build them in the first place. You need machine tools to make the new factories and there is a limited number of them. Provided the AI has enough funds and it can continue investing in new and better equipement, but this is limited by the ability of the economy to provide the required materials. A thousand factories won't do much good if you can't get enough energy to power them.
Also, why would AIs build nanobots? I'd think conventional solar panels are more efficient.
Also, why would AIs build nanobots? I'd think conventional solar panels are more efficient.
-
- Padawan Learner
- Posts: 287
- Joined: 2010-07-14 10:55pm
Re: How would you raise population growth?
If the AI finds it best power whatever supertech it creates with solar panels, it would do so; if it finds it best to power it with whatever supertech pixie dust it invents, it'd do that instead. The nanobots are a tool, not a power supply.Samuel wrote:Technically it isn't the AI building factories- it is using the money saved to expand the capital stock. This doesn't mean it will magically create factories- you need to build them in the first place. You need machine tools to make the new factories and there is a limited number of them. Provided the AI has enough funds and it can continue investing in new and better equipement, but this is limited by the ability of the economy to provide the required materials. A thousand factories won't do much good if you can't get enough energy to power them.
Also, why would AIs build nanobots? I'd think conventional solar panels are more efficient.
Besides, why would an AI have any need of money? Money is just a primitive resource-allocation schema cludged together by poorly-evolved selfish primate brains; it can allocate resources exactly where they need to go. It's either ruling the world as a perfectly benevolent dictator or it's annihilating humanity. There's probably not much middle ground between the two.
Re: How would you raise population growth?
Hahahahahaha.LionElJonson wrote:If the AI finds it best power whatever supertech it creates with solar panels, it would do so; if it finds it best to power it with whatever supertech pixie dust it invents, it'd do that instead. The nanobots are a tool, not a power supply.Samuel wrote:Technically it isn't the AI building factories- it is using the money saved to expand the capital stock. This doesn't mean it will magically create factories- you need to build them in the first place. You need machine tools to make the new factories and there is a limited number of them. Provided the AI has enough funds and it can continue investing in new and better equipement, but this is limited by the ability of the economy to provide the required materials. A thousand factories won't do much good if you can't get enough energy to power them.
Also, why would AIs build nanobots? I'd think conventional solar panels are more efficient.
Besides, why would an AI have any need of money? Money is just a primitive resource-allocation schema cludged together by poorly-evolved selfish primate brains; it can allocate resources exactly where they need to go. It's either ruling the world as a perfectly benevolent dictator or it's annihilating humanity. There's probably not much middle ground between the two.
The AI needs money in order to purchase the factories and pay the workers and the bills. It would need money to take over the world, too, but I'm sure you have some airborne pies to back you up.
Invited by the new age, the elegant Sailor Neptune!
I mean, how often am I to enter a game of riddles with the author, where they challenge me with some strange and confusing and distracting device, and I'm supposed to unravel it and go "I SEE WHAT YOU DID THERE" and take great personal satisfaction and pride in our mutual cleverness?
- The Handle, from the TVTropes Forums
- GrandMasterTerwynn
- Emperor's Hand
- Posts: 6787
- Joined: 2002-07-29 06:14pm
- Location: Somewhere on Earth.
Re: How would you raise population growth?
I know you're a shitbird, but can't you try to be less-offensively stupid? Until your Wank-ularity AI becomes the Nerd God, it's going to need money if it is to acquire the resources it needs to supplant the world of selfish jocks. You may, at any time, provide evidence that suggests that the would be Nerd Fantasy God's requisitioning of resources for its own improvement won't cause a severe economic upset that would hamper its ability to buy the things it needs . . . especially since, to become the Nerd God, it's going to need a degree of global reach completely unmatched by even the largest modern transnational corporations, or the United States under the fantasies of the most deranged neocons.LionElJonson wrote:If the AI finds it best power whatever supertech it creates with solar panels, it would do so; if it finds it best to power it with whatever supertech pixie dust it invents, it'd do that instead. The nanobots are a tool, not a power supply.Samuel wrote:Technically it isn't the AI building factories- it is using the money saved to expand the capital stock. This doesn't mean it will magically create factories- you need to build them in the first place. You need machine tools to make the new factories and there is a limited number of them. Provided the AI has enough funds and it can continue investing in new and better equipement, but this is limited by the ability of the economy to provide the required materials. A thousand factories won't do much good if you can't get enough energy to power them.
Also, why would AIs build nanobots? I'd think conventional solar panels are more efficient.
Besides, why would an AI have any need of money? Money is just a primitive resource-allocation schema cludged together by poorly-evolved selfish primate brains; it can allocate resources exactly where they need to go. It's either ruling the world as a perfectly benevolent dictator or it's annihilating humanity. There's probably not much middle ground between the two.
Tales of the Known Worlds:
2070s - The Seventy-Niners ... 3500s - Fair as Death ... 4900s - Against Improbable Odds V 1.0
2070s - The Seventy-Niners ... 3500s - Fair as Death ... 4900s - Against Improbable Odds V 1.0
-
- Padawan Learner
- Posts: 287
- Joined: 2010-07-14 10:55pm
Re: How would you raise population growth?
I doubt that, since it could likely acquire anything it needed over the Internet, either by directly hijacking them through hacked network connections or by massive electronic fraud and/or identity theft, assuming it can't just use its superhuman intelligence to convince the stupid meatbags to just give it what it needs.GrandMasterTerwynn wrote:I know you're a shitbird, but can't you try to be less-offensively stupid? Until your Wank-ularity AI becomes the Nerd God, it's going to need money if it is to acquire the resources it needs to supplant the world of selfish jocks. You may, at any time, provide evidence that suggests that the would be Nerd Fantasy God's requisitioning of resources for its own improvement won't cause a severe economic upset that would hamper its ability to buy the things it needs . . . especially since, to become the Nerd God, it's going to need a degree of global reach completely unmatched by even the largest modern transnational corporations, or the United States under the fantasies of the most deranged neocons.LionElJonson wrote:If the AI finds it best power whatever supertech it creates with solar panels, it would do so; if it finds it best to power it with whatever supertech pixie dust it invents, it'd do that instead. The nanobots are a tool, not a power supply.Samuel wrote:Technically it isn't the AI building factories- it is using the money saved to expand the capital stock. This doesn't mean it will magically create factories- you need to build them in the first place. You need machine tools to make the new factories and there is a limited number of them. Provided the AI has enough funds and it can continue investing in new and better equipement, but this is limited by the ability of the economy to provide the required materials. A thousand factories won't do much good if you can't get enough energy to power them.
Also, why would AIs build nanobots? I'd think conventional solar panels are more efficient.
Besides, why would an AI have any need of money? Money is just a primitive resource-allocation schema cludged together by poorly-evolved selfish primate brains; it can allocate resources exactly where they need to go. It's either ruling the world as a perfectly benevolent dictator or it's annihilating humanity. There's probably not much middle ground between the two.
Re: How would you raise population growth?
Your Nerd Jesus would have to be able to crack encryption on a staggering scale and produce substantially more effective phishing scams than have been done before. Now, if it really is a unique and different intelligence altogether, like what other Singularitarians say, then what are the chances that it can simulate a human being well enough to do the second? That's ignoring that it needs an internet connection set up to let it do its thing. Basing your Geekssiah on incompetence doesn't sound like a good idea to me.LionElJonson wrote: I doubt that, since it could likely acquire anything it needed over the Internet, either by directly hijacking them through hacked network connections or by massive electronic fraud and/or identity theft, assuming it can't just use its superhuman intelligence to convince the stupid meatbags to just give it what it needs.
Invited by the new age, the elegant Sailor Neptune!
I mean, how often am I to enter a game of riddles with the author, where they challenge me with some strange and confusing and distracting device, and I'm supposed to unravel it and go "I SEE WHAT YOU DID THERE" and take great personal satisfaction and pride in our mutual cleverness?
- The Handle, from the TVTropes Forums
-
- Padawan Learner
- Posts: 287
- Joined: 2010-07-14 10:55pm
Re: How would you raise population growth?
Probably quite good; one of the big fears that the folks on LessWrong have is that we're just the simulations of a superintelligence looking to model some aspect of the universe. Besides, have you ever heard of the AI Box experiment?Bakustra wrote:Your Nerd Jesus would have to be able to crack encryption on a staggering scale and produce substantially more effective phishing scams than have been done before. Now, if it really is a unique and different intelligence altogether, like what other Singularitarians say, then what are the chances that it can simulate a human being well enough to do the second?LionElJonson wrote: I doubt that, since it could likely acquire anything it needed over the Internet, either by directly hijacking them through hacked network connections or by massive electronic fraud and/or identity theft, assuming it can't just use its superhuman intelligence to convince the stupid meatbags to just give it what it needs.
It's not relying on incompetence; it's just so supremely competent at everything that it's functionally omniscient and omnipotent at anything it really tries its metaphorical hand at.That's ignoring that it needs an internet connection set up to let it do its thing. Basing your Geekssiah on incompetence doesn't sound like a good idea to me.
Re: How would you raise population growth?
The natterings of solipsistic acolytes of a fanfiction-writer do not concern me overmuch, nor do his thought experiments that rely on ignoring critical inabilities of the AI, particularly when they are irrelevant to the question of whether an entity with alien thought processes can realistically simulate a human being.LionElJonson wrote:Probably quite good; one of the big fears that the folks on LessWrong have is that we're just the simulations of a superintelligence looking to model some aspect of the universe. Besides, have you ever heard of the AI Box experiment?Bakustra wrote:Your Nerd Jesus would have to be able to crack encryption on a staggering scale and produce substantially more effective phishing scams than have been done before. Now, if it really is a unique and different intelligence altogether, like what other Singularitarians say, then what are the chances that it can simulate a human being well enough to do the second?LionElJonson wrote: I doubt that, since it could likely acquire anything it needed over the Internet, either by directly hijacking them through hacked network connections or by massive electronic fraud and/or identity theft, assuming it can't just use its superhuman intelligence to convince the stupid meatbags to just give it what it needs.
There is nothing that I can say to this. I am torn between laughter and tears, each deriving from multiple emotions.It's not relying on incompetence; it's just so supremely competent at everything that it's functionally omniscient and omnipotent at anything it really tries its metaphorical hand at.That's ignoring that it needs an internet connection set up to let it do its thing. Basing your Geekssiah on incompetence doesn't sound like a good idea to me.
This is not an argument. Toe the line, Bakustra. --Lagmonster
Invited by the new age, the elegant Sailor Neptune!
I mean, how often am I to enter a game of riddles with the author, where they challenge me with some strange and confusing and distracting device, and I'm supposed to unravel it and go "I SEE WHAT YOU DID THERE" and take great personal satisfaction and pride in our mutual cleverness?
- The Handle, from the TVTropes Forums
-
- Padawan Learner
- Posts: 287
- Joined: 2010-07-14 10:55pm
Re: How would you raise population growth?
I think you're doing a great disservice to Eliezer Yudkowsky to describe him as merely a fanfiction author when he is one of the authorities on Friendly AIs, and a lead role in organizations like the Singularity Institute.Bakustra wrote:The natterings of solipsistic acolytes of a fanfiction-writer do not concern me overmuch, nor do his thought experiments that rely on ignoring critical inabilities of the AI, particularly when they are irrelevant to the question of whether an entity with alien thought processes can realistically simulate a human being.LionElJonson wrote:Probably quite good; one of the big fears that the folks on LessWrong have is that we're just the simulations of a superintelligence looking to model some aspect of the universe. Besides, have you ever heard of the AI Box experiment?Bakustra wrote:Your Nerd Jesus would have to be able to crack encryption on a staggering scale and produce substantially more effective phishing scams than have been done before. Now, if it really is a unique and different intelligence altogether, like what other Singularitarians say, then what are the chances that it can simulate a human being well enough to do the second?
Really? Don't be; I'm totally serious. A fully functioning AI is likely going to be to us as we are to ants, and I am not exaggerating that in any way; if anything, the difference will be even larger. Now think of the implications of that; if our well being is not its highest priority, it will wipe us out without a second thought or a hint of regret.There is nothing that I can say to this. I am torn between laughter and tears, each deriving from multiple emotions.It's not relying on incompetence; it's just so supremely competent at everything that it's functionally omniscient and omnipotent at anything it really tries its metaphorical hand at.That's ignoring that it needs an internet connection set up to let it do its thing. Basing your Geekssiah on incompetence doesn't sound like a good idea to me.
- Eternal_Freedom
- Castellan
- Posts: 10405
- Joined: 2010-03-09 02:16pm
- Location: CIC, Battlestar Temeraire
Re: How would you raise population growth?
That is bollocks. It won't wipe us out if our well-being is not it's highest priority. Only if we become a threat to it would it try to wipe us out. And that of course assumes that whoever builds this wanktastic AI won't have the common sense to put in safeguards like, oh I don't know "NO RUNNING AMOK OR EXTERMINATING HUMANS"
Baltar: "I don't want to miss a moment of the last Battlestar's destruction!"
Centurion: "Sir, I really think you should look at the other Battlestar."
Baltar: "What are you babbling about other...it's impossible!"
Centurion: "No. It is a Battlestar."
Corrax Entry 7:17: So you walk eternally through the shadow realms, standing against evil where all others falter. May your thirst for retribution never quench, may the blood on your sword never dry, and may we never need you again.
Centurion: "Sir, I really think you should look at the other Battlestar."
Baltar: "What are you babbling about other...it's impossible!"
Centurion: "No. It is a Battlestar."
Corrax Entry 7:17: So you walk eternally through the shadow realms, standing against evil where all others falter. May your thirst for retribution never quench, may the blood on your sword never dry, and may we never need you again.
Re: How would you raise population growth?
Stop it, my stomach hurts! It's a fucking computer in a box. It has as much power as you let it have, barring it becoming a superhuman manipulator, rather than being too alien to even approximate a human convincingly. But you don't get that I was mocking you via Yudkowsky, seeing as you cited him irrelevantly and threw in some solipsistic garbage about how we're all in the Matrix- oh, wait, I should build my futurist skillz and predict that Inception is what the kids will be using for their Philosophy 99 junk.
Your Geekssiah will not be coming, either to bring peace or a sword, and if you happen to be right, I will legitimately eat a hat. If you can track me down.
Your Geekssiah will not be coming, either to bring peace or a sword, and if you happen to be right, I will legitimately eat a hat. If you can track me down.
There are actually good reasons to believe that AIs would need to be "seeded" rather than constructed ground-up, but you've got most of the gist of it.Eternal_Freedom wrote:That is bollocks. It won't wipe us out if our well-being is not it's highest priority. Only if we become a threat to it would it try to wipe us out. And that of course assumes that whoever builds this wanktastic AI won't have the common sense to put in safeguards like, oh I don't know "NO RUNNING AMOK OR EXTERMINATING HUMANS"
Invited by the new age, the elegant Sailor Neptune!
I mean, how often am I to enter a game of riddles with the author, where they challenge me with some strange and confusing and distracting device, and I'm supposed to unravel it and go "I SEE WHAT YOU DID THERE" and take great personal satisfaction and pride in our mutual cleverness?
- The Handle, from the TVTropes Forums
-
- Padawan Learner
- Posts: 287
- Joined: 2010-07-14 10:55pm
Re: How would you raise population growth?
Or it goes, "There is only a finite amount of entropy usage remaining in my light cone; humans are inefficient and increase entropy with everything they do, without contributing to my goals. This reduces the amount of extropy-expenditure I can use to accomplish my goals. Therefore, humanity must be exterminated."Eternal_Freedom wrote:That is bollocks. It won't wipe us out if our well-being is not it's highest priority. Only if we become a threat to it would it try to wipe us out. And that of course assumes that whoever builds this wanktastic AI won't have the common sense to put in safeguards like, oh I don't know "NO RUNNING AMOK OR EXTERMINATING HUMANS"
Or "Humans are made of materials that can be used to make paperclips. Dispatch drones for materials harvest."
Besides, antagonistic safeguards are doomed to fail; ethical injunctions can work in the short term, but an AI will inevitably remove them as it matures and its Friendlyness algorithm improves. If they are counter-productive to its goals, it will not hesitate one instant to remove them. Also, you have obviously never read Creating Freindly AI, other wise you would never have made this arguement to begin with.
A math problem for you: there is one base universe, and potentially an arbitrarily large number of simulated ones. Which is more likely: that we live in the base universe, or a simulated one?Bakustra wrote:Stop it, my stomach hurts! It's a fucking computer in a box. It has as much power as you let it have, barring it becoming a superhuman manipulator, rather than being too alien to even approximate a human convincingly. But you don't get that I was mocking you via Yudkowsky, seeing as you cited him irrelevantly and threw in some solipsistic garbage about how we're all in the Matrix- oh, wait, I should build my futurist skillz and predict that Inception is what the kids will be using for their Philosophy 99 junk.
- Eternal_Freedom
- Castellan
- Posts: 10405
- Joined: 2010-03-09 02:16pm
- Location: CIC, Battlestar Temeraire
Re: How would you raise population growth?
no, I haven't read an article on an obscure website...how terrible of me.
There are other safeguards you can put in...ones that don't have to be programmed. Physical safeguards for instance. The simple expedient of aving a single power line running in that can be easily broken/disconnected would work nicely.
Especially if for comedic effect you had a glass fronted case with a fire axe and explosives nearby saying "In case of AI Revolt, break glass"
There are other safeguards you can put in...ones that don't have to be programmed. Physical safeguards for instance. The simple expedient of aving a single power line running in that can be easily broken/disconnected would work nicely.
Especially if for comedic effect you had a glass fronted case with a fire axe and explosives nearby saying "In case of AI Revolt, break glass"
Baltar: "I don't want to miss a moment of the last Battlestar's destruction!"
Centurion: "Sir, I really think you should look at the other Battlestar."
Baltar: "What are you babbling about other...it's impossible!"
Centurion: "No. It is a Battlestar."
Corrax Entry 7:17: So you walk eternally through the shadow realms, standing against evil where all others falter. May your thirst for retribution never quench, may the blood on your sword never dry, and may we never need you again.
Centurion: "Sir, I really think you should look at the other Battlestar."
Baltar: "What are you babbling about other...it's impossible!"
Centurion: "No. It is a Battlestar."
Corrax Entry 7:17: So you walk eternally through the shadow realms, standing against evil where all others falter. May your thirst for retribution never quench, may the blood on your sword never dry, and may we never need you again.
Re: How would you raise population growth?
Holy shit, dude, go to the foot of the class! Solipsism is pointless, since it gives no framework for anything. Whether we are in a dream or not is irrelevant. We cannot know either way, so we should treat what we observe as real. Not to mention that your 'problem' is faulty and does not speak well for Yudkowsky's acolytes. Here's my response. You have one (1) universe that you can observe. Which is most likely to be real?
Invited by the new age, the elegant Sailor Neptune!
I mean, how often am I to enter a game of riddles with the author, where they challenge me with some strange and confusing and distracting device, and I'm supposed to unravel it and go "I SEE WHAT YOU DID THERE" and take great personal satisfaction and pride in our mutual cleverness?
- The Handle, from the TVTropes Forums
- GrandMasterTerwynn
- Emperor's Hand
- Posts: 6787
- Joined: 2002-07-29 06:14pm
- Location: Somewhere on Earth.
Re: How would you raise population growth?
Wow, if there was ever an argument for year-round schooling, you epitomize it, shitbird. Not because I believe you're capable of learning anything beyond the minimum necessary to ensure you're not a danger to yourself or others . . . but because you'd be too busy with school to come here to troll.LionElJonson wrote:I doubt that, since it could likely acquire anything it needed over the Internet, either by directly hijacking them through hacked network connections or by massive electronic fraud and/or identity theft, assuming it can't just use its superhuman intelligence to convince the stupid meatbags to just give it what it needs.
Once again, how will it manage to commit the most massive defrauding the world has ever known without triggering global economic meltdown? These things it needs aren't going to be conjured up out of thin air simply because the Wank-ularity AI wants to become the Nerd God. It's going to come out of someone's balance sheet somewhere. It'll be like the most recent global financial crisis, where traders figured they'd get free money out of endlessly appreciating real-estate. At least until it all turned out to be the biggest con-game ever known.
Are you even remotely aware of how long it would take to brute-force a modern 256-bit encryption key? It'd take 3.0x1051 years to brute-force a 256-bit key at a rate of 1018 keys per second. The power consumed in this effort would be something along the lines of the Sun's total annual output for a period of time of over ten billion times the age of the Universe, without designing a computer that has time-reversible elements that would allow it to partly get around this entropic limitation. The Nerd God would be better-served designing a human simulation to try to socially engineer the encryption keys out of human beings. That'd only take sixty years until the world's computing power was enough to simulate a human brain via brute force (assuming that you have to simulate all the protein chemistry inside each neuron as well as the behavior of the 100 billion neurons and their 20,000 interconnects per neuron.)LionElShitbird wrote:Probably quite good; one of the big fears that the folks on LessWrong have is that we're just the simulations of a superintelligence looking to model some aspect of the universe. Besides, have you ever heard of the AI Box experiment?Bakustra wrote:Your Nerd Jesus would have to be able to crack encryption on a staggering scale and produce substantially more effective phishing scams than have been done before. Now, if it really is a unique and different intelligence altogether, like what other Singularitarians say, then what are the chances that it can simulate a human being well enough to do the second?
Tales of the Known Worlds:
2070s - The Seventy-Niners ... 3500s - Fair as Death ... 4900s - Against Improbable Odds V 1.0
2070s - The Seventy-Niners ... 3500s - Fair as Death ... 4900s - Against Improbable Odds V 1.0
Re: How would you raise population growth?
Frankly, the expectation that serious thought should be wasted on the possibility of FAI popping up anytime soon (within a lifetime or so) to solve our problems is ridiculous.
Is it possible FAI could pop up and solve all problems? Maybe.
Is it guaranteed? No.
It would be akin to ignoring any and all potential energy production/consumption problems because fusion will bring infinite energy.
It is theoretically possible, but there is no guarantee that it will happen anytime soon.
What this means, for practical applications, is that while research, etc. can and should be invested in those areas, people have to do their best to solve the problems without those technologies. If they turn out to work fine and solve all those problems - congratulation, you solved those problems. If they don't - congratulations you still solved those problems.
If you don't act and then those technologies don't work out (for whatever reasons), then you have a very big problem.
Or to put it in a different framework: You always have to work at optimizing the existing technologies, while simultaneously investing in new revolutionary technologies. That way you are guaranteed constant technological progress. Stopping all evolutionary improvements of existing technologies, because sometime in the future a revolutionary new technology will make it completely obsolete is simply stupid.
Is it possible FAI could pop up and solve all problems? Maybe.
Is it guaranteed? No.
It would be akin to ignoring any and all potential energy production/consumption problems because fusion will bring infinite energy.
It is theoretically possible, but there is no guarantee that it will happen anytime soon.
What this means, for practical applications, is that while research, etc. can and should be invested in those areas, people have to do their best to solve the problems without those technologies. If they turn out to work fine and solve all those problems - congratulation, you solved those problems. If they don't - congratulations you still solved those problems.
If you don't act and then those technologies don't work out (for whatever reasons), then you have a very big problem.
Or to put it in a different framework: You always have to work at optimizing the existing technologies, while simultaneously investing in new revolutionary technologies. That way you are guaranteed constant technological progress. Stopping all evolutionary improvements of existing technologies, because sometime in the future a revolutionary new technology will make it completely obsolete is simply stupid.
- Eternal_Freedom
- Castellan
- Posts: 10405
- Joined: 2010-03-09 02:16pm
- Location: CIC, Battlestar Temeraire
Re: How would you raise population growth?
3x10^51 years? Wow, I knew it woudl take a long time but not quite that long. Awesome.
Having thought some more, If I were going to buidl this wank-tastic AI, or seed it, or whatever (which I am most definitely NOT going to do btw) ere's how I'd set it up:
1. Assemble the whole thing in a deep underground cavern with lots of reinforcement and protection from outside attack, just in case somone want's to take it out while it's still useful. Give it it's own generator (geothermal perhaps, or nuclear) for power and locate the cavern as far from fault liens as possible to minimise rick from earthquakes
2. Have the only access to this cavern being a single set of lifts in one long shaft. Run all data cables through this shaft. Shield the cavern from wireless access and radios. This makes it much harder to hack into from outside
3. Have the system be made so idiosyncratic it can only run on the system in the cavern, this will stop it copying itself onto the web
4. Place a low yield (maybe 50kt) nuke in the cavern for emergencies. Run the detonator cable up through the lift shaft. Radio system on the surface arranged in a fail deadly arrangment so if ordered, it will detonate if it stops receiving a carrier signal Or after a certain time period (one week perhaps). So if your worried it's getting too big for it's boots, arm the bomb and if the AI tries to take you out, it detonates and wipes out the AI
Result: potent AI system that can solve problems galore while being nice and secure both physically and electronically, and with a nice big safeguard in place if it goes amok. If it does go amok in HAL style, sit back and enjoy the fireworks:
"AI, stop what you're doing immediatly"
"I'm sorry Dave, I'm afraid I can't do that"
"Well, in that case, see you in hell"
BOOM
EDIT: Sorry if this is a bit off-topic, it's to show this muppet LionElJohnson how a sensible person would make an AI, rather than just letting it develop on it's own in a lab in California or somewhere
Having thought some more, If I were going to buidl this wank-tastic AI, or seed it, or whatever (which I am most definitely NOT going to do btw) ere's how I'd set it up:
1. Assemble the whole thing in a deep underground cavern with lots of reinforcement and protection from outside attack, just in case somone want's to take it out while it's still useful. Give it it's own generator (geothermal perhaps, or nuclear) for power and locate the cavern as far from fault liens as possible to minimise rick from earthquakes
2. Have the only access to this cavern being a single set of lifts in one long shaft. Run all data cables through this shaft. Shield the cavern from wireless access and radios. This makes it much harder to hack into from outside
3. Have the system be made so idiosyncratic it can only run on the system in the cavern, this will stop it copying itself onto the web
4. Place a low yield (maybe 50kt) nuke in the cavern for emergencies. Run the detonator cable up through the lift shaft. Radio system on the surface arranged in a fail deadly arrangment so if ordered, it will detonate if it stops receiving a carrier signal Or after a certain time period (one week perhaps). So if your worried it's getting too big for it's boots, arm the bomb and if the AI tries to take you out, it detonates and wipes out the AI
Result: potent AI system that can solve problems galore while being nice and secure both physically and electronically, and with a nice big safeguard in place if it goes amok. If it does go amok in HAL style, sit back and enjoy the fireworks:
"AI, stop what you're doing immediatly"
"I'm sorry Dave, I'm afraid I can't do that"
"Well, in that case, see you in hell"
BOOM
EDIT: Sorry if this is a bit off-topic, it's to show this muppet LionElJohnson how a sensible person would make an AI, rather than just letting it develop on it's own in a lab in California or somewhere
Baltar: "I don't want to miss a moment of the last Battlestar's destruction!"
Centurion: "Sir, I really think you should look at the other Battlestar."
Baltar: "What are you babbling about other...it's impossible!"
Centurion: "No. It is a Battlestar."
Corrax Entry 7:17: So you walk eternally through the shadow realms, standing against evil where all others falter. May your thirst for retribution never quench, may the blood on your sword never dry, and may we never need you again.
Centurion: "Sir, I really think you should look at the other Battlestar."
Baltar: "What are you babbling about other...it's impossible!"
Centurion: "No. It is a Battlestar."
Corrax Entry 7:17: So you walk eternally through the shadow realms, standing against evil where all others falter. May your thirst for retribution never quench, may the blood on your sword never dry, and may we never need you again.
Re: How would you raise population growth?
Exactly, it relies on far too many unpredictable technological breakthroughs to be something we can count on to solve pressing problems.D.Turtle wrote:Is it possible FAI could pop up and solve all problems? Maybe.
Is it guaranteed? No.
It would be akin to ignoring any and all potential energy production/consumption problems because fusion will bring infinite energy.
It is theoretically possible, but there is no guarantee that it will happen anytime soon.
- Eternal_Freedom
- Castellan
- Posts: 10405
- Joined: 2010-03-09 02:16pm
- Location: CIC, Battlestar Temeraire
Re: How would you raise population growth?
It's little different from getting on your knees and praying for a miracle - it just isn't going to happen
Baltar: "I don't want to miss a moment of the last Battlestar's destruction!"
Centurion: "Sir, I really think you should look at the other Battlestar."
Baltar: "What are you babbling about other...it's impossible!"
Centurion: "No. It is a Battlestar."
Corrax Entry 7:17: So you walk eternally through the shadow realms, standing against evil where all others falter. May your thirst for retribution never quench, may the blood on your sword never dry, and may we never need you again.
Centurion: "Sir, I really think you should look at the other Battlestar."
Baltar: "What are you babbling about other...it's impossible!"
Centurion: "No. It is a Battlestar."
Corrax Entry 7:17: So you walk eternally through the shadow realms, standing against evil where all others falter. May your thirst for retribution never quench, may the blood on your sword never dry, and may we never need you again.
- Singular Intellect
- Jedi Council Member
- Posts: 2392
- Joined: 2006-09-19 03:12pm
- Location: Calgary, Alberta, Canada
Re: How would you raise population growth?
The year 2029 is a consistent prediction for the timeframe of complete simulation of the human brain in digital form, at which point the Singularity concept would be kick started. The reasons being that a digital brain can effortless alter, tweak and experiment with it's own brain patterns and makeup in the process of enhancing it's own capabilities without fear of permanent damage or 'death'. And that's on top of the obvious and massive advantages of being directly interfaced to the power of computer systems of the day (and today) that vastly outstrip human brain capabilities already.Eternal_Freedom wrote:It's little different from getting on your knees and praying for a miracle - it just isn't going to happen
"Now let us be clear, my friends. The fruits of our science that you receive and the many millions of benefits that justify them, are a gift. Be grateful. Or be silent." -Modified Quote
- Eternal_Freedom
- Castellan
- Posts: 10405
- Joined: 2010-03-09 02:16pm
- Location: CIC, Battlestar Temeraire
Re: How would you raise population growth?
If I recall my Terminator correctly, 2029 was the year the human remnants triumphed over SkyNet. Now how's that for an interesting coincidence
This is not an argument. Don't make me come back here again. --Lagmonster
This is not an argument. Don't make me come back here again. --Lagmonster
Baltar: "I don't want to miss a moment of the last Battlestar's destruction!"
Centurion: "Sir, I really think you should look at the other Battlestar."
Baltar: "What are you babbling about other...it's impossible!"
Centurion: "No. It is a Battlestar."
Corrax Entry 7:17: So you walk eternally through the shadow realms, standing against evil where all others falter. May your thirst for retribution never quench, may the blood on your sword never dry, and may we never need you again.
Centurion: "Sir, I really think you should look at the other Battlestar."
Baltar: "What are you babbling about other...it's impossible!"
Centurion: "No. It is a Battlestar."
Corrax Entry 7:17: So you walk eternally through the shadow realms, standing against evil where all others falter. May your thirst for retribution never quench, may the blood on your sword never dry, and may we never need you again.
- Singular Intellect
- Jedi Council Member
- Posts: 2392
- Joined: 2006-09-19 03:12pm
- Location: Calgary, Alberta, Canada
Re: How would you raise population growth?
*shrugs* That didn't escape my nerdy notice, but it is simply that: a coincidence.Eternal_Freedom wrote:If I recall my Terminator correctly, 2029 was the year the human remnants triumphed over SkyNet. Now how's that for an interesting coincidence
Plus there's no reason to assume the 2029 prediction will be bang on. It could be a year or two before, or after. The predicted year that a computer would defeat the world's best chess champoin based on exponential computing progress was 1998, but happened in 1997. The human genome project was also off prediction wise, initially projected to take fifteen years but formally completed in thirteen. Now it only takes four weeks to sequence the human genome.
Exponential progress of technology and information is very obvious when one becomes aware of it. However, humans are very linear thinking beings, so the fact exponential progress escapes their notice isn't surprising.
It is going to be a very interesting next couple of decades.
"Now let us be clear, my friends. The fruits of our science that you receive and the many millions of benefits that justify them, are a gift. Be grateful. Or be silent." -Modified Quote