Will we go too far?
Moderator: Alyrium Denryle
- Singular Intellect
- Jedi Council Member
- Posts: 2392
- Joined: 2006-09-19 03:12pm
- Location: Calgary, Alberta, Canada
Will we go too far?
So I was watching 2002 remake of "The Time Machine", and a particularily moving scene always sticks out to me.
Our protagonist Alexander had recently arrived in the future, some eight hundred thousand years forward. In the particular scene that brought my subject to mind, he gazes up at the moon, now reduced to a roughly third sized large chunk surrounded debris after the accident in 2037, that wiped out most of humanity. He then says:
"You were right Filby. We did go too far."
His comment was referring to a discussion he had with his (now) long dead friend whom had made the offhand comment of wondering if humanity would go too far, whereas he was looking at Alexander's concept art of a futuristic city. Alexander replied "No such thing."
Alexander was obviously reminiscing about the fact humanity's technological power had exceeded their ability to be careful/responsible with it.
This scene moved me, but also produced a small amount of irritation. It seemed to be suggesting there's should be a 'limit' to how far humanity advances, yet all I could really appreciate was that a catatrosphic situation had resulted in the loss of all of humanity's progress and knowledge, with so much of it having been acquired after so much sacrifice and effort.
So what's your take on this idea? Is there some 'upper limit' of technology/behavior/society you think humanity should not try to push past because the risks are potentially too great? If so, what point do you think that is, and if not where do you think we could one day go?
Our protagonist Alexander had recently arrived in the future, some eight hundred thousand years forward. In the particular scene that brought my subject to mind, he gazes up at the moon, now reduced to a roughly third sized large chunk surrounded debris after the accident in 2037, that wiped out most of humanity. He then says:
"You were right Filby. We did go too far."
His comment was referring to a discussion he had with his (now) long dead friend whom had made the offhand comment of wondering if humanity would go too far, whereas he was looking at Alexander's concept art of a futuristic city. Alexander replied "No such thing."
Alexander was obviously reminiscing about the fact humanity's technological power had exceeded their ability to be careful/responsible with it.
This scene moved me, but also produced a small amount of irritation. It seemed to be suggesting there's should be a 'limit' to how far humanity advances, yet all I could really appreciate was that a catatrosphic situation had resulted in the loss of all of humanity's progress and knowledge, with so much of it having been acquired after so much sacrifice and effort.
So what's your take on this idea? Is there some 'upper limit' of technology/behavior/society you think humanity should not try to push past because the risks are potentially too great? If so, what point do you think that is, and if not where do you think we could one day go?
"Now let us be clear, my friends. The fruits of our science that you receive and the many millions of benefits that justify them, are a gift. Be grateful. Or be silent." -Modified Quote
- Count Chocula
- Jedi Council Member
- Posts: 1821
- Joined: 2008-08-19 01:34pm
- Location: You've asked me for my sacrifice, and I am winter born
Re: Will we go too far?
Simply put, no. The human population is so large, and repositories of knowledge are so widespread (especially in the Western world) that I find it difficult to imagine any man-made disaster that could set humanity as a whole back more than 50 years or so. I sincerely doubt at this point that a full global CBN war would wipe out more than, say, 50% of Earth's human population. Could there be accidents with new techology, or acts of war or terror strikes with some new more lethal technology that cause a tremendous loss of life? Again, sure, but not enough to take humanity back to the Stone Age.
The only people who were safe were the legion; after one of their AT-ATs got painted dayglo pink with scarlet go faster stripes, they identified the perpetrators and exacted revenge. - Eleventh Century Remnant
Lord Monckton is my heeerrooo
"Yeah, well, fuck them. I never said I liked the Moros." - Shroom Man 777
Lord Monckton is my heeerrooo
"Yeah, well, fuck them. I never said I liked the Moros." - Shroom Man 777
- Tahlan
- Youngling
- Posts: 129
- Joined: 2007-03-14 05:21pm
- Location: Somewhere between sanity and madness...
Re: Will we go too far?
Yes, I think there is some upper limit of technological advancement, a threshold, beyond which mankind should not proceed. The problem is that it is inherent in man's makeup that he will at some point step across the line in the sand. In my opinion, atomic and then nuclear weapons were both precursor and example of technology we were not yet ready to wield; I say that because of the proliferation of nuclear weapons during the cold war, and our potential capability to annihilate ourselves. It's just dumb luck we haven't yet destroyed ourselves and our planet. And no doubt, we'll have more opportunities in the future.Singular Intellect wrote:"You were right Filby. We did go too far."
So what's your take on this idea? Is there some 'upper limit' of technology/behavior/society you think humanity should not try to push past because the risks are potentially too great? If so, what point do you think that is, and if not where do you think we could one day go?
"And this is the house I pass through on my way to power and light."
~James Dickey, Power and Light
~James Dickey, Power and Light
- Akkleptos
- Jedi Knight
- Posts: 643
- Joined: 2008-12-17 02:14am
- Location: Between grenades and H1N1.
- Contact:
Re: Will we go too far?
The question itself is silly, because it resorts to the age-old "Mad scientist" myth. "There are some things Man is not supposed to understand". Hell, mankind -as the nosey, inquisitive, endeavouring primates we are, - wasn't supposed to even master fire! How about the A bomb? The H bomb? Genetic engineering?
It's a sociological question that has more to do with ethics than science. In other words, as previously repeated over and over in many other threads regarding similar issues, it's pretty much like the guns kill people situation. No, people kill people. Guns are just a tool they happen to have quite handy in some countries to get the job done, but they might just use a shovel, or a plastic bag to accomplish the same final effect. Same with science. It's a tool. Whatever we decide to do with it will be a matter of human ingenuity and good judgement or human stupidity and refusal to consider potential consequences.
So, no, science cannot be inherently evil, nor are there specific fields humans should not even research into.
Besides, who's going to define what is to be researched and what is not? Sometimes, a scientific breakthrough is catalised by an advancement in a distant, but related field of knowledge.
In the end, it would probably be more a matter of legal regulations about what can and cannot be done with potentially dangerous technologies, than of what knowledge is gained through research.
It's a sociological question that has more to do with ethics than science. In other words, as previously repeated over and over in many other threads regarding similar issues, it's pretty much like the guns kill people situation. No, people kill people. Guns are just a tool they happen to have quite handy in some countries to get the job done, but they might just use a shovel, or a plastic bag to accomplish the same final effect. Same with science. It's a tool. Whatever we decide to do with it will be a matter of human ingenuity and good judgement or human stupidity and refusal to consider potential consequences.
So, no, science cannot be inherently evil, nor are there specific fields humans should not even research into.
Besides, who's going to define what is to be researched and what is not? Sometimes, a scientific breakthrough is catalised by an advancement in a distant, but related field of knowledge.
In the end, it would probably be more a matter of legal regulations about what can and cannot be done with potentially dangerous technologies, than of what knowledge is gained through research.
Life in Commodore 64:
10 OPEN "EYES",1,1
20 GET UP$:IF UP$="" THEN 20
30 GOTO BATHROOM
...
10 OPEN "EYES",1,1
20 GET UP$:IF UP$="" THEN 20
30 GOTO BATHROOM
...
Don't like what I'm saying?
Take it up with my representative:
Take it up with my representative:
Re: Will we go too far?
"We went to far"
"There are some things we are not meant to know"
"But, it's evil!"
You can some all those things up very simply; "You are a intellectual and technological chicken shit that lacks imagination"
"There are some things we are not meant to know"
"But, it's evil!"
You can some all those things up very simply; "You are a intellectual and technological chicken shit that lacks imagination"
I've been asked why I still follow a few of the people I know on Facebook with 'interesting political habits and view points'.
It's so when they comment on or approve of something, I know what pages to block/what not to vote for.
It's so when they comment on or approve of something, I know what pages to block/what not to vote for.
Re: Will we go too far?
Having knowledge and applying it responsibly are two very different things. Nobody's going to deny the benefit internal combustion has been to the development of the human race, but only an idiot's going to deny that overuse and short-term thinking have let that same technology cause massive damage.
So yeah, Alexander had a point. The lunar colonisation program was obviously not mature enough and it failed because of it, dragging Earth with it. Does that mean it should never have been tried? Of course not. But it should have been done better.
So yeah, Alexander had a point. The lunar colonisation program was obviously not mature enough and it failed because of it, dragging Earth with it. Does that mean it should never have been tried? Of course not. But it should have been done better.
Re: Will we go too far?
All those all-too-common phrases often spouted in fiction translate the same in my mind: "Do not defy GOD".Solauren wrote:"We went to far"
"There are some things we are not meant to know"
"But, it's evil!"
You can some all those things up very simply; "You are a intellectual and technological chicken shit that lacks imagination"
It's very typical of the religious mindset to claim that we should remain naive and ignorant for our own safety.
unsigned
- Ryan Thunder
- Village Idiot
- Posts: 4139
- Joined: 2007-09-16 07:53pm
- Location: Canada
Re: Will we go too far?
You have a very strange translation algorithm.LordOskuro wrote:All those all-too-common phrases often spouted in fiction translate the same in my mind: "Do not defy GOD".Solauren wrote:"We went to far"
"There are some things we are not meant to know"
"But, it's evil!"
You can some all those things up very simply; "You are a intellectual and technological chicken shit that lacks imagination"
SDN Worlds 5: Sanctum
- General Zod
- Never Shuts Up
- Posts: 29211
- Joined: 2003-11-18 03:08pm
- Location: The Clearance Rack
- Contact:
Re: Will we go too far?
I hear a lot of Scifi authors are highly religious?Ryan Thunder wrote:You have a very strange translation algorithm.LordOskuro wrote: All those all-too-common phrases often spouted in fiction translate the same in my mind: "Do not defy GOD".
Seriously though, this whole "will humanity go too far" bullshit is exactly that. As long as we don't wipe ourselves out in the process how the fuck can we go "too far" when there is no ultimate goal behind evolution except for a species to survive? In my mind there has to be some kind of "end goal" in mind for something to go "too far", and for humanity there just isn't one. It simply reeks of luddite scare mongering to me.
"It's you Americans. There's something about nipples you hate. If this were Germany, we'd be romping around naked on the stage here."
- Patrick Degan
- Emperor's Hand
- Posts: 14847
- Joined: 2002-07-15 08:06am
- Location: Orleanian in exile
Re: Will we go too far?
The whole idea of man "going too far" is a very 19th century sensibility and one tied in with a religious worldview that there are certain areas of knowledge which are too dangerous for Man to try to pierce (a concept going all the way back to the myth of the Fall). Religious thinkers of the period genuinely feared the vision of science casting God out of the world, leaving Man totally alone against forces he could not understand or control and thus resisted inquiry and advancement past the point they then knew. "Man was not meant to tamper in God's domain" would be the common refrain for this mindset.
When ballots have fairly and constitutionally decided, there can be no successful appeal back to bullets.
—Abraham Lincoln
People pray so that God won't crush them like bugs.
—Dr. Gregory House
Oil an emergency?! It's about time, Brigadier, that the leaders of this planet of yours realised that to remain dependent upon a mineral slime simply doesn't make sense.
—The Doctor "Terror Of The Zygons" (1975)
—Abraham Lincoln
People pray so that God won't crush them like bugs.
—Dr. Gregory House
Oil an emergency?! It's about time, Brigadier, that the leaders of this planet of yours realised that to remain dependent upon a mineral slime simply doesn't make sense.
—The Doctor "Terror Of The Zygons" (1975)
-
- Sith Acolyte
- Posts: 6464
- Joined: 2007-09-14 11:46pm
- Location: SoCal
Re: Will we go too far?
There's an alternate interpretation of 'going too far' which doesn't rest upon ideas of religion, or 'not being meant to know,' etc - advances in technology which outstrip the maturity-level of the people with access to them.
Doctrinaire Christians and Jews and Muslims with keys to nuclear weapons come to mind. Is it 'going too far' to have nukes, in general? No. Is it 'going too far' to have nukes while simultaneously believing there's a deity who under certain circumstances wants you to use them? Probably.
Doctrinaire Christians and Jews and Muslims with keys to nuclear weapons come to mind. Is it 'going too far' to have nukes, in general? No. Is it 'going too far' to have nukes while simultaneously believing there's a deity who under certain circumstances wants you to use them? Probably.
I find myself endlessly fascinated by your career - Stark, in a fit of Nerd-Validation, November 3, 2011
- General Zod
- Never Shuts Up
- Posts: 29211
- Joined: 2003-11-18 03:08pm
- Location: The Clearance Rack
- Contact:
Re: Will we go too far?
Except the OP is talking about humanity as a whole, not one particular subset of humanity. Big difference there.Kanastrous wrote:There's an alternate interpretation of 'going too far' which doesn't rest upon ideas of religion, or 'not being meant to know,' etc - advances in technology which outstrip the maturity-level of the people with access to them.
Doctrinaire Christians and Jews and Muslims with keys to nuclear weapons come to mind. Is it 'going too far' to have nukes, in general? No. Is it 'going too far' to have nukes while simultaneously believing there's a deity who under certain circumstances wants you to use them? Probably.
"It's you Americans. There's something about nipples you hate. If this were Germany, we'd be romping around naked on the stage here."
- speaker-to-trolls
- Jedi Master
- Posts: 1182
- Joined: 2003-11-18 05:46pm
- Location: All Hail Britannia!
Re: Will we go too far?
Urghh, The Time Machine 2002, interesting that the only identifiable message in that film was a luddite one when the books author was a self-professed technocrat.
I don't see why there should be any kind of upper limit to technological progress, there are certainly points when mankind wouldn't be able to deal with certain technologies and would have to grow into them, but as long as society can adapt along with technology I see no reason why it shouldn't keep on progressing indefinitely. Let's face it, we have already reached a point where civilisation could be largely destroyed due to our technology, if there's any point at which we should have stopped it was probably around the 1940's.
I don't see why there should be any kind of upper limit to technological progress, there are certainly points when mankind wouldn't be able to deal with certain technologies and would have to grow into them, but as long as society can adapt along with technology I see no reason why it shouldn't keep on progressing indefinitely. Let's face it, we have already reached a point where civilisation could be largely destroyed due to our technology, if there's any point at which we should have stopped it was probably around the 1940's.
Post Number 1066 achieved Sun Feb 22, 2009 3:19 pm(board time, 8:19GMT)
Batman: What do these guys want anyway?
Superman: Take over the world... Or rob banks, I'm not sure.
Batman: What do these guys want anyway?
Superman: Take over the world... Or rob banks, I'm not sure.
Re: Will we go too far?
Why the 1940s? Sure, nuclear weapons are really good at what they do, but conventional bombing and normal conquest managed to completely wreak Europe. If you want wars that don't destroy countries, you are going to have to go back to before the French Revolution.
- General Zod
- Never Shuts Up
- Posts: 29211
- Joined: 2003-11-18 03:08pm
- Location: The Clearance Rack
- Contact:
Re: Will we go too far?
Don't you mean back before the Roman Empire? Salting the land was a popular tactic of rendering a place unlivable so the enemy couldn't make use of their resources anymore when you conquered it, I hear.Samuel wrote:Why the 1940s? Sure, nuclear weapons are really good at what they do, but conventional bombing and normal conquest managed to completely wreak Europe. If you want wars that don't destroy countries, you are going to have to go back to before the French Revolution.
"It's you Americans. There's something about nipples you hate. If this were Germany, we'd be romping around naked on the stage here."
Re: Will we go too far?
Ugh, I hate that movie. It defaced a classic of English literature and had the most horrendously stupid disaster premise ever. I get angry just thinking of the weapons-grade stupidity of a few nukes shifting the moon out of orbit.
Anyway, no I don't really buy the idea. Technology can be good or bad depending on the intent and responsibility with which it is applied. Most powerful technologies have great potentials for improving human existence as well as making it worse. There may be specific hypothetical situations where my opinion would change, but generally I think it's stupid to turn our backs on things that can improve the lot of the human race just because they might be dangerous.
Let's take the example from the movie (and ignore the epic fucking stupidity of the idea that a few nukes could shift the moon's orbit ). According to the statement, we either "went too far" in developing nuclear power or space travel. Nuclear power is our best shot for keeping the lights on when oil runs out, and space travel is our best shot at preserving our civilization if something seriously bad happens to the Earth, and also I think a vital outlet for our desire to explore. The problem happened because the technologies were used in a very stupid and irresponsible manner. If it weren't for that, they could both be very beneficial to the human species.
Anyway, no I don't really buy the idea. Technology can be good or bad depending on the intent and responsibility with which it is applied. Most powerful technologies have great potentials for improving human existence as well as making it worse. There may be specific hypothetical situations where my opinion would change, but generally I think it's stupid to turn our backs on things that can improve the lot of the human race just because they might be dangerous.
Let's take the example from the movie (and ignore the epic fucking stupidity of the idea that a few nukes could shift the moon's orbit ). According to the statement, we either "went too far" in developing nuclear power or space travel. Nuclear power is our best shot for keeping the lights on when oil runs out, and space travel is our best shot at preserving our civilization if something seriously bad happens to the Earth, and also I think a vital outlet for our desire to explore. The problem happened because the technologies were used in a very stupid and irresponsible manner. If it weren't for that, they could both be very beneficial to the human species.
-
- Sith Acolyte
- Posts: 6464
- Joined: 2007-09-14 11:46pm
- Location: SoCal
Re: Will we go too far?
Yes, but I think that the principle applies to a greater or lesser degree species-wide. The particular example is just one manifestation.General Zod wrote:Except the OP is talking about humanity as a whole, not one particular subset of humanity. Big difference there.Kanastrous wrote:There's an alternate interpretation of 'going too far' which doesn't rest upon ideas of religion, or 'not being meant to know,' etc - advances in technology which outstrip the maturity-level of the people with access to them.
Doctrinaire Christians and Jews and Muslims with keys to nuclear weapons come to mind. Is it 'going too far' to have nukes, in general? No. Is it 'going too far' to have nukes while simultaneously believing there's a deity who under certain circumstances wants you to use them? Probably.
I find myself endlessly fascinated by your career - Stark, in a fit of Nerd-Validation, November 3, 2011
- General Zod
- Never Shuts Up
- Posts: 29211
- Joined: 2003-11-18 03:08pm
- Location: The Clearance Rack
- Contact:
Re: Will we go too far?
I don't really see how. It's not like either of the groups you mentioned are known for scientific acumen, and the OP's premise doesn't really apply if they didn't create the technology themselves.Kanastrous wrote: Yes, but I think that the principle applies to a greater or lesser degree species-wide. The particular example is just one manifestation.
"It's you Americans. There's something about nipples you hate. If this were Germany, we'd be romping around naked on the stage here."
- Zixinus
- Emperor's Hand
- Posts: 6663
- Joined: 2007-06-19 12:48pm
- Location: In Seth the Blitzspear
- Contact:
Re: Will we go too far?
The thing is, that no one really wants to see the end of the world and even if they do, they're not kept close to anything of the power that they could do it with.
Thing is, the more powerful a technology is, the more oversight it will have. Nukes have an enormous oversight and even if somebody tries to replicate it, there will be shit stirring about it. Look at Iran: they're making nukes (I think) both for home (as in NPPs) and foreign use and everyone is leering over them because they have a few shit-fucking crazy people over there.
The closest that could happen is that an elite group of people get to take advantage and put everyone into oppression with their technology in order to further the workings of a totalitarian state. Only China comes to mind and such states tend to be unstable as it is.
Ambition does not necessarily mean self-destruction. Even if people are motivated by greed or power, they still tend to be rational people when it comes to the treat of self-destruction.
Thing is, the more powerful a technology is, the more oversight it will have. Nukes have an enormous oversight and even if somebody tries to replicate it, there will be shit stirring about it. Look at Iran: they're making nukes (I think) both for home (as in NPPs) and foreign use and everyone is leering over them because they have a few shit-fucking crazy people over there.
The closest that could happen is that an elite group of people get to take advantage and put everyone into oppression with their technology in order to further the workings of a totalitarian state. Only China comes to mind and such states tend to be unstable as it is.
Ambition does not necessarily mean self-destruction. Even if people are motivated by greed or power, they still tend to be rational people when it comes to the treat of self-destruction.
Credo!
Chat with me on Skype if you want to talk about writing, ideas or if you want a test-reader! PM for address.
Chat with me on Skype if you want to talk about writing, ideas or if you want a test-reader! PM for address.
Re: Will we go too far?
Yes, there is. As technology increases access to resources increases, which vastly jump starts individual power. With this increase in individual power comes the numbers game of when someone will abuse that power. As technology increases, so to does population and population density, which means the amount of harm that results from that abuse of power goes up as well. And increases in population also play into the probability that someone will do it. We already see this happening - in the middle ages it would have been inconceivable that someone could kill an entire village in a few minutes. Yet school shooters rack up similar numbers. If you had told one of the Roman Emperors that 18 people could kill 3000 in a few hours you would have been laughed out of court. Yet that happened. And this is just going to get faster - wait about 5 years, if rapid prototyping fabbers follow the same pricing trends as other consumer electronics they would be about $1,000 by then. That's going to be a game changer - like fabbing machine guns that fit in your pocket, or a more benign trick of fabbing remote control helicopters to even the playing field that major players currently have wrt drones (can't find the articles about that now, boing boing and wired covered that a few weeks back)
EDIT: Never mind the pricing comment, apparently you can find instructions on how to build a personal fabber on the web free right here. Not sure what components cost though.
EDIT: Never mind the pricing comment, apparently you can find instructions on how to build a personal fabber on the web free right here. Not sure what components cost though.
بيرني كان سيفوز
*
Nuclear Navy Warwolf
*
in omnibus requiem quaesivi, et nusquam inveni nisi in angulo cum libro
*
ipsa scientia potestas est
*
Nuclear Navy Warwolf
*
in omnibus requiem quaesivi, et nusquam inveni nisi in angulo cum libro
*
ipsa scientia potestas est
Re: Will we go too far?
As the overall technological level of civilization increases, so does the potential destructive power available to individuals. Ender has pointed out the dangers of deliberate misuse of this power, but I think a far greater risk lies in the accidental misuse of this power.
My example is the automobile, currently the one item of technology that offers the greatest potential for destruction to the average Joe. Even here in the civilized West, the amount of destruction caused by autos is huge and most of the destruction is caused not by deranged loners on a mission of revenge, but by average people put into situations that the turn out to be incapable of handling (through their own actions or sometimes just by chance).
Such destructive potential can be moderated to some degree by laws and regulations. As bad as it is on the roads here, there are certainly places that are a lot worse. As our technological level increases, so do the complexity and sheer number of laws and restrictions designed to protect us from our technology. But this comes into conflict with our culture of freedom. Take a look again at the automobile example. We could indeed reduce the harmful effects of this technology by greatly increasing the regulations and restrictions governing its use, but as a whole, we don't want to. We have reached more or less the limits of what we are willing to put up with. Look at the fuss that arises just about every time a photo radar unit is installed, or when there is talk about lowering speed limits.
That all being said, for all the carnage caused by the automobile, civilizations is certainly in no danger of collapsing because of it. And while death tolls will continue to escalate as the technological power available to the average person increases, it will be held in check somewhat by increasing regulation.
My example is the automobile, currently the one item of technology that offers the greatest potential for destruction to the average Joe. Even here in the civilized West, the amount of destruction caused by autos is huge and most of the destruction is caused not by deranged loners on a mission of revenge, but by average people put into situations that the turn out to be incapable of handling (through their own actions or sometimes just by chance).
Such destructive potential can be moderated to some degree by laws and regulations. As bad as it is on the roads here, there are certainly places that are a lot worse. As our technological level increases, so do the complexity and sheer number of laws and restrictions designed to protect us from our technology. But this comes into conflict with our culture of freedom. Take a look again at the automobile example. We could indeed reduce the harmful effects of this technology by greatly increasing the regulations and restrictions governing its use, but as a whole, we don't want to. We have reached more or less the limits of what we are willing to put up with. Look at the fuss that arises just about every time a photo radar unit is installed, or when there is talk about lowering speed limits.
That all being said, for all the carnage caused by the automobile, civilizations is certainly in no danger of collapsing because of it. And while death tolls will continue to escalate as the technological power available to the average person increases, it will be held in check somewhat by increasing regulation.
- Zixinus
- Emperor's Hand
- Posts: 6663
- Joined: 2007-06-19 12:48pm
- Location: In Seth the Blitzspear
- Contact:
Re: Will we go too far?
An average individual's access to dangerous tools have increased, true. But we are talking humanity-ending technologies, not the cases of technology biting humanity in the ass a bit.
The idea of a miniature "machine gun" (technically, a sub-machine gun, as I doubt that thing fires either rifle or machine-gun bullets) has been around for quite some time.
Design by Dave Boatman, the idea is that businessmen should have submachine guns. No, don't ask why. Hell, the guy called it an FMG, the same thing that Eugene Stoner made.
Hell, the idea of a brief-case looking gun was so popular that the Soviets copied the idea as the P-90.
From what I hear, the things usually gather dust in warehouses as no one really found a use for them.
Machine pistols are still available on some Glocks and its not like gangsters have any real trouble getting regular, old sub-machine guns. Pocket-pistols and the like are hardly rare either.
From the looks of it, what you have there is just a fancy stock: something rarely used for pistols as the very idea behind a pistol is to be easily concealable. Something that the stock prevents.
A gun, unless you can fab proper barrels out of polymer or something, are still going to be made out of metal as well as other inner mechanics of the gun. Metal detectors will still find it.
Furthermore, can you fab bullets with operable primers and gunpowders?
I admit that I don't know much about fabbers, but I see nothing to indicate that such a weapon would change any game.
Bhahaha!That's going to be a game changer - like fabbing machine guns that fit in your pocket,
The idea of a miniature "machine gun" (technically, a sub-machine gun, as I doubt that thing fires either rifle or machine-gun bullets) has been around for quite some time.
Design by Dave Boatman, the idea is that businessmen should have submachine guns. No, don't ask why. Hell, the guy called it an FMG, the same thing that Eugene Stoner made.
Hell, the idea of a brief-case looking gun was so popular that the Soviets copied the idea as the P-90.
From what I hear, the things usually gather dust in warehouses as no one really found a use for them.
Machine pistols are still available on some Glocks and its not like gangsters have any real trouble getting regular, old sub-machine guns. Pocket-pistols and the like are hardly rare either.
From the looks of it, what you have there is just a fancy stock: something rarely used for pistols as the very idea behind a pistol is to be easily concealable. Something that the stock prevents.
A gun, unless you can fab proper barrels out of polymer or something, are still going to be made out of metal as well as other inner mechanics of the gun. Metal detectors will still find it.
Furthermore, can you fab bullets with operable primers and gunpowders?
I admit that I don't know much about fabbers, but I see nothing to indicate that such a weapon would change any game.
Credo!
Chat with me on Skype if you want to talk about writing, ideas or if you want a test-reader! PM for address.
Chat with me on Skype if you want to talk about writing, ideas or if you want a test-reader! PM for address.
Re: Will we go too far?
Also, this is one example where advancing technology will probably eventually eliminate much of the risk. Sooner or later, we're bound to have automated control systems for automobiles that are much more reliable than human drivers.Korvan wrote:My example is the automobile, currently the one item of technology that offers the greatest potential for destruction to the average Joe. Even here in the civilized West, the amount of destruction caused by autos is huge and most of the destruction is caused not by deranged loners on a mission of revenge, but by average people put into situations that the turn out to be incapable of handling (through their own actions or sometimes just by chance).
Such destructive potential can be moderated to some degree by laws and regulations. As bad as it is on the roads here, there are certainly places that are a lot worse. As our technological level increases, so do the complexity and sheer number of laws and restrictions designed to protect us from our technology. But this comes into conflict with our culture of freedom. Take a look again at the automobile example. We could indeed reduce the harmful effects of this technology by greatly increasing the regulations and restrictions governing its use, but as a whole, we don't want to. We have reached more or less the limits of what we are willing to put up with. Look at the fuss that arises just about every time a photo radar unit is installed, or when there is talk about lowering speed limits.
That all being said, for all the carnage caused by the automobile, civilizations is certainly in no danger of collapsing because of it. And while death tolls will continue to escalate as the technological power available to the average person increases, it will be held in check somewhat by increasing regulation.
Re: Will we go too far?
We are also talking long term, so this doesn't counter a damn thing. The point is that the trend lends itself toward that end eventually, even if we aren't there yet.Zixinus wrote:An average individual's access to dangerous tools have increased, true. But we are talking humanity-ending technologies, not the cases of technology biting humanity in the ass a bit.
You really enjoy proving to everyone that you are just a stupid piece of shit, don't you? You don't know the difference between Afghanistan and Iraq, you claim above that there are no groups out there with apocalypse fetishes despite them being in the news every week, and here you fixate on one example of what the game changing technology can produce rather than the machine itself. And yes it is a game changer, pull your head out of your ass and realize that most business models are centered around control of production rather than design, and realize that designs can fly over P2P, then think about what that can do to the economy.Bhahaha!That's going to be a game changer - like fabbing machine guns that fit in your pocket,
*SNIP*
I admit that I don't know much about fabbers, but I see nothing to indicate that such a weapon would change any game.
The point isn't that there are small guns, we've had those for ever. The point isn't that you can now build said gun in your home, a skilled machinist could do that with a little effort and some good tools. The point is that we now have technology out there that completely removes skill from the equation, and with it scarcity. The point isn't that someone can get a weapon, the point is that with this individuals can now field gadgets that approximate or match what has previously taken a state to field, with only a trivial expenditure. And rather than needing multiple massive facilities to do it, one easily scaled source can serve as multiple production lines. That ramps up the power the individual or small group can wield.
Last edited by Ender on 2009-03-30 01:15am, edited 1 time in total.
بيرني كان سيفوز
*
Nuclear Navy Warwolf
*
in omnibus requiem quaesivi, et nusquam inveni nisi in angulo cum libro
*
ipsa scientia potestas est
*
Nuclear Navy Warwolf
*
in omnibus requiem quaesivi, et nusquam inveni nisi in angulo cum libro
*
ipsa scientia potestas est
Re: Will we go too far?
But it is also an example of where technology can be easily misappropriated for other purposes. Hard to regulate what it wasn't supposed to be able to do in the first place.Junghalli wrote:Also, this is one example where advancing technology will probably eventually eliminate much of the risk. Sooner or later, we're bound to have automated control systems for automobiles that are much more reliable than human drivers.
بيرني كان سيفوز
*
Nuclear Navy Warwolf
*
in omnibus requiem quaesivi, et nusquam inveni nisi in angulo cum libro
*
ipsa scientia potestas est
*
Nuclear Navy Warwolf
*
in omnibus requiem quaesivi, et nusquam inveni nisi in angulo cum libro
*
ipsa scientia potestas est