Irbis->
What's funny is that I'm just pointing out how improbable the whole thing is even if we assume that somebody actually programmed a God-AI software into existence in the first place; which as I've repeatedly pointed out is not actually an easy (or imminent) thing to do.
Nor is it even necessary to network such a machine, because only idiots think that the World Wide Web with its enormous mass of contradictory information is an ideal tool for "teaching" an AI. Heck, an AI that bases its knowledge base on the Internet would probably conclude the its purpose in life was to create porn for anything and everything.
So your objections demonstrate not only your ignorance, but blatant avoidance of the core of the issue.
And then sleeper cell you missed reinfects the net with version 2.0?
Again, viruses and other similar "infection" are by necessity small programs to avoid detection. You can't have a God-AI program the size of a virus, because it literally can't have that kind of functionality with only a couple of bytes of data.
Slowdowns. Ahahahaaha
Actually, if you again look at the potential file size of an AI program, it will in fact consume the majority of your bandwidth. We're not talking about a 100kb file here. We're probably going to talk in terms of a 100GB file at the minimum. Even if you have a fibr internet connection, you'll generally be working under a usage subscription plan so you're gonna see your usage used up. And there is no way to prevent the inevitable harddisk slowdown as it tries to copy such a huge file into your PC.
So again, even if we assume that a God AI does try to do a Skynet, only an idiot would believe it can self-replicate. The Terminator 3 scenario was extreme idiocy (and not just because Skynet actually subsequently committed suicide because it nuked all of the computers it was running on)
Yes, and? If it's so intelligent, it can then modify itself to fit the environment. If it can't work, the parent program tries to infect new machine with new version.
Except of course this only demonstrates that you've never actually done any coding.
Compatibility issues are a huge hurdle in any software development cycle. You cannot make a program run on Linux if it was meant to run on Windows natively, and it gets worse with a more complex program. In fact, you're probably going to look at a total recode from top to bottom just to get it to run again... and with a huge and highly complex program, that ain't gonna happen overnight.
Unless of course you're again one of the singularist idiots who hand-wave the actual complexity of code.
Unlike human programmers, AI would be actually capable of thoroughly testing and understanding the processor it runs at, finding out all the errors it possibly can, and adjusting accordingly.
Actually, they can't. They can't do it now, and it's unlikely they will able to do it quickly in the future. To date we don't even have a programming tool for computer programmers that can do automatic code corrections if there is a testing failure; the best that we have is little more than the equivalent of a spell checker.
This is again the "design ability" hurdle that singularists tend to hand wave, but do not realize is an enormous hurdle; probably because the vast majority of singularist proponents aren't actual programmers but sci-fi writers.
=====
A human being isn't born knowing how their brain works. We have to use sophisticated tools to actually look at how the brain works.
A machine would similarly be unable automatically figure out how a microprocessor works. It may be able to know what model of microprocessor it's using by checking the System Registry, but System Registries do not come with complete schematics on how the microprocessor actually looks or functions. It doesn't say what kind of tolerances the hardware is capable of. If it tries to install itself on just some random computer, the likely result is simple: It will not run, period. It's like trying to run a game like Crysis II without knowing the system capabilities.
So, again, the claims of the singularists are very much on the level of "insane paranoid ramblings" as far as the Terminator 3 Skynet scenario is concerned. It was a dumb scenario. You may as well claim that the Large Hadron Collider will kill us all.