What if the optimism is off by centuries, not decades?
As a programmer, I can tell you that our techniques have not kept up with the hardware. Yes, we do a lot more operations per second, and store a lot more data in increasingly smaller space, but we still use the same basic algorithms and data structures we used forty years ago. As a professional in the field, I'm not at all convinced we know the first thing about how to program a fundamentally intelligent, self-aware machine.
Software Advances of the Last 40 Years
Moderator: Thanas
Software Advances of the Last 40 Years
I think I may finally have found something to call this person as spewing utter bullshit. Kind of hoping for help busting this or if he is right, well, I guess I will have to live with that as well.
"He that would make his own liberty secure must guard even his enemy from oppression; for if he violates this duty, he establishes a precedent that will reach to himself."
Thomas Paine
"For the living know that they shall die: but the dead know not any thing, neither have they any more a reward; for the memory of them is forgotten."
Ecclesiastes 9:5 (KJV)
Thomas Paine
"For the living know that they shall die: but the dead know not any thing, neither have they any more a reward; for the memory of them is forgotten."
Ecclesiastes 9:5 (KJV)
Re: Software Advances of the Last 40 Years
There's a difference between computer science theory following rules as old as Alan Turing and 'not keeping up'. Although it wouldn't surprise me if there was some truth to that.
The AI thing is just a massive red-herring, and I expect Starglider to school that idea pretty firmly.
The AI thing is just a massive red-herring, and I expect Starglider to school that idea pretty firmly.
Re: Software Advances of the Last 40 Years
I know you personally do not like just asking questions without doing research but in this case there is just so much stuff that it really is almost impossible to know where to start.
If I understand, even such stuff as shells and the google bots which I used to try to find such information are forms of AI
If I understand, even such stuff as shells and the google bots which I used to try to find such information are forms of AI
"He that would make his own liberty secure must guard even his enemy from oppression; for if he violates this duty, he establishes a precedent that will reach to himself."
Thomas Paine
"For the living know that they shall die: but the dead know not any thing, neither have they any more a reward; for the memory of them is forgotten."
Ecclesiastes 9:5 (KJV)
Thomas Paine
"For the living know that they shall die: but the dead know not any thing, neither have they any more a reward; for the memory of them is forgotten."
Ecclesiastes 9:5 (KJV)
Re: Software Advances of the Last 40 Years
Uh, the quote says 'fundamentally intelligent, self-aware machine', not a webcrawler. He seems to be suggesting that current thinking is tied to the past and not keeping up with the possiblities of new hardware which is why we can't build AI yet.
Re: Software Advances of the Last 40 Years
That is why I need someone who can at least help me though.....
Some other examples are the fuzzy logic we use in battles from Lord of the Rings and for the newest video games. They may not be self aware AI but are they steps in that way?
Some other examples are the fuzzy logic we use in battles from Lord of the Rings and for the newest video games. They may not be self aware AI but are they steps in that way?
"He that would make his own liberty secure must guard even his enemy from oppression; for if he violates this duty, he establishes a precedent that will reach to himself."
Thomas Paine
"For the living know that they shall die: but the dead know not any thing, neither have they any more a reward; for the memory of them is forgotten."
Ecclesiastes 9:5 (KJV)
Thomas Paine
"For the living know that they shall die: but the dead know not any thing, neither have they any more a reward; for the memory of them is forgotten."
Ecclesiastes 9:5 (KJV)
- Starglider
- Miles Dyson
- Posts: 8709
- Joined: 2007-04-05 09:44pm
- Location: Isle of Dogs
- Contact:
Re: Software Advances of the Last 40 Years
The first part is a non-sequitur. Of course we use the same basic structures; that's what all the more sophisticated structures are constructed from. Computer science is quite close to maths (although not actually maths) and it shares the property of progressing in a steady aggregation of techniques that build on each other. That said there are a lot of 'basic structure's that most programmers would have had to know inside and out in the 1980s (e.g. segmentation/overlays/addressing modes) but which modern programmers can completely ignore, plus there is a huge raft of stuff that most programmers treat as a basic primitive today (e.g. associative arrays) which would've been a sophisticated custom code block in the 1980s.
The second one is only debatable because of the 'fundamentally intelligent, self-aware machine' qualifier. Obviously AI has made a fair amount of progress and we do have programs that can do all kinds of interesting and useful things automatically. The obvious conclusion is that we are making slow progress towards general intelligence, but unfortunately I have to agree that most current AI research does not have much bearing on the central problems of general intelligence. That said we have lots of viable ideas - Bayes/Kolomogorov probablistic logic systems, workable genetic programming, recurrent and spiking NNs, heirarchical temporal memory, many other specialised techniques - that were not around in 1969 (in fact pretty much the only AGI-relevant things that were around then were simple planners using classic propositional logic). Dismissing all that as irrelevant would take fairly spectacular ignorance and/or arrogance, but again those weasel words ("I'm not at all convinced") may make it difficult to nail this person on it.
The hardware whining thing is funny - not just because 99% of those 'oh hardware is designed wrong, I know much better than all the engineers at Intel' or 'oh noes a x100 clock speed improvement has made all our programming techniques irrelevant no really' blog posts are bullshit (though that is hilarious) - but also because modern CPUs use the same logic gates, the same basic instruction pipeline, even a lot of the same low level logic circuit designs (flip flops, adder blocks) as 1960s CPUs. Oh noes! That means hardware is just as backwards as software after all!
Send this person over here if you can and we will roast them.
The second one is only debatable because of the 'fundamentally intelligent, self-aware machine' qualifier. Obviously AI has made a fair amount of progress and we do have programs that can do all kinds of interesting and useful things automatically. The obvious conclusion is that we are making slow progress towards general intelligence, but unfortunately I have to agree that most current AI research does not have much bearing on the central problems of general intelligence. That said we have lots of viable ideas - Bayes/Kolomogorov probablistic logic systems, workable genetic programming, recurrent and spiking NNs, heirarchical temporal memory, many other specialised techniques - that were not around in 1969 (in fact pretty much the only AGI-relevant things that were around then were simple planners using classic propositional logic). Dismissing all that as irrelevant would take fairly spectacular ignorance and/or arrogance, but again those weasel words ("I'm not at all convinced") may make it difficult to nail this person on it.
The hardware whining thing is funny - not just because 99% of those 'oh hardware is designed wrong, I know much better than all the engineers at Intel' or 'oh noes a x100 clock speed improvement has made all our programming techniques irrelevant no really' blog posts are bullshit (though that is hilarious) - but also because modern CPUs use the same logic gates, the same basic instruction pipeline, even a lot of the same low level logic circuit designs (flip flops, adder blocks) as 1960s CPUs. Oh noes! That means hardware is just as backwards as software after all!
Send this person over here if you can and we will roast them.
Re: Software Advances of the Last 40 Years
His entire logic can also be refutted by this;
We have the same basic 4 compound genetic structure as single cell organism 3.6 billion years ago. Clearly, evolution is not advancing us as a lifeform.
It's the application of the basic principals that have to advance, not the principals themselves.
And if programmers find a better way to do something, they let you know about it. We are braggots after all
Although Starglider put things better then I ever could.
We have the same basic 4 compound genetic structure as single cell organism 3.6 billion years ago. Clearly, evolution is not advancing us as a lifeform.
It's the application of the basic principals that have to advance, not the principals themselves.
And if programmers find a better way to do something, they let you know about it. We are braggots after all
Although Starglider put things better then I ever could.
I've been asked why I still follow a few of the people I know on Facebook with 'interesting political habits and view points'.
It's so when they comment on or approve of something, I know what pages to block/what not to vote for.
It's so when they comment on or approve of something, I know what pages to block/what not to vote for.
Re: Software Advances of the Last 40 Years
His post is incredibly rude but I am still going to post it here.
Here is a link to the thread
http://208.84.116.223/forums/index.php? ... 7357&st=60
Here is a link to the thread
http://208.84.116.223/forums/index.php? ... 7357&st=60
You can tell your boyfriends on SDN this:
1. I don't goof around on "which is better: Imperial Star Destroyer or Enterprise D" fanboy sites. If Kitsune (called "Desert Fox" here, or just "DF") wants to channel me for the purposes of this specific conversation, he can (most likely to his detriment).
2. The first line of the quote is a reference to predictions of near-term (ca. 20 years) commercial fusion power. It is irrelevant to a discussion about software technology.
3. What DF neglected to tell you is that the original discussion was about naive Positivism, and my statement was in response to this wonderfully vague and categorical assertion about the programability of intelligent machines: "As for programming, sure it is hard. However, there is no reason it can't be done since it occurs in every human."
4. I apparently confused poor ol' DF when I said that I was a "professional in the field". I was referring back to the first sentence of the paragraph, which began, "As a programmer..." He apparently took it to mean that I was claiming to be an AI expert. Nothing could be further from the truth.
5. Where I went to school, we learned to build those "sophisticated custom code block"s from the real basic primitives before we were allowed to assume that they'd just be available. With a little bit of refresher work, I'm confident I could match whoever you are pointer chasing through a BST, or quadratically probing a hash table.
6. I'm neither impressed nor intimidated by a list of buzzword that can easily be lifted off of the interwebs. Yes, we have developed great monuments to complexity and subtlety with our current tools. I benefit from some of that work daily. (Query optimizers in database engines being a paticularly elementary but useful form of AI.) My skepticism comes from the suspicion -- and I'll freely admit that's all that it is -- that intelligence may necessarily be based on a fundamentally different means of organizing and processing data than we use in modern digital computers. I'm not saying that it is, but I'm not willing to rule it out.
7. IOW, I don't believe something is a technological imperitive simply because it exists in nature. Actually, the extraneous statement about commercial fusion power is right on point in this regard.
8. What you call "weasel words" most people call qualification. Unlike DF, I was trying to avoid a categorical statement.
9. For the person that brought up DNA, a couple of things:
a. DNA is a data storage medium. Data storage is nothing without rules to organize the data and programs that put it to use. Here in our little corner of the universe, it took right around 3 billion years to go from DNA synthesis to human intelligence. Now, we do have a big head start in knowing that intelligence exists and what its observable properties are. Let's not flatter ourselves that we know much more than that, however.
b. What if what we're doing with computers is the informational equivalent of inorganic chemistry? One can take the oxygen and phosphorus out of DNA, add a bunch of silicon and traces of the most common metals, and you get granite. One can make Mt. Everest out of granite and not know how to make a paramecium.
I'll be watching you, DF. Don't change or leave out a word. I'll post the changes here alongside my original text if you do.
"He that would make his own liberty secure must guard even his enemy from oppression; for if he violates this duty, he establishes a precedent that will reach to himself."
Thomas Paine
"For the living know that they shall die: but the dead know not any thing, neither have they any more a reward; for the memory of them is forgotten."
Ecclesiastes 9:5 (KJV)
Thomas Paine
"For the living know that they shall die: but the dead know not any thing, neither have they any more a reward; for the memory of them is forgotten."
Ecclesiastes 9:5 (KJV)
- Starglider
- Miles Dyson
- Posts: 8709
- Joined: 2007-04-05 09:44pm
- Location: Isle of Dogs
- Contact:
Re: Software Advances of the Last 40 Years
I imagine he was actually complaining about the lack of real meaning in the semantic networks Cyc-style classic propositional reasoners use, which probably constituted most of his limited exposure to AI. Equating this with philosophical positivism (which is in fact one of the few good things in the whole field of philosophy) is nonsense; yes essentially all of those symbolic AI researchers were explicit or implicit positivists, but their extreme oversimplication (largely forced by the hardware of the time) is in no way a part of the positivist paradigm.3. What DF neglected to tell you is that the original discussion was about naive Positivism
A clarification here; today if you want to store data you normally just pick an appropriate library class; STL or the equivalents in all major languages. You get a simple common interface and can pick from a few generic implementations (e.g. red-black tree, linked hashmap). Performance is merely adequate, but that's ok; better to have something highly functional, well tested and well optimised. Most programmers these days aren't trained to write optimised low-level stuff anyway; you can't really blame them, when they have to learn so much other material (multimedia and 3D APIs, web programming, umpteen different enterprise 'frameworks' for CV buzzword compliance) that earlier programmers didn't.Where I went to school, we learned to build those "sophisticated custom code block"s from the real basic primitives before we were allowed to assume that they'd just be available.
If you look at older code, you find a much higher incidence of custom-designed data structures hand-optimised for specific usage scenarios - /not/ the toy implementations you learned in Data Structures 101 (though of course there were plenty of those in low quality code). Today you only really see this in kernels, database engines and a few scientific applications - even games are now written using relatively high level libraries as much as possible. Generally this is a good tradeoff; most code isn't time-critical anyway, and saving those developer hours for use somewhere else is more important. I recently spent six months implementing a graph database engine chock-full of bit level packing, blocking for cache optimisation and assorted inline assembly - because it really was necessary for fast processing of large datasets - but this is extremely unusual in modern practice and caused just about everyone who's seen the source it to go 'wuh?'.
That is actually the correct position to take, if you are not an AGI researcher. However I should stress that if you have not spent years studying the field (not just playing with your own prototypes, reading everyone else's papers), you are not qualified to have any real opinion about it. AGI really is that difficult and that counterintuitive - it takes years of work just to be qualified to have a credible hunch, and to date >90% of those have still turned out wrong. I personally have now become quite confident that representations relatively similar to those used by conventional programs (as opposed to completely opaque artificially evolved or low-level-brain-mimicking ones) will work, someone less confident but still inclined to think that they are optimal, but there's still too much personal interpretation in that for me to be able to dismiss researchers who don't agree with me (besides, most of the alternative approaches are dismissable by their own specific flaws, so it's usually easier to win by default).My skepticism comes from the suspicion -- and I'll freely admit that's all that it is -- that intelligence may necessarily be based on a fundamentally different means of organizing and processing data than we use in modern digital computers. I'm not saying that it is, but I'm not willing to rule it out.
Re: Software Advances of the Last 40 Years
I really wish he would discuss with you directly instead of going through me...Instead I want to go back to the original subject of the Pregnant Walking Whale
"He that would make his own liberty secure must guard even his enemy from oppression; for if he violates this duty, he establishes a precedent that will reach to himself."
Thomas Paine
"For the living know that they shall die: but the dead know not any thing, neither have they any more a reward; for the memory of them is forgotten."
Ecclesiastes 9:5 (KJV)
Thomas Paine
"For the living know that they shall die: but the dead know not any thing, neither have they any more a reward; for the memory of them is forgotten."
Ecclesiastes 9:5 (KJV)