Computer Science - why don't people 'get it'?

SLAM: debunk creationism, pseudoscience, and superstitions. Discuss logic and morality.

Moderator: Alyrium Denryle

User avatar
Durandal
Bile-Driven Hate Machine
Posts: 17927
Joined: 2002-07-03 06:26pm
Location: Silicon Valley, CA
Contact:

Post by Durandal »

Darth Wong wrote:So? There's a lot more applied science in optimally programming a CNC machine than there is in programming computers.
What's your point? "Programming" can encompass a wide variety of things.
The complexity of the program has no bearing whatsoever on the question of whether the task should be called "programming".
No, but it does have bearing on whether or not the person who performs the task should be classified as a "technician" or something else. The guy who programs a CNC machine to machine a part isn't designing any new systems; he's just scripting a machine's movements. So yeah, you could classify him as a technician. The guy who writes an IO subsystem and its associated APIs for an operating system is designing an entirely new system, which is not something technicians do.

CNC programmers and application software developers do entirely different things in the same world. The overlap in skillset is minimal, at best.

Though I'm curious, what exactly is your definition of proper engineering? Must it apply a physical science of some sort, be regulated and licensed by a board or both?
Damien Sorresso

"Ever see what them computa bitchez do to numbas? It ain't natural. Numbas ain't supposed to be code, they supposed to quantify shit."
- The Onion
User avatar
Winston Blake
Sith Devotee
Posts: 2529
Joined: 2004-03-26 01:58am
Location: Australia

Post by Winston Blake »

Darth Wong wrote:So? There's a lot more applied science in optimally programming a CNC machine than there is in programming computers.
Defining engineering as applied science clearly excludes making software, which is inherently unphysical and so not governed by scientific physical laws. But it also excludes control systems engineering (somebody had to make that CNC machine move properly). Control engineering is more about applied mathematics than science.

If engineering is defined as applying science and mathematics, then many areas of software development become engineering. For example, making a search engine by applying the theory on searching and sorting sets. Or more generally, applying computer science concepts (which I agree should probably be 'computational mathematics') to manage pieces of information. Data structures and algorithms have to be selected and implemented based on their particular characteristics. I think this only applies to a minority of the work - the vast majority of software engineers aren't engineers.

A big difference from ordinary engineering is that making software doesn't use calculus. I don't know much, but I think engineering typically means mathematically describing an electrical or mechanical or chemical etc system and then solving those equations to find the design parameters, or whether it will fail, or perform properly. I don't see how you can do that with software - form equations completely describing the system and then solve them to produce the code.

The way I see it, the vast majority of software is made in a similar way to how the Romans built aqueducts and the Coliseum. They didn't have our mathematical descriptions of why things failed back then. They just built a body of knowledge by trial and error, experience-born instinct and simple geometry. This is much more like a trade (like carpentry) than engineering. Sure they could have theory on the characteristics of common problems/solutions/pitfalls, but that's not based on higher mathematics.

Software development is like building a bridge by turning gravity off, sketching on a piece of paper, and then a bridge materialises. Then you turn gravity on and one end collapses. So you rewind time, find out what failed and fix it. Then you run an expected load of traffic across, and the middle fails. Rewind, fix. Then you run a 6x load and it stays up, and you go have lunch and it's still up when you come back. There: product finished.

If physical engineering was like that, prehistoric man would have had aqueducts/railways/etc. We'd also be a race of telekinetic Time Lords. This isn't irrelevant - even if a full mathematical description of software arose, repeated prototyping is so ridiculously easy that there's no guarantee it would ever be used.

So basically, what I'm saying is that a small part of software development is engineering in the sense of applied mathematics. The majority is not, and further, will probably never be.
User avatar
Starglider
Miles Dyson
Posts: 8709
Joined: 2007-04-05 09:44pm
Location: Isle of Dogs
Contact:

Post by Starglider »

Winston Blake wrote:Defining engineering as applied science clearly excludes making software, which is inherently unphysical and so not governed by scientific physical laws.
I just pointed out that computer science does rely (to some extent) on empirical tests, for characterisation of performance and behaviour. These are run on real physical systems (of a certain structure that implements the logical structure of interest). You'll have to say why this isn't 'real science' if you think software engineering isn't applied science. Certainly the mathematicians don't think it's 'real maths'.
A big difference from ordinary engineering is that making software doesn't use calculus.
Or rather, it rarely uses calculus. Some kinds of machine learning do, as do physics engines for games and modelling software. But anyway I don't see why calculus is any more special than any other branch of maths.
I don't know much, but I think engineering typically means mathematically describing an electrical or mechanical or chemical etc system and then solving those equations to find the design parameters, or whether it will fail, or perform properly. I don't see how you can do that with software - form equations completely describing the system and then solve them to produce the code.
Ah now you've hit on something relevant. The software equivalent of this is 'formal methods', where you do indeed create a mathematical description of the software and prove that it will work. Unfortunately, only a tiny fraction of software developers use this approach, because it's difficult and slow and the tools aren't very good. Formal methods are used for software where quality/reliability is critical and the cost and time taken is worth it; usually embedded software in safety-critical applications. I would not object to the idea that only this kind of software development is real engineering. IMHO all software should be developed like this, but realistically that will require much better development tools (not coincidentally, that is what my company is working on).
The way I see it, the vast majority of software is made in a similar way to how the Romans built aqueducts and the Coliseum. They didn't have our mathematical descriptions of why things failed back then. They just built a body of knowledge by trial and error, experience-born instinct and simple geometry.
90% of the time, we have the theory required to describe software in detail. The problem is that most customers don't want to pay for it; either the dev time or the high salaries of real software engineers who know how to do formal methods (a tiny minority of software developers, single digit percent at best, possibly less than 1%). It's equivalent to nearly everyone ignoring professional architects and getting a graphic designer to sketch out their new house in an afternoon - because it's quicker and cheaper. People get away with this because most software isn't safety critical, but it's why most software is quite buggy.
User avatar
Winston Blake
Sith Devotee
Posts: 2529
Joined: 2004-03-26 01:58am
Location: Australia

Post by Winston Blake »

Starglider wrote:
Winston Blake wrote:Defining engineering as applied science clearly excludes making software, which is inherently unphysical and so not governed by scientific physical laws.
I just pointed out that computer science does rely (to some extent) on empirical tests, for characterisation of performance and behaviour. These are run on real physical systems (of a certain structure that implements the logical structure of interest). You'll have to say why this isn't 'real science' if you think software engineering isn't applied science. Certainly the mathematicians don't think it's 'real maths'.
Running it is physical, sure, but what I meant was that the 'design' part is not dependent on physical laws. I was unclear here, but I elaborated later.
A big difference from ordinary engineering is that making software doesn't use calculus.
Or rather, it rarely uses calculus. Some kinds of machine learning do, as do physics engines for games and modelling software. But anyway I don't see why calculus is any more special than any other branch of maths.
It's not, it's just something different between software engineering and other kinds. It's a possible reason why Darth Wong and others just 'feel' that it's not 'real' engineering.
The software equivalent of this is 'formal methods', where you do indeed create a mathematical description of the software and prove that it will work. Unfortunately, only a tiny fraction of software developers use this approach, because it's difficult and slow and the tools aren't very good.
Ah. I didn't know this existed.
User avatar
phongn
Rebel Leader
Posts: 18487
Joined: 2002-07-03 11:11pm

Post by phongn »

Starglider wrote:IMHO all software should be developed like this, but realistically that will require much better development tools (not coincidentally, that is what my company is working on).
It certainly would be nice, though for really complicated systems could you do formal verification? There's so many possible state changes that it seems rather infeasible to do.
User avatar
Darth Wong
Sith Lord
Sith Lord
Posts: 70028
Joined: 2002-07-03 12:25am
Location: Toronto, Canada
Contact:

Post by Darth Wong »

OK, a couple of things:

1) Assuming that it is a form of systems design, I would be a lot more comfortable with calling something "engineering" if it incorporated professional standards, a self-policing association, a code of conduct, and the possibility of loss of credentials due to misconduct. As far as I know, no such thing exists in the programming world. At all. Feel free to enlighten me if this is not the case.

2) Despite what you may think, software people are above and beyond others when it comes to abusing the term "engineer". While stupid terms like "sanitary engineer" have been around for a while, nobody has ever taken them seriously. But how many thousands of resumes have been handed into employer desks with bullshit credentials on them like MCSE, an acronym which I am reluctant to expand for fear that I will vomit all over my keyboard?
Image
"It's not evil for God to do it. Or for someone to do it at God's command."- Jonathan Boyd on baby-killing

"you guys are fascinated with the use of those "rules of logic" to the extent that you don't really want to discussus anything."- GC

"I do not believe Russian Roulette is a stupid act" - Embracer of Darkness

"Viagra commercials appear to save lives" - tharkûn on US health care.

http://www.stardestroyer.net/Mike/RantMode/Blurbs.html
User avatar
phongn
Rebel Leader
Posts: 18487
Joined: 2002-07-03 11:11pm

Post by phongn »

Darth Wong wrote:1) Assuming that it is a form of systems design, I would be a lot more comfortable with calling something "engineering" if it incorporated professional standards, a self-policing association, a code of conduct, and the possibility of loss of credentials due to misconduct. As far as I know, no such thing exists in the programming world. At all. Feel free to enlighten me if this is not the case.
There are two associations involved with the CS field in general, but they're weaker than their counterparts in traditional engineering fields. IEEE-CS and the Association for Computer Machinery (ACM) are the two big ones. Both ACM and IEEE-CS have standardized on a code of ethics and professional practice.

There remains no authoritative credentialing system in the US for programmers, and even for traditional engineering fields a PE is only required for non-public work.
Despite what you may think, software people are above and beyond others when it comes to abusing the term "engineer". While stupid terms like "sanitary engineer" have been around for a while, nobody has ever taken them seriously. But how many thousands of resumes have been handed into employer desks with bullshit credentials on them like MCSE, an acronym which I am reluctant to expand for fear that I will vomit all over my keyboard?
As a nitpick, the MCSE is actually an IT administration certification. Microsoft's programming certifications end with "developer," not "engineer."
User avatar
Durandal
Bile-Driven Hate Machine
Posts: 17927
Joined: 2002-07-03 06:26pm
Location: Silicon Valley, CA
Contact:

Post by Durandal »

phongn wrote:
Darth Wong wrote:1) Assuming that it is a form of systems design, I would be a lot more comfortable with calling something "engineering" if it incorporated professional standards, a self-policing association, a code of conduct, and the possibility of loss of credentials due to misconduct. As far as I know, no such thing exists in the programming world. At all. Feel free to enlighten me if this is not the case.
There are two associations involved with the CS field in general, but they're weaker than their counterparts in traditional engineering fields. IEEE-CS and the Association for Computer Machinery (ACM) are the two big ones. Both ACM and IEEE-CS have standardized on a code of ethics and professional practice.

There remains no authoritative credentialing system in the US for programmers, and even for traditional engineering fields a PE is only required for non-public work.
I think the key requirement is issuing a revocable license. If a software engineer is guilty of unethical activity or incompetence, there's really no institution that keeps a record of it. And beyond that, any institution with this power would have to have much looser standards for incompetence than are commonly accepted for civil or mechanical engineering. At the end of the day, software has bugs, and it will always have bugs. In the software community, we've learned to live with that and learned how to fix them quickly.

A bridge, on the other hand, really can't have any "bugs" without catastrophic consequences.
Despite what you may think, software people are above and beyond others when it comes to abusing the term "engineer". While stupid terms like "sanitary engineer" have been around for a while, nobody has ever taken them seriously. But how many thousands of resumes have been handed into employer desks with bullshit credentials on them like MCSE, an acronym which I am reluctant to expand for fear that I will vomit all over my keyboard?
As a nitpick, the MCSE is actually an IT administration certification. Microsoft's programming certifications end with "developer," not "engineer."[/quote]

Everyone recognizes that MCSEs are basically useless.
Damien Sorresso

"Ever see what them computa bitchez do to numbas? It ain't natural. Numbas ain't supposed to be code, they supposed to quantify shit."
- The Onion
User avatar
phongn
Rebel Leader
Posts: 18487
Joined: 2002-07-03 11:11pm

Post by phongn »

Durandal wrote:Everyone recognizes that MCSEs are basically useless.
Pretty much. I think Microsoft intended well with them, but the creation of boot camps in the dot-com boom ruined the certification for everyone else.
User avatar
Starglider
Miles Dyson
Posts: 8709
Joined: 2007-04-05 09:44pm
Location: Isle of Dogs
Contact:

Post by Starglider »

Durandal wrote:And beyond that, any institution with this power would have to have much looser standards for incompetence than are commonly accepted for civil or mechanical engineering. At the end of the day, software has bugs, and it will always have bugs. In the software community, we've learned to live with that and learned how to fix them quickly.
I don't agree with that sentiment. In principle, formal software engineering tools make it possible to guarentee that software will meet the specification and will not fail catastrophically. Yes it's extremely time consuming and requires specialised skillsets and the tools are expensive and kinda sucky. But it is possible. Of course bugs will still exist in the original specification, because it's impossible to eliminate human error from requirements capture, and possibly in the tools. Look at electronic engineering though. No one (that I know of) objects to people who design circuitry being called 'electronic engineers' even though they can get the requirements wrong and sometimes even introduce glitches into their circuits. IMHO it is definitely possible to achieve standards of quality and certification in software engineering equivalent to electrical engineering.
A bridge, on the other hand, really can't have any "bugs" without catastrophic consequences.
A bridge is overdesigned to do that. Overdesigning software is harder, but possible (pervasive verification, redundant code paths that must agree, hard performance guarentees that you then multiply by a safety factor etc). For example, software for the Space Shuttle is written like this. But almost no one else is prepared to pay for it; generally it's only done when there are unavoidable liability concerns and developing the software properly is cheaper than just eating the litigation (or insurance) costs if it fails.

Basically we could have properly certified 'software engineers', but they'd be a small niche compared to the hordes of 'software developers'. The problem is the market demand/interest isn't there to set up the legislation and organisations required.
Everyone recognizes that MCSEs are basically useless.
All good technical people sure. A lot of non-technical management don't, a lot of low-grade technical people who were suckered into getting one think it's worth something, and recruiters and trainers are spending a lot of money on marketing trying to maintain that illusion.
User avatar
Battlehymn Republic
Jedi Council Member
Posts: 1824
Joined: 2004-10-27 01:34pm

Post by Battlehymn Republic »

What about Cisco's certifications?
User avatar
phongn
Rebel Leader
Posts: 18487
Joined: 2002-07-03 11:11pm

Post by phongn »

Battlehymn Republic wrote:What about Cisco's certifications?
They're pretty good, AFAICT.
User avatar
The Jester
Padawan Learner
Posts: 475
Joined: 2005-05-30 08:34am
Location: Japan

Post by The Jester »

Battlehymn Republic wrote:What about Cisco's certifications?
I would only put stock in CCIE certification since they're the only ones which involve some level of challenge in obtaining.
User avatar
phongn
Rebel Leader
Posts: 18487
Joined: 2002-07-03 11:11pm

Post by phongn »

Aren't the CCNA and CCNP exams reasonably challenging as well? They're no CCIE, granted, but they're not a slam dunk.
User avatar
Edi
Dragonlord
Dragonlord
Posts: 12461
Joined: 2002-07-11 12:27am
Location: Helsinki, Finland

Post by Edi »

phongn wrote:Aren't the CCNA and CCNP exams reasonably challenging as well? They're no CCIE, granted, but they're not a slam dunk.
CCNA reasonably challenging? Hahahaaaaaa! It might be if they have changed it from what it used to be, but when the standard of difficulty used to be "What color is this red cable?", the improvement would have to be pretty big.

CCNP actually requires some more skill so you do have to have an idea of what you're doing, but a high level certification it ain't. The Jester can probably elaborate better on these, as his knowledge of the Cisco certificates is more current than mine.
Warwolf Urban Combat Specialist

Why is it so goddamned hard to get little assholes like you to admit it when you fuck up? Is it pride? What gives you the right to have any pride?
–Darth Wong to vivftp

GOP message? Why don't they just come out of the closet: FASCISTS R' US –Patrick Degan

The GOP has a problem with anyone coming out of the closet. –18-till-I-die
User avatar
Durandal
Bile-Driven Hate Machine
Posts: 17927
Joined: 2002-07-03 06:26pm
Location: Silicon Valley, CA
Contact:

Post by Durandal »

Starglider wrote:I don't agree with that sentiment. In principle, formal software engineering tools make it possible to guarentee that software will meet the specification and will not fail catastrophically.
That's the desire. But anyone who's actually worked on large, complex systems will tell you that, in practice, it's simply not achievable. Frankly, I've never put much stock in formal software engineering. Operating systems are a prime example of this because, even if each component is tested in its own little environment and works, they can still come together to cause the system to fail catastrophically. (Yes, even if you've designed your operating system to segregate user-space processes from each other and from the kernel and all those other good design practices.)

At the end of the day, operating systems and computers are extraordinarily complex systems. They can't be reliably predicted. I've seen simple, one-line changes result in kernel panics and hugely complex changes go in without any problems. You just can't guarantee anything. And it's not because software developers are incompetent or don't know how to test, it's just that achieving exhaustive coverage in your testing is impossible.
Yes it's extremely time consuming and requires specialised skillsets and the tools are expensive and kinda sucky. But it is possible. Of course bugs will still exist in the original specification, because it's impossible to eliminate human error from requirements capture, and possibly in the tools. Look at electronic engineering though. No one (that I know of) objects to people who design circuitry being called 'electronic engineers' even though they can get the requirements wrong and sometimes even introduce glitches into their circuits. IMHO it is definitely possible to achieve standards of quality and certification in software engineering equivalent to electrical engineering.
Historically, that's just wrong, and it has to do with complexity. Look at the reliability of CPUs versus the reliability of software. When was the last time a really, really serious bug was found in a CPU? I mean one that would cause it to simply stop functioning properly? I'm not talking about the unit being part of a bad batch; I'm talking about an overwhelming failure in the design.

There haven't been very many, and there's a good reason. For all the difficulties of CPU design and hardware logic design, the end product is a whole hell of a lot more predictable in its behavior. It's a lot easier to develop a comprehensive test suite for a CPU than it is for a complex software system because you can develop unit tests for the CPU that achieve excellent coverage. After all, on a CPU, making sure that a given set of inputs produces the correct set of outputs pretty much covers the entire set of functionality.

Not the case in software. Making sure all your methods return the proper values is only a small part of how the system works. You have to deal with things like threading issues, asynchronous behavior, blocking, possibly weird error conditions in the system generating exceptions, etc. I shudder to think how many components in any operating system would just puke themselves in malloc() suddenly started failing.
A bridge is overdesigned to do that. Overdesigning software is harder, but possible (pervasive verification, redundant code paths that must agree, hard performance guarentees that you then multiply by a safety factor etc). For example, software for the Space Shuttle is written like this. But almost no one else is prepared to pay for it; generally it's only done when there are unavoidable liability concerns and developing the software properly is cheaper than just eating the litigation (or insurance) costs if it fails.
The Space Shuttle software also runs on 8086's, if memory serves. How complex can it really be compared to modern operating systems?
Basically we could have properly certified 'software engineers', but they'd be a small niche compared to the hordes of 'software developers'. The problem is the market demand/interest isn't there to set up the legislation and organisations required.
The problem is that so much software development is done by amateurs. Anyone with a computer can pretty much dive right in. And the trend in the industry is to make development more easily accessible, which encourages more amateurs to enter.

But it's not like you can just go and buy yourself a shop and a lathe to start your career in mechanical engineering. And there is no trend in mechanical engineering that has the effect of making entry into the field easier.
Damien Sorresso

"Ever see what them computa bitchez do to numbas? It ain't natural. Numbas ain't supposed to be code, they supposed to quantify shit."
- The Onion
User avatar
phongn
Rebel Leader
Posts: 18487
Joined: 2002-07-03 11:11pm

Post by phongn »

Durandal wrote:Historically, that's just wrong, and it has to do with complexity. Look at the reliability of CPUs versus the reliability of software. When was the last time a really, really serious bug was found in a CPU? I mean one that would cause it to simply stop functioning properly? I'm not talking about the unit being part of a bad batch; I'm talking about an overwhelming failure in the design.
K10 had a nasty TLB bug in it that caused AMD stop shipping Opterons until there was a fix (and the software fix causes a fairly significant performance drop). Core 2 has plenty of errata, too, and a lot of it won't be fixed.
After all, on a CPU, making sure that a given set of inputs produces the correct set of outputs pretty much covers the entire set of functionality.
But how many combination of input are there? It's not possible to exhaustively test them.
Not the case in software. Making sure all your methods return the proper values is only a small part of how the system works. You have to deal with things like threading issues, asynchronous behavior, blocking, possibly weird error conditions in the system generating exceptions, etc. I shudder to think how many components in any operating system would just puke themselves in malloc() suddenly started failing.
CPU designers have to deal with their own problems too, like ensuring your cache hierarchy doesn't go wonky, for example. They're immensely complex, just like complex software, just in different ways.
The Space Shuttle software also runs on 8086's, if memory serves. How complex can it really be compared to modern operating systems?
It's 420,000 LOC for the flight control software, with another 1.4 million LOC for development, testing, simulation, etc. And even they have major errors at time - most recently, they never accounted for the fact that the Shuttle might be flying on New Years, where there is be a year change. The general-purpose computer is an AP-101S, which apparently is a custom design. A bunch of other stuff runs on 386 and PPC; see here for some details.
User avatar
Durandal
Bile-Driven Hate Machine
Posts: 17927
Joined: 2002-07-03 06:26pm
Location: Silicon Valley, CA
Contact:

Post by Durandal »

phongn wrote:K10 had a nasty TLB bug in it that caused AMD stop shipping Opterons until there was a fix (and the software fix causes a fairly significant performance drop). Core 2 has plenty of errata, too, and a lot of it won't be fixed.
Okay, so that's one potentially nasty bug. Errata are a different matter, since they don't impair general functionality. A catastrophic CPU failure that just causes it to stop is pretty rare, and if there is one in the errata, you can bet it doesn't manifest itself under anything but the weirdest of circumstances.

Hell, the very existence of errata almost proves my point. They're so damn good at testing that they can come up with a decent list of "known issues". It's not exactly typical for people to discover issues that weren't documented in the errata with CPUs. It's very much typical for people to find issues in software that weren't documented in the "known issues" section, if there was one.
But how many combination of input are there? It's not possible to exhaustively test them.
Not all of them, but you can certainly get better coverage. Just look at the reliability of CPUs versus the reliability of operating systems. CPUs aren't more reliable for the average user just by chance; they're more reliable because they're easier to predict and the "buck stops there", so to speak. There are no black boxes beneath the CPU layer, unlike with software.
CPU designers have to deal with their own problems too, like ensuring your cache hierarchy doesn't go wonky, for example. They're immensely complex, just like complex software, just in different ways.
Sure, but they manage. I'm not saying that CPUs aren't complex, but they're a lot easier to test due to the lack of multiple abstraction layers.
It's 420,000 LOC for the flight control software, with another 1.4 million LOC for development, testing, simulation, etc. And even they have major errors at time - most recently, they never accounted for the fact that the Shuttle might be flying on New Years, where there is be a year change. The general-purpose computer is an AP-101S, which apparently is a custom design. A bunch of other stuff runs on 386 and PPC; see here for some details.
Complexity isn't just a function of lines of code. It's a function of asynchronous operation. Just how many of those 420,000 lines can be executing simultaneously, for example? In a modern OS like Mac OS X or Windows, I'd wager that half the components that go into it can be in-flight at any given time.

Also, going by straight LOC, Windows XP is estimated to have about 40 million LOC, and Mac OS X Tiger has about 86 million LOC. So yes, the Space Shuttle software is far less complex just based on that metric.
Damien Sorresso

"Ever see what them computa bitchez do to numbas? It ain't natural. Numbas ain't supposed to be code, they supposed to quantify shit."
- The Onion
User avatar
Aquatain
Padawan Learner
Posts: 294
Joined: 2004-11-02 07:13am
Location: Ever Expanding Empire of Denmark

Post by Aquatain »

Computer science - Christ you got it easy, try explaining to somebody that you're a synthesis operator, and they'll all go "huh..what?"
There Lives More Faith In Honest Doubt,Belive Me,Than In Half The Creeds. ~ Alfred Lord Tennyson.

"The two most common elements in the universe are Hydrogen and stupidity."
User avatar
Zac Naloen
Sith Acolyte
Posts: 5488
Joined: 2003-07-24 04:32pm
Location: United Kingdom

Post by Zac Naloen »

Aquatain wrote:Computer science - Christ you got it easy, try explaining to somebody that you're a synthesis operator, and they'll all go "huh..what?"
For some reason I'm picturing you sat in a recording studio with a synthesiser :P



Microsoft have recently changed their qualifications system, I don't know if they've made it more challenging but the low level stuff i've just been doing wasn't the brain dead "what's colour is the screen" shit that I was informed it would be.
Image
Member of the Unremarkables
Just because you're god, it doesn't mean you can treat people that way : - My girlfriend
Evil Brit Conspiracy - Insignificant guy
User avatar
The Jester
Padawan Learner
Posts: 475
Joined: 2005-05-30 08:34am
Location: Japan

Post by The Jester »

Edi wrote:CCNP actually requires some more skill so you do have to have an idea of what you're doing, but a high level certification it ain't. The Jester can probably elaborate better on these, as his knowledge of the Cisco certificates is more current than mine.
I've only sat through Cisco's CCNA courses and their first WLAN course and can testify that these courses are a complete joke. I know plenty that have gone through the CCNP certification process and know while it's significantly more complicated than CCNA, it's still a joke.

Cisco uses online multiple choice exams for testing theory (with no penalties for random guessing) which puts them on the same level as American SATs. Furthermore, since the exams are performed online by so many people, there will always be people that copy the questions and post them online. Since Cisco don't rotate their question pool anywhere near quickly enough, this makes cheating rampant and impossible for instructors to defeat. Most guys I know working in telecoms who bothered with the certification basically got them as an afterthought.
User avatar
Dahak
Emperor's Hand
Posts: 7292
Joined: 2002-10-29 12:08pm
Location: Admiralty House, Landing, Manticore
Contact:

Post by Dahak »

My only experience is with Sun and OMG certifications. I did them because it makes your profile look shinier for the customer your boss wants to sell you to (and because said boss listed them in the goal sheet for that year...).

The Java Programmer exam is quite difficult. Even if you've been programming Java the last 5 years, you probably will fail. Not because you cannot do Java, but because they ask for things you never have to bother with in real work (because the editor will take care of it for you), or which are so special or detailed it is highly unlikely you'll ever see it in real life. This exam, basically, teaches you to play "compiler".

OMG is somewhat nicer, but it also comes down to memorising diagrams, properties of diagrams and the like. Nothing too difficult...
Image
Great Dolphin Conspiracy - Chatter box
"Implications: we have been intercepted deliberately by a means unknown, for a purpose unknown, and transferred to a place unknown by a form of intelligence unknown. Apart from the unknown, everything is obvious." ZORAC
GALE Force Euro Wimp
Human dignity shall be inviolable. To respect and protect it shall be the duty of all state authority.
Image
User avatar
Aquatain
Padawan Learner
Posts: 294
Joined: 2004-11-02 07:13am
Location: Ever Expanding Empire of Denmark

Post by Aquatain »

Zac Naloen wrote:
Aquatain wrote:Computer science - Christ you got it easy, try explaining to somebody that you're a synthesis operator, and they'll all go "huh..what?"
For some reason I'm picturing you sat in a recording studio with a synthesiser :P



Microsoft have recently changed their qualifications system, I don't know if they've made it more challenging but the low level stuff i've just been doing wasn't the brain dead "what's colour is the screen" shit that I was informed it would be.
Actually it's the creation of medicine though chemical synthesis :D
There Lives More Faith In Honest Doubt,Belive Me,Than In Half The Creeds. ~ Alfred Lord Tennyson.

"The two most common elements in the universe are Hydrogen and stupidity."
User avatar
Starglider
Miles Dyson
Posts: 8709
Joined: 2007-04-05 09:44pm
Location: Isle of Dogs
Contact:

Post by Starglider »

Durandal wrote:That's the desire. But anyone who's actually worked on large, complex systems will tell you that, in practice, it's simply not achievable.
No, they won't - or rather if they do, nearly all of them will be speaking from a position of near-complete ignorance. Only a small fraction of programmers ever learn what formal methods entails, never mind actually applies them. They may say honestly say that they weren't able to prove their software correct, but that says nothing about whether appropriately trained engineers could.
Frankly, I've never put much stock in formal software engineering.
Ok, which formal methods frameworks have you used? I've used Z, did a course on VDM (never actually applied it though) and am currently working on an intelligent code generation engine that uses techniques quite similar to formal verification (for a start, the basis is essentially a Petri net). This is the basis for my opinion 'the methods work in principle but they're tedious and fiddly and the tools suck'.
Operating systems are a prime example of this because,
Operating systems are an exception; for one thing the OS is often blamed for the failings of drivers (Windows in particular; drivers for consumer hardware are usually awful) and for another the amount of abstraction that can be used is heavily limited by performance concerns (backwards compatability also imposes crippling constraints). That said, it is still possible to engineer an OS with formal methods and hard reliability guarentees. It's just that no one in their right minds would do this for a major OS because they're already very expensive and time consuming to create from scratch. Microkernels are a step in this direction and that concept mostly failed due to unacceptable performance and (whole-system) complexity overhead.
even if each component is tested in its own little environment and works, they can still come together to cause the system to fail catastrophically
Strictly, formal methods have to be applied to the whole system. It's harder to do this for OSes than nearly anything else. That said there's much reduced scope for formally engineered components interfering with each other even if they're developed and verified completely separately, assuming reasonably defensive assumptions are used.
(Yes, even if you've designed your operating system to segregate user-space processes from each other and from the kernel and all those other good design practices.)
Actually formal verification eliminates the need for address space protection - this is exactly what the Java VM does (well, strictly it does it by a combination of verification and weakened instruction set expressivity), and people have seriously proposed 'Java operating systems' that don't need memory protection. This isn't a practical proposition at this time as it's hard to even contemplate replacing the massive investment in conventional memory-protected systems. This is why projects like Tunes will remain pie-in-the-sky theory or at best toy prototypes unless and until we have the tools to heavily automate massive replacement of the existing software base (with formal methods).
At the end of the day, operating systems and computers are extraordinarily complex systems. They can't be reliably predicted.
Computers are carefully engineered to be close to deterministic. Obviously concurrent systems aren't deterministic by default but much of the art of parallel programming deals with various mechanisms to guarantee specific required types of determinism. What you're saying is just 'humans can't always judge the effect of small changes to huge systems by intuition/experience'. Well no shit sherlock. Guess what, it isn't practical to write Linux entirely by typing machine code into a hex editor - yet we do build massively complex software systems, because compilers (and to a lesser extent, IDEs) are major 'force multipliers' for the amount of complexity a programmer can handle. Meanwhile, most programmers use no verification tools beyond maybe static type checking (though ignoring even this minimal level of verification seems to be trendy these days).
I've seen simple, one-line changes result in kernel panics and hugely complex changes go in without any problems. You just can't guarantee anything.
Have you tried creating a VDM model of your system and asking the prover that question? *
And it's not because software developers are incompetent or don't know how to test, it's just that achieving exhaustive coverage in your testing is impossible.
And here we see the crux of the problem. All of your 'oh, you can't be sure' whining is based on the concept of testing as the sole quality assurance mechanism. This is exactly what DW was slamming; programmers may use math to make their systems work, but (almost all of them) they don't use maths to prove their systems will work, the way engineers prove that a bridge won't collapse. This is understandable for a whole host of reasons, but claiming that it is impossible simply because your definition of 'quality assurance' is locked in to 'functional testing alone' is just ignorance.
There haven't been very many, and there's a good reason. For all the difficulties of CPU design and hardware logic design, the end product is a whole hell of a lot more predictable in its behavior. It's a lot easier to develop a comprehensive test suite for a CPU
Again you are zeroing in on testing when that is not what I was talking about at all. Again, I imagine this is because you have no real experience with formal verification. CPUs are formally verified to a certain extent, both because the relatively straightforward and formal spec (the instruction set definition, mostly) makes it easier and because CPU design teams have relatively high budgets and skill levels.
After all, on a CPU, making sure that a given set of inputs produces the correct set of outputs pretty much covers the entire set of functionality.
It is impossible to exhaustively test even a CPU for all possible instruction sequences that create unique internal states. As of 64-bit operations, it isn't even possible to exhaustively test individual functional units. What you can do is prove the design works on paper, then go on to prove that a specific set of functional tests will achieve full coverage of the physical gates - no one wants to leave this to guesswork and no manager is going to be satisfied with 'pretty much' when billions of dollars are on the line. Unfortunately the currently practical techniques still can't cover everything, and with the extreme pressure to get new CPU designs out on schedule mistakes are still made.
Making sure all your methods return the proper values is only a small part of how the system works. You have to deal with things like threading issues, asynchronous behavior, blocking, possibly weird error conditions in the system generating exceptions, etc.
All of which can be dealt with by temporal logic constructs in principle. As I said, the problem is just that the current tools are lagging what people actually want to develop by quite a way.
I shudder to think how many components in any operating system would just puke themselves in malloc() suddenly started failing.
This is simply an argument for ensuring that malloc() always works. Though >90% of the time automatic memory management is more appropriate (yes, kernels are an exception) and not coincidentally more amenable to formal verification.
The problem is that so much software development is done by amateurs. Anyone with a computer can pretty much dive right in. And the trend in the industry is to make development more easily accessible, which encourages more amateurs to enter.
Well, yes, granted. I think this a major part of that was that the microcomputer revolution of the 80s moved too fast (much faster than even the rise of electronic engineering) for a professional structure similar to other professions to take hold. The dot-com boom certainly didn't help matters either.
And there is no trend in mechanical engineering that has the effect of making entry into the field easier.
Not yet, but I'm surprised at the progress currently being made in '3D printing' systems, well before nanoscale machinery can become relevant.
Post Reply