nice article on parallel processor limitations

SLAM: debunk creationism, pseudoscience, and superstitions. Discuss logic and morality.

Moderator: Alyrium Denryle

User avatar
General Zod
Never Shuts Up
Posts: 29211
Joined: 2003-11-18 03:08pm
Location: The Clearance Rack
Contact:

Re: nice article on parallel processor limitations

Post by General Zod »

Akkleptos wrote: This is rich. An old saying goes: if it's not broken, why fix it? If a vital program worked fine under XP, but if fails to do so under Vista, wouldn't it suck to be the employee who suggested the whole company's migration to Vista?
It's not Vista's fault that employee was a stupid dipshit who didn't bother doing any kind of research beforehand to make sure their oh-so vital program worked. But hey, feel free to keep blaming the software for user idiocy.
Besides, Vista is marketed as an OS that will help you run things more efficiently. In a case as above, it just fails grandtime. Maybe it's not usual, but still for most sensible managers it would be a definite deal breaker. You may suggest to upgrade to a program's version that is Vista-compatible. More spending. And also, with software for special niches, that would imply waiting for months, if not years. But what you had already worked! It'd be hard no to see why this situation is bad.
Once again, employee stupidity does not mean Vista is inefficient. But feel free to continue pretending otherwise. The idea that it should cater to every single fucking piece of software that was developed under a previous version and you're somehow forced to upgrade is pure idiocy. I think you're just trolling at this point since you can't do anything but repeat the same fucking nonsense over and over.
My dear General, you're right. Nevertheless, if you find 20 random people on the street that could google up and install lighter, cheaper or even free alternatives I'd be surprised. If you expand that to the whole planet, where computer literacy is well below the standards of, say, the US...
Jesus Christ you're a fucking retard. If you honestly think that because an operating system isn't usable by idiotic bogans who have no clue how to type things into a url makes it a bad operating system you should just maybe consider why everybody needs access to a computer. (Neverminding that virtually any highschooler should know how to do these things).
"It's you Americans. There's something about nipples you hate. If this were Germany, we'd be romping around naked on the stage here."
User avatar
Mad
Jedi Council Member
Posts: 1923
Joined: 2002-07-04 01:32am
Location: North Carolina, USA
Contact:

Re: nice article on parallel processor limitations

Post by Mad »

Akkleptos wrote:The key term here is "necessary". It would be bloat however if it uses more resources to do something other programs can, using less. Or just bad programming.
How do you know what's necessary and what's not?

And necessary is a relative term in this case, because implementations can use more memory than strictly necessary for the sake of speed or stability. Would you rather it use as little memory as possible, and therefore be less bloated by your definition, but run more slowly and have more bugs and crashes?

Cross-platform applications may have higher requirements because it gets a lot trickier to keep everything optimized for each major platform.

Yes, software bloat is bad. But not everything that causes increased memory usage for no apparent reason (to the end-user) is bloat. It may appear that way, but the assumption could be completely wrong. There are a lot of trade-offs.

Office 2007 is doing more than Office 2000 and so will need more memory. But how much more? What's the cut-off between "this is bloated" and "this is not bloat"?

I'm just saying that it's not something an end-user can easily quantify.
Also, when it comes to writing code in 200 Mb when it could have been done in 20, well, that's not good, IMHO.
How do you know it can be done with only 20 MB of memory as opposed to 200 MB? Do you know how much of the 200 MB is code, data, or artwork?
I've read no blogs. I've seen expensive programs we use at work failing to run under Vista, when they work just fine in XP. I've gone to google up fixes and workarounds only to find out it's a Vista compatibility issue, and that no solution exists at the moment. In these instances, the "upgrade" killed productivity. And, no, I'm not saying Vista is necessarily bad. Just that I see it as an unnecessary upgrade.
Vista certainly has a number of problems (I use it at work and regularly experience some of them), but I view some of its compatibility issues as necessary for the health of the industry. Earlier versions of Windows allowed developers to be lazy about security and assume all users were running with admin rights (bad security practice). Microsoft should have fixed that long ago, when there'd be less problems.
Back to bloating, here's a good example of apps that do pretty much the same essential job, yet having a significant disparity on resource consumption:
They do the "same essential job," but also have a disparity in features. As mentioned by another, iTunes has a built-in shop. It's also cross-platform (probably optimized much more for OSX than Windows, though that's purely a guess as I don't use OSX).
Finally, all I'm saying is not features=bloat (though it's nice to be able to opt out of the ones one doesn't really need), but rather bad, inefficient and redundant programming=bloat.
How do you know what "bad, inefficient and redundant programming" is? Yeah, it's bad, but do you have the source code to be able to tell that a particular application is guilty of it?
Later...
User avatar
Starglider
Miles Dyson
Posts: 8709
Joined: 2007-04-05 09:44pm
Location: Isle of Dogs
Contact:

Re: nice article on parallel processor limitations

Post by Starglider »

Xon wrote:
Starglider wrote: A lot of that will be the hard disk; modern hard disks have a transfer rate about 100 times faster than mid-90s ones.
In comparison to ram and CPU, storage simply hasn't improved as fast.
Actually yes it has in two metrics; transfer rate and storage space per dollar. These have scaled at least as well as RAM for hard drives if not better. The one thing that has not improved much is average latency, because it's limited (for a hard drive) by RPM. To be fair latency for RAM has also been improving much more slowly than bandwidth has - this is the primary reason why processor cache sizes have ballooned over recent years.
SSD has massive improvements in small random reads, which is what an OS does most of the time running applications.
I know, that's why I have one in my laptop. It does help noticeably with response times.
A lot of home users never have to perform any maintenance on their systems, if they never uninstall anything, don't install junk and don't catch viruses.
If that was the case, service desks wouldn't be such a nightmare to man.
Regardless of the fact that plenty of people do constantly nag the support desk, there are also plenty of people who don't (even though in some cases they probably should, e.g. machine is virus infested).
The largest issue for computer maintenance is when someone changes something. Either hardware or softwar, then you have potential issues.
Yes. Hopefully modularisation and intelligent system management will eventually eliminate these problems from a user point of view. I am encouraged by the progress that's been made with Linux package systems over the last few years. It used to be (circa 2000) that installing anything nontrivial on a Linux system was a multi-hour exercise in frustration. Now 99% of the time installation and removal is one click despite Linux's tendency towards spectacularly complex dependency chains.
Post Reply