Page 1 of 2

ArsTechnica reviews OS X 10.4

Posted: 2005-04-28 02:31pm
by phongn
As usual, Ars reviews OS X. This will likely be the best and most in-depth review of OS X 10.4 you'll find anywhere and touches on issues from API to metadata to its user interface.

Posted: 2005-04-28 03:52pm
by Melchior
Interesting.
I hope that they will fix the windows apparence problem quickly, it seems annoying.

Posted: 2005-04-28 06:42pm
by Praxis
This doesn't have to do with Ars Technica, but it does have to do with Tiger so I figured might as well post here instead of making a new thread.

http://apple.slashdot.org/article.pl?si ... =179&tid=3

"Online retailer Tiger Direct has reportedly sued Apple over the use of the Tiger name just one day before the Mac maker is scheduled to roll-out its next-generation Mac OS X 10.4 'Tiger' operating system, according to an article at AppleInsider. TigerDirect, which owns trademarks on the names Tiger, TigerDirect and TigerSoftware, has requested an injunction that could prevent Friday's launch of the Tiger OS. Tiger Direct is also seeking damages and legal fees. 'Apple Computer has created and launched a nationwide media blitz led by Steven Jobs, overwhelming the computer world with a sea of Tiger references,' Tiger Direct's attorneys wrote in the lawsuit." While the suit may have some merit, it is odd for them to wait until now to try and halt such a heralded product.
So TigerDirect is suing Apple for Mac OS X Tiger.

These guys have a copyright on the word Tiger now, do they? Better warn the local zoos...

Posted: 2005-04-28 09:10pm
by phongn
No, but they do have a trademark and Apple Computer is in the business of selling computers.

Re: ArsTechnica reviews OS X 10.4

Posted: 2005-04-29 05:14pm
by Durandal
phongn wrote:As usual, Ars reviews OS X. This will likely be the best and most in-depth review of OS X 10.4 you'll find anywhere and touches on issues from API to metadata to its user interface.
I slogged through all 20 pages of it. The little movie demoing kqueue made it all worthwhile though. :)

Posted: 2005-04-29 06:52pm
by MKSheppard
Despite its NeXTSTEP roots, Mac OS X is still a very a young operating system. Most of the technologies that make it interesting and unique are actually brand new: Quartz, Core Audio, IOKit, Core Foundation. The hold-overs from NeXT and classic Mac OS have also evolved substantially: QuickTime, Carbon, Cocoa.
Whats up with Apple naming everything cheesily?

Posted: 2005-04-29 06:54pm
by Spanky The Dolphin
Because Apple is full of dorks. :P

Posted: 2005-04-29 07:21pm
by MKSheppard
After reading this, my impression is the real big leap comes from making
all of the pretty graphical effects apple put down to make OS X look cool
hardware accelerated.

In short, they made everything open GL and moved all the hard work of
drawing and moving those transparent windows and menus off to the graphics
card, because previous OS X implementations were slow as shit graphically.

Meanwhile, most people turn off the pretty transparent backgrounds
in Windows to speed up their computers....

Posted: 2005-04-29 07:49pm
by phongn
MKSheppard wrote:After reading this, my impression is the real big leap comes from making all of the pretty graphical effects apple put down to make OS X look cool hardware accelerated.

In short, they made everything open GL and moved all the hard work of drawing and moving those transparent windows and menus off to the graphics card, because previous OS X implementations were slow as shit graphically.

Meanwhile, most people turn off the pretty transparent backgrounds in Windows to speed up their computers....
The other problem is that even with many of the fancy graphical features in OS X turned off the GUI was pretty slow. The classic methods of hardware accelerated 2D desktops could simply not be done in the OS X window manager. At first that meant the entire GUI was done in software and since 10.0 Apple has incrementally moved portions onto the GPU to speed things up.

Longhorn will have the same problem but when it will come out it'll enjoy a broad variety of GPUs that can hardware accelerate it, unlike OS X at its inception.

Posted: 2005-04-29 07:58pm
by MKSheppard
Okay: From talking with Phong:

1.) On OS X, each window is essentially a PDF object, which is
composited with the other windows to create the full screen
image, which is quite difficult to do

2.)OS X 10.4 essentially uses pixel shaders to draw the window;
OS X 10.3 introduced the ability to do the compositing stage in
hardware via OpenGL


I got a question for the Mac Fanatics: WHY WHY did your beloved
company decide to do everything like a German Engineer trying to
avoid being sent to the Ostfront in WWII?

Pixel Shaders being used to draw a simple 2D window? :wtf: Not even
MICROSOFT is that insane.

Posted: 2005-04-29 08:00pm
by Praxis
MKSheppard wrote:Okay: From talking with Phong:

1.) On OS X, each window is essentially a PDF object, which is
composited with the other windows to create the full screen
image, which is quite difficult to do

2.)OS X 10.4 essentially uses pixel shaders to draw the window;
OS X 10.3 introduced the ability to do the compositing stage in
hardware via OpenGL


I got a question for the Mac Fanatics: WHY WHY did your beloved
company decide to do everything like a German Engineer trying to
avoid being sent to the Ostfront in WWII?

Pixel Shaders being used to draw a simple 2D window? :wtf: Not even
MICROSOFT is that insane.
Since the graphics card is doing all the work rather than the processor, who really cares, there's no notable effect on performance and it look pretty...

As for Microsoft not being insane, I seem to recall Microsoft doing that in Longhorn, though I may be wrong :lol:

Posted: 2005-04-29 08:01pm
by phongn
MKSheppard wrote:I got a question for the Mac Fanatics: WHY WHY did your beloved company decide to do everything like a German Engineer trying to avoid being sent to the Ostfront in WWII?

Pixel Shaders being used to draw a simple 2D window? :wtf: Not even MICROSOFT is that insane.
They will be ;)

Posted: 2005-04-29 08:03pm
by MKSheppard
phongn wrote:They will be ;)
Microsoft has always been very good about allowing users "classic windows";
even their stupid arsed "active desktop" shit could be turned off, along with
menu fading...

Posted: 2005-04-30 12:51am
by Xon
MKSheppard wrote:After reading this, my impression is the real big leap comes from making
all of the pretty graphical effects apple put down to make OS X look cool
hardware accelerated.
There are some massive changes under the covers. Say what you will about user-land Windows, but the Windows NT kernel is fairly sophisticated. Apple is only just getting features into the Mac OS X kernel which have been in the NT kernel is day 1.


Meanwhile, most people turn off the pretty transparent backgrounds
in Windows to speed up their computers....
If you have a reasonably modern graphics card and proper drivers, that stuff is hardware accelerated :P

Posted: 2005-05-01 08:16pm
by Durandal
MKSheppard wrote:I got a question for the Mac Fanatics: WHY WHY did your beloved company decide to do everything like a German Engineer trying to avoid being sent to the Ostfront in WWII?

Pixel Shaders being used to draw a simple 2D window? :wtf: Not even MICROSOFT is that insane.
You know, most people who actually know something about the subject consider GPU utilization in drawing the UI to be the preferable way to do things, since it relieves the CPU of these tasks and allows for a great range of real-time effects for free.

By the way, coolest feature ever in Tiger.

Image

This works for any application.

Posted: 2005-05-01 09:18pm
by Pu-239
Hey! My name's John... :evil:

Posted: 2005-05-01 10:17pm
by Durandal
Pu-239 wrote:Hey! My name's John... :evil:
Here's the source of that particular image. :)

Posted: 2005-05-01 10:27pm
by MKSheppard
Durandal wrote:You know, most people who actually know something about the subject consider GPU utilization in drawing the UI to be the preferable way to do things, since it relieves the CPU of these tasks and allows for a great range of real-time effects for free.
Actually, lots of stuff is already 2D hardware accelerated, I just question
the need to use a 3D pixel shader to draw the damn window on screen. It's
Germanic in it's inefficiency.

EDIT: And why did it take Apple until the 4th iteration of OS X to
add hardware acceleration for all the pretty fancy GUI effects?

They knew that all this stuff imposed a severe performance penalty,
yet put it in anyway back in about 2000.

Posted: 2005-05-01 11:43pm
by phongn
MKSheppard wrote:Actually, lots of stuff is already 2D hardware accelerated, I just question the need to use a 3D pixel shader to draw the damn window on screen. It's Germanic in it's inefficiency.
Apple wanted a fairly sophisticated windowing system that could do all sorts of tricks. The price for that was complexity. One side-effect is that it is trivial to produce PDFs from OS X since that is (more or less) what your window is.
EDIT: And why did it take Apple until the 4th iteration of OS X to
add hardware acceleration for all the pretty fancy GUI effects?

They knew that all this stuff imposed a severe performance penalty,
yet put it in anyway back in about 2000.
GPUs capable of hardware accelerating that section of the GUI did not arrive until DX9-compliant video cards appeared on the market. The window-composition layer has been accelerated for some time, though.

Posted: 2005-05-02 12:24am
by Durandal
MKSheppard wrote:Actually, lots of stuff is already 2D hardware accelerated, I just question the need to use a 3D pixel shader to draw the damn window on screen. It's Germanic in it's inefficiency.
How is it inefficient? The advent of high-resolution displays (200 dpi and up) will make pixel-pushing power an absolute necessity for user interfaces, especially when the vaunted resolution-independent GUIs start showing up. GPUs have monstrous bandwidth and excel at parallelized tasks, like the ones often involved in image manipulation. It's incredibly efficient.

Without hardware acceleration, the CPU has to do all the work to composite screen elements, which results in massive slowdowns for even trivial things like alpha blending. The GPU does these things much, much faster. Inefficient would be letting the massive processing power of today's GPUs sit there idly during 95% of the computer's usage.

The current 2D acceleration that Windows uses is blindingly fast for simple pixel-blitting, but it's been around for a very long time, and the benefits new DirectX 9-class GPUs bring to the table simply don't help the current Windows GUI all that much. That's why Microsoft is moving the Longhorn and XP GUI over to Avalon, which includes a DirectX 9 equivalent of Quartz 2D Extreme.

The point is that if you view every window as a flat, textured polygon, you can do more with that window. You can do anything you want to it with a fragment program and in real time. Viewing the window as a texture makes things like Exposé feasible and fast.
EDIT: And why did it take Apple until the 4th iteration of OS X to add hardware acceleration for all the pretty fancy GUI effects?
Apple's hardware acceleration makes extensive use of the ARB_fragment_program. This extension must be supported at the hardware level on the GPU in order for these improvements to work. And those GPUs are all DirectX 9-class (i.e. nVidia's GeForce 5200 and up and ATi's Radeon 9600 and up).
They knew that all this stuff imposed a severe performance penalty, yet put it in anyway back in about 2000.
They bet on the long term. Because of their strategy, any app natively developed using CoreGraphics gets most of the acceleration for free with no work on the app developer's part.

Apple chose long-term benefits and took the growing pains associated with it, and now it's paying off in a big way. Their developers have had access to the underlying APIs for 4 years now and have started using them already. Trust me, Microsoft would love to be in Apple's position with Avalon, but they won't be because all their big-time app developers are using XP's drawing system and will likely not switch over to Avalon for a very long time. Meanwhile, Apple has been able to deprecate QuickDraw without upsetting too many people.

Posted: 2005-05-02 12:54am
by Durandal
More coolness:

Image

:mrgreen:

Posted: 2005-05-02 01:30am
by Slartibartfast
MKSheppard wrote:Actually, lots of stuff is already 2D hardware accelerated, I just question
the need to use a 3D pixel shader to draw the damn window on screen. It's
Germanic in it's inefficiency.
Because video card manufacturers aren't working harder and harder at improving their cards' 2D performance as much as they do with the 3D perf.

That would be my guess. Also if the 3D chipset isn't doing any 3D rendering ATM, it's just dead weight, could as well use it.

EDIT: Also because, even if it lacks depth, windows ARE placed in a 3D environment (one in front of the other), so it makes sense to speed up the process by using 3D hardware instead of silly BLIT methods to paint the "part of the screen that has changed".

Posted: 2005-05-02 02:25am
by Praxis
phongn wrote: The window-composition layer has been accelerated for some time, though.
Since OS X 10.2 Jaguar, in fact.

Posted: 2005-05-02 06:00am
by Xon
Durandal wrote:The current 2D acceleration that Windows uses is blindingly fast for simple pixel-blitting, but it's been around for a very long time, and the benefits new DirectX 9-class GPUs bring to the table simply don't help the current Windows GUI all that much. That's why Microsoft is moving the Longhorn and XP GUI over to Avalon, which includes a DirectX 9 equivalent of Quartz 2D Extreme.

The point is that if you view every window as a flat, textured polygon, you can do more with that window. You can do anything you want to it with a fragment program and in real time. Viewing the window as a texture makes things like Exposé feasible and fast.
There is the biggest difference between Quartz 2D Extreme & Avalon. As it stands, the with Quartz, the composing engine treats each window as a flat, textured polygon.

With Avalon the each window is a scene graph. Avalon can rescale/skin/move elements on a window on the fly without any required help from the application.

A trivial example would be some managed code which you inject into an Avalon markup file which causes the GUI elements to run away from the cursor or something stupid like that.

Posted: 2005-05-02 10:51am
by Praxis
ggs wrote:
Durandal wrote:The current 2D acceleration that Windows uses is blindingly fast for simple pixel-blitting, but it's been around for a very long time, and the benefits new DirectX 9-class GPUs bring to the table simply don't help the current Windows GUI all that much. That's why Microsoft is moving the Longhorn and XP GUI over to Avalon, which includes a DirectX 9 equivalent of Quartz 2D Extreme.

The point is that if you view every window as a flat, textured polygon, you can do more with that window. You can do anything you want to it with a fragment program and in real time. Viewing the window as a texture makes things like Exposé feasible and fast.
There is the biggest difference between Quartz 2D Extreme & Avalon. As it stands, the with Quartz, the composing engine treats each window as a flat, textured polygon.

With Avalon the each window is a scene graph. Avalon can rescale/skin/move elements on a window on the fly without any required help from the application.

A trivial example would be some managed code which you inject into an Avalon markup file which causes the GUI elements to run away from the cursor or something stupid like that.
What about CoreImage? (note, asking a question, not arguing- I knew Quartz Extreme wasn't as good as Avalon, but that was obvious since QE was released oh three years ago...)