Yeah yeah, that article is basically irrelevant.
Yes, most hardware works now on 64 bit operating systems. Yes you can use all your RAM. Yes you gain the ability to execute 64 bit applications. But what the author fails to recognize is that having hardware WORK does not necessarily mean it works WELL. This is particularly true of Creative's drivers. ATi and nVidia weren't much better, but have since improved a lot.
There are also very few 64 bit applications that your average user would need or want to use. Why do I want to run 64 bit internet explorer if there's no flash plugin for it?
See - aside from getting access to more RAM, there's no compelling reason to use a 64 bit operating system for the average user / gamer. I consider myself a pretty advanced user, particularly in the area of video encoding. I did see some performance enhancements in this department, but to gain this performance it was necessary to jump through several hoops - and it ended up breaking my workflow.
Show me one game that benefits from 64 bit. Crysis doesn't count, since it's too far ahead of its time to really be taken into consideration for ANYTHING. Far Cry? Who cares... Source? Sort of, not really. Those are the only ones I know of...
Photoshop (and other Adobe products) aren't 64 bit yet, and per Adobe engineers are more memory bandwidth than capacity limited anyway. I'm still scratching my head over that one to be perfectly honest.
That eliminates a big chunk of content creation stuff that could potentially be faster on 64 bit. What other system intensive stuff does the average user do, besides gaming, content creation, and media encoding?
I'm not sure, but I welcome input.
XP 32 bit FTW - IMHO. 64 bit is here to stay, but it's going to be a long, drawn out transition. We absolutely will need more than 2 gigs of ram per process and ~ 3.5 gigs for the whole system. Sooner than a lot of people think. But, it's not today.
~MiSfit