Intel® Software NetworkConnect with developers and Intel engineers
Communities
AcademicCluster ReadyManageabilityMobilityParallel Programming and Multi-CoreOpen SourceVirtualizationVisual ComputingMore...DownloadsTools
Product SuitesCompilersVTune? AnalyzersPerformance LibrariesThreading Analysis ToolsCluster ToolsSOA ProductsBuy or RenewFree Evaluation SoftwareReseller CenterAcademic ProgramCustomer CommentsPlatform Administration ProductsContent Management ProductsTools Knowledge BaseForums/Blogs
ForumsBlogsBlog CategoriesMeet The BloggersResources
Events CalendarIntel Press Technical BooksIntel Software Insight MagazineIntel Visual Adrenaline MagazineIntel Software Partner ProgramKnowledge BaseTake Five VideosTrainingWhat If SoftwareSoftware Support
English | ?? | ???????
Login
Login IDassword:Remember Me?
Forgot Login ID?
Forgot Password?
New Registration?
Search
Entire SiteKnowledgebaseForumsVideosBlogs
Advanced Search
1,861 Posts served
6,545 Conversations started
Navigation
Blog Categories
Meet The Bloggers
Archives
Posts By Category
Academic
Atom
Bit Stories
Cool Software
Customer Support
Events
Financial Services Industry
Gaming
Intel SW Partner Program
Intel® Software Network 2.0
Manageability
Mobility
Multi-Core
Open Source
Social Media & Virtual Worlds
Software Engineering
Threading Building Blocks
Virtualization
Visual Computing
What If Software
XML Software
Popular Posts
Update on the 915 Graphics WDDM Vista Driver Issue
ASF and Intel AMT - Spot the differences (part 1)
Video: Why Intel 915 graphics don't have a WDDM driver for Vista
Why Windows Threads Are Better Than POSIX Threads
Windows Server 2008 "Aero Enabled" Workstation Edition
Home ? Software Blogs ?
Mac OS X 10.6 Snow Leopard: Reading from the Intel Cookbook
By Josh Bancroft (Intel) (69 posts) on June 10, 2008 at 10:08 pm
The Apple WWDC 2008 keynote has come and gone, and my wild speculation about what Apple might say about the next version of OS X, 10.6 code named "Snow Leopard" (and affectionately christened "Snot Leopard" thanks to a typo during my WWDC liveblogging ;-) ), that it would be announced as the operating system for a "netbook" or Mobile Internet Device powered by the Intel Atom processor, didn't come true. In fact, besides a brief reference to an after-lunch WWDC session (under NDA), Steve Jobs didn't say much about Snow Leopard at all. Since then, a few more details have become available, and Apple has put up a page with the (limited) info:
http://www.apple.com/macosx/snowleopard/
Much has been written about the more controversial questions - are they really not adding any new features? Are they going to drop PPC support? Is it going to be 64-bit only (and if so, what about early Intel Core Duo chips that aren't fully 64-bit capable?). I'll leave all that to the people who know what they're talking about. But what strikes me as interesting is that the few fundamental technologies they HAVE discussed looks like a mirror image of the technologies Intel, and specifically, our group Intel Software Network, Intel's developer community, have been promoting and evangelizing to software developers for quite a while now.
First, I have to cling to my hope and dream that one day, Apple will release something along the lines of a "netbook", like the Asus Eee PC or the MSI Wind. Something like the MacBook Air, but much smaller. Apple's throwing fuel on that particular speculative fire with statements like this:
Snow Leopard dramatically reduces the footprint of Mac OS X, making it even more efficient for users, and giving them back valuable hard drive space for their music and photos.
Having recently paved and done a clean install of Mac OS X Leopard on my MacBook Pro, I can tell you that the operating system itself only takes up about 5.5 GB of hard drive space. Hard drives are growing in capacity and dropping in price at an astounding rate (did you ever dream you'd be able to pick up a terabyte of disk space for a couple hundred bucks?). So why would Apple care about reducing that 5-6 GB footprint, when drives are huge and cheap? Think SSD. Solid State Disks. Like the ones in the netbook devices. The Asus Eee PC I got to play with a while ago had a 4 GB SSD. Current models have 12 or 20GB. Fast, efficient, and no moving parts. Perfect for mobile devices. But still really expensive - you can get a 64GB SSD in a MacBook Air instead of the much slower 80GB hard drive, but it will cost you a cool $999 for the upgrade. SSDs are coming down in price, but they're still going to be expensive in any really large sizes for a while. So, if Apple was thinking of doing a Mobile Internet Device or netbook, it makes sense to squeeze OS X down as much as they can, to make, say, an affordable 16GB SSD a viable option that won't get hogged by just the OS.
Next, there's the new "Grand Central" technology, that focuses on taking full advantage of multicore processors:
?Grand Central,? a new set of technologies built into Snow Leopard, brings unrivaled support for multicore systems to Mac OS X. More cores, not faster clock speeds, drive performance increases in today?s processors. Grand Central takes full advantage by making all of Mac OS X multicore aware and optimizing it for allocating tasks across multiple cores and processors. Grand Central also makes it much easier for developers to create programs that squeeze every last drop of power from multicore systems.
Emphasis mine. Intel Software Network has been banging on the multicore drum for quite a while now, ever since it became clear that the future of processor performance was more and more cores working in parallel, rather than ever-increasing clock speeds. In fact, we have a whole multicore developer community (hosted by my awesome colleague, Aaron Tersteeg) dedicated to multicore programming resources, tools, learning, and access to the Intel experts who literally wrote the book on this stuff. I'm sure as Snow Leopard gets closer, you Mac developers will (hopefully) be seeing a lot more details from both Apple and Intel on how to make your apps sing on many-core processors. It's the biggest fundamental shift in computing since, say, the x86 architecture became the standard. I can't wait to see this gain broader acceptance and implementation.
Finally, Apple teases us with this little tidbit on the vaguely-named Open CL (Open Computing Language), apparently aimed at taking advantage of upcoming super-powerful GPUs for other computing tasks:
Another powerful Snow Leopard technology, OpenCL (Open Computing Language), makes it possible for developers to efficiently tap the vast gigaflops of computing power currently locked up in the graphics processing unit (GPU). With GPUs approaching processing speeds of a trillion operations per second, they?re capable of considerably more than just drawing pictures. OpenCL takes that power and redirects it for general-purpose computing.
They don't name any one company's products or technologies, but it's well known that Nvidia and Intel are both working on many-core GPUs that support "GPGPU" - General Purpose (Computing) on the GPU. And again, my group, Intel Software Network, has a whole community (this one just freshly minted!) dedicated to what we call Visual Computing. Steve Pitzel hosts this community (Steve has more interesting stories than ANYONE I know - ask him some time!), and the super swanky page design came from our resident web development wizard, Kevin Pirkl. Intel has a little upcoming product called Larrabee that we think is going to really turn the notion of what a GPU is for on its head. Have you noticed how Nvidia has been getting very aggressive towards Intel, some might say even attacking? Yeah, it's because of Larrabee. And knowing Apple, they'll be right there, ready to take advantage of all of the advances in the visual computing world. Competition is a good thing.
Anyway, that's it for today's dose of idle speculation, and listening to me play armchair industry analyst. I have to say it feels pretty cool to work for a company (Intel) that has such influence over the world of technology. I get to see SO MANY COOL THINGS in the course of my job, I feel spoiled. And I try to share as much with you as I can - like tomorrow, I'll be filming demos at the Research@Intel event at the Computer History Museum in Mountain View. From the previews I've seen, some of this stuff is just freaky sci-fi cool. I can't wait to see it, shoot it, and get it out to you. As usual, I'd love to hear your thoughts, even if all you have to say is how wrong you think I am. Leave it in a comment!