I don't think I'd want to be competing against Intel and AMD in the chipset market when in both cases the
competitors define and manufacture both the CPU and CPU socket / interface / chipset specification to a large degree.
It seems like you'd always be playing catch-up with whatever proprietary interface IP the CPU vendors implement
to interface to their CPUs.
On the upside, though, when the CPUs all commonly get integrated memory controllers, at least the northbridge won't
have to do that particular function. Still, though, one has to deal with high bandwidth links to the CPU with fairly
rapidly changing technologies like HyperTransport, Quickpath, and whatever they come up with next year.
IMHO there's nothing at all wrong with them looking to make excellent "chipset" type chips or IP cores for functions
like GPUs, DisplayPort, DVI, Gigabit Ethernet, USB2.0, USB3.0, HD-Audio, SATA, PCIE, et. al. and either sell the
chips as next generation southbridges or as next generation northbridges (sans memory controllers but cum
QPI or HT or whatever). IMHO when you cut the memory controller out of the NB you're almost at the point where
you can start merging NB and SB into one chip, especially if you save on pin count due to PATA -> SATA,
elimination of parallel ports, multi-lane SATA or SAS technology, perhaps elimination of PS/2 KB/Mouse, no
integrated RS232 ports, et. al.
Now I'd ask myself, though, do I really want to compete in the IP core or SB market for functions like Gb Ethernet,
USB 2.0, USB 3.0, HD-Audio, SATA, et. al.? Clearly Intel, AMD, VIA, et. al. will have their own USB 2.0/3.0 solutions
and at best you'd probably try to achieve parity in features/performance/stability at a commodity cost.
Clearly Intel/AMD/VIA will have their own SATA solutions, as well as there being excellent IP/chips from
Silicon Labs (Silicon Image?), et. al. Clearly Marvell, Intel, Realtek, et. al. are dominating the Gb Ethernet landscape,
so there's little opportunity to innovate or command a premium.
Firewire? Seems to be getting less popular, and even so, Intel has that well implemented at a commodity cost,
as do other companies.
HD-Audio? Everyone and their brother has their own audio chip. I frankly think it would be a *triviality* to put some
really NICE analog and digital I/O interfaces onto a GPU chip (or modern 8-core CPU for that matter) , write some
nice embedded code for 3D audio, and have a good 3D graphics AND 3D surround sound system complete with
all kinds of spatial filters, real time equalization and echo control, et. al. I can't even see why the poorly supported
junk Creative Labs sells is allowed to still turn a profit or have market share compared to the trivially easy to implement
3D surround sound you can do on the CPU or GPU along with a few nice DACs and digital interfaces. So certainly
there's room for innovation and novel solutions here, but, again, the built in RealTek HD audio AC97 type HD-codecs
have on lots of motherboards are MORE than good enough for 99% of the market, so innovation is really just
for audiophiles, gamers, or to advance the state of the art of the next generation PC media standards.
So, really where's the profit and innovation in making Yet Another PC Chipset if you're not also making the CPU
and motherboard?
IMHO they should pursue the whole integrated CPU + GPU + Chipset thing as an opportunity for the next
3+ years when the process technology allows monolithic integration of something comparable to a
Nehalem Quad Core CPU + HD4870 GPU + all the peripherals you could want onto one or two chips that
NVIDIA could design and produce alone.
I'd say they'd be stronger if they just bought / merged with VIA and possibly AMD and maybe someone like
Marvell or Silicon Image and a bit motherboard OEM to get the best of current CPU + GPU + chipset + networking + peripheral technology in one house and then don't sell the chips, make motherboards and PCs directly with a wholly
integrated proprietary integration of all their best products and thus keep more of the profit in-house.
Competing with Intel in the CPU space of the current X86 legacy architecture is probably a losing proposition, but
with that kind of added IP and expertise and the whole X86 backward architecture being about to become irrelevant
in the face of massive multi-core and object oriented CPUs, a good consortium like the above could have a real chance
to totally redefine PC architecture and CPU architecture as well. Certainly when it comes to things like
laptops, ultra-portables, embedded converged media devices, this is especially true. If they planned for dominating
the space of mobile information devices that did things like real time 3D displays (not 3D on a 2D monitor!), projection
video, holographic displays, eInk type devices, convergence devices to replace your camera / PDA / cell phone / laptop /
desktop PC / GPS / MP3 player / wristwatch then they're really aiming at the right target.
Who the hell is going to want a desktop PC in a few years when your laptop or even pocket sized uber-PDA is going
to be more powerful than 4 of the current generation high end desktops? Future GPUs are going to be much less relevant
to fit into PC case peripheral slots as being something that sit in a pocket sized device to drive the 3D holo-display,
or at the very least as things that are built into the monitor and integrate strongly with its drive electronics.
Your massively parallel GPU technology could just as well be aiding in software defined radio for a next generation
3G/4G cell phone, doing DSP for miniature low cost ultrasound / CT / MRI devices, acting as a content scanner for
gigabit level internet connections we might get to our houses with "fiber to the curb" type technologies and dense
mesh wireless / optical networks, et. al. How are you going to do "photoshop" type image processing with your new
stereoscopic 3D 64 megapixel/frame camera? -- Oh, wait, it won't BE a STILL camera, it'll be a multi-spectral real time
HD video 3D video stream at gigabit per second rates. Better start thinking about encoding / decoding / compressing /
information processing THAT, NVIDIA.
We should worry less about slots, chips, sockets, backwards compatibility and more about convergence
of communications, interfaces, display technology, signal acquisition technology (2D/3D/4D imaging, 3D/4D sound *capture* as well as processing / playback), broadband wireless, personal area networks, grid / cluster / personal area
network computing, et. al. If a company with that much IT IP is still thinking about sockets / slots /
2D capture -> 3D spatial -> 2D DISPLAY type GPUs and motherboard chipsets for the MERE low level of
technology and peripheral density of today, they're really missing the boat for tomorrow's convergence and
ultra-broadband ultra-mesh / cloud type technologies.