AMD's Chip Architect Brad Burgess on Mobile Computing's Future

AMD

AMD just revealed the chip architecture that'll be at the core of the company's future products: Bulldozer for high-end computing, and Bobcat for lower-power demands. We spoke to Bobcat's chief architect about the chips, the market, and how he sees the future.

Bobcat is, by AMD's own press info, destined for "low power markets" like netbooks and nettops and it's "cloud-client optimized." Very much in the mold of Intel's Atom, which it is designed to compete directly with, the CPU has a "small die area" and is designed to be flexible so it can be used in numerous devices. But Brad Burgess was careful to note that "Bobcat is also critical to our strategy of combining traditional CPU cores, like Bobcat, and graphics processing capability onto a single chip." This is a strategy that's very much in vogue with chip design at the moment since "That’s where improving the user experience really comes into play because graphics processors can do so much more than gaming; they are actually great at general purpose processing as well." Bobcat is also the foundation of a whole line of "Fusion" chips, which marry CPUs and GPUs into a single unit—it's a huge part of the company's future. 

We asked Brad what set his company's chips apart from their competition, and while he was reluctant to compare figures directly, Burgess noted Bobcat is "unique because we have focused on power management not just in terms of implementation tricks or clock gating, but rather we have put significant effort into reducing power all way down to fundamental microachitecture decisions." Its reliance on out-of-order process management is also key—providing "one of the biggest performance advantages" over in-order processors (in which commands are simply processed in the order they arrive, meaning if one is computationally "expensive" it can hold up a whole string of later, but smaller, commands).

But the future of computing isn't as clear-cut as it has been for recent decades, where Moore's law saw ever faster, more powerful, more electrical power-hungry chips cramming more and more transistors into a square millimeter of silicon. In particular two new pieces of thinking have been changing the markets: Desktop supercomputers, or servers, powered by smaller, less energy-hungry chips and multifunction "heterogenous" chips that act as both central processor and graphics processor.

"In terms of delivering a 'supercomputer on your desktop,'" Burgess highlighted that it's "actually possible today with the incredible processing capability of graphics technology, such as our ATI Radeon HD 5000 series graphics cards. With multiple Teraflops of compute power, we are seeing customers running applications on their desktops that would have been classified as supercomputers not too long ago." But he agreed that the new trend of using chips like Bobcat in servers and powerful many-cored desktop machines was an "interesting concept." Could his chip be used like this? "It could," he conceded, "but we will watch the market very closely before making any decision."

Meanwhile another revolution, headlined by things like the new Microsoft Xbox CPU/GPU dual-purpose processors, is going on. AMD is in a prime position to capitalize on this as it's got expertise in designing both types of processor (unlike Intel, as pretty much anyone who's ever had to rely on Intel's "integrated graphics" solutions will attest). AMD is in fact in a "unique position" to do this, but the real trick to finesse this sort of tech is to integrate "this technology in a way where the people who program applications don't see any difference" Burgess thinks, with the tech intelligently mapping specific tasks to the right resources all by itself. And this is interesting, given that lots of emphasis is currently placed on clever code exploits (for unusual number-crunching tasks like encryption) being crafted to make the most of the powerful chips in GPU cards. 

So what about the distant future? Some noise is being made about optical chip interconnects, and you can just about argue that the fiber optics inside USB 3.0 and LightPeak are a first step toward replacing silicon with "light chips". But Burgess, who should know, thinks we shouldn't rule out CMOS and silicon just yet, thanks tot the raw "economics of manufacturing." The existing "workhorse of the industry is plain-old CMOS" and while there will ultimately "be more mainstream applications for optical computing, but for the near-term, the economics of semiconductor manufacturing will keep us on a more traditional path." 

To keep up with this news follow me, Kit Eaton, on Twitter.

Add New Comment

0 Comments