A place I write about my personal and professional life, sharing experiences and an occasional rant about anything and everything that comes to my head. Thanks for visiting!

Industry suddenly realises multi-cored chips are useless unless used

STANFORD UNIVERSITY HAS joined forces with IBM, AMD, Sun Microsystems, Nvidia, Hewlett-Packard, and Intel to create innovative software that will let chips better process several simultaneously.

According to the New York Times, the partnership between the University and the six rival computer and chip makers will be formally announced this Friday, and the project will be dubbed the “Pervasive Parallelism Lab”.

In mid March of this year, Chipzilla and the Vole announced that they’d be funneling a combined $20 million into building specialised labs at Berkely’s University of California and Urbana-Champaign’s University of Illinois for parallel computing research, which would effectively tackle the same problem.

The massive amounts of funding and effort being channeled by the big players into such research programmes shows just how worried the software industry actually is about future microprocessors with 8, 16 or more cores on a single chip. The concern is that the software would not be able to work properly with the new hardware because without optimal programming, applications don’t profit from added chip power, and in some cases can even become slower because of it. This means that customers could just decide that it’s not worth their while to upgrade their system.

Clockspeed is no longer as important as performance per watt in computing, and hiking performance is now the domain of multicores, with most corporate server microprocessors and gaming machines having about eight cores.

To help the software take better advantage of the increased number of cores, the competing teams of boffins are going to have to experiment with new programming languages and tweaks in the hardware, as well as going back to the drawing board on things like operating systems and compilers (which translates programming gibberish into commands a computer can actually understand).

With all the effort going into it, it definitely seems that sequential programming will soon be a thing of the distant past, whilst parallel programming’s day is yet to come

Toggle menu