Intel exec: Programming for multicore chips a challenge

Adding cores could create challenges for programmers writing code that enables applications to work effectively with multicore chips.

Adding more cores is desirable to meet growing computing demands, but it could create more challenges for programmers writing code that enables applications to work effectively with multicore chips.

As technology develops at a fast rate, a challenge for developers is to adapt to programming for multicore systems, said Doug Davis, vice-president of the digital enterprise group at Intel, during a speech Tuesday at the Multicore Expo in Santa Clara, California. Programmers will have to transition from programming for single-core processors to multiple cores, while future-proofing the code to keep up-to-date in case additional cores are added to a computing system.

Programming models can be designed that take advantage of hyperthreading, which enables parallel processing capabilities of multiple cores to boost application performance in a cost-effective way, Davis said. Intel is working with universities and funding programs that will train programmers to develop applications that solve those problems, Davis said.

Intel, along with Microsoft, has donated $US20 million to the University of California at Berkeley and the University of Illinois at Champaign-Urbana, to train students and conduct research on multicore programming and parallel computing. The centers will tackle the challenges of programming for multicore processors to carry out more than one set of program instructions at a time, a scenario known as parallel computing.

Beyond future-proofing code for parallelism, adapting legacy applications to work in new computing environments that take advantage of multicore processing is a challenge coders face, Davis said. Writing code from scratch is the ideal option, but it can be expensive.

"The world we live in today has millions of lines of legacy code ... how do we take legacy of software and take advantage of legacy technology?" Coders could need to deliver what's best for their system, Davis said.

Every major processor architecture has undergone quick changes because of the rapid rate of change as described by Moore's Law, which calls for better application and processor performance every two years, but now the challenge is to deliver performance within a defined power envelope. Power consumption is driving multicore chip development, and programmers need to write code that works within that power envelope, Davis said.

Adding cores to a chip to boost performance is a better power-saving option than cranking up clock frequency of a single-core processor, Davis said. Adding cores increases performance, but cuts down on power consumption.

In 2007, about 40 per cent of desktops, laptops and servers shipped with multicore processors. By 2011, about 90 per cent of PCs shipping will be multicore systems. Almost all of Microsoft Windows Vista PCs shipping today are multicore, Davis said.

Intel is also working on an 80-core Polaris chip, which brings teraflops of performance.

"We're not only talking about terabit computing, but the terabyte sets [of data] we can manage." Davis said. Users are consuming and storing tremendous amounts of data now, and in a few years, the amount of data should reach zettabytes, Davis said.

The next "killer" application for multicore computing could be tools that enable the real-time collection, mining and analysis of data, Davis said. For example, military personnel using wearable multicore computers are able to simulate, analyse and synthesize data in real time to show how a situation will unfold. Doing so is viable and doesn't create risk for military personnel, Davis said.

"These types of applications have taken weeks to do ... now these types of applications are literally running in minutes," Davis said.

As cores are added, the performance boost may also enable more applications, Davis said. The oil and gas industry will demand one petaflop of computing capacity in 2010, compared to 400 teraflops in 2008, to cost-effectively collect seismic data, compare it to historical data and analyse the data. Compared to the past, oil and gas explorers can collect and analyse data much faster now, Davis said.

Join the newsletter!

Error: Please check your email address.
Rocket to Success - Your 10 Tips for Smarter ERP System Selection
Keep up with the latest tech news, reviews and previews by subscribing to the Good Gear Guide newsletter.

Agam Shah

IDG News Service
Show Comments

Most Popular Reviews

Latest Articles

Resources

PCW Evaluation Team

Sarah Ieroianni

Brother QL-820NWB Professional Label Printer

The print quality also does not disappoint, it’s clear, bold, doesn’t smudge and the text is perfectly sized.

Ratchada Dunn

Sharp PN-40TC1 Huddle Board

The Huddle Board’s built in program; Sharp Touch Viewing software allows us to easily manipulate and edit our documents (jpegs and PDFs) all at the same time on the dashboard.

George Khoury

Sharp PN-40TC1 Huddle Board

The biggest perks for me would be that it comes with easy to use and comprehensive programs that make the collaboration process a whole lot more intuitive and organic

David Coyle

Brother PocketJet PJ-773 A4 Portable Thermal Printer

I rate the printer as a 5 out of 5 stars as it has been able to fit seamlessly into my busy and mobile lifestyle.

Kurt Hegetschweiler

Brother PocketJet PJ-773 A4 Portable Thermal Printer

It’s perfect for mobile workers. Just take it out — it’s small enough to sit anywhere — turn it on, load a sheet of paper, and start printing.

Matthew Stivala

HP OfficeJet 250 Mobile Printer

The HP OfficeJet 250 Mobile Printer is a great device that fits perfectly into my fast paced and mobile lifestyle. My first impression of the printer itself was how incredibly compact and sleek the device was.

Featured Content

Latest Jobs

Don’t have an account? Sign up here

Don't have an account? Sign up now

Forgot password?