Brown CS News

Maurice Herlihy's Transactional Memory Research Put into Action with Intel's New Haswell Architecture

2012-0411.maurice.png

Maurice Herlihy "We always said that parallel machines with more than one processor were going to be important someday, but nobody knew when it would happen." Credit: Mike Cohea/Brown University

In 1993, Maurice Herlihy along with Eliot Moss of the University of Massachusetts, Amherst invented transactional memory in the paper: Transactional Memory: Architectural Support for Lock-Free Data Structures.

Transactional memory is a promising technique designed to make the creation of reliable multithreaded programs easier. It does this by using a transactional model wherein complex operations can be performed concurrently, in isolation from each other, with those operations either completing or being undone as if they'd never been started—a model that developers are already familiar with from database programming.

“At the time, we thought (transactional memory) was really cool, but it was hard to attract attention to it — it was this specialized corner,” Herlihy said. “We weren’t necessarily geniuses ahead of our time, because it solved a problem that didn’t exist yet.”

For a decade this work attracted little attention. However, in 2004, it began to gain traction and currently has more than 1300 citations. Recently, Intel has announced that its Haswell architecture, due to ship in 2013, will include hardware support for transactional memory. Until now, transactional memory has been a technique best described as “experimental.” The theoretical gains—a simpler programming model that allows much greater concurrency than lock-based systems—are well-known, but current software-based implementations have offset those gains due to their slow performance. But with Intel planning to include the feature in a mainstream, mass-market processor, the theoretical improvements will be finally achieved in practice and transactional memory will start being used for real in millions of computers. For parallel programmers, that's an exciting prospect indeed.

Next entry

Previous entry