Department of Computer Science Colloquium
Thursday May 2nd, 2002 at 4:15pm
Upson Hall B17
SUDS: Thread Level Speculation with Minimal Hardware
Support
Matthew Frank
http://www.cag.lcs.mit.edu/~mfrank/
Massachusetts Institute of Technology
General purpose processor architectures devote substantial chip area to hardware
structures for dynamic prediction, speculation and instruction scheduling.
Digital signal processors and multimedia processors, on the other hand, devote
this area to additional functional units to provide better peak performance.
This works because the important applications in these domains are relatively
easy to parallelize. I will argue that compiler technology makes it possible to
parallelize a broader class of applications than is commonly believed.
To demonstrate this I will describe SUDS (Software Un-Do System), a compiler and runtime system that automatically finds and exploits thread level parallelism in pointer-intensive programs with unstructured control flow. The SUDS compiler performs two tasks. First it exposes thread-level parallelism in do-across loops. I will demonstrate that this requires splitting cyclic control dependence regions. The global analysis required is thus best supported at compile time rather than runtime.
The compiler also identifies memory renaming opportunities. The SUDS runtime system speculatively parallelizes across any memory dependences that the compiler is unable to analyze, checks whether the parallel execution produced a result consistent with sequential semantics, and then either commits or rolls back the speculative execution path. Since the SUDS compiler eliminates most of the work of renaming the runtime speculation system is particularly efficient. As a result, the SUDS system requires only minimal hardware support to support thread-level speculation.