Computer Science Bldg. Room 632
Most existing program optimization techniques are based on the code or the behavior of the target program, which means they can be characterized as self-aware. However, self-aware optimization does not consider interaction between different programs, a factor which often has a significant impact on performance. Interaction may be positive (collaborative) or negative (interfering). Collaborative interaction happens in message passing or shared memory parallel programs, when a group of tasks accomplish the same goal. Interference happens in resource sharing, when the resource usage of one task is limited by the resource usage of other tasks. Parallel programs interact in both ways, while independent programs interact only through resource sharing.
Peer-aware optimization optimizes a task by considering the code or the behavior of other tasks. This dissertation studies three such techniques: defensive tiling to guard against contention among independent programs, delta send-receive for efficient message passing, and shared cache modeling and performance scaling for symmetric tasks. The results show that peer analysis can be systematically applied at both code and behavior levels, and peer-aware optimization is profitable and increasingly important on today’s multicore systems.