Guaranteed Optimization: Proving Nullspace Properties of Compilers
| dc.contributor.author | Veldhuizen, Todd; Lumsdaine, Andrew | |
| dc.date.accessioned | 2025-11-12T00:09:42Z | |
| dc.date.available | 2025-11-12T00:09:42Z | |
| dc.date.issued | 2002-07 | |
| dc.description.abstract | Writing performance-critical programs can be frustrating because optimizing compilers for imperative languages tend to be unpredictable. For a subset of optimizations -- those that simplify rather than reorder code -- it would be useful to prove that a compiler reliably performs optimizations. We show that adopting a ``superanalysis'' approach to optimization enables such a proof. By analogy with linear algebra, we define the nullspace of an optimizer as those programs it reduces to the empty program. To span the nullspace, we define rewrite rules that de-optimize programs by introducing abstraction. For a model compiler we prove that any sequence of de-optimizing rewrite rule applications is undone by the optimizer. Thus, we are able to give programmers a clear mental model of what simplifications the compiler is guaranteed to perform, and make progress on the problem of ``abstraction penalty'' in imperative languages. > > languages. > > | |
| dc.identifier.uri | https://hdl.handle.net/2022/34407 | |
| dc.relation.ispartofseries | Indiana University Computer Science Technical Reports; TR564 | |
| dc.rights | This work is protected by copyright unless stated otherwise. | |
| dc.rights.uri | ||
| dc.title | Guaranteed Optimization: Proving Nullspace Properties of Compilers |
Files
Original bundle
1 - 1 of 1
Collections
Can’t use the file because of accessibility barriers? Contact us