--- res/PP-compiler.tex 2009/06/28 16:26:25 1.8 +++ res/PP-compiler.tex 2009/06/29 08:36:40 1.12 @@ -28,9 +28,9 @@ \begin{document} \maketitle -PP leader: \emph{Jens Knoop and Andreas Krall} +PP leader: \emph{Jens Knoop and Andreas Krall (beide E185.1)} -Associated researchers: \emph{} +Associated researchers: \emph{Anton Ertl (E185.1), Bernhard Gramlich (E185.2)} @@ -38,65 +38,113 @@ Associated researchers: \emph{} %\emph{Informal description of the purpose of the PP (3-5 lines)} Every embedded system consists of software which is written in a high level language, compiled to machine language and executed on a -processor. For robust embedded systems new verified compilation, -simulation and specification methods are necessary to optimize for -performance, power, space, concurrency and reliability. +processor. For robust embedded systems new verified analysis and +compilation, simulation, and specification methods are necessary to +support the programmer during application development and maintenance +and to optimize for performance, power, space, concurrency and +reliability during compilation. +%for short, new programming and +%compilation techniques for robust embedded systems development and +%deployment. \subsubsection*{State of the art and related work:} %\emph{Briefly describe the scientific state of the art (20-30 lines)} %Compilation Techniques for Reliability - Because of the exponential increase of the number of transistors and the continuing decrease of the feature sizes of current processors -soft errors mainly caused by energetic particles are becoming an +\emph{soft errors} mainly caused by energetic particles are becoming an important design issue for robust embedded systems. Blome et al.~\cite{Blome+06} observed that a majority of faults that affect the architectural state of a processor come from the register file. Lee -and Shrivastava and proposed different solutions to cope with this -problem. The first assigns variables depending on their lifetime to -either the ECC protected or the unprotected part of a register file to -balance energy consumption and reliability \cite{LeeShrivastava09a}. -The second spills registers to ECC protected memory if the register -contents are not used for a long period \cite{LeeShrivastava09c}. -There exist complete software solutions which use different forms of -code duplications \cite{Oh+02a,Reis+05}, which do failure -virtualization \cite{WapplerMueller08} or which use techniques like -control flow checking \cite{Oh+02b}. A complete overview of processor -description languages and generation of compilers and simulators from -processors specifications gives the book of Mishra and Dutt \cite{MishraDutt08}. +and Shrivastava \cite{LeeShrivastava09a,LeeShrivastava09c} proposed +different solutions to cope with this problem. The first assigns +variables depending on their lifetime to either the ECC protected or +the unprotected part of a register file to balance energy consumption +and reliability \cite{LeeShrivastava09a}. The second spills registers +to ECC protected memory if the register contents are not used for a +long period \cite{LeeShrivastava09c}. There exist complete software +solutions which use different forms of code duplications +\cite{Oh+02a,Reis+05}, which do failure virtualization +\cite{WapplerMueller08} or which use techniques like control flow +checking \cite{Oh+02b}. A complete overview of processor description +languages and generation of compilers and simulators from processor +specifications gives the book of Mishra and Dutt \cite{MishraDutt08}. A good survey of current instruction set simulators gives our chapter -in the Handbook of Signal Processing systems \cite{BrHoKr09}. A famous -instruction set simulator with modelling of energy consumtion is Wattch -\cite{BrooksTiwariMartonosi00}. - -Compiler Verification - - \cite{Hoare03} - \cite{TristanLeroy09} - \cite{TristanLeroy08} - \cite{Kundu+09} - \cite{Necula00} - \cite{ZaksPnueli08} - \cite{Pnueli98a} - \cite{Pnueli98b} - \cite{GlesnerGoosZimmeermann04} - \cite{GoosZimmermann00} - \cite{BlechPoetzsch07} - -WCET \cite{} +in the \emph{Handbook of Signal Processing systems} \cite{BrHoKr09}. A +famous instruction set simulator with modelling of energy consumption +is \emph{Wattch} \cite{BrooksTiwariMartonosi00}. + +\paragraph{JK} +Methods for \emph{compiler verification} do exist +\cite{Langmaack97a,Po-lncs124,MMO-lncs1283,Goos:99:verifix,Goos:00:ASM,1328444}. +Most notably are the pioneering approaches of the +\emph{ProCoS} \cite{Langmaack96a} and the \emph{Verifix} +\cite{Goerigk-et-al:CC96} projects, and more recently of +the \emph{CompCert} project \cite{CompCert,BDL-fm06,Le-popl06}. There +is also a rich body of work on the related approaches of +\emph{translation validation} +\cite{PSS-tacas98,Ne-pldi00}, \emph{certifying compilation} +\cite{NL-pldi98,Colby-etal-pldi00,BlechPoetzsch07}, and +\emph{proof-carrying code} \cite{Ne-popl97,AF-popl00,FNSG-tlfi07}. +However, an integratedly verified compiler, +which is optimizing and ensures non-functional program properties such +as on time and space ressources required by the compiled program is +still missing. + +\emph{Worst-case execution time analysis $($WCET$)$} for real-time systems, +which are often safety-critical, is a vibrant field of research in +academia and industry and of fast growing economical relevance, +especially in the avionics and automotive industry. A survey on +state-of-the-art tools and methods for WCET analysis has recently been +given by Wilhelm et al.~\cite{Wilhelm:TECS2008}. The outcomes of the +WCET Tool Challenges \cite{Gus:ISoLA2006,Holsti:WCET2008}, however, +demonstrate that all these tools have their own strengths and +limitations. In particular, they all rely to some extent on +user-assistance and thus a \emph{trusted information basis} guiding +the WCET analysis \cite{Prantl:WCET2009}. + + +\paragraph{AK} +Three aspects of program and compiler correctness exist. The verifying +compiler proves properties of the translated program and is a grand challenge +for computing research \cite{Hoare03}. A certified compiler like Verifix is +proven once to do semantically equivalent optimizations and translations +\cite{GlesnerGoosZimmeermann04,GoosZimmermann00}. Translation validation proves +at every compiler run that the translation is correct and was introduced by +Pnueli et al.\ \cite{Pnueli98a,Pnueli98b} and Necula \cite{Necula00}. Until now +some optimizations have been verified, recently lazy code motion +\cite{TristanLeroy09}, instruction scheduling \cite{TristanLeroy08}, or the whole +code generation phase \cite{BlechPoetzsch07}. Another research direction is the +construction of general frameworks for validation \cite{ZaksPnueli08} or +generalizations like parameterized equivalence checking \cite{Kundu+09}. + \subsubsection*{Previous achievements:} %\emph{Brief description of your own contributions to the related %scientific state-of-the art (5-10 lines)} -Jens Knoop has a long history on work on program analysis with topics -like partial redundancy elimination or lazy code motion -\cite{knoop:DSP:2008:1575,conf/cc/XueK06,scholz04}. Recently he -changed his research focus on worst case execution time analysis -\cite{SchrSchoKn09,Prantl:WLPE2008,prantl_et_al:DSP:2008:1661, -kirner_et_al:DSP:2008:1657,kirner_et_al:DSP:2007:1197}. He is involved -in the organization of many compiler conferences and since 2002 program -cochair of the yearly workshop on compiler optimization meets verification. +Jens Knoop's research focuses on proven correct and optimal program +analyses and optimizations \cite{Kn-lncs1428}. He is the co-inventor +of the \emph{Lazy Code Motion} +\cite{KRS-pldi92,KRS-retrolcm04,XueK06}, the code-size sensitive +\emph{Sparse Code Motion} \cite{RKS-popl00}, and numerous other +program analyses and optimizations including +\emph{partial dead-code elimination}, \cite{KRS-pldi94}, \emph{partially redundant assignment elimination} \cite{KRS-pldi94}, and \emph{code-size sensitive +speculative code motion} \cite{scholz04}, many of which are now part of +state-of-the-art compilers. Recent research focuses on compiler +support for +\emph{worst-case execution time analysis} for safety-critical +real-time embedded systems +\cite{Prantl:WCET2009,SchrSchoKn09,Prantl:WLPE2008,prantl_et_al:DSP:2008:1661,kirner_et_al:DSP:2008:1657,kirner_et_al:DSP:2007:1197}. He served on $50+$ +programme committees of international conferences including PLDI, CC, +TACAS, Formal Methods, and Supercomputing. He was the General Chair of +PLDI'02 and ETAPS'06, and is Programme Committee Co-Chair of PACT'10. He is the +iniator and co-founder of the annual workshop series on +\emph{Compiler Optimization meets Compiler Verification} (since 2002), +co-organizer of 4 Dagstuhl seminars, most recently on \emph{Verifying +Optimizing Compilation}, and a member of the European Network of +Excellence HiPEAC. +%, and the IFIP WG 2.4 \emph{Software Implementation Technology}. Andreas Krall does research in the area of architecture description languages and the automatic generation of highly optimizing compilers, @@ -116,15 +164,20 @@ Microelectronics). %stress what the significant additions to the scientific knowledge are, %and why they are important. (30-40 lines)} -New modeling and representation techniques of non-functional program and system properties on the programming and intermediate language levels -Definitions and measures of non-functional program and system properties (performance, time, space/memory, power, concurrency). -Modeling and representation of these properties alongside with the programming languages semantics -Adapting and enhancing state-of-the-art compilation techniques towards non-functional property and platform awareness -New functional and non-functional property and platform-aware compilation techniques -Analyses for non-functional program and system properties -Functional and non-functional property and platform-aware code generation techniques -Enabling validation and verification throughout the compilation process -Techniques for reducing or eliminating trusted code, annotation, etc., bases +New modeling and representation techniques of non-functional program +and system properties on the programming and intermediate language +levels Definitions and measures of non-functional program and system +properties (performance, time, space/memory, power, concurrency). +Modeling and representation of these properties alongside with the +programming languages semantics Adapting and enhancing +state-of-the-art compilation techniques towards non-functional +property and platform awareness New functional and non-functional +property and platform-aware compilation techniques Analyses for +non-functional program and system properties Functional and +non-functional property and platform-aware code generation techniques +Enabling validation and verification throughout the compilation +process Techniques for reducing or eliminating trusted code, +annotation, etc., bases \subsubsection*{Work Plan (first 4 years):} @@ -133,10 +186,10 @@ Techniques for reducing or eliminating t %to also describe and (coarsely) quantify the resources (staff, cost of %special equipment) required for this work in a table. (20-30 lines)} -Compilation techniques for robust embedded systems comprise different areas. -Therefore, the project is divided into three work packages: compilation and -simulation techniques for reliabiltiy, verified compilation and worst case -execution time analysis. +Compilation techniques for robust embedded systems comprise different +areas. Therefore, the project is divided into three work packages: +compilation and simulation techniques for reliabiltiy, verified +compilation and worst case execution time analysis. \paragraph*{WP1 - Compilation and Simulation Techniques for Reliability} @@ -171,12 +224,13 @@ hardware/software and pure software tech \paragraph*{WP2 - Verified Compilation} Suitable semantics are necessary which support efficient translation -validation or support easy verification of a compiler. We will research -into different semantics and into mappings between the semantics of our -processor description language \cite{BrEbKr07} and a compiler backend -semantics, intermediate representation semantics (compatible to LLVM) and -source language semantics. The main research will be on verification and -translation validation for all kinds of compiler optimizations. +validation or support easy verification of a compiler. We will +research into different semantics and into mappings between the +semantics of our processor description language \cite{BrEbKr07} and a +compiler backend semantics, intermediate representation semantics +(compatible to LLVM) and source language semantics. The main research +will be on verification and translation validation for all kinds of +compiler optimizations. \begin{itemize} \item Evaluate different semantics regarding suitability for compiler @@ -192,9 +246,47 @@ translation validation for all kinds of frontend and backend optimizations \end{itemize} -\paragraph*{WP3 - Worst Case Ececution Time Analysis} +\paragraph*{WP3 - Resource Analysis} +For safety-critical real-time embedded sytems resource consumption +measured in terms of a quantitative aspect of a program execution such +as execution time, storage use, and power consumption belongs rather +to the functional properties of an application rather its +non-functional ones. Formal guarantees on resource consumption are +thus essential and mandatory to ensure the robustness of such +systems. This requires new and usually more complex but more +expressive program analyses and transformations to support the (1) +programmer during source code development by early and automatically +providing hints on resource consumption and (2) the compiler to +optimize for resource consumption. In our previous work we focused on +compiler support for +\emph{worst-case execution time analysis $($WCET$)$} \cite{Prantl:WCET2009,Prantl:WLPE2008,prantl_et_al:DSP:2008:1661,kirner_et_al:DSP:2008:1657}. Based on this work and expertise we will extend this research towards +other quantitive aspects of resource consumption, especially storage +usage, towards these two global objectives, using the programming +environment used there as testbed for implementation +\cite{Prantl:WCET2009,Prantl:WLPE2008,prantl_et_al:DSP:2008:1661,kirner_et_al:DSP:2008:1657}. + +\begin{itemize} +\item Research into new program analyses for providing high-quality + bounds on resourse consumption which are useful for both the + application programmer and the compiler. +\item Research new program analyses and static optimizations + to optimize for resource consumption and to help complying to + possibly given limits. +\item Research suitable abstraction levels of interfaces and modes + of interaction between fully automatic program analysis and + verification methods and semi-automatic ones relying on + user-assistance because of undecidability issues +\item Research the synergies and the trade-off between fully + automatic program analysis and verification methods and + semi-automatic ones utilizing user-assistance on high-quality + resource bounds and the computational costs to compute them. +\item Research simulation and profiling methods to assess the + quality of resource consumption analyses. +\end{itemize} +Overall, this WP will contribute to the design, foundations, +verification, implementation, and application of resource analyses. + -WCET \begin{tabular}{llll} @@ -203,26 +295,34 @@ WCET {\bf Pos} & {\bf Type} & {\bf Description} & {\bf Duration} \\ NN1 & PhD & reliable compilation / simulation & 4 years \\ NN2 & PhD & verified compilation & 4 years \\ -NN3 & PhD & WCET & 4 years \\ +NN3 & PhD & Resource analysis & 4 years \\ \hline \end{tabular} \subsubsection*{Goals (last 4 years):} %\emph{Brief description of the -%eesearch topics to be addressed during the last 4 years. Make sure to +%research topics to be addressed during the last 4 years. Make sure to %explicitly stress what the significant additions to the scientific %knowledge are, and why they are important. (20-30 lines)} -New programming languages and compilers for RESs -Non-functional properties and requirements as first-class language and compiler citizens -New compilation techniques enabling a uniform and integrated approach -for ensuring functional and non-functional program and system requirements -Replacing trust by proof -Certifying compilation, proof-carrying code, translation validation -Verified compilers, verifying compilation for RESs -Making legacy applications fit to and available on RESs -Techniques for adjusting and decompiling legacy applications +In the last 4 years we will extend the research of the first years into +some additional directions like + +\begin{itemize} +\item New programming languages and compilers for RESs +\item Non-functional properties and requirements as first-class language and + compiler citizens +\item New compilation techniques enabling a uniform and integrated approach + for ensuring functional and non-functional program and system requirements +\item Verified compilers, proof-carrying code, verifying compilation for RESs +\item Making legacy applications fit to and available on RESs +\item Techniques for adjusting and decompiling legacy applications +\end{itemize} + +Application of the results of this research reduces the cost of the +development of reliable and correct embedded systems and makes them +safer and robust. \subsubsection*{Collaboration with other PPs:} %\emph{List the PPs you are expecting to collaborate with, and describe briefly @@ -236,7 +336,7 @@ Techniques for adjusting and decompiling [E182/Puschner]: Links to hard- and software models for time predictable systems, verification of timing behaviour. \item PP Formal Verification for Robustness [E184/Veith]: Links to software - model checking and testing of code (on source code and intermediate + model-checking and testing of code (on source code and intermediate code levels), support for program analysis and transformation. \item PP Modeling \& Analysis of Robust Distributed Systems [E182/Schmid]: Links to functional and non-functional system requirements, @@ -247,10 +347,13 @@ Techniques for adjusting and decompiling %\emph{List envisioned international and national collaborations, and %describe briefly the topic and nature of such a collaboration. (5-10 %lines)} - -Aviral Shrivastava, Arizona State University, Tempe, AZ, USA - -Wolf Zimmermann, Universit\"at Halle, Halle, Germany +\begin{itemize} +\item Walter Binder, University of Lugano, Switzerland (resource analysis) +\item Sabine Glesner, TU Berlin, Berlin, Germany (verified compilation) +\item Aviral Shrivastava, Arizona State University, Tempe, AZ, USA (reliable compilation) +\item Wolf Zimmermann, Martin-Luther Universit\"at Halle-Wittenberg, Halle, Germany + (verified compilation) +\end{itemize} \begin{comment} %Bitte hier die Bibtex-Entries einfuellen, z.B.,