Friday, March 21, 2014

Reproducibility and information preservation

Measuring reproducibility in computer systems research

Christian Collberg, Todd Proebsting, Gina Moraila, Akash Shankara, Zuoming Shi and Alex M Warren

The reproducibility, or lack thereof, of scientific research is a hot topic recently.  Among real scientists, reproducibility of experiments has always been an important goal, although as experimental materials and volumes of data they can generate have grown rapidly, it is becoming more and more difficult to achieve this goal.  In principle, the "Methods and Materials" section of a scientific journal article should provide the information needed (at least, for another qualified scientist in the same area) to reproduce the results.  In practice, this is not always possible, although one must at least pay lip service to this in order to get a paper accepted.  In some domains, however, it is estimated that up to half or more of reported results are irreproducible; however, once a result is published (especially by a high-profile journal) it can become difficult for others to publish results that contradict apparently-established facts.

Labels: ,

Sunday, March 16, 2014

Introduction to Bisimulation and Coinduction

Introduction to Bisimulation and Coinduction
Davide Sangorgi
Cambridge University Press, 2012

Bisimulation is a natural notion of equivalence for process algebras, and arises naturally in modal logic, graph theory and other settings.  Suppose we have a labeled transition system $(X,\stackrel{a}{\to})$, then a bisimulation $R$ is a relation satisfying:

• for all $x~R~y$, if $x \stackrel{a}{\to} x'$ then there exists $y'$ such that $y \stackrel{a}{\to} y'$
• for all $x~R~y$, if $y \stackrel{a}{\to} y'$ then there exists $x'$ such that $x \stackrel{a}{\to} x'$
Then bisimilarity is defined as the largest relation $\approx$ that contains every bisimulation.  Equivalently, the largest relation closed under the above two rules.

Bisimulation is defined coinductively, that is, in terms of the greatest fixed point of a monotone operator, in contrast to inductive definitions (least fixed-points) that are more familiar to computer scientists.  Bisimulation and coinduction are general concepts, but they have generally been covered in the context of process calculi such as CCS or the $\pi$-calculus, or for reasoning about infinite streams or other infinite behaviors in functional programming.  This book provides an accessible treatment of both topics in their own right, rather than as a secondary matter arising in the study of process calculus or streams.

Labels: , ,

Friday, March 07, 2014

Programming with Higher-Order Logic

Programming with Higher-Order Logic
Church's higher-order logic, based on the typed $\lambda$-calculus, is a foundation of mathematics and computer science.  It underpins a number of other influential logics and type theories, including systems for formalized mathematics such as LF, HOL and Coq, which in turn have led to interesting developments relevant to mainstream mathematics such as formal proofs of the Four-Color Theorem and Feit-Thompson Theorem in Coq, as well as the Homotopy Type Theory program initiated by Voevodsky.  This book is about higher-order logic viewed in a different light, as a foundation for programming, specifically following the computation-as-deduction approach taken in logic programming languages such as Prolog.  Higher-order logic programming builds on higher-order unification, and generalizes Horn clauses over first-order terms in Prolog to hereditary Harrop formulas over higher-order terms in languages such as $\lambda$Prolog. One might view (higher-order) logic programming as a tool deserving a place in a logician's toolbox analogous to numerical or symbolic tools for analysis: not a replacement for mathematical reasoning, but a framework for experimentation and automation.