Progress in Higher Order Automatic Differentiation or AXIS OF EVAL! AUTOMATIC DIFFERENTIATION mates with LAMBDA CALCULUS birthing MONSTER COMPILER faster than FORTRAN
Speaker: Prof Barak Pearlmutter, Maynooth University
Time: 3.00PM
Date: Wednesday 29 October 2014
Location: Room H1.52, UCD Science Hub, University College Dublin,
Abstract: The technique known in the machine learning community as "back-propagation" is a special case of "reverse-mode accumulation
automatic differentiation", or "reverse AD". We will explore forward and reverse AD using novel formulations that make contact with differential geometry and the lambda calculus. In this context, the AD operators naturally generalise to a much broader range of computer programs, including programs containing iterate-to-fixed-point loops; invoking or embodying higher-order functions; invoking optimisers; or even them-selves invoking AD operators. Algorithms including fast exact Hessian-vector multiplication, Pineda/Almeida fixed point back-propagation, and a wide variety of other techniques can be defined and implemented as one-liners. These methods allow very complicated systems, like bi-level optimisation architectures, to be built and optimised using gradient methods. We are in the process of formalising this system using the tools of Programming Language Theory, and a re-search prototype implementation has been constructed which exhibits startlingly good (faster-than-FORTRAN) numeric performance.
Series: Applied and Computational Mathematics Seminar Series
Please Note: All are welcome and tea/coffee will be provided
Social Media Links