# Talks

### Learning in the Physical World

I present how machine learning tasks in industrial settings differ from typical applications on the internet. As they require explicit …

### Uncertainties Need a Purpose

I argue that the task a model will be used for should be an explicit part of the modelling process. As an example, I show how ideas …

### Sparse GP Approximations

In this talk, I present an introduction to pseudo-input methods for sparse GP approximations. I derive the variational lower bounds for …

# Experience

#### Siemens AG

September 2018 – Present Neuperlach
I work on the application of my research and other state-of-the-art Bayesian models on industrial applications. My responsibilities include:

• Bayesian modelling in interaction with domain experts
• Infrastructure design for scalable inference
• Internal knowledge transfer
• Patent harvesting and collaboration on grant applications

#### Siemens AG and Technical University of Munich

September 2016 – Present Neuperlach
My research interests include:

• Hierarchical probabilistic models
• Reinforcement learning under uncertainty
• Expert-interpretable models

#### Siemens AG

September 2015 – June 2016 Neuperlach
Title: Incorporating Uncertainty into Reinforcement Learning through Gaussian Processes.

• Model based reinforcement learning with Gaussian Processes
• Propagation of predictive uncertainties
• Evaluation on a bicycle benchmark

#### Technical University of Munich

September 2012 – July 2014 Garching
Teaching assistant in multiple undergraduate courses:

# Projects

#### Modulated Bayesian Optimization using Latent Gaussian Process Models

We show how additional structure can be placed on surrogate models for Bayesian optimization to find the trends useful to exploit in search of the optimum. At the core of our approach is the use of a Latent Gaussian Process Regression model that allows us to modulate the input domain with an orthogonal latent space.

#### Compositional uncertainty in deep Gaussian processes

We discuss how mean-field assumptions for inference in deep Gaussian processes lead to collapse of uncertainties. We propose possible modifications to discover compositional structure in training data and yield informative uncertainties.

#### Bayesian Decomposition of Multi-Modal Dynamical Systems for Reinforcement Learning

In this extension, we demonstrate how semantic decompositions of dynamics models for Reinforcement Learning significantly increase data efficiency. We show how good model specification is critical for success and how the decomposition can be used for reward shaping.

#### Learning in the Physical World

I present how machine learning tasks in industrial settings differ from typical applications on the internet. As they require explicit handling of uncertainties and the incorporation of expert knowledge, the Bayesian paradigm is a good fit to formulate models.

#### Workshop on Uncertainty Propagation in Composite Models

Together with Carl Henrik Ek and Neill Campbell, I organized a workshop on uncertainty propagation in composite models at the Siemens AI Lab.

#### Uncertainties Need a Purpose

I argue that the task a model will be used for should be an explicit part of the modelling process. As an example, I show how ideas from Bayesian Optimization and Probabilistic Numberics can be used to reinterpret Reinforcement Learning.

#### Data Association with Gaussian Processes

We interpret the data-association problem of multimodal regression in the context of deep Gaussian processes and present an inference scheme based on doubly stochastic variational inference.

#### Interpretable Dynamics Models for Data-Efficient Reinforcement Learning

We demonstrate how expert knowledge can be incorporated in probabilistic policy search by imposing Bayesian structure on the learning problem. Our models yield human-interpretable insights about the underlying dynamics and significantly increase data efficiency.

#### Bayesian Alignments of Warped Multi-Output Gaussian Processes

We extend multi-output Gaussian processes with nonlinear alignments and warpings. The resulting model connects multiple deep Gaussian processes with a shared layer that allows us to extract shared latent data from multiple time series.

#### Sparse GP Approximations

In this talk, I present an introduction to pseudo-input methods for sparse GP approximations. I derive the variational lower bounds for SGPR and SVGP and give some intution for how they should be interpreted.

#### zfix-docker: Dockerized deployment of my server infrastructure

This project contains the code required for the installation and configuration of the different services running on my Linux server. To simplify dependency management, I use Docker-based deployments.

#### Incorporating Uncertainty into Reinforcement Learning through Gaussian Processes

In my master’s thesis I explore a variant of PILCO for Bayesian model-based reinforcement learning using Gaussian processes. Instead of optimizing a closed-form parameterized policy, I select actions by applying particle swarm optimization to the expected reward, which takes uncertainties about the system dynamics into account.

#### Incidence-Structures of Power Diagrams

Power diagrams are a generalization of Voronoi diagrams where the cell centers attract points with different forces. In this report I present an algorithm which calculates the incidence struture of such a diagram using the convex hull of a set of dual points.

#### LLVM-IL: A Scala-Library that emits LLVM Intermediate Language

LLVM-IL is a Scala-Library used to emit a subset of the textual LLVM-IR Code. Besides the direct commands, it contains some specific OOP features, like the creation of simple V-Tables paired with field access and virtual resolve. It works together with a simple runtime written in C.

#### Theoretical Computer Science Tutorial

The slides I created while teaching the tutorial for theoretical computer science at TU Munich. Theoretical computer sciences is held in the fourth semester of the Bachelor. It is an introduction to automata theory, formal grammars, computability and complexity theory.

#### Oblivious Routing and Minimum Bisection

Oblivious routing is generalization of multi commodity flows where the actual demand function is unknown. In this report I present a $\mathcal{O}(\log n)$ approximation algorithm using tree metrics. This result is then applied to the minimum bisection problem asking for an vertex bisection with minimal cost in the edges between the sets, also resulting in an $\mathcal{O}(\log n)$ approximation.

#### Discrete Structures Tutorial

The slides I created while teaching the tutorial for discrete structures at TU Munich. Discrete structures is the first mathematical course for comptuer scientists held in the first semester of the Bachelor. It is an introduction to mathematical proofs, combinatorics, graph theory and algebra.