I am highly interested in big data analytics, both from the architectural and the data science point of view. In 2016, I participated in the Stanford Statistical Learning online course and I got trained for the Teradata Aster Analytics platform. I am employed as Senior Consultant by BTC Business Technology Consulting AG, occupying the roles Data Scientist and Analytics Architect. A significant part of my work currently focuses on building a cloud-based big data platform for advanced analytics and smart data services with Apache Mesos, Apache Hadoop and Docker.
While graduating in 2008, I developed tools for simulating and controlling populations of adaptive household appliances to support grid stability. Afterwards, I focused on the broader topics of self-organized coordination and distributed heuristics in the domain of energy systems as topic for my PhD thesis. Ultimately, I developed the Combinatorial Optimization Heuristic for Distributed Agents (COHDA), which is able to solve combinatorial problems in distributed systems. While the former projects have been realized using Java and C, COHDA has been coded in Python, my present language of choice. Currently, I am mainly using Jupyter for my daily data science work.
Until 2016, I was a full-time employee at the Environmental Informatics group, University of Oldenburg, Germany. There, I gained experience in algorithm design, empirical experiments, statistical analysis and HPC computing. I published in peer-reviewed journals, books and proceedings. I have been collaborating with national and international research groups. Furthermore, I was member of several university committees and I was referee for journals and conferences.
My postdoctoral position involved a certain amount of teaching. I am experienced in doing both small-scale courses (i.e., seminars, software projects, tutorials) and large-scale lectures (i.e., readings with up to 350 participants). I enjoy giving talks, both national and international, and I put much effort into my presentations. Although not being employed by the university anymore, I still supervise and examine theses.
When I started working at the Environmental Informatics group in 2008, I got the responsibility for keeping the whole IT-infrastructure of our group up and running. My predecessor left me just a folder of hand-written notes and hardcopies of man-pages, accompanied by a bunch of servers running SunOS and Linux. Without further ado, I deleted my installation of MS Windows, switched to Linux and got familiar with Bash and friends. Over time, I built up a completely new infrastructure comprising application servers, gateway, firewall, multi-tier backup system (on-site and off-site storage), central user/project/repository management, a handful of virtual machines and a sophisticated monitoring system. Of course, I documented everything in fine detail, using ReStructuredText files in a Mercurial repository with Sphinx as rendering backend. In summary, this part of my duties strongly influenced my daily work in terms of thinking ahead, acting demand-oriented and being responsible (24/7). Also, I learned to love automation and scripting: Let the machines do the work!