Postdoctoral fellow - Biomechatronik
Since January 2016
Postdoctoral fellow. CITEC and Research Group Biomechatronics. Bielefeld University.
January 2014 - January 2016
Postdoctoral researcher. CITEC and Department of Neurobiology. Bielefeld University.
PhD in Computational Neuroscience. "Representation of Visual Motion Information in the Fly Brain". Department of Neurobiology. Bielefeld University.
April 2013 - December 2013
Scientific assistant. Department of Neurobiology, Egelhaaf Group. Bielefeld University.
October 2012 - March 2013
Scientific assistant. Straw Lab. The Research Institute of Molecular Pathology. Vienna, Austria.
June 2010 - September 2013
Scientific Assistant. Department of Neurobiology, Warzecha Group. Bielefeld University.
Master thesis. Department of Neurobiology. Bielefeld University.
Visiting student in the laboratory of Prof. Dr. Dr. h. c. Randolf Menzel at the Institute of Neurobiology, Free University of Berlin.
2008 - 2010
International master's programme: Systems Biology of Brain and Behaviour', Bielefeld University.
Bachelor thesis. Department of Neurobiology. Bielefeld University.
2003 - 2008
Biology and Psychology at Bielefeld University.
Flying insects like for example flies or bees heavily rely on their visual system when performing behavioural tasks like course control, collision avoidance or spatial navigation. Compared to current approaches in machine vision the visual systems of flying insects are - despite the relatively small number of neurons involved - superior in terms of speed, flexibility and resource efficency, though they only have a poor spatial resolution. Hence, from an engineer‘s perspective the underlying mechanisms that allow such tiny animals to interact with their environment are of great interest.
Our goal is to develop a bio-inspired camera system allowing robotic platforms to perform visually-guided orientation behaviour based on physiological and computational principals found in insects. In cooperation with the Cognitronics group at CITEC, we develop and implement algorithms on dynamically reconfigurable hardware in combination with low power microprocessors to extract motion information contained in the optic flow of images a robotic agent perceives during self-motion. This visual motion information is subsequently used in bio-inspired algorithms controlling the agent's orientation behaviour in unfamiliar terrain.