Prof. Marcus Hudritsch
Prof. Marcus Hudritsch Dozent
Technik und Informatik
Lecturer in the computer science specialization CPVR (Computer Perception and Virtual Reality)
Applied Research and Development project management and acquisition.
Image Analysis and Synthesis
2D & 3D Computer Graphics
Augmented and Virtual Reality
BTI4402 Introduction to Image Processing and Artificial Intelligence
BTI4404 Advanced Image Processing and Artificial Intelligence
BTI4406 Special Topics in Image Processing and Artificial Intelligence
BTI7527 Game Development
BTI7534 Advanced Game Development
Image Analysis and Image Synthesis
2D & 3D Computer Graphics
Augmented and Virtual Reality
Image based Machine Learning
- 1993-1995 Leitender Architekt CoBau AG, Stein am Rhein
- 1997-2004 Software Ingenieur ProcessLink AG, Basel
- 2001-2010 40% Dozent für Computergrafik und Bildverarbeitung FHNW
- 2010-2012 100% Dozent für Computergrafik und Bildverarbeitung FHNW
- 2012- 100% Dozent für Computergrafik und Bildverarbeitung BFH
- 1979-1985 Matura Gymnasium Biel
- 1986 Englisch Diplom Sprachschule Perth Australien
- 1986-1992 Dipl. Arch. ETH ETH Zürich
- 1991 Austauschsemester International School of Architecture Ahmedabad, Indien
- 1996-1997 Dipl. Inf. NDS Ingenieurschule beider Basel
2019-2020: In the Erleb-AR project financed by the Swiss Federal Office of Culture an outdoor augmented reality application is developed to visualize 4 cultural heritage sites in Switzerland. Our project is called Erleb-AR, because it is supposed to make the cultural heritage practically tangible with augmented reality. The title Erleb-AR is wordplay with the German adjective “erlebbar” which means in German experienceable and AR.In the project, an app for mobile devices is developed that allows the outdoor AR visualization of historical buildings that don’t exist anymore. To illustrate the project idea, a concept video was created: https://youtu.be/zdezcX9PfRs. As an example of the concept video, we chose the 1834 demolished Christoffelturm on the Berne main station. All those who have installed our app on their mobile device can experience this imposing tower again.
2017-18: In the ADPS - Alzheimer Disease Prediction Service (KTI 18996.2 PFLS-LS) project, a mobile application is being developed in which information for a non-invasive diagnosis of Alzheimer's is collected using an augmented reality game and the movement sensor data. The game will be played by patients with a risk of Alzheimer Disease. Current state-of-the-art diagnostic measures of Alzheimer's disease (AD) are invasive (cerebrospinal fluid analysis), expensive (neuroimaging) and time-consuming (neuropsychological assessment) and thus have limited accessibility as frontline screening and diagnostic tools for AD. This project proposes a novel computational biomaker for mobile phones (ADPS) which may facilitate early and potentially more effective therapeutic and preventative strategies for AD.
2016-2017: In the Bodygee3D project (CTI 19154.2 PFES-ES), a morphable body model is created using 3D body scans to use 3D body scans in the areas of nutritional advice and fitness. The aim of this project is the development of an analysis module that determines the effectiveness of the habit change/training on the specific body parts based on 3D body scans and enables predictions of body changes using a morphable body model. Obesity is becoming a widespread disease. Only those who permanently change their habits have good chances of reaching and maintaining lower body weight. This project enables not only statements that body fat has been reduced to be made but with much greater accuracy where fat has been lost. In this way, the weight-loss behavior and the effectiveness of the training/behavior change can be documented. In the medium term, the goal is to be able to make additional statements about the relationship between muscle, fat and water content in relation to the individual body parts and by using local impedance measurements. However, this is not part of this project. With the Body Morphable Model, not only can states be recorded, but thanks to the parameterization and the statistically determined data, the course (first legs, then buttocks ...) of the change in body shape can be simulated in the medium term.
2015-2016: In the CleverScan project (KTI 15620-1 PFES-ES), a camera system was developed with which manholes can be scanned and measured in 2D and 3D. The system will go into production in 2016. The new camera produces a detailed 3D-reconstruction of the manhole with detailed images mapped on the 3D surface. The camera uses five sensors: Fore synchronized sensors to the side are used for the scanning and one sensor downwards captures a full HD video stream. The 3D-reconstruction is done using 4 plane lasers.
2014-2015: In the i-Lumica Expert Gate project (KTI 14281-1 PFES-ES), a system consisting of 40 cameras and 20 beamers was developed with which Ha-gel cells can be detected on car bodies. The application realizes several modules to accomplish these tasks. A data acquisition module retrieves 2D and 3D data from the surface scanning hardware provided by i-Lumica AG. Depending on the current analysis run, the 3D data part is cleaned, repaired and optimized to produce mesh data of cars which act as virtual master models. A segmentation module is used to identify, segment and store car body parts based on these master models. Finally, a surface analysis module uses the acquired 2D hail damage data of an ad-hoc car body to simulate hail and assess damaged parts based on the master model of the same car type.
The information gained from such a damage analysis is stored in enhanced databases for further analysis and online access as well as online visualization for 3rd party service providers like insurance companies, logistics experts and engineers.
2012-2014: In the Multimodal Passenger Flow Control System (KTI 13316.1 PFFLE-IW) project, intelligent stereo cameras were developed with which the flow of people in large buildings can be monitored. The camera system was successfully launched on the market.
2015-2017: Virtual Room VR: The Virtual Room project is about allowing several people to meet in a virtual room via the network. Each person wears newer generation VR glasses (e.g. HTC Vive or Oculus Rift) with two hand controllers each, the position and orientation of which are precisely recorded in the respective real room and synchronized via the network. It is possible for a person in Switzerland and a person in Australia to meet in a virtual space and not only communicate verbally and visually, but also interact manually with their hands. Together you can e.g. look at a model of a machine and agree on a possible repair. The respective representation of one's own body in virtual space can be chosen freely. The video below shows a demonstration of the project at the BFH stand at the 2016 professional fair in Bern.
Olivier Gafner, Frederik Heck, Yannik Stuker Stratoon (Tracked Stratospere Balloon) 2020
David Märki Semantic Segmentation for LOCSIM 2020
Michael Schertenleib Shadow Mapping with OpenGL in SLProject 2020
Steven Henz UI Framework for Oculus VR 2020
Nic Dorner Ray Tracing with Optix 2019
Joel Frutiger HoloLens for Machine Maintenance 2019
Roland Bruggmann Volume Rendering in Unity 2016
Nic Dorner Eularian Video Magnification 2019
Michael Utz Player Tracking 2018
Stefan Töni Voxel Cone Tracing 2018
Jan Dellsperger & David Sheppard Low Cost Motion Capturing System 2018
Matthias Spring VR Kitchen Configurator 2018
Sebastian Häni & Raphael Laubscher Thermal Infrared Scanner for SBB Trains 2017
Vincent Genecand & Lukas Knöpfel Hololens for BIM 2017
Patrik Marti & Mathias Winkler Tetris GameBot 2016
Tina Gerber Face Morphing in PCA Space 2016
Daniel Probst & Kaspar Schmid Billard Tracking at 120 FPS 2013
Sprachen- und Länderkenntnisse
- Deutsch - Muttersprache oder zweisprachig
- Englisch - Fliessend
- Französisch - Konversationssicher