This lecture introduce students to machine learning basics and possible applications in engineering. The basic notion of classification and regression will be introduced, whereby a primary focus lays on regression. Stu- dents shall understand the basic structure of machine learning algorithms, which typically consists of three ingredients: 1) a parametric model or ansatz (e.g. a neural network), 2) a loss or objective function and 3) an optimization algorithm. Peculiarities with uncertain data and the importance of under- and overfitting, training, test and validation sets will be discussed.
Algorithms and Programming
In algorithms and programming, concepts are taught that enable students to understand existing codes and especially to build new R&D software. In general, software development expertise is useful in all engineering disciplines and provides invaluable benefits especially in the context of numerics and machine learning, covered for instance by courses on finite elements or finite volumes and data-driven material modeling. Throughout the exercises, the programming language python is used. However, all concepts taught are applicable in a variety of languages, e.g. Java or C++, to only name a few. Besides programming basics, version control and container virtualization, principles of object-oriented programming, data structures and complexity considerations for different algorithms are discussed
As a cost-effective supplement to experiments and prototyping, computational engineering has become an indispensable part of development cycles. In parallel, the possibilities to collect and store in-situ data from the actual operation of products, processes and infrastructure have increased enormously.
Against this background, the aim of data-driven modeling is to link all available information from experiment, simulation and monitoring. In this way, it makes an important contribution to the development of digital twins: digital models that accompany physical products, processes or infrastructure throughout their life cycle. For this purpose, the specialization conveys competencies along the interface of modeling, numerics, machine learning and the quantification of uncertainties.
Course: Data-driven material modeling
Conventional material modeling is based on the assumption of simplified mathematical relationships for which material specific parameters have to be determined. With machine learning methods, models can also be extracted directly from large data sets. This becomes especially appealing in the context of architectured multi-scale materials. At the end of the course, students are able to formulate material models using machine learning methods and to implement them into a finite element environment. The lecture is accompanied by a computer exercise and includes the following contents:
Constitutive modeling with neural networks
Neural network architectures that respect fundamental modeling requirements, such as objectivity and thermodynamic consistency
Implementation of neural network constitutive models into finite element software
Multi-scale modeling
Computational homogenization with Convolutional Neural Networks
Generative Adversarial Neural Networks for augmentation of experimental data
Course: Advanced data-driven modeling
As its precursor data-driven material modeling, advanced data-driven modeling addresses topics at the intersection of continuum mechanics, numerics and machine learning with a glimpse into uncertainty quantification. A fundamental understanding of the aforementioned disciplines, either from data-driven material modeling or lectures on solid mechanics, finite elements and machine learning is mandatory to follow this course. Advanced data-driven modeling provides a holistic view on parameter identification. Different optimization and sampling methods as well as different discretization techniques are discussed. Beyond parameter identification, data-driven closure terms will be introduced that can be used to update computational models from noisy data.
Inverse Problems and Parameter Identification
Many-query approaches to parameter identification
Physics-informed neural networks for parameter identification
Physics-informed neural networks in inverse problems, e.g. reconstruction of physical fields from incomplete observational data
Model Updating
Gaussian Processes
Statistical Finite Element Method
Course: Methods of Uncertainty Quantification and Analysis
Predicting with complex computational models requires accurate parameter estimates. However, in practice, noisy and limited data introduce uncertainty. Therefore, this course presents concepts and methods that account for limited parameter knowledge in predictive modeling. Starting with the required background in probability theory, random descriptions of scalar and distributed model parameters will be studied first. Then, different approaches to surrogate modeling will be discussed, where emphasis is put on the underlying probability distributions. The methods range from non-intrusive spectral modeling to the stochastic Galerkin Finite Element method. Finally, advanced Monte Carlo approaches, such as the multilevel Monte Carlo method will be introduced, which allow to efficiently quantify uncertainties also for complex models. At the end, students will be able to identify suitable methods for uncertainty analysis of complex models and master some first computational UQ studies with open source software.
This course is offered by Prof. Ulrich Römer, Institut für Dynamik und Schwingungen.
Artificial intelligence is gaining rapidly importance across disciplines. How can we cope with the fast paced research and development in this field? In the BMBF funded collaborative project KI4ALL, we are developing AI related microcredits: small, interchangeable teaching units. When clustered, microcredits can be integrated into existing curricula, or be used in continued education scenarios. For our collection of microcredits, please check the KI4ALL webpage.
To work on our microcredits, independently without constraints on time and location, we offer access to a Jupyterhub which is documented here: KI4All The terms of use can be found here: Nutzungsordnung Cluster KI4All