In recent years, Graph Neural Networks (GNNs) and Transformers have led to numerous break-through achievements in a variety of fields such as Natural Language Processing (NLP), chemistry and physics. By doing away with the need for fixed-size inputs, these architectures significantly extend the scope of problems in which deep learning can be applied.
Preliminary Agenda
This workshop will take you from the representation of graphs and finite sets as inputs for neural networks to the implementation of full GNNs for a variety of tasks. You will learn about the central concepts used in GNNs in a hands-on setting using Jupyter Notebooks and a series of coding exercises. While the workshop will use problems from the field of chemistry as an example for applications, the skills you learn can be transferred to any domain where finite set or graph-based representations of data are appropriate. From GNNs, we will make the leap to Transformer architectures, and explain the conceptual ties between the two.
The workshop is free of charge and will be conducted fully online using zoom.
To successfully participate in this workshop, you should have a good understanding of basic linear algebra and core concepts of deep learning such as CNNs, stochastic gradient descent, and supervised learning. You should also be familiar with the implementation of neural networks using PyTorch. A basic conceptual understanding of mathematical graphs is recommended but not a prerequisite.
For updated agenda you can follow this link
https://enccs.se/events/2021/10/advanced-deep-learning/
For further questions please contact us at training@enccs.se