BEGIN:VCALENDAR
VERSION:2.0
PRODID:-//ENCCS - ECPv6.15.16//NONSGML v1.0//EN
CALSCALE:GREGORIAN
METHOD:PUBLISH
X-ORIGINAL-URL:https://enccs.se
X-WR-CALDESC:Events for ENCCS
REFRESH-INTERVAL;VALUE=DURATION:PT1H
X-Robots-Tag:noindex
X-PUBLISHED-TTL:PT1H
BEGIN:VTIMEZONE
TZID:Europe/Stockholm
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20200329T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20201025T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20210328T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20211031T010000
END:STANDARD
BEGIN:DAYLIGHT
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
TZNAME:CEST
DTSTART:20220327T010000
END:DAYLIGHT
BEGIN:STANDARD
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
TZNAME:CET
DTSTART:20221030T010000
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTART;TZID=Europe/Stockholm:20211123T090000
DTEND;TZID=Europe/Stockholm:20211124T160000
DTSTAMP:20260422T053848
CREATED:20211011T095300Z
LAST-MODIFIED:20211019T070509Z
UID:11177-1637658000-1637769600@enccs.se
SUMMARY:Advanced Deep Learning with Transformers - ENCCS/RISE
DESCRIPTION:Overview\nIn recent years\, Graph Neural Networks (GNNs) and Transformers have led to numerous breakthrough achievements in a variety of fields such as Natural Language Processing (NLP)\, chemistry\, and physics. By doing away with the need for fixed-size inputs\, these architectures significantly extend the scope of problems in which deep learning can be applied.\nPreliminary Agenda \nThis workshop will take you from the representation of graphs and finite sets as inputs for neural networks to the implementation of full GNNs for a variety of tasks. You will learn about the central concepts used in GNNs in a hands-on setting using Jupyter Notebooks and a series of coding exercises. While the workshop will use problems from the field of chemistry as an example for applications\, the skills you learn can be transferred to any domain where finite set or graph-based representations of data are appropriate. From GNNs\, we will make the leap to Transformer architectures\, and explain the conceptual ties between the two. \nThe workshop is free of charge and will be conducted fully online using zoom. \nPrerequisites\nTo successfully participate in this workshop\, you should have a good understanding of basic linear algebra and core concepts of deep learning such as CNNs\, stochastic gradient descent\, and supervised learning. You should also be familiar with the implementation of neural networks using PyTorch. A basic conceptual understanding of mathematical graphs is recommended but not a prerequisite. \nAgenda\nTuesday\, 23 November 2021\n[ninja_tables id=”10926″] \nWednesday\, 24 November 2021\n[ninja_tables id=”10927″] \nRegistration\nParticipants have reached the maximum amount so we had to close registrations. Subscribe to our newsletter or follow us on Twitter and LinkedIn where we announce all our new events regularly. \n————\nThis training is intended for users established in the European Union or a country associated to Horizon 2020.
URL:https://enccs.se/events/graph-neural-networks-and-transformers-2/
LOCATION:Online
CATEGORIES:ENCCS Event,RISE Event
ATTACH;FMTTYPE=image/jpeg:https://media.enccs.se/2021/09/Avanced-deep-learning-RISE.jpg
END:VEVENT
END:VCALENDAR