Invited Speakers per track/theme

Tracks:

Joined sessions by ASCI, IPA & SIKS

Bart Jacobs (day 1)

Bart Jacobs is a professor of computer security at the Radboud University Nijmegen. With his research group he has worked over the last decade on a number of scientifically and societally relevant security topics such as chipcards (eg. in passports and transport), electronic voting, smart metering, road pricing and privacy. He is a member of the Academia Europaea, of the National Cyber Security Council, and he heads the advisory board of the digital rights organisation Bits of Freedom.

 

Abstract

ICT-issues are high on the political agenda, for instance in security issues with voting, or in debates about the role of big data in policy making. This talk will give an overview of the underlying issues, from the perspective that computer scientists are no longer the architects of the digital world but have become architects of the social world.

 

Marc van Kreveld (day 2)

Marc van Kreveld is a professor in computational geometry and its applications at the Department of Information and Computing Sciences of Utrecht University, the Netherlands. Best known as a co-author of the textbook "Computational Geometry - Algorithms and Applications", he leads the Virtual Worlds Division in Utrecht. His work was published in about 100 journal articles and 100 peer-refereed conference papers with a total of over different 150 co-authors. In addition to geometric algorithms, Prof. van Kreveld's research interests include GIScience, movement data algorithms, graph drawing, and puzzle design. The latter started as a hobby which made its way into his research and teaching in computational geometry and expanded to other puzzle types - like paper-based drawing puzzles and digital puzzles. Prof. van Kreveld served as program committee co-chair of the International Symposium on Computational Geometry and the International Symposium on Graph Drawing, and organizing co-chair of CG Week, including the International Symposium on Computational Geometry. He has been invited speaker at the European Symposium on Algorithms (Wroclaw) and at the Workshop on Computational Geometry and Games (Kyoto). In addition, he is on the editorial board of four major journals: Computational Geometry - Theory and Applications, Journal of Computational Geometry, Journal of Spatial Information Science, and ACM Transactions of Spatial Algorithms and Systems.

 

Abstract

Title: Geometric  Measures for Geometric Data Analysis and Geometric Content Generation

While the whole world seems to be moving towards data science, there is also science without any data and whose purpose it is to generate data out of nothing. We will be discussing the generation of geometric data, where shapes, distances, and angles play a role. The generated content should typically have certain properties, specified by geometric measures. Think of generating a set of 1000 floorplans of buildings for analyzing motion models of humans during evacuation. These floorplans must satisfy geometric constraints to be realistic, yet they should show sufficient variety to be able to analyze the motion model in different environments. Computing geometric measures or modifying content so that it satisfies properties requires geometric algorithms. Geometric measures are not only needed in geometric content generation, they also play an essential role in other computational tasks. We will review a number of them, ranging from cartographic algorithms via puzzle design to movement data analysis.

 

PROGRESS

Said Hamdioui (day 2)

Said Hamdioui is a chair professor on dependable and emerging computing technologies at Delft University of Technology (TUDelft), Delft, The Netherlands. Prior to joining TUDelft, Hamdioui worked for intel (CA),  Philips Semiconductors R&D (France) and Philips/ NXP Semiconductors (the Netherlands). His research focuses on two domains: Dependable CMOS nano-computing (including Reliability, Testability, Hardware Security) and emerging technologies and computing paradigms (including 3D stacked ICs, memristors for logic and storage, in-memory-computing for big-data applications).

 

Abstract
Title:  Computation for the future: Beyond CMOS and beyond Von- Neumann

Today’s computing systems are based on von Neumann (VN) architectures and mainly rely on many parallel (mini-)cores with a shared SRAM cache (parallel CPUs, GPUs, SIMD-VLIWs, and vector processors).  It is well recognised that such solutions suffer from major limitations such as a decreased performance acceleration per core, increased power consumption, and limited system scalability. These limitations are mainly caused by the processor-memory bottleneck.  As the current/future data-intensive applications require huge data transfers back and forth between processors and memories through load/store instructions, the maximal performance cannot be extracted as the processors will have many idle moments while waiting for data.  Moreover, today’s computers are manufactured using the traditional CMOS technology, which is reaching the inherent physical limits due to transistor down-scaling.  Technology nodes far below 20 nm are presumably only practical for limited applications due to multiple challenges, such as high static power consumption, reduced performance gain, reduced reliability, complex manufacturing process leading to low yield and complex testing process, and extremely costly masks.


This talk will first address the CMOS scaling and its impact on different aspects of IC and electronics; the major limitations and challenges the scaling is facing (such as leakage, yield, reliability, etc) will be shown and the need of a new technology will be motivated. Thereafter, an overview of computing systems, developed since the introduction of Stored program computers by John von Neumann in the forties, will be given. Shortcomings of today’s architectures including their limitations to deal with data-intensive applications will be discussed. It will be shown that the speed at which data is growing has already surpassed the capabilities of today’s computation architectures suffering from communication bottleneck and energy inefficiency; hence the need for a new architecture. Possible future architectures will be highlighted and a new architecture paradigm for data-intensive will be introduced; it is based on the integration of the storage and computation  in the same physical location (using a crossbar topology) and the use of non-volatile resistive-switching technology, based on memristors, instead of CMOS technology. The huge potential of such architecture in realizing order of magnitude improvement will be illustrated by comparing it with the state-of-the art architectures for different data-intensive applications.

 

Kerstin Eder (day 1)

Dr Kerstin Eder is a Professor of Computer Science at the University of Bristol, UK. She set up the Energy Aware COmputing (EACO) initiative (http://www.cs.bris.ac.uk/Research/eaco/) and leads the Verification and Validation for Safety in Robots research theme at the Bristol Robotics Laboratory (http://www.brl.ac.uk/vv). Her research is focused on specification, verification and analysis techniques which allow engineers to design a system and to verify/explore its behaviour in terms of functional correctness, performance and energy efficiency. Kerstin has gained extensive experience of verifying complex microelectronic designs while working with leading semiconductor design and Electronic Design Automation companies. In her research she seeks novel combinations of techniques to achieve solutions that make a difference in practice. Her most recent work includes intelligent, agent-based testing of code for robots that directly interact with humans, using assertion checks and theorem proving to verify control system designs, energy modelling of software and static analysis to predict energy consumption of programs. She is particularly interested in safety assurance for learning machines.

Kerstin was a Principal Investigator of the EC FP7 FET MINECC (Minimizing Energy Consumption of Computing to the Limit) collaborative research project ENTRA (Whole Systems Energy Transparency), which developed techniques to promote energy efficiency to a first class software design goal utilizing advanced energy modelling and static analysis techniques. At the Bristol Robotics Laboratory she is the Principal Investigator of two EPSRC projects: RIVERAS (Robust Integrated Verification of Autonomous Systems) and ROBOSAFE (Trustworthy Robotic Assistants). Kerstin has co-authored over 60 internationally refereed publications and was awarded a Royal Academy of Engineering "Excellence in Engineering" prize. She holds a PhD in Computational Logic, an MSc in Artificial Intelligence and an MEng in Informatics.
 

Abstract
Title: Whole Systems Energy Transparency/ More Power to Software Developers!

Energy efficiency is now a major (if not the major) concern in electronic systems engineering. While hardware can be designed to save a modest amount of energy, the potential for savings is far greater at the higher levels of abstraction in the system stack. The greatest savings are expected from energy consumption-aware software. This seminar emphasizes the importance of energy transparency from hardware to software as a foundation for energy-aware system design. Energy transparency enables a deeper understanding of how algorithms and coding impact on the energy consumption of a computation when executed on hardware. It is a key prerequisite for informed design space exploration and helps system designers to find the optimal tradeoff between performance, accuracy and energy consumption of a computation. Promoting energy efficiency to a first class software design goal is therefore an urgent research challenge. I will outline our approach, techniques and recent results towards giving "more power" to software developers. We will cover energy monitoring of software, energy modelling at different abstraction levels, including insights into how data affects the energy consumption of a computation, and static analysis techniques for energy consumption estimation.

 

Orlando Moreira (day 1)

Orlando Moreira leads an R&D group at Intel Corporation that develops compilers and tools for simulator and hardware generation. He got his engineering degree from the University of Aveiro, and holds a PhD in Electrical Engineering from the Technical University of Eindhoven. He has worked in embedded systems research and development since 2000, first at Philips, then at NXP and Ericsson, before joining Intel in early 2015. He has published work on several topics, including parallel models of computation, real-time analysis, runtime resource management, and tools for programming, scheduling and temporal analysis of multi-processor systems.

 

Abstract

Title: Software Development Tools for Embedded systems: the challenge of going from research to development

 

ProRISC

Bram Nauta (day 1)

Bram Nauta was born in 1964 in Hengelo, The Netherlands. In 1987 he received the M.Sc degree (cum laude) in electrical engineering from the University of Twente, Enschede, The Netherlands. In 1991 he received the Ph.D. degree from the same university on the subject of analog CMOS filters for very high frequencies. In 1991 he joined the Mixed-Signal Circuits and Systems Department of Philips Research, Eindhoven the Netherlands. In 1998 he returned to the University of Twente, as full professor heading the IC Design group. His current research interest is high-speed analog CMOS circuits, software defined radio, cognitive radio and beamforming techniques.

 

Jan Prummel (day 2)

Jan Prummel received the M.Sc. degree in electronic control engineering from the University of Salford, Manchester, U.K., in 1993.

He joined NXP Semiconductors, Eindhoven, The Netherlands, in 1994 as an analogue Circuit Designer, where he worked on portable and car radio receiver ICs.

In 2000, he joined Dialog Semiconductor as an RF circuit and system development engineer. He is currently Member of Technical Staff, in the Advanced Technology Connectivity Group, focussing on low-power high-performance radio architectures in standard CMOS for the IoT. He holds seven patents, and has co-authored four academic journal and conference papers.

 

Abstract
Title: Interference robust low-power radios for IoT

In 2014 Dialog Semiconductor introduced their first Bluetooth Low-Energy integrated circuit on the market, which was at the time the first IC to be designed for this purpose from scratch. As a result it set the benchmark for low-power consumption, high interference robustness, small size and low cost in commercially available IoT solutions. This talk will focus on the architecture, circuit design and measurement results of the radio part in comparison to several recently published developments.

 

SAFE

Gijs Krijnen (day 1)

Sten Vollebregt (day 1)

Themes:

Cyber Security

Maria Dubovitskaya

Dr. Maria Dubovitskaya is a research staff member at IBM Research Zurich working on designing cryptographic protocols for privacy protection and applying them in practice. She recently gave a TED@IBM talk on privacy-preserving authentication. Maria holds a Ph.D. in cryptography and privacy from ETH Zurich and has been granted several patents in the security field. She is a founder and co-leader of IBM Technical Excellence Council R/CIS, and is the recipient of the 2012 Anita Borg Change Agent Award for promoting science and tech for young women.

 

Abstract:

Title: Privacy-preserving identity management and authentication: Blockchain and beyond
People use online services with the belief that small amounts of information cannot reveal enough to impact them in a negative way. Therefore, they give away much more information about themselves than they may care to admit. Whether buying a bottle of wine, making an online purchase, or submitting a transaction on a blockchain, most of us share far more information than is necessary: birthdates, credit card numbers, addresses.
Indeed, it is possible to build a complete picture of someone's movements, transactions, preferences, and relationships from the trail left from online interaction and querying various databases.

With an increasing number of identity and service providers, it is really hard to gain the control over one’s personal data. Therefore, identity management systems and privacy-preserving technologies are becoming a key ingredient of modern IT infrastructure.

IBM Identity Mixer is a cryptographic protocol suite that provides user-centric identity management and strong authentication without collecting personal data. Thus, no personal data needs to be protected, managed, and treated according to complex legal regulations, while service providers can rest assured that their access restrictions are fully satisfied.

In this talk, I will explain the basic principles behind the Identity Mixer technology (zero-knowledge proofs) and compare it to the standard means of authentication (OAuth/OpenID and X.509 certificates). Then I will describe how Identity Mixer can be used in practice as a service and mobile app and show a live demo. I will also highlight the most prominent use cases and applications of Identity Mixer that include IoT and signing transactions on a blockchain.

 

Data Science

Reynold Xin (day 2)

Reynold Xin is a co-founder and Chief Architect at Databricks, a San Francisco-based cloud big data platform company. At Databricks, he led the development of Apache Spark and pushed Spark to be the most popular open source big data project. Prior to Databricks, he was pursuing PhD research at the UC Berkeley AMPLab, where he worked on large-scale data processing.

 

Abstract

Title: Simplifying Big Data with Spark: The Journey So Far and the Road Ahead

Last couple of years we witnessed Apache Spark's rise to dominance in the big data software space, due to its simplicity, expressiveness, and performance. In this talk, I will review some of the important developments in Spark that led to its success, followed by broader technology trends in 2017 and how Spark is moving to meet them.

 

Health

Marleen de Bruijne (day 2)

Marleen de Bruijne is associate professor of medical image analysis at Erasmus MC Rotterdam, The Netherlands, and at University of Copenhagen, Denmark. She received an MSc degree in physics (1997) and a PhD degree in medical imaging (2003) both from Utrecht University, The Netherlands. From

2003 to 2006 she was assistant professor and later associate professor at the IT University of Copenhagen, Denmark.

Dr. De Bruijne has (co-)authored over 150 peer-reviewed  papers in international conferences and journals and is the recipient of an NWO-VENI, NWO-VIDI, and DFF-YDUN award. She has been a member of the program committee of many international conferences in medical imaging and computer vision, including MICCAI, SPIE Medical Imaging, ISBI, and IPMI. She is a member of the EMBS Technical Committee on Biomedical Imaging and Image Processing, of  the Information Processing in Medical Imaging board and of the editorial boards of Medical Image Analysis and Frontiers in ICT - Computer Image Analysis. Her research interests are model based and quantitative analysis of medical images with applications including pulmonary imaging, neuro imaging, and cardiovascular imaging.

 

Abstract
Title: Machine learning imaging biomarkers

Quantitative analysis of medical imaging data plays an increasingly important role both in clinical studies and in the diagnosis, monitoring, and prognosis of disease in individual patients. Traditional techniques measure factors that are well-known to be related to disease, such as for instance the density of lung tissue, which relates to lung function, or the size of certain brain structures, which may help to predict the development of dementia. Large databases that contain both medical image data and related patient data enable a new, more data-driven approach in which image characteristics that are related to disease outcome are learned directly from a training set of patient data.  This talk will cover different approaches to learning disease-specific models from imaging data, discuss common issues in large medical imaging studies, and show applications in early diagnosis and prediction in various clinical applications.

 

Machine Learning

Bert Kappen (day 1)

Bert Kappen completed his PhD in theoretical particle physics in 1987 at the Rockefeller University in New York. From 1987 until 1989 he worked as a scientist at the Philips Research Laboratories in Eindhoven, the Netherlands. Since 1989, he is conducting research on neural networks, Bayesian machine learning, stochastic control theory and computational neuroscience. His research has made significant contributions to approximate inference in machine learning using methods from statistical physics; he has pioneerd the field of path integral control methods for solving large non-linear stochastic optimal control problems and their relation to statistical physics. In 1997, his research was awarded the prestigious national PIONIER research subsidy from NWO/STW. Since 1997 he is associate professor and since 2004 full professor at the Radboud University. In 2005 he was Miller visiting professor at the University of California at Berkeley. Since 2009, he is honorary faculty at UCL's Gatsby Computational Neuroscience Unit in London. He co-founded in 1998 the company Smart Research that commercializes applications of neural networks and machine learning. Smart Research has developed forensic software for DNA matching used by the Dutch Forensic institute (MH17 plane crash over Ukraine in 2014), Interpol, the Vietnam government for analysis of victims of the Vietnam war and the Australian Police force. He is director of the Dutch Foundation for Neural Networks (SNN), which coordinates research on neural networks and machine learning in the Netherlands through the Machine Learning Platform.

 

Abstract

In this talk, I give an overview of recent work that relates control theory, Monte Carlo sampling and learning. 

 

In particular, I will show how the solution to a stochastic optimal control problem can be expressed as a statistical estimation problem that can be approximated by Monte Carlo sampling. I will show how this idea can be applied to very high dimensional non-linear real-time control problems, such as the coordination of autonomous helicopters. 

 

In the second half of the talk, I will show how the relation between control and inference can also be reversed. We consider a complex time-series estimation problem, in this case the estimation of neural activity from observed fMRI signals. The control problem is to ‘steer’ the latent neural state such that it best explains the observed fMRI data. 

 

Ramon Schiffelers (day 1)

Ramon Schiffelers is a software innovation architect at ASML, world's leading provider of lithography systems for the semiconductor industry, and a (part-time) assistant professor at the department of Mathematics and Computer Science at the Eindhoven University of Technology.

Within ASML, he is leading a SW research group (+/- 25 fte) consisting of (SW) architects, researchers, scientific programmers, and PhD/ PDEng/MSc students.

He focusses on developing theory, methods and tools to enable cost effective, industrial scale model-driven system/software engineering. For this, he combines state-of-the-art methods and techniques from academia with state-of-the-practice in industry into innovative (software) products. These products empower their users to develop highly complex software intensive (sub) systems.

Ramon is positioned at the interface between scientific knowledge and its application in the industry. Next to innovative products, this resulted in long-term collaborative research and innovation between ASML, several departments of the Eindhoven University of Technology, and knowledge institutes such as the High Tech Systems Center (HTSC) and TNO-ESI.

 

Abstract
Title: Model Driven Development of TWINSCAN software, but not from scratch!

ASML is the world's leading provider of complex lithography systems (TWINSCAN) for the semiconductor industry. Lithography systems are highly complex Cyber-Physical Systems. They contain a huge amount of complex software which is essential to be extremely accurate, to deliver very high throughput and to deliver exceptionally reliable results in a 24/7 mode of operation.

To develop this TWINSCAN software, ASML uses Model-Driven-Engineering techniques. However, a large part of the software has been developed in the past using traditional software engineering techniques. Since it is too time-consuming to develop or re-engineer models for this part of the software by hand, ASML aims for (highly) automated model inference techniques to obtain those models.

In this talk, Ramon discusses ASML’s software engineering challenges. Three PhD students from the Eindhoven University of Technology, Maikel Leemans, Kousar Aslam, and Alfredo Bolt, will introduce and apply state-of-the-art model inference techniques to TWINSCAN software and discuss their results.