9 October 2018

The next generation DAS, DAS-6, will get funding! For details, see the DAS Achievements page. DAS-6 is expected to become operational in the second half of 2020.

2 Jan 2017

DAS-5/VU has been extended with 4 TitanX-Pascal GPUs.

May, 2016

IEEE Computer publishes paper about 20 years of Distributed ASCI Supercomputer. See the DAS Achievements page.

28 Sep 2015

DAS-5/VU has been extended with 16 GTX TitanX GPUs.

DAS-5 Research

e-Infrastructure management

The COMMIT P20 Infrastructure management work packages

(prof. de Laat, dr. Grosso; prof. Epema, TUD)

These projects aim to ease the management of the highly complicated scientific computing infrastructures by effectively shielding the user from the low-level complexity. P20 investigates how to design a programmable e-Science architecture that is able to describe the infrastructure components and optimize them for typical usage scenarios. The architecture will allow users to create, configure, and operate secure virtualized distributed computing environments that automatically scale with computational demands. Specific work packages (each having one PhD student or postdoc) focus on:

  • Using semantic description of infrastructures for better integration, federation and sustainable ICT;
  • Mechanisms and tools to provide security of (virtual) e-Science infrastructure;
  • System-oriented studies in scheduling and resource management in data centers;
  • Application-oriented studies in scheduling and resource management in data centers.

ENVRI: Common Operations of ENVironmental Research Infrastructures

(prof. de Laat, dr. Grosso, UvA).

This EU FP-7 project investigates how our information modeling and workflow planning tool can be used to federate and integrate diverse environmental research infrastructures. The main goal is to develop a common reference model for the ESFRI (European Strategy Forum on Research Infrastructures) and to prototype selected common functionalities for these infrastructures, using the semantic linking framework that the UvA leads. The results will speed up the construction of these Environmental Sciences research infrastructures and will allow scientists to use the data and software from each facility to enable multi-disciplinary science.

Research on Really Reliable and Secure Systems Software

(prof. Tanenbaum, VU)

This is a 2.4M EURO EU/ERC-advanced project. It is based on MINIX-3, a modular multiserver operating system designed for high reliability. To analyze its performance it is necessary to run timing tests with many different workloads. DAS-5 will provide a good testbed to run such large-scale tests (using virtualization). Another specific PhD subproject is fault injection, where we test and compare systems designed to contain failures caused by buggy software. Here, workloads need to be executed many times with different faults injected.

Greening software with software-defined network services

(dr. Lago, VU; dr. Grosso, UvA).

This project is funded by VU and UvA and studies how to make distributed software systems and applications more energy efficient by dynamically selecting and exploiting the greenest available network connections. The emerging Software Defined Networking paradigm empowers end-users to control the behavior of the network. This paradigm lends itself for programming energy-efficient and sustainable network services. The project aims to determine how existing software systems can be transformed according to green- and cloud-based models integrating dynamic green network services.

Taming Hardware Diversity


(dr. Varbanescu, UvA).

This NWO VENI project does research in large-scale graph processing across multiple platforms (GPUs, multicores, clusters).

NWO GreenClouds

(prof. Bal, VU; prof. de Laat, UvA).

In GreenClouds we study how to exploit hardware diversity to reduce energy consumption of high-performance applications. The research focuses on green scheduling and management of clouds. For example, we have implemented a MapReduce framework on DAS-4 that can run on a mix of CPUs and GPUs to save energy.

The COMMIT P20 Distributed Computing work packages

(prof. Bal, dr. Kielmann, VU; prof. Bubak, dr. Belloum, UvA)

These work packages investigate efficient methods to program data-intensive applications on heterogeneous systems and to build workflow-based collaborative problem solving environments. Specific work packages focus on:

  • Programming Systems for elastic Cloud applications
  • e-Science applications on large-scale hybrid distributed systems
  • Modeling and integrating workflow processes of complex experiments
  • Methods enabling workflow sharing and reproducibility of results.
  • Workflow and application component integration

NLeSC Climate Research Project

(prof. Dijkstra, IMAU/UU; prof. Bal, VU; dr. Seinstra, NLeSC).

This project studies projections of future sea level changes due to changes in the North Atlantic ocean circulation. The innovative element is the use of an eddy-resolving ocean model to determine these sea level changes with an unprecedented level of detail. New multi-model/multi-kernel techniques will be applied using a diverse combination of high-performance and distributed computing facilities (referred to as Jungle Computing Systems).

NLeSC eScience Technology Project

(prof. Bal, VU; prof. de Laat, UvA).

This project addresses the data volume and complexity of modern applications. In particular, it does research on how to efficiently implement the LOFAR imaging pipeline for the AARTFAAC (Amsterdam - ASTRON Radio Transient Facility And Analysis Center) project on GPUs.

The Final Parsec

(prof. Portegies Zwart, LU).

This NWO-VICI project uses the Starlab software package for simulating the evolution of dense stellar systems. Starlab is a collection of tools that share a common data structure and that can be combined in arbitrarily complex ways to study the dynamics of star clusters and galactic nuclei. Two tools connected by UNIX pipes may operate on different portions of the same data set, even though neither understands the data structures, or even the physical variables, used by the other.

The Virtual Galaxy project

(prof. Portegies Zwart, LU).

This project intends to run a gravitational N-body simulation of a Milky Way like Galaxy on a star-by-star basis, using 100 billion particles. The calculation results will be used to study the formation and evolution of spiral structure in the Galaxy. A 100 billion particle simulation is more than a factor 1000 larger than any previous simulation of the Milky Way Galaxy. In earlier test simulations on 4096 GPU nodes of the Titan Supercomputer a sustained performance of 4227 TFlops was achieved. DAS-5 will be used to develop an efficient GPU code to prepare for large-scale runs to simulate the entire Milky Way.

From complex living systems to smarter computers

(dr. Kaandorp, UvA).

The FP7 Project SWAM-ORGAN tries to understand complex living systems such as cells making an organ, or the spatially-controlled growing of a plant, and to apply these principles to technological systems, in particular more intelligent and adaptable robot swarms. The project will identify the principles of these systems and use them to design a theoretical framework about distributed adaptive control. They will explore a specific approach, the gene regulatory networks, as a potentially powerful control method for these systems. Comparing networks between different biological systems, they will be able to identify patterns and fundamental principles that can be applied to technology.

Interacting with big data

A Reasonable Web

(dr. Urbani, VU).

This NWO VENI project will investigate methods to estimate the difficulty of reasoning at web scale.


(prof. Bal, VU)

This COMMIT work package will develop a smartphone based distributed sensor network for real-time coaching of (groups of) amateur athletes. Sensor data collected through the smarthpones will be sent to a cloud (DAS-5), which will help in providing detailed feedback and in doing population-level data analytics.


(prof. van Steen, VU)

This work package studies large sets of nodes in a wireless distributed system. The goal of the project is to apply gossip-based techniques to tackle real-world problems on large, wireless sensor networks for future societal goals as wellbeing and independent living. This project will use DAS-5 to do simulations and large-scale (parallel) network analysis.


(prof. Adriaans, UvA)

This work package studies measures of information content and complexity that can be used to optimize a 'matching' and query utility in a given set of conditions. Also, it develops platform-independent tools that enable retrieval of distributed complex data sets.

Reasoning in a Changing World

(prof. Bal, prof. van Harmelen, VU)

This is a VU-funded project studying novel parallel and distributed algorithms for reasoning on highly dynamic data. It bridges the gap between reasoning and more traditional stream processing and it will produce high-performance algorithms for incremental, approximate and streaming reasoning.

NLeSC Food Research Project

(Prof. Alkema, Radboud University).

The technological developments in life science research have led to a vast increase in data that are available in public and proprietary databases. To efficiently capitalize on these data, dedicated vocabularies and algorithms are necessary for annotating, searching, filtering and integrating data from various sources. This project will develop structured vocabularies covering the food domain, which will be incorporated in existing knowledge management tools to link potentially related research findings. These relations will be used to generate hypotheses addressing important areas in food research.

This project and the next two NLeSC projects aim to explore efficient solutions for interacting with Big data in different domains, which ultimately should result in large-scale production software for these domains. Obtaining efficient solutions, however, requires much interactive experimentation work with distribution, accelerators, networks, etc., which can only be done on DAS. DAS-5 will thus be used as a stepping stone towards production work, and serve as a catalyst for eScience technology innovation.

NLeSC Geographic Data Project

(Prof. van Oosterom, TUD).

Modern geographic data acquisition technologies generate point clouds with billions (or even trillions) of elevation/depth points, which are are too big (several terabytes) to be handled efficiently by common ICT infrastructures. This project develops several novel and innovative eScience techniques for data management, dissemination, processing and visualization.

NLeSC Water Management Project

(Prof. van der Giesen, TUD).

The development of a high resolution global hydrological model has recently been put forward as Grand Challenge for the hydrological community. To ensure proper parameterization of such a model, massive assimilation of remotely sensed data is needed, in turn requiring vast amounts of computational power. Moreover, updating a global hydrological model with Earth observations will be a major computational challenge that demands close cooperation between ICT and hydrology.


(dr. Belloum, UvA)

This is a FP-7 project studying how to efficiently move large datasets. It will provide a cloud platform for easy-to-use access to compute and data resources, including services for deployment of scientific software on a virtualized infrastructure, and cloud data access and transfer for very large objects.

Multimedia and games


(dr. Snoek, UvA, prof. Smeulders, UvA)

This is an IARPA project that aims to develop tools for dramatically improving an analyst's ability to efficiently search with high precision massive, and continuously growing databases of videos for specific events. As with many projects in this category, efficient solutions need flexible control over the composition of the hardware (storage, network, accelerator), which is only possible in experimental systems like DAS.


(dr. Snoek, UvA)

This is an NWO/STW VIDI project. Video images are only findable when people describe the content in advance. In this project we teach computers how to recognise video images which have not been given descriptions, on the basis of people, objects and scenes which can be recognised and their interaction.


(dr. Worring, UvA) This is an STW project that studies the interactive categorization of large collections of images and their metadata by combining information visualization with multimedia analysis and intelligent interaction.

COMMIT P6-WP1: Mining online multimedia as training resource

(dr. Snoek, UvA)

This work package aims to develop tools and techniques for leveraging user-generated multimedia as training resource for automatic semantic labeling. Intuitively, if different persons label visually similar images and videos using the same tags, these tags are likely to reflect objective aspects of the visual content. We will study how this intuition can be exploited to obtain relevant labels for visual content.

COMMIT P6-WP2: Interactive visual learning

(dr. Snoek, UvA)

This work package aims to develop tools and techniques for leveraging user-generated multimedia as training resource for interactive visual learning. Rather than relying on an interacting user, we aim to exploit weakly labeled online data as starting point and emphasize in particular the role of diverse, yet compact, visual features and efficient machine learning schemes.

COMMIT P6-WP3: Identity resolution

(dr. Snoek, UvA)

This work package aims to design, develop and evaluate methods for positive identification of objects in visual search engines. The project builds on the extensive success of visual recognition in computer vision. What is missing in the current search engines is the evidence that the object as indicated is really there. Also nearly identical objects or object classes cannot be discriminated yet. @Large (dr. Iosup, TUD).

This NWO VENI project (augmented with additional PhD students on a scholarship) does research in resource management for Massive Multi-player Online Games (MMOGs) in large-scale distributed systems (content generation, game analytics, etc).

Astronomy applications

NLeSC Astronomy Project

(dr. de Vos, ASTRON; prof. Bal, VU; dr. van Nieuwpoort, NLeSC).

The most advanced modes of large radio/mm telescopes like ALMA and LOFAR require structural collaboration between astronomers and e-Science experts. The project addresses the issues related to the huge size of the datasets produced by the most extreme observations. Areas of optimization include interoperability between existing software packages, high-performance computing platforms, extremely large databases, and streaming processing pipelines.


(dr. Boonstra, ASTRON; dr. Engbersen, IBM)

This is a 32.9M EURO project, supported by grants from the Dutch Ministry of EL&I and the province Drenthe. In this collaboration between ASTRON and IBM, Dome addresses the computational challenges of the Square Kilometre Array, the next-generation radio telescope that has exa-scale compute requirements. Research areas that are relevant to DAS-5 include algorithmic research, the use of accelerator hardware and microservers, and real-time communication. About 10 DOME researchers are expected to use DAS-5.


(dr. Vermeulen, ASTRON)

This project will continue to use DAS-5 for the development of software pipelines for the LOFAR radio telescope. LOFAR is the first of a novel type of radio telescopes (essentially a distributed sensor network), and enables groundbreaking science in astronomy and astrophysics. Its versatility (due to the absence of dishes that need mechanical steering) allows new observation modes that require new processing pipelines.


(prof. Wijers, UVA; dr. de Vos, ASTRON)

This is an ERC project. This UvA/ASTRON collaboration expands the LOFAR radio telescope to an all-sky monitoring instrument. AARTFAAC will be used to detect and study transient events (such as the birth of a black hole). DAS-5 allows us to continue to develop the data processing pipeline.

Uniboard 2

(dr. Szomoru, JIVE)

This is a RadioNet3/FP7 project. Within Uniboard 2, we will develop the next-generation FPGA-based processing board for digital signal processing of radio telescope data. DAS-5 will allow us to create a prototype system that integrates Uniboard 2 with conventional high-performance computing hardware. DAS-5 also allows us to do research on applying emerging high-level languages like OpenCL for programming FPGAs.


(dr. van Leeuwen, ASTRON)

This is an NWO-M and NOVA funded project. ARTS will turn the new wide-field APERTIF focal-plane array receivers on the Westerbork telescope (funded by NWO and EC grants) into high-speed cameras, and will be powered by a 500-TFLOP, 100-GPU cluster. These cameras will then be able to detect, for the first time, the source of the enigmatic short radio flashes that appear all over the sky. DAS-5 will be used to develop the GPU algorithms needed for this processing pipeline.


(dr. de Vos, ASTRON) This is an international RadioNet3/FP7 project, in which we will create optimized prototype software pipelines for current and emerging radio telescopes.


(dr. Romein, ASTRON)

This is an NWO Open Competition project that studies some key radio-astronomical signal-processing algorithms on a wide range of accelerator platforms. This way, we learn which properties make an architecture (in)efficient; we study energy efficiency, programmability, and device (in)dependent optimizations.