Scheduling the I/O of HPC applications under congestion - Inria - Institut national de recherche en sciences et technologies du numérique Access content directly
Reports (Research Report) Year : 2014

Scheduling the I/O of HPC applications under congestion

Abstract

A significant percentage of the computing capacity of large-scale platforms is wasted due to interferences incurred by multiple applications that access a shared parallel file system concurrently. One solution to handling I/O bursts in large-scale HPC systems is to absorb them at an intermediate storage layer consisting of burst buffers. However, our analysis of the Argonne's Mira system shows that burst buffers cannot prevent congestion at all times. As a consequence, I/O performance is dramatically degraded, showing in some cases a decrease in I/O throughput of 67%. In this paper, we analyze the effects of interference on application I/O bandwidth, and propose several scheduling techniques to mitigate congestion. We show through extensive experiments that our global I/O scheduler is able to reduce the effects of congestion, even on systems where burst buffers are used, and can increase the overall system throughput up to 56%. We also show that it outperforms current Mira I/O schedulers.
Dans ce travail, nous proposons des algorithmes efficaces pour pallier aux problèmes de congestion lors des transferts des données de type I/O. Nous les évaluons sur des machines haute performance.
Fichier principal
Vignette du fichier
RR-8519.pdf (1.09 Mo) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-00983789 , version 1 (25-04-2014)
hal-00983789 , version 2 (29-04-2014)
hal-00983789 , version 3 (20-10-2014)

Identifiers

  • HAL Id : hal-00983789 , version 3

Cite

Ana Gainaru, Guillaume Aupy, Anne Benoit, Franck Cappello, Yves Robert, et al.. Scheduling the I/O of HPC applications under congestion. [Research Report] RR-8519, LIP; INRIA. 2014, pp.25. ⟨hal-00983789v3⟩
811 View
495 Download

Share

Gmail Facebook X LinkedIn More