Zur Hauptnavigation wechseln Zur Suche wechseln Zum Hauptinhalt wechseln

Hardware Accelerated ATLAS Workloads on the WLCG Grid

Publikation: Beitrag in FachzeitschriftKonferenzartikelBegutachtung

1 Zitat (Scopus)

Abstract

In recent years the usage of machine learning techniques within data-intensive sciences in general and high-energy physics in particular has rapidly increased, in part due to the availability of large datasets on which such algorithms can be trained, as well as suitable hardware, such as graphic or tensor processing units, which greatly accelerate the training and execution of such algorithms. Within the HEP domain, the development of these techniques has so far relied on resources external to the primary computing infrastructure of the WLCG (Worldwide LHC Computing Grid). In this paper we present an integration of hardware-accelerated workloads into the Grid through the declaration of dedicated queues with access to hardware accelerators and the use of Linux container images holding a modern data science software stack. A frequent use-case in the development of machine learning algorithms is the optimization of neural networks through the tuning of their Hyper Parameters (HP). For this often a large range of network variations must be trained and compared, which for some optimization schemes can be performed in parallel-A workload well suited for Grid computing. An example of such a hyper-parameter scan on Grid resources for the case of flavor tagging within ATLAS is presented.

OriginalspracheEnglisch
Aufsatznummer012059
FachzeitschriftJournal of Physics: Conference Series
Jahrgang1525
Ausgabenummer1
DOIs
PublikationsstatusVeröffentlicht - 7 Juli 2020
Extern publiziertJa
Veranstaltung19th International Workshop on Advanced Computing and Analysis Techniques in Physics Research, ACAT 2019 - Saas-Fee, Schweiz
Dauer: 11 März 201915 März 2019

Fingerprint

Untersuchen Sie die Forschungsthemen von „Hardware Accelerated ATLAS Workloads on the WLCG Grid“. Zusammen bilden sie einen einzigartigen Fingerprint.

Dieses zitieren