Considering GPGPU for HPC centers: Is it worth the effort?

Hans Hacker, Carsten Trinitis, Josef Weidendorfer, Matthias Brehm

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

11 Scopus citations

Abstract

In contrast to just a few years ago, the answer to the question "What system should we buy next to best assist our users" has become a lot more complicated for the operators of an HPC center today. In addition to multicore architectures, powerful accelerator systems have emerged, and the future looks heterogeneous. In this paper, we will concentrate on and apply the abovementioned question to a specific accelerator with its programming environment that has become increasingly popular: systems using graphics processors from NVidia, programmed with CUDA. Using three benchmarks encompassing main computational needs of scientific codes, we compare performance results with those obtained by systems with modern x86 multicore processors. Taking the experience from optimizing and running the codes into account, we discuss whether the presented performance numbers really apply to computing center users running codes in their everyday tasks.

Original languageEnglish
Title of host publicationFacing the Multicore-Challenge - Aspects of New Paradigms and Technologies in Parallel Computing
Pages118-130
Number of pages13
DOIs
StatePublished - 2010
EventConference for Young Scientists: Facing the Multicore-Challenge - Heidelberg, Germany
Duration: 17 Mar 201019 Mar 2010

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume6310 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

ConferenceConference for Young Scientists: Facing the Multicore-Challenge
Country/TerritoryGermany
CityHeidelberg
Period17/03/1019/03/10

Fingerprint

Dive into the research topics of 'Considering GPGPU for HPC centers: Is it worth the effort?'. Together they form a unique fingerprint.

Cite this