White Papers

Copyright © 2019 Dell Inc. or its subsidiaries. All Rights Reserved. Dell, EMC and other trademarks are trademarks of Dell Inc. or its subsidiaries
PowerEdge Servers and Deep Learning Domains:
The Impact of Scaling Accelerators
Evaluating Performance Using MLPerf Benchmark
To accurately harvest Artificial Intelligence (AI) performance data it is
critical to select a benchmark that is qualified to accurately test multiple
domain types. MLPerf is a new and broad Machine Learning (ML) and
Deep Learning (DL) benchmark suite that is gaining popularity and
adoption for its multi-domain capabilities and representative models. The
current version (v0.5) covers five domains associated with AI subsets, as
seen in Figure 1: image classification, object detection, language
translation, reinforcement learning and recommendation.
For each given domain, MLPerf will measure performance by assessing
and comparing total training times; the amount of time that it takes to train
a neural net model for a given domain to reach target accuracy. Dell EMC
team benchmarked various PowerEdge servers that have GPU
compatibility to help customers pick the appropriate GPU infrastructure
that will achieve their requirements. We used multi-GPU training to
highlight the shortest amount of training time needed to reach target
accuracy the fastest for the MLPerf.
Tech Note by
Matt Ogle
Ramesh Radhakrishnan
Summary
With deep learning
principles becoming a
widely accepted practice,
customers are keen to
understand how to select
the most optimized server,
based on GPU count, to
accommodate varying
machine and deep learning
workloads.
This tech note delivers test
results that portray how
scaling NVIDIA GPU’s on
PowerEdge server
configurations will impact
performance for various
deep learning domains, and
how these results outline
general guidelines for
constructing an optimized
deep learning platform.
Direct from Development
Server and Infrastructure
Engineering
Figure 1: Domains covered within the MLPerf v0.5 benchmark

Summary of content (3 pages)