DAWNBench

An End-to-End Deep Learning Benchmark and Competition

ImageNet Training

Submission Date Model Time to 93% Accuracy Cost (USD) Max Accuracy Hardware Framework

Apr 2018

ResNet50

Google Cloud TPU

source

8:52:33 $58.53 93.11% GCP n1-standard-2, Cloud TPU TensorFlow v1.8rc1

Apr 2018

ResNet50

Intel(R) Corporation

source

3:25:55 N/A 93.02% 128 nodes with Xeon Platinum 8124M / 144 GB / 36 Cores (Amazon EC2 [c5.18xlarge]) Intel(R) Optimized Caffe

Apr 2018

AmoebaNet-D N6F256

Google

source

1:58:24 N/A 93.17% 1/16 of a TPUv2 Pod TensorFlow 1.8.0-rc1

Sep 2018

ResNet50

Google Cloud TPU

source

2:44:31 $12.60 93.34% GCP n1-standard-2, Cloud TPU TensorFlow v1.11.0

Oct 2017

ResNet152

Stanford DAWN

source

10 days, 3:59:59 $1112.64 93.00% 8 K80 / 488 GB / 32 CPU (Amazon EC2 [p2.8xlarge]) MXNet 0.11.0

Apr 2018

AmoebaNet-D N6F256

Google Cloud TPU

source

7:28:30 $49.30 93.11% GCP n1-standard-2, Cloud TPU TensorFlow 1.8.0-rc0

Mar 2020

ResNet50-v1.5

Apsara AI Acceleration(AIACC) team in Alibaba Cloud

source

0:21:38 $7.43 93.05% 1 ecs.gn6e-c12g1.24xlarge (AlibabaCloud) AIACC-Training 1.3 + Tensorflow 2.1

May 2019

ResNet-50

ModelArts Service of Huawei Cloud

source

0:02:43 N/A 93.10% 16 nodes with InfiniBand (8*V100 with NVLink for each node) Moxing v1.13.0 + TensorFlow v1.13.1

Apr 2018

ResNet56

Intel(R) Corporation

source

3:31:47 N/A 93.11% 128 nodes with Xeon Platinum 8124M / 144 GB / 36 Cores (Amazon EC2 [c5.18xlarge]) Intel(R) Optimized Caffe

Aug 2019

Resnet 50

ZTE AI Platform

source

0:23:11 N/A 93.03% 8 nodes with InfiniBand (8*P100 for each node) TensorFlow v1.12.0

Apr 2019

ResNet50

Setu Chokshi (MS AI MVP | PropertyGuru)

source

1:42:23 $20.89 93.02% Azure ND40s_v2 PyTorch 1.0

Sep 2018

Resnet 50

Andrew Shaw, Yaroslav Bulatov, Jeremy Howard

source

0:29:43 $48.48 93.02% 32 * V100 (4 machines - AWS p3.16xlarge) ncluster / Pytorch 0.5.0a0+0e8088d

Oct 2017

ResNet152

Stanford DAWN

source

13 days, 10:41:37 $2323.39 93.38% 4 M60 / 488 GB / 64 CPU (Amazon EC2 [g3.16xlarge]) TensorFlow v1.3

Apr 2018

ResNet50

Intel(R) Corporation

source

6:09:50 N/A 93.05% 64 nodes with Xeon Platinum 8124M / 144 GB / 36 Cores (Amazon EC2 [c5.18xlarge]) Intel(R) Optimized Caffe

Mar 2020

ResNet50-v1.5

Apsara AI Acceleration(AIACC) team in Alibaba Cloud

source

0:02:38 $14.42 93.04% 16 ecs.gn6e-c12g1.24xlarge (AlibabaCloud) AIACC-Training 1.3 + Tensorflow 2.1

Apr 2018

AmoebaNet-D N6F256

Google

source

1:06:32 N/A 93.03% 1/4 of a TPUv2 Pod TensorFlow 1.8.0-rc1

Aug 2019

Resnet 50

Chuan Li

source

12:39:49 $19.00 93.05% Lambda GPU Cloud - 4x GTX 1080 Ti ncluster / Pytorch 1.0.0

Sep 2018

Resnet 50

Andrew Shaw, Yaroslav Bulatov, Jeremy Howard

source

0:18:53 $61.63 93.19% 64 * V100 (8 machines - AWS p3.16xlarge) ncluster / Pytorch 0.5.0a0+0e8088d

Feb 2019

Resnet 50 v1

GE Healthcare (Min Zhang)

source

1:44:34 $42.66 93.24% 8*V100 (single p3.16xlarge) tensorflow 1.11 + horovod

Sep 2018

ResNet-50

fast.ai/DIUx (Yaroslav Bulatov, Andrew Shaw, Jeremy Howard)

source

0:18:06 $118.07 93.11% 16 p3.16xlarge (AWS) PyTorch 0.4.1

Apr 2018

ResNet50

Google

source

0:30:43 N/A 93.03% Half of a TPUv2 Pod TensorFlow 1.8.0-rc1

Dec 2017

ResNet152

ppwwyyxx

source

1 day, 20:28:27 N/A 93.94% 8 P100 / 512 GB / 40 CPU (NVIDIA DGX-1) tensorpack 0.8.0

Apr 2018

Resnet 50

fast.ai + students team: Jeremy Howard, Andrew Shaw, Brett Koonce, Sylvain Gugger

source

2:57:28 $72.40 93.05% 8 * V100 (AWS p3.16xlarge) fastai / pytorch

Jan 2018

ResNet50

DIUX

source

14:37:59 $358.22 93.07% p3.16xlarge tensorflow 1.5, tensorpack 0.8.1

Mar 2018

ResNet50

Google Cloud TPU

source

12:26:39 $82.07 93.15% GCP n1-standard-2, Cloud TPU TensorFlow v1.7rc1

Dec 2018

ResNet-50

ModelArts Service of Huawei Cloud

source

0:09:22 N/A 93.23% 16 * 8 * Tesla-V100(ModelArts Service) Huawei Optimized MXNet

Sep 2018

ResNet50

Google Cloud TPU

source

5:52:31 $27.00 93.36% GCP n1-standard-2, Cloud TPU TensorFlow v1.11.0
Disclosure: The Stanford DAWN research project is a five-year industrial affiliates program at Stanford University and is financially supported in part by founding members including Intel, Microsoft, NEC, Teradata, VMWare, and Google. For more information, including information regarding Stanford’s policies on openness in research and policies affecting industrial affiliates program membership, please see DAWN's membership page.