Rank |
Model |
average |
– V1 1 benchmark |
– V2 1 benchmark |
– V4 1 benchmark |
– IT 2 benchmarks |
– behavior 1 benchmark |
engineering 1 benchmark |
– Deng2009-top1 v1 |
---|---|---|---|---|---|---|---|---|---|
1 |
CORnet-S
Kubilius et al., 2018 |
.417 | .294 | .242 | .581 | .423 | .545 | .747 | .747 |
2 |
vgg-19
Simonyan et al., 2014 |
.408 | .347 | .341 | .610 | .248 | .494 | .711 | .711 |
3 |
resnet-50-robust
Santurkar et al., 2019 |
.408 | .378 | .365 | .537 | .243 | .515 | ||
4 |
resnet-101_v1
He et al., 2015 |
.407 | .266 | .341 | .590 | .274 | .561 | .764 | .764 |
5 |
vgg-16
Simonyan et al., 2014 |
.406 | .355 | .336 | .620 | .259 | .461 | .715 | .715 |
6 |
resnet-152_v1
He et al., 2015 |
.405 | .282 | .338 | .598 | .277 | .533 | .768 | .768 |
7 |
resnet-101_v2
He et al., 2015 |
.404 | .274 | .332 | .599 | .263 | .555 | .774 | .774 |
8 |
resnet50-SIN_IN
Geirhos et al., 2019 |
.404 | .282 | .324 | .599 | .276 | .541 | .746 | .746 |
9 |
densenet-169
Huang et al., 2016 |
.404 | .281 | .322 | .601 | .274 | .543 | .759 | .759 |
10 |
densenet-201
Huang et al., 2016 |
.402 | .277 | .325 | .599 | .273 | .537 | .772 | .772 |
11 |
resnet-50-pytorch
He et al., 2015 |
.399 | .289 | .317 | .600 | .259 | .528 | .752 | .752 |
12 |
resnet-50_v1
He et al., 2015 |
.398 | .274 | .317 | .594 | .278 | .526 | .752 | .752 |
13 |
resnet50-SIN_IN_IN
Geirhos et al., 2019 |
.397 | .275 | .321 | .596 | .273 | .523 | .767 | .767 |
14 |
resnet-152_v2
He et al., 2015 |
.397 | .274 | .326 | .591 | .266 | .528 | .778 | .778 |
15 |
resnet-50_v2
He et al., 2015 |
.396 | .270 | .323 | .596 | .260 | .531 | .756 | .756 |
16 |
densenet-121
Huang et al., 2016 |
.396 | .277 | .306 | .595 | .267 | .535 | .745 | .745 |
17 |
resnext101_32x32d_wsl
Mahajan et al., 2018 |
.396 | .267 | .289 | .574 | .254 | .594 | .854 | .854 |
18 |
mobilenet_v1_1.0_160
Howard et al., 2017 |
.393 | .290 | .332 | .588 | .275 | .480 | .680 | .680 |
19 |
resnext101_32x8d_wsl
Mahajan et al., 2018 |
.392 | .271 | .312 | .586 | .241 | .551 | .842 | .842 |
20 |
inception_v2
Szegedy et al., 2015 |
.392 | .284 | .313 | .587 | .270 | .505 | .739 | .739 |
21 |
dorinet_cornet_z
None |
.390 | .310 | .311 | .582 | .468 | .279 | ||
22 |
resnet-18
He et al., 2015 |
.390 | .274 | .302 | .583 | .266 | .524 | .698 | .698 |
23 |
mobilenet_v2_1.0_224
Howard et al., 2017 |
.389 | .245 | .331 | .573 | .273 | .521 | .718 | .718 |
24 |
mobilenet_v2_0.75_224
Howard et al., 2017 |
.388 | .236 | .316 | .586 | .268 | .533 | .698 | .698 |
25 |
efficientnet-b0
Tan et al., 2019 |
.387 | .215 | .317 | .556 | .274 | .573 | ||
26 |
fixres_resnext101_32x48d_wsl
Touvron et al., 2019 |
.387 | .246 | .288 | .582 | .257 | .561 | .863 | .863 |
26 |
resnext101_32x48d_wsl
Mahajan et al., 2018 |
.387 | .246 | .288 | .582 | .257 | .561 | .822 | .822 |
28 |
mobilenet_v2_1.3_224
Howard et al., 2017 |
.386 | .253 | .332 | .575 | .271 | .500 | .744 | .744 |
29 |
resnet50-SIN
Geirhos et al., 2019 |
.386 | .300 | .333 | .580 | .267 | .448 | .602 | .602 |
30 |
pnasnet_large
Liu et al., 2017 |
.385 | .264 | .305 | .578 | .263 | .515 | .829 | .829 |
31 |
AT_efficientnet-b7
Xie et al., 2020 |
.385 | .276 | .308 | .583 | .281 | .475 | ||
32 |
mobilenet_v2_0.75_192
Howard et al., 2017 |
.384 | .245 | .306 | .573 | .275 | .524 | .687 | .687 |
33 |
mobilenet_v2_1.4_224
Howard et al., 2017 |
.384 | .257 | .321 | .566 | .277 | .500 | .750 | .750 |
34 |
inception_v1
Szegedy et al., 2014 |
.384 | .259 | .311 | .589 | .244 | .518 | .698 | .698 |
35 |
xception
Chollet et al., 2016 |
.384 | .245 | .306 | .610 | .249 | .508 | .790 | .790 |
36 |
AT_efficientnet-b4
Xie et al., 2020 |
.383 | .246 | .339 | .549 | .279 | .503 | ||
37 |
mobilenet_v2_0.75_160
Howard et al., 2017 |
.383 | .278 | .316 | .573 | .273 | .473 | .664 | .664 |
38 |
inception_v4
Szegedy et al., 2016 |
.382 | .238 | .299 | .574 | .263 | .537 | .802 | .802 |
39 |
resnext101_32x16d_wsl
Mahajan et al., 2018 |
.382 | .263 | .302 | .587 | .250 | .509 | .851 | .851 |
40 |
inception_resnet_v2
Szegedy et al., 2016 |
.381 | .233 | .319 | .583 | .272 | .499 | .804 | .804 |
41 |
efficientnet-b6
Tan et al., 2019 |
.381 | .263 | .295 | .563 | .271 | .513 | ||
42 |
efficientnet-b2
Tan et al., 2019 |
.380 | .213 | .317 | .569 | .273 | .526 | ||
43 |
nasnet_large
Zoph et al., 2017 |
.380 | .282 | .291 | .585 | .270 | .470 | .827 | .827 |
44 |
mobilenet_v1_1.0_224
Howard et al., 2017 |
.380 | .223 | .341 | .560 | .273 | .502 | .709 | .709 |
45 |
efficientnet-b4
Tan et al., 2019 |
.379 | .228 | .286 | .575 | .272 | .535 | ||
46 |
inception_v3
Szegedy et al., 2015 |
.379 | .241 | .307 | .596 | .273 | .477 | .780 | .780 |
47 |
mobilenet_v2_1.0_192
Howard et al., 2017 |
.377 | .216 | .322 | .572 | .273 | .503 | .707 | .707 |
48 |
mobilenet_v2_1.0_160
Howard et al., 2017 |
.376 | .239 | .322 | .570 | .275 | .472 | .688 | .688 |
49 |
mobilenet_v2_0.5_192
Howard et al., 2017 |
.375 | .263 | .329 | .566 | .264 | .454 | .639 | .639 |
50 |
mobilenet_v2_0.5_224
Howard et al., 2017 |
.372 | .229 | .308 | .569 | .266 | .488 | .654 | .654 |
51 |
mobilenet_v1_0.75_224
Howard et al., 2017 |
.372 | .223 | .336 | .558 | .267 | .477 | .684 | .684 |
52 |
AT_efficientnet-b2
Xie et al., 2020 |
.372 | .248 | .295 | .563 | .275 | .480 | ||
53 |
resnet-34
He et al., 2015 |
.372 | .230 | .286 | .560 | .237 | .546 | .733 | .733 |
54 |
AT_efficientnet-b0
Xie et al., 2020 |
.371 | .238 | .334 | .570 | .267 | .447 | ||
55 |
mobilenet_v1_0.5_224
Howard et al., 2017 |
.370 | .221 | .340 | .555 | .260 | .474 | .633 | .633 |
56 |
mobilenet_v1_1.0_192
Howard et al., 2017 |
.370 | .235 | .329 | .548 | .271 | .466 | .700 | .700 |
57 |
mobilenet_v2_0.75_128
Howard et al., 2017 |
.369 | .237 | .320 | .553 | .271 | .464 | .632 | .632 |
58 |
cornetz_contrastive
None |
.369 | .325 | .353 | .551 | .268 | .346 | ||
59 |
alexnet
None |
.368 | .316 | .353 | .550 | .254 | .370 | .577 | .577 |
60 |
mobilenet_v1_1.0_128
Howard et al., 2017 |
.368 | .254 | .325 | .557 | .267 | .437 | .652 | .652 |
61 |
mobilenet_v1_0.75_128
Howard et al., 2017 |
.368 | .267 | .330 | .564 | .252 | .425 | .621 | .621 |
62 |
mobilenet_v2_0.5_160
Howard et al., 2017 |
.368 | .258 | .305 | .562 | .264 | .448 | .610 | .610 |
63 |
mobilenet_v2_1.0_128
Howard et al., 2017 |
.368 | .252 | .303 | .569 | .267 | .447 | .653 | .653 |
64 |
mobilenet_v1_0.5_192
Howard et al., 2017 |
.367 | .220 | .337 | .566 | .260 | .454 | .617 | .617 |
65 |
mobilenet_v1_0.75_192
Howard et al., 2017 |
.367 | .229 | .339 | .549 | .267 | .449 | .672 | .672 |
66 |
mobilenet_v2_0.35_192
Howard et al., 2017 |
.366 | .264 | .301 | .568 | .259 | .437 | .582 | .582 |
67 |
mobilenet_v2_1.0_96
Howard et al., 2017 |
.363 | .256 | .332 | .530 | .257 | .443 | .603 | .603 |
68 |
resnet18-supervised
He et al., 2015 |
.361 | .276 | .281 | .539 | .263 | .446 | ||
69 |
mobilenet_v1_0.5_160
Howard et al., 2017 |
.361 | .265 | .320 | .557 | .252 | .410 | .591 | .591 |
70 |
mobilenet_v2_0.35_160
Howard et al., 2017 |
.359 | .269 | .292 | .554 | .259 | .424 | .557 | .557 |
71 |
mobilenet_v1_0.75_160
Howard et al., 2017 |
.359 | .213 | .346 | .558 | .264 | .413 | .653 | .653 |
72 |
mobilenet_v2_0.35_224
Howard et al., 2017 |
.359 | .215 | .296 | .554 | .253 | .474 | .603 | .603 |
73 |
mobilenet_v2_0.5_128
Howard et al., 2017 |
.358 | .222 | .309 | .557 | .262 | .440 | .577 | .577 |
74 |
nasnet_mobile
Zoph et al., 2017 |
.357 | .272 | .273 | .566 | .268 | .406 | .740 | .740 |
75 |
mobilenet_v2_0.75_96
Howard et al., 2017 |
.350 | .208 | .305 | .527 | .258 | .451 | .588 | .588 |
76 |
squeezenet1_0
Iandola et al., 2016 |
.341 | .304 | .320 | .591 | .229 | .263 | .575 | .575 |
77 |
mobilenet_v1_0.5_128
Howard et al., 2017 |
.341 | .245 | .304 | .550 | .234 | .373 | .563 | .563 |
78 |
squeezenet1_1
Iandola et al., 2016 |
.336 | .265 | .311 | .582 | .229 | .291 | .575 | .575 |
79 |
mobilenet_v2_0.35_128
Howard et al., 2017 |
.333 | .245 | .289 | .530 | .235 | .367 | .508 | .508 |
80 |
mobilenet_v2_0.5_96
Howard et al., 2017 |
.331 | .266 | .278 | .501 | .239 | .370 | .512 | .512 |
81 |
ViT_L_32_imagenet1k
Dosovitskiy et al., 2021 |
.328 | .265 | .291 | .531 | .227 | .324 | ||
82 |
mobilenet_v1_0.25_224
Howard et al., 2017 |
.327 | .231 | .296 | .538 | .240 | .333 | .498 | .498 |
83 |
ViT_L_32
Dosovitskiy et al., 2021 |
.324 | .305 | .301 | .511 | .219 | .286 | ||
84 |
mobilenet_v1_0.25_192
Howard et al., 2017 |
.323 | .208 | .318 | .517 | .226 | .344 | .477 | .477 |
85 |
CORnet-Z
Kubilius et al., 2018 |
.322 | .298 | .182 | .553 | .223 | .356 | .470 | .470 |
86 |
ViT_B_32_imagenet1k
Dosovitskiy et al., 2021 |
.317 | .271 | .285 | .536 | .219 | .276 | ||
87 |
resnet18-local_aggregation
Zhuang et al., 2019 |
.314 | .253 | .308 | .563 | .268 | .177 | ||
88 |
ViT_B_32
Dosovitskiy et al., 2021 |
.313 | .308 | .275 | .504 | .208 | .270 | ||
89 |
mobilenet_v1_0.25_160
Howard et al., 2017 |
.312 | .198 | .293 | .509 | .229 | .330 | .455 | .455 |
90 |
ViT_L_16_imagenet1k
Dosovitskiy et al., 2021 |
.311 | .215 | .269 | .494 | .244 | .333 | ||
91 |
bagnet9
Brendel et al., 2019 |
.307 | .215 | .260 | .550 | .200 | .307 | .260 | .260 |
92 |
ViT_B_16_imagenet1k
Dosovitskiy et al., 2021 |
.304 | .234 | .261 | .485 | .204 | .335 | ||
93 |
mobilenet_v2_0.35_96
Howard et al., 2017 |
.303 | .183 | .249 | .501 | .230 | .351 | .455 | .455 |
94 |
mobilenet_v1_0.25_128
Howard et al., 2017 |
.302 | .262 | .238 | .513 | .213 | .286 | .415 | .415 |
95 |
ViT_B_16
Dosovitskiy et al., 2021 |
.302 | .242 | .260 | .498 | .190 | .320 | ||
96 |
vggface
Parkhi et al., 2015 |
.301 | .358 | .339 | .555 | .176 | .078 | ||
97 |
resnet18-contrastive_multiview
Zhuang et al., 2020 |
.293 | .258 | .265 | .551 | .231 | .161 | ||
98 |
resnet18-instance_recognition
Wu et al., 2018 |
.292 | .267 | .294 | .548 | .261 | .090 | ||
99 |
resnet18-colorization
Zhuang et al., 2020 |
.273 | .269 | .265 | .568 | .205 | .060 | ||
100 |
resnet18-deepcluster
Zhuang et al., 2020 |
.272 | .258 | .306 | .545 | .253 | .000 | ||
101 |
resnet18-relative_position
Zhuang et al., 2020 |
.262 | .278 | .302 | .544 | .194 | -0.006 | ||
102 |
resnet18-depth_prediction
Zhuang et al., 2020 |
.260 | .285 | .246 | .509 | .158 | .102 | ||
103 |
dcgan
None |
.242 | .316 | .226 | .432 | .214 | .023 | ||
104 |
resnet18-contrastive_predictive
Zhuang et al., 2020 |
.236 | .247 | .263 | .497 | .163 | .010 | ||
105 |
prednet
Zhuang et al., 2020 |
.222 | .224 | .234 | .503 | .138 | .014 | ||
106 |
resnet18-autoencoder
Zhuang et al., 2020 |
.218 | .298 | .165 | .438 | .103 | .083 | ||
107 |
pixels
None |
.030 | .053 | .003 | .068 | .008 | .020 |
About
The Brain-Score platform aims to yield strong computational models of the ventral stream. We enable researchers to quickly get a sense of how their model scores against standardized brain benchmarks on multiple dimensions and facilitate comparisons to other state-of-the-art models. At the same time, new brain data can quickly be tested against a wide range of models to determine how well existing models explain the data.
Brain-Score is organized by the Brain-Score team in collaboration with researchers and labs worldwide. We are working towards an easy-to-use platform where a model can easily be submitted to yield its scores on a range of brain benchmarks and new benchmarks can be incorporated to challenge the models.
This quantified approach lets us keep track of how close our models are to the brain on a range of experiments (data) using different evaluation techniques (metrics). For more details, please refer to the technical paper and the perspective paper.
Compare
Hover over dots to reveal model details. Scrolling zooms in and out. Click the dropdowns to change x-/y-axis data.
Participate
Challenge the data: Submit a model
If you would like to score a model, please log in here.Challenge the models: Submit data
If you have neural or behavioral recordings that you would like models to compete on, please get in touch with us to submit data.Change the evaluation: Submit a metric
If you have an idea for a different way of comparing brain and machine, please send in a pull request.Citation
If you use Brain-Score in your work, please cite Brain-Score: Which Artificial Neural Network for Object Recognition is most Brain-Like? (technical) and Integrative Benchmarking to Advance Neurally Mechanistic Models of Human Intelligence (perspective) as well as the respective benchmark sources.@article{SchrimpfKubilius2018BrainScore, title={Brain-Score: Which Artificial Neural Network for Object Recognition is most Brain-Like?}, author={Martin Schrimpf and Jonas Kubilius and Ha Hong and Najib J. Majaj and Rishi Rajalingham and Elias B. Issa and Kohitij Kar and Pouya Bashivan and Jonathan Prescott-Roy and Franziska Geiger and Kailyn Schmidt and Daniel L. K. Yamins and James J. DiCarlo}, journal={bioRxiv preprint}, year={2018}, url={https://www.biorxiv.org/content/10.1101/407007v2} } @article{Schrimpf2020integrative, title={Integrative Benchmarking to Advance Neurally Mechanistic Models of Human Intelligence}, author={Schrimpf, Martin and Kubilius, Jonas and Lee, Michael J and Murty, N Apurva Ratan and Ajemian, Robert and DiCarlo, James J}, journal={Neuron}, year={2020}, url={https://www.cell.com/neuron/fulltext/S0896-6273(20)30605-X} }