Rank

Model

average

V1 1 benchmark

V2 1 benchmark

V4 1 benchmark

IT 2 benchmarks

behavior 1 benchmark

engineering 1 benchmark

Deng2009-top1 v1

1 CORnet-S
Kubilius et al., 2018
.417 .294 .294 .242 .242 .581 .581 .423 .541 .305 .545 .545 .747 .747
2 vgg-19
Simonyan et al., 2014
.408 .347 .347 .341 .341 .610 .610 .248 .496 X .494 .494 .711 .711
3 resnet-50-robust
Santurkar et al., 2019
.408 .378 .378 .365 .365 .537 .537 .243 .486 X .515 .515
4 resnet-101_v1
He et al., 2015
.407 .266 .266 .341 .341 .590 .590 .274 .549 X .561 .561 .764 .764
5 vgg-16
Simonyan et al., 2014
.406 .355 .355 .336 .336 .620 .620 .259 .518 X .461 .461 .715 .715
6 resnet-152_v1
He et al., 2015
.405 .282 .282 .338 .338 .598 .598 .277 .553 X .533 .533 .768 .768
7 resnet-101_v2
He et al., 2015
.404 .274 .274 .332 .332 .599 .599 .263 .527 X .555 .555 .774 .774
8 resnet50-SIN_IN
Geirhos et al., 2019
.404 .282 .282 .324 .324 .599 .599 .276 .552 X .541 .541 .746 .746
9 densenet-169
Huang et al., 2016
.404 .281 .281 .322 .322 .601 .601 .274 .548 X .543 .543 .759 .759
10 densenet-201
Huang et al., 2016
.402 .277 .277 .325 .325 .599 .599 .273 .545 X .537 .537 .772 .772
11 resnet-50-pytorch
He et al., 2015
.399 .289 .289 .317 .317 .600 .600 .259 .518 X .528 .528 .752 .752
12 resnet-50_v1
He et al., 2015
.398 .274 .274 .317 .317 .594 .594 .278 .555 X .526 .526 .752 .752
13 resnet50-SIN_IN_IN
Geirhos et al., 2019
.397 .275 .275 .321 .321 .596 .596 .273 .545 X .523 .523 .767 .767
14 resnet-152_v2
He et al., 2015
.397 .274 .274 .326 .326 .591 .591 .266 .532 X .528 .528 .778 .778
15 resnet-50_v2
He et al., 2015
.396 .270 .270 .323 .323 .596 .596 .260 .520 X .531 .531 .756 .756
16 densenet-121
Huang et al., 2016
.396 .277 .277 .306 .306 .595 .595 .267 .533 X .535 .535 .745 .745
17 resnext101_32x32d_wsl
Mahajan et al., 2018
.396 .267 .267 .289 .289 .574 .574 .254 .507 X .594 .594 .854 .854
18 mobilenet_v1_1.0_160
Howard et al., 2017
.393 .290 .290 .332 .332 .588 .588 .275 .549 X .480 .480 .680 .680
19 resnext101_32x8d_wsl
Mahajan et al., 2018
.392 .271 .271 .312 .312 .586 .586 .241 .481 X .551 .551 .842 .842
20 inception_v2
Szegedy et al., 2015
.392 .284 .284 .313 .313 .587 .587 .270 .539 X .505 .505 .739 .739
21 dorinet_cornet_z
None
.390 .310 .310 .311 .311 .582 .582 .468 .468 .279 .279
22 resnet-18
He et al., 2015
.390 .274 .274 .302 .302 .583 .583 .266 .531 X .524 .524 .698 .698
23 mobilenet_v2_1.0_224
Howard et al., 2017
.389 .245 .245 .331 .331 .573 .573 .273 .546 X .521 .521 .718 .718
24 mobilenet_v2_0.75_224
Howard et al., 2017
.388 .236 .236 .316 .316 .586 .586 .268 .535 X .533 .533 .698 .698
25 efficientnet-b0
Tan et al., 2019
.387 .215 .215 .317 .317 .556 .556 .274 .547 X .573 .573
26 fixres_resnext101_32x48d_wsl
Touvron et al., 2019
.387 .246 .246 .288 .288 .582 .582 .257 .513 X .561 .561 .863 .863
26 resnext101_32x48d_wsl
Mahajan et al., 2018
.387 .246 .246 .288 .288 .582 .582 .257 .513 X .561 .561 .822 .822
28 mobilenet_v2_1.3_224
Howard et al., 2017
.386 .253 .253 .332 .332 .575 .575 .271 .543 X .500 .500 .744 .744
29 resnet50-SIN
Geirhos et al., 2019
.386 .300 .300 .333 .333 .580 .580 .267 .534 X .448 .448 .602 .602
30 pnasnet_large
Liu et al., 2017
.385 .264 .264 .305 .305 .578 .578 .263 .526 X .515 .515 .829 .829
31 AT_efficientnet-b7
Xie et al., 2020
.385 .276 .276 .308 .308 .583 .583 .281 .562 X .475 .475
32 mobilenet_v2_0.75_192
Howard et al., 2017
.384 .245 .245 .306 .306 .573 .573 .275 .550 X .524 .524 .687 .687
33 mobilenet_v2_1.4_224
Howard et al., 2017
.384 .257 .257 .321 .321 .566 .566 .277 .554 X .500 .500 .750 .750
34 inception_v1
Szegedy et al., 2014
.384 .259 .259 .311 .311 .589 .589 .244 .488 X .518 .518 .698 .698
35 xception
Chollet et al., 2016
.384 .245 .245 .306 .306 .610 .610 .249 .498 X .508 .508 .790 .790
36 AT_efficientnet-b4
Xie et al., 2020
.383 .246 .246 .339 .339 .549 .549 .279 .559 X .503 .503
37 mobilenet_v2_0.75_160
Howard et al., 2017
.383 .278 .278 .316 .316 .573 .573 .273 .547 X .473 .473 .664 .664
38 inception_v4
Szegedy et al., 2016
.382 .238 .238 .299 .299 .574 .574 .263 .526 X .537 .537 .802 .802
39 resnext101_32x16d_wsl
Mahajan et al., 2018
.382 .263 .263 .302 .302 .587 .587 .250 .499 X .509 .509 .851 .851
40 inception_resnet_v2
Szegedy et al., 2016
.381 .233 .233 .319 .319 .583 .583 .272 .543 X .499 .499 .804 .804
41 efficientnet-b6
Tan et al., 2019
.381 .263 .263 .295 .295 .563 .563 .271 .541 X .513 .513
42 efficientnet-b2
Tan et al., 2019
.380 .213 .213 .317 .317 .569 .569 .273 .547 X .526 .526
43 nasnet_large
Zoph et al., 2017
.380 .282 .282 .291 .291 .585 .585 .270 .541 X .470 .470 .827 .827
44 mobilenet_v1_1.0_224
Howard et al., 2017
.380 .223 .223 .341 .341 .560 .560 .273 .546 X .502 .502 .709 .709
45 efficientnet-b4
Tan et al., 2019
.379 .228 .228 .286 .286 .575 .575 .272 .543 X .535 .535
46 inception_v3
Szegedy et al., 2015
.379 .241 .241 .307 .307 .596 .596 .273 .545 X .477 .477 .780 .780
47 mobilenet_v2_1.0_192
Howard et al., 2017
.377 .216 .216 .322 .322 .572 .572 .273 .547 X .503 .503 .707 .707
48 mobilenet_v2_1.0_160
Howard et al., 2017
.376 .239 .239 .322 .322 .570 .570 .275 .550 X .472 .472 .688 .688
49 mobilenet_v2_0.5_192
Howard et al., 2017
.375 .263 .263 .329 .329 .566 .566 .264 .529 X .454 .454 .639 .639
50 mobilenet_v2_0.5_224
Howard et al., 2017
.372 .229 .229 .308 .308 .569 .569 .266 .533 X .488 .488 .654 .654
51 mobilenet_v1_0.75_224
Howard et al., 2017
.372 .223 .223 .336 .336 .558 .558 .267 .535 X .477 .477 .684 .684
52 AT_efficientnet-b2
Xie et al., 2020
.372 .248 .248 .295 .295 .563 .563 .275 .550 X .480 .480
53 resnet-34
He et al., 2015
.372 .230 .230 .286 .286 .560 .560 .237 .474 X .546 .546 .733 .733
54 AT_efficientnet-b0
Xie et al., 2020
.371 .238 .238 .334 .334 .570 .570 .267 .534 X .447 .447
55 mobilenet_v1_0.5_224
Howard et al., 2017
.370 .221 .221 .340 .340 .555 .555 .260 .521 X .474 .474 .633 .633
56 mobilenet_v1_1.0_192
Howard et al., 2017
.370 .235 .235 .329 .329 .548 .548 .271 .543 X .466 .466 .700 .700
57 mobilenet_v2_0.75_128
Howard et al., 2017
.369 .237 .237 .320 .320 .553 .553 .271 .541 X .464 .464 .632 .632
58 cornetz_contrastive
None
.369 .325 .325 .353 .353 .551 .551 .268 .535 X .346 .346
59 alexnet
None
.368 .316 .316 .353 .353 .550 .550 .254 .508 X .370 .370 .577 .577
60 mobilenet_v1_1.0_128
Howard et al., 2017
.368 .254 .254 .325 .325 .557 .557 .267 .535 X .437 .437 .652 .652
61 mobilenet_v1_0.75_128
Howard et al., 2017
.368 .267 .267 .330 .330 .564 .564 .252 .505 X .425 .425 .621 .621
62 mobilenet_v2_0.5_160
Howard et al., 2017
.368 .258 .258 .305 .305 .562 .562 .264 .528 X .448 .448 .610 .610
63 mobilenet_v2_1.0_128
Howard et al., 2017
.368 .252 .252 .303 .303 .569 .569 .267 .534 X .447 .447 .653 .653
64 mobilenet_v1_0.5_192
Howard et al., 2017
.367 .220 .220 .337 .337 .566 .566 .260 .520 X .454 .454 .617 .617
65 mobilenet_v1_0.75_192
Howard et al., 2017
.367 .229 .229 .339 .339 .549 .549 .267 .535 X .449 .449 .672 .672
66 mobilenet_v2_0.35_192
Howard et al., 2017
.366 .264 .264 .301 .301 .568 .568 .259 .518 X .437 .437 .582 .582
67 mobilenet_v2_1.0_96
Howard et al., 2017
.363 .256 .256 .332 .332 .530 .530 .257 .514 X .443 .443 .603 .603
68 resnet18-supervised
He et al., 2015
.361 .276 .276 .281 .281 .539 .539 .263 .526 X .446 .446
69 mobilenet_v1_0.5_160
Howard et al., 2017
.361 .265 .265 .320 .320 .557 .557 .252 .503 X .410 .410 .591 .591
70 mobilenet_v2_0.35_160
Howard et al., 2017
.359 .269 .269 .292 .292 .554 .554 .259 .517 X .424 .424 .557 .557
71 mobilenet_v1_0.75_160
Howard et al., 2017
.359 .213 .213 .346 .346 .558 .558 .264 .529 X .413 .413 .653 .653
72 mobilenet_v2_0.35_224
Howard et al., 2017
.359 .215 .215 .296 .296 .554 .554 .253 .506 X .474 .474 .603 .603
73 mobilenet_v2_0.5_128
Howard et al., 2017
.358 .222 .222 .309 .309 .557 .557 .262 .525 X .440 .440 .577 .577
74 nasnet_mobile
Zoph et al., 2017
.357 .272 .272 .273 .273 .566 .566 .268 .536 X .406 .406 .740 .740
75 mobilenet_v2_0.75_96
Howard et al., 2017
.350 .208 .208 .305 .305 .527 .527 .258 .516 X .451 .451 .588 .588
76 squeezenet1_0
Iandola et al., 2016
.341 .304 .304 .320 .320 .591 .591 .229 .459 X .263 .263 .575 .575
77 mobilenet_v1_0.5_128
Howard et al., 2017
.341 .245 .245 .304 .304 .550 .550 .234 .467 X .373 .373 .563 .563
78 squeezenet1_1
Iandola et al., 2016
.336 .265 .265 .311 .311 .582 .582 .229 .457 X .291 .291 .575 .575
79 mobilenet_v2_0.35_128
Howard et al., 2017
.333 .245 .245 .289 .289 .530 .530 .235 .470 X .367 .367 .508 .508
80 mobilenet_v2_0.5_96
Howard et al., 2017
.331 .266 .266 .278 .278 .501 .501 .239 .479 X .370 .370 .512 .512
81 ViT_L_32_imagenet1k
Dosovitskiy et al., 2021
.328 .265 .265 .291 .291 .531 .531 .227 .454 X .324 .324
82 mobilenet_v1_0.25_224
Howard et al., 2017
.327 .231 .231 .296 .296 .538 .538 .240 .480 X .333 .333 .498 .498
83 ViT_L_32
Dosovitskiy et al., 2021
.324 .305 .305 .301 .301 .511 .511 .219 .439 X .286 .286
84 mobilenet_v1_0.25_192
Howard et al., 2017
.323 .208 .208 .318 .318 .517 .517 .226 .451 X .344 .344 .477 .477
85 CORnet-Z
Kubilius et al., 2018
.322 .298 .298 .182 .182 .553 .553 .223 .447 X .356 .356 .470 .470
86 ViT_B_32_imagenet1k
Dosovitskiy et al., 2021
.317 .271 .271 .285 .285 .536 .536 .219 .437 X .276 .276
87 resnet18-local_aggregation
Zhuang et al., 2019
.314 .253 .253 .308 .308 .563 .563 .268 .536 X .177 .177
88 ViT_B_32
Dosovitskiy et al., 2021
.313 .308 .308 .275 .275 .504 .504 .208 .417 X .270 .270
89 mobilenet_v1_0.25_160
Howard et al., 2017
.312 .198 .198 .293 .293 .509 .509 .229 .457 X .330 .330 .455 .455
90 ViT_L_16_imagenet1k
Dosovitskiy et al., 2021
.311 .215 .215 .269 .269 .494 .494 .244 .487 X .333 .333
91 bagnet9
Brendel et al., 2019
.307 .215 .215 .260 .260 .550 .550 .200 .401 X .307 .307 .260 .260
92 ViT_B_16_imagenet1k
Dosovitskiy et al., 2021
.304 .234 .234 .261 .261 .485 .485 .204 .408 X .335 .335
93 mobilenet_v2_0.35_96
Howard et al., 2017
.303 .183 .183 .249 .249 .501 .501 .230 .460 X .351 .351 .455 .455
94 mobilenet_v1_0.25_128
Howard et al., 2017
.302 .262 .262 .238 .238 .513 .513 .213 .425 X .286 .286 .415 .415
95 ViT_B_16
Dosovitskiy et al., 2021
.302 .242 .242 .260 .260 .498 .498 .190 .380 X .320 .320
96 vggface
Parkhi et al., 2015
.301 .358 .358 .339 .339 .555 .555 .176 .351 X .078 .078
97 resnet18-contrastive_multiview
Zhuang et al., 2020
.293 .258 .258 .265 .265 .551 .551 .231 .461 X .161 .161
98 resnet18-instance_recognition
Wu et al., 2018
.292 .267 .267 .294 .294 .548 .548 .261 .522 X .090 .090
99 resnet18-colorization
Zhuang et al., 2020
.273 .269 .269 .265 .265 .568 .568 .205 .410 X .060 .060
100 resnet18-deepcluster
Zhuang et al., 2020
.272 .258 .258 .306 .306 .545 .545 .253 .506 X .000 X
101 resnet18-relative_position
Zhuang et al., 2020
.262 .278 .278 .302 .302 .544 .544 .194 .388 X -0.006 -0.006
102 resnet18-depth_prediction
Zhuang et al., 2020
.260 .285 .285 .246 .246 .509 .509 .158 .315 X .102 .102
103 dcgan
None
.242 .316 .316 .226 .226 .432 .432 .214 .214 .023 .023
104 resnet18-contrastive_predictive
Zhuang et al., 2020
.236 .247 .247 .263 .263 .497 .497 .163 .325 X .010 .010
105 prednet
Zhuang et al., 2020
.222 .224 .224 .234 .234 .503 .503 .138 .275 X .014 .014
106 resnet18-autoencoder
Zhuang et al., 2020
.218 .298 .298 .165 .165 .438 .438 .103 .207 X .083 .083
107 pixels
None
.030 .053 .053 .003 .003 .068 .068 .008 .015 X .020 .020
Model scores on brain benchmarks. Hover over model name to see layer commitments. The more green and bright a cell, the better the model's score. Scores are ceiled, hover the benchmark to see ceilings.

About

The Brain-Score platform aims to yield strong computational models of the ventral stream. We enable researchers to quickly get a sense of how their model scores against standardized brain benchmarks on multiple dimensions and facilitate comparisons to other state-of-the-art models. At the same time, new brain data can quickly be tested against a wide range of models to determine how well existing models explain the data.

Brain-Score is organized by the Brain-Score team in collaboration with researchers and labs worldwide. We are working towards an easy-to-use platform where a model can easily be submitted to yield its scores on a range of brain benchmarks and new benchmarks can be incorporated to challenge the models.

This quantified approach lets us keep track of how close our models are to the brain on a range of experiments (data) using different evaluation techniques (metrics). For more details, please refer to the technical paper and the perspective paper.

Compare

x vs y.
Hover over dots to reveal model details. Scrolling zooms in and out. Click the dropdowns to change x-/y-axis data.

Participate

Challenge the data: Submit a model

If you would like to score a model, please log in here.

Challenge the models: Submit data

If you have neural or behavioral recordings that you would like models to compete on, please get in touch with us to submit data.

Change the evaluation: Submit a metric

If you have an idea for a different way of comparing brain and machine, please send in a pull request.

Citation

If you use Brain-Score in your work, please cite Brain-Score: Which Artificial Neural Network for Object Recognition is most Brain-Like? (technical) and Integrative Benchmarking to Advance Neurally Mechanistic Models of Human Intelligence (perspective) as well as the respective benchmark sources.
@article{SchrimpfKubilius2018BrainScore,
  title={Brain-Score: Which Artificial Neural Network for Object Recognition is most Brain-Like?},
  author={Martin Schrimpf and Jonas Kubilius and Ha Hong and Najib J. Majaj and Rishi Rajalingham and Elias B. Issa and Kohitij Kar and Pouya Bashivan and Jonathan Prescott-Roy and Franziska Geiger and Kailyn Schmidt and Daniel L. K. Yamins and James J. DiCarlo},
  journal={bioRxiv preprint},
  year={2018},
  url={https://www.biorxiv.org/content/10.1101/407007v2}
}

@article{Schrimpf2020integrative,
  title={Integrative Benchmarking to Advance Neurally Mechanistic Models of Human Intelligence},
  author={Schrimpf, Martin and Kubilius, Jonas and Lee, Michael J and Murty, N Apurva Ratan and Ajemian, Robert and DiCarlo, James J},
  journal={Neuron},
  year={2020},
  url={https://www.cell.com/neuron/fulltext/S0896-6273(20)30605-X}
}