THINGS-data: A multimodal collection of large-scale datasets for investigating object representations in brain and behavior

Posted on 2023-01-17 - 19:30 authored by Martin Hebart

Here we provide all datasets which are part of the THINGS-data collection comprising functional MRI, magnetoencephalographic recordings, and 4.70 million similarity judgments in response to thousands of photographic images for up to 1,854 object concepts. THINGS-data is unique in its breadth of richly-annotated objects, allowing for testing countless hypotheses at scale while assessing the reproducibility of previous findings. Beyond the unique insights promised by each individual dataset, the multimodality of THINGS-data allows combining datasets for a much broader view into object processing than previously possible.

CITE THIS COLLECTION

Hebart, Martin; Contier, Oliver; Teichmann, Lina; Rockter, Adam; Zheng, Charles; Kidder, Alexis; et al. (2023): THINGS-data: A multimodal collection of large-scale datasets for investigating object representations in brain and behavior. Figshare+. Collection. https://doi.org/10.25452/figshare.plus.c.6161151.v1
or
Select your citation style and then place your mouse over the citation text to select it.

FUNDING

Intramural Research Program of the National Institutes of Health (ZIA-MH-002909, ZIC-MH002968)

ERC Starting Grant project COREDIM (101039712)

Research group grant by the Max Planck Society awarded to MNH

Object, face, body and scene representations in the human brain

National Institute of Mental Health

Machine Learning Team

National Institute of Mental Health

Research Institution(s)

National Institute of Mental Health, Max Planck Institute for Human Cognitive and Brain Sciences

Contact email

hebart@cbs.mpg.de

SHARE

email
need help?