Perceiving animacy in 'identical' images
Boger, T.; Firestone, C.
Show abstract
Some objects appear animate (e.g., dogs and elephants) while others do not (e.g., boots and sofas). This distinction pervades human cognition, with an expansive literature reporting striking effects of animacy on vision, memory, social perception, and neural organization. But studies of perceived animacy face a persistent challenge: Objects that differ in animacy tend to differ in many lower-level visual features (e.g., shape, texture, spatial frequency). Thus, it remains controversial whether animacy per se -- as opposed to its lower-level correlates -- drives visual processing. Here, we achieve previously unattainable levels of experimental control to demonstrate that the visual system represents animacy itself, beyond its lower-level covariates. We vary animacy while holding nearly all lower-level features constant by exploiting "visual anagrams" -- a diffusion-based technique for generating static images whose interpretations change radically with orientation. Seven pre-registered experiments leverage this approach to demonstrate that representations of animacy structure visual working memory and guide visual attention. Thus, the visual system extracts animacy itself, beyond its lower-level correlates.
Matching journals
The top 4 journals account for 50% of the predicted probability mass.