Abstract: How similar is the human mind to the sophisticated machine-learning systems that mirror its performance? Biologically inspired convolutional neural networks (CNNs) have taken our field by storm, achieving human-level benchmarks in recognizing novel images and objects. These advances support transformative technologies such as self-driving cars and machine medical diagnosis; beyond this, they also serve as candidate models for the human mind itself. However, unlike humans, CNNs can be "fooled" by adversarial examples -- carefully crafted images that look like nonsense patterns to humans but are recognized as familiar objects by machines, or that look like one object to humans (e.g., an orange) and a different object to machines (e.g., a missile). This seemingly extreme divergence between human and machine classification fundamentally challenges these new advances, both as applied image-recognition systems and also as models of the human mind. Surprisingly, however, little work has empirically investigated human classification of such stimuli; does human and machine performance ultimately diverge? Or could humans engage in “machine theory of mind” and predict the CNN’s preferred labels? Here, I’ll show how human and machine classification are robustly related: I will present data showing that, across many prominent and diverse adversarial imagesets, human subjects can reliably identify the machine's preferred label over relevant foils, even for images described in the literature as "totally unrecognizable to human eyes". I suggest that human intuition is a surprisingly reliable guide to machine (mis)classification, and I explore the consequences of these results for psychology, neuroscience, computer vision, “explainable AI”, and our shared cyberpunk future.
For further information please contact Johnny Dubois at email@example.com.
If you require an accommodation due to a disability, please contact the event coordinator OR email firstname.lastname@example.org five days prior to the event. We will work with you to make appropriate arrangements.