Sensing space – the world around us and our position in it – is one of the most important jobs the brain faces. Spatial information derives from many sources – vision, hearing, balance, touch, proprioception, movement and memory – and these systems must coordinate with each other to synthesize our spatial knowledge. I will describe some of the computations that are essential to knitting together visual and auditory information, which requires not only coordination across different reference frames but also different formats for encoding information (akin to digital vs. analog coding). Recent work in my laboratory suggests that this process begins in the ear itself via the brain’s ability to motorically control the mechanics of the eardrum. Finally, I will discuss the important role that spatial sensory and motor processing plays in supporting our abilities to think and remember.