Although vision seems to predominate in triggering the simulation of the behaviour and mental states of others, the social perception of actions might rely on auditory and olfactory information not only when vision is lacking (e.g. in congenitally blind individuals), but also in daily life (e.g. hearing footsteps along a dark street prompts an appropriate fight-or-fly reaction and smelling the scent of coffee prompts the act of grasping a mug). Here, we review recent evidence showing that non-visual, telereceptor-mediated motor mapping might occur as an autonomous process, as well as within the context of the multimodal perceptions and representations that characterize real-world experiences. Moreover, we discuss the role of auditory and olfactory resonance in anticipating the actions of others and, therefore, in shaping social interactions.