The processing capabilities of biological visual systems are still vastly superior in terms of performance for real-time and low-power applications compared with conventional artificial vision. There is increasing evidence that biology has evolved a multitude of cell types, including at the level of the retina, to adapt to an extensive set of dynamic visual environments. Existing bio-inspired artificial vision technology has failed to consider the utility of modelling this rich diversity of cells, despite the fact that these cells are crucial to biology's ability to process the natural visual environment. To address this shortcoming, the VISUALISE project will create a refined understanding of retinal function in natural visual environments, enhanced models of biological signal processing in the retina and the next generation of bio-inspired asynchronous vision sensors. To achieve these objectives we will combine the efforts of physiologists, computational neuroscientists, neuromorphic electronic engineers, and roboticists, to build novel theoretical and hardware models of biological retinal ganglion cell types for dynamic vision applications. We will 1) record the activities of vertebrate retinal ganglion cells using multi-electrode arrays under dynamic natural stimulation, 2) analyse the functional response properties to expose new principles of spike encoding that bridge the gap between single cell and population information processing, 3) exploit these principles in multi-scale mathematical models which permit efficient digital circuit implementations for a next generation of real-time event-based vision sensors, and 4) evaluate their effectiveness in a challenging predator-prey high-speed robot scenario.