A key question in visual neuroscience is how our subjective experience of the visual world remains largely uninterrupted by the many eye movements we make, each of which abruptly displaces the retinal image. It has long been hypothesized that extra-retinal signals, generated by voluntary movements of the eye, alter our vision in such a way that facilitates visual perception across eye movements. However, it is unclear how these extra-retinal signals influence the integration of form information, giving rise to our ability to identify visual objects in spite of the fact that eye movements shift those objects on the retina. Thus, I used eye tracking and psychophysics to test the influence of eye movements on object perception in peripheral vision. In this thesis I present studies in which I show for the first time substantial changes in object identification that result from eye movement preparation and execution. The results from these studies show that our visual experience is not just constructed from the different images that hit the retina, but is influenced by eye movement signals that facilitate the perception of objects from one glance to the next.