University of Cambridge > Talks.cam > Zangwill Club > Time perception as accumulation of salient events

Time perception as accumulation of salient events

Add to your list(s) Download to your calendar using vCal

  • UserWarrick Roseboom (University of Sussex)
  • ClockFriday 19 November 2021, 16:15-18:00
  • HouseZoom meeting.

If you have a question about this talk, please contact Yasmin Fouani-Eckstein.

Human perception and experience of time on the scale of seconds to minutes depends heavily on the context and content of experience, as clearly reflected in aphorisms such as “time flies when you’re having fun” or “a watched pot never boils”. It has been established for decades at least that both lower-level stimulus properties – e.g. the rate of change, speed, or complexity of the stimulus – as well as higher-level complexities – e.g. the type of scene you are in, busy or quiet – strongly influence perceived time. In these cases, the intuition that more-stuff-happening-faster results in longer perceived duration generally holds. These features of the natural experience of time indicate that a basis for human time perception might be found in the dynamics of neural activity across sensory processing hierarchies – specifically in the moments of larger changes in activity that we refer to as salient events. We proposed that a simple way to characterise and track salient events was as a kind of prediction error. We started by simply using the difference between states of neural networks in successive instances. This assumes that, in the absence of any more precise information, the immediate past is a good prediction of the immediate future. We have shown that our algorithmic approach can reproduce human-like estimates of and biases in time perception when applied to both models of the visual processing hierarchy – deep convolutional neural networks trained for image classification – and neuroimaging data from the human visual processing hierarchy. Further, we can even predict trial-by-trial subjective reports of duration for a given participant based only on (fMRI) BOLD measured while they view naturalistic videos. Using salient events as a basis for time perception links naturally with predictive coding accounts of perception, as well as the prominent event segmentation-based accounts of episodic memory. We are currently working to compare model-based and data-driven approaches to event segmentation as applied in EEG /MEG to see which features are common to the different methods as well as where they diverge both from each other and from human annotations of naturalistic experience. This line of work resolves many contentious perspectives in the time perception field, while also bringing time perception and episodic memory back to the same basic units of operation – salient events in experience – all under a predictive processing framework.

This talk is part of the Zangwill Club series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2021 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity