How We Process Events—and Why It Matters for Games

Every time we move through the world, experience a story, or play a game, our brains are actively dividing those experiences into meaningful chunks. This process, known as event segmentation, helps us organize information, predict what will happen next, and decide what details to remember or forget. But event boundaries—like walking through a doorway, transitioning between scenes, or shifting between different sounds—can also disrupt memory and change how we process information.

Understanding how event segmentation works is crucial not just for psychology and neuroscience but also for game design. How do players experience level transitions? How do they recall information across different environments? Can sound shifts, time jumps, or even subtle environmental changes impact immersion and memory? Research into event segmentation helps answer these questions, offering insights that can shape narrative pacing, level design, player tutorials, and even AI-driven game adaptation.

This page explores the science behind event segmentation—focusing on the Event Horizon Model, key findings from cognitive research, and open questions that remain. It also highlights how virtual reality (VR) provides new ways to study these effects and, most importantly, how these insights can be applied to game design to create more seamless, engaging, and immersive player experiences.

Event Segmentation Theory and the Event Horizon Model

Event Segmentation Theory proposes that we naturally divide continuous experiences into meaningful segments, or “events,” to help process and store information efficiently. These event boundaries act as markers, influencing what we remember and how we predict future occurrences. The Event Horizon Model builds on this idea, suggesting that when we encounter an event boundary—such as entering a new room or shifting to a different topic—our cognitive system prioritizes new information while de-emphasizing the previous event. This has profound implications for memory recall, learning, and attention.

What Has Gabriel Radvansky’s Team Found?

Gabriel Radvansky and colleagues have shown that event boundaries impair memory recall. For example, their research demonstrates that simply walking through a doorway (a spatial event boundary) reduces memory for information learned in the prior room—often referred to as the “doorway effect”. These findings suggest that our cognitive system resets at event boundaries, making it harder to retrieve details from before the shift. They’ve also explored how time-based event shifts and narrative structure influence recall, reinforcing the idea that event segmentation is a fundamental aspect of human cognition.

Unanswered Questions and New Research Directions

While event segmentation has been well-documented in spatial and narrative contexts, many open questions remain:


Will different spatio-temporal event shifts (e.g., weather changes, time jumps) produce the same effect?

If a sudden storm transition (such as in Gris) in a narrative or a time skip in a story triggers an event boundary, will it disrupt memory in the same way a doorway does? Investigating this would help refine our understanding of how different types of boundaries influence recall.

Can an “event shift” in non-verbal audio also produce this effect?

Most event segmentation studies focus on visual or linguistic changes, but what about sound-based shifts? If a game or movie suddenly changes background music or shifts between different ambient soundscapes, could this trigger event segmentation in the same way that a scene change does? Studying this could provide insight into how sound influences cognition and memory, and what games that utilize adaptive soundtracks (like Breath of the Wild) might be promoting cognitively.




Will eye tracking provide information about prediction processes in participants?

Eye movements reflect anticipation and attention, meaning that gaze behavior could reveal how people predict upcoming events based on segmentation cues. If participants’ gaze patterns shift before an event boundary, this could indicate that they’re preparing for the transition, giving us new ways to measure cognitive processing of event shifts.

How Virtual Reality Helps Us Investigate These Questions

Virtual reality (VR) provides a powerful tool for testing event segmentation in controlled yet immersive environments. Unlike traditional lab experiments, VR allows researchers to precisely manipulate event boundaries—whether spatial (doorways, landscape changes), temporal (time jumps, day/night cycles), or sensory (sudden changes in lighting or sound). By using VR, we can:

  • Track how participants explore environments and whether their movement patterns align with event segmentation theories.
  • Measure eye tracking data in real time to assess how attention and prediction shift at event boundaries.
  • Test audio-based event segmentation by controlling how soundscapes change in response to player actions.

Why This Research Matters for Game Design

Games are built on progression and segmentation—whether it’s moving between levels, transitioning through story beats, or exploring new environments. Understanding event segmentation can help designers:

Optimize level transitions to reduce memory disruption and improve narrative cohesion.




In Hades, when Zagreus dies, he doesn’t just “restart”; he returns to the House of Hades, where characters acknowledge his progress. This prevents resets from feeling like a disjointed game mechanic and instead integrates them into the story. Additionally, interwoven character arcs unfold gradually, giving a sense of progression even when core gameplay loops reset.


Leverage event boundaries to create impactful moments, such as using audio cues or environmental changes to naturally segment gameplay.

In Nier: Automata, When 2B and 9S first enter the amusement park, the shift in atmosphere is immediately striking. The sudden contrast—cheerful music, fireworks, and passive machine lifeforms—marks a clear event boundary from the war-torn, desolate city ruins. This primes players to process the amusement park as a distinct experience, reinforcing the eerie contrast between the festive setting and the underlying tension of Nier: Automata’s themes.

The rollercoaster sequence takes this even further. As the player boards the ride, the camera perspective shifts to a dynamic side-scrolling view, altering how movement and combat feel. This shift in perspective acts as a cognitive event boundary, signaling that the gameplay experience is about to change. The player is no longer navigating an open 3D space but engaging in an on-rails combat segment, reinforcing the moment’s cinematic quality.

Design better tutorials and onboarding, ensuring that critical information is retained even as players move between different gameplay sections.




Breath of the Wild reinforces mechanics naturally by aligning them with event boundaries, ensuring players retain key information as they transition between gameplay sections. For example, environmental shifts—like the cold in the Great Plateau—segment survival mechanics within distinct moments. By embedding tutorials within meaningful gameplay shifts, the game ensures players absorb mechanics through experience rather than repetition.

Strengthen narrative flow and pacing by breaking down stories into digestible chunks, that allow for major plot points to be strengthened at the right moments.




Scarlet Nexus is divided into clear Phases (chapters), which naturally segment the story. Each phase introduces major plot developments, new characters, and fresh objectives, helping players organize information and anticipate what’s coming next. Event boundaries are also reinforced by gameplay, where major battles or story beats often coincide with unlocking new SAS abilities or adding new party members.

By bridging cognitive science and game design, this research can inform how players perceive, process, and remember experiences, ultimately shaping how we create engaging, intuitive, and memorable games.