Lester Loschky

Lester LoschkyContact Information

Office: BH 471

Phone: 532-6882

E-mail: loschky@ksu.edu

Curriculum Vitae (CV)

ResearchGate Profile

Google Scholar Profile

Visual Cognition Laboratory

Faculty Advisor to the Cognitive Neuroscience Core, Center for Cognitive and Neurobiological Approaches to Plasticity (CNAP)

Funding Sources

Current & Previous:

National Science Foundation (NSF)

Office of Naval Research (ONR)

Dr. Loschky is currently accepting applications for graduate students for 2025.
  • Enrollment Options & Deadlines:
    • January 2025 (Application deadline: August 1, 2024)
    • August 2025 (Application deadline: December 1, 2024)

Students interested in working with me can contact me by e-mail (loschky@ksu.edu) (Please copy and paste "Prospective PhD Student in Visual Cognition" as the email header)

  • Financial Support:
    • Five years of funding for students with BS/BA, or four years for those with an MS/MA.
    • Funding includes Graduate Research Assistantship and/or Graduate Teaching Assistantship, covering tuition and benefits.

Major Research Themes

  • How scene perception and event comprehension influence eye movements
  • The role of attention and eye movements in online learning

Research Interests

  • Research Focus: Our laboratory conducts both basic and applied cutting-edge interdisciplinary research in Visual Cognition & Attention. We explore how attention shapes people’s awareness, understanding, learning, and memory through real-time eye movement tracking. Additionally, we integrate machine learning and AI in our methodologies.
  • Laboratory Credo: We believe that excellent basic research inspires practical applications, and excellent applied research inspires theoretical implications.
  • Current Applied Research: Supported by a National Science Foundation grant, our current focus is on measuring and modeling students' attention during online learning. Using data from webcam-based eye movement measures, mouse movements, and keyboard input to train AI, we aim to enhance student engagement by enabling instructors to adapt their teaching strategies effectively.
  • Current Basic Research: Our pioneering research investigates how viewers’ understanding affects their visual attention. Using stimuli such as movie clips, we explore how comprehension influences eye movements, advancing our understanding of visual cognition and its implications for artificial intelligence (AI). We are continuously developing a theory, called the Scene Perception & Event Comprehension Theory (SPECT) that connects this work and guides our research.
  • Active Interdisciplinary Collaborations: We collaborate globally with psychologists (studying eye movements, perception, and comprehension), computer scientists (using machine learning & computational modeling), and STEM education researchers (studying physics learning). We also collaborate with Kansas State faculty (using EEG, TMS, and fMRI). These collaborations, often leading to membership on PhD students' dissertation committees, expand our students' skills and professional network.

Student Involvement

Research Opportunities in My Lab: A Collaborative and Empowering Approach

My philosophy for working with students centers on providing guidance while encouraging them to contribute their own ideas and viewpoints.

Graduate students can either work on one of my ongoing research projects or propose their own topics, depending on their experience and motivation. They will also gain valuable experience supervising undergraduate research assistants. Financial support for graduate students comes from grant funds when available or departmental graduate teaching assistantships. Students who contribute significantly to our research will have ample opportunity to co-author publications resulting from their work.

I am currently accepting applications for graduate students for 2025.

  • Enrollment Options & Deadlines:
    • January 2025 (Application deadline: August 1, 2024)
    • August 2025 (Application deadline: December 1, 2024)

Students interested in working with me can contact me by e-mail (loschky@ksu.edu) (Please copy and paste "Prospective PhD Student in Visual Cognition" as the email header)

  • Financial Support:
    • Five years of funding for students with BS/BA, or four years for those with an MS/MA.
    • Funding includes Graduate Research Assistantship and/or Graduate Teaching Assistantship, covering tuition and benefits.

Undergraduate students interested in research can apply to be a PSYCH 599 research assistant in my lab. As a research assistant, students will experience the entire research cycle, including:

  • Reading and reviewing relevant literature
  • Generating research questions and hypotheses
  • Designing and preparing experiments
  • Conducting experiments
  • Analyzing data
  • Writing up results
  • Presenting findings at conferences or submitting them for publication in scientific journals

The activities that an individual research assistant participates in will depend on their level of motivation and commitment. This experience is invaluable for understanding graduate-level research and can greatly strengthen a graduate school application.

Former Graduate Students

  • Jared J. Peterson, Ph.D. (2018), M.S. (2016), Kansas State University. B.S. University of Wisconsin, LaCrosse. ResearchGate profile. Research Psychologist at the U.S. Coast Guard Research and Development Center.
  • Tyler E. Freeman, Ph.D. (2012), M.S. (2009) Kansas State University, B.S. University of North Carolina at Wilmington. ResearchGate profile, LinkedIn profile. Director, Human Performance at ICF International.

Representative Publications

(*indicates current or former student co-author; click on underlined citations to go to those articles)

Loschky, L. C., *Smith, M. E., *Chandran, P., *Hutson, J. P., Smith, T. J., Magliano, J. P. (submitted). The Role of Event Understanding in Guiding Attentional Selection in Real-world Scenes: The Scene Perception & Event Comprehension Theory (SPECT), Invited Submission to Vision Research.

 

*Chandran, P., *Huang, Y., *Munsell, J., *Howatt, B., *Wallace, B., *Wilson, L., D’Mello, S., Hoai, M., Rebello, N.S., & Loschky, L.C. (2024). Characterizing Learners’ Complex Attentional States During Online Multimedia Learning Using Eye-tracking, Egocentric Camera, Webcam, and Retrospective recalls. Proceedings of the 2024 Symposium on Eye Tracking Research and Applications (pp. 59-66). New York, NY: ACM.

 

*Smith, M. E., Loschky, L. C., & Bailey, H. R. (2023). Eye movements and event segmentation: Eye movements reveal age-related differences in event model updating. Psychology and Aging. Advance online publication. https://doi.org/10.1037/pag0000773

 

*Miller, S. S., *Hutson, J. P., *Strain, M. L., Smith, T. J., *Palavamäki, M., Loschky, L. C., & Saucier, D. A. (2023). The role of individual differences in resistance to persuasion on memory for political advertisements. Frontiers in Psychology. 14. https://doi.org/10.3389/fpsyg.2023.1196209

 

*Hutson, J.P., *Chandran, P., Magliano, J.P., Smith, T., & Loschky, L.C. (2022) Narrative comprehension guides eye movements in the absence of motion. Cognitive Science. 46(5), e13131. https://doi.org/10.1111/cogs.13131

 

*Smith, M. E., Loschky, L. C., & Bailey, H. R. (2021). Knowledge guides attention to goal-relevant information in older adults. Cognitive Research: Principles and Implications, 6(1), 1-22. https://doi.org/10.1186/s41235-021-00321-1

 

*Ringer, R.V., *Coy, A.M., Larson, A.M., & Loschky, L.C. (2021). Investigating visual crowding of objects in complex real-world scenes. i-Perception, 12(2), 1–24. https://doi.org/10.1177/2041669521994150

 

*Hutson, J. P., Magliano, J. P., Smith, T. J., & Loschky, L. C. (2021). “This Ticking Noise in My Head”: How Sound Design, Dialogue, Event Structure, and Viewer Working Memory Interact in the Comprehension of Touch of Evil (1958). Projections, 15(1), 1-27. https://doi.org/10:3167/proj.2021.150102

 

Loschky, L. C., Larson, A.M., Smith, T. J., & Magliano, J. P. (2020). The Scene Perception & Event Comprehension Theory (SPECT) Applied to Visual Narratives. Topics in Cognitive Science, 12(1), 311-351. https://doi.org/10.1111/tops.12455

 

*Zu, T.L., *Hutson, J., Loschky, L.C., & Rebello, N.S. (2020). Using Eye Movements to Measure Intrinsic, Extraneous, and Germane Load in a Multimedia Learning Environment. Journal of Educational Psychology, 112(7), 1338–1352. https://doi.org/10.1037/edu0000441

 

*Smith, M. E. & Loschky, L. C. (2019). The influence of sequential predictions on scene gist recognition. Journal of Vision, 19(12):14, 1–24, https://doi.org/10.1167/19.12.14.

* * 2020 Kansas State University, nominee for the MAGS/ProQuest Distinguished Thesis Award for the Social Sciences.

 

Loschky, L. C., Szaffarczyk, S., *Beugnet, C., Young, M. E., & Boucart, M. (2019). The contributions of central and peripheral vision to scene-gist recognition with a 180° visual field. Journal of Vision, 19(5), 1-15. doi: 10.1167/19.5.15

 

Loschky, L. C., *Hutson, J. P., *Smith, M. E., Smith, T. J., & Magliano, J. P. (2018). Viewing Static Visual Narratives Through the Lens of the Scene Perception and Event Comprehension Theory (SPECT). in J. Laubrock, J. Wildfeuer, & A. Dunst (Eds.), The Empirical Study of Comics, Rutledge.

 

* Hutson, J. P., Magliano, J. P., Smith, T. J., & Loschky, L. C. (2017). What is the role of the film viewer? The effects of narrative comprehension and viewing task on gaze control in film. Cognitive Research Principles & Implications, 2(1), 46, 1-30. 10.1186/s41235-017-0080-5

 

* Ringer, R.V., *Throneburg, Z., Johnson, A.P., Kramer, A.F., & Loschky, L.C. (2016). Impairing the Useful Field of View in natural scenes: Tunnel vision versus general interference. Journal of Vision, 16(2):7, 1-25. doi: 10.1167/16.2.7.

 

Loschky, L.C., *Larson, A.M., Magliano, J.P., & Smith, T.J. (2015). What would Jaws do? The tyranny of film and the relationship between gaze and higher-level narrative film comprehension. PLoS ONE 10(11): e0142474. doi:10.1371/journal.pone.0142474

 

*Ringer, R. V., Johnson, A. P., *Gaspar, J., Neider, M., Crowell, J., Kramer, A. F., & Loschky, L. C. (2014). Creating a new dynamic measure of the Useful Field of View. In J. Mulligan (Ed.), Proceedings of the 2014 Symposium on Eye Tracking Research and Applications (pp. 59-66). New York, NY: ACM.

* * * 2014 Eye Tracking Research & Applications Symposium Best Full Paper Award and Best Student Paper Award to graduate student Ryan Ringer for same paper

 

*Rouinfar, A., *Agra, E., *Larson, A. M., Rebello, N. S., & Loschky, L. C. (2014). Linking attentional processes and conceptual problem solving: Visual cues facilitate the automaticity of extracting relevant information from diagrams. [Original Research]. Frontiers in Psychology, 5. doi: 10.3389/fpsyg.2014.01094

 

Loschky, L.C., & *Larson, A.M. (2010). The natural/man-made distinction is made prior to basic-level distinctions in scene gist processing. Visual Cognition, 18(4), 513-536.

 

Loschky, L.C., Hansen, B.C., Sethi, A. & *Pydimarri, T. (2010). The role of higher-order image statistics in masking scene gist recognition. Attention, Perception & Psychophysics, 72(2), 427-444.

 

*Larson, A.M. & Loschky, L.C. (2009). The contributions of central versus peripheral vision to scene gist recognition. Journal of Vision, 9(10):6, 1-16, http://journalofvision.org/9/10/6/, doi:10.1167/9.10.6.

 

Loschky, L.C., McConkie, G.W., Yang, J. & Miller, M.E. (2005). The limits of visual resolution in natural scene viewing. Visual Cognition, 12(6), 1057-1092.

 

Zelinsky, G.J. & Loschky, L.C. (2005). Eye movements serialize memory for objects in scenes. Perception and Psychophysics, 67(4), 676-690