1. 이성현 – Chris Dede, Tina A. Grotzer, Amy Kamarainen, Shari Metcalf, “EcoXPT: Designing for Deeper Learning through Experimentation in an Immersive Virtual Ecosystem”, Journal of Educational Technology & Society , Vol. 20, No. 4 (October 2017), pp.166-178.
2. 오준오 – David M. Markowitz, Rob Laha, Brian P. Perone, Roy D. Pea, Jeremy N. Bailenson, “Immersive Virtual Reality Field Trips Facilitate Learning About Climate Change”,Frontiers in Psychology, 9, 2018.
3. 김대한 – D. Hamilton, J. McKechnie1, E. Edgerton, C. Wilson, “Immersive virtual reality as a pedagogical tool in education: a systematic literature review of quantitative learning outcomes and experimental design”, Journal of Computers in Education, 2020.
4. 김희주 – Amy Kamarainen, Joseph Reilly, Shari Metcalf, Tina Grotzer, Chris Dede, “Using Mobile Location-Based Augmented Reality to Support Outdoor Learning in Undergraduate Ecology and Environmental Science Courses”, The Bulletin of the Ecological Society of America, 2018.
5. 서상우 – Carolin Helbig, Hans-Stefan Bauer, Karsten Rink, Volker Wulfmeyer, Michael Frank, Olaf Kolditz, “Concept and workflow for 3D visualization of atmospheric data in a virtual reality environment for analytical approaches”, Environmental Earth Science (2014) 72:3767-3780.
So how does the Parallel Reality technology work?
Parallel Reality displays are enabled by a new pixel. As demonstrated at CES 2020, these pixels are capable of simultaneously projecting millions of light rays of different colors and brightness. Each ray can then be directed, via software, to a specific person. Hence having the camera to track you so it knows how to align the pixels.