Virtual Reality (071011-1) Fall 2020 Final Exam (Due by 12/15/2020)
Dankook University, College of SW Convergence, Computer Engineering
The final exam will be a take-home exam, handed out on Wednesday December 9th in class, and due by Tuesday December 15th by 11:59 PM. Please, turn-in MS Word doc or Adobe Acrobat pdf file via e-learning. Please put your name and student ID on the exam you turn in.
This is an individual exam, to be completed without the aid of other students in the classroom. All of your answers should be in your own words using complete sentences, NOT just spitting back quotes from publications, books, lecture notes, or web pages. Answers that are direct copies of sentences from the book will NOT receive full credit. In answering the exam questions, it is crucial that you use citations of readings and outside publications that are relevant to your arguments (except the lecture notes from this class).
“VR/AR Environmental Education” Term Project Final Document & Presentation (Due by 12/9)
– Prepare your final presentation (in You Tube presentation format)
– The final term project report will include the information about your application, such as, the features, user interface description (including diagrams and pictures), sketches, images used in the design, various versions made during your application development.
About 30 illustrations, technical documentation (main loop, Finite State Machine, data structure, description and illustration of implemented graphic effects, description of basic algorithm, effects and methods created), etc.
– Submit the final report (yourname_termproject_final.ppt) and final report (yourname_termproject_final.doc).
1. 이성현 – Chris Dede, Tina A. Grotzer, Amy Kamarainen, Shari Metcalf, “EcoXPT: Designing for Deeper Learning through Experimentation in an Immersive Virtual Ecosystem”, Journal of Educational Technology & Society , Vol. 20, No. 4 (October 2017), pp.166-178.
2. 오준오 – David M. Markowitz, Rob Laha, Brian P. Perone, Roy D. Pea, Jeremy N. Bailenson, “Immersive Virtual Reality Field Trips Facilitate Learning About Climate Change”,Frontiers in Psychology, 9, 2018.
3. 김대한 – D. Hamilton, J. McKechnie1, E. Edgerton, C. Wilson, “Immersive virtual reality as a pedagogical tool in education: a systematic literature review of quantitative learning outcomes and experimental design”, Journal of Computers in Education, 2020.
4. 김희주 – Amy Kamarainen, Joseph Reilly, Shari Metcalf, Tina Grotzer, Chris Dede, “Using Mobile Location-Based Augmented Reality to Support Outdoor Learning in Undergraduate Ecology and Environmental Science Courses”, The Bulletin of the Ecological Society of America, 2018.
5. 서상우 – Carolin Helbig, Hans-Stefan Bauer, Karsten Rink, Volker Wulfmeyer, Michael Frank, Olaf Kolditz, “Concept and workflow for 3D visualization of atmospheric data in a virtual reality environment for analytical approaches”, Environmental Earth Science (2014) 72:3767-3780.
So how does the Parallel Reality technology work?
Parallel Reality displays are enabled by a new pixel. As demonstrated at CES 2020, these pixels are capable of simultaneously projecting millions of light rays of different colors and brightness. Each ray can then be directed, via software, to a specific person. Hence having the camera to track you so it knows how to align the pixels.