Search by item HOME > Access full text > Search by item

JBE, vol. 23, no. 1, pp.36-44, January, 2018

DOI: https://doi.org/10.5909/JBE.2018.23.1.36

Spatio-temporal Data Visualization Survey for VR and AR Environment

Hyunjoo Song

C.A E-mail: hjsong0001@duksung.ac.kr

Abstract:

VR(Virtual Reality) and AR(Augmented Reality) devices are becoming more common, and the need for proper contents presentation techniques in such environments has been growing ever since the popularization of the devices. One of the contents is the spatio-temporal data, which has become more prominent since it could be both generated and consumed by a large number of ordinary users. In this work, the researcher analyzed the characteristics of spatio-temporal data as a source for visualization in VR and AR environment, and categorized prior visualization methods for such data, which were devised for traditional monitors. The researcher also reviewed the hardware specification of state-of-the-art devices, and examined the possibility of adopting the previous visualization approaches. This work is expected to contribute in designing spatio-temporal visualization for VR and AR environment by utilizing their unique characteristics.



Keyword: Virtual Reality, Augmented Reality, Spatio-temporal Data, Information Visualization

Reference:
[1] E. Olshannikova, A. Ometov, Y. Koucheryavy, and T. Olsson, “Visualizing big data with augmented and virtual reality: challenges and research agenda,“ Journal of Big Data, Vol. 2, No. 1, pp.22, October 2015.
[2] I. Kim, C. Eem, and H. Hong, “Hierarchical subdivision of light distribution model for realistic shadow generation in augmented reality,“ Journal of Broadcast Engineering, Vol. 21, No. 1, pp.24-35, January 2016.
[3] K.-H. Yoo, “Standard model for live actor and entity representation in mixed and augmented reality,“ Journal of Broadcast Engineering, Vol. 21, No. 2, pp.192-199, March 2016.
[4] E. Kim, J. Kim, E. Yoo, and T. Park, “Study on virtual reality (VR) operating system prototype,“ Journal of Broadcast Engineering, Vol. 22, No. 1, pp.87-94, January 2017.
[5] Quantified self, http://quantifiedself.com (accessed Nov. 10, 2017).
[6] G. Sun, Y. Liu, W. Wu, R. Liang, and H. Qu, “Embedding temporal display into maps for occlusion-free visualization of spatio-temporal data,“ Proceedings of IEEE Pacific Visualization Symposium, Yokohama, Japan, March 2014.
[7] E. R. Tufte, “Envisioning information,“ Optometry and Vision Science, Vol. 68, No. 4, pp.322-324, April 1991.
[8] N. Andrienko and G. Andrienko, “Interactive visual tools to explore spatio-temporal variation,“ Proceedings of the working conference on Advanced Visual Interfaces, Gallipoli, Italy, pp.417-420, 2004.
[9] Map of Seoul, https://commons.wikimedia.org/wiki/File:Image-Map_ Seoul-teukbyeolsi-big.png (accessed Nov. 10, 2017).
[10] N. Andrienko and G. Andrienko, “Visual data exploration using space-time cube,“ Proceedings of International Cartographic Confer- ence, Durban, South Africa, pp.1981-1983, 2003.
[11] A. M. MacEachren, How Maps Work: Representation, Visualization, and Design, The Guilford Press, New York, USA, pp.252-254, 1995.
[12] B. Bach, P. Dragicevic, D. Archambault, C. Hurter, and S. Carpendale, “A review of temporal data visualizations based on space-time cube operations,“ Eurographics Conference on Visualization, Swansea, Wales, United Kingdom, 2014.
[13] E. R. Tufte, The visual display of quantitative information, Graphics Press, Cheshire, USA, 1986.
[14] P. Gatalsky, N. Andrienko, and G. Andrienko, “Interactive analysis of event data using space-time cube,“ Proceedings of the Information Visualization, Washington DC, USA, pp.145-152, 2004.
[15] K. Kurzhals, F. Heimerl, and D. Weiskopf, “ISeeCube: visual analysis of gaze data for video,“ Proceedings of the Symposium on Eye Tracking Research and Applications, Safety Harbor, Florida, USA, pp.43-50, 2014.
[16] D. A. Keim, “Information visualization and visual data mining,“ IEEE Transactions on Visualization and Computer Graphics, Vol. 8, No. 1, pp.1-8, January 2002.
[17] C. Cruz-Neira, D. J. Sandin, T. A. DeFanti, R. V. Kenyon, and J. C. Hart, “The CAVE: audio visual experience automatic virtual environment,“ Communications of the ACM, Vol. 35, No. 6, pp.64-72, June 1992.
[18] J. S. Yi, Y. Kang, and J. T. Stasko, “Toward a deeper understanding of the role of interaction in information visualization,“ IEEE Transactions on Visualization and Computer Graphics, Vol. 13, No. 6, pp.1224- 1231 November 2007.
[19] E. M. Kolasinski, Simulator sickness in virtual environments, U.S. Army Research Institute for the Behavioral and Social Sciences, Alexandria, USA, 1995.
[20] O. Kwon, C. Muelder, K. Lee, and K.-L. Ma, “A study of layout, rendering, and interaction methods for immersive graph visualization,” IEEE Transactions on Visualization and Computer Graphics, Vol. 22, No. 7, pp.1802-1815, July 2016.
[21] F. Barahimi, and S. Wismath, “3D graph visualization with the oculus rift,” Proceedings of International Symposium on Graph Drawing, Würzburg, Germany, pp.519-520, 2014.
[22] Q. Lin,Z. Xu, B. Li, R. Baucom, B. Poulose, B. A. Landman, and R. E. Bodenheimer, “Immersive virtual reality for visualization of abdominal CT,“ Proceedings of SPIE, 8673, 2013, http://doi.org/10.1117/ 12.2008050 (accessed Nov. 10, 2017).
[23] Yelp, https://www.yelp.com (accessed Nov. 10, 2017).

Comment


Editorial Office
1108, New building, 22, Teheran-ro 7-gil, Gangnam-gu, Seoul, Korea
Homepage: www.kibme.org TEL: +82-2-568-3556 FAX: +82-2-568-3557
Copyrightⓒ 2012 The Korean Institute of Broadcast and Media Engineers
All Rights Reserved