Modifying Visual Stimuli in Virtual Reality to Reduce Hand Tremor in Micromanipulation Tasks

  • Current experiment environment. By adding extra IMU sensor, we aim to increase precision of location tracking.

Involuntary hand tremor has been a serious challenge in micromanipulation tasks and thus draws a significant amount of attention from related fields. To minimize the effect of the hand tremor, a variety of mechanically assistive solutions have been proposed. However, approaches increasing human awareness of their own hand tremor have not been extensively studied. In this paper, a head mount display based virtual reality (VR) system to increase human self-awareness of hand tremor is proposed. It shows a user a virtual image of a handheld device with emphasized hand tremor information. Provided with this emphasized tremor information, we hypothesize that subjects will control their hand tremor more effectively. Two methods of emphasizing hand tremor information are demonstrated: (1) direct amplification of tremor and (2) magnification of virtual object, in comparison to the controlled condition without emphasized tremor information. A human-subject study with twelve trials was conducted, with four healthy participants who performed a task of holding a handheld gripper device in a specific direction. The results showed that the proposed methods achieved a reduced level of hand tremor compared with the control condition.

We are extending the system to support more precise location tracking for micromanipulation simulation and assist system with the virtual reality technology, including Oculus Rift and HTC VIVE.

  • Related Publication
    • John Prada, *Taiwoo Park, Jintaek Lim, and *Cheol Song. Exploring the Potential of Modifying Visual Stimuli in Virtual Reality to Reduce Hand Tremor in Micromanipulation Tasks. Current Optics and Photonics. December 2017, 1(6), 642-648. (SCI-E Indexed)

JARVIS: Ubiquitous Mixed Reality Fitness Platform

  • Ubiquitous Mixed Reality Screenshot of JARVIS Fitness App

This project envisions genuinely ubiquitous mixed reality experiences. The next generation of mixed reality devices needs to incorporate the capability to closely interact with objects in a user’s surrounding environment, as proposed in Keiichi Matsuda’s concepts of Hyper Reality. a potential approach to enable such capability, we propose to leverage the Internet of Things (IoT) technologies that integrate sensors and connectivity into everyday objects such as toys, home appliances, and shopping carts. These sensors capture a user’s interaction with the objects in her vicinity, expanding the capability of mixed reality platforms beyond tracking users’ head and whole-body movements.

We describe the challenges of realizing the vision of ubiquitous mixed reality, and then present our ongoing work on developing a virtual fitness coach and its supporting platform as a concrete example of ubiquitous mixed reality technology.

This work is in press for the quarterly ACM SIGMOBILE Mobile Computing and Communications Review (GetMobile) and under major revision for Proceedings of the ACM on
Interactive, Mobile, Wearable and Ubiquitous Technologies (IMWUT). Find the current draft of our vision paper here.

  • Date
    • 2015-Current
  • Role
    • Project Co-Lead, Frontend VR/AR Application Design and Development, User Experience Research
  • In collaboration with Mi Zhang (MSU), Youngki Lee (Singapore Management University).

JediFlight VR

JediFlight is an interaction mechanism with virtual wings, where a player moves the wings and basic limbs to interact with environment and other objects. The major contribution of the design is to enable a player to naturally and simultaneously move real and extended limbs (i.e., wings) in multi-faceted interactive task setup, compared to previous works only explored basic feasibility of extended limb manipulation.

  • Date
    • 2017-Current
  • Hardware
    • HTC VIVE and VIVE Trackers

Touching the Virtual: Individual differences in approach and avoidance behaviors in VR

This study investigates the relationship between individual differences in motivational activation and approach/avoidance behaviors in a 3D virtual environment (VE). The primary hypotheses are that 1) motivational relevance shapes facilitation or inhibition of behaviors while reaching, holding, and manipulating virtual objects, and 2) variations in individual’s trait appetitive system activation (ASA) and defensive system activation (DSA) will moderate the relationship. In order to unobtrusively observe individuals’ unconscious and automatic behaviors, we measure eye-gaze and distance kept between the participant and virtual objects while playing a VR game that involves a sorting task including emotional pictures. We expect that closer distance, and longer visual inspection of virtual objects are associated with ASA while further distance and shorter interaction are related to DSA. Relationship between trait motivational activation and other individual factors, such as VR skill and experience, will also be investigated.

    • Date
      • 2017-Current
    • Hardware
      • HTC VIVE, aGlass Eye Tracker
    • Original VR Environment Credits
      • Sage Miller, Yilang Zhao, Byron Lau

 

VR Mental Health

In this project, we are using VR 360 for an immersive interactive video storytelling method of health promotion. Talking about one’s feelings is helpful. However, due to stigma, many people with depression keep the suffering to themselves. In this VR environment, we used 360-degree videos to reproduce a counseling session in virtual reality that provides users an opportunity to vent out, without the fear of stigma or discrimination.

  • Date
    • 2017-Current
  • Related Publications
    • Syed Ali Hussain, Taiwoo Park, Irem Gokce Yildirim, and Zihan Xiang. Virtual Reality-based Counseling for People with Mild Depression. HCI International 2018, ICA 2018. (Accepted)

CoSMiC: Crowd-Sourced Mobile App to Find a Missing Child

  • Conceptual UI of CoSMiC Application

Finding a missing child is an important problem concerning not only parents but also our society. It is essential and natural to use serendipitous clues from neighbors for finding a missing child. In this paper, we explore a new architecture of crowd collaboration to expedite this mission-critical process and propose a crowd-sourced collaborative mobile application, CoSMiC. It helps parents find their missing child quickly on the spot before he or she completely disappears. A key idea lies in constructing the location history of a child via crowd participation, thereby leading parents to their child easily and quickly. We implement a prototype application and conduct extensive user studies to assess the design of the application and investigate its potential for practical use.

Plant-Based Games for Anxiety Reduction

More and more researchers are finding anxiety and stress as critical health problems influencing quality of life and various illnesses. Studies suggest gardening activities help with anxiety. Our goal is to create engaging ways for people to interact with plants and eventually reduce anxiety and stress. We made three short games employing a person’s touch interaction with a plant as the input interface. Each of the three games implements a unique interaction: tapping, patting, and gentle pinching. We then tested the games with ten players, among whom five of them (the plant group) played the games with the plant as the input interface. The other five (the non-plant group) played the games with a pressure sensor board. The plant group showed decreased anxiety with a borderline statistical significance (p=0.054) with Cohen’s d of 0.20 (i.e., ‘small’ effect), while the non-plant group showed a non-significant decrease in anxiety after the gameplay (p=0.65). We further examined which in-game elements contributed to calming the participants as well as the design elements that need to be improved for plant-based games.

  • Publications
    • Taiwoo Park, Tianyu Hu, and Jina Huh. 2016. Plant-based Games for Anxiety Reduction. In Proceedings of the 2016 Annual Symposium on Computer-Human Interaction in Play (CHIPlay), 199–204.

BeUpright: Posture Correction Using Relational Norm Intervention

(Concept Video)

(Presentation at CHI 2016)

We propose ‘relational norm intervention’ (RMI) model for behavior change, develop a sample wearable/mobile service and evaluate its effectiveness. RNI model uses Negative Reinforcement and Other-Regarding Preferences as motivating factors for behavior change. The model features the passive participation of a helper who will undergo artificially generated discomforts (e.g., limited access to a mobile device) when a target user performs against a target behavior.

  • Related Publications
    • Jaemyung Shin, Bumsoo Kang, *Taiwoo Park, Jina Huh, Jinhan Kim, Junehwa Song, and Taiwoo Park. 2016. BeUpright: Posture Correction Using Relational Norm Intervention. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI ’16), 6040–6052.  (Acceptance Rate: 23%)

Games for Swimmers

The unique aquatic nature of swimming makes it very difficult to use social or technical strategies to mitigate the tediousness of monotonous exercises. We developed MobyDick, a smartphone-based multi-player exergame designed to be used while swimming, in which a team of swimmers collaborate to hunt down a virtual monster.

In the development of the game and supporting platform, we have undergone thorough underwater network performance analysis for three representative wireless communication technologies (i.e., LTE, 3G, WiFi), as well as a smart swimming stroke activity recognition system to enable real-time game inputs for swimming context.

Our study revealed its basic fun factors as well as unique aspects of multi-player swimming games.

  • Date
    • 2013-2017
  • Related Publications
    • Woohyeok Choi, Jeungmin Oh, Taiwoo Park, Seongjun Kang, Miri Moon, Uichin Lee, Inseok Hwang, Darren Edge, and Junehwa Song. 2016. Designing Interactive Multiswimmer Exergames. ACM Transactions on Sensor Networks 12, 3: 1–40. (SCI-E Indexed)
    • Woohyeok Choi, Jeungmin Oh, Taiwoo Park, Seongjun Kang, Miri Moon, Uichin Lee, Inseok Hwang, Junehwa Song. MobyDick: an interactive multi-swimmer exergame, in Proceedings of ACM SenSys 2014, Memphis, TN, November, 2014.
    • (Poster) Haechan Lee, Miri Moon, Taiwoo Park, Inseok Hwang, Uichin Lee, Junehwa Song. Dungeons & swimmers: designing an interactive exergame for swimming, in Proceedings of ACM UbiComp 2013 Adjunct Publication, 2013.

Pervasive Exergame Platform

Exertainer, a pervasive exergaming platform, supports multiple heterogeneous exercise devices, such as hula hoop, jump rope, exercise bike and interactive treadmill, as well as wearable motion sensors. I have prototyped the exercise devices and the wearable sensors (in both of hardware and software). The platform utilizes a variety of sensors, including accelerometer, gyroscope, proximity sensor, rotation speed sensor and magnetic switch to detect players’ activities and exercise context.

  • Date
    • 2010-2016
  • Related Publications
    • Taiwoo Park, Inseok Hwang, Youngki Lee, Junehwa Song, “Toward a Mobile Platform for Pervasive Games“, in Proceedings of ACM SIGCOMM Workshop on Mobile Gaming. Helsinki, Finland, August 2012.
    • Taiwoo Park, Inseok Hwang, Uichin Lee, Sunghoon Ivan Lee, Chungkuk Yoo, Youngki Lee, Hyukjae Jang, Sungwon Peter Choe, Souneil Park, Junehwa Song, “ExerLink: Enabling Pervasive Social Exergames with Heterogeneous Exercise Devices“, in Proceedings of ACM MobiSys 2012. Lake District, UK, June 2012.
  • Awards
    • Best Demo Award, ACM MobiSys 2012
    • Best Demo Honorable Mention, IEEE SECON 2012

Swan Boat

Swan Boat is a social multiplayer exergame to enhance treadmill running experiences. It targets the tedious nature of running on a treadmill, by making the experience fun through collaborative social interaction and immersive game play. I have designed and implemented the game and supporting platform, including hardware and software.

  • Date
    • 2007-2015
  • Related Publications
    • Taiwoo Park, Uichin Lee, Scott I. MacKenzie, Miri Moon, Inseok Hwang, Junehwa Song. “Human Factors of Speed-based Exergame Controllers“, in Proceedings of ACM CHI 2014, Toronto, Canada, April, 2014. (Honorable Mention Award).
    • Taiwoo Park, Uichin Lee, Bupjae Lee, Haechan Lee, Sanghun Son, Seokyoung Song, Junehwa Song, “ExerSync: Facilitating Interpersonal Synchrony in Social Exergames“, in Proceedings of ACM CSCW 2013. San Antonio, TX, February, 2013.
    • Taiwoo Park, Chungkuk Yoo, Sungwon Peter Choe, Byunglim Park, Junehwa Song, “Transforming Solitary Exercises into Social Exergames“, in Proceedings of ACM CSCW 2012. Seattle, WA, February, 2012.
    • Miru Ahn, Sungjun Kwon, Byunglim Park, Kyungmin Cho, Sungwon Peter Choe, Sooho Cho, Inseok Hwang, Hyukjae jang, Taiwoo Park, Jaesang Park, Yunseok Rhee, Junehwa Song, “Exertainer: An Interactive Entertainment System for Pervasive Running Applications“, in Proceedings of UbiComp 2009 Video, Orlando, USA, October 2009.
    • Miru Ahn, Sungjun Kwon, Byunglim Park, Sungwon Peter Choe, Taiwoo Park, Sooho Cho, Jaesang Park, Yunseok Rhee, Junehwa Song, “Swan Boat: Pervasive Social Game to Enhance Treadmill Running“, in Proceedings of ACM Multimedia 2009 Technical Demonstrations, Beijing, China, September 2009.

Continue reading “Swan Boat”

Mobile Gesture Interaction Platform

Designed and developed E-Gesture, a mobile gesture platform which enables eye- and hands-free mobile gesture interaction with a mobile device and wristwatch sensor. Its unique technical feature is feedback-based close-loop sensor fusion, which minimizes energy consumption of motion sensors while preserving gesture-sensing quality. I have implemented the gesture processing architecture using Android NDK (Native Development Kit), TinyOS, and HTK (HMM ToolKit).

  • Date
    • 2010-2014
  • Related Publications
    • Ju-Hwan Kim, Tek-Jin Nam, Taiwoo Park. “CompositeGesture: Creating Custom Gesture Interfaces with Multiple Mobile or Wearable Devices”, International Journal on Interactive Design and Manufacturing (IJIDeM), 2014.
    • Taiwoo Park, Jinwon Lee, Inseok Hwang, Chungkuk Yoo, Lama Nachman, Junehwa Song, “E-Gesture: A Collaborative Architecture for Energy-efficient Gesture Recognition with Hand-worn Sensor and Mobile Devices”, in Proceedings of ACM SenSys 2011. Seattle, WA, November, 2011.
    • Taiwoo Park, Jinwon Lee, Inseok Hwang, Chungkuk Yoo, Lama Nachman, Junehwa Song. “Demo: E-Gesture – A Collaborative Architecture for Energy-efficient Gesture Recognition with Hand-worn Sensor and Mobile Devices”, ACM MobiSys Demonstration, 2011 (Also demonstrated at IEEE SECON 2012) – Best Demo Award

Continue reading “Mobile Gesture Interaction Platform”

Mobile Context-Aware Service Platform

Challenges in designing and developing mobile context-aware service platform

Participated in designing and implementing a context-aware service platform and applications. For the project, I have built a large body of wearable sensor components and experimental toolkits to measure energy consumption. Also, I developed situation-specific context-aware applications for kids in kindergartens and successfully instantiated their feasibility via several field studies.

  • Date
    • 2007-2012
  • Related Publications
    • Youngki Lee, Sitharam S. Iyengar, Chulhong Min, Younghyun Ju, Seungwoo Kang, Taiwoo Park, Jinwon Lee, Yunseok Rhee, and Junehwa Song. “MobiCon: Mobile Context-Monitoring Platform“, Communications of the ACM (CACM), March 2012.
    • Inseok Hwang, Hyukjae Jang, Taiwoo Park, Aram Choi, Youngki Lee, Chanyou Hwang, Yanggui Choi, Lama Nachman, Junehwa Song. “Leveraging Children’s Behavioral Distribution and Singularities in New Interactive Environments: Study in Kindergarten Field Trips“, in Proceedings of Pervasive 2012, Newcastle, UK, June, 2012.
    • Inseok Hwang, Hyukjae Jang, Taiwoo Park, Aram Choi, Chanyou Hwang, Yanggui Choi, Lama Nachman, Junehwa Song, “Toward Delegated Observation of Kindergarten Children’s Exploratory Behaviors in Field Trips“, in Proceedings of the 13th ACM International Conference on Ubiquitous Computing (UbiComp 2011) (Poster), Beijing, China, 2011.
    • Seungwoo Kang, Youngki Lee, Chulhong Min, Younghyun Ju, Taiwoo Park, Jinwon Lee, Yunseok Rhee, Junehwa Song. “Orchestrator: An Active Resource Orchestration Framework for Mobile Context Monitoring in Sensor-rich Mobile Environments“, in Proceedings of IEEE PerCom 2010, Mannheim, Germany, 2010.
    • Seungwoo Kang, Jinwon Lee, Hyukjae Jang, Hyonik Lee, Youngki Lee, Souneil Park, Taiwoo Park, and Junehwa Song. “SeeMon: Scalable and Energy-efficient Context Monitoring Framework for Sensor-rich Mobile Environments“, in Proceedings of ACM MobiSys 2008, Colorado, USA, June 2008.

Continue reading “Mobile Context-Aware Service Platform”