Publications

By Yanxia Zhang (Clear Search)

2019
Publication Details
  • The 17th IEEE International Conference on Embedded and Ubiquitous Computing (IEEE EUC 2019)
  • Aug 2, 2019

Abstract

Close
Human activity forecasting from videos in routine-based tasks is an open research problem that has numerous applications in robotics, visual monitoring and skill assessment. Currently, many challenges exist in activity forecasting because human actions are not fully observable from continuous recording. Additionally, a large number of human activities involve fine-grained articulated human motions that are hard to capture using frame-level representations. To overcome thesechallenges, we propose a method that forecasts human actions by learning the dynamics of local motion patterns extracted from dense trajectories using longshort-term memory (LSTM). The experiments on a pub-lic dataset validated the effectiveness of our proposed method in activity forecasting and demonstrate large improvements over the baseline two stream end-to-endmodel. We also learnt that human activity forecasting benefits from learning both the short-range motion pat-terns and long-term dependencies between actions.

Abstract

Close
An open challenge in current telecommunication systems including Skype and other existing research systems is a lack of physical interaction, and consequently a restricted feeling of connection for users. For example, those telecommunication systems cannot allow remote users to move pieces of a board game while playing with a local user. We propose that installing a robot arm and teleoperating it can address the problem by enabling remote physical interaction. We compare three methods for remote control to study the relationship between connection, and how it relates to agency and autonomy for each control scheme.
2018
Publication Details
  • The 8th International Conference on the Internet of Things (IoT 2018)
  • Oct 15, 2018

Abstract

Close
With the tremendous progress in sensing and IoT infrastructure, it is foreseeable that IoT systems will soon be available for commercial markets, such as in people's homes. In this paper, we present a deployment study using sensors attached to household objects to capture the resourcefulness of three individuals. The concept of resourcefulness highlights the ability of humans to repurpose objects spontaneously for a different use case than was initially intended. It is a crucial element for human health and wellbeing, which is of great interest for various aspects of HCI and design research. Traditionally, resourcefulness is captured through ethnographic practice. Ethnography can only provide sparse and often short duration observations of human experience, often relying on participants being aware of and remembering behaviours or thoughts they need to report on. Our hypothesis is that resourcefulness can also be captured through continuously monitoring objects being used in everyday life. We developed a system that can record object movement continuously and deployed them in homes of three elderly people for over two weeks. We explored the use of probabilistic topic models to analyze the collected data and identify common patterns.
Publication Details
  • UbiComp 2018 (IMWUT)
  • Oct 1, 2018

Abstract

Close
Continuous monitoring with unobtrusive wearable social sensors is becoming a popular method to assess individual affect states and team effectiveness in human research. A large number of applications have demonstrated the effectiveness of applying wearable sensing in corporate settings; for example, in short periodic social events or in a university campus. However, little is known of how we can automatically detect individual affect and group cohesion for long duration missions. Predicting negative affect states and low cohesiveness is vital for team missions. Knowing team members’ negative states allows timely interventions to enhance their effectiveness. This work investigates whether sensing social interactions and individual behaviors with wearable sensors can provide insights into assessing individual affect states and group cohesion. We analyzed wearable sensor data from a team of six crew members who were deployed on a four-month simulation of a space exploration mission at a remote location. Our work proposes to recognize team members’ affect states and group cohesion as a binary classification problem using novel behavior features that represent dyadic interaction and individual activities. Our method aggregates features from individual members into group levels to predict team cohesion. Our results show that the behavior features extracted from the wearable social sensors provide useful information in assessing personal affect and team cohesion. Group task cohesion can be predicted with a high performance of over 0.8 AUC. Our work demonstrates that we can extract social interactions from sensor data to predict group cohesion in longitudinal missions. We found that quantifying behavior patterns including dyadic interactions and face-to-face communications are important in assessing team process.