printlogo
http://www.ethz.ch/index_EN
Welcome
 
print
  

Nguyen-Dinh, Long-Van, Dr

Long-Van Nguyen-Dinh

ETH Zürich
Dr Long-Van Nguyen-Dinh
Institut f. Elektronik
ETZ H 95
Gloriastrasse 35
8092 Zürich

Phone: +41 44 632 51 41
E-Mail: 

Dr. Long-Van Nguyen-Dinh studied Computer Science at HCMC University of Technology (Vietnam)., with a focus on database system design. She continued her studies and completed her master degree in 2011 in Computer Science at Purdue University (USA) with a VEF fellowship. During her time in Purdue, she worked as a research assistant at the Database Lab, building a streaming spatio-temporal database over Hadoop MapReduce. From May 2011 until September 2011, she interned at Google, working in the Adsense group and using outlier detection techniques to detect potential users for some specific service. In December 2015 she defended her PhD with a thesis entitled "Wearable activity recognition with crowdsourced annotations". Her PhD thesis focused on leveraging crowdsourcing to reduce labeling efforts in activity awareness systems and proposing novel noise-robust methods to handle crowdsourced data for training activity recognition systems. Additionally, she proposed a new one-time point annotation technique in which labelers can specify only a one-time point when an activity occurs.

Her current research interests include wearable computing, activity recognition, machine learning and deep learning.

Research Interests

Research Projects

Long-Van's research is currently focused on deep learning for activity recognition. She was also involved in the following research projects:

Smart-DAYS: We investigated new labeling techniques for activity recognition in order to reduce the effort of collecting training data. We outsourced annotation tasks which are traditionally performed carefully by a few number of experts to a crowd of ordinary people. The crowdsourced training dataset for activity recognition can be expanded easily to open-ended systems. However, the quality of crowdsourced data is not perfect and suffers from annotation noises (boundary jitter and label noise). Therefore, we proposed noise-robust methods (SegmentedLCSS and WarpingLCSS) to handle noisy crowdsourced data for training activity recognition.

Daily Life Activity Awareness on Smartphones by Leveraging Crowd-sourced Data: In this project, we built real-time audio-based daily context recognition on smartphone with zero effort to collect training data. Basically, we mined existing crowd-sourced audio repositories on the web for free annotated training data. We investigated the pros and cons of crowdsourced data and user-centric data and then combined the best properties of them in order to achieve high performance and reduce user labeling effort to minimum.

One-Time Point Annotations: In this project, we proposed a one-time point annotation in which labelers do not have to select the start and end time carefully, but just mark a one-time point within the time a gesture is happening.

Teaching

Courses: Teaching Assistant for Wearable Systems I (FS12 & FS16) taught by Prof. Tröster.

Student Projects: If you are interested in working with me, please drop me a line or stop by my office. Recently, my main focus is Deep Learning for Activity Recognition. I am also open to your own ideas and will be very happy to discuss them.

Publications

2016 2014 2013 2012 2010

Completed Student Projects

 

Wichtiger Hinweis:
Diese Website wird in älteren Versionen von Netscape ohne graphische Elemente dargestellt. Die Funktionalität der Website ist aber trotzdem gewährleistet. Wenn Sie diese Website regelmässig benutzen, empfehlen wir Ihnen, auf Ihrem Computer einen aktuellen Browser zu installieren. Weitere Informationen finden Sie auf
folgender Seite.

Important Note:
The content in this site is accessible to any browser or Internet device, however, some graphics will display correctly only in the newer versions of Netscape. To get the most out of our site we suggest you upgrade to a newer browser.
More information

© 2017 ETH Zurich | Imprint | 9 December 2016
top