Sensing & Interpreting Data

You are here: Home » Programme » Sensing & Interpreting Data

Thursday 27th April, Session 2

Identifying requirements for mapping physiological measurements to distress
Genovefa Kefalidou, University of Nottingham

Advances in the area of wearable devices that record physiological changes provides researchers with greater opportunities to detect stress (and other psychological states) in real-time. This paper describes research that explores the use of heart rate monitors to gather physiological data from participants over a number of days. From this research, we make a number of observations from a user perspective, and discuss implications for the use of these devices in real-world contexts.

Lower blink counts are associated with high taskload, visual tasks
Rebecca Charles, Cranfield University

Physiological measures have been increasing in popularity due to the growing availability of equipment that allows measurement in real time. Eye blinks are an easy measure to collect using video capture. Our findings indicate that blink counts effectively differentiate between taskloads and task types during a computer based task. Blink counts were significantly lower during the tasks involving high visual load when compared to non-visually demanding tasks. Lower numbers of blinks were observed under higher taskloads across all visual tasks.

Optimum Kinect Setup for Real-Time Ergonomic Risk Assessment on the Shop Floor
Chika Mgbemena, Cranfield University

This study provides ergonomists as well as other researchers with the optimum Kinect placement setup for accurate data capture in real-time ergonomic evaluations. It gives the information needed to optimally place fixed Kinect v2 sensors for continuous shop floor observation. It investigated the feasibility of detecting a manual shop floor activity using the Kinect v2 sensor so as to determine the optimal positions for Kinect v2 placement for more accurate data collection on the shop floor.

Using myoelectric signals for gesture detection: a feasibility study
Farshid Amirabdollahian, University of Hertfordshire

With the technological advances in sensing human motion, and its potential to drive and control mechanical interfaces remotely, a multitude of input mechanisms are used to link actions between the human and the robot. In this study we explored the feasibility of using human arm’s myoelectric signals with the aim of identifying a number of gestures automatically. We deployed k-nearest neighbours algorithm in machine learning to train and later identify gestures, and achieved an accuracy of around 65%. This indicates potential feasibility while highlighting areas for improvement both in accuracy and utility/usability of such approaches.

The Self Learning Car – A Human Factors Approach to Machine Learning
Joseph Smyth, WMG, Univerity of Warwick

The Self-Learning Car project by Jaguar Land Rover uses machine learning techniques to learn and subsequently automate certain driver-focused vehicle features. A human factors approach is taken to review the current SLC system. Subsequently, it is learnt that the current method used for machine learning is not sufficient to grasp a true understanding of interaction, routine and feature use – and a new method is proposed. Issues surrounding trust and acceptance in automation are also explored and recommendations made.

Date & place

25 - 27 April 2017
Staverton Estate, Daventry, Northamptonshire

Organised by

Sponsors

Human Applications
Osmond Ergonomics
Greenstreet Berman
K Sharp
ElsevierA
HITS
Towergate Insurance
Humanscale
Scandinavian Business Seating
Want to sponsor this event?
Contact Adam Potter at Redactive Media on 0207 880 7555 or email adam.potter@redactive.co.uk.