SmartATRS was the case study for the research that controls the Automated Transport and Retrieval System (ATRS). The smartphone system was developed by Dr Whittington as part of a final year project at Bournemouth University to replace the existing keyfobs that were found to be challenging for people with reduced finger dexterity.
ATRS is a technically advanced laser guidance system for a powered wheelchair (powerchair). The system complies with a compact LiDAR device and a robotics unit fitted to the powerchair for locating the exact position of a platform lift, fitted in the rear of the vehicle. The vehicle is also fitted with a motorised driver’s seat and automated tailgate. Using a joystick attached to the driver’s seat, the user manoeuvres the powerchair to the rear of the vehicle until the LiDAR is able to see two highly reflective fiducials fitted to the lift. From then on the docking of the powerchair is completely autonomous. Although there is an autonomous aspect to ATRS, it is seen as an interactive system requiring user interaction to operate the seat, tailgate and lift.
User feedback and safety features were incorporated into the SmartATRS that were not present in the original keyfobs to operate ATRS. Seven command buttons are used to activiate each ATRS function, with a large Emergency Stop button that terminates any operating function. Five navigation icons allow the system to connect to additional interfaces that can be used to control home functions, e.g. automated gates and garage doors. SmartATRS can either be controlled through a touch interface or through a powerchair joystick.
An evaluation was conducted to compare the usability of the existing keyfobs and SmartATRS operated through both touch and joystick. Analysis using the System Usability and Adjective Rating Scales revealed that keyfobs achieved ‘OK Usability’, touch achieved ‘Excellent Usability’ and joystick achieved ‘Good Usability’. The NASA Task Load Index (TLX) results identified that the touch based interaction showed lower mental and physical demands, compared to keyfobs. Due to a steeper learning curve for joystick interaction caused by the coordination required, it had medium levels of mental and physical demands.
SmartATRS can also be controlled through head movements on iOS devices by using the Switch Control accessibility feature. A second evaluation was conducted to compare the usability of head interaction compared to touch interface. Analysis using the System Usability and Adjective Rating scales, revealed that head interaction achieved ‘Poor Usability’ compared to ‘Good Usability’ of touch interaction. This was due to the participants not having a full range of neck movements required to operate Switch Control. Analysis using NASA TLX revealed that head based interaction produced significantly higher mental and physical demands. However, for some participants who were not able to use a standard touch interface, head interaction was a more efficient solution.
Dr Paul Whittington
Lecturer in Assistive Technology
Dr Huseyin Dogan
Associate Professor & Acting Deputy Head of Department