ORCID Profile
0000-0001-8319-3589
Current Organisation
University of Tasmania
Does something not look right? The information on this page has been harvested from data sources that may not be up to date. We continue to work with information providers to improve coverage and quality. To report an issue, use the Feedback Form.
In Research Link Australia (RLA), "Research Topics" refer to ANZSRC FOR and SEO codes. These topics are either sourced from ANZSRC FOR and SEO codes listed in researchers' related grants or generated by a large language model (LLM) based on their publications.
Sustainability Accounting and Reporting | Consumer-Oriented Product or Service Development | Computer-Human Interaction | Marketing
Marketing | Livestock Product Traceability and Quality Assurance | Technological and Organisational Innovation |
Publisher: ACM
Date: 07-12-2015
Publisher: ACM
Date: 28-11-2011
Publisher: IEEE
Date: 11-2012
Publisher: MDPI AG
Date: 16-03-2018
DOI: 10.3390/SYM10030069
Publisher: Australasian Society for Computers in Learning in Tertiary Education
Date: 15-05-2021
DOI: 10.14742/AJET.6243
Abstract: This systematic review aimed to identify how tele-mentoring systems that incorporate augmented reality (AR) technology are being used in healthcare environments. A total of 12 electronic bibliographic databases were searched using the terms “augmented reality”, “tele-mentoring” and “health”. The PRISMA checklist was used as a guide for reporting. The mixed method appraisal tool was used to assess the quality of the included experiments. The data were then analysed using a concept-centric approach and categorised primarily with regards to system performance and task performance measures. A total of 11 randomised controlled trials and 14 non-randomised designs were included for review. Both mentees and mentors assessed the system and task performance according to 25 categories. The feedback of mentees using AR tele-mentoring systems was generally positive. The majority of experiments revealed that the AR system was an effective tele-mentoring device overall and resulted in the effective performance of a procedure. Benefits included improvements in trainees’ confidence, task completion time and reductions in task errors and shifts in focus. However, the systems had limitations, including heaviness of the equipment, inconvenience, discomfort and distraction of wearing devices, limited battery life, the latency of video and audio signals and limited field of view. Implications for practice or policy: Health practitioners can apply AR technology to receive and follow real-time annotated instructions verbally and visually from remote experts. Technical developers may consider improving AR devices in terms of lighter weight, larger field of view, more ergonomic design, more stable network connection and longer battery life. Further AR-related experiments may need to explore AR tele-mentoring systems’ utility across healthcare environments with larger s les, real patient populations in remote settings, cost-benefit analysis and impacts on short- and long-term patient outcomes.
Publisher: ACM
Date: 24-11-2014
Publisher: Springer International Publishing
Date: 2018
Publisher: ACM
Date: 24-11-2014
Publisher: IEEE
Date: 10-2013
Publisher: IEEE
Date: 10-2017
Publisher: Wiley
Date: 10-01-2019
DOI: 10.1002/CAE.22087
Publisher: ACM Press
Date: 2016
Publisher: JMIR Publications Inc.
Date: 02-10-2023
DOI: 10.2196/47228
Publisher: JMIR Publications Inc.
Date: 14-03-2023
Publisher: BCS Learning & Development
Date: 09-2014
Publisher: Informa UK Limited
Date: 28-11-2022
Publisher: ACM
Date: 26-04-2014
Publisher: ACM
Date: 28-11-2011
Publisher: Springer International Publishing
Date: 2015
Publisher: IEEE
Date: 09-2015
DOI: 10.1109/ISMAR.2015.9
Publisher: MIT Press - Journals
Date: 02-2003
DOI: 10.1162/105474603763835305
Abstract: The operation and performance of a six degree-of-freedom (DOF) shared-aperture tracking system with image overlay is described. This unique tracking technology shares the same aperture or scanned optical beam with the visual display, virtual retinal display (VRD). This display technology provides high brightness in an AR helmet-mounted display, especially in the extreme environment of a military cockpit. The VRD generates an image by optically scanning visible light directly to the viewer's eye. By scanning both visible and infrared light, the head-worn display can be directly coupled to a head-tracking system. As a result, the proposed tracking system requires minimal calibration between the user's viewpoint and the tracker's viewpoint. This paper demonstrates that the proposed shared-aperture tracking system produces high accuracy and computational efficiency. The current proof-of-concept system has a precision of +/− 0.05 and +/− 0.01 deg. in the horizontal and vertical axes, respectively. The static registration error was measured to be 0.08 +/− 0.04 and 0.03 +/− 0.02 deg. for the horizontal and vertical axes, respectively. The dynamic registration error or the system latency was measured to be within 16.67 ms, equivalent to our display refresh rate of 60 Hz. In all testing, the VRD was fixed and the calibrated motion of a robot arm was tracked. By moving the robot arm within a restricted volume, this real-time shared-aperture method of tracking was extended to six-DOF measurements. Future AR applications of our shared-aperture tracking and display system will be highly accurate head tracking when the VRD is helmet mounted and worn within an enclosed space, such as an aircraft cockpit.
Publisher: Springer Berlin Heidelberg
Date: 2013
Publisher: ACM
Date: 02-12-2014
Publisher: IEEE
Date: 09-2014
Publisher: ACM
Date: 27-11-2017
Publisher: IEEE
Date: 11-2017
Publisher: ACM
Date: 02-11-2015
Publisher: MDPI AG
Date: 12-2018
DOI: 10.3390/SYM10120680
Abstract: Control of robot arms is often required in engineering and can be performed by using different methods. This study examined and symmetrically compared the use of a controller, eye gaze tracker and a combination thereof in a multimodal setup for control of a robot arm. Tasks of different complexities were defined and twenty participants completed an experiment using these interaction modalities to solve the tasks. More specifically, there were three tasks: the first was to navigate a chess piece from a square to another pre-specified square the second was the same as the first task, but required more moves to complete and the third task was to move multiple pieces to reach a solution to a pre-defined arrangement of the pieces. Further, while gaze control has the potential to be more intuitive than a hand controller, it suffers from limitations with regard to spatial accuracy and target selection. The multimodal setup aimed to mitigate the weaknesses of the eye gaze tracker, creating a superior system without simply relying on the controller. The experiment shows that the multimodal setup improves performance over the eye gaze tracker alone ( p 0.05 ) and was competitive with the controller only setup, although did not outperform it ( p 0.05 ).
Publisher: ACM
Date: 02-11-2015
Publisher: ACM
Date: 02-12-2014
Publisher: IEEE
Date: 07-2017
Publisher: ACM
Date: 25-11-2013
Start Date: 07-2014
End Date: 07-2021
Amount: $2,500,000.00
Funder: Australian Research Council
View Funded Activity