ORCID Profile
0000-0002-1644-6934
Current Organisations
Dublin City University
,
University of South Australia
Does something not look right? The information on this page has been harvested from data sources that may not be up to date. We continue to work with information providers to improve coverage and quality. To report an issue, use the Feedback Form.
In Research Link Australia (RLA), "Research Topics" refer to ANZSRC FOR and SEO codes. These topics are either sourced from ANZSRC FOR and SEO codes listed in researchers' related grants or generated by a large language model (LLM) based on their publications.
Information Systems | Decision Support and Group Support Systems
Publisher: Springer International Publishing
Date: 2016
Publisher: IEEE
Date: 03-2019
Publisher: IEEE
Date: 03-2022
Publisher: ACM
Date: 14-12-2021
Publisher: ACM
Date: 26-04-2014
Publisher: IEEE
Date: 10-2022
Publisher: Elsevier BV
Date: 07-2017
Publisher: ACM
Date: 07-12-2015
Publisher: ACM
Date: 28-11-2016
Publisher: IEEE
Date: 10-2013
Publisher: IEEE
Date: 03-2022
Publisher: ACM
Date: 19-11-2013
Publisher: IEEE
Date: 10-2013
Publisher: IEEE
Date: 11-2012
Publisher: IEEE
Date: 06-2011
Publisher: IEEE
Date: 10-2017
Publisher: Wiley
Date: 15-07-2020
DOI: 10.1002/JSID.954
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 11-2018
Publisher: Wiley
Date: 06-2012
Publisher: The Eurographics Association
Date: 2017
Publisher: The International Community for Auditory Display
Date: 07-2016
Abstract: Attention redirection trials were carried out using a wearable interface incorporating auditory and visual cues. Visual cues were delivered via the screen on the Recon Jet – a wearable computer resembling a pair of glasses – while auditory cues were delivered over a bone conduction headset. Cueing conditions included the delivery of in idual cues, both auditory and visual, and in combination with each other. Results indicate that the use of an auditory cue drastically decreases target acquisition times. This is true especially for targets that fall outside the visual field of view. While auditory cues showed no difference when paired with any of the visual cueing conditions for targets within the field of view of the user, for those outside the field of view a significant improvement in performance was observed. The static visual cue paired with the binaurally spatialised, dynamic auditory cue appeared to provide the best performance in comparison to any other cueing conditions. In the absence of a visual cue, the binaurally spatialised, dynamic auditory cue performed the best.
Publisher: Computers, Materials and Continua (Tech Science Press)
Date: 2023
Publisher: ACM
Date: 08-05-2021
Publisher: IEEE
Date: 03-2022
Publisher: ACM
Date: 15-11-2013
Publisher: IEEE
Date: 03-2019
Publisher: ACM
Date: 08-05-2021
Publisher: MDPI AG
Date: 11-10-2022
Abstract: Emerging applications of immersive virtual technologies are providing architects and designers with powerful interactive environments for virtual design collaboration, which has been particularly beneficial since 2020 while the architecture, engineering and construction (AEC) industry has experienced an acceleration of remote working. However, there is currently a lack of critical understanding about both the theoretical and technical development of immersive virtual environments (ImVE) for supporting architectural design collaboration. This paper reviewed recent research (since 2010) relating to the topic in a systematic literature review (SLR). Through the four steps of identification, screening, eligibility check, and inclusion of the eligible articles, in total, 29 journal articles were reviewed and discussed from 3 aspects: ImVE in the AEC industry, ImVE for supporting virtual collaboration, and applications of ImVE to support design collaboration. The results of this review suggest that future research and technology development are needed in the following areas: (1) ImVE support for design collaboration, particularly at the early design stage (2) cognitive research about design collaboration in ImVE, toward the adoption of more innovative and comprehensive methodologies (3) further enhancements to ImVE technologies to incorporate more needed advanced design features.
Publisher: ACM
Date: 08-12-2021
Publisher: IEEE
Date: 03-2008
Publisher: ACM
Date: 18-04-2015
Publisher: IEEE
Date: 10-2013
Publisher: IEEE
Date: 11-2012
Publisher: Elsevier BV
Date: 09-2023
Publisher: IEEE
Date: 10-2013
Publisher: ACM
Date: 12-11-2019
Publisher: ACM
Date: 02-05-2019
Publisher: IEEE
Date: 10-2013
Publisher: IEEE
Date: 11-2020
Publisher: IEEE
Date: 03-2019
Publisher: IEEE
Date: 06-2016
DOI: 10.1109/SVR.2016.25
Publisher: The Eurographics Association
Date: 2020
Publisher: IEEE
Date: 2004
Publisher: IEEE
Date: 10-2018
Publisher: ACM
Date: 18-11-2009
Publisher: IEEE
Date: 09-2020
Publisher: Springer International Publishing
Date: 2020
Publisher: ACM
Date: 23-10-2018
Publisher: ACM
Date: 24-11-2014
Publisher: IEEE
Date: 10-2017
Publisher: IEEE
Date: 2011
Publisher: Association for Computing Machinery (ACM)
Date: 07-11-2022
DOI: 10.1145/3555564
Abstract: In a remote collaboration involving a physical task, visualising gaze behaviours may compensate for other unavailable communication channels. In this paper, we report on a 360° panoramic Mixed Reality (MR) remote collaboration system that shares gaze behaviour visualisations between a local user in Augmented Reality and a remote collaborator in Virtual Reality. We conducted two user studies to evaluate the design of MR gaze interfaces and the effect of gaze behaviour (on/off) and gaze style (bi-/uni-directional). The results indicate that gaze visualisations lify meaningful joint attention and improve co-presence compared to a no gaze condition. Gaze behaviour visualisations enable communication to be less verbally complex therefore lowering collaborators' cognitive load while improving mutual understanding. Users felt that bi-directional behaviour visualisation, showing both collaborator's gaze state, was the preferred condition since it enabled easy identification of shared interests and task progress.
Publisher: Frontiers Media SA
Date: 08-02-2019
Publisher: IEEE
Date: 10-2013
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 05-2023
Publisher: ACM
Date: 07-12-2015
Publisher: ACM Press
Date: 2016
Publisher: Springer Science and Business Media LLC
Date: 15-07-2020
Publisher: ACM
Date: 28-11-2016
Publisher: IEEE
Date: 10-2018
Publisher: ACM
Date: 25-07-2022
Publisher: Royal Society of Chemistry (RSC)
Date: 2012
DOI: 10.1039/C2LC20973J
Abstract: In this article we introduce a novel technology that utilizes specialized water dissolvable thin films for valving in centrifugal microfluidic systems. In previous work (William Meathrel and Cathy Moritz, IVD Technologies, 2007), dissolvable films (DFs) have been assembled in laminar flow devices to form efficient sacrificial valves where DFs simply open by direct contact with liquid. Here, we build on the original DF valving scheme to leverage sophisticated, merely rotationally actuated vapour barriers and flow control for enabling comprehensive assay integration with low-complexity instrumentation on "lab-on-a-disc" platforms. The advanced sacrificial valving function is achieved by creating an inverted gas-liquid stack upstream of the DF during priming of the system. At low rotational speeds, a pocket of trapped air prevents a surface-tension stabilized liquid plug from wetting the DF membrane. However, high-speed rotation disrupts the metastable gas/liquid interface to wet the DF and thus opens the valve. By judicious choice of the radial position and geometry of the valve, the burst frequency can be tuned over a wide range of rotational speeds nearly 10 times greater than those attained by common capillary burst valves based on hydrophobic constrictions. The broad range of reproducible burst frequencies of the DF valves bears the potential for full integration and automation of comprehensive, multi-step biochemical assay protocols. In this report we demonstrate DF valving, discuss the biocompatibility of using the films, and show a potential sequential valving system including the on-demand release of on-board stored liquid reagents, fast centrifugal sedimentation and vigorous mixing thus providing a viable basis for use in lab-on-a-disc platforms for point-of-care diagnostics and other life science applications.
Publisher: ACM
Date: 04-12-2018
Publisher: IEEE
Date: 2017
Publisher: ACM
Date: 11-11-2014
Publisher: ACM
Date: 02-05-2019
Publisher: Springer International Publishing
Date: 2020
Publisher: ACM
Date: 11-12-2011
Publisher: ACM
Date: 18-04-2015
Publisher: Springer Science and Business Media LLC
Date: 02-06-2018
Publisher: IEEE
Date: 10-2019
Publisher: ACM
Date: 27-04-2022
Publisher: ACM
Date: 16-12-2009
Publisher: Association for Computing Machinery (ACM)
Date: 07-2005
Abstract: Users experience and verify immersive content firsthand while creating it within the same virtual environment.
Publisher: ACM
Date: 27-11-2017
Publisher: IEEE
Date: 10-2018
Publisher: JMIR Publications Inc.
Date: 18-09-2020
DOI: 10.2196/18965
Abstract: Throughout March 2020, leaders in countries across the world were making crucial decisions about how and when to implement public health interventions to combat the coronavirus disease (COVID-19). They urgently needed tools to help them to explore what will work best in their specific circumstances of epidemic size and spread, and feasible intervention scenarios. We sought to rapidly develop a flexible, freely available simulation model for use by modelers and researchers to allow investigation of how various public health interventions implemented at various time points might change the shape of the COVID-19 epidemic curve. “COVOID” (COVID-19 Open-Source Infection Dynamics) is a stochastic in idual contact model (ICM), which extends the ICMs provided by the open-source EpiModel package for the R statistical computing environment. To demonstrate its use and inform urgent decisions on March 30, 2020, we modeled similar intervention scenarios to those reported by other investigators using various model types, as well as novel scenarios. The scenarios involved isolation of cases, moderate social distancing, and stricter population “lockdowns” enacted over varying time periods in a hypothetical population of 100,000 people. On April 30, 2020, we simulated the epidemic curve for the three contiguous local areas (population 287,344) in eastern Sydney, Australia that recorded 5.3% of Australian cases of COVID-19 through to April 30, 2020, under five different intervention scenarios and compared the modeled predictions with the observed epidemic curve for these areas. COVOID allocates each member of a population to one of seven compartments. The number of times in iduals in the various compartments interact with each other and their probability of transmitting infection at each interaction can be varied to simulate the effects of interventions. Using COVOID on March 30, 2020, we were able to replicate the epidemic response patterns to specific social distancing intervention scenarios reported by others. The simulated curve for three local areas of Sydney from March 1 to April 30, 2020, was similar to the observed epidemic curve in terms of peak numbers of cases, total numbers of cases, and duration under a scenario representing the public health measures that were actually enacted, including case isolation and r -up of testing and social distancing measures. COVOID allows rapid modeling of many potential intervention scenarios, can be tailored to erse settings, and requires only standard computing infrastructure. It replicates the epidemic curves produced by other models that require highly detailed population-level data, and its predicted epidemic curve, using parameters simulating the public health measures that were enacted, was similar in form to that actually observed in Sydney, Australia. Our team and collaborators are currently developing an extended open-source COVOID package comprising of a suite of tools to explore intervention scenarios using several categories of models.
Publisher: IEEE
Date: 07-2019
Publisher: IGI Global
Date: 2006
DOI: 10.4018/978-1-59904-066-0.CH013
Abstract: This chapter describes designing interaction methods for Tangible Augmented Reality (AR) applications. First, we describe the concept of a Tangible Augmented Reality interface and review its various successful applications, focusing on their interaction designs. Next we classify and consolidate these interaction methods into common tasks and interaction schemes. And finally, we present general design guidelines for interaction methods in Tangible AR applications. The authors hope that these guidelines will help developers design interaction methods for Tangible AR applications in a more structured and efficient way, and bring Tangible AR interfaces closer to our daily lives with further research.
Publisher: MDPI AG
Date: 26-08-2013
DOI: 10.3390/S130911336
Publisher: IEEE
Date: 06-2017
Publisher: ACM
Date: 08-05-2021
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 11-2016
Publisher: ACM
Date: 22-07-2016
Publisher: ACM
Date: 04-12-2018
Publisher: IEEE
Date: 2007
Publisher: ACM
Date: 07-05-2016
Publisher: ACM
Date: 02-11-2015
Publisher: Now Publishers
Date: 2015
DOI: 10.1561/1100000049
Publisher: IEEE
Date: 09-2014
Publisher: IEEE
Date: 09-2014
Publisher: ACM
Date: 02-11-2015
Publisher: IEEE
Date: 05-2021
Publisher: IEEE
Date: 10-2022
Publisher: Elsevier BV
Date: 04-2009
Publisher: ACM
Date: 14-12-2021
Publisher: Springer Science and Business Media LLC
Date: 25-07-2020
Publisher: MDPI AG
Date: 21-03-2023
DOI: 10.3390/MTI7030032
Abstract: Virtual Reality (VR) technology is gaining in popularity as a research tool for studying human behavior. However, the use of VR technology for remote testing is still an emerging field. This study aimed to evaluate the feasibility of conducting remote VR behavioral experiments that require millisecond timing. Participants were recruited via an online crowdsourcing platform and accessed a task on the classic cognitive phenomenon “Inhibition of Return” through a web browser using their own VR headset or desktop computer (68 participants in each group). The results confirm previous research that remote participants using desktop computers can be used effectively for conducting time-critical cognitive experiments. However, inhibition of return was only partially replicated for the VR headset group. Exploratory analyses revealed that technical factors, such as headset type, were likely to significantly impact variability and must be mitigated to obtain accurate results. This study demonstrates the potential for remote VR testing to broaden the research scope and reach a larger participant population. Crowdsourcing services appear to be an efficient and effective way to recruit participants for remote behavioral testing using high-end VR headsets.
Publisher: ACM
Date: 26-11-2012
Publisher: ACM
Date: 17-08-2020
Publisher: IEEE
Date: 10-2011
Publisher: ACM
Date: 14-11-2019
Publisher: SAGE Publications
Date: 09-2016
Abstract: Visual search performance was studied using auditory cues delivered over a bone conduction headset. Two types of auditory cues were employed to evaluate the effectiveness of such cues in an attention redirection task. Participants were required to locate and shoot targets at one of four locations on a screen when one of the two audio cues was delivered. Reaction and target acquisition times were significantly reduced when the binaurally spatialised cues were used compared to unlocalisable, monophonic cues. This appears to suggest that an auditory cue with directional information is far superior at aiding search tasks or alerting the user to redirect attention in the real-world space in comparison to a centered ‘monophonic’ cue. The results demonstrate the effectiveness of a binaurally spatialised, dynamic cue and point to its potential use in an information rich environment to provide useful and actionable information.
Publisher: ACM
Date: 12-2022
Publisher: ACM Press
Date: 2016
Publisher: ACM
Date: 19-11-2013
Publisher: ACM Press
Date: 2005
Publisher: ACM
Date: 19-11-2013
Publisher: ACM
Date: 02-05-2019
Publisher: ACM
Date: 02-12-2012
Publisher: IEEE
Date: 09-2016
Publisher: The Eurographics Association
Date: 2017
Publisher: ACM
Date: 12-11-2019
Publisher: IEEE
Date: 2006
DOI: 10.1109/VR.2006.150
Publisher: ACM
Date: 02-11-2015
Publisher: The Eurographics Association
Date: 2017
Publisher: American Society of Civil Engineers (ASCE)
Date: 2023
Publisher: The Eurographics Association
Date: 2017
Publisher: Wiley
Date: 06-10-2010
Publisher: Springer Science and Business Media LLC
Date: 03-04-2023
DOI: 10.1007/S10055-023-00759-2
Abstract: Virtual reality (VR) is a promising tool for training life skills in people with intellectual disabilities. However, there is a lack of evidence surrounding the implementation, suitability, and effectiveness of VR training in this population. The present study investigated the effectiveness of VR training for people with intellectual disabilities by assessing (1) their ability to complete basic tasks in VR, (2) real-world transfer and skill generalisation, and (3) the in idual characteristics of participants able to benefit from VR training. Thirty-two participants with an intellectual disability of varying severity completed a waste management training intervention in VR that involved sorting 18 items into three bins. Real-world performance was measured at pre-test, post-test, and delayed time points. The number of VR training sessions varied as training ceased when participants met the learning target (≈ 90% correct). A survival analysis assessed training success probability as a function of the number of training sessions with participants split by their level of adaptive functioning (as measured on the Adaptive Behaviour Assessment System Third Edition). The learning target was met by 19 participants (59.4%) within ten sessions ( Mdn = 8.5, IQR 4–10). Real-world performance significantly improved from pre- to post-test and pre- to delayed test. There was no significant difference from post- to delayed test. Further, there was a significant positive relationship between adaptive functioning and change in the real-world assessment from the pre-test to the post- and delayed tests. VR facilitated the learning of most participants, which led to demonstrations of real-world transfer and skill generalisation. The present study identified a relationship between adaptive functioning and success in VR training. The survival curve may assist in planning future studies and training programs.
Publisher: ACM
Date: 18-11-2009
Publisher: Korean Society for Internet Information (KSII)
Date: 31-12-2018
Publisher: ACM
Date: 17-11-2013
Publisher: Institute of Electrical and Electronics Engineers (IEEE)
Date: 2020
Publisher: Springer Science and Business Media LLC
Date: 17-11-2021
Publisher: IEEE
Date: 10-2019
Publisher: Elsevier BV
Date: 02-2021
Publisher: ACM
Date: 27-11-2017
Publisher: ACM
Date: 27-11-2017
Publisher: IEEE
Date: 09-2017
Publisher: IEEE
Date: 10-2019
Publisher: ACM
Date: 02-12-2012
Publisher: Frontiers Media SA
Date: 14-06-2021
DOI: 10.3389/FRVIR.2021.697367
Abstract: Gaze is one of the predominant communication cues and can provide valuable implicit information such as intention or focus when performing collaborative tasks. However, little research has been done on how virtual gaze cues combining spatial and temporal characteristics impact real-life physical tasks during face to face collaboration. In this study, we explore the effect of showing joint gaze interaction in an Augmented Reality (AR) interface by evaluating three bi-directional collaborative (BDC) gaze visualisations with three levels of gaze behaviours. Using three independent tasks, we found that all bi-directional collaborative BDC visualisations are rated significantly better at representing joint attention and user intention compared to a non-collaborative (NC) condition, and hence are considered more engaging. The Laser Eye condition, spatially embodied with gaze direction, is perceived significantly more effective as it encourages mutual gaze awareness with a relatively low mental effort in a less constrained workspace. In addition, by offering additional virtual representation that compensates for verbal descriptions and hand pointing, BDC gaze visualisations can encourage more conscious use of gaze cues coupled with deictic references during co-located symmetric collaboration. We provide a summary of the lessons learned, limitations of the study, and directions for future research.
Publisher: ACM
Date: 27-11-2017
Publisher: ACM
Date: 20-04-2018
Publisher: ACM
Date: 14-11-2019
Publisher: ACM
Date: 17-11-2019
Publisher: Elsevier BV
Date: 11-2019
Publisher: The Eurographics Association
Date: 2017
Publisher: IEEE
Date: 10-2010
DOI: 10.1109/CW.2010.68
Publisher: Royal Society of Chemistry (RSC)
Date: 2012
DOI: 10.1039/C2LC40781G
Abstract: This work describes the first use of a wireless paired emitter detector diode device (PEDD) as an optical sensor for water quality monitoring in a lab-on-a-disc device. The microfluidic platform, based on an ionogel sensing area combined with a low-cost optical sensor, is applied for quantitative pH and qualitative turbidity monitoring of water s les at point-of-need. The autonomous capabilities of the PEDD system, combined with the portability and wireless communication of the full device, provide the flexibility needed for on-site water testing. Water s les from local fresh and brackish sources were successfully analysed using the device, showing very good correlation with standard bench-top systems.
Publisher: IEEE
Date: 10-2017
Publisher: The Eurographics Association
Date: 2017
Publisher: The Eurographics Association
Date: 2006
DOI: 10.2312/EGS.20061045
Publisher: MDPI AG
Date: 09-12-2020
DOI: 10.3390/INFORMATICS7040055
Abstract: Hyperscanning is a technique which simultaneously records the neural activity of two or more people. This is done using one of several neuroimaging methods, such as electroencephalography (EEG), functional magnetic resonance imaging (fMRI), and functional near-infrared spectroscopy (fNIRS). The use of hyperscanning has seen a dramatic rise in recent years to monitor social interactions between two or more people. Similarly, there has been an increase in the use of virtual reality (VR) for collaboration, and an increase in the frequency of social interactions being carried out in virtual environments (VE). In light of this, it is important to understand how interactions function within VEs, and how they can be enhanced to improve their quality in a VE. In this paper, we present some of the work that has been undertaken in the field of social neuroscience, with a special emphasis on hyperscanning. We also cover the literature detailing the work that has been carried out in the human–computer interaction domain that addresses remote collaboration. Finally, we present a way forward where these two research domains can be combined to explore how monitoring the neural activity of a group of participants in VE could enhance collaboration among them.
Publisher: The Eurographics Association
Date: 2017
Publisher: IEEE
Date: 06-2017
Location: Ireland
Location: Ireland
Start Date: 2023
End Date: 2023
Amount: $480,234.00
Funder: Australian Research Council
View Funded Activity