ORCID Profile
0000-0002-8903-0067
Current Organisation
James Cook University
Does something not look right? The information on this page has been harvested from data sources that may not be up to date. We continue to work with information providers to improve coverage and quality. To report an issue, use the Feedback Form.
Publisher: Informa UK Limited
Date: 04-08-2015
DOI: 10.1080/02701367.2015.1053104
Abstract: This study examined the effects of strength training on alternating days and endurance training on consecutive days on running performance for 6 days. Sixteen male and 8 female moderately trained in iduals were evenly assigned into concurrent-training (CCT) and strength-training (ST) groups. The CCT group undertook strength training on alternating days combined with endurance training on consecutive days for 6 days. One week later, the CCT group conducted 3 consecutive days of endurance training only to determine whether fatigue would be induced with endurance training alone (CCT-Con). Endurance training was undertaken to induce endurance-training stimulus and to measure the cost of running (CR), rating of perceived exertion (RPE), and time to exhaustion (TTE). The ST group undertook 3 strength-training sessions on alternating days. Maximal voluntary contraction (MVC), rating of muscle soreness (RMS), and rating of muscle fatigue (RMF) were collected prior to each strength and endurance session. For the CCT group, small differences were primarily found in CR and RPE (ES = 0.17-0.41). However, moderate-to-large reductions were found for TTE and MVC (ES = 0.65-2.00), whereas large increases in RMS and RMF (ES = 1.23-2.49) were found prior to each strength- and endurance-training session. Small differences were found in MVC for the ST group (ES = 0.11) and during CCT-Con for the CCT group (ES = 0.15-0.31). Combining strength training on alternating days with endurance training on consecutive days impairs MVC and running performance at maximal effort and increases RMS and RMF over 6 days.
Publisher: Wiley
Date: 25-06-2015
DOI: 10.1071/HE14052
Publisher: Oxford University Press (OUP)
Date: 15-12-2022
Abstract: Prevention of musculoskeletal injury is vital to the readiness, performance, and health of military personnel with the use of specialized systems (e.g., force plates) to assess risk and/or physical performance of interest. This study aimed to identify the reliability of one specialized system during standard assessments in military personnel. Sixty-two male and ten female Australian Army soldiers performed a two-leg countermovement jump (CMJ), one-leg CMJ, one-leg balance, and one-arm plank assessments using a Sparta Science force plate system across three testing sessions. Sparta Science (e.g., total Sparta, balance and plank scores, jump height, and injury risk) and biomechanical (e.g., average eccentric rate of contraction, average concentric force, and sway velocity) variables were recorded for all sessions. Mean ± SD, intraclass correlation coefficients (ICCs), coefficient of variation, and bias and limits of agreement were calculated for all variables. Mean results were similar between sessions 2 and 3 (P & .05). The relative reliability for the Sparta Science (ICC = 0.28-0.91) and biomechanical variables (ICC = 0.03-0.85) was poor to excellent. The mean absolute reliability (coefficient of variation) for Sparta Science variables was similar to or lower than that of the biomechanical variables during the CMJ (1-10% vs. 3-7%), one-leg balance (4-6% vs. 9-14%), and one-arm plank (5-7% vs. 12-17%) assessments. The mean bias for most variables was small (& % of the mean), while the limits of agreement varied with most unacceptable (±6-87% of the mean). The reliability of most Sparta Science and biomechanical variables during standard assessments was moderate to good. The typical variability in metrics documented will assist practitioners with the use of emerging technology to monitor and assess injury risk and/or training interventions in military personnel.
Publisher: Springer Science and Business Media LLC
Date: 03-2019
Publisher: Springer Science and Business Media LLC
Date: 12-07-2017
DOI: 10.1007/S40279-017-0758-3
Abstract: A single bout of resistance training induces residual fatigue, which may impair performance during subsequent endurance training if inadequate recovery is allowed. From a concurrent training standpoint, such carry-over effects of fatigue from a resistance training session may impair the quality of a subsequent endurance training session for several hours to days with inadequate recovery. The proposed mechanisms of this phenomenon include: (1) impaired neural recruitment patterns (2) reduced movement efficiency due to alteration in kinematics during endurance exercise and increased energy expenditure (3) increased muscle soreness and (4) reduced muscle glycogen. If endurance training quality is consistently compromised during the course of a specific concurrent training program, optimal endurance development may be limited. Whilst the link between acute responses of training and subsequent training adaptation has not been fully established, there is some evidence suggesting that cumulative effects of fatigue may contribute to limiting optimal endurance development. Thus, the current review will (1) explore cross-sectional studies that have reported impaired endurance performance following a single, or multiple bouts, of resistance training (2) identify the potential impact of fatigue on chronic endurance development (3) describe the implications of fatigue on the quality of endurance training sessions during concurrent training, and (4) explain the mechanisms contributing to resistance training-induced attenuation on endurance performance from neurological, biomechanical and metabolic standpoints. Increasing the awareness of resistance training-induced fatigue may encourage coaches to consider modulating concurrent training variables (e.g., order of training mode, between-mode recovery period, training intensity, etc.) to limit the carry-over effects of fatigue from resistance to endurance training sessions.
Publisher: MDPI AG
Date: 03-01-2023
Abstract: Objective: To examine the feasibility and effect of an in idualised and force-plate guided training program on physical performance and musculoskeletal injury risk factors in army personnel. Design: Pre-post, randomised control. Methods: Fourteen male and five female Australian Army soldiers were randomised into two groups and performed 5-weeks of physical training. The control group (n = 9) completed standard, group-designed, physical training whilst the experimental group (n = 8) completed an in idualised training program. Physical (push-ups, multi-stage fitness test, three repetition maximum (3RM) for squat, strict press, deadlift and floor press), occupational (weight-loaded march time), and technological assessments (two-leg and one-leg countermovement jumps (CMJ), one-leg balance, one-arm plank) were conducted prior to and following the training period. Comparisons between groups and changes within groups were conducted via Mann–Whitney U tests. Results: Compared to the control group, the experimental group exhibited a significantly smaller improvement for weight-loaded march time (−0.7% ± 4.0% vs. −5.1% ± 3.0%, p = 0.03) and a greater improvement for deadlift-3RM (20.6% ± 11.9% vs. 8.4% ± 6.8%, p = 0.056). All other outcomes were similar between groups. Visually favourable alterations in the two-leg CMJ profile with no reports of injuries were noted for the experimental group. Conclusions: In idualised physical training was feasible within an army setting and, for the most part, produced similar physical, occupational and technological performances to that of standard, group-designed physical training. These preliminary results provide a foundation for future research to expand upon and clarify the benefits of in idualised training programs on long-term physical performance and injury risk/incidence in active combat army personnel.
Publisher: Termedia Sp. z.o.o.
Date: 05-06-2023
DOI: 10.5114/JHK/163180
Abstract: This study examined the intra-session reliability of sprint performance on a non-motorized treadmill amongst healthy, active male and female adults. One hundred and twenty participants (males n = 77 females n = 45) completed two familiarization sessions, followed by a third session that consisted of three trials (T1, T2, T3) of maximal sprints (4-s), interspersed by three minutes of recovery. Combining males and females exhibited moderate-to-excellent test-retest reliability (intra-class correlation coefficient, ICC), minimal measurement error (coefficient of variation, CV) and trivial differences between trials (effect size, ES) for speed, power, total work and acceleration (ICC = 0.82–0.98, CV = 1.31–8.45%, ES = 0.01–0.22). The measurement error was improved between comparisons of T1 vs. T2 (CV = 1.62–8.45%, ES = 0.12–0.22) to T2 vs. T3 (CV = 1.31–6.56%, ES = 0.01–0.07) and better for females (CV = 1.26–7.94%, ES = 0.001–0.26) than males (CV = 1.33–8.53%, ES = 0.06–0.31). The current study demonstrated moderate-to-excellent reliability and good-moderate measurement error during a 4-s sprint on a non-motorized treadmill. However, sex had a substantial impact with females exhibiting better values. Practitioners should employ at least two separate trials within a session, in addition to multiple familiarization sessions, to achieve reliable non-motorized treadmill sprint performances.
Publisher: Human Kinetics
Date: 03-2023
Abstract: Purpose: This crossover trial compared the effects of varying feedback approaches on sprint performance, motivation, and affective mood states in female athletes. Methods : Eligibility criteria were being competitive female athletes, where participants completed sprint tests in 4 randomized feedback conditions on grass, including augmented feedback (sprint time AUG-FB), technical feedback (cues TECH-FB), a competition-driven drill (CDD) sprinting against an opponent, and a control condition (no feedback CON). Participants completed a 20-m sprint (maximum sprint), 30-m curved agility sprint, and a repeated sprint ability test, with sprint times, motivation level, and mood states recorded. The participants were blinded from the number of trials during the repeated sprint ability test. Results : About 12 rugby league players completed all feedback conditions. The maximum sprint times were faster for AUG-FB (3.54 [0.16] s) and CDD (3.54 [0.16] s) compared with TECH-FB (3.64 [0.16] s), while there were no differences compared with CON (3.58 [0.17] s). The curved agility sprint times were faster for AUG-FB (5.42 [0.20] s) compared with TECH-FB (5.61 [0.21] s) and CON (5.57 [0.24] s), although CDD (5.38 [0.26] s) produced faster sprint times than TECH-FB. Effort and value were higher with AUG-FB (6.31 [0.68] 6.53 [0.05]) compared with CON (5.99 [0.60] 4.75 [2.07]), while CON exhibited lower enjoyment ratings (4.68 [0.95]) compared with other feedback conditions (AUG-FB: 5.54 [0.72] CDD: 5.56 [0.67] TECH-FB: 5.60 [0.56]). Conclusions : Providing AUG-FB prior to sprint tasks enhances more immediate performance outcomes than TECH-FB. AUG-FB also benefited athlete enjoyment, task effort, and coaching value. Female athletes should receive AUG-FB in testing and training environments, to improve immediate physical performance and motivation.
Publisher: Canadian Science Publishing
Date: 2014
Abstract: This study investigated the effects of endurance training only (E, n = 14) and same-session combined training, when strength training is repeatedly preceded by endurance loading (endurance and strength training (E+S), n = 13) on endurance (1000-m running time during incremental field test) and strength performance (1-repetition maximum (1RM) in dynamic leg press), basal serum hormone concentrations, and endurance loading-induced force and hormone responses in recreationally endurance-trained men. E was identical in the 2 groups and consisted of steady-state and interval running, 4–6 times per week for 24 weeks. E+S performed additional mixed-maximal and explosive-strength training (2 times per week) immediately following an incremental running session (35–45 min, 65%–85% maximal heart rate). E and E+S decreased running time at week 12 (–8% ± 5%, p = 0.001 and –7% ± 3%, p 0.001) and 24 (–13% ± 5%, p 0.001 and –9% ± 5%, p = 0.001). Strength performance decreased in E at week 24 (–5% ± 5%, p = 0.014) but was maintained in E+S (between-groups at week 12 and 24, p = 0.014 and 0.011, respectively). Basal serum testosterone and cortisol concentrations remained unaltered in E and E+S but testosterone/sex hormone binding globulin ratio decreased in E+S at week 12 (–19% ± 26%, p = 0.006). At week 0 and 24, endurance loading-induced acute force (–5% to –9%, p = 0.032 to 0.001) and testosterone and cortisol responses (18%–47%, p = 0.013 to p 0.001) were similar between E and E+S. This study showed no endurance performance benefits when strength training was performed repeatedly after endurance training compared with endurance training only. This was supported by similar acute responses in force and hormonal measures immediately post-endurance loading after the training with sustained 1RM strength in E+S.
Publisher: SAGE Publications
Date: 20-06-2023
DOI: 10.1177/17479541231181549
Abstract: Cricket is an unique international sport where environmental and task constraints have shown to have a significant impact on batting and bowling performance. The aim of this systematic review was to determine the effect of task and environmental constraints on cricket performance. A systematic literature search was conducted across Scopus, PubMed, Web of Science, CINAHL, and SportDiscus. Studies were deemed eligible if they reported the effects of pitch type, pitch length, equipment (e.g. cricket bat, batting pads, ball type, etc.) on cricket performance. A total of 20 studies met the inclusion criteria with Kmet score ranging between 75% and 92%. The results from this study demonstrate that environmental constraints such as pitch-type and task constraints such as equipment modification (e.g. type of cricket bat, batting pads, ball) and pitch length can influence cricketer's batting and bowling performance. Scaling cricket bats and reducing pitch length were acutely beneficial to cricket batting, while ball type, pitch length and soil properties were impactful on bowling performance. Importantly though, the impact of constraint manipulation seemed to be influenced by the skill level of the performer. The findings from this study may help to inform coaches and practitioners improve skill acquisition, through constraint manipulation, to develop highly adaptive cricket batting and bowling skill.
Publisher: MDPI AG
Date: 19-09-2020
DOI: 10.3390/JPM10030135
Abstract: Aim: This systematic review aimed to explore the literature to identify in which types of chronic diseases exercise with supplemental oxygen has previously been utilized and whether this type of personalized therapy leads to superior effects in physical fitness and well-being. Methods: Databases (PubMed/MEDLINE, CINHAL, EMBASE, Web of knowledge and Cochrane Library) were searched in accordance with PRISMA. Eligibility criteria included adult patients diagnosed with any type of chronic diseases engaging in supervised exercise training with supplemental oxygen compared to normoxia. A random-effects model was used to pool effect sizes by standardized mean differences (SMD). Results: Out of the identified 4038 studies, 12 articles were eligible. Eleven studies were conducted in chronic obstructive pulmonary disease (COPD), while one study included coronary artery disease (CAD) patients. No statistical differences were observed for markers of physical fitness and patient-reported outcomes on well-being between the two training conditions (SMD −0.10 95% CI −0.27, 0.08 p = 0.26). Conclusions: We found that chronic exercise with supplemental oxygen has mainly been utilized for COPD patients. Moreover, no superior long-term adaptations on physical fitness, functional capacity or patient-reported well-being were found, questioning the role of this method as a personalized medicine approach. Prospero registration: CRD42018104649.
Publisher: Informa UK Limited
Date: 22-10-2020
DOI: 10.1080/02640414.2019.1683385
Abstract: A common practice in resistance training is to perform sets of exercises at, or close to failure, which can alter movement dynamics. This study examined ankle, knee, hip, and lumbo-pelvis dynamics during the barbell back squat under a moderate-heavy load (80% of 1 repetition maximum (1RM)) when performed to failure. Eleven resistance trained males performed three sets to volitional failure. Sagittal plane movement dynamics at the ankle, knee, hip, and lumbo-pelvis were examined specifically, joint moments, joint angles, joint angular velocity, and joint power. The second repetition of the first set and the final repetition of the third set were compared. Results showed that while the joint movements slowed (p < 0.05), the joint ranges of motion were not altered There were significant changes in most mean joint moments (p < 0.05), indicating altered joint loading. The knee moment decreased while the hip and lumbo-pelvis moments underwent compensatory increases. At the knee and hip, there were significant decreases (p < 0.05) in concentric power output (p < 0.05). Whilst performing multiple sets to failure altered some joint kinetics, the comparable findings in joint range ofmotion suggest that technique was not altered. Therefore, skilled in iduals appear to maintain technique when performing to failure.
Publisher: MDPI AG
Date: 19-07-2021
Abstract: The current study examined the acute effects of a bout of resistance training on cricket bowling-specific motor performance. Eight sub-elite, resistance-untrained, adolescent male fast bowlers (age 15 ± 1.7 years height 1.8 ± 0.1 m weight 67.9 ± 7.9 kg) completed a bout of upper and lower body resistance exercises. Indirect markers of muscle damage (creatine kinase [CK] and delayed onset of muscle soreness [DOMS]), anaerobic performance (15-m sprint and vertical jump), and cricket-specific motor performance (ball speed, run-up time, and accuracy) were measured prior to and 24 (T24) and 48 (T48) hours following the resistance training bout. The resistance training bout significantly increased CK (~350% effect size [ES] = 1.89–2.24), DOMS (~240% ES = 1.46–3.77) and 15-m sprint times (~4.0% ES = 1.33–1.47), whilst significantly reducing vertical jump height (~7.0% ES = 0.76–0.96) for up to 48 h. The ball speed (~3.0% ES = 0.50–0.61) and bowling accuracy (~79% ES = 0.39–0.70) were significantly reduced, whilst run-up time was significantly increased (~3.5% ES = 0.36–0.50) for up to 24 h. These findings demonstrate that a bout of resistance training evokes exercise-induced muscle damage amongst sub-elite, adolescent male cricketers, which impairs anaerobic performance and bowling-specific motor performance measures. Cricket coaches should be cautious of incorporating bowling sessions within 24-h following a bout of resistance training for sub-elite adolescent fast bowlers, particularly for those commencing a resistance training program.
Publisher: MDPI AG
Date: 10-11-2022
DOI: 10.3390/JCM11226680
Abstract: Background: The purpose of this study was to describe the femoral component rotation in total knee arthroplasty (TKA) using a tibia-first, gap-balancing, “functional alignment” technique. Methods: Ninety-seven patients with osteoarthritis received a TKA using computer navigation. The tibial resection was performed according to the kinematic alignment (KA) principles, while the femoral rotation was set according to the gap-balancing technique. Preoperative MRIs and intraoperative resection depth data were used to calculate the following rotational axes: the transepicondylar axis (TEA), the posterior condylar axis (PCA) and the prosthetic posterior condylar axis (rPCA). The angles between the PCA and the TEA (PCA/TEA), between the rPCA and the PCA (rPCA/PCA) and between the rPCA and the TEA (rPCA/TEA) were measured. Data regarding patellar maltracking and PROMs were collected for 24 months postoperatively. Results: The mean PCA/TEA, rPCA/TEA and rPCA/PCA angles were −5.1° ± 2.1°, −4.8° ± 2.6° and −0.4° ± 1.7°, respectively (the negative values denote the internal rotation of the PCA to the TEA, rPCA to TEA and rPCA to PCA, respectively). There was no need for lateral release and no cases of patellar maltracking. Conclusions: A tibia-first, gap-balancing, “functional alignment” approach allows incorporating a gap-balancing technique with kinematic principles. Sagittal complexities in the proximal tibia (variable medial and lateral slopes) can be accounted for, as the tibial resection is completed prior to setting the femoral rotation. The prosthetic femoral rotation is internally rotated relative to the TEA, almost parallel to the PCA, similar to the femoral rotation of the KA-TKA technique. This technique did not result in patellar maltracking.
Publisher: Informa UK Limited
Date: 03-10-2014
DOI: 10.1080/17461391.2012.726653
Abstract: Strength training has been shown to cause acute detrimental effects on running performance. However, there is limited investigation on the effect of various strength training variables, whilst controlling eccentric contraction velocity, on running performance. The present study examined the effects of intensity and volume (i.e. whole body vs. lower body only) of strength training with slow eccentric contractions on running economy (RE) [i.e. below anaerobic threshold (AT)] and time-to-exhaustion (TTE) (i.e. above AT) 6 hours post. Fifteen trained and moderately endurance trained male runners undertook high-intensity whole body (HW), high-intensity lower body only (HL) and low-intensity whole body (LW) strength training sessions with slow eccentric contractions (i.e. 1:4 second concentric-to-eccentric contraction) in random order. Six hours following each strength training session, a RE test with TTE was conducted. The results showed that HW, HL and LW sessions had no effect on RE and that LW session had no effect on TTE (P ≥ 0.05). However, HW and HL sessions significantly reduced TTE (P < 0.05). These findings demonstrate that a 6-hour recovery period following HW, HL and LW sessions may minimize attenuation in endurance training performance below AT, although caution should be taken for endurance training sessions above AT amongst trained and moderately endurance trained runners.
Publisher: Canadian Science Publishing
Date: 06-2013
Abstract: This study examined the acute effect of strength and endurance training sequence on running economy (RE) at 70% and 90% ventilatory threshold (VT) and on running time to exhaustion (TTE) at 110% VT the following day. Fourteen trained and moderately trained male runners performed strength training prior to running sessions (SR) and running prior to strength training sessions (RS) with each mode of training session separated by 6 h. RE tests were conducted at baseline (Base-RE) and the day following each sequence to examine cost of running (C R ), TTE, and lower extremity kinematics. Maximal isometric knee extensor torque was measured prior to and following each training session and the RE tests. Results showed that C R at 70% and 90% VT for SR-RE (0.76 ± 0.10 and 0.77 ± 0.07 mL·kg –0.75 ·m –1 ) was significantly greater than Base-RE (0.72 ± 0.10 and 0.70 ± 0.11 mL·kg –0.75 ·m –1 ) and RS-RE (0.73 ± 0.09 and 0.72 ± 0.09 mL·kg –0.75 ·m –1 ) (P 0.05). TTE was significantly less for SR-RE (237.8 ± 67.4 s) and RS-RE (275.3 ± 68.0 s) compared with Base-RE (335.4 ± 92.1 s) (P 0.01). The torque during the SR sequence was significantly reduced for every time point following the strength training session (P 0.05). However, no significant differences were found in torque following the running session (P 0.05), although it was significantly reduced following the strength training session (P 0.05) during the RS sequence. These findings show that running performance is impaired to a greater degree the day following the SR sequence compared with the RS sequence.
Publisher: Informa UK Limited
Date: 19-02-2015
DOI: 10.1080/02640414.2015.1014830
Abstract: This study investigated whether anticipation and search strategies of goalkeepers are influenced by temporally and spatially manipulated video of a penalty. Participants were clustered into three groups depending on skill: goalkeepers (n = 17), field players (n = 20) and control group (n = 20). An eye tracker was worn whilst watching 40 videos of a striker kicking to four corners of a goal in random order. All 40 videos were temporally occluded at foot-to-ball contact, and the non-kicking leg of 20 videos was spatially manipulated. Results showed that goalkeepers had significantly better predictions than the two groups with no differences between the two testing conditions. According to effect size, the percentage of fixation location and viewing time of the kicking leg and ball were greater for the goalkeepers and field players group than the control group irrespective of testing conditions. The fixations on the kicking leg and ball in conjunction with comparable predictions between spatially manipulated and control conditions suggest that goalkeepers may not rely on the non-kicking leg. Furthermore, goalkeepers appear to use a global perceptual approach by anchoring on a distal fixation point/s of the penalty taker whilst using peripheral vision to obtain additional information.
Publisher: Springer Science and Business Media LLC
Date: 04-2015
DOI: 10.1007/S00421-015-3159-Z
Abstract: This study examined the effects of two typical strength training sessions performed 1 week apart (i.e. repeated bout effect) on sub-maximal running performance and hormonal. Fourteen resistance-untrained men (age 24.0 ± 3.9 years height 1.83 ± 0.11 m body mass 77.4 ± 14.0 kg VOpeak 48.1 ± 6.1 M kg(-1) min(-1)) undertook two bouts of high-intensity strength training sessions (i.e. six-repetition maximum). Creatine kinase (CK), delayed-onset muscle soreness (DOMS), counter-movement jump (CMJ) as well as concentrations of serum testosterone, cortisol and testosterone/cortisol ratio (T/C) were examined prior to and immediately post, 24 (T24) and 48 (T48) h post each strength training bout. Sub-maximal running performance was also conducted at T24 and T48 of each bout. When measures were compared between bouts at T48, the degree of elevation in CK (-58.4 ± 55.6 %) and DOMS (-31.43 ± 42.9 %) and acute reduction in CMJ measures (4.1 ± 5.4 %) were attenuated (p < 0.05) following the second bout. Cortisol was increased until T24 (p 0.05). Sub-maximal running performance was impaired until T24, although changes were not attenuated following the second bout. The initial bout appeared to provide protection against a number of muscle damage indicators suggesting a greater need for recovery following the initial session of typical lower body resistance exercises in resistance-untrained men although sub-maximal running should be avoided following the first two sessions.
Publisher: Springer Science and Business Media LLC
Date: 04-06-2019
Publisher: Springer Science and Business Media LLC
Date: 26-02-2022
DOI: 10.1007/S40279-022-01640-Z
Abstract: Several studies have examined the effect of creatine monohydrate (CrM) on indirect muscle damage markers and muscle performance, although pooled data from several studies indicate that the benefits of CrM on recovery dynamics are limited. This systematic review and meta-analysis determined whether the ergogenic effects of CrM ameliorated markers of muscle damage and performance following muscle-damaging exercises. In total, 23 studies were included, consisting of 240 participants in the CrM group (age 23.9 ± 10.4 years, height 178 ± 5 cm, body mass 76.9 ± 7.6 kg, females 10.4%) and 229 participants in the placebo group (age 23.7 ± 8.5 years, height 177 ± 5 cm, body mass 77.0 ± 6.6 kg, females 10.0%). These studies were rated as fair to excellent following the PEDro scale. The outcome measures were compared between the CrM and placebo groups at 24–36 h and 48–90 h following muscle-damaging exercises, using standardised mean differences (SMDs) and associated p -values via forest plots. Furthermore, sub-group analyses were conducted by separating studies into those that examined the effects of CrM as an acute training response (i.e., after one muscle-damaging exercise bout) and those that examined the chronic training response (i.e., examining the acute response after the last training session following several weeks of training). According to the meta-analysis, the CrM group exhibited significantly lower indirect muscle damage markers (i.e., creatine kinase, lactate dehydrogenase, and/or myoglobin) at 48–90 h post-exercise for the acute training response (SMD − 1.09 p = 0.03). However, indirect muscle damage markers were significantly greater in the CrM group at 24 h post-exercise (SMD 0.95 p = 0.04) for the chronic training response. Although not significant, a large difference in indirect muscle damage markers was also found at 48 h post-exercise (SMD 1.24) for the chronic training response. The CrM group also showed lower inflammation for the acute training response at 24–36 h post-exercise and 48–90 h post-exercise with a large effect size (SMD − 1.38 ≤ d ≤ − 1.79). Similarly, the oxidative stress markers were lower for the acute training response in the CrM group at 24–36 h post-exercise and 90 h post-exercise, with a large effect size (SMD − 1.37 and − 1.36, respectively). For delayed-onset muscle soreness (DOMS), the measures were lower for the CrM group at 24 h post-exercise with a moderate effect size (SMD − 0.66) as an acute training response. However, the inter-group differences for inflammation, oxidative stress, and DOMS were not statistically significant ( p 0.05). Overall, our meta-analysis demonstrated a paradoxical effect of CrM supplementation post-exercise, where CrM appears to minimise exercise-induced muscle damage as an acute training response, although this trend is reversed as a chronic training response. Thus, CrM may be effective in reducing the level of exercise-induced muscle damage following a single bout of strenuous exercises, although training-induced stress could be exacerbated following long-term supplementation of CrM. Although long-term usage of CrM is known to enhance training adaptations, whether the increased level of exercise-induced muscle damage as a chronic training response may provide potential mechanisms to enhance chronic training adaptations with CrM supplementation remains to be confirmed.
Publisher: Wiley
Date: 02-06-2018
DOI: 10.1111/CPF.12443
Abstract: The purpose of this systematic review was to evaluate the effects of smokeless forms of nicotine on physiological responses and exercise performance. Methodology and reporting were based on the PRISMA statement. The intervention was defined as any product containing nicotine that did not require smoking. Searches were conducted across two electronic databases with supplementary approaches utilized. Studies were selected following set inclusion and exclusion criteria and checked by two independent authors. A modified PEDro scale was utilized to rate study quality with studies averaging 9·3/13. Six studies assessed exercise performance with endurance-based parameters reported as significantly improved with nicotine in one study, while anaerobic parameters were unaffected or decreased compared to placebo except in one study which reported enhanced leg extensor torque but no effect on countermovement jump or Wingate anaerobic capacity. Sixteen of 28 studies investigating physiological responses reported that nicotine significantly increased heart rate compared to placebo or control. Blood pressure and blood flow were also reported as significantly increased in multiple studies. While there is strong evidence of nicotine-induced changes in physiological function that would benefit physical performance, beneficial effects have only been reported on leg extensor torque and endurance performance by one study each. Subsequently, there is need for more research with strong methodological quality to definitively evaluate nicotine's potential as an ergogenic aid.
Publisher: Elsevier BV
Date: 03-2023
Publisher: Springer Science and Business Media LLC
Date: 16-08-2018
DOI: 10.1007/S40279-018-0964-7
Abstract: Several studies have examined the effects of balance training in elderly in iduals following total knee arthroplasty (TKA), although findings appear to be equivocal. This systematic review and meta-analysis examined the effects of balance training on walking capacity, balance-specific performance and other functional outcome measures in elderly in iduals following TKA. Data sources: Pubmed, PEDro, Cinahl, SportDiscus, Scopus. Eligibility criteria: Data were aggregated following the population-intervention-comparison-outcome (PICO) principles. Eligibility criteria included: (1) randomised controlled trials (2) studies with comparative groups (3) training interventions were incorporated post-TKA and (4) outcome measures included walking capacity, balance-specific performance measures, subjective measures of physical function and pain and knee range-of-motion. Elderly in iduals (65 + years) who underwent total knee arthroplasty. Balance interventions that consisted of balance exercises, which were compared to control interventions that did not involve balance exercises, or to a lesser extent. Participants also undertook usual physiotherapy care in conjunction with either the balance and/or control intervention. The intervention duration ranged from 4 to 32 weeks with outcome measures reported immediately following the intervention. Of these, four studies also reported follow-up measures ranging from 6 to 12 months post-interventions. Study appraisal: PEDro scale. Quantitative analysis was conducted by generating forest plots to report on standardised mean differences (SMD i.e. effect size), test statistics for statistical significance (i.e. Z values) and inter-trial heterogeneity by inspecting I Balance training exhibited significantly greater improvement in walking capacity (SMD = 0.57 Z = 6.30 P < 0.001 I A number of outcome measures indicated high inter-trial heterogeneity and only articles published in English were included. Balance training improved walking capacity, balance-specific performance and functional outcome measures for elderly in iduals following TKA. These findings may improve clinical decision-making for appropriate post-TKA exercise prescription to minimise falls risks and optimise physical function.
Publisher: Informa UK Limited
Date: 09-2013
DOI: 10.1080/14763141.2012.760204
Abstract: The purpose of this study was to compare kinematics and muscle activity between chin-ups and lat-pull down exercises and between muscle groups during the two exercises. Normalized electromyography (EMG) of biceps brachii (BB), triceps brachii (TB), pectoralis major (PM), latissimus dorsi (LD), rectus abdominus (RA), and erector spinae (ES) and kinematics of back, shoulder, and seventh cervical vertebrae (C7) was analysed during chin-ups and lat-pull down exercises. Normalized EMG of BB and ES and kinematics of shoulder and C7 for chin-ups were greater than lat-pull down exercises during the concentric phase (p < 0.05). For the eccentric phase, RA during lat-pull down exercises was greater than chin-ups and the kinematics of C7 during chin-ups was greater than lat-pull down exercises (p < 0.05). For chin-ups, BB, LD, and ES were greater than PM during the concentric phase, whereas BB and LD were greater than TB, and LD was greater than RA during the eccentric phase (p < 0.05). For lat-pull down exercise, BB and LD were greater than PM, TB, and ES during the concentric phase, whereas LD was greater than PM, TB, and BB during the eccentric phase (p < 0.05). Subsequently, chin-ups appears to be a more functional exercise.
Publisher: MDPI AG
Date: 24-08-2022
Abstract: To examine the repeated bout effect (RBE) following two identical resistance bouts and its effect on bowling-specific performance in male cricketers. Male cricket pace bowlers (N = 10), who had not undertaken resistance exercises in the past six months, were invited to complete a familiarisation and resistance maximum testing, before participating in the study protocol. The study protocol involved the collection of muscle damage markers, a battery of anaerobic (jump and sprint), and a bowling-specific performance test at baseline, followed by a resistance training bout, and a retest of physical and bowling-specific performance at 24 h (T24) and 48 h (T48) post-training. The study protocol was repeated 7–10 days thereafter. Indirect markers of muscle damage were lower (creatine kinase: 318.7 ± 164.3 U·L−1 muscle soreness: 3 ± 1), whilst drop jump was improved (~47.5 ± 8.1 cm) following the second resistance training bout when compared to the first resistance training bout (creatine kinase: 550.9 ± 242.3 U·L−1 muscle soreness: 4 ± 2 drop jump: ~43.0 ± 9.7 cm). However, sport-specific performance via bowling speed declined (Bout 1: −2.55 ± 3.43% Bout 2: 2.67 ± 2.41%) whilst run-up time increased (2.34 ± 3.61% Bout 2: 3.84 ± 4.06%) after each bout of resistance training. Findings suggest that while an initial resistance training bout reduced muscle damage indicators and improved drop jump performance following a second resistance training bout, this RBE trend was not observed for bowling-specific performance. It was suggested that pace bowlers with limited exposure to resistance training should minimise bowling-specific practice for 1–2 days following the initial bouts of their resistance training program.
Publisher: F1000 Research Ltd
Date: 17-06-2020
DOI: 10.12688/F1000RESEARCH.23129.1
Abstract: Background: Rugby league involves repeated, complex, change-of-direction movements, although there are no test protocols that specifically assesses these physical fitness profiles. Thus, the current study examined the convergent validity and reliability of a repeated Illinois Agility (RIA) protocol in adolescent Rugby League players. Methods: Twenty-two junior Rugby League players completed 4 sessions with each separated by 7 days. Initially, physical fitness characteristics at baseline (i.e., multi-stage fitness, countermovement jump, 30-m sprint, single-effort agility and repeated sprint ability [RSA]) were assessed. The second session involved a familiarisation of RIA and repeated T-agility test (RTT) protocols. During the third and fourth sessions, participants completed the RIA and RTT protocols in a randomised, counterbalanced design to examine the validity and test-retest reliability of these protocols. Results: For convergent validity, significant correlations were identified between RIA and RTT performances (r= .80 p .05). For contributors to RIA performance, significant correlations were identified between all baseline fitness characteristics and RIA (r = .71 p 0.05). Reliability of the RIA protocol was near perfect with excellent intra-class correlation coefficient (0.87-0.97), good ratio limits of agreement (×/÷ 1.05-1.06) and low coefficient of variations (1.77-1.97%). Conclusions: The current study has demonstrated the RIA to be a simple, valid and reliable field test that can provide coaches with information about their athlete’s ability to sustain high intensity, multi-directional running efforts.
Publisher: F1000 Research Ltd
Date: 08-10-2021
DOI: 10.12688/F1000RESEARCH.23129.2
Abstract: Background: : Rugby league involves repeated, complex, and high intensity change-of-direction (COD) movements with no existing test protocols that specifically assesses these multiple physical fitness components simultaneously. Thus, the current study examined the convergent validity of a repeated Illinois Agility (RIA) protocol with the repeated T-agility protocol, and the repeatability of the RIA protocol in adolescent Rugby League players. Furthermore, aerobic capacity and anaerobic and COD performance were assessed to determine whether these physical qualities were important contributors to the RIA protocol. Methods: Twenty-two junior Rugby League players completed 4 sessions with each separated by 7 days. Initially, physical fitness characteristics at baseline (i.e., Beep test,, countermovement jump, 30-m sprint, single-effort COD and repeated sprint ability [RSA]) were assessed. The second session involved a familiarisation of RIA and repeated T-agility test (RTT) protocols. During the third and fourth sessions, participants completed the RIA and RTT protocols in a randomised, counterbalanced design to examine the validity and test-retest reliability of these protocols. Results: For convergent validity, significant correlations were identified between RIA and RTT performances (r= .80 p .05). For contributors to RIA performance, significant correlations were identified between all baseline fitness characteristics and RIA (r = .71 p 0.05). Reliability of the RIA protocol was near perfect with excellent intra-class correlation coefficient (0.87-0.97), good ratio limits of agreement (×/÷ 1.05-1.06) and low coefficient of variations (1.8-2.0%). Conclusions: The current study has demonstrated the RIA to be a simple, valid and reliable field test for RL athletes that can provide coaches with information about their team’s ability to sustain high intensity, multi-directional running efforts.
Publisher: F1000 Research Ltd
Date: 22-11-2021
DOI: 10.12688/F1000RESEARCH.23129.3
Abstract: Background: : Rugby league involves repeated, complex, and high intensity change-of-direction (COD) movements with no existing test protocols that specifically assesses these multiple physical fitness components simultaneously. Thus, the current study examined the convergent validity of a repeated Illinois Agility (RIA) protocol with the repeated T-agility protocol, and the repeatability of the RIA protocol in adolescent Rugby League players. Furthermore, aerobic capacity and anaerobic and COD performance were assessed to determine whether these physical qualities were important contributors to the RIA protocol. Methods: Twenty-two junior Rugby League players completed 4 sessions with each separated by 7 days. Initially, physical fitness characteristics at baseline (i.e., Multi-stage Shuttle test, countermovement jump, 30-m sprint, single-effort COD and repeated sprint ability [RSA]) were assessed. The second session involved a familiarisation of RIA and repeated T-agility test (RTT) protocols. During the third and fourth sessions, participants completed the RIA and RTT protocols in a randomised, counterbalanced design to examine the validity and test-retest reliability of these protocols. Results: For convergent validity, significant correlations were identified between RIA and RTT performances (r= .80 p .05). For contributors to RIA performance, significant correlations were identified between all baseline fitness characteristics and RIA (r = .71 p 0.05). Reliability of the RIA protocol was near perfect with excellent intra-class correlation coefficient (0.87-0.97), good ratio limits of agreement (×/÷ 1.05-1.06) and low coefficient of variations (1.8-2.0%). Conclusions: The current study has demonstrated the RIA to be a simple, valid and reliable field test for RL athletes that can provide coaches with information about their team’s ability to sustain high intensity, multi-directional running efforts.
Publisher: Informa UK Limited
Date: 19-04-2021
Publisher: Springer Science and Business Media LLC
Date: 07-06-2021
DOI: 10.1007/S40279-021-01486-X
Abstract: The relationship between exercise-induced muscle damage (EIMD) indicators and acute training loads (TL) is yet to be reviewed extensively in semi-elite and elite athlete populations. The objectives of this systematic review and meta-analysis were threefold: (1) to evaluate studies of EIMD following the initial period of the preseason in semi-elite and elite athletes: (2) to examine acute physiological and performance responses across two periods of the season with similar TL and (3) to examine acute physiological and performance responses to acute changes in TL during the season. The CINAHL, PubMed, Scopus, SPORTDiscus and Web of Science databases were systematically searched for studies that investigated: (1) semi-elite or elite athletes in team or in idual sports following a periodised training programme and (2) measured acute responses to training. Studies were excluded if: (1) conducted in animals (2) non-English language or (3) a conference abstract, review or case report. The Kmet Quality Scoring of Quantitative Studies tool was used for study appraisal. Data were quantitatively analysed by generating forest plots to report test statistics for statistical significance and inter-trial heterogeneity. Of the included studies (n = 32), athletes experienced greater creatine kinase (CK) concentrations (Z = 4.99, p < 0.00001, I This review included varying age, sex, sports and competition levels. The group level meta-analysis failed to identify within-athlete or position-specific differences across time. Blood biomarkers of EIMD may not differ across periods of similar TL, however can be considered a sensitive monitoring tool for assessing responses following acute TL changes in semi-elite and elite athletes.
Publisher: Wiley
Date: 20-04-2023
DOI: 10.1111/AJR.12986
Abstract: Despite the importance of child road traffic death, a limited number of studies have investigated rural child road traffic death in high income countries. This review estimated the impact of rurality on child road traffic deaths and other potential risk factors in high‐income countries. We searched Ovid, MEDLINE, CINAHL, PsycINFO and Scopus databases and extracted studies focusing on the association between rurality and child road traffic death published between 2001 and 2021. Available data were extracted and analysed, to evaluate the impact of rurality on child road traffic death and explore other risk factors of child road traffic deaths. We identified 13 studies for child road traffic death between 2001 and 2021. Eight studies reported the impact of rurality on child road traffic death, and all of them alleged that the mortality rate and injury rate of children was significantly higher on rural road than on urban road. The impact of rurality varied between studies, from 1.6 times to 15 times higher incidence of road traffic death in rural areas. Vehicle type, speeding cars, driver loss of control, alcohol and drug use road environment were identified as risk factors of child road traffic death. Conversely, ethnicity, seat belts, nondeployed airbag, child restraint, strict driver licence system, camera law and accessibility of trauma centres were considered protective factors. Other factors, including age, gender and teen passengers, appeared ambiguous for child road traffic death. Rurality is one of the most important risk factors of child road traffic death. Therefore, we should consider the impact that rurality has on child road death and resolve the gap between rural and urban areas in order to prevent child road traffic death effectively. The findings of this literature review will assist policy‐makers to prevent child road traffic death by focusing on rural regions.
Publisher: Informa UK Limited
Date: 15-07-2021
Publisher: Springer Science and Business Media LLC
Date: 06-10-2019
DOI: 10.1007/S11136-018-2001-6
Abstract: The current review was conducted to identify all self-report questionnaires on functional health status (FHS) and/or health-related quality-of-life (HR-QoL) in adult populations with dysphonia (voice problems), and to evaluate the psychometric properties of the retrieved questionnaires. A systematic review was performed in the electronic literature databases PubMed and Embase. The psychometric properties of the questionnaires were determined using the COnsensus-based Standards for the selection of health Measurement INstruments (COSMIN) taxonomy and checklist. Responsiveness was outside the scope of this review and as no agreed 'gold standard' measures are available in the field of FHS and HR-QoL in dysphonia, criterion validity was not assessed. Only questionnaires developed and published in English were included. Forty-eight studies reported on the psychometric properties of 15 identified questionnaires. As many psychometric data were missing or resulted from biased study designs or statistical analyses, only preliminary conclusions can be drawn. Based on the current available psychometric evidence in the literature, the Voice Handicap Index seems to be the most promising questionnaire, followed by the Vocal Performance Questionnaire. More research is needed to complete missing data on psychometric properties of existing questionnaires in FHS and/or HR-QoL. Further, when developing new questionnaires, the use of item response theory is preferred above classical testing theory, as well as international consensus-based psychometric definitions and criteria to avoid bias in outcome data on measurement properties.
Publisher: Elsevier BV
Date: 06-2022
DOI: 10.1016/J.JSE.2022.01.133
Abstract: Preoperative skin preparations for total shoulder arthroplasty (TSA) are not standardized for Cutibacterium acnes eradication. Topical benzyl peroxide (BPO) and benzyl peroxide with clindamycin (BPO-C) have been shown to reduce the bacterial load of C acnes on the skin. Our aim was to investigate whether preoperative application of these topical antimicrobials reduced superficial colonization and deep tissue inoculation of C acnes in patients undergoing TSA. In a prospective, single-blinded randomized controlled trial, 101 patients undergoing primary TSA were randomized to receive either topical pHisoHex (hexachlorophene [1% triclosan sodium benzoate, 5 mg/mL and benzyl alcohol, 5 mg/mL]) (n = 35), 5% BPO (n = 33), or 5% BPO with 1% clindamycin (n = 33). Skin swabs obtained prior to topical application and after topical application before surgery, as well as 3 intraoperative swabs (dermis after incision, on joint capsule entry, and dermis at wound closure), were cultured. The primary outcome was positive culture findings and successful decolonization. All 3 topical preparations were effective in decreasing the rate of C acnes. The application of pHisoHex reduced skin colonization by 50%, BPO reduced skin colonization by 73.7%, and BPO-C reduced skin colonization by 81.5%. The topical preparation of BPO-C was more effective in decreasing the rate of C acnes at the preoperative and intraoperative swab time points compared with pHisoHex and BPO (P = .003). Failure to eradicate C acnes with topical preparations consistently resulted in deep tissue inoculation. There was an increase in the C acnes contamination rate on the skin during closure (33%) compared with skin cultures taken at surgery commencement (22%). Topical application of BPO and BPO-C preoperatively is more effective than pHisoHex in reducing colonization and contamination of the surgical field with C acnes in patients undergoing TSA.
Publisher: Hogrefe Publishing Group
Date: 16-11-2020
DOI: 10.1024/0300-9831/A000689
Abstract: Abstract. This systematic review and meta-analysis examined the effects of selected root plants (curcumin, ginseng, ginger and garlic) on markers of muscle damage and muscular performance measures following muscle-damaging protocols. We included 25 studies (parallel and crossover design) with 353 participants and used the PEDro scale to appraise each study. Forest plots were generated to report on standardised mean differences (SMD) and p-values at 24 and 48 hours following the muscle-damaging protocols. The meta-analysis showed that the supplemental (SUPP) condition showed significantly lower levels of indirect muscle damage markers (creatine kinase, lactate dehydrogenase and myoglobin) and muscle soreness at 24 hours and 48 hours (p 0.01) than the placebo (PLA) condition. The inflammatory markers were significantly lower for the SUPP condition than the PLA condition at 24 hours (p = 0.02), although no differences were identified at 48 hours (p = 0.40). There were no significant differences in muscular performance measures between the SUPP and PLA conditions at 24 hours and 48 hours (p 0.05) post-exercise. According to our qualitative data, a number of studies reported a reduction in oxidative stress (e.g., malondialdehyde, superoxide dismutase) with a concomitant upregulation of anti-oxidant status, although other studies showed no effects. Accordingly, selected root plants minimised the level of several biomarkers of muscle damage, inflammation and muscle soreness during periods of exercise-induced muscle damage. However, the benefits of these supplements in ameliorating oxidative stress, increasing anti-oxidant status and accelerating recovery of muscular performance appears equivocal, warranting further research in these outcome measures.
Publisher: Elsevier BV
Date: 12-2012
Publisher: Informa UK Limited
Date: 11-01-2021
Publisher: Center for Open Science
Date: 20-05-2021
Abstract: BackgroundBoth athletes and recreational exercisers often perform relatively high volumes of aerobic and strength training simultaneously. However, the compatibility of these two distinct training modes remains unclear. ObjectiveThis systematic review assessed the compatibility of concurrent aerobic and strength training compared with strength training alone, in terms of adaptations in muscle function (maximal and explosive strength) and muscle mass. Subgroup analyses were conducted to examine the influence of training modality, training type, exercise order, training frequency, age, and training status.DesignA systematic literature search was conducted according to the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA). PROSPERO: CRD42020203777.Data sourcesPubMed/MEDLINE, ISI Web of Science, Embase, CINAHL, SPORTDiscus and Scopus were systematically searched (12th of August 2020, updated on the 15th of March 2021).Eligibility criteriaPopulation: Healthy adults of any sex and age Intervention: supervised, concurrent aerobic and strength training of at least 4 weeks Comparison: identical strength training prescription, with no aerobic training Outcome: maximal strength, explosive strength, and muscle hypertrophy. ResultsA total of 43 studies were included. The estimated standardised mean differences (SMD) based on the random-effects model were -0.06 (95% CI: -0.20, 0.09, p=0.446), -0.28 (95% CI: -0.48, - 0.08, p=0.007) and -0.01 (95% CI: -0.16, 0.18, p=0.919) for maximal strength, explosive strength, and muscle hypertrophy, respectively. Attenuation of explosive strength was more pronounced when concurrent training was performed within the same session (p=0.043) compared to when sessions were separated by at least 3 h (p& .05). No significant effects were found for the other moderators, i.e. type of aerobic training (cycling vs. running), frequency of concurrent training (& 5 weekly sessions vs. & 5 weekly sessions), training status (untrained vs. active) and mean age (& 40 years vs. & 40 years). Summary/ConclusionConcurrent aerobic and strength training does not compromise muscle hypertrophy and maximal strength development. However, explosive strength gains may be attenuated, especially when aerobic and strength training are performed in the same session. These results appeared to be independent of the type of aerobic training, frequency of concurrent training, training status, and age.
Publisher: Human Kinetics
Date: 11-2023
Abstract: Purpose : The purpose of the study was to examine whether various athletic performances, anthropometric measures, and playing experience differentiate selected and nonselected ultimate Frisbee players trialing to compete in the world ch ionship. Methods : Forty-three Australian male ultimate Frisbee players (age = 21.2 [1.2] y height = 1.7 [6.8] m body mass = 69.7 [8.2] kg playing experience = 3.5 [1.5] y) participated in a 30-m sprint test, single-leg run-up jump approach (both left [Jump LL ], and right leg [Jump RL ]) and a stationary bilateral vertical jump (Jump BIL ), and change-of-direction speed test. Following a selection c , players were sub ided according to their selection or nonselection into the team. Results : A multivariate analysis of variance revealed that height, 10-m sprint time, acceleration, Jump LL , Jump RL , and Jump BIL were significantly greater for selected players than nonselected players ( P .05). Area under the curve (AUC) was greatest for Jump RL (AUC = 79% optimal cutoff value of 37.5 cm, sensitivity and specificity values of 77% and 71%, respectively), Jump LL (AUC = 74% optimal cutoff 38.5 cm, sensitivity and specificity values 77% and 77%, respectively), and Jump BIL (AUC = 78% optimal cutoff value of 40.5 cm, sensitivity and specificity values 71% and 79%, respectively). The largest AUC (AUC = 81% 95% CI 0.66–0.97 P = .001) was found when combining the explanatory variables that demonstrated moderate to large effect sizes (ie, height, playing experience, 10-m sprint, acceleration, Jump LL , Jump RL , and Jump BIL ), with sensitivity of 93% and specificity of 71%. Conclusion : These athletic performance and anthropometric characteristics differentiating selected and nonselected players may help inform targeted training and player-development strategies.
Publisher: Springer Science and Business Media LLC
Date: 08-2018
DOI: 10.1007/S00590-018-2280-1
Abstract: Accelerated rehabilitation protocols for medial opening wedge high tibial osteotomy (MOW HTO) using intraosseous implants have not previously been described. The present study provides early clinical and radiological outcomes of MOW HTO using a polyetheretherketone (PEEK) intraosseous system, in combination with an early weight-bearing protocol. Twenty consecutive knees (17 patients) underwent navigated MOW HTO using a PEEK implant with accelerated rehabilitation. Time to union and maintenance of correction were assessed radiographically for 12 months post-operative. Patient outcomes were monitored for a mean follow-up of 38 months (range 23-42) using standardised instruments (WOMAC, IKDC and Lysholm scores). All knees were corrected to valgus. The mean time to unassisted weight-bearing was 55 days (SD 24, range 21-106). Bone union occurred in 95% of knees by 6 months, with correction maintained for 15 knees at 12 months post-operative. Knees for which correction was lost within 1 year of surgery had significantly greater preoperative varus alignment. Implant survivorship was 95% and 80% at 12 and 38 months post-operative, respectively. Significant improvements in patient-reported satisfaction, knee function and return to daily activities from preoperative to 38 months post-operative were reported (WOMAC 36 v 0 IKDC 35.6 v 96 Lysholm 44.5 v 100). Accelerated rehabilitation following MOW HTO with an intraosseous PEEK implant did not delay bone union, with significantly improved functional outcomes within 3 months post-operative. Early findings suggest that this approach may be suitable for a defined patient subset, with consideration for the extent of preoperative genu varum.
Publisher: Springer Science and Business Media LLC
Date: 09-05-2023
DOI: 10.1007/S40279-023-01839-8
Abstract: Several studies have utilised isometric, eccentric and downhill walking pre-conditioning as a strategy for alleviating the signs and symptoms of exercise-induced muscle damage (EIMD) following a bout of damaging physical activity. This systematic review and meta-analysis examined the effects of pre-conditioning strategies on indices of muscle damage and physical performance measures following a second bout of strenuous physical activity. PubMed, CINAHL and Scopus. Studies meeting the PICO (population, intervention/exposure, comparison, and outcome) criteria were included in this review: (1) general population or “untrained” participants with no contraindications affecting physical performance (2) studies with a parallel design to examine the prevention and severity of muscle-damaging contractions (3) outcome measures were compared using baseline and post-intervention measures and (4) outcome measures included any markers of indirect muscle damage and muscular contractility measures. In iduals with no resistance training experiences in the previous 6 or more months. A single bout of pre-conditioning exercises consisting of eccentric or isometric contractions performed a minimum of 24 h prior to a bout of damaging physical activity were compared to control interventions that did not perform pre-conditioning prior to damaging physical activity. Kmet appraisal system. Quantitative analysis was conducted using forest plots to examine standardised mean differences (SMD, i.e. effect size), test statistics for statistical significance (i.e. Z-values) and between-study heterogeneity by inspecting I 2 . Following abstract and full-text screening, 23 articles were included in this paper. Based on the meta-analysis, the pre-conditioning group exhibited lower levels of creatine kinase at 24 h (SMD = − 1.64 Z = 8.39 p = 0.00001), 48 h (SMD = − 2.65 Z = 7.78 p = 0.00001), 72 h (SMD = − 2.39 Z = 5.71 p = 0.00001) and 96 h post-exercise (SMD = − 3.52 Z = 7.39 p = 0.00001) than the control group. Delayed-onset muscle soreness was also lower for the pre-conditioning group at 24 h (SMD = − 1.89 Z = 6.17 p = 0.00001), 48 h (SMD = − 2.50 Z = 7.99 p = 0.00001), 72 h (SMD = − 2.73 Z = 7.86 p = 0.00001) and 96 h post-exercise (SMD = − 3.30 Z = 8.47 p = 0.00001). Maximal voluntary contraction force was maintained and returned to normal sooner in the pre-conditioning group than in the control group, 24 h (SMD = 1.46 Z = 5.49 p = 0.00001), 48 h (SMD = 1.59 Z = 6.04 p = 0.00001), 72 h (SMD = 2.02 Z = 6.09 p = 0.00001) and 96 h post-exercise (SMD = 2.16 Z = 5.69 p = 0.00001). Range of motion was better maintained by the pre-conditioning group compared with the control group at 24 h (SMD = 1.48 Z = 4.30 p = 0.00001), 48 h (SMD = 2.20 Z = 5.64 p = 0.00001), 72 h (SMD = 2.66 Z = 5.42 p = 0.00001) and 96 h post-exercise (SMD = 2.5 Z = 5.46 p = 0.00001). Based on qualitative analyses, pre-conditioning activities were more effective when performed at 2–4 days before the muscle-damaging protocol compared with immediately prior to the muscle-damaging protocol, or 1–3 weeks prior to the muscle-damaging protocol. Furthermore, pre-conditioning activities performed using eccentric contractions over isometric contractions, with higher volumes, greater intensity and more lengthened muscle contractions provided greater protection from EIMD. Several outcome measures showed high inter-study heterogeneity. The inability to account for differences in durations between pre-conditioning and the second bout of damaging physical activity was also limiting. Pre-conditioning significantly reduced the severity of creatine kinase release, delayed-onset muscle soreness, loss of maximal voluntary contraction force and the range of motion decrease. Pre-conditioning may prevent severe EIMD and accelerate recovery of muscle force generation capacity.
Publisher: Springer Science and Business Media LLC
Date: 31-01-2020
Publisher: Springer Science and Business Media LLC
Date: 27-07-2017
Publisher: Georg Thieme Verlag KG
Date: 29-02-2012
Abstract: Whilst various studies have examined lower extremity joint kinematics during running, there is limited investigation on joint kinematics at steady-state running and at intensities close to exhaustion. Subsequently, the purpose of this study was to determine whether the reliability of kinematics in the lower extremity and thorax is affected by varying the running speeds during a running economy test. 14 trained and moderately trained runners undertook 2 running economy tests with each test incorporating 3 intensity stages: 70-, 90- and 110% of the second ventilatory threshold, respectively. The participants ran for 10 min during each of the first 2 stages and to exhaustion during the last stage. Kinematics of the ankle, knee, hip, pelvis and thorax were recorded using a 3-dimensional motion analysis system. Intra-class correlation coefficient (ICC), limits of agreement (LOA) and coefficient of variation (CV) were used to calculate reliability. The ICC, LOA and CV of the lower extremity and thoracic kinematic variables ranged from 0.33-0.97, 1.03-1.39 and 2.0-18.6, respectively. Whilst the reliability did vary between the kinematic variables, the majority of results showed minimal within-subject variation and moderate to high reliability. In conclusion, examining thoracic and lower extremity kinematics is useful in determining whether running kinematics is altered with varying running intensities.
Publisher: Public Library of Science (PLoS)
Date: 25-01-2016
Publisher: PeerJ
Date: 28-03-2016
DOI: 10.7717/PEERJ.1841
Abstract: This study examined the effects of cold-water immersion (CWI) and cold air therapy (CAT) on maximal cycling performance (i.e. anaerobic power) and markers of muscle damage following a strength training session. Twenty endurance-trained but strength-untrained male (n = 10) and female (n = 10) participants were randomised into either: CWI (15 min in 14 °C water to iliac crest) or CAT (15 min in 14 °C air) immediately following strength training (i.e. 3 sets of leg press, leg extensions and leg curls at 6 repetition maximum, respectively). Creatine kinase, muscle soreness and fatigue, isometric knee extensor and flexor torque and cycling anaerobic power were measured prior to, immediately after and at 24 (T24), 48 (T48) and 72 (T72) h post-strength exercises. No significant differences were found between treatments for any of the measured variables (p 0.05). However, trends suggested recovery was greater in CWI than CAT for cycling anaerobic power at T24 (10% ± 2%, ES = 0.90), T48 (8% ± 2%, ES = 0.64) and T72 (8% ± 7%, ES = 0.76). The findings suggest the combination of hydrostatic pressure and cold temperature may be favourable for recovery from strength training rather than cold temperature alone.
Publisher: Informa UK Limited
Date: 20-04-2022
Publisher: Springer Science and Business Media LLC
Date: 07-03-2019
DOI: 10.1007/S40279-019-01072-2
Abstract: Whilst the "acute hypothesis" was originally coined to describe the detrimental effects of concurrent training on strength development, similar physiological processes may occur when endurance training adaptations are compromised. There is a growing body of research indicating that typical resistance exercises impair neuromuscular function and endurance performance during periods of resistance training-induced muscle damage. Furthermore, recent evidence suggests that the attenuating effects of resistance training-induced muscle damage on endurance performance are influenced by exercise intensity, exercise mode, exercise sequence, recovery and contraction velocity of resistance training. By understanding the influence that training variables have on the level of resistance training-induced muscle damage and its subsequent attenuating effects on endurance performance, concurrent training programs could be prescribed in such a way that minimises fatigue between modes of training and optimises the quality of endurance training sessions. Therefore, this review will provide considerations for concurrent training prescription for endurance development based on scientific evidence. Furthermore, recommendations will be provided for future research by identifying training variables that may impact on endurance development as a result of concurrent training.
Publisher: MDPI AG
Date: 04-09-2020
Abstract: This study examined the acute effects of resistance training (RT) on volleyball-specific performance. Sixteen female volleyball players undertook their initial, pre-season RT bout. Countermovement jump (CMJ), delayed onset of muscle soreness (DOMS), and sport-specific performances (i.e., run-up jump, agility, and spiking speed and accuracy) were measured before, 24 (T24), and 48 (T48) hours after RT. A significant increase in DOMS was observed at T24 and T48 (~207.6% ± 119.3% p 0.05 ES = 1.8 (95% CI: 0.94–2.57)), whilst agility was significantly impaired at T48 (1.7% ± 2.5% p 0.05 ES = 0.30 (95% CI: −0.99–0.40)). However, there were no differences in CMJ (~−2.21% ± 7.6% p 0.05 ES = −0.11 (95% CI: −0.80–0.58)) and run-up jump (~−1.4% ± 4.7% p 0.05 ES = −0.07 (95% CI: −0.76–0.63)). Spiking speed was significantly reduced (−3.5% ± 4.4% p 0.05 ES = −0.28 (95% CI: −0.43–0.97)), although accuracy was improved (38.3% ± 81.4%: p 0.05) at T48. Thus, the initial, preseason RT bout compromised agility and spiking speed for several days post-exercise. Conversely, spiking accuracy improved, suggesting a speed–accuracy trade-off. Nonetheless, at least a 48-h recovery may be necessary after the initial RT bout for athletes returning from the off-season or injury.
No related grants have been discovered for Kenji Doma.