ORCID Profile
0000-0003-1307-2997
Current Organisations
University of Western Australia
,
University of Tasmania
,
Western Australian Department of Primary Industries and Regional Development and
Does something not look right? The information on this page has been harvested from data sources that may not be up to date. We continue to work with information providers to improve coverage and quality. To report an issue, use the Feedback Form.
Publisher: CSIRO Publishing
Date: 2015
DOI: 10.1071/CP14221
Abstract: A survey was conducted of commercial broadacre paddocks in the south-west cropping zone of Western Australia from 2010 to 2013. In total, 687 paddock years of data were s led from 184 paddocks. The land use of each paddock was recorded together with measurements of weed density, the incidence of soilborne pathogen DNA, and soil inorganic nitrogen (nitrate and ammonium). The dynamics of these biophysical variables were related to the crop and pasture sequences employed. Wheat was the most frequent land use (60% of paddock years), followed by canola and pasture (12% each), and lupins and barley (6% each). Four crop species, wheat, canola, barley and lupins, accounted for 84% of land use. By region, wheat, canola, barley and lupin accounted for 90% of land use in the Northern Agricultural Region (NAR), 83% in the Central Agricultural Region (CAR) and 78% in the Southern Agricultural Region (SAR). Conversely, pasture usage in the SAR was 21%, compared with 12% in the CAR and 7% in the NAR. Over the surveyed paddocks, weed density, soilborne pathogens and soil N were maintained at levels suitable for wheat production. The inclusion of land uses other than wheat at the frequency reported maintained the condition of these biophysical variables.
Publisher: CSIRO Publishing
Date: 22-06-2022
DOI: 10.1071/CP21778
Abstract: Context Rotations in rainfed farming systems of southwest Australia have shifted towards intensified cropping and it is necessary to reassess soilborne pathogens and plant parasitic nematodes within this context. Aims We tested the hypothesis that these recent changes in rotations and agronomy have altered the efficacy with which rotations reduce the incidence of common root pathogens and plant parasitic nematodes. Methods We tracked changes in common pathogen DNA in soil and the incidence and severity of crop root damage in 184 paddocks, over 6 years from 2010 to 2015, and related this to farmer practices. Key results Overall, severe root damage was rare, with 72% of plant s les showing no damage or only a trace and only 1% severely damaged. We found that the reduction of paddocks in pasture and resultant very low weed populations, combined with early sowing, reduced persistence of pathogens and nematode pests. But some aspects of crop management had the opposite effect: high rates of herbicide, increased frequency of cereals and canola at the expense of lupin and increased N fertiliser use. Conclusions Current agronomic practices and the frequency of non-host crops in rotations appear to be effective in controlling common root pathogens and plant parasitic nematodes. But the aspects of agronomic management that increased populations of pathogens should be applied cautiously. Implications Studies such as this that link multiple productivity constraints, such as pathogens and nematode pests, weeds and nutrients, to management practices are important to understand the sustainability of current or proposed production methods.
Publisher: CSIRO Publishing
Date: 02-05-2022
DOI: 10.1071/CP21745
Abstract: Rotations and associated management practices in rainfed farming systems of southwest Australia have shifted towards intensified cropping. Survey data from 184 fields spanning 14 Mha of southwest Australia were used to document water use efficiency (WUE) and water-limited yield potential (WLYP) of commercial crops and crop sequences and identify biophysical variables influencing WUE. WUE achieved in commercial wheat crops was 10.7 kg grain/ha.mm. Using a boundary function Ywl = 25 × (WU − 45), farmers achieved 54% of WLYP. Climate variables affected WUE more than management and biotic variates, the highest latitude region having WUE of 9.0 kg grain/ha.mm, compared to 11.8 kg grain/ha.mm for regions further south. Increased soil nitrogen and nitrogen fertiliser increased WUE, as did sowing earlier in keeping with farmers in southern Australia sowing crops earlier and trebling fertiliser nitrogen usage since 1990. Wheat yield and WUE increased a small amount after break crop or pasture (12.5 kg grain/ha.mm) compared to wheat grown after wheat (11.2 kg grain/ha.mm), due to good weed and root pathogen control, and high fertiliser nitrogen application. However, WUE of wheat declined to 8.4 kg grain/ha.mm when more than three wheat crops were grown in succession. Farmers continue to improve WUE with increased inputs and new technologies replacing some traditional functions of break crops and pasture. However, break crops and pastures are still required within the rotation to maintain WUE and break effects need to be measured over several years.
Publisher: CSIRO Publishing
Date: 25-03-2021
DOI: 10.1071/CP20403
Abstract: Balancing nutrient inputs and exports is essential to maintaining soil fertility in rainfed crop and pasture farming systems. Soil nutrient balances of land used for crop and pasture production in the south-west of Western Australia were assessed through survey data comprising biophysical measurements and farm management records (2010–15) across 184 fields spanning 14 Mha. Key findings were that nitrogen (N) inputs via fertiliser or biological N2 fixation in 60% of fields, and potassium (K) inputs in 90% of fields, were inadequate to balance exports despite increases in fertiliser usage and adjustments to fertiliser inputs based on rotations. Phosphorus (P) and sulfur (S) balances were positive in most fields, with only 5% returning losses kg P or 7 kg S/ha. Within each of the three agroecological zones of the survey, fields that had two legume crops (or pastures) in 5 years (i.e. 40% legumes) maintained a positive N balance. At the mean legume inclusion rate observed of 20% a positive partial N budget was still observed for the Northern Agricultural Region (NAR) of 2.8 kg N/ha.year, whereas balances were negative within the Central Agricultural Region (CAR) by 7.0 kg N/ha.year, and the Southern Agricultural Region (SAR) by 15.5 kg N/ha.year. Hence, N budgets in the CAR and SAR were negative by the amount of N removed in ~0.5 t wheat grain, and continuation of current practices in CAR and SAR fields will lead to declining soil fertility. Maintenance of N in the NAR was achieved by using amounts of fertiliser N similar to other regions while harvesting less grain. The ratio of fertiliser N to legume-fixed N added to the soil in the NAR was twice that of the other regions. Across all regions, the ratio of fertiliser N to legume-fixed N added to the soil averaged ~4.0:1, a major change from earlier estimates in this region of 1:20 under ley farming systems. The low contribution of legume N was due to the decline in legume inclusion rate (now 20%), the low legume content in pastures, particularly in the NAR, and improved harvest index of lupin (Lupinus angustifolius), the most frequently grown grain legume species. Further quantifications of the effects of changing farming systems on nutrient balances are required to assess the balances more accurately, thereby ensuring that soil fertility is maintained, especially because systems have altered towards more intensive cropping with reduced legume production.
Publisher: CSIRO Publishing
Date: 2016
DOI: 10.1071/CP15224
Abstract: Canola (Brassica napus L.) is widely grown throughout all rainfall zones in south-western Australia. Yields are low by world standards, and variable in low-rainfall ( mm annual rainfall) and medium-rainfall (350–450 mm) zones, so that minimising production costs is a major consideration for growers in these areas. One of the major input costs is nitrogen (N) fertiliser. Fifteen N rate × application time × canola plant-type experiments were conducted in the low- and medium-rainfall zones between 2012 and 2014. In most experiments, five rates of N were tested, of ranges 0–75, 0–100, or 0–150 kg N/ha. Nitrogen was applied at four different times (seeding, or 4, 8 or 12 weeks after sowing) or split between these timings. Each experiment compared triazine-tolerant (TT), open-pollinated (OP) canola with Roundup Ready (RR) hybrid canola, and one experiment included TT hybrid and RR OP canola types. On average, RR hybrid produced 250 kg/ha, or 23% more seed and 2.2% more oil than TT OP canola, and the average gross margin of RR hybrid was AU$65/ha more than TT OP. However, seed yield and gross margin differences between RR hybrid and TT OP canola were reduced when seed yields were kg/ha. Canola growth (dry matter) and seed yield responded positively to N fertiliser in most experiments, with 90% of maximum seed yield achieved at an average of 46 kg N/ha (s.e. 6). However, 90% of maximum gross margin was achieved at a lower average N rate of 17 kg N/ha, due primarily to the relatively small yield increase compared with the reduction in concentration of oil in the seed with N applied. Because canola growers of south-western Australia are now paid an uncapped premium for canola grain with oil concentration %, decreases in oil percentage have a significant financial effect, and recommended rates of N should be lower than those calculated to optimise seed yield. In 80% of cases, the first 10 kg N/ha applied provided a return on investment in N $1.50 for every $1 invested. The next 20 kg N/ha applied provided a return on investment of $1.25 for every $1 invested 80% of the time, and further increases would most likely break even. The timing of N application had a minor effect on yield, oil and financial returns, but delaying N application would allow farmers to reduce risk under poor conditions by reducing or eliminating further inputs. Overall, our work demonstrates that a conservative approach to N supply mindful of the combined impacts of N on yield and oil is necessary in south-western Australia and that split and delayed applications are a viable risk-management strategy.
Publisher: CSIRO Publishing
Date: 2020
DOI: 10.1071/CP19509
Abstract: Six years of survey data taken from 184 paddocks spanning 14 million ha of land used for crop and pasture production in south-west Western Australia were used to assess weed populations, herbicide resistance, integrated weed management (IWM) actions and herbicide use patterns in a dryland agricultural system. Key findings were that weed density within crops was low, with 72% of cropping paddocks containing fewer than 10 grass weeds/m2 at anthesis. Weed density and herbicide resistance were not correlated, despite the most abundant grass weed species (annual ryegrass, Lolium rigidum Gaudin) testing positive for resistance to at least one herbicide chemistry in 92% of monitored paddocks. A wide range of herbicides were used (369 unique combinations) suggesting that the ersity of herbicide modes of action may be beneficial for reducing further development of herbicide resistance. However, there was a heavy reliance on glyphosate, the most commonly applied active ingredient. Of concern, in respect to the evolution of glyphosate resistant weeds, was that 45% of glyphosate applications to canola were applied as a single active ingredient and area sown to canola in Western Australia expanded from 0.4 to 1.4 million hectares from 2005 to 2015. In order to minimise the weed seed bank within crops, pastures were used infrequently in some regions and in 50% of cases pastures were actively managed to reduce weed seed set, by applying a non-selective herbicide in spring. The use of non-selective herbicides in this manner also kills pasture plants, consequently self-regenerating pastures were sparse and contained few legumes where cropping intensity was high. Overall, the study indicated that land use selection and utilisation of associated weed management actions were being used successfully to control weeds within the survey area. However, to successfully manage herbicide resistant weeds land use has become less erse, with pastures utilised less and crops with efficacious weed control options utilised more. Further consideration needs to be given to the impacts of these changes in land use on other production factors, such as soil nutrient status and plant pathogens to assess sustainability of these weed management practices in a wider context.
Publisher: Scientific Societies
Date: 08-2015
DOI: 10.1094/PHYTO-07-14-0203-R
Abstract: Root diseases have long been prevalent in Australian grain-growing regions, and most management decisions to reduce the risk of yield loss need to be implemented before the crop is sown. The levels of pathogens that cause the major root diseases can be measured using DNA-based services such as PreDicta B. Although these pathogens are often studied in idually, in the field they often occur as mixed populations and their combined effect on crop production is likely to vary across erse cropping environments. A 3-year survey was conducted covering most cropping regions in Western Australia, utilizing PreDicta B to determine soilborne pathogen levels and visual assessments to score root health and incidence of in idual crop root diseases caused by the major root pathogens, including Rhizoctonia solani (anastomosis group [AG]-8), Gaeumannomyces graminis var. tritici (take-all), Fusarium pseudograminearum, and Pratylenchus spp. (root-lesion nematodes) on wheat roots for 115, 50, and 94 fields during 2010, 2011, and 2012, respectively. A predictive model was developed for root health utilizing autumn and summer rainfall and soil temperature parameters. The model showed that pathogen DNA explained 16, 5, and 2% of the variation in root health whereas environmental parameters explained 22, 11, and 1% of the variation in 2010, 2011, and 2012, respectively. Results showed that R. solani AG-8 soil pathogen DNA, environmental soil temperature, and rainfall parameters explained most of the variation in the root health. This research shows that interactions between environment and pathogen levels before seeding can be utilized in predictive models to improve assessment of risk from root diseases to assist growers to plan more profitable cropping programs.
Location: Australia
No related grants have been discovered for Martin Harries.