ORCID Profile
0000-0001-5687-2535
Current Organisations
University Hospitals Birmingham NHS Foundation Trust
,
Delft University of Technology
Does something not look right? The information on this page has been harvested from data sources that may not be up to date. We continue to work with information providers to improve coverage and quality. To report an issue, use the Feedback Form.
Publisher: Elsevier BV
Date: 08-2014
Publisher: Copernicus GmbH
Date: 11-07-2017
DOI: 10.5194/HESS-21-3427-2017
Abstract: Abstract. The ersity in hydrologic models has historically led to great controversy on the correct approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this paper, we revisit key modeling challenges on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, provide ex les of modeling advances that address these challenges, and define outstanding research needs. We illustrate how modeling advances have been made by groups using models of different type and complexity, and we argue for the need to more effectively use our ersity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.
Publisher: Copernicus GmbH
Date: 05-09-2017
DOI: 10.5194/AMT-2017-287
Abstract: Abstract. In the last decade there has been a growing interest from the hydrometeorological community regarding rainfall estimation from commercial microwave link (CML) networks. Path-averaged rainfall intensities can be retrieved from the signal attenuation between cell phone towers. Although this technique is still in development, it offers great opportunities for the retrieval of rainfall rates at high spatiotemporal resolutions very close to the Earth's surface. Rainfall measurements at high spatiotemporal resolutions are highly valued in urban hydrology, for instance, given the large impact that flash floods exert on society. Flash floods are triggered by intense rainfall events that develop over short time scales. Here, we present one of the first evaluations of this measurement technique for a subtropical climate. Rainfall estimation for subtropical climates is highly relevant, since many countries with few surface rainfall observations are located in such areas. The test bed of the current study is the Brazilian city of São Paulo. The open-source algorithm RAINLINK was applied to retrieve rainfall intensities from (power) attenuation measurements. The performance of RAINLINK estimates was evaluated for 5 of the 250 CML in the São Paulo metropolitan area for which we received data, for 81 days between October 2014 and January 2015. We evaluated the retrieved rainfall intensities and accumulations from CML against those from a dense automatic gauge network. Results were found to be promising and encouraging, especially for short links, for which high correlations ( 0.9) and low biases (~ 30 % and lower) were obtained.
Publisher: Copernicus GmbH
Date: 27-04-2021
DOI: 10.5194/HESS-25-2261-2021
Abstract: Abstract. The biophysical processes occurring in the unsaturated zone have a direct impact on the water table dynamics. Representing these processes through the application of unsaturated zone models of different complexity has an impact on the estimates of the volumes of water flowing between the unsaturated zone and the aquifer. These fluxes, known as net recharge, are often used as the shared variable that couples unsaturated to groundwater models. However, as recharge estimates are always affected by a degree of uncertainty, model–data fusion methods, such as data assimilation, can be used to inform these coupled models and reduce uncertainty. This study assesses the effect of unsaturated zone models complexity (conceptual versus physically based) to update groundwater model outputs, through the assimilation of actual evapotranspiration rates, for a water-limited site in South Australia. Actual evapotranspiration rates are assimilated because they have been shown to be related to the water table dynamics and thus form the link between remote sensing data and the deeper parts of the soil profile. Results have been quantified using standard metrics, such as the root mean square error and Pearson correlation coefficient, and reinforced by calculating the continuous ranked probability score, which is specifically designed to determine a more representative error in stochastic models. It has been found that, once properly calibrated to reproduce the actual evapotranspiration–water table dynamics, a simple conceptual model may be sufficient for this purpose thus using one configuration over the other should be motivated by the specific purpose of the simulation and the information available.
Publisher: Copernicus GmbH
Date: 20-07-2017
DOI: 10.5194/HESS-21-3701-2017
Abstract: Abstract. In this synthesis paper addressing hydrologic scaling and similarity, we posit that roadblocks in the search for universal laws of hydrology are hindered by our focus on computational simulation (the third paradigm) and assert that it is time for hydrology to embrace a fourth paradigm of data-intensive science. Advances in information-based hydrologic science, coupled with an explosion of hydrologic data and advances in parameter estimation and modeling, have laid the foundation for a data-driven framework for scrutinizing hydrological scaling and similarity hypotheses. We summarize important scaling and similarity concepts (hypotheses) that require testing describe a mutual information framework for testing these hypotheses describe boundary condition, state, flux, and parameter data requirements across scales to support testing these hypotheses and discuss some challenges to overcome while pursuing the fourth hydrological paradigm. We call upon the hydrologic sciences community to develop a focused effort towards adopting the fourth paradigm and apply this to outstanding challenges in scaling and similarity.
Publisher: Elsevier BV
Date: 03-2021
Publisher: Copernicus GmbH
Date: 08-09-2016
DOI: 10.5194/HESS-20-3631-2016
Abstract: Abstract. In the current human-modified world, or Anthropocene, the state of water stores and fluxes has become dependent on human as well as natural processes. Water deficits (or droughts) are the result of a complex interaction between meteorological anomalies, land surface processes, and human inflows, outflows, and storage changes. Our current inability to adequately analyse and manage drought in many places points to gaps in our understanding and to inadequate data and tools. The Anthropocene requires a new framework for drought definitions and research. Drought definitions need to be revisited to explicitly include human processes driving and modifying soil moisture drought and hydrological drought development. We give recommendations for robust drought definitions to clarify timescales of drought and prevent confusion with related terms such as water scarcity and overexploitation. Additionally, our understanding and analysis of drought need to move from single driver to multiple drivers and from uni-directional to multi-directional. We identify research gaps and propose analysis approaches on (1) drivers, (2) modifiers, (3) impacts, (4) feedbacks, and (5) changing the baseline of drought in the Anthropocene. The most pressing research questions are related to the attribution of drought to its causes, to linking drought impacts to drought characteristics, and to societal adaptation and responses to drought. Ex le questions include (i) What are the dominant drivers of drought in different parts of the world? (ii) How do human modifications of drought enhance or alleviate drought severity? (iii) How do impacts of drought depend on the physical characteristics of drought vs. the vulnerability of people or the environment? (iv) To what extent are physical and human drought processes coupled, and can feedback loops be identified and altered to lessen or mitigate drought? (v) How should we adapt our drought analysis to accommodate changes in the normal situation (i.e. what are considered normal or reference conditions) over time? Answering these questions requires exploration of qualitative and quantitative data as well as mixed modelling approaches. The challenges related to drought research and management in the Anthropocene are not unique to drought, but do require urgent attention. We give recommendations drawn from the fields of flood research, ecology, water management, and water resources studies. The framework presented here provides a holistic view on drought in the Anthropocene, which will help improve management strategies for mitigating the severity and reducing the impacts of droughts in future.
Publisher: Copernicus GmbH
Date: 09-02-2017
DOI: 10.5194/HESS-2017-54
Abstract: Abstract. In just the past five years, the field of Earth observation has evolved from the relatively staid approaches of government space agencies into a plethora of sensing opportunities afforded by CubeSats, Unmanned Aerial Vehicles (UAVs), and smartphone technologies that have been embraced by both for-profit companies and in idual researchers. Over the previous decades, space agency efforts have brought forth well-known and immensely useful satellites such as the Landsat series and the Gravity Research and Climate Experiment (GRACE) system, with costs typically on the order of one billion dollars per satellite and with concept-to-launch timelines on the order of two decades (for new missions). More recently, the proliferation of smartphones has helped to miniaturise sensors and energy requirements, facilitating advances in the use of CubeSats that can be launched by the dozens, while providing 3–5 m resolution sensing of the Earth on a daily basis. Start-up companies that did not exist five years ago now operate more satellites in orbit than any space agency and at costs that are a mere fraction of an agency mission. With these advances come new space-borne measurements, such as high-definition video for understanding real-time cloud formation, storm development, flood propagation, precipitation tracking, or for constructing digital surfaces using structure-from-motion techniques. Closer to the surface, measurements from small unmanned drones and tethered balloons have mapped snow depths, floods, and estimated evaporation at sub-meter resolution, pushing back on spatiotemporal constraints and delivering new process insights. At ground level, precipitation has been measured using signal attenuation between antennae mounted on cell phone towers, while the proliferation of mobile devices has enabled citizenscience to record photos of environmental conditions, estimate daily average temperatures from battery state, and enable the measurement of other hydrologically important variables such as channel depths using commercially available wireless devices. Global internet access is being pursued via high altitude balloons, solar planes, and hundreds of planned satellite launches, providing a means to exploit the Internet of Things as a new measurement domain. Such global access will enable real-time collection of data from billions of smartphones or from remote research platforms. This future will produce petabytes of data that can only be accessed via cloud storage and will require new analytical approaches to interpret. The extent to which today's hydrologic models can usefully ingest such massive data volumes is not clear. Nor is it clear whether this deluge of data will be usefully exploited, either because the measurements are superfluous, inconsistent, not accurate enough, or simply because we lack the capacity to process and analyse them. What is apparent is that the tools and techniques afforded by this array of novel and game-changing sensing platforms presents our community with a unique opportunity to develop new insights that advance fundamental aspects of the hydrological sciences. To accomplish this will require more than just an application of the technology: in some cases, it will demand a radical rethink on how we utilise and exploit these new observation platforms to enhance our understanding of the Earth system.
Publisher: American Geophysical Union (AGU)
Date: 09-2013
DOI: 10.1002/WRCR.20407
Publisher: Springer Science and Business Media LLC
Date: 12-09-2017
Publisher: Copernicus GmbH
Date: 26-07-2018
Abstract: Abstract. In the last decade there has been a growing interest from the hydrometeorological community regarding rainfall estimation from commercial microwave link (CML) networks. Path-averaged rainfall intensities can be retrieved from the signal attenuation between cell phone towers. Although this technique is still in development, it offers great opportunities for the retrieval of rainfall rates at high spatiotemporal resolutions very close to the Earth's surface. Rainfall measurements at high spatiotemporal resolutions are highly valued in urban hydrology, for instance, given the large impact that flash floods exert on society. Flash floods are triggered by intense rainfall events that develop over short time scales. Here, we present one of the first evaluations of this measurement technique for a humid subtropical climate. Rainfall estimation for subtropical climates is highly relevant, as many countries with few surface rainfall observations are located in such areas. The test bed of the current study is the Brazilian city of São Paulo. The open-source algorithm RAINLINK was applied to retrieve rainfall intensities from (power) attenuation measurements. The performance of RAINLINK estimates was evaluated for 145 of the 213 CMLs in the São Paulo metropolitan area for which we received data, for 81 days between October 2014 and January 2015. We evaluated the retrieved rainfall intensities and accumulations from CMLs against those from a dense automatic gauge network. Results were found to be promising and encouraging when it comes to capturing the city-average rainfall dynamics. Mixed results were obtained for in idual CML estimates, which may be related to erroneous metadata.
Publisher: American Geophysical Union (AGU)
Date: 12-2018
DOI: 10.1029/2018WR023393
Publisher: Copernicus GmbH
Date: 09-02-2017
Publisher: Springer Science and Business Media LLC
Date: 03-2021
DOI: 10.1530/ERP-21-0001
Publisher: American Diabetes Association
Date: 29-09-2020
DOI: 10.2337/DB20-0647
Abstract: Obesity is a major risk factor for insulin resistance (IR) and its attendant complications. The pathogenic mechanisms linking them remain poorly understood, partly due to a lack of intermediary monogenic human phenotypes. Here, we report on a monogenic form of IR-prone obesity, Alström syndrome (ALMS). Twenty-three subjects with monogenic or polygenic obesity underwent hyperinsulinemic-euglycemic cl ing with concomitant adipose tissue (AT) microdialysis and an in-depth analysis of subcutaneous AT histology. We have shown a relative AT failure in a monogenic obese cohort, a finding supported by observations in a novel conditional mouse model (Almsflin/flin) and ALMS1-silenced human primary adipocytes, whereas selective reactivation of ALMS1 gene in AT of an ALMS conditional knockdown mouse model (Almsflin/flin Adipo-Cre+/−) restores systemic insulin sensitivity and glucose tolerance. Hence, we show for the first time the relative AT failure in human obese cohorts to be a major determinant of accelerated IR without evidence of lipodystrophy. These new insights into adipocyte-driven IR may assist development of AT-targeted therapeutic strategies for diabetes.
Publisher: Copernicus GmbH
Date: 12-06-2019
Abstract: Abstract. Knowledge of the full rainfall Drop Size Distribution (DSD) is critical for characterising liquid water precipitation for applications such as rainfall retrievals using electromagnetic signals and atmospheric model parameterisation. Southern Hemisphere temperate latitudes have a lack of DSD observations and their integrated variables. Laser-based disdrometers rely on the attenuation of a beam by falling particles and is currently the most commonly used type of instrument to observe the DSD. However, there remain questions on the accuracy and variability in the DSDs measured by co-located instruments wether identical models, different models or from different manufacturers. In this study, raw and processed DSD observations obtained from two of the most commonly deployed laser disdrometers, namely the Parsivel1 from OTT and the Laser Precipitation Monitor (LPM) from Thies Clima, are analysed and compared. Four co-located instruments of each type were deployed over 3 years from 2014 to 2017 in the proximity of Melbourne, a region prone to coastal rainfall in Southeast Australia. This dataset includes a total of approximately 1.5 million recorded minutes, including over 40,000 minutes of quality rainfall data common to all instruments, equivalent to a cumulative amount of rainfall ranging from 1093 to 1244 mm (depending on the instrument records) for a total of 318 rainfall events. Most of the events lasted between 20 and 40 min for rainfall amounts of 0.12 mm to 26.0 mm. The co-located LPM sensors show very similar observations while the co-located Parsivel1 systems show significantly different results. The LPM recorded one to two orders of magnitude more smaller droplets for drop diameters below 0.6 mm compared to the Parsivel1, with differences increasing at higher rainfall rates. The LPM integrated variables showed systematically lower values compared to the Parsivel1. Radar reflectivity-rainfall rate (ZH-R) relationships and resulting potential errors are also presented. Specific ZH-R relations for drizzle and convective rainfall are also derived based on DSD collected for each instrument type. Variability of the DSD as observed by co-located instruments of the same manufacturer had little impact on the estimated ZH-R relationships for stratiform rainfall, but differs when considering convective rainfall relations or ZH-R relations fitted to all available data. Conversely, disdrometer-derived ZH-R relations as compared to the Marshall-Palmer relation ZH =200R1.6 led to a bias in rainfall rates for reflectivities of 50 dBZ of up to 21.6 mm h−1. This study provides an open-source high-resolution dataset of co-located DSD to further explore s ling effects at micro-scale, along with rainfall microphysics.
Publisher: PeerJ
Date: 30-07-2020
DOI: 10.7717/PEERJ.9558
Abstract: River discharges are often predicted based on a calibrated rainfall-runoff model. The major sources of uncertainty, namely input, parameter and model structural uncertainty must all be taken into account to obtain realistic estimates of the accuracy of discharge predictions. Over the past years, Bayesian calibration has emerged as a suitable method for quantifying uncertainty in model parameters and model structure, where the latter is usually modelled by an additive or multiplicative stochastic term. Recently, much work has also been done to include input uncertainty in the Bayesian framework. However, the use of geostatistical methods for characterizing the prior distribution of the catchment rainfall is underexplored, particularly in combination with assessments of the influence of increasing or decreasing rain gauge network density on discharge prediction accuracy. In this article we integrate geostatistics and Bayesian calibration to analyze the effect of rain gauge density on river discharge prediction accuracy. We calibrated the HBV hydrological model while accounting for input, initial state, model parameter and model structural uncertainty, and also taking uncertainties in the discharge measurements into account. Results for the Thur basin in Switzerland showed that model parameter uncertainty was the main contributor to the joint posterior uncertainty. We also showed that a low rain gauge density is enough for the Bayesian calibration, and that increasing the number of rain gauges improved model prediction until reaching a density of one gauge per 340 km 2 . While the optimal rain gauge density is case-study specific, we make recommendations on how to handle input uncertainty in Bayesian calibration for river discharge prediction and present the methodology that may be used to carry out such experiments.
Publisher: American Geophysical Union (AGU)
Date: 03-2021
DOI: 10.1029/2020AV000258
Abstract: The monitoring of wildfire smoke is important to help mitigate impacts on people such as by sending early warnings to affected areas. Received signal levels (RSLs) from radio links have been used as an opportunistic way to accurately measure rainfall and humidity. Radio links provide integrated measurements along their paths and are an exceptional untapped resource to complement air quality stations in areas affected by smoke events, or in developing countries without air quality monitoring capability. This study analyzed radio link signal fluctuations during smoke events associated with the 2019–2020 Australian wildfires. Concurrently, the atmospheric boundary layer was characterized using atmospheric soundings and surface observations, as well as air quality proxies such as particulate matter concentrations less than 2.5 μm (10 μm), or PM 2.5 (PM 10 ). Observations showed that dry air containing large amounts of smoke within a surface layer above the ground acted as a lid, reducing dispersion, trapping and maintaining high ground‐level concentrations of smoke. These conditions also created anomalous propagation conditions for radio links and operational weather radars. Power‐law relations between signal fluctuations and PM 10 and PM 2.5 were derived based on the link data collected and the closest air quality station observations. While there was variability in retrieval performance across smoke events, the best performance determination coefficients exceeded 0.5, with an RMSE on the order of less than 50 μg m −3 for concentrations of more than 400 μg m −3 . Mid‐range link lengths (5–20 km) provided the best results.
Publisher: Oxford University Press (OUP)
Date: 2004
DOI: 10.1016/J.EHJ.2003.10.020
Abstract: The efficacy of cardioversion (DCCV) for restoration of sinus rhythm (SR) in persistent atrial fibrillation (AF) is limited by a high relapse rate. Relapse may be reduced by amiodarone but no placebo-controlled trials of efficacy have been performed and the appropriate duration of therapy is unknown. In this double-blind study, 161 subjects with persistent AF were randomized to one of three groups-placebo (n=38) amiodarone 400mg BD for 2 weeks prior to DCCV and 200mg daily for 8 weeks followed by placebo for 44 weeks (n=62, short-term amiodarone) amiodarone 400mg BD for 2 weeks then 200mg daily for 52 weeks (n=61, long-term amiodarone). Spontaneous reversion to SR occurred before DCCV in 21% (26/123) patients on amiodarone and none of the 38 patients on placebo (absolute difference 21%, 95% confidence interval (CI): 10 to 29%, P=0.002). At 8 weeks following DCCV, 51% (63/123) patients on amiodarone remained in SR compared to 16% (6/38) taking placebo (difference-35% 95% CI: -48 to -18%, P<0.001). At 1 year, 49% (30/61) patients on long-term amiodarone were in SR compared to 33% (21/62) taking short-term amiodarone (difference-15%, 95% CI: -31 to 2%, P=0.085). There was no difference in adverse event rate or quality of life scores between groups. Amiodarone pre-treatment before electrical DCCV for persistent AF allows chemical conversion in one-fifth of patients without altering the efficacy of subsequent DC conversion. Amiodarone is more effective than placebo in the maintenance of SR when continued for 8 weeks following successful DCCV. More patients taking long-term amiodarone remained in SR at 52 weeks, but more had serious adverse effects requiring discontinuation of therapy. Eight weeks of adjuvant therapy with amiodarone following successful DCCV may be the preferred option.
Publisher: Springer Science and Business Media LLC
Date: 02-2016
DOI: 10.1038/NGEO2646
Publisher: Copernicus GmbH
Date: 06-07-2020
Abstract: Abstract. The bio-physical processes occurring in the unsaturated zone have a direct impact on the water table dynamics. Conceptual models, with a simplified representation of the unsaturated zone dynamics, are often selected for coupling to groundwater models, while physically-based models are widely used, particularly at the field scale, for an accurate representation of the water transport. The recharge rates estimated by these Unsaturated Zone Models (UZMs) can then be used as input for groundwater models. Because recharge estimates are always affected by uncertainty, model-data fusion methods, such as data assimilation, can be used to reduce the uncertainty in the model results. In this study, the required complexity (i.e. conceptual versus physically-based) of the unsaturated zone model to update groundwater models through the assimilation of evapotranspiration (ET) rates is assessed for a water-limited site in South Australia. ET rates are assimilated because they have been shown to be related to the groundwater table dynamics, and thus form the link between remote sensing data and the deeper parts of the soil profile. It has been found that, under the test site conditions, a conceptual UZM can be used to improve groundwater model results through the assimilation of ET rates.
Publisher: American Geophysical Union (AGU)
Date: 29-06-2020
DOI: 10.1029/2019WR026255
Abstract: Commercial microwave links (CMLs) have proven useful for providing rainfall information close to the ground surface. However, large uncertainties are associated with these retrievals, partly due to challenges in the type of data collection and processing. In particular, the most common case is when only minimum and maximum received signal levels (RSLs) over a given time interval (hereafter 15 min) are stored by mobile network operators. The average attenuation and the corresponding rainfall rate are then calculated based on a weighted average method using the minimum and maximum attenuation. In this study, an alternative to using a constant weighted average method is explored, based on a machine learning model trained to produce actual attenuation from minimum/maximum values. A rainfall retrieval deep learning model was designed based on a long short‐term memory (LSTM) model architecture and trained with disdrometer data in a form that is comparable to the data provided by mobile network operators. A first evaluation used only disdrometer data to mimic both attenuation from a CML and corresponding rainfall rates. For the test data set, the relative bias was reduced from 5.99% to 2.84% and the coefficient of determination ( R 2 ) increased from 0.86 to 0.97. The second evaluation used this disdrometer‐trained LSTM to retrieve rainfall rates from an actual CML located nearby the disdrometer. A significant improvement in the overall rainfall estimation compared to existing microwave link attenuation models was observed. The relative bias reduced from 7.39% to −1.14% and the R 2 improved from 0.71 to 0.82.
Publisher: Copernicus GmbH
Date: 31-05-2016
Abstract: Abstract. In the current human-modified world, or "Anthropocene", the state of water stores and fluxes has become dependent on human as well as natural processes. Water deficits (or droughts) are the result of a complex interaction between meteorological anomalies, land surface processes, and human inflows, outflows and storage changes. Our current inability to adequately analyse and manage drought in many places points to gaps in our understanding and to inadequate data and tools. The Anthropocene requires a new framework for drought definitions and research. Drought definitions need to be revisited to explicitly include human processes driving and modifying soil moisture drought and hydrological drought development. We give recommendations for robust drought definitions to clarify timescales of drought and prevent confusion with related terms such as water scarcity and overexploitation. Additionally, our understanding and analysis of drought need to move from single driver to multiple drivers and from uni-directional to multi-directional. We identify research gaps and propose analysis approaches on (1) drivers, (2) modifiers, (3) impacts, (4) feedbacks, and (5) changing baseline of drought in the Anthropocene. The most pressing research questions are related to the attribution of drought to its causes, to linking drought impacts to drought characteristics, and to societal adaptation and responses to drought. Ex le questions include: (i) what are the dominant drivers of drought in different parts of the world?, (ii) how do human modifications of drought enhance or alleviate drought severity?, (iii) how do impacts of drought depend on the physical characteristics of drought versus the vulnerability of people or the environment?, (iv) to what extent are physical and human drought processes coupled, and can feedback loops be identified and altered to lessen or mitigate drought?, (v) how should we adapt our drought analysis to accommodate "changes in the norm" (i.e. what are considered normal conditions) over time? Answering these questions requires exploration of qualitative and quantitative data as well as mixed modelling approaches. The challenges related to drought research and management in the Anthropocene are not unique to drought, but do require urgent attention. We give recommendations drawn from the fields of flood research, ecology, water management, and water resources studies. The framework presented here provides a holistic view on drought in the Anthropocene, which will help improve management strategies for mitigating the severity and reducing the impacts of droughts in future.
Publisher: Copernicus GmbH
Date: 13-01-2017
Abstract: Abstract. In this review of hydrologic scaling and similarity, we posit that roadblocks in the search for universal laws of hydrology are hindered by our focus on computational simulation (the third-paradigm), and assert that it is time for hydrology to embrace a fourth paradigm of data-intensive science. Advances in information-based hydrologic science, coupled with an explosion of hydrologic data and advances in parameter estimation and modelling, have laid the foundation for a data-driven framework for scrutinizing hydrological scaling and similarity hypotheses. We summarize important scaling and similarity concepts (hypotheses) that require testing, describe a mutual information framework for testing these hypotheses, describe boundary condition, state/flux, and parameter data requirements across scales to support testing these hypotheses, and discuss some challenges to overcome while pursuing the fourth hydrological paradigm. We call upon the hydrologic sciences community to develop a focused effort towards adopting the fourth paradigm and apply this to outstanding challenges in scaling and similarity.
Publisher: Copernicus GmbH
Date: 24-05-2017
DOI: 10.5194/HESS-21-2579-2017
Abstract: Abstract. Wetlands are important reservoirs of water, carbon and bio ersity. They are typical landscapes of lowland regions that have high potential for water retention. However, the hydrology of these wetlands in tropical regions is often studied in isolation from the processes taking place at the catchment scale. Our main objective is to study the hydrological dynamics of one of the largest tropical rainforest regions on an island using a combination of satellite remote sensing and novel observations from dedicated field c aigns. This contribution offers a comprehensive analysis of the hydrological dynamics of two neighbouring poorly gauged tropical basins the Kapuas basin (98 700 km2) in West Kalimantan and the Mahakam basin (77 100 km2) in East Kalimantan, Indonesia. Both basins are characterised by vast areas of inland lowlands. Hereby, we put specific emphasis on key hydrological variables and indicators such as discharge and flood extent. The hydroclimatological data described herein were obtained during fieldwork c aigns carried out in the Kapuas over the period 2013–2015 and in the Mahakam over the period 2008–2010. Additionally, we used the Tropical Rainfall Measuring Mission (TRMM) rainfall estimates over the period 1998–2015 to analyse the distribution of rainfall and the influence of El-Niño – Southern Oscillation. Flood occurrence maps were obtained from the analysis of the Phase Array type L-band Synthetic Aperture Radar (PALSAR) images from 2007 to 2010. Drought events were derived from time series of simulated groundwater recharge using time series of TRMM rainfall estimates, potential evapotranspiration estimates and the threshold level approach. The Kapuas and the Mahakam lake regions are vast reservoirs of water of about 1000 and 1500 km2 that can store as much as 3 and 6.5 billion m3 of water, respectively. These storage capacity values can be doubled considering the area of flooding under vegetation cover. Discharge time series show that backwater effects are highly influential in the wetland regions, which can be partly explained by inundation dynamics shown by flood occurrence maps obtained from PALSAR images. In contrast to their nature as wetlands, both lowland areas have frequent periods with low soil moisture conditions and low groundwater recharge. The Mahakam wetland area regularly exhibits low groundwater recharge, which may lead to prolonged drought events that can last up to 13 months. It appears that the Mahakam lowland is more vulnerable to hydrological drought, leading to more frequent fire occurrences than in the Kapuas basin.
Publisher: Copernicus GmbH
Date: 28-03-2022
DOI: 10.5194/EGUSPHERE-EGU22-7026
Abstract: & & Radar rainfall nowcasting, an observation-based rainfall forecasting technique that statistically extrapolates current observations into the future, is increasingly used for short-term forecasting (& hours ahead). These first hours ahead are a key time scale for e.g. (flash) flood warnings and they are generally not sufficiently well captured by the rainfall forecasts of numerical weather prediction (NWP) models.& br& & br& A recent development in nowcasting is the transition to more community-driven, open-source models. The Python library pySTEPS is an ex le of this. One of its main features is an efficient Python implementation of the probabilistic nowcasting scheme STEPS. pySTEPS generates an ensemble of rainfall forecasts by perturbing a deterministic extrapolation nowcast with spatially and temporally correlated stochastic noise. It considers the dynamical scaling of the rainfall predictability by decomposing the rainfall fields into a multiplicative cascade and applies different stochastic perturbations for each scale. This results in large-scale features that evolve more slowly than the small-scale features.& br& & br& Despite pySTEPS' representation of the uncertainty associated with growth and decay of rainfall in the first 1-2 hours of the nowcast, it quickly loses skill after 2 hours, or even less for convective rainfall events or small radar domains. To extend the skillful lead time to the desired time scale of 6 hours or more, a blending with NWP rainfall forecasts is necessary. We have implemented an adaptive scale-dependent blending in pySTEPS based on earlier work in the STEPS scheme. In this blending implementation, the blending of the extrapolation nowcast, NWP and noise components is performed level-by-level, which means that the blending weights vary per cascade level. These scale-dependent blending weights are computed from the recent skill of the forecast components, and converge to a climatological value, which is computed from a 1-month rolling window and can be adjusted to the (operational) needs of the user. To constrain the (dis)appearance of rain in the ensemble members to regions around the rainy areas, we have developed a Lagrangian blended probability matching scheme and incremental masking strategy.& br& & br& We present a validation of the blending approach in a hydrometeorological testbed using Belgian radar and NWP data for the Belgian and Dutch catchments Dommel, Geul and Vesdre. We compare the resulting ensemble rainfall and discharge forecasts of the blending implementation with ensemble nowcasts from pySTEPS, ALARO (NWP) forecasts and a linear blending strategy.& &
Publisher: Copernicus GmbH
Date: 16-01-2017
Abstract: Abstract. The ersity in hydrologic models has historically led to great controversy on the “correct” approach to process-based hydrologic modeling, with debates centered on the adequacy of process parameterizations, data limitations and uncertainty, and computational constraints on model analysis. In this paper we revisit key modeling challenges, outlined by Freeze and Harlan nearly 50 years ago, on requirements to (1) define suitable model equations, (2) define adequate model parameters, and (3) cope with limitations in computing power. We outline the historical modeling challenges, summarize modeling advances that address these challenges, and define outstanding research needs. We illustrate how modeling advances have been made by groups using models of different type and complexity, and we argue for the need to more effectively use our ersity of modeling approaches in order to advance our collective quest for physically realistic hydrologic models.
Publisher: Copernicus GmbH
Date: 28-07-2017
DOI: 10.5194/HESS-21-3879-2017
Abstract: Abstract. In just the past 5 years, the field of Earth observation has progressed beyond the offerings of conventional space-agency-based platforms to include a plethora of sensing opportunities afforded by CubeSats, unmanned aerial vehicles (UAVs), and smartphone technologies that are being embraced by both for-profit companies and in idual researchers. Over the previous decades, space agency efforts have brought forth well-known and immensely useful satellites such as the Landsat series and the Gravity Research and Climate Experiment (GRACE) system, with costs typically of the order of 1 billion dollars per satellite and with concept-to-launch timelines of the order of 2 decades (for new missions). More recently, the proliferation of smartphones has helped to miniaturize sensors and energy requirements, facilitating advances in the use of CubeSats that can be launched by the dozens, while providing ultra-high (3–5 m) resolution sensing of the Earth on a daily basis. Start-up companies that did not exist a decade ago now operate more satellites in orbit than any space agency, and at costs that are a mere fraction of traditional satellite missions. With these advances come new space-borne measurements, such as real-time high-definition video for tracking air pollution, storm-cell development, flood propagation, precipitation monitoring, or even for constructing digital surfaces using structure-from-motion techniques. Closer to the surface, measurements from small unmanned drones and tethered balloons have mapped snow depths, floods, and estimated evaporation at sub-metre resolutions, pushing back on spatio-temporal constraints and delivering new process insights. At ground level, precipitation has been measured using signal attenuation between antennae mounted on cell phone towers, while the proliferation of mobile devices has enabled citizen scientists to catalogue photos of environmental conditions, estimate daily average temperatures from battery state, and sense other hydrologically important variables such as channel depths using commercially available wireless devices. Global internet access is being pursued via high-altitude balloons, solar planes, and hundreds of planned satellite launches, providing a means to exploit the internet of things as an entirely new measurement domain. Such global access will enable real-time collection of data from billions of smartphones or from remote research platforms. This future will produce petabytes of data that can only be accessed via cloud storage and will require new analytical approaches to interpret. The extent to which today's hydrologic models can usefully ingest such massive data volumes is unclear. Nor is it clear whether this deluge of data will be usefully exploited, either because the measurements are superfluous, inconsistent, not accurate enough, or simply because we lack the capacity to process and analyse them. What is apparent is that the tools and techniques afforded by this array of novel and game-changing sensing platforms present our community with a unique opportunity to develop new insights that advance fundamental aspects of the hydrological sciences. To accomplish this will require more than just an application of the technology: in some cases, it will demand a radical rethink on how we utilize and exploit these new observing systems.
Publisher: Oxford University Press (OUP)
Date: 08-06-2020
Abstract: Cardiac involvement in Fabry disease (FD) occurs prior to left ventricular hypertrophy (LVH) and is characterized by low myocardial native T1 with sphingolipid storage reflected by cardiovascular magnetic resonance (CMR) and electrocardiogram (ECG) changes. We hypothesize that a pre-storage myocardial phenotype might occur even earlier, prior to T1 lowering. FD patients and age-, sex-, and heart rate-matched healthy controls underwent same-day ECG with advanced analysis and multiparametric CMR [cines, global longitudinal strain (GLS), T1 and T2 mapping, stress perfusion (myocardial blood flow, MBF), and late gadolinium enhancement (LGE)]. One hundred and fourteen Fabry patients (46 ± 13 years, 61% female) and 76 controls (49 ± 15 years, 50% female) were included. In pre-LVH FD (n = 72, 63%), a low T1 (n = 32/72, 44%) was associated with a constellation of ECG and functional abnormalities compared to normal T1 FD patients and controls. However, pre-LVH FD with normal T1 (n = 40/72, 56%) also had abnormalities compared to controls: reduced GLS (−18 ± 2 vs. −20 ± 2%, P & 0.001), microvascular changes (lower MBF 2.5 ± 0.7 vs. 3.0 ± 0.8 mL/g/min, P = 0.028), subtle T2 elevation (50 ± 4 vs. 48 ± 2 ms, P = 0.027), and limited LGE (%LGE 0.3 ± 1.1 vs. 0%, P = 0.004). ECG abnormalities included shorter P-wave duration (88 ± 12 vs. 94 ± 15 ms, P = 0.010) and T-wave peak time (Tonset – Tpeak 104 ± 28 vs. 115 ± 20 ms, P = 0.015), resulting in a more symmetric T wave with lower T-wave time ratio (Tonset – Tpeak)/(Tpeak – Tend) (1.5 ± 0.4 vs. 1.8 ± 0.4, P & 0.001) compared to controls. FD has a measurable myocardial phenotype pre-LVH and pre-detectable myocyte storage with microvascular dysfunction, subtly impaired GLS and altered atrial depolarization and ventricular repolarization intervals.
Publisher: American Geophysical Union (AGU)
Date: 12-2018
DOI: 10.1029/2018WR023580
Publisher: Copernicus GmbH
Date: 19-11-2019
DOI: 10.5194/HESS-23-4737-2019
Abstract: Abstract. Knowledge of the full rainfall drop size distribution (DSD) is critical for characterising liquid water precipitation for applications such as rainfall retrievals using electromagnetic signals and atmospheric model parameterisation. Southern Hemisphere temperate latitudes have a lack of DSD observations and their integrated variables. Laser-based disdrometers rely on the attenuation of a beam by falling particles and are currently the most commonly used type of instrument to observe the DSD. However, there remain questions on the accuracy and variability in the DSDs measured by co-located instruments, whether identical models, different models or from different manufacturers. In this study, raw and processed DSD observations obtained from two of the most commonly deployed laser disdrometers, namely the Parsivel1 from OTT and the Laser Precipitation Monitor (LPM) from Thies Clima, are analysed and compared. Four co-located instruments of each type were deployed over 3 years from 2014 to 2017 in the proximity of Melbourne, a region prone to coastal rainfall in south-eastern Australia. This dataset includes a total of approximately 1.5 million recorded minutes, including over 40 000 min of quality rainfall data common to all instruments, equivalent to a cumulative amount of rainfall ranging from 1093 to 1244 mm (depending on the instrument records) for a total of 318 rainfall events. Most of the events lasted between 20 and 40 min for rainfall amounts of 0.12 to 26.0 mm. The co-located LPM sensors show very similar observations, while the co-located Parsivel1 systems show significantly different results. The LPM recorded 1 to 2 orders of magnitude more smaller droplets for drop diameters below 0.6 mm compared to the Parsivel1, with differences increasing at higher rainfall rates. The LPM integrated variables showed systematically lower values compared to the Parsivel1. Radar reflectivity–rainfall rate (ZH–R) relationships and resulting potential errors are also presented. Specific ZH–R relations for drizzle and convective rainfall are also derived based on DSD collected for each instrument type. Variability of the DSD as observed by co-located instruments of the same manufacturer had little impact on the estimated ZH–R relationships for stratiform rainfall, but differs when considering convective rainfall relations or ZH–R relations fitted to all available data. Conversely, disdrometer-derived ZH–R relations as compared to the Marshall–Palmer relation ZH=200R1.6 led to a bias in rainfall rates for reflectivities of 50 dBZ of up to 21.6 mm h−1. This study provides an open-source high-resolution dataset of co-located DSD to further explore s ling effects at the micro scale, along with rainfall microstructure.
Location: United Kingdom of Great Britain and Northern Ireland
Location: United States of America
No related grants have been discovered for Richard Steeds.