Probabilistic Methods (Dose Response Panel)

Project ID

4902

Category

Other

Added on

Sept. 24, 2024, 10:51 a.m.

Search the HERO reference database

Query Builder

Search query
Journal Article

Abstract  Extensive exposure to per- and polyfluoroalkyl substances (PFAS) have been observed in many countries. Current deterministic frameworks for risk assessment lack the ability to predict the likelihood of effects and to assess uncertainty. When exposure exceeds tolerable intake levels, these shortcomings hamper risk management and communication.

DOI
Journal Article

Abstract  Model averaging for dichotomous dose–response estimation is preferred to estimate the benchmark dose (BMD) from a single model, but challenges remain regarding implementing these methods for general analyses before model averaging is feasible to use in many risk assessment applications, and there is little work on Bayesian methods that include informative prior information for both the models and the parameters of the constituent models. This article introduces a novel approach that addresses many of the challenges seen while providing a fully Bayesian framework. Furthermore, in contrast to methods that use Monte Carlo Markov Chain, we approximate the posterior density using maximum a posteriori estimation. The approximation allows for an accurate and reproducible estimate while maintaining the speed of maximum likelihood, which is crucial in many applications such as processing massive high throughput data sets. We assess this method by applying it to empirical laboratory dose–response data and measuring the coverage of confidence limits for the BMD. We compare the coverage of this method to that of other approaches using the same set of models. Through the simulation study, the method is shown to be markedly superior to the traditional approach of selecting a single preferred model (e.g., from the U.S. EPA BMD software) for the analysis of dichotomous data and is comparable or superior to the other approaches.

Journal Article

Abstract  A probabilistic and interdisciplinary risk-benefit assessment (RBA) model integrating microbiological, nutritional, and chemical components was developed for infant milk, with the objective of predicting the health impact of different scenarios of consumption. Infant feeding is a particular concern of interest in RBA as breast milk and powder infant formula have both been associated with risks and benefits related to chemicals, bacteria, and nutrients, hence the model considers these three facets. Cronobacter sakazakii, dioxin-like polychlorinated biphenyls (dl-PCB), and docosahexaenoic acid (DHA) were three risk/benefit factors selected as key issues in microbiology, chemistry, and nutrition, respectively. The present model was probabilistic with variability and uncertainty separated using a second-order Monte Carlo simulation process. In this study, advantages and limitations of undertaking probabilistic and interdisciplinary RBA are discussed. In particular, the probabilistic technique was found to be powerful in dealing with missing data and to translate assumptions into quantitative inputs while taking uncertainty into account. In addition, separation of variability and uncertainty strengthened the interpretation of the model outputs by enabling better consideration and distinction of natural heterogeneity from lack of knowledge. Interdisciplinary RBA is necessary to give more structured conclusions and avoid contradictory messages to policymakers and also to consumers, leading to more decisive food recommendations. This assessment provides a conceptual development of the RBA methodology and is a robust basis on which to build upon.

Journal Article

Abstract  BACKGROUND: Consumption of meat prepared by barbecuing is associated with risk of cancer due to formation of carcinogenic compounds including benzo[a]pyrene (BaP). Assessment of a population's risk of disease and people's individual probability of disease given specific consumer attributes may direct food safety strategies to where impact on public health is largest. The aim of this study was to propose a model that estimates the risk of cancer caused by exposure to BaP from barbecued meat in Denmark, and to estimate the probability of developing cancer in subgroups of the population given different barbecuing frequencies.

METHODS: We developed probabilistic models applying two dimensional Monte Carlo simulation to take into account the variation in exposure given age and sex and in the individuals' sensitivity to develop cancer after exposure to BaP, and the uncertainty in the dose response model. We used the Danish dietary consumption survey, monitoring data of chemical concentrations, data on consumer behavior of frequency of barbecuing, and animal dose response data.

FINDINGS: We estimated an average extra lifetime risk of cancer due to BaP from barbecued meat of 6.8 × 10-5 (95% uncertainty interval 2.6 × 10-7 - 7.0 × 10-4) in the Danish population. This corresponds to approximately one to 4,074 extra cancer cases over a lifetime, reflecting wide uncertainty. The impact per barbecuing event on the risk of cancer for men and women of low body weight was higher compared to higher bodyweight. However, the difference due to sex and bodyweight between subgroups are dwarfed by the uncertainty.

INTERPRETATION: This study proposes a model that can be applied to other substances and routes of exposure, and allows for deriving the change in risk following a specific change in behaviour. The presented methodology can serve as a valuable tool for risk management, allowing for the formulation of behaviour advice targeted to specific sub-groups in the population.

Journal Article

Abstract  Background: Socioeconomic analysis is currently used in the Europe Union as part of the regulatory process in Regulation Registration, Evaluation and Authorisation of Chemicals (REACH), with the aim of assessing and managing risks from dangerous chemicals. The political impact of the socio-economic analysis is potentially high in the authorisation and restriction procedures, however, current socio-economic analysis dossiers submitted under REACH are very heterogeneous in terms of methodology used and quality. Furthermore, the economic literature is not very helpful for regulatory purposes, as most published calculations of health costs associated with chemical exposures use epidemiological studies as input data, but such studies are rarely available for most substances. The quasi-totality of the data used in the REACH dossiers comes from toxicological studies. Methods: This paper assesses the use of the integrated probabilistic risk assessment, based on toxicological data, for the calculation of health costs associated with endocrine disrupting effects of triclosan. The results are compared with those obtained using the population attributable fraction, based on epidemiological data. Results: The results based on the integrated probabilistic risk assessment indicated that 4894 men could have reproductive deficits based on the decreased vas deferens weights observed in rats, 0 cases of changed T-3 levels, and 0 cases of girls with early pubertal development. The results obtained with the Population Attributable Fraction method showed 7,199,228 cases of obesity per year, 281,923 girls per year with early pubertal development and 88,957 to 303,759 cases per year with increased total T3 hormone levels. The economic costs associated with increased BMI due to TCS exposure could be calculated. Direct health costs were estimated at (sic)5.8 billion per year. Conclusions: The two methods give very different results for the same effects. The choice of a toxicological-based or an epidemiological-based method in the socio-economic analysis will therefore significantly impact the estimated health costs and consequently the political risk management decision. Additional work should be done for understanding the reasons of these significant differences.

Journal Article

Abstract  National recommendations for numeric human health ambient water quality criteria (AWQC) for toxic substances are derived by the US Environmental Protection Agency (USEPA) using a deterministic approach that combines point estimates for exposure, toxicity, and acceptable risk. In accordance with the Clean Water Act, states, territories, and authorized tribes must either adopt these recommendations or modify and replace them with criteria using an alternative, scientifically defensible method. Recent reports have criticized the deterministic approach, stating that it suffers from compounded conservatism by selecting upper percentiles or maximum values for multiple inputs and that it cannot directly determine what portion of the population a given criterion protects. As an alternative, probabilistic risk assessment (PRA) has been promoted as a more transparent and robust method for deriving AWQC. Probabilistic risk assessment offers several advantages over the deterministic approach. For example, PRA uses entire data distributions rather than upper-percentile point estimates to specify exposures, thereby reducing compounded conservatism. Additionally, because it links acceptable risk targets with specific segments of the exposed population, PRA-based AWQC demonstrably protects multiple subsets of the population. To date, no study has quantitatively compared deterministic and PRA approaches and resulting AWQC using national inputs consistent with USEPA guidance. This study introduces a PRA method for deriving AWQC and presents case studies to compare probabilistically derived AWQC with USEPA's 2015 recommendations. The methods and results of this work will help federal and state regulators, water quality managers, and stakeholders better understand available approaches to deriving AWQC and provide context to assumption- and method-specific differences between criteria. Integr Environ Assess Manag 2023;19:501-512. © 2022 The Authors. Integrated Environmental Assessment and Management published by Wiley Periodicals LLC on behalf of Society of Environmental Toxicology & Chemistry (SETAC).

Journal Article

Abstract  Safety sciences must cope with uncertainty of models and results as well as information gaps. Acknowledging this uncer-tainty necessitates embracing probabilities and accepting the remaining risk. Every toxicological tool delivers only probable results. Traditionally, this is taken into account by using uncertainty / assessment factors and worst-case / precautionary approaches and thresholds. Probabilistic methods and Bayesian approaches seek to characterize these uncertainties and promise to support better risk assessment and, thereby, improve risk management decisions. Actual assessments of uncertainty can be more realistic than worst-case scenarios and may allow less conservative safety margins. Most importantly, as soon as we agree on uncertainty, this defines room for improvement and allows a transition from traditional to new approach methods as an engineering exercise. The objective nature of these mathematical tools allows to assign each methodology its fair place in evidence integration, whether in the context of risk assessment, sys-tematic reviews, or in the definition of an integrated testing strategy (ITS) / defined approach (DA) / integrated approach to testing and assessment (IATA). This article gives an overview of methods for probabilistic risk assessment and their application for exposure assessment, physiologically-based kinetic modelling, probability of hazard assessment (based on quantitative and read-across based structure-activity relationships, and mechanistic alerts from in vitro studies), indi-vidual susceptibility assessment, and evidence integration. Additional aspects are opportunities for uncertainty analysis of adverse outcome pathways and their relation to thresholds of toxicological concern. In conclusion, probabilistic risk assessment will be key for constructing a new toxicology paradigm - probably!

DOI
Journal Article

Abstract  OBJECTIVES: Assessment factors (AFs) are commonly used for deriving reference concentrations for chemicals. These factors take into account variabilities as well as uncertainties in the dataset, such as inter-species and intra-species variabilities or exposure duration extrapolation or extrapolation from the lowest-observed-adverse-effect level (LOAEL) to the noobserved- adverse-effect level (NOAEL). In a deterministic approach, the value of an AF is the result of a debate among experts and, often a conservative value is used as a default choice. A probabilistic framework to better take into account uncertainties and/or variability when setting occupational exposure limits (OELs) is presented and discussed in this paper. MATERIAL AND METHODS: Each AF is considered as a random variable with a probabilistic distribution. A short literature was conducted before setting default distributions ranges and shapes for each AF commonly used. A random sampling, using Monte Carlo techniques, is then used for propagating the identified uncertainties and computing the final OEL distribution. RESULTS: Starting from the broad default distributions obtained, experts narrow it to its most likely range, according to the scientific knowledge available for a specific chemical. Introducing distribution rather than single deterministic values allows disclosing and clarifying variability and/or uncertainties inherent to the OEL construction process. CONCLUSIONS: This probabilistic approach yields quantitative insight into both the possible range and the relative likelihood of values for model outputs. It thereby provides a better support in decision-making and improves transparency. Int J Occup Med Environ Health 2018;31(4):475-489.

Journal Article

Abstract  Pregnant women, infants, and children are particularly vulnerable to perfluoroalkyl substances (PFASs), yet little is known about related health risks. Here, we aimed to study the four main PFASs: perfluorooctanesulfonic acid (PFOS), perfluorooctanoic acid (PFOA), perfluorononanoic acid (PFNA), and perfluorohexanesulfonic acid (PFHxS), and assess the mixture risks of co-exposure to PFASs for pregnant women and children as well as for infants associated with maternal PFAS exposure at national and global scales, based on biomonitoring data on serum. We conducted a literature search and aggregated 69 data sources across 22 countries/regions from 2010 to 2020 to profile the serum concentrations of these four PFASs in pregnant women and children. Based on toxicity assessments by regulatory authorities, we determined conservative reference levels (RfLs) in the serum for the primary adverse effects of PFASs, including hepatic, developmental, and immune effects. The cumulative hazard quotient (HQ) was combined with probabilistic analysis to compare serum levels with RfLs and to quantify mixture risks. Our analysis revealed that PFOS was the dominant PFAS in maternal and child serum worldwide, with median levels 2.5-10 times higher than those of PFOA, PFNA, and PFHxS. The estimated global median serum levels of PFOS were 6.17 ng/mL for pregnant women and 4.85 ng/mL for children, and their immune effects in pregnant women and children are concerning as their cumulative HQs could exceed 1. For infants, the cumulative HQs for both developmental and immune effects could also be > 1, suggesting that maternal exposure to PFASs during pregnancy and breastfeeding may pose concerns for infant development and immunity. Our national and global serum database and risk assessment offer additional insights into PFAS exposures and mixture risks in susceptible populations, serving as a reference for evaluating the effectiveness of ongoing regulatory mitigation measures.

Journal Article

Abstract  Growing toxicologic evidence suggests that emerging perfluoroalkyl substances (PFASs), like chlorinated polyfluoroalkyl ether sulfonate (Cl-PFESA), may be as toxic or more toxic than perfluorooctanesulfonate (PFOS) and perfluorooctanoic acid (PFOA). However, further investigations are needed in terms of the human health risk assessment. This study examined the effects of emerging and legacy PFAS exposure on newborn thyroid homeostasis and compared the thyroid disruption caused by 6:2 Cl-PFESA and PFOS using a benchmark dose approach. The health effects of mixture and individual exposure were estimated using the partial least-squares (PLS) model and linear regression, respectively. A Bayesian benchmark dose (BMD) analysis determined the BMD value for adverse effect comparison between 6:2 Cl-PFESA and PFOS. The median (interquartile range) concentrations of 6:2 Cl-PFESA (0.573 [0.351-0.872] ng/mL), PFOS (0.674 [0.462-1.007] ng/mL), and PFOA (1.457 [1.034, 2.405] ng/mL) were found to be similar. The PLS model ranked the PFAS variables' importance in projection (VIP) scores as follows: 6:2 Cl-PFESA > PFOS > PFOA. Linear regression showed that 6:2 Cl-PFESA had a positive association with free triiodothyronine (FT3, P = 0.006) and triiodothyronine (T3, P = 0.014), while PFOS had a marginally significant positive association with FT3 alone (P = 0.042). The BMD analysis indicated that the estimated BMD10 for 6:2 Cl-PFESA (1.01 ng/mL) was lower than that for PFOS (1.66 ng/mL) in relation to a 10% increase in FT3. These findings suggest that 6:2 Cl-PFESA, an alternative to PFOS, has a more pronounced impact on newborns' thyroid homeostasis compared to PFOS and other legacy PFASs.

Journal Article

Abstract  We propose Bayesian semiparametric mixed effects models with measurement error to analyze the literature data collected from multiple studies in a meta-analytic framework. We explore this methodology for risk assessment in cadmium toxicity studies, where the primary objective is to investigate dose-response relationships between urinary cadmium concentrations and -microglobulin. In the proposed model, a nonlinear association between exposure and response is described by a Gaussian process with shape restrictions, and study-specific random effects are modeled to have either normal or unknown distributions with Dirichlet process mixture priors. In addition, nonparametric Bayesian measurement error models are incorporated to flexibly account for the uncertainty resulting from the usage of a surrogate measurement of a true exposure. We apply the proposed model to analyze cadmium toxicity data imposing shape constraints along with measurement errors and study-specific random effects across varying characteristics, such as population gender, age, or ethnicity.

Book/Book Chapter

Abstract  Provision of safe drinking water (DW) is one of the major requisites for human health, related to four Sustainable Development Goals (SDGs) of the United Nation 2030 Agenda: SDGs 3 (Good health), 6 (Clean water and sanitation), 11 (Sustainable cities) and 12 (Responsible production and consumption). However, this is hindered by the presence, especially in highly-anthropized contexts, of contaminants of emerging concern (CECs) in DW, that may pose a risk for human health. The present study aims at developing a holistic framework to support both (i) decision-makers for CECs prioritization in DW regulation and (ii) water utilities for the selection of appropriate monitoring and treatment interventions for the optimization of DW supply system. In detail, a quantitative chemical risk assessment (QCRA), including uncertainties related to both exposure and hazard assessments, was developed. Then, it was combined with testing and modeling of CECs fate in treatment processes and in distribution network, obtaining a robust tool to achieve the above-mentioned SDGs.

Journal Article

Abstract  The benchmark dose (BMD) approach for the exposure limit in the risk assessment of cancer and non-cancer endpoints is well established; it is often based on dose-response modeling of the most critical or the most sensitive outcome. However, neither the most critical endpoint nor the most sensitive endpoint may necessarily be representative of the overall toxic effects. To have a whole picture, it is preferable to express responses for different endpoints with equivalent severity levels and integrate them into one analysis framework. In this paper, we derive BMD in the case of multivariate ordered categorical responses such as none, mild, adverse, and severe based on structural equation models (SEMs). First, for each of the ordered categorical responses, we obtain a latent continuous variable based on fictitious cutoffs of a standard normal distribution. Second, we use SEMs to integrate the multiple continuous variables into a single latent continuous variable and derive the corresponding BMD. We employed a Bayesian statistical approach using Markov chain Monte Carlo simulations to obtain the parameter estimates of the latent variables, SEMs, and the corresponding BMD. We illustrate the proposed procedure by simulation studies and analysis of an experimental study of acrylamide exposure in mice with multivariate endpoints of different severity levels.

Dissertation

Abstract  In the modern world, humans are exposed to a wide range of chemicals throughout their life. Human risk assessment of chemicals is of considerable public health importance and provides means to derive safe levels of acute and chronic exposure for subgroups of the human population including neonates, children, elderly and populations of different geographical ancestry and genetic polymorphisms. The application of pathway-related kinetic data to address human variability in the quantification of hazard has potential to reduce uncertainty and better characterize variability compared with the use of traditional default uncertainty factors. This thesis aims to 1) quantify human variability by means of Bayesian meta-analysis for a range of phase I, phase II metabolic pathways and transporters using pharmacokinetic markers of acute and chronic exposure or enzyme activity data from available probe substrate, 2) derive pathway-related variability distributions and pathway-related uncertainty factors for their future integration in physiologically based kinetic models for human risk assessment of chemicals.

Journal Article

Abstract  Fitting theoretical models to experimental data for dose-response screenings of nanoparticles yields values of several hazard metrics that can support risk management. In this paper, we describe a Bayesian approach to the analysis of dose-response data for nanoparticles that takes into account multiple sources of uncertainty. Specifically, we develop a Bayesian model for the analysis of data for the cytotoxicity of ZnO nanoparticles that follow the log-logistic equation. This model reproduces the unequal variance across doses observed in the experimental data, incorporates information about the sensitivity of the cytotoxicity assay used (i.e. resazurin), and complements experimental data with historical information about the system. The model determines probability distributions for multiple values of toxicity potency (EC(50)), and exponential decay (the slope s); these distributions provide a direct measure of uncertainty in terms of probabilistic credibility intervals. By substituting these distributions in the log-logistic equation, we determine upper and lower limits of the benchmark dose (BMD), corresponding to upper and lower limits of credibility intervals with 95% probability given the experimental data, multiple sources of uncertainty, and historical information. In view of a reduction of costs and time of dose-response screenings, we use the Bayesian model for the cytotoxicity of ZnO nanoparticles to identify the experimental design that uses the minimum number of data while reducing uncertainty in the estimation of both fitting parameters and BMD.

DOI
Journal Article

Abstract  A probabilistic, evidence-based, quantitative microbial risk assessment model was developed to investigate public health risks associated with Campylobacter spp. contamination in broiler chicken supply systems in the United States, covering a farm-to-fork continuum. The objective of this study was to evaluate the effectiveness of intervention strategies in processing plants to protect the safety of chicken consumption and associated consumer health. A baseline model was constructed based on most common industrial practices with minimal interventions for model development and validation purposes. Sensitivity analysis was conducted to determine the most important input parameters for the model and identify critical control points along the supply chain. The effectiveness of possible intervention measures applicable during processing, including alternative processing strategies, chemical processing aids and physical methods to reduce risks of Campylobacter contamination in broiler chicken and campylobacteriosis among consumers were compared using scenario analysis. Input parameter distributions for the model were populated by the results of a previous systematic review and meta-analysis study, rather than evidence collected from the literature by convenience, to reduce possible bias in risk estimations. The final risk estimate was expressed as the number of campylobacteriosis cases per 100,000 persons per year and the intervention effectiveness was expressed as the relative change in campylobacteriosis risk if an intervention had been implemented compared with the baseline. The model estimated an occurrence of 274 (95% CI: 0–561) cases per 100,000 persons per year for baseline. Consumers' food safety practices and operations at processing plants are among the most significant factors to be targeted for reduction in consumers’ exposure to Campylobacter through broiler consumption. Scenario analysis results indicate that chemical processing aids (individually or in tandem) can offer significant reduction in risk estimates. The model is expected to provide a framework for risk managers making risk-based decisions on changes in current poultry processing practices or implementation of alternative intervention strategies.

DOI
Journal Article

Abstract  Frameworks for deriving occupational exposure limits (OELs) and OEL-analogue values (such as derived-no-effect levels [DNELs]) in various regulatory areas in the EU and at national level in Germany were analysed. Reasons for differences between frameworks and possible means of improving transparency and harmonisation were identified. Differences between assessment factors used for deriving exposure limits proved to be one important reason for diverging numerical values. Distributions for exposure time, interspecies and intraspecies extrapolation were combined by probabilistic methods and compared with default values of assessment factors used in the various OEL frameworks in order to investigate protection levels. In a subchronic inhalation study showing local effects in the respiratory tract, the probability that assessment factors were sufficiently high to protect 99% and 95% of the target population (workers) from adverse effects varied considerably from 9% to 71% and 17% to 87%, respectively, between the frameworks. All steps of the derivation process, including the uncertainty associated with the point of departure (POD), were further analysed with two examples of full probabilistic assessments. It is proposed that benchmark modelling should be the method of choice for deriving PODs and that all OEL frameworks should provide detailed guidance documents and clearly define their protection goals by stating the proportion of the exposed population the OEL aims to cover and the probability with which they intend to provide protection from adverse effects. Harmonisation can be achieved by agreeing on the way to perform the methodological steps for deriving OELs and on common protection goals.

DOI
Journal Article

Abstract  The rapid progress of AI impacts diverse scientific disciplines, including toxicology, and has the potential to transform chemical safety evaluation. Toxicology has evolved from an empirical science focused on observing apical outcomes of chemical exposure, to a data-rich field ripe for AI integration. The volume, variety and velocity of toxicological data from legacy studies, literature, high-throughput assays, sensor technologies and omics approaches create opportunities but also complexities that AI can help address. In particular, machine learning is well suited to handle and integrate large, heterogeneous datasets that are both structured and unstructured-a key challenge in modern toxicology. AI methods like deep neural networks, large language models, and natural language processing have successfully predicted toxicity endpoints, analyzed high-throughput data, extracted facts from literature, and generated synthetic data. Beyond automating data capture, analysis, and prediction, AI techniques show promise for accelerating quantitative risk assessment by providing probabilistic outputs to capture uncertainties. AI also enables explanation methods to unravel mechanisms and increase trust in modeled predictions. However, issues like model interpretability, data biases, and transparency currently limit regulatory endorsement of AI. Multidisciplinary collaboration is needed to ensure development of interpretable, robust, and human-centered AI systems. Rather than just automating human tasks at scale, transformative AI can catalyze innovation in how evidence is gathered, data are generated, hypotheses are formed and tested, and tasks are performed to usher new paradigms in chemical safety assessment. Used judiciously, AI has immense potential to advance toxicology into a more predictive, mechanism-based, and evidence-integrated scientific discipline to better safeguard human and environmental wellbeing across diverse populations.

Journal Article

Abstract  High-throughput toxicogenomics as an advanced toolbox of Tox21 plays an increasingly important role in facilitating the toxicity assessment of environmental chemicals. However, toxicogenomic dose-response analyses are typically challenged by limited data, which may result in significant uncertainties in parameter and benchmark dose (BMD) estimation. Integrating historical data via prior distribution using a Bayesian method is a useful but not-well-studied strategy. The objective of this study is to evaluate the effectiveness of informative priors in genomic dose-response modeling and BMD estimation. Specifically, we aim to identify plausible informative priors and evaluate their effects on BMD estimates at both gene and pathway levels. A general informative prior and eight time-specific (from 3 h to 29 d) informative priors for seven commonly used continuous dose-response models were derived. Results suggest that the derived informative priors are sensitive to the specific data sets used for elicitation. Real data-based simulations indicate that BMD estimation with the time-specific informative priors can achieve increased or equivalent accuracy, significantly decreased uncertainty, and a slightly enhanced correlation with the points of departure estimated from apical end points than the counterparts with noninformative priors. Overall, our study systematically examined the effects of historical data-based informative priors on BMD estimates, highlighting the benefits of plausible information priors in advancing the practice of toxicogenomics.

DOI
Journal Article

Abstract  Valley Fever is a respiratory disease caused by inhalation of arthroconidia, a type of spore produced by fungi within the genus Coccidioides spp. which are found in dry, hot ecosystems of the Western Hemisphere. A quantitative microbial risk assessment (QMRA) for the disease has not yet been performed due to a lack of dose-response models and a scarcity of quantitative occurrence data from environmental samples. A literature review was performed to gather data on experimental animal dosing studies, environmental occurrence, human disease outbreaks, and meteorological associations. As a result, a risk framework is presented with information for parameterizing QMRA models for Coccidioides spp., with eight new dose-response models proposed. A probabilistic QMRA was conducted for a Southwestern US agricultural case study, evaluating eight scenarios related to farming occupational exposures. Median daily workday risks for developing severe Valley Fever ranged from 2.53 x 10(-7) (planting by hand while wearing an N95 facemask) to 1.33 x 10(-3) (machine harvesting while not wearing a facemask). The literature review and QMRA synthesis confirmed that exposure to aerosolized arthroconidia has the potential to result in high attack rates but highlighted that the mechanistic relationships between environmental conditions and disease remain poorly understood. Recommendations for Valley Fever risk assessment research needs in order to reduce disease risks are discussed, including interventions for farmers.

Journal Article

Abstract  Toxicologists are often concerned with determining the dosage to which an individual can be exposed with an acceptable risk of adverse effect. These types of studies have been conducted widely in the past, and many novel approaches have been developed. Parametric techniques utilizing ANOVA and non-linear regression models are well represented in the literature. The biggest drawback of parametric approaches is the need to specify the adequate model. Recently, there has been an interest in nonparametric approaches to tolerable dosage estimation. In this work, we focus on the monotonically decreasing dose-response model where the response is a percent to control. This imposes two constraints on the nonparametric approach: the dose-response function must be monotonic and always positive. Here, we propose a Bayesian solution to this problem using a novel class of non-parametric models. A set of basis functions developed in this research is Alamri Monotonic spline (AM-spline). Our approach is illustrated using two simulated datasets and two experimental datasets from pesticide related research at the US Environmental Protection Agency.

Journal Article

Abstract  2,2',6-Tribromobisphenol A (Tri-BBPA), the main debrominated congener of tetrabromobisphenol A (TBBPA), is ubiquitous in the environment and human body but with unknown toxicity. Tri-BBPA was synthesized and applied to investigate its sub-chronic exposure effects on 28 organ coefficients and clinical health indicators related to liver function, kidney function, and cardiovascular system function in female mice. Results showed that the liver was the targeted organ of Tri-BBPA exposure. Compared to the control group, the changes in liver coefficient, cholinesterase, total protein, albumin, gamma-glutamyl transpeptidase, lactate dehydrogenase, and creatine kinase levels ranged from -61.2 % to 35.5 % in the high-exposed group. Creatine kinase was identified as a critical effect indicator of Tri-BBPA exposure. Using the Bayesian benchmark dose derivation method, a lower reference dose than TBBPA was established for Tri-BBPA (10.6 mug/kg-day). Serum metabolomics revealed that Tri-BBPA exposure may primarily damage the liver by disrupting tryptophan metabolism related to L-alanine, tryptamine, 5-hydroxyindoleacetic acid, and 5-methoxyindoleacetate in liver cells and leading to liver dysfunction. Notably, epilepsy, schizophrenia, early preeclampsia, and late-onset preeclampsia were the top six enriched diseases, suggesting that the nervous system may be particularly affected by Tri-BBPA exposure. Our findings hinted a non-negligible health risk of exposure to debrominated products of TBBPA.

Filter Results