In pharmacology, bioavailability (BA or F) is a subcategory of absorption and is the fraction of Bioavailability is one of the essential tools in pharmacokinetics, as bioavailability must be considered when calculating dosages for. Drug Bioavailability and Clinical Pharmacology - Learn about from the MSD Manuals - Medical Professional Version. Bioavailability is an important process that controls the uptake and, hence, the . Oral bioavailability is particularly important for lead optimization considering.
EPA, , although medium-specific default values are available for lead, cadmium, and manganese. If risks predicted using the default bioavailability assumptions are below a level of concern at all parts of the site, and if there is no reason to believe the default bioavailability value has been substantially underestimated, it is generally appropriate to conclude that no further investigation of site-specific bioavailability is needed although the information may be useful for characterizing uncertainty.
This conclusion, however, is predicated on the assumption that the default bioavailability value is an upper bound, health-protective estimate e. Hence, it is possible that the actual RBA at the site could be higher or lower than the default. If there is reason to believe that the lead RBA at a site might be substantially higher than the default value e.
The assessment could proceed to Step 2. Is default BA adequately protective? Collection of site- specific BA data not recommended z No Step 3c Does the added value exceed the costs of obtaining the bioavailability data?
This is not intended to address the collection of samples for research. This is important to help avoid substantial delays that might arise from a delayed or late decision to collect additional site-specific information in support of a site-specific bioavailability assessment.
It should be apparent very early in the risk assessment process whether any metals may be risk drivers and where additional bioavailability information could have a significant impact. Determine whether or not EPA has identified one or more validated methods for estimating site-specific bioavailability.
Step 2 of the recommended procedure evaluates whether or not EPA has identified one or more validated methodologies for estimating the absolute or relative bioavailability of the metal of concern at the site. The Agency believes that these regulatory validation approaches are generally applicable to the assessment of bioavailability methods. ICCVAM has developed validation criteria and regulatory acceptance criteria for test methods used to generate information to support regulatory decisions.
Validation can be achieved by demonstrating that a method is reliable and relevant for its proposed use, while regulatory acceptance can be accomplished when a regulatory e. If EPA has not identified a validated methodology, we recommend that further pursuit of site-specific values generally not proceed without the development and validation of a suitable method. The latter efforts usually would not be undertaken as part of site-specific risk assessment efforts, but original research on the development of alternative bioavailability methods is encouraged, where resources are available.
Evaluate the costs and potential value added by obtaining the data. Estimate the costs, In this recommended step, information is collected on the cost including both time and money that would be required to obtain reliable site-specific bioavailability data.
This should include the level of effort that would be needed to plan for and collect appropriate site samples for analysis, the time and cost of performing the bioavailability measurements using the validated method s , and the effort needed to summarize, evaluate, and apply the results to the risk assessment process. Estimate the potential value added by obtaining the data. Estimate the range of bioavailability values that are plausible.
In this recommended step, information should be assembled from the site under consideration, or from other similar sites, that may be useful in judging whether the bioavailability of the metal in soil at the site could be substantially different from the default value used in the screening-level calculations performed in Step 1.
Examples of the types of information that may be relevant include: These data might have been derived from measurements made at the site or from knowledge about the sources of soil contamination. In general, these would include the organic content and the nature of the organic fraction e. The specific types of information that would be relevant for a particular metal of concern should be assessed from the available scientific literature.
A recent review of these topics can be found in NRC Based on the available information, the range of bioavailability values that might be plausible at the site should be estimated. These estimates may be based, in large part, on observations at other sites and on professional judgment applied to extrapolations to the site of interest. The objective is to provide plausible bounds on the absolute or relative bioavailability of the metal at the site, which may then be used in estimating costs and potential value added by collecting site-specific bioavailability data.
Estimate the added value. Step 3b2 of the recommended decision framework estimates the added value that might be realized if reliable site-specific bioavailability data were obtained. For example, cost savings could be realized if the site-specific bioavailability values were in the lower part of the plausible range. This estimation could be accomplished by first using the plausible range of bioavailability values to estimate the current and potential future human health risk.
Then one could determine the extent of the site soils that would fall above a level of concern using the default bioavailability assumption, and compare that to the area that would be above a level of concern based on the potential alternative lower assumed value.
The difference in areas of concern is then multiplied by the estimated cost of remediation per unit area, and the result is a crude estimate of the potential cost savings from reduced remediation. Collection of site-specific bioavailability data could also provide additional value through improved confidence in the estimate and enhanced information for risk communication.
In the final part of recommended Step 3, the estimated costs time and money of obtaining the data are compared to the added value that may be realized, and a decision is reached based on the cost comparison. For example, at a site where the area of concern based on default bioavailability assumptions is relatively small, the cost of cleanup might be the same or less than the cost of obtaining the data.
Conversely, at a large site, the potential cost savings might outweigh the cost of data collection if collection of site-specific bioavailability data resulted in even a small decrease in the extent of the site determined to be above a level of concern.
In the absence of cost savings, the value of continuing with the bioavailability study may still be worth the added expense e. It is also important to consider whether additional data collection activities can be completed within an adequate time frame.
Depending upon the type of information needed, data collection could take a few weeks to several months. If additional collection of site-specific bioavailability data is not feasible, either due to resource or schedule constraints, then the plausible range of bioavailability values and their potential impacts on risk estimates should be discussed in the uncertainty section of the human health risk assessment.
However, if the cost comparison and feasibility evaluation support collection and analysis of additional data, then the assessment could proceed.
Document site-specific implementation of validated method, In this recommended step, a site-specific risk assessment should document the 1 rationale for use of the selected validated method at the site; 2 the basis for the selection of soil samples assayed for the purpose of predicting bioavailability at each area of concern; and 3 the approaches conceptual and quantitative used to integrate the site bioavailability information into the risk characterization. We also recommend that the risk assessment document the basis for selecting the appropriate sample size needed to ensure that the bioavailability assay yields a reliable estimate of bioavailability or relative bioavailability.
The first part of the site-specific documentation i. This site- specific documentation should satisfy the data quality objectives and methodology validation for acceptance.
The site-specific documentation should also summarize the pertinent results of these evaluations and why they support the use of the method for the assessment of site-specific bioavailability. Limitations of the selected method for the intended application, in comparison to alternatives, should be documented as well. The second part of the site-specific documentation should address the approach used to translate the results from bioavailability assays into estimates of absolute or relative bioavailability of the metal in the receptors of concern at the site.
For example, if statistical transformations of the data, such as regression models, were used in translating the data output from the methodology into bioavailability estimates, these statistical models should be documented see U. EPA, a, for an example of a regression model applied to the output of an in vitro solubility assay for lead.
The third part of the documentation should address selection and procuring of samples that allow prediction of bioavailabiiity at each area of concern. The ultimate goal of the bioavailability assessment is to arrive at a bioavailability adjustment s that can be applied to risk estimations for all or part of the site.
In some cases, the bioavailability of the metal of concern may be similar across the entire site, and a single sample usually a composite sample may be adequate for derivation of a site-specific RBA estimate. In other cases, the bioavailability of the metal of concern may vary within or between sub-areas of the site due to differences in soil characteristics, metal concentrations, form of metal, aging, land use, or other factors.
In these cases, bioavailability should be assessed in representative samples collected from each sub-area of potential concern. In all cases, the documentation for the selection of samples to be assessed should address the adequacy of the sample size and sample locations for assessing both within- area and between-area variability, and explain how the estimates of variability will be integrated into the bioavailability assessment at each area.
The TRW is available for consultation and review of site-specific implementation plans as needed. For additional information on sampling, see U.
Collect soil samples and assess bioavailability. Step 5 of the recommended process is the collection of the soil samples and measurement of bioavailability in those samples using the selected methodology. Sample collection, laboratory procedures, data handling, and archiving should be consistent with Agency guidance for data quality objectives and assurance U. In the case that a validated in vitro method is used to estimate bioavailability, it is recommended that the protocol specified in the methodology be followed for making the extrapolation from in vitro data to in vivo values.
That is, there is no a priori assumption that all validated in vitro methods must yield results that are identical to in vivo values. Rather, it is assumed that a mathematical equation will exist such that the I'M vitro result entered as input will yield an estimate of the in vivo value as output. In general, the mathematical equation that links in vitro results to in vivo results will yield an estimate of the expected average value of the in vivo bioavailability value.
Thus, the true in vivo bioavailability value may be either lower or higher than the best estimate predicted from the in vitro value. Risk assessors and risk managers should exercise their judgment in deciding whether to use the average value, a range of values, or a conservative point estimate when applying the results to the risk assessment. Step 6, Integrate results of bioavailability estimates into risk characterization. In Step 6 of the recommended procedure, the results of the site-specific bioavailability assessment should be incorporated into the characterization of the site risks.
This approach is consistent with other EPA risk assessment guidance U. EPA, , which recommends that, in general, reliable site-specific parameter values are preferred over default values that may not represent site-specific conditions. The uncertainty assessment section of the risk characterization should discuss the basis for confidence in the site-specific estimates of bioavailability, the limitations in the estimates, and any issues related to extrapolating these values over time i.
The uncertainty assessment should also provide at least a qualitative, but preferably a quantitative, assessment of uncertainty in the site-specific bioavailability estimates, as well as the potential impacts of this uncertainty on the risk characterization. A final source factor of importance is the size of the ingested load. This may make relatively little difference for certain nutrients but can be very important for others. For example, the absorption fraction for calcium varies inversely as the logarithm of the load size Heaney b.
Subject factors have limited relevance to the pharmaceutical or supplement manufacturer because they are uncontrollable. However, knowledge of subject factors is important in interpreting, for example, age-related changes in apparent bioavailability, as well as in reconciling results from seemingly disparate studies.
They may also be relevant in formulating niche products, targeted, for example, to the elderly or to pregnant women, etc. One of the more important of these subject factors is mucosal mass. Although this variable cannot currently be measured in intact humans, its effect on absorptive performance is a well-demonstrated phenomenon in experimental animals and is seen for essentially all nutrients, both poorly and well absorbed.
Also important are intestinal transit time and the rate of gastric emptying Barger-Lux et al. Another factor, well understood if not often acknowledged, is the up- and down-regulation of absorption by physiological controls because of the experience of the subject with the nutrient concerned.
For example, absorption fraction for calcium will tend to be lower for individuals on high calcium intakes than for those on low. One consequence is that the absorption fraction observed at a single-meal test in an individual with a low calcium intake cannot be extrapolated to what would happen in the same individual taking the supplement regularly under which circumstances absorption may be down-regulated.
A related factor, also generally well recognized for nutrients, is the nutritional status of the subject being tested with respect to the nutrient concerned, noted earlier. Thus, absorption of calcium and iron will be greater in individuals who are deficient in these minerals than in individuals who are replete.
Important coingested factors include anti-absorbers in other foods ingested at the same meal. Thus, as has been well described, phytic acid in whole grain cereals may interfere with iron and zinc absorption, and wheat bran, with calcium absorption Weaver et al.
On the other side of this issue, some substances enhance absorption, as seen in the effect of ascorbic acid on iron absorption. Further, there is the enhancing effect of the meal itself Heaney et al. The effect is probably a composite of prolonged gastric emptying from a meal source as contrasted with dumping that may occur with a supplement tablet taken on an empty stomach , as well as interactions between food macromolecules and calcium particles in ways that enhance the presentation of the calcium to the absorptive surface.
Once again, although the effect is well established for several nutrients, the precise mechanism remains unclear. Finally, there is competition for limited absorptive transport capacity with other chemically similar substances, for example, the well-studied competition between calcium and strontium, as well as the very large and beneficial interference by calcium with lead absorption.
Therefore, such a concern is a kind of quality assurance issue, and measurement of bioavailability serves to demonstrate that the product does at least a part of what it purports to do. As such, every responsible supplement manufacturer and food fortifier ought to assume the burden of demonstrating that the respective product exhibits appropriate bioavailability.
But there is a second bioavailability issue, particularly for poorly absorbed nutrients such as calcium, and that is the pursuit of a kind of holy grail of enhanced bioavailability. This quest stands behind both the usually exaggerated marketing claims of superior performance for one salt or one formulation relative to another, as well as the search within the industry for additives that might enhance the absorption of calcium, thereby conferring, it is assumed, a market advantage on the product concerned.
In general, this emphasis seems inappropriate and misdirected from both cost benefit and nutritional considerations. Only when products exhibit very large differences in absorbability or are priced about the same will the cost-benefit analysis reveal that the better absorbed product is actually a better bargain.
In the final analysis, the simplest and cheapest way to absorb more calcium is to ingest more calcium. Also, nutritionally, there seems very little advantage to improving absorbability, because unabsorbed calcium exhibits valuable functionality in its own right. Calcium remaining in the food residue forms complexes with harmful substances left over from digestion, such as oxalic acid, unabsorbed fatty acids and bile acids.
This complexation is the mechanism by which high calcium diets reduce the risk of kidney stones and colon cancer. Calcium phosphate has been shown to be more efficacious at preventing colon cancer in animal models than the same amount of calcium as the carbonate Lupton Theoretically, sources with high intrinsic absorbability, ingested at low load sizes, could meet the body's skeletal needs for calcium, but they would leave unmet the detoxification function that unabsorbed calcium serves within the intestinal lumen itself.
In brief, there is little or no nutritional advantage to ingesting one's calcium in a form with absorbability higher than that of natural calcium sources. Many factors can influence both the actual absorbability of nutrient sources and the endpoints by which it is measured.
With respect to actual bioavailability, the formulation by which calcium is added to the diet, either pharmaceutical or food, may be the most important controllable factor and also the one producing the greatest effect. At the existing state of understanding of the chemistry of the chyme and of the mechanisms of the absorption process, predicting bioavailability is chancy, and there is today no substitute for direct bioavailability testing.
Finally, ultimate bioavailability of a nutrient source can only be known when testing is performed under fully adapted conditions. This latter point is not applicable to the demonstration of product quality or bioequivalence, but it is important for the understanding of the impact of supplement use on the nutritional status of a population. Oxford University Press is a department of the University of Oxford. It furthers the University's objective of excellence in research, scholarship, and education by publishing worldwide.
Sign In or Create an Account. Close mobile search navigation Article navigation. Abstract For non-metabolizable supplemental nutrients, bioavailability is effectively equivalent to absorbability.
View large Download slide. Barger-Lux et al An investigation of sources of variation in calcium absorption efficiency. The skeleton as an ion exchange system: Bo-Linn et al An evaluation of the importance of gastric acid secretion in the absorption of dietary calcium. Carr and Shangraw DeGrazia et al A double isotope method for measurement of intestinal absorption of calcium in man. Halleux and Schneider Iron absorption by intestinal epithelial cells: CaCo2 cells cultivated in serum-free medium, on polyethyleneterephthalate microporous membranes, as a in vitro model.
Heaney et al Absorption of calcium as the carbonate and citrate salts, with some observations on method. Bioavailability of the calcium in fortified soy imitation milk, with some observations on method. Heaney and Recker Heaney et al a. Heaney et al b. In the context of environmental risk assessment, relative bioavailability is the ratio of the absorbed fraction from the exposure medium in the risk assessment e. Bioaccumulation is the total accumulation of contaminants in the tissue of an organism through any route, such as food items as well as from the dissolved phase in water.
Bioconcentration is accumulation of a chemical directly from the dissolved phase through the gills and epithelial tissues of an aquatic organism. Biomagnification is the process by which bioaccumulation causes an increase in tissue concentrations from one trophic level to the next from food to consumer. Bioavailable fraction is that portion of the bulk concentration that is available to be accumulated into an organism under a defined set of conditions. For instance, for a metal it could be the freely dissolved ion of the metal.
Other forms of the metal bound in precipitates or covalent or hydrogen bonded to other ions would not be available.
The available fraction is a proportion ranging from 0. The available fraction determines the reactive portion of the total mass of material, much like the activity coefficient relates activity to concentration. Bioaccessibility describes the fraction of the chemical that desorbs from its matrix e. The bioaccessible fraction is not necessarily equal to the RAF or RBA but depends on the relation between results from a particular in vitro test system and an appropriate in vivo model.
Relative absorption factor RAF describes the ratio of the absorbed fraction of a substance from a particular exposure medium relative to the fraction absorbed from the dosing vehicle used in the toxicity study for that substance the term relative bioavailability adjustment RBA is also used to describe this factor.
Absorption describes the transfer of a chemical across the biological membrane into the blood circulation. Biostabilization refers to the biodegradation of the more labile HOC hydrophobic organic compound fraction leaving a residual that is much less available and mobile. Another view of bioavailability is represented by a chemical crossing a cell membrane, entering a cell, and becoming available at a site of biological activity.
Others might think of bioavailability more specifically in terms of contaminant binding to or release from a solid phase. Figure is a depiction of bioavailability processes in soil or sediment; it incorporates exposure by release of solid-bound contaminant and subsequent transport, direct contact of a bound contaminant, uptake by passage through a membrane, and incorporation into an organism.
This may include geological processes like weathering and scouring, chemical processes like redox reactions or complexation, and biochemical processes through the action of biosurfactants or hydrolytic enzymes. Binding may occur by adsorption on solid surfaces, by absorption within a phase like natural organic matter, or by a change in form as in covalent bonding. Transport may result from diffusion and advection to target receptors such as microbes, plants, and humans. Thus, bioavailability processes A and B comprise exposure via various chemical and biochemical phenomena that affect release and subsequent transport of dissolved contaminants.
It should be noted that processes A, B, and C can occur internal to an organism such as in the gut lumen, although they are depicted in Figure as occurring in the external environment. The bioavailability process depicted as D in Figure entails movement across membranes. Here the contaminant passes from the external environment through a physiological barrier and into a living system. An example is transport. Note that A, B, and C can occur internal to an organism such as in the lumen of the gut.
Exposure to both dissolved and solid-bound contaminants can lead to chemical interaction with the membrane of an organism and subsequent uptake or absorption these terms are used synonymously. For example, after passage across a biological membrane the chemical can exert a toxic effect within a particular tissue among many possibilities. It should be noted that A, B, and C in Figure are sometimes considered to be fate and transport processes which they are rather than bioavailability processes.
On the other hand, process D is more traditionally associated with bioavailability in contemporary risk assessment. Figure makes it clear that soils and sediments can affect exposure in various ways, both external and internal to the organism. For example, solid phases influence the extent of contaminant transfer from one medium to another, thereby determining soluble chemical concentrations. There is also differential uptake of contaminants into animals and plants depending on whether they are solubilized or solid-bound.
Although of great importance in determining the overall effect of a contaminant on an organism, E processes—the toxic action or metabolic effect of a chemical—are not defined as bioavailability processes per se because soil and sediment are no longer a factor.
However, because E processes are often measured endpoints, they are described at length in Chapters 3 and 4. Bioavailability processes have definable characteristics that provide the foundation for this report.
Second, bioavailability processes are quantifiable through the use of multiple tools. Third, bioavailability processes incorporate a number of steps see Figure , not all of which would be applicable for all compounds or all settings. Thus, bioavailability processes modify the amount of chemical in soil or sediment that is actually taken up and available to cause biological responses.
That soils and sediments can impact chemical interactions with plants and pests has been known for some time by farmers and those involved in agricultural services e. However, in the past few decades the phenomenon has gained attention with respect to releases of hazardous chemicals to the environment. First, interest in bioavailability has been driven by a desire to reduce the uncertainties in estimating exposures as part of human and ecological risk assessment. That is, a better understanding of bioavailability processes could help identify sediment- or soil-specific factors that might influence exposure.
A second impetus comes from the remediation of contaminated sites, including observations that the effectiveness of bioremediation and other treatment technologies can be limited by the availability of chemicals in soils or sediments. In some cases, the greatest opportunity for risk reduction may be to treat or contain the bioavailable fraction of the hazardous chemicals in soils and sediments and then to rely on natural attenuation approaches to treat the long-term, slow release of residual contaminants.
Thus, there is considerable interest in setting cleanup goals based on the bioavailable amount rather than the entire contaminant mass. The brief history below acknowledges the varied use of the term and the extent to which bioavailability processes have been considered in different contexts.
For example, pre-Columbian natives in South America were known to extract a powerful muscle-paralyzing agent—curare—from various Strychnos plants. They had no means of knowing that this alkaloid possesses a quaternary nitrogen atom, and that the charge on this nitrogen atom prevents its movement across the gas-. They understood quite well, however, that this poison was harmless when ingested, but very effective when injected.
As a result, they could immobilize prey with curare-tipped arrows, dispatch the prey, and safely eat the meat. From the fifth century BC to the fifteenth century AD, red clay from a specific hill on the Greek island of Lemnos was regarded as a sacred antidote for poisoning Thompson, Called terra sigillata , it was considered effective against all poisons, no doubt acting as an adsorbent and preventing uptake in the gastrointestinal tract.
The use of charcoal as an adsorbent to reduce the effect of poisons can be traced back to even earlier times, with its mention recorded in the Egyptian Papyrus of BC.
In the nineteenth century, when toxicologists had the fortitude to serve as their own experimental subjects, P. Tourney demonstrated the effectiveness of charcoal before the French Academy of Medicine by ingesting ten times the lethal dose of strychnine combined with charcoal, and surviving Holt and Holz, One of the most fundamental concepts in toxicology is that an adverse effect is dependent upon the dose of the toxic substance or toxicant reaching a target organ or tissue.
With the exception of chemicals that react with the organism on contact, such as corrosive agents, the toxicant must be absorbed into the systemic circulation to reach its biological target. From a toxicological perspective then, bioavailability implies movement of a chemical into the systemic circulation because to a large extent this is a good indication of the biologically effective dose. Because the disciplines of toxicology and pharmacology share many basic principles, this is essentially the same way bioavailability has been defined in medicine, except of course that the focus is on the absorption of drugs from dosage forms instead of chemicals from environmental media.
Both toxicologists and medical doctors are cognizant of the importance of events outside the body and that physical—chemical properties of the toxicant or drug and its interactions with its surroundings can affect the rate and extent of absorption. In fact, much of what is termed pharmaceutics involves an understanding of these phenomena as they pertain to drugs and manipulation of drugs and their microenvironment to therapeutic advantage.
Also, toxicologists are well aware that a variety of events in the environment can affect the rate and form in which chemicals are delivered to the body. Nevertheless, the defining aspect of bioavailability, as the term is used in both toxicology and medicine, is the movement of chemical from outside the body into the systemic circulation.
Bioavailability is also an important consideration in nutrition. Here the focus is on absorption of nutrients from the gastrointestinal tract, and the term bioavailability can have different meanings in different situations. For example, nutrients such as amino acids in proteins must be liberated through digestive enzyme activity in the gut.
In this context, bioavailability may become synonymous with digestibility. Other nutrients, such as most vitamins, require metabolic activation in order to have nutritional value. For these substances, bioavailability is sometimes defined to include both absorption and the metabolic activation process.
For still other nutrients that do not require digestion or metabolic activation, bioavailability is regarded simply as the process of absorption of the substance from the gut into the systemic circulation, as in toxicology and medicine. In considering the toxicological use of the term, it is important to recognize that systemic absorption is not necessarily equivalent to general uptake or absorption into the body, particularly from the gastrointestinal tract.
Mammalian anatomy is responsible for this complication. Chemicals absorbed from the gastrointestinal tract enter hepatic portal circulation and must pass through the liver before reaching the general circulation. The liver and to some extent, the gastrointestinal epithelium may metabolize the chemical, converting it to substances with greater, lesser, or qualitatively different biological activity. This view of bioavailability, in terms of what reaches the systemic circulation as opposed to just crossing a biological membrane , includes both absorption and metabolism components, and components both internal and external to the body.
It can also lead to some ambiguity in how bioavailability is operationally defined for a. However, in some instances it is important to describe bioavailability in ways that include metabolites, such as when metabolites are formed that contribute significantly to the biological dose of the chemical. This is analogous to the expanded definition of bioavailability in nutrition to include metabolic activation of vitamins.
Regardless of how it is defined, a clear articulation of the basis for the bioavailability determination with or without metabolites is required in order to interpret the results. The recognition that total soil concentration of a compound is not equivalent to bioavailable or effective concentration is well established in the agricultural sciences.
This is well known not only for plant nutrients but also for water, where physical processes such as water tension or matrix potential control the fraction of total water that is plant-available. Attempts to maximize yields and optimize economic return have resulted in extensive research to describe the behavior of necessary plant nutrients in soil systems.
These have been validated with field trials for multiple crops under varied soil, climate, and moisture regimes. The bioavailable nutrient pool varies significantly by soil type and by plant species Chaney, This reflects the different complexing capacities of different soil orders as well as different plant mechanisms for accessing soil nutrients Marschner, Availability can also depend on the source of the nutrient.
For example, nitrogen can be added to soils as manure N, ammoniacal N, nitrate N, and N—P materials; each of these sources will have different release characteristics that vary by soil type, soil moisture, plant growth stage, and soil microbial activity Pierzynski et al. The range of factors that affect nutrient availability and the methods that have been developed to predict effective nutrient concentrations potentially can be used as a model for the development of appropriate protocols to assess bioavailability processes for contaminants in soils and sediments.
Although the majority of these protocols have been developed to predict phytoavailability of nutrients in potentially deficient conditions, there is a direct correlation to the development of an understanding of the bioavailable fraction of soil contaminants. In many cases, however, plants are aggressively attempting to alter the rhizosphere environment to facilitate nutrient uptake, during which they may inadvertently access soil-bound contaminants.
While this research has significantly increased knowledge of bioavailability processes and led to the development of tools to measure the bioavailable fraction, it is not yet at the point where the phytoavailability of nutrients across a range of soils and crops can always be accurately predicted.
Heterogeneity in soil colloids and adsorption surfaces and differences in soil pH, organic matter, and pore spaces preclude the ability to definitively predict the fate of nutrients in soil systems. This is further complicated by differences in uptake efficiencies across plant species. Nonetheless, the factors involved in nutrient uptake may help to clarify the processes that are involved in determining the bioavailability of contaminants in soil systems. The concept of bioavailability also has a history in the application of pesticides, particularly herbicides, to agricultural soils.
As with the uptake of nutrients by plants, the efficacy of an applied herbicide, fungicide, or insecticide depends on a range of soil properties, primarily soil organic matter content and texture. Specific properties of the pesticide will also affect its behavior in the soil system, including the size of the molecule, its structure and functional groups, its polarity, and resulting dissociation constants and partitioning coefficients e. Thus, different application rates are recommended for different soil types and compounds.
In addition, the potential for herbicide residues to damage successive croppings will vary because of changes in the persistence of the compound in different soils. This has been understood and incorporated in product development for several decades Hance, ; Bailey and White, ; Walker et al. Generally, herbicides must be dissolved in soil solution to be effective. As the soil organic matter concentration or soil clay content increases, the portion of the herbicide that is sorbed also increases Stevenson, In soils of high organic matter such as peats, herbicides may be completely ineffective when applied at typical economic rates.
For soils with very low organic matter concentrations, application may not be recommended because too much of the compound may be present in soil solution, increasing the potential for crop damage as well as leaching. Other factors, such as moisture content, soil texture, and timing of rainfall after application will also affect the efficacy of the compound Mueller-Warrant, These factors have been sufficiently recognized within the industry that compound labels will generally recommend different application rates based on soil type.
For example, application rate recommendations for S-metalochlor are based on soil texture and percent organic matter, with recommended rates varying from 0. Bioavailability is also an issue when dealing with residues of agricultural chemicals applied in the past. In particular, the bioavailability of insecticides. Owing to the widespread use and economic importance of pesticides, their long-term persistence in soil has been studied for more than half a century.
Methods to assess pesticide concentrations in soil have evolved to recover as much added compound as possible with ever-increasing precision and accuracy. Today there is a debate as to whether analytical methods designed to measure the total concentration adequately reflect the risk from such pesticides. Early evidence showed that pesticides persist in soil for a long time.
In and , plots were established to study the long-term persistence and rates of disappearance of several chlorinated hydrocarbon insecticides applied to soil, including dieldrin, chlordane, and DDT Nash and Woolson, Their results showed that 39 percent of the original DDT remained after 17 years. These soil plots gave an upper-limit persistence owing to the amount and means of pesticide added and management of the test plots with minimal tillage.
Alexander arithmetically plotted selected data sets of Nash and Woolson for DDT, heptaclor, and dieldrin to suggest gradual decrease in the rate of reduction of contaminant mass for which some latter data points do not change much with time. Other evidence for the long-term persistence of DDT and its residues in soil is presented by Boul et al.
Other studies have tried to demonstrate a link between persistence and bioavailability by measuring contaminant assimilation into animals or effects on crops for soils with aged compounds versus soils with freshly added compounds.
For example, Morrison et. Box describes a series of studies on pesticide persistence in soil and resulting bioavailability. The attention given to bioavailability in the environmental arena is relatively recent compared to disciplines like toxicology and agronomy. This attention has been driven in large part by hazardous materials and site cleanup legislation and concerns about the exposure to and risk from hazardous chemicals.
For example, chemicals that are encapsulated, insoluble, or strongly bound to solids may not be prone to biological uptake or exert a biological response, while chemicals that are.
Comparison was made with soil freshly spiked with pesticide. Their data showed that although aging reduced uptake into earthworms, some of the pesticide was still assimilated by the earthworms even after an aging period of 49 years. In an analogous study, Robertson and Alexander showed a significant reduction in mortality of insects to DDT- and dieldrin-amended soils aged for 30 days compared to freshly added insecticides.
Toxicity decreased with further aging for or days, showing no mortality. About 85 and 92 percent of the contaminant was recovered from the soil by extraction after and days, respectively. Similar results are reported for herbicides, where the toxicity was less than anticipated based on total sample analysis. Aged simazine residues were shown to be biologically unavailable to sugarbeets and to microbial degraders, whereas recently added simazine caused damage to sugarbeets and was substantially degraded by microbes.
In summary, pesticides can persist in soils for up to 50 years and perhaps much longer. Based on tests with microorganisms, worms, insects, and plants, pesticides may or may not exhibit greatly reduced bioavailability as measured by degradation, uptake, or toxicity over the long term. Typically, modern analytical methods are designed to report the total amount of all forms of a compound present in a sample.
Thus, the difference between the total amount of a compound detectable using modern analytical techniques and the bioavailable amount of the compound has become a central issue in the environmental arena.
The earliest studies of contaminant bioavailability from soil for the purposes of refining human exposure assessment focused on dioxins and furans in the mids Bonaccorsi et al.
These were soon followed by similar studies on the oral bioavailability of polychlorinated biphenyls PCBs and polyaromatic hydrocarbons PAHs from soil Fries et al. Starting in the late s, bioavailability. This prompted the development of lead bioavailability models in rats Freeman et al. The success of this approach for lead resulted in the development of analogous models for arsenic in swine and monkeys Freeman et al. Mercury bioavailability also has been the subject of recent investigations as reviewed in Davis et al.
Several review documents compile the results from these site-specific bioavailability studies Battelle and Exponent, , ; NEPI, a, b. Despite this work, for many scenarios there is limited agreement on how to quantify all relevant bioavailability processes at hazardous waste sites, partly because too few compounds have been tested to make generalizations. A large body of information comes from empirical observations suggesting that bioavailability processes are important for assessing the risk of compounds in soil.
In particular, for organic chemicals a pattern of chemical disappearance composed of a more rapid initial phase followed by a period in which little or no degradation of chemical can be detected is commonly observed e. In the case where the compounds are known to be biodegradable, the lack of disappearance in the second phase is taken to mean the compounds are unavailable to microorganisms. In addition, it is argued that the observed slowing in the biodegradation rate of organic compounds in aged samples imposes a limit on what may be achieved by bioremediation.
Indeed, in many cases it has been observed that organic—solid partitioning or the aging of organic pollutants in soil and sediment systems results in residues that are recalcitrant to further microbial attack despite favorable environmental conditions Mihelcic and Luthy, ; Alexander, ; Ramaswami and Luthy, Beyond empirical observations, more quantitative attempts to document bioavailability processes at hazardous waste sites use a variety of techniques including mass transfer measurements, geochemical analyses, microbial responses, extractants that mimic the digestive action of organisms, accumulation or uptake tests as in the lead model discussed above , and bioassays of acute and chronic responses for a detailed discussion of tests see Chapter 4.
Accumulation into earthworms e. Where concern has focused on the potential risk associated with longer exposures to low levels of contamination, tests that measure sublethal endpoints such as growth and reproduction have been applied Dillon et al. These approaches offer the advantage of providing a closer link to effects on higher levels of biological organization e. Although bioassays of uptake and effect are most applicable to the test organism usually microorganisms, clams, worms, and plants , the results may also be relevant to other animals and humans.
In addition to the metals-contaminated mining sites mentioned previously, bioavailability also has been seriously considered at former manufactured gas plant MGP facilities, which made gaseous fuels from coal and oil prior to the widespread distribution of natural gas following WWII. These plants operated from 50 to years ago, and wastes remain at thousands of sites around the world.
Bioavailability processes have emerged as important for assessing environmental exposures and for remediating contaminated soils and sediments at MGP sites e. The focus has been primarily on the bioavailability of coal tar constituents—specifically PAHs. The implications of bioavailability for biological treatment of these materials also have been evaluated.
For example, some treatment technologies have focused on methods of increasing the availability of coal tar constituents e. In other cases, the goal has been to demonstrate that contaminants in the treated soils or sediments are no longer in an available form and thus pose less risk.
This is the case for MGP purifier waste, which contains elevated levels of cyanide compounds that happen to be much less bioavailable than simple cyanide salts Ghosh et al. Another area where bioavailability processes are a primary focus of environmental risk assessment is in the management of coal ash. Ash is one of the largest solid waste residuals associated with energy production from fossil fuels.
The Electric Power Research Institute , has conducted a substantial amount of the research associated with these materials, including development of geochemical models for predicting leaching and transport behavior of the metals in ash. These recent assessments include evaluations of exposure to ecological receptors and incorporate bioavailability processes as reflected by biological uptake factors. Finally, bioavailability processes are an important component of U. Environmental Protection Agency EPA regulations concerning the beneficial use of biosolids, which are the residual materials generated by municipal wastewater treatment and applied to land for their fertilizer value.
As discussed in greater detail in Chapter 2 , the Part Sludge Rule contains risk-based standards de-. Initially, the proposed regulations called for limits on the amount of sludge that could be applied to land, based on metal toxicity to certain plants Marks et al. As more studies using biosolids were conducted e. For all exposure pathways other than human ingestion of biosolids, Part regulations currently permit the use of data from such field studies to determine these concentration thresholds and set application rates of biosolids such that metal limits are not exceeded.
Risk assessments provide the foundation for decisions about exposure to chemicals and cleanup of soils and sediments at contaminated sites. Bioavailability processes are important for evaluating exposures of humans and ecological receptors to persistent compounds. Indeed, risk management decisions related to judging the acceptability of dioxins in soils can be traced back to evaluations that explicitly considered bioavailability Kimbrough et al.
Since that time, some progress has been made in explicitly incorporating bioavailability concepts into risk assessment, particularly for lead contamination of soils and for dermal exposure pathways see Chapter 2. In general, though, most bioavailability processes are not transparently dealt with during risk assessment, and are instead part of certain assumptions, adjustments, or correction factors, which may or may not be based on experiments.
Following is a brief overview of how bioavailability concepts are incorporated into human health and ecological risk assessment. A more thorough examination of the topic is given in Chapter 2.
Bioavailability processes leading up to absorption processes A—C in Figure are also included in human health risk assessments, but typically are not identified as such. When bioavailability is considered as the fraction of the chemical that is absorbed into systemic circulation, two operational definitions are important— absolute and relative bioavailability.
The amount of chemical that is ingested, lies on the surface of the skin, or is inhaled is called the applied dose. FIGURE Absolute bioavailability determines the fraction of the external or applied dose that reaches the systemic circulation internal dose.
recommend fundamental procedures for bioavailability and bioequivalence the need to consider plasma clearance in evaluating absolute bioavailability. This review does not consider the details of the specific chemical form of calcium as a variable in the determination of bioavailability because. metal-specific bioavailability information is presented for those metals that are most often found as . Why Consider Bioavailability in Risk Assessments?.