Uncertainty factor has an effect on the accuracy and precision of analytical data in extractable and leachable studies.

Extractable and Leachable (E&L) data is used in a range of industries. This is often to determine the risk to patient and consumer safety or to determine the potential impact on products from the materials (predominately polymeric and elastomeric) used to manufacture, store, and deliver a product.

As with all analytical data, there can be a degree of uncertainty relating to the results. This uncertainty can depend on a number of factors relating to the processes, methods or equipment that are used to generate the results.

Within the field of extractable and leachable analysis, there is the added complexity that methods are required to detect a wide range of unidentified analytes at very low concentrations in the test article.

This can lead to a number of different sources of variability which affect the level of uncertainty relating to a result. This was discussed in a recent event, Around the World of E&L — a virtual event that can now be accessed on-demand.

Analytical Evaluation Threshold (AET)

It is typical within the field of E&L to set a threshold (based on toxicological concern levels) above which analytes should be identified and reported.

Due to potential variability in the response of different analytes by any given detector, it is common for an uncertainty factor to be applied when calculating the threshold. This minimises the risk of under-reporting analytes.

This is exemplified in the analytical evaluation threshold (AET) calculation presented in ISO 10993-18:2020

AET = (DBT x (A/(B x C)))/UF

Where:

  • DBT = dose-based threshold (µg/day)
  • A = the number of test articles used to generate the extract
  • B  = the volume of extract (mL)
  • C = number of test articles a patient, consumer is exposed to in a day
  • UF = uncertainty factor

The uncertainty factor value is often dependent on the technique being used and the response of the surrogate standard compared to the response of the analytes of interest.

If a surrogate standard which responds well (an average response) is selected, it may be necessary to apply a large uncertainty factor to the calculation of the AET so that poorly responding analytes are not under-reported.

By selecting a surrogate standard that has a poor response compared to the majority of analytes of interest, a lower uncertainty factor may be applied to the data in order to reduce the chances of under-reporting data. Underreporting analytes could potentially lead to a negative impact on patient safety, see Figure 1.



Figure 1 Average responding surrogate – UF 10 (blue line). Lowest responding surrogate – UF 1 (green line)


Both these options have potential impacts on the variability of the analytical data – but for different reasons.

By selecting a surrogate standard that can utilise a small uncertainty factor, it is highly likely that its response will be much smaller than the majority of analytes of interest. Thus, impacting the accuracy of the results.

Conversely the bigger the uncertainty factor the lower the AET. This potentially impacts precision. Surrogate standards which have a low response (low signal to noise) are likely to be less precise (higher %RSD) due to the phenomenon described by the Horwitz equation (Z-score).

Picking an appropriate surrogate standard that doesn’t have a negative impact on the uncertainty of the data can prove to be a challenging undertaking. Introducing multiple surrogate standards for different classes of compounds or utilising relative response libraries can start to reduce some of this uncertainty.

Our mission is to ensure consumer and patient safety making this an important consideration in the work we do.


Have Your Products Tested »