First, let us cover the concepts of accuracy and precision. Accuracy refers to the closeness of a measured value to a standard or known value. Precision refers to the closeness of two or more measurements to each other. For example, say you weighed a block that was known to be 100 kilograms (kg) and you received measurements of 98 kg, 101 kg, 99 kg, 100kg, and 100kg. The data received would likely be considered accurate and precise. However, if you received measurements of 90 kg, 108 kg, 112 kg, 93 kg, and 99 kg, the data may be considered accurate as its average is near the known 100 kg, but it is not precise as you had measurements in a 22 kg range. On the other hand, if you had measurements of 50 kg, 51 kg, 49 kg, 50 kg, and 49 kg, this would be considered precise data that is NOT accurate. It is important to note that laboratory data needs to be both precise and accurate.
While looking at a typical calibration curve, every value located outside the area of hard quantitation is considered an estimated value and is not considered hard quantitated. Values falling within the lower and upper calibration points are bracketed and considered hard quantitated. All values below the low calibration point, RL (Reporting Limit) are extrapolated from the known calibration points and are estimated.
Notice the method detection limit (MDL) below the low calibration point (Reporting Limit). This point is a statistically derived number that indicates where positive detection of an analyte beings and ambient noise detections ends. However, it is important to remember that detection of an analyte is different than measurement or hard quantitation of an analyte. As the name implies, it is simply the lower limit that is detectible by that particular method and a level of uncertainty surrounds that point. Data within the range of the MDL and low calibration point (Reporting Limit) is estimated, not hard quantitated. The MDL is a level that says with 99% confidence that a “hit” (reading) of that particular analyte is greater than zero. Since the MDL is a measure of confidence in detection of an analyte, it has a range of results for which we are 99% confident detection occurred. It is due to this lower statistical bound and upper statistical bound that putting a “less than” sign before the MDL when no analyte detection occurred is technically incorrect. With that being said, it is bad practice to use MDL numbers as a regulatory values for which to compare data against because they are not fixed values and can be a range. Only data between two calibration points can be hard quantitated.
A surrogate is a chemically similar compound to the one being investigated but not expected to occur in the environmental samples. Surrogates are intentionally put into a sample and their performance is compared with those from the analytes. The surrogates are used to judge the laboratories performance of the method used to measure the analyte. The surrogate processes two things: 1) a surrogate is placed on a sample prior to preparation steps to judge the laboratories performance of carrying out the method properly and 2) to evaluate the occurrence of matrix issues when carrying out the method procedures. Surrogate data is presented alongside analyte data and compared to give feedback on the laboratory’s performance. For example, if a known 100 parts per billion (ppb) surrogate was added to a sample and an analyte was measured at 100 ppb, if the surrogate standard is measured at 100 ppb then it would be assumed surrogate recovery was 100%. A 100% surrogate recovery represents sound laboratory procedures for that method and generally means analyte data is reliable. If surrogate recovery is less than the known 100 ppb, it is likely unsound laboratory procedures or reactions between surrogate and matrix took place and decreased surrogate recovery. It is generally assumed in this situation that the laboratory’s analyte reading is likely below its actual concentration. On the other hand, if surrogate recovery is higher than the known 100 ppb, it is likely the laboratory’s analyte reading is above its actual concentration.
An internal standard is a compound that is chemically similar to analytes being determined and spiked directly onto a test sample at the analytical instrument used to perform the analytical test at a known concentration. Unlike a surrogate, the response of the instrument to the known level of internal standard is used to quantitate the analyte in the sample. The use of internal standards is to compare an instrument’s performance to a known standard. Internal standards do NOT take into account the procedure a laboratory must take to prepare a sample for reading. Instead, internal standards only measure an instrument’s performance on detecting analytes of a known value.