Sei sulla pagina 1di 24

Analytical Services and QA/QC

Lynda Bloom, Analytical Solutions Ltd.


Prepared for the Society of Exploration Geologists, April, 2002
1.0 Introduction
The principal objective of exploration geochemical surveys is to locate mineral deposits
at the lowest possible cost. It is necessary to have reliable survey results so that areas
with no apparent geochemical response can be abandoned with confidence.
Misinterpretation, sampling inconsistencies or poor quality analytical data can lead to
expenditures on areas with false anomalies, which is a waste of time and resources.
The focus of this paper is the control and improvement of analytical data quality for
exploration geochemical surveys. Many types of geochemical surveys
(hydrogeochemistry, biogeochemistry, selective extractions and rare earth
lithogeochemistry) take advantage of modern technology, such as ICP-MS, and require
measurement of elements at the sub-ppb level. These measurements may test the limits
of the technology, and contamination is a serious concern.
In order to interpret these analytical data, it is necessary to have an understanding of the
errors associated with sample handling, preparation and laboratory procedures. Once it is
recognized that all data have an associated error, quality assurance can be implemented to
measure these errors.

Figure 1: Herman Lake Sheet Bromine Values

Figure 1 is a map of bromine


values in lake sediments from an
Ontario Geological Survey of the
Herman Lake map sheet,
covering an area of 40 km by 20
km. The east half of the sheet
was sampled at a different time
than the west half and the
samples were analyzed
approximately one year apart.
The baseline bromine values are
different in the east and west

halves of the survey area due to changes in instrumentation, although the same analytical
method was requested for both parts of the survey. This example demonstrates the
importance of understanding the analytical processes, error measurement and monitoring
results.

2.0 Definitions
Quality assurance has a broad definition outside the mining industry and has been defined
as: All those planned or systematic actions necessary to provide adequate confidence
that a product or service will satisfy given needs (Kirschling, 1991). Quality control is
one aspect of quality assurance. The difference between the two concepts is described by
Vaughn (1990), as Assurance in the quality context is the relief of concern about the
quality of a product. Sampling plans and auditsthe quality control devicesare
designed to supply part of this assurance.
The terms commonly used to discuss geochemical data are defined below.
Precision: the reproducibility of a result. The results can be said to be of low precision
when multiple analyses of the same sample or duplicate analyses of single samples show
a wide variation in results.
Accuracy: the relationship between the expected result (particularly of standards) and the
result actually achieved from the analysis.
Bias: the amount by which the analysis varies from the correct result. The amount of
bias can only really be determined by a large number of repeat analyses of known
standards over a period of time.
A demonstration of the difference between accuracy and precision is provided in Table 1.
Table 1
Accuracy vs. Precision
Precise but
Inaccurate

Accurate but
Imprecise

Accurate and
Precise

Value 1
Value 2
Value 3
Value 4
Value 5
Value 6

130
135
130
125
125
135

150
95
80
105
95
105

95
100
100
105
95
105

Average

130

100

100

Range

50

Expected Value

100

100

100

Analytical Solutions Ltd., 1214-3266 Yonge Street, Toronto, ON M4N 2L6


www.explorationgeochem.com

3
Detection Limit: refers to the limit to which an analytical signal can be measured and
distinguished from the background noise with a specific level of confidence in the
observation. The term is probably the most widely abused and least useful of any in
laboratory literature.
Detection limits, used by instrument manufacturers, are often recorded by analyzing
single element samples in pure, simple matrices. These figures do not relate to analysis
of samples and are not useful guidelines for geologists.
There are a number of ways of determining the detection limit of a particular analytical
process, some of which are illustrated below, but they should all indicate one thing, that
is, the lowest value that can reliably be determined.
The most common definition of detection limit is the lowest concentration of an element
that can be detected with a 95% probability. This is calculated by taking a large number
of instrumental readings at or near the blank level, calculating the standard deviation and
determining the value corresponding to 2 standard deviations plus the blank value.
For atomic absorption spectrometry, the detection limit is essentially a fixed value for any
particular element and once the instrument has been optimized, should not change.
However, variations may occur over time with power fluctuations, temperature changes
or electrical problems.
For some techniques, such as x-ray fluorescence spectrometry or neutron activation
analysis, detection limits for an element can generally be improved by increasing the
counting time. As a rule, squaring the counting time will halve the detection limit. For
example, if the counting increases from 5 minutes to 25 minutes, the detection limit
should improve from 10 ppm to 5 ppm. Improved detection limits and therefore longer
counting times result in higher per sample costs. Detection limits may not be improved
in cases where there are significant interferences from other elements.
The detection limit of a particular process has a significant effect upon the precision (and
accuracy) of analytical results at levels that approach the detection limit. It is generally
assumed that the precision of values between the detection and 10 times the
detection limit is 50%.
It is important to choose an analytical method with a detection limit that is lower than the
expected geochemical background. Preferably, the detection limit should be at most
1/10th the geochemical background of the area.
Sensitivity: is often used to refer to the detection limit of the technique but there is only a
secondary relationship between the two terms. The sensitivity of a technique relates to
the slope of the calibration graph; i.e., it is the slope of the relationship between
instrumental signal and analytical concentration.

Analytical Solutions Ltd., 1214-3266 Yonge Street, Toronto, ON M4N 2L6


www.explorationgeochem.com

4
As a rule of thumb, the higher the sensitivity, the lower the detection limit, and vice
versa. However, some instruments allow the operator to increase the sensitivity by
varying amplification factors. While this increases the slope of the calibration curve, it
does not improve detection limits as it also increases noise.
Drift: occurs when the instrumental response to a given concentration of analyte or to
background conditions varies progressively with time.
This change need not be linear or even progressive, just observable. It is often associated
with changes in ambient temperature or warming up of the electronic components of the
instrument during operation.
Progressive blocking of nebulizers or atomization sources are a physical cause of drift. If
this is not recognized, the results will be biased. Internal standards and normalization
standards can be used to minimize the effects of drift.
Noise: refers to short-term instability of instrumentation and consequently, of the signal.
When variation in atomization and nebulization conditions, sources, or the components
themselves gives rise to signal variability, this is referred to as noise.
The noise of the instrument is a major factor in lowering detection limits, as the noisier
the instrument, the higher the detection limit. This parameter can be used to monitor
degradation of instruments or components.
Analytical range: of a technique or method is the concentration range over which valid
data can be collected within pre-determined statistical parameters.
Analytical range is affected by such instrumental factors as integration time,
pre-concentration factors, and solution matrix (changes of matrix, especially, will vary
analytical ranges by orders of magnitude). Analytical range and signal integration time
are inter-related and longer integration times usually extend the analytical range to lower
levels.
Sensitivity decreases with increasing concentration for most spectrophotometric methods.
It is necessary to dilute sample solutions or re-calibrate equipment when concentrations
are greater than the upper limit of detection. Dilution of samples in production
laboratories is generally imprecise and data are less likely to be reproducible towards the
upper limit of detection. It is recommended that high grade samples be resubmitted for
analysis by assay techniques that are suitable for the concentration range.
Certified Reference Materials (CRM): are homogeneous materials that have been
analyzed by a large number of different analysts, usually internationally and using a wide
variety of analytical techniques, to provide a representative concentration value for
specific analytes. The expected, acceptable or working values are almost always
quoted as total element concentrations. There are very few CRMs for the partial
Analytical Solutions Ltd., 1214-3266 Yonge Street, Toronto, ON M4N 2L6
www.explorationgeochem.com

5
extractions or digestions commonly used in the mining industry, such as the ubiquitous
aqua regia digestion.
Reference (and certified reference) materials should have:
n
the same matrix as the samples to be analyzed
n
the same levels of trace elements
n
the same speciation (valency and binding) as in the sample matrix or similar
mineralogy.
They should also be homogeneous (i.e., the difference between representative sample
measurements must be smaller than the overall uncertainty limits of the measurements).
Note that homogeneity for one analyte does not imply that the material is also satisfactory
for a wide range of elements. CRMs should be physically and chemically stable for an
indefinite period of time, which is particularly problematic for sulphide-rich samples that
could oxidize. CRMs that are not stable should have an expiration date.
The various applications of CRMs can be summarized as:
n
calibration of equipment
n
achievement of traceability of calibration
n
improvement of measurement quality
n
verification of accuracy of results.
They can also be used in statistical quality control procedures, although this option is
expensive because large quantities of material are required. Control samples must have
some of the characteristics of a reference material.
Secondary standards: are in-house standards usually used for quality control purposes.
The accurate quantification of analytes is of relatively minor importance as long as the
same result is obtained on a day-to-day basis.
This type of material is considerably cheaper than CRMs and can therefore be used far
more often. It is also more than likely that the secondary standard is more representative
of the material being analyzed by the laboratory than the CRM and therefore is more
useful.
Calibration standards: are appropriate standards, usually made from spectrographically
pure chemicals, prepared in such a way as to be used to directly calibrate instrumentation
being used for analysis.
Interference: is the effect of constituents in the sample directly upon analytes and/or
upon a measured parameter, causing a bias in results when the latter are compared to
equivalent results from samples not containing those constituents. Interference effects
can lead to either an increase or a decrease in the measured parameter. However, in some
extreme cases they can eliminate the acquisition of any valid data.

Analytical Solutions Ltd., 1214-3266 Yonge Street, Toronto, ON M4N 2L6


www.explorationgeochem.com

6
Reagent Blank: refers to the concentration of the elements of interest in the sample
solution that has been taken through the same analytical procedure as the samples being
analyzed, without the sample being added. This measures the potential contamination
contributed by reagents. The analytical blank is the above plus the signal component
attributable to the instrument noise.
Errors in analytical data: All analytical data is subject to errors or to bias. Errors can be
divided into the following categories:
n

Random Errors: This type of error is endemic to analytical chemistry and is part
of the functioning of every instrument and technique ever developed. It arises
from unstable power supplies, non-reproducible atomization and other
fluctuations. While it is not possible to remove random error, it is possible to
minimize its magnitude and measure its degree.

Systematic Errors : Such errors can arise from solution matrix effects specific to
a particular technique, instrument specific inter-element effects or relative bias
associated with variations between sample type and available appropriate
standards. Personal bias can also be a source of systematic error, especially when
there is a preconceived idea as to the required concentrations in CRMs and
in-house standards run with sample batches.
One of the few checks to determine systematic error is the use of standards or
controls; however, it is advisable not to acquaint the analyst with either the
location of these samples in the batch or the expected concentration of analytes in
them. In this way, bias can be removed. This approach has a fundamental
advantage, in that wrongly certified values soon become known.

Gross Error: These errors result in completely incorrect results being obtained.
They are the result of such things as mislabelling of samples, incorrect
preparation procedures, vessels contamination, incorrect instrument set up, bad
calculations, etc.
Such errors are usually random, can be identified, and are corrected quickly with
the use of quality control procedures such as the submission of blanks, duplicates
and controls with samples.

3.0 Six Sigma Approach or DMAIC


Michael Thompson, one of the co-designers of the Thompson-Howarth precision plot,
said, "All analytical measurements are wrong; it's just a matter of how large the errors
are, and whether they are acceptable." In this instance, "errors" refers to the inaccuracies
and imprecision of the data. These are not "mistakes" but result from the naturally
occurring limitations of selecting small representative samples from large volumes of
material and from the sensitivity of analytical methods.

Analytical Solutions Ltd., 1214-3266 Yonge Street, Toronto, ON M4N 2L6


www.explorationgeochem.com

7
There are many sources of error to be taken into account when assessing a geochemical
or assay database. These may include sample inhomogeneity, contamination, data
accuracy and analytical precision. Each project will have different challenges so that the
program to measure these errors will vary in its design.
Six Sigma is an organizational quality system that can be applied to geochemical data.
The Six Sigma approach is a system that was originally implemented at General Electric
and has been adopted by large corporations to save millions of dollars. Application of the
Six Sigma approach system in this context is suitable to maximize the effectiveness of
quality control programs and ensure that the appropriate measures are adopted.
The key steps of a Six Sigma improvement project are referred to as DMAIC

Define
Measure
Analyze
Improve
Control

3.1 Define
Although it is important to measure all sources of errors, there are financial and practical
constraints. The first step of the process is to define what has to measured and how often,
which is usually done in conjunction with understanding the consequences of introducing
errors.
Different types of projects have different requirements. For example, it is important to
measure the accuracy of Ti and Zr analyses, for lithogeochemical surveys, since subtle
variations in these data will be used to identify rock types and degree of alteration. A
fluctuation in the accuracy of the data between batches is unacceptable. A rigorous
system of reference materials is required.
Similarly, the accuracy of low-level fire assay determination of gold in soils may be
important. Figure 2 is an example of a soil survey in Africa in a lateritic terrain over an
area of approximately 6 km by 6 km. Gold values range from detection limit (5 ppb) to
several grams per tonne.
The approximately 5,000 soils were sampled at two different times, with Phase 2
sampling, in the southeast corner of the project area, completed almost a year later than
Phase 1. The background Au values in Phase 2 are elevated relative to the background
values in Phase 1. The selection of anomalous areas is biased because of the shift in
background values. It is most likely that the problem arose due to a shift in the accuracy
of the fire assay determinations for Au. A system of reference materials, blanks and field
duplicates would have identified the shift in background Au values and the laboratory
could have been requested to repeat the determinations.

Analytical Solutions Ltd., 1214-3266 Yonge Street, Toronto, ON M4N 2L6


www.explorationgeochem.com

Phase 1

Break between Phases 1 & 2

Phase 2

Figure 2: Distribution of Au in soils over a area of 6 km by 6 km.


Selective extractions (such as MMI, Enzyme Leach, cold hydroxylamine, sodium
pyrophosphate and others) are generally expected to have high signal-to-noise ratios.
The ratio of threshold to anomalous values may be a factor of ten or higher. In this case,
the precision of each determination is not as important as for other types of surveys. It is
necessary to monitor the accuracy of the determinations, especially whether the
extractions were performed consistently and if the chemistry of the sample affected metal
extractability. A system of reference materials is required but crosscheck analyses may
not be useful. Due to the low elemental concentrations, sources of contamination need to
be controlled and monitored.
In some surveys, it is important to monitor gross errors more closely than systematic
errors. Gross errors include switching samples during analysis, samples numbered
incorrectly in the field, and other randomly introduced human errors. Regional stream
Analytical Solutions Ltd., 1214-3266 Yonge Street, Toronto, ON M4N 2L6
www.explorationgeochem.com

9
sediment surveys may be designed with the collection of one sample per 10 square
kilometres or more. If two consecutive samples are mixed up and only one is anomalous,
then follow-up work is concentrated in the wrong area. Not only is the cost of the followup work lost but the prospective ground is never tested. Gross errors in regional surveys
is a serious problem and requires a quality control program to address it.
Many of the most rigorous quality control programs are designed for advanced drill
programs. In an unusual case, Cu values were reported by two different laboratories on
the same pulp for almost 10,000 samples using acid digestion with instrumental finishes.
Most of the Cu values reported within 10% except for a group of 115 samples with
differences of over 50% between values (Figure 3).
2.5

Cu (%) Lab 2

1.5

+10%

-10%
0.5

0
0

0.2

0.4

0.6

0.8

Cu (%) Lab 1

Figure 3: Comparison of Cu determinations by Labs 1 and 2. The majority of


the 10,000 data points fall between the 10% lines and are not shown on this
graph.
The differences are attributable to gross errors in almost every case. In most cases, it
appears that samples were switched when they were weighed, at either of the two
laboratories. One of the two laboratories is highly computerized so it is unlikely that
errors were made in data transfer or data transcription. The other laboratory is less
computerized and some errors are related to data entry.
The error rate is in the order of 0.1% for this comparison of Cu values on the same pulp.
There are relatively few processes involved in these determinations, relative to fire assay
for example. Samples were weighed, acids added, the test tube racks were presented to
the atomic absorption spectrometer or ICP, and data were collected and transferred to
reports. For methods that require greater sample handling, such as fire assay, higher rates
of gross errors are expected.

Analytical Solutions Ltd., 1214-3266 Yonge Street, Toronto, ON M4N 2L6


www.explorationgeochem.com

1.2

10

Gross errors that could be introduced in sample drying, crushing and pulverizing were not
measured when comparing the Cu determinations on the same pulp. There are many
steps in these processes where gross errors can be introduced.
Where the errors are due to two consecutively numbered samples being switched, there is
essentially no impact on ore reserve calculations since the samples are most likely from
the same mineralized zone. However, if the samples were from a regional survey the
consequences might be more severe.
This example demonstrates that a quality control program may need to measure
reproducibility (i.e., precision) and the number of defects or gross errors in the data set.
One other design criteria is compliance with regulations. Some jurisdictions and
corporations have requirements to report quality control measures. These are not
technical considerations but need to be incorporated into the quality control program.
At the Define stage of the process, the key items to be monitored are determined and the
risks associated with errors are recognized. The next step is to identify what needs to be
Measured.
3.2 Measure
To determine what needs to be measured, it is necessary to have a thorough
understanding of the processes. At each stage of sample preparation, weighing,
digestion, etc., there is the potential for errors.
In the Define stage, it was determined which criteria were important. Knowing, for
example, that it is critical to monitor accuracy, the flow sheet in Figure 4 can be
examined for processes where inaccuracy could be introduced. Once these processes are
identified, it is possible to take measurements that will identify when the process is out
of control. Flow charts of all processes, from sampling to data acquisition, can be
developed for each project and evaluated to identify sources of contamination, inaccuracy
and sample inhomogeneity.
The quality control program is custom designed for a project. The level of confidence in
the laboratory, anticipated grades, distribution of the mineralization and other factors
determine the approach selected.

Analytical Solutions Ltd., 1214-3266 Yonge Street, Toronto, ON M4N 2L6


www.explorationgeochem.com

11

Drill Core Sampling, Sample Preparation and Analysis

Flowchart: insertion of
blanks, standards and dups
3" dia. core/0.5 m sample/~1.5 kg
Split core in half
1 in 25 cases sampleArchive remaining core
second half of the core
and treat as first half

Archive - 2mm reject

In 1 in 25 cases,
submit 500 g split
of -2 mm material

Dry at 110 degrees C for 24 hours


Insert coarse blanks (1/50)

Prepare coarse blank

Crush to 80% -2mm


Clean crusher
Split 500 gm
Insertstandards

Purchase/preparestd.

Pulverize to -88 microns


Split 100 g
400 g

In 1 in 25 cases,
renumber and submit
for duplicate analyses

100 g

Submit for chemical analysis

Randomlyselect
1/25 for submission
tosecondarylab

Legend
Monitor Contamination
Introduction of Sampling Error
Monitor Sampling Error
Monitor Accuracy

ReviewQC
Merge analytical data with sample
location and description in the
database

Analytical Solutions Ltd.

Figure 4: Flow sheet for drill core sampling, sample preparation and analysis

Analytical Solutions Ltd., 1214-3266 Yonge Street, Toronto, ON M4N 2L6


www.explorationgeochem.com

12

Some basic concepts such as the insertion of blanks and field duplicates are described
below for a typical regional geochemical survey. These descriptions assume that samples
are submitted to the same laboratory for preparation and analysis. Some mining
companies are using one laboratory for sample preparation, inserting quality control
samples and then submitting all the pulps to another laboratory for analysis. This
approach has the advantage that blanks, standards and duplicates can be submitted
blind. When this approach is used some of the issues discussed below are not relevant.
Blanks: It is recommended that a non-mineralized material be inserted on a routine basis.
This material is not crushed or pulverized. Higher-than-expected analytical results would
indicate that the material was contaminated during sample preparation, during laboratory
analysis or perhaps replaced with a mineralized sample. Incorrect results would then
mean that a batch of samples would have to be re-analyzed or changes made to standard
procedures.
The blank material should closely resemble the material being submitted for analysis.
For a stream sediment survey, a bulk sample would be collected from an area where there
is no known mineralization. Similarly, a bulk sample would be collected from a pit in a
barren area to acquire material that resembles the sampling horizon in a soil survey.
Blanks should be submitted so that they are not distinctive from the other samples in the
shipment.
It is necessary to pre-assign sample numbers for blanks so that these numbers are not
used inadvertently in the field.
Duplicate samples: Duplicate samples are used to (a) monitor sample batches for
potential sample mix-ups and (b) monitor the data variability as a function of both
laboratory error and sample homogeneity.
Collection of duplicate samples from the same site is a useful means of monitoring both
site homogeneity and analytical precision. The degree of reproducibility will determine
how sensitive the data are to site variation and therefore improve the interpretation of
anomalies. Very poor reproducibility of results could indicate that sample mix-ups have
occurred in the laboratory.
Field duplicates are generated by collecting a sample twice from the same site using the
same procedure each time. The duplicate samples should not be labelled with
consecutive numbers. The sample numbers should be separated by at least 20 sample
numbers so that the two samples will be analyzed in different laboratory batches. The
duplicate sample can be labelled with a random number from the sample tag book; the
duplicate number is recorded at the same location as the original sample.
Alternatively, field crews can be issued with a separate sample tag book with a different
series of numbers, which is specifically used for duplicate samples.

Analytical Solutions Ltd., 1214-3266 Yonge Street, Toronto, ON M4N 2L6


www.explorationgeochem.com

13
After approximately 25 duplicate pairs have been generated, the data should be reviewed
to determine if the data precision is acceptable. It may be necessary to increase the size
of the sample collected or alter laboratory procedures if sample inhomogeneity is
suspected to be a problem. If the laboratory is suspected of numerous errors, it may be
necessary to select a different laboratory for the project.
Duplicate samples should also be generated at each stage where a sample is split. Sample
preparation duplicates are generated by pulverizing two splits of the crushed sample. If
there is a two-stage crushing procedure, preparation duplicates should be generated for
each step of the procedure. Laboratories routinely analyse pulps in duplicate and this data
can be requested from the laboratory.
It is also recommended that the second half of split drill core be routinely sampled and
assayed to determine the variability in two halves of the same core. However, many
companies are reluctant to utilize both halves of the drill core and this is usually a
management decision.
Control samples: For most regional geochemical surveys, blank samples and field
duplicates are included routinely. In some cases, it is also preferable to insert control
samples on a routine basis. Drilling campaigns will also include the use of control
samples as the accuracy of the assays must be documented for ore reserve calculations.
Control samples are a preferred method for monitoring the consistency of a laboratory.
Usually a homogeneous, fine grained pulp is submitted routinely to a laboratory for
analysis. Standard reference materials are available from various government institutions
(CANMET in Canada, National Bureau of Standards in the U.S.) but are generally too
expensive to be used on a routine basis. More commonly, 5- to-10 kg pulps are prepared
from material at the project site, covering the range of expected values.
No more than seven control samples are necessary but it is important to prepare an
adequate volume of material. It is useful to have a series of control samples to cover the
range of anticipated values and also so that it is more difficult for the laboratory to
anticipate the correct value. However, if too many different control samples are
introduced it is difficult to accumulate the necessary statistics on data variability and it is
more likely that mistakes will occur when recording which control samples were inserted.
It is common to submit material that is considered suitable for control samples to five or
six laboratories to determine an accurate value for each material. Multiple aliquots of the
control samples should be submitted along with purchased certified reference materials.
There are also a number of commercially available control standards that are available at
costs in the order of $50-100 per kilogram or 5% of the cost of certified reference
materials. There are a wide variety of materials at different grades and styles of
mineralization. These standards are useful for short drill programs where project-derived
controls are not available. They can also be effective where there are concerns that a

Analytical Solutions Ltd., 1214-3266 Yonge Street, Toronto, ON M4N 2L6


www.explorationgeochem.com

14
laboratory is familiar with the standards being submitted and is optimizing the results
reported.
Randomization: Some companies have implemented a practice of renumbering all
samples using a random numbering system (Plant, 1973). This approach will identify
laboratory drift and bias more readily than submission of samples that were collected
along sample lines or throughout a drill hole, and numbered consecutively. Some
geologists are reluctant to use this technique as errors may occur in reassigning the
correct sample numbers and locations.
Crosschecks or umpire assays: Many companies have adopted a practice of
resubmitting 5-10% of all sample pulps for analysis to a second laboratory. This
approach identifies variations in analytical procedures between laboratories, possible
sample mix-ups, and whether substantial biases have been introduced during the course
of the project. These are routine checking programs and are far superior to the more
common practice of submitting a selection of samples at the end of a project to an
alternative laboratory for analysis. Unfortunately, if a problem is identified at the end of
a project, the decisions based on the assays such as drill hole locations, anomaly followup, etc., may already have been made and the budget spent.
Selection bias can be introduced if check samples are not selected randomly from the
entire analytical range (Long, 1999). Standards should always be included with the
submission of check samples so that if a bias is identified between the results of two
laboratories, it can be determined which laboratory produced the correct assays.
There is a difference between submitting pulps for check assays and submitting rejects.
Rejects refers to a second split of the crusher product (usually 90% passing 2 mm).
When rejects are submitted it is difficult to discern subtle analytical biases as the
sampling errors are likely greater than the potential analytical errors. Submitting rejects
is a worthwhile test of the splitting procedures at a laboratory but not a good test of
analytical accuracy.
Laboratory Communication: An important feature of project management is to
maintain close communications with the laboratory being used for analyses. Questions
should be asked concerning the type of sample preparation equipment, cleaning methods
and standard operating procedures. It is possible to request documentation of the
analytical procedures from a laboratory and incorporate the information in an appendix of
a report. Documentation of the procedures is particularly important for a long-term
project where procedural changes from year to year may significantly bias the database of
information. It is always recommended to include clear and precise analytical
instructions with every sample batch submitted. Where possible, a laboratory tour should
be arranged. It is becoming more common to sign laboratory contracts that specify
analytical methods, price, turnaround time and quality control expectations. A contract
can clarify when a laboratory will repeat assays free-of-charge if there are quality control
failures.

Analytical Solutions Ltd., 1214-3266 Yonge Street, Toronto, ON M4N 2L6


www.explorationgeochem.com

15
In summary, a variety of quality control measures can be used to monitor data quality.
Some of the approaches used to monitor data quality include:
n
n
n
n
n
n
n
n
n
n
n

insertion of control samples


insertion of international reference materials
submission of field duplicates
submission of sample preparation duplicates (approximately 90% passing 2 mm)
randomization of sample numbers before submission to a laboratory
comparison of multi-element trends for elements determined by different
laboratory procedures
comparison of results for the same element determined by different methods
routine insertion of an unprepared, barren sample (blank)
routine insertion of a pulverized barren sample (blank)
analysis of 5-10% of sample pulps at an umpire laboratory
analysis of 5-10% of sample rejects at an umpire laboratory.

3.3 Analyze
Overall quality of the data can be improved by quickly identifying and remedying the
problems. Results for control samples and blanks should be reviewed as soon as every
laboratory certificate is received. Control charts are used to monitor the data and decide
immediately whether the results are acceptable.
3.3.1 Control Charts
Each time the laboratory reports a value for one of the control samples, including the
blank, the value is plotted on a control chart. These charts can be computer plotted but it
is just as useful to plot the results by hand on a piece of graph paper. The graph paper is
prepared with the mean, mean plus two standard deviations and the mean minus two
standard deviations drawn as lines across the chart. The mean plus two standard
deviations is the Upper Control Limit and the mean minus two standard deviations is the
Lower Control Limit.
The mean and standard deviation are derived from the multiple analyses performed at
several laboratories to establish acceptable values for the control samples. It is important
to understand the statistics associated with the estimation of the accepted value in order to
evaluate the results.
The mean of the element concentrations for a CRM are derived from the multiple
analyses performed at numerous laboratories to establish acceptable (or expected) values
for control samples. The determination of the mean is complicated when there are
apparent outliers or sets of data from specific laboratories that appear to be biased. The
International Organization of Standardization (ISO) recommends that outliers should not
be excluded on purely statistical evidence until they have been thoroughly investigated
and , where possible, reasons for the discrepancies identified. A variety of statistical tests
for outliers exist (Verma, 1997) and are applied by the supplier of the CRM.
Analytical Solutions Ltd., 1214-3266 Yonge Street, Toronto, ON M4N 2L6
www.explorationgeochem.com

16

The establishment of an appropriate range of acceptable values is more difficult. Gold


standard MA-1b, produced by the Canadian Certified Reference Materials Project, a
division of Natural Resources Canada, is used to demonstrate this point.
The Certificate of Analysis for MA-1b reads as follows
REFERENCE GOLD ORE MA-1b

Recommended Value 95% Confidence Interval


Au 17.0 g/g 0.3 g/g

This is the label on the purchased bottles and many purchasers assume that 95 out of 100
times a laboratorys results for MA-1b should therefore fall between
16.7 to 17.3 g/g Au.
However, it is stipulated in the literature that accompanies the bottle, The uncertainty
estimates the expected range of reproducibility of this mean within 95% probability were
the measurement program to be repeated many times. In fact, the 95% Confidence Limit
quoted denotes that if the certification program were to be conducted 100 times, the
overall mean in 95 cases would be expected to fall within the prescribed limits.
The certification program for MA-1b included 175 acceptable analytical determinations
by 28 laboratories. When certified reference materials or standards are inserted with
samples there is only one determination and the 95% Confidence Limit quoted is not
applicable for measuring the acceptability of the reported value.
Along with the MA-1b documentation is an additional table of statistics that is
reproduced below.

Distribution of results by method


Method
FA/G
FA/AAS
INAA
FA/INAA
FA/ICP

No. of
Sets
20
8
3
1
1

No.
Results
113
44
20
4
5

g/g
Mean
16.96
17.26
17.21
16.23
17.36

CI
0.30
0.85
1.85

SLc
0.61
0.99
0.66

Src
0.37
0.42
0.65
0.36
0.29

CV,%
1.93
2.30
3.88
2.25
1.66

Overall

33

186

17.05

0.26

0.70

0.42

2.22

Analytical Solutions Ltd., 1214-3266 Yonge Street, Toronto, ON M4N 2L6


www.explorationgeochem.com

17

SLc is the between-set standard deviation and Src is the within-set standard deviation. The
bottles of MA-1b are labelled with the overall mean, 17.0 and the 95% Confidence Limit,
0.30 g/g.
The more meaningful statistics for explorationists, when a single determination by a
single laboratory is being evaluated, are the overall between- and within-set standard
deviations. The within-set standard deviation is a reflection of the homogeneity of the
material in the bottle received by the participating laboratory in combination with that
laboratorys ability to reproduce the analytical method routinely. The between-set
standard deviation is likely the most useful statistic as this takes into account slight biases
between laboratories, the differences between the sub-samples received by the
laboratories, in addition to the factors described for the within-set standard deviation.
The mean two standard deviations approximates the expected range of values for 95%
of the cases. Using SLc , the between-set standard deviation, of 0.70 g/g and the
calculated mean value of 17.0 g/g , the results for MA-1b are expected to fall within
15.6 to 18.4 g/g Au. Based on the 95% Confidence Limit as indicated on the label of
the bottle, the range of results for MA-1b are 16.7 to 17.3 g/g Au which is a
considerably narrower range of acceptable values.
The range established by using SLc , the between-set standard deviation, is equivalent to
8.2% of the certified value of 17.0 g/t Au. If the user must demonstrate that the method
gives an accuracy to an uncertainty better than 8%, she should select a different certified
gold ore which has less between-set variability. On the other hand, if the users method
gives no better than ~ 8%, MA-1b is a suitable reference material (Steger, 1998).
This is a very complicated subject and using SLc , the between-set standard deviation, to
estimate the allowable range of values is only a first approximation.
The International Standards Organization has several committees that have been working
on these questions for 25 years which are covered by ISO Guides 30 to 35. There are
several statistical approaches documented to evaluating the acceptance of single and
replicate assays, which seem to involve equations (as shown below) where different sets
of conditions need to be met.
| XC - XL | 2 Lm2 + Rm 2
Further details on the evaluation of results for certified reference materials are described
in a condensed version of ISO Guide 33 available from the Canadian Certified Reference
Materials Project at CANMET, Ottawa, Canada, ccrmp@nrcan.gc.ca .
This does not take into account the fact that 5% of the cases will be outside this acceptable
range, based on the definition of the mean plus or minus two standard deviations.

Analytical Solutions Ltd., 1214-3266 Yonge Street, Toronto, ON M4N 2L6


www.explorationgeochem.com

18
Many suppliers of reference materials report the error as the 95% Confidence Limit or
some other statistic to demonstrate that the materials are homogeneous and the round
robin was conducted properly. These values should not be used to decide on which
assays should be rejected for a quality control program.
It may also be necessary to consider whether the analytical methods being used are
equivalent to those used for the establishment of acceptable values.
When a number of different reference materials are used for the same project, it may be
difficult to interpret trends based on control charts for individual standards. The results
for any number of standards can be compiled on one graph by plotting the reported value
as a percentage of the expected value (Figure 5). Chemists more often plot Z-scores
against time, where the Z-score is the reported value less the expected value then divided
by the standard deviation. Division by the standard deviation provides additional insight
as to whether the differences are significant.

Compilation of Results for 22 Standards Ranging in


Gold Grade from 0.7 to 11 g/t

Au Assay as % of Expected Value for


Standards

125
120
115

Moving
Average

110
105
100
95
Batch 752

Batch 761

90
85
80

N=307

Analytical Solutions Ltd.


Figure 5: Compilation of Results
for Multiple Controls for the Same Project
Time Sequence by Batch and Sample Number

75

When unacceptable values are found, it is appropriate to contact the laboratory and
request additional analyses. It is not important to get the correct value for the control
sample but this information is used to ensure that the samples analyzed in the same batch
have been reported properly. As a guideline, if a control sample is inserted with every 20
samples then the 10 samples before the out-of-control sample and 10 samples after it
should be requested for re-analysis. If there are other reasons to suspect the results in a
Analytical Solutions Ltd., 1214-3266 Yonge Street, Toronto, ON M4N 2L6
www.explorationgeochem.com

19
laboratory batch, for example results for the duplicates, then additional tests may be
required.
Control charts should be prepared for each control sample or blank and for each element.
In the case where a 30+ element ICP scan has been requested, the preparation of these
graphs becomes an onerous task. It is acceptable to plot a selection of elements but it is
important that at least several elements are monitored for each analytical method.
3.3.2 Plotting Duplicates
It is also necessary to review the data for various types of duplicates. The preparation of
simple X-Y plots of the two results using a spreadsheet program is the easiest way to do
this. The same scale is selected for the X- and Y- axes; where there is a broad range of
values it may be preferable to use a logarithmic scale for the axes.
All values should plot close to the X = Y line and precision envelopes can be drawn so
that points that fall outside these envelopes are automatically recorded as unacceptable
(Figure 6).
100000

10000

X=Y
+10%
-10%

1000

100

10

N=488
1
1

10

100

1000

10000

Cu 1 (ppm)

Figure 6: Precision Envelopes


Alternatively, Howarth-Thompson plots (Thompson, 1992) are used when there is a
broad range of values to evaluate and in order to calculate precision. These plots have
the advantage of graphically displaying the difference between values so that both the
absolute difference between values as well as the percentage difference can be easily
determined.
The graph is constructed by
(i) calculating the average of the two results
Analytical Solutions Ltd., 1214-3266 Yonge Street, Toronto, ON M4N 2L6
www.explorationgeochem.com

100000

20

(ii) calculating the difference between the two results


(iii) converting the differences into absolute values
(iii) plotting the average against the difference on log-log axes of an X-Y plot.
Variations of this graph can be used to compare data where it is important to monitor
whether there is a positive or negative bias between two sets of results, for example, the
results from two laboratories. In this case, the difference between two results is plotted
against the average of two results and arithmetic axes are used (Figure 7). The same data
set is displayed in Figures 6 and 7 but it is easier to visualize the differences in Figure 7.

2000

+10%
1500

+5%
1000

500

X=Y

-500

-5%

-1000

N=480
-1500

-10%
-2000
0

5000

10000

15000

20000

25000

30000

Mean Cu (ppm) Lab Dups

Figure 7: Mean vs. the Difference Plot


A comparison of results for duplicate samples or crosschecks will identify laboratory
problems and also provide an estimate of sample homogeneity. In certain cases it may be
proven that the laboratory is providing reliable data but that field sampling programs
and/or sample preparation procedures need to be modified in order to improve the
precision of the data.
3.4 Improve
A quality control program is designed to measure the variation in precision, accuracy,
sample representivity and other parameters, as required. The quality control data provide
the numerical basis to improve processes and plan for improvements.
Analytical Solutions Ltd., 1214-3266 Yonge Street, Toronto, ON M4N 2L6
www.explorationgeochem.com

21

Necessary action could include a request for re-assaying, cancellation of a laboratory


contract or changes to sampling or analytical procedures. For example, it may be
necessary to alter sample collection, crushing, splitting and grinding procedures based on
the results of quality control data in order to improve sample representivity. The
additional cost of using more laborious splitting procedures or pulverizing a larger
sample aliquot could be justified based on scientific data.
It may be recognized that lower detection limits and therefore different analytical
methods and instrumentation are required.
The interpretation of geochemical data requires an understanding of the precision and
accuracy of the data. In the simplest case, if an anomalous value is determined to be
greater than 1000 ppm and the precision of the data is 50%, then any value greater than
500 ppm is technically anomalous. If the precision of the data is 10%, then only
values above 900 ppm are anomalous. The need to strictly apply precision limits is
alleviated somewhat by evaluating trends in the data and integration with other data sets.
However, the use of ratios or other multi-element calculated scores further complicates
the issue as errors may be cumulative.
The quality control requirements developed at the Define stage of the process should
have incorporated the concerns that could arise during interpretation. However, if
problems are recognized with the interpretation and integration with other data sets,
additional improvements to the sampling and analytical processes may be necessary.
3.5 Control
For ongoing control of the data, it may be necessary to refine measurements and
investigate alternative methods of displaying the data. A critical concern is that the
quality control data are reviewed regularly. It is important to automate as much of the
process as possible or use a third-party to evaluate the data and act as an intermediary
with the laboratory. Company management needs to take a lead role. The cost of quality
control procedures including the additional assays has to be included in the budget. Any
project review should start with a brief examination of the quality control measures. A
project review should not proceed if it cannot be demonstrated that the data are valid.
4.0 Conclusions
The Six Sigma DMAIC approach is an attractive model for the implementation of a
quality control program and the evaluation of the data. The approach develops a series
of measurements that are used to control data quality and to make improvements to the
processes. Improvements to sampling, preparation, analysis and quality control can be
implemented on the basis of numerical data and evaluated for their cost-effectiveness.

Analytical Solutions Ltd., 1214-3266 Yonge Street, Toronto, ON M4N 2L6


www.explorationgeochem.com

22
SELECTED BIBLIOGRAPHY
Amor, S., Bloom, L. and Ward, P., 1998, Practical application of exploration
geochemistry. Proceedings of a short course presented by the Prospectors and
Developers Association of Canada, Toronto.
Bloom. L., 1999, Third party vetting of geochemical programs or return on quality, in
Quality Control in Mineral Exploration: A short course presented during the 19th
International Geochemical Exploration Symposium, April, 11, 1999.
Bloom, L., 1998, The Role of Economic Geologists in Evaluating Assay Data Quality.
Proceedings of a short course presented by the GAC and PDAC, November, 1998.
Bloom, L., 1993, Man-made parameters in elemental analysis: SME Pre-print 93-79: p.5.
Burn, R.G., 1981, Data reliability in ore reserve assessments: Mining Magazine, October,
p. 289-299.
Clifton, H.E., Hunter, R.E., Swanson, F.J. and Phillips, R.L., 1969, Sample size and
meaningful gold analysis: United States Geological Survey, Professional Paper 625-C.
Fletcher, W.K., 1981, Quality control in the laboratory, in Govett, G.J.S., ed., Analytical
Methods in Geochemical Prospecting: Handbook of Exploration Geochemistry, v.1, p.
25-46, Elsevier.
Fletcher, W.K., 1987, Analysis of soil samples, in Fletcher et al., eds., Exploration
Geochemistry: Design and Interpretation of Soil Surveys, (1987): Soc. Econ. Geol., p.
79-96.
Garrett, R.G., 1969, The determination of sampling and analytical errors in exploration
geochemistry: Econ. Geol., v. 64, p. 568-571.
Govindaraju, K., 1994, Compilation of working values and sample description of 383
geostandards: Geostandards Newsletter, v.18, p. 1-158.
Gy, P., 1976, The sampling of particulate materials - a general theory: Symposium on
sampling practices in the mineral industries, The Australian Inst. of Mining and
Metallurgy, Victoria, Australia, p. 17-33.
Hall, G.E.M., 1996, Twenty-five years in geoanalysis, 1970-1996: Jour. Geochem. Expl.,
v. 57, Nos. 1-3, 1-8.
Hall, G.E.M. and Bonham-Carter, G., 1988, Review of methods to determine gold,
platinum and palladium in production-oriented laboratories, with application of a
statistical procedure to test for bias: Jour. Geochem. Expl., v. 30, p. 255-286.

Analytical Solutions Ltd., 1214-3266 Yonge Street, Toronto, ON M4N 2L6


www.explorationgeochem.com

23
SELECTED BIBLIOGRAPHY (Continued)
Hall, G.E.M., Vaive, J.E., Coope, J.A. and Weiland, E.F., 1989, Bias in the analysis of
geological materials for gold using current methods: Jour. Geochem. Expl., v. 34, p. 157171.
Hill, W.E., 1974, The use of analytical standards to control assaying projects: Vancouver
IGES, p. 651-657.
Howarth, R.S. and Thompson, M., 1976, Duplicate analysis in geochemical practice, Part
II: Analyst 101 (1206), p. 699-709.
Kane, J.S., 1992, Reference samples for use in analytical geochemistry: Their
availability, preparation and appropriate use: Jour. Geochem. Expl., v. 44, p. 37-63.
Kirschling, G., 1991, Quality Assurance and Tolerances: Springer Verlag, 335 p.
Kretz, R., 1985, Calculation and illustration of uncertainty in geochemical analysis: Jour.
Geol. Educ., v. 33, p. 40-44.
Levinson, A.A., Bradshaw, P.M.D. and Thomson, I., 1987, Discrepancies in analytical
determinations of gold, Arizona, U.S.A., in Levinson, A.A., Bradshaw, P.M.D. and
Thomson, I., eds., Practical Problems in Exploration Geochemistry: Applied Publishing,
p. 148-149.
Long, S., 1999, Impact of selection bias, in Quality Control in Mineral Exploration: a
short course presented during the 19th International Geochemical Exploration
Symposium, April, 1999.
Mining Standards Task Force, Toronto Stock Exchange, 1999. Mineral Exploration Best
Practices Guidelines, October 1999.
Plant, J.A., 1973, Random numbering system for geochemical samples: IMM Trans. 82,
B64-B65.
Plant, J.A., Jeffrey, K., Gill, E. and Fage, C., 1975, The systematic determination of
accuracy and precision in geochemical exploration data: Jour. Geochem. Expl., v. 4(4), p.
467-486.
Potts, P.J., Tindle, A.G. and Webb, P.C., 1992, Geochemical reference material
compositions: Rocks, minerals, sediments, soils, carbonates, refractories and ore used in
research and industry: CRC Press Inc.
Pyzdek, T., 1989, What every engineer should know about quality control. ASQC
Quality Press, N.Y. 251 p.

Analytical Solutions Ltd., 1214-3266 Yonge Street, Toronto, ON M4N 2L6


www.explorationgeochem.com

24

SELECTED BIBLIOGRAPHY (Continued)


Ramsey, M.H., Thompson, M. and Hale, M., 1992, Objective evaluation of precision
requirements for geochemical analysis using robust analysis of variance: Jour. Geochem.
Expl., v. 44, p. 23-36.
Stanley, C., 1999, Treatment of geochemical data: Some pitfalls in graphical analysis, in
Quality Control in Mineral Exploration: A short course presented during the 19th
International Geochemical Exploration Symposium, April 11, 1999.
Steger, H.F., 1998. Uses of matrix reference materials. For presentation at the
IUPAC/ISO/REMCO workshop on Reference Materials, Berlin, April 22-23, 1999 and
the workshop of the conference of the Canadian Mineral Analysts, Kirkland Lake, ON,
September 17, 2001, and for publication in the respective workshop proceedings. Project:
MMSL No. 600637CCRMP DIVISION REPORT MMSL 98-024 (OP&J)
Thompson, M., 1983, Control procedures in geochemical analysis, in Howarth, R.J., ed.,
Statistics and Data Analysis in Geochemical Prospecting: Handbook of Exploration
Geochemistry, v.2, p. 39-58, Elsevier.
Thompson, M., 1992, Data quality in applied geochemistry: the requirements and how to
achieve them: Jour. Geochem. Expl., v. 44, p. 3-22.
Thompson, M. and Howarth, R.J., 1978, A new approach to the estimation of analytical
precision: Jour. Geochem. Explor., v. 9, p. 23-30.
Vaughn, R.C., 1990. Quality Assurance. Iowa State University Press, Ames, Iowa.
Verma, S.P., 1997. Sixteen statistical tests for outlier detection and rejection in evaluation
of international geochemical reference materials: example of Microgabbro PM-S.
Geostandards Newsletter, Vol.21, No.1, pp. 59-75.

Analytical Solutions Ltd., 1214-3266 Yonge Street, Toronto, ON M4N 2L6


www.explorationgeochem.com

Potrebbero piacerti anche