Sei sulla pagina 1di 7

HOME

Is an 80th Percentile Design Point Logical?


D David1
1. FAusIMM, Technical Director Process, AMEC, Level 7, 197 St Georges Terrace, Perth WA
6000. Email: dean.david@amec.com

ABSTRACT
Clearly a plant designed only to treat average ore at the nameplate rate will fail to achieve nameplate
in any typical year. To insert the necessary capability to achieve nameplate it is common process
engineering design practice that plant must be able to treat an ore with 80th percentile value (hardness
or competence values for example) at the nameplate rate. In the authors experience, apart from a
couple of notable cases, this has been applied reasonably well as a principal, and with widespread
success. However, in a number of recent instances it became clear that the use of the 80th percentile
number would have resulted in significant under design of the plant. This paper makes the case that
the 80th percentile, as a principal, has can have serious flaws and its use needs to be assessed on a case
by case basis. The discussion in this paper is the first step in developing a new design principal, and
the associated methodology for selection of a design value, that will ensure plants are designed to
achieve their nameplate capacities.

INTRODUCTION
Designing a process plant has many conventions that are often taken for granted. One of these is that
the 80th percentile value of a key measure, like the Bond ball mill work index (BWI), will provide an
unquestionable margin of design safety in the plant. The normal procedure is to take a set of test
results, for example 20 individual BWI values from around the orebody, arrange them largest to
smallest and take the 16th largest value. This is placed in the design criteria as the 80th percentile,
usually alongside the average BWI value, and is subsequently used to design the ball mill. Many years
later that number can assume almost legendary status as THE work index of the deposit or it may
have assumed infamous status as being the main cause of the failure of the plant to achieve nameplate
throughput or grind size.
The purpose of this paper is to explore circumstances where the 80th percentile value will not provide
a definitive design point. Guidelines for avoiding being fooled by a false design value are provided,
together with the basis for alternative design methodologies.

PROBLEMATIC DATA SETS


The annual plan
If the 20 values of a particular measurement are provided from a 20 year annual mine plan then failure
to design for variability for that measure is virtually assured. One of the most common values that
dangerously makes its way into design criteria from mine plans is head grade.
A recent analysis of copper head grade for a project determined the variability of a number of related
data sets ranging from the most inherently variable to the least. The most variable data sets are those
that look at the orebody in individual parcels that may represent minutes or hours of plant feed. Two
examples of such data (present in virtually all projects) are the drill database and the block model. At
the other end of the variability scale is the annual mine plan. To arrive at the annual mine plan it is
normal to use the individual ore blocks contained in the block model in an orderly and controlled
fashion. The use of blocks (in an open pit example) is orderly because it commences at the surface
and must progress downward in some logical mining sequence. The use of blocks is controlled
because it is typical to attempt to obtain a target head grade to the process plant when making
Metallurgical Plant Design and Operating Strategies (MetPlant 2013)
15 - 17 July 2013, Perth WA

selections from the blocks that are immediately available to be mined. In many cases there is also a
stockpiling system in place (usually incorporated into the block sequencing procedures) between the
mine and process plant to allow grade control to be achieved when the mine cannot directly provide it.
Usually the measure being controlled is the head grade of the most valuable component, and this also
the measure for which believable variability information is required for design.
On arrival at a site for commissioning at a copper mine the author was greeted with the bunded
flotation area of the plant freshly filled with copper concentrate, mostly from the cleaners and
recleaners. The launders and pumps were continuing to overflow. On asking what had caused the
problem, the response was that the plant had been processing 8% Cu in feed for the last day and they
couldnt do much about the problem immediately as the stockpile was full of 8% copper ore!
The design criteria for this project were, in theory, prepared on an even more conservative basis than
the 80th percentile because the maximum grade had been used (100th percentile). This value was
1.6% Cu and, obviously, nowhere near what was being dealt with by the plant. Although this value
was in the design criteria, the maximum case mass balance assumed a head grade of only 1.39% Cu,
the 80th percentile value. Both the 100th percentile and the 80th percentile values were from the annual
mine plan.
The variability levels for the various data sets described at the commencement of this section are
shown in Figure 1. The upper and lower limit 95% lines represent the range that contains 95% of
the data within that particular set. Note that this is not the same project that had the 8% Cu in flotation
feed problem.
2.5

%CuinFeed

2.0

UpperLimit95%

1.5

80thPercentile
Average

1.0

LowerLimit95%
0.5

0.0
Core

MetTestData

Block

Shift

Day

Week

Month

Year

Figure 1: Variability Bands for Grade Data Sets from a Single Orebody

The 80th percentile value for the shift data set is 0.91% Cu and this would be a reasonable value to use
for design purposes. The shift data set also predicts that the plant can expect to treat 1% Cu (or
greater) in feed for 2.5% of the time (the Upper Limit 95 is actually the 97.5th percentile value)
which is 27 x 8 hour shifts per year, or more than two per month. Clearly choosing to use the 80th
percentile annual, monthly or weekly mine plan copper grades for design purposes would dangerously
underestimate the real copper grade variability that the plant must be able to process.
Interestingly, the core database 80th percentile would also underestimate the shift 80th percentile value,
while both the metallurgical test data set and the block data 80th percentile values would be acceptable

Metallurgical Plant Design and Operating Strategies (MetPlant 2013)


15 - 17 July 2013, Perth WA

estimates. However, all three small-sample data sets overestimate the Upper Limit 95 value by at least
100% compared to the shift data.

When put in the context of real operational requirements, it becomes clear that the annual
plan is not a valid source of process design information.

The composited sample


To save money, a recent client had only conducted five sets of metallurgical tests at laboratory scale
for a definitive feasibility study (DFS) level design. Minimal background information, apart from the
data itself, was provided for a review. The five BWI values provided all lay between 11.1 kWh/t and
11.8 kWh/t. The SD between the six tests was 0.3 kWh/t, approximately what is normally considered
to be the inherent repeatability of the test itself. The SD (standard deviation) of the DWI results was
only 0.4 kWh/m3, again similar to the level of repeatability expected of that test.
The 80th percentile of the five BWI values was only 2% greater than the average value and the
maximum value was only 3.5% above the average. For the DWI result set, the 80th percentile value
was only 10% higher than average and the maximum value was not much higher at 13% above
average. There were two possibilities to explain these results, either the orebody was the first one in
the authors experience where the material was all the same or, something very strange had
happened in the constitution of the samples.
In response to requesting more information, the core intervals making up each of the samples were
provided. The source of the problem was composite preparation, and all semblance of variability had
been eliminated from the samples. Each test sample was a composite of a minimum of 60 core
intervals from a minimum of 4 different drill holes. The stated aim of this particular piece of work had
been to define the properties of five different areas in the orebody. Obviously, it was confirmed that
the five areas can be considered virtually identical, on average, with respect to ball milling. However,
the test data set contains compositing equivalent to, at least, the level of an annual mine plan (and
maybe even up to a 5 year mine plan basis). As demonstrated in the last section, annual plan data is
unsuitable because it contains a level of variability far below what is needed for safe design.
Applying the relationship between annual variability and shift variability from Figure 1, it is possible
to provide an estimate of what the true variability of BWI values might have been, had multiple
contiguous samples been tested individually. From figure 1 the 80th percentile value from annual data
is only 10% above the average value, while the 80th percentile on a shift basis is 74% above the
average value. Therefore, if the difference between the annual BWI 80th percentile and the average
BWI value is multiplied by 7.4 the variability will more closely represent a valid design 80th
percentile. The outcome of this calculation is a more believable 80th percentile BWI value of 13.2
kWh/t 16% above the average value of 11.4 kWh/t. Given the uncertainty in what the original data set
actually represents, the real 80th percentile value for a production shift could be even higher.
This example clearly shows the danger of relying on composited samples for design purposes. The
circumstances behind these composites are extreme, but it must be recognised that any compositing
reduces the inherent variability that will exist in the test results. Knowing how samples have been
selected and prepared is essential to understanding the design implications of test data.
As a variation on the compositing theme, in many projects tests have been carried out on annual
composites (eg Year 1 composite, Year 2 composite, etc). Taking an 80th or 100th percentile value
from the set of annual composite results will, again, dangerously underestimate the variability the
process design needs to cope with.

Multi-modal ore properties


Few orebodies consist of a single lithology or a single geological classification of material. In many
instances (but not all) the comminution and separation properties of the different geological units can
be distinctly different and need to be understood separately. If the geological differences also
correspond to metallurgical differences then the geological ore types are valid geometallurgical ore
Metallurgical Plant Design and Operating Strategies (MetPlant 2013)
15 - 17 July 2013, Perth WA

types. In addition, each geometallurgical ore type will display a range of properties, and the
proportions of each ore type in plant feed will vary from shift to shift and year to year.
A classic example would be where the orebody has an oxidised cap, a transition zone and fresh rock at
depth. The approach often seen by the author is for the fresh rock to be represented by 20 or more
samples, the transition by five samples and the oxide by two, neatly matching their proportion in the
orebody.
Provided the samples have been selected correctly, it is certainly possible to derive a reasonable 80th
percentile value for the fresh ore. Conversely, it is not even possible to estimate the degree of
variability in key properties (let alone the 80th percentile values) that characterise the oxide cap using
only two samples.
If the oxide cap represents 100% of plant feed for the first 6 months, then understanding the properties
of that material is of reasonably high importance. The plant will be commissioned on that ore and it
may play a significant role when bankers tests and warranty tests are being conducted. In this
situation, a minimum of 10 spatially distributed samples of oxide ore need to be tested before the
variability can be estimated. There is usually one guarantee with oxide ore, it will be more variable
than the fresh rock from which it is derived.
The transition ore presents similar problems, not least of which is definition. An ore is called
transition because it is part way between the original rock (in this case fresh ore) and the geological
layer above it (in this case the oxide cap). Invariably, the transition ore will contain examples of the
end members (fresh and oxide) together with everything in between. The high degree of variability in
typical transition ore demands that a reasonable number of samples be tested, provided of course that
the transition ore type represents a substantial plant feed component for a long enough time period to
be considered separately in the design process.
The more components in the ore the more important it is to have detailed mine plans guiding the
design process. The most common misconception for the inexperienced (or those in a hurry) is that
the proportions of each ore type in plant feed can be taken directly from the monthly or annual mine
plan. The only way a realistic appreciation of the variability in ore type proportions can be gained is to
understand how such ore gets from the mine to the plant.
For example: An orebody has two ore types with distinctly different comminution properties and
these ore types are planned to be, on an annual basis, delivered to the plant in a 50:50 ratio.
Simplistically the ore type properties can be averaged and then used for design. However, discussions
with the geologists reveal that the first ore type is on one side of the orebody and the other ore type is
on the other side of the open pit. Discussions with the miners reveal that they are only intending to
have one shovel in ore at any one time and that there is no intention to set up blending stockpiles, as it
is too expensive to double-handle all the ore. The 50:50 ratio is not a 50:50 blend and has now
become sequential processing of one ore type, followed by the other. The resulting design
requirements are totally different to the requirements for processing a 50:50 blend.
Where does the 80th percentile come into this discussion? The 80th percentile of the blend is obviously
of little use. Each ore type now needs to be understood individually in terms of variability, and the
number of test samples required has probably doubled. If the 80th percentile value is to be the basis of
design then the properties of each ore type need to be measured to the degree where a reliable 80th
percentile value can be extracted from the results for each ore type.
However, before proceeding to design the plant to treat each ore type separately, a constructive
discussion is needed across all three disciplines to explore the implications of alternative mining
strategies. For example, introducing blending before the primary crusher or reducing the shovel size
and having two (or more) ore faces supplying plant feed at all times. Beware of the argument that
often comes up in such discussions that blending is happening in the coarse ore stockpile after
primary crushing. Unless the coarse ore stockpile is a bed blending arrangement where ore is stacked
in layers and reclaimed across the layers, then it can be safely assumed that no blending of any
consequence occurs in the coarse ore stockpile.

Metallurgical Plant Design and Operating Strategies (MetPlant 2013)


15 - 17 July 2013, Perth WA

10

VARIABILITY EXPECTATIONS
Steve Morrell (Morrell 2011) published the distribution of variability levels that exists within the
SMC (Sag Mill Competence) test database. Typically, SMC test samples are from contiguous core
intervals and are done in numbers large enough (per orebody) to derive reasonable statistics. The
variabilities of the DWI (drop weight index, in kWh/m3) results from the 650 orebodies represented in
the database (at the time of his writing) were distributed as shown in Figure 2.

Figure 2 Distribution of Orebody COV* values for Competence Measurement (after Morrell 2011)
*The coefficient of variation (COV) is simply the standard deviation divided by the average value, expressed as a
percentage.

For an orebody with an average DWI value of 5 kWh/m3 and a COV value of 25%, the 80th percentile
value must lie between 25% and 50% greater than the mean value (between one and two standard
deviations greater than the mean). An 80th percentile value of 7 kWh/m3 would not be unreasonable
for this example. However, the COV could range from 5% to 60%, so 80th percentile values of 10
kWh/m3 and 5.4 kWh/m3 are also not unreasonable for a 5 kWh/m3 ore.
All critical measures used for design will have a similar (but usually less broadly spread) range of
possible variability values. Without having enough actual measurements from tests performed in the
correct manner on appropriate samples, it is not possible to achieve a reliable estimate of variability
for a particular property in an orebody. This statement also excludes the 80th percentile estimation
method applied in the compositing exercise discussed earlier in the paper. Although the final 80th
percentile estimate was superior to the estimate from the original data, it was also far from definitive.

AN ALTERNATIVE APPROACH
As has been demonstrated in the examples above, the 80th percentile value can be totally misleading
and dangerous as a design point if the data set from which it is derived is not suitable for design
purposes. In any data set typically available as a foundation for design, the average value will be a
much more reliable number than the 80th percentile or the SD. In the example given for the effect of
compositing, an estimate was made of a believable 80th percentile value based on an assumption about
Metallurgical Plant Design and Operating Strategies (MetPlant 2013)
15 - 17 July 2013, Perth WA

11

the variability differences known to exist between data sets calculated on varying time scales of
production. Notably the adjustment to the 80th percentile value was made relative to this well-defined
value, the average of the data set. The method employing scaling of variability (or 80th percentile)
between datasets is one alternative approach to selecting the design point.
However, although the concept of the 80th percentile is one that provides some comfort in design, it is
also one that is arbitrary. Why not use the 90th percentile, the 75th percentile, or some other value? In a
recent design project some criticism was provided in a review that the 80th percentile had not been
used and reliance was placed on a riskier value, the 75th percentile. Regardless of the fact that the 75th
percentile was the agreed design point with the client, the results database was revisited and the
design outcome recalculated using the 80th percentile value. In this instance the difference was less
than 0.5% in the mill power, insignificant in the accuracy of the design. For this particular deposit the
SD of the value in question was very low but the variability in the data set was considered valid for
design. The data set consisted of about 50 results, all from individual contiguous samples according to
an AMEC sample selection plan. The insensitivity, in this instance, of the design outcome to the
selected percentile value was of concern, mainly because the design point was not all that much
greater than the data set average. This particular instance has led to a re-evaluation of the basis of
design towards using the mean value, rather than any high-percentile derivative (80th or 75th) of the
data set.
Once the focus is on the mean value then the concept of confidence can be introduced to design
process. It is possible mathematically to derive the upper and lower confidence limits for a given
mean value of a data population by knowing the number of samples in the population, the SD of the
population and the degree of confidence that is required in the mean value. The higher is the required
confidence level, the wider the range of possible mean values for any data set. A range of possible
mean values exists because, if exactly the same sample selection process was conducted on the same
orebody, but different core intervals were chosen, the mean result from testing this second sample set
is almost certain to be different to the mean result from testing the first sample set. Therefore, it is
possible to know what the highest likely mean value is (the upper confidence limit of the mean) and
Microsoft Excel provides a function to simplify this estimation. As the SD is an integral part of the
calculation it is still necessary to re-estimate the SD if compositing or annual planning has smoothed
the available data set.
Confidence is the inverse of risk. A confidence level of 90% means that the risk of the available result
being wrong (outside of some accuracy target if you repeat the measurement exercise) is 10%. The
10% is actually made up of two components, being outside of accuracy limits on the high side and
being outside the limits on the low side. If the low side is problematic but the high side is not, then
there is only a 5% change of a problematic result. This represents 1 chance in 20 that the available
measurement is lower than it should be and will lead to a design at risk of failure. Typically a 90%
confidence limit or a 95% confidence limit is applied in these calculations and any remaining risk is
mitigated by adding a design margin.
A poor data set will provide a large confidence limit band for the mean. A good data set will provide a
tight confidence limit band. For any data set the safest estimate of the mean for design purposes is
either the upper or lower confidence limit (depending on how the value in question drives design). As
these values are estimates of the mean (and not of the range of full population) then the resulting
design value will be much less than the 80th percentile or the upper limit 95% for the population. The
important thing about this value is its believability (the confidence it provides).
Having such a value to work with, it is now possible to construct a design methodology based on
estimation of SD coupled with a statistically-derived upper confidence limit of the mean value. It will
be argued (in a follow-up paper) that such an approach is more robust than the tried, and sometimes
wanting, 80th percentile based method.

CONCLUSION
The 80th percentile value can be a useful design number, provided all the correct pre-requisites are
present in the population that it is derived from. Firstly, the samples that the individual test results
Metallurgical Plant Design and Operating Strategies (MetPlant 2013)
15 - 17 July 2013, Perth WA

12

represent must consist of individual lengths of contiguous core, or something close to this ideal.
Secondly, there must be some understanding of how the variability of the available data set is related
to the variability that needs to be catered for in plant design. If a pathway to estimating shift by shift
production variability (for any variable) is available then this can provide useful guidance and should
be followed. Thirdly, there must be some relationship between the available data and the time
sequence of presentation of ore to the plant. For example, if the ore that the plant is to be
commissioned on is not clearly understood, then an adequate number of samples of commissioning
ore need to be tested before commencing definitive design. Fourthly, the relationship between the
mining practices to be employed and the distributions of the various ores in the deposit need to be
understood at an operational level, and not a smoothed monthly or annual ideal.
It is essential that the process design engineers have access to all raw test data, and that they
understand the basis of sample selection, they understand any compositing that has been performed on
the samples and why it was dome that way, and they have confidence in the testing that has been
conducted. It is also essential to link the test results to the ore types and to the time sequence in which
the ore is likely to be mined. Finally, the process engineers must understand the operational reality of
ore access and delivery to the plant, including any assumptions regarding blending. A good start to
gaining these necessary understandings is to ask the geologists for the drill database and to ask the
mining engineers for a sequenced block list. The resulting conversations are usually very
enlightening.
If a valid 80th percentile design value is not obtainable from the available raw data, then a relatively
simple pathway for estimating a useful 80th percentile value (based on the relative variabilities within
available data sets) has been demonstrated.
To increase the confidence that can be placed in a design outcome, the basis for a new and robust
design methodology has been proposed that is linked to a worst case estimate of the mean value (at a
desired accuracy level) and with a defined confidence level based on the clients appetite for risk. This
will be elaborated in a follow-up paper.

ACKNOWLEDGEMENTS
Anonymous acknowledgement is given to those projects that supplied examples, good and bad, used
in this paper. Special acknowledgement is given to the geologist who advised (in an operating pit) that
there was no difference between the brown gooey ore and the crunchy blue rock, they are both the
same geological classification. Acknowledgement is also given to Steve Morrell, who has followed
(and sometimes built) the pathway to understanding variability.

REFERENCE
Morrell S, 2011, Mapping Orebody Hardness Variability for AG/SAG/Crushing and HPGR Circuits, in International
Autogneous Grinding, Semiautogenous Grinding and High Pressure Grinding Roll Technology 2011, Paper # 154,
(Editors Major K, Flintoff B C, Klein B, McLeod K).

Metallurgical Plant Design and Operating Strategies (MetPlant 2013)


15 - 17 July 2013, Perth WA

Potrebbero piacerti anche