Sei sulla pagina 1di 29

Computers and Structures

Manuscript Draft
Manuscript Number: CAS-D-15-00814
Title: Efficient optimization of reliability-constrained structural
design problems including interval uncertainty
Article Type: Research Paper
Keywords: surrogate model; optimization; interval uncertainty;
reliability; composite structures
Abstract: A novel interval uncertainty formulation for exploring the
impact of epistemic uncertainty on reliability-constrained design
performance is proposed. An adaptive surrogate modeling framework is
developed to locate the lowest reliability value within a multidimensional interval. This framework is combined with a multi-objective
optimizer, where the interval width is considered as an objective. The
resulting Pareto front examines how uncertainty reduces performance while
maintaining a reliability threshold. Two case studies are presented: a
cantilever tube under multiple loads and a composite stiffened panel. The
proposed framework is robust and able to resolve the Pareto front with
fewer than 200 reliability evaluations.

*Manuscript
Click here to view linked References

Efficient optimization of reliability-constrained


structural design problems including interval uncertainty
Yan Liua , Han Koo Jeongb , Matthew Collettea,
a

Department of Naval Architecture and Marine Engineering, University of Michigan,


United States
b
Department of Naval Architecture, Kunsan National University, Korea

Abstract
A novel interval uncertainty formulation for exploring the impact of epistemic uncertainty on reliability-constrained design performance is proposed.
An adaptive surrogate modeling framework is developed to locate the lowest reliability value within a multi-dimensional interval. This framework is
combined with a multi-objective optimizer, where the interval width is considered as an objective. The resulting Pareto front examines how uncertainty
reduces performance while maintaining a reliability threshold. To case studies are presented: a cantilever tube under multiple loads and a composite
stiffened panel. The proposed framework is robust and able to resolve the
Pareto front with fewer than 200 reliability evaluations.
Keywords: surrogate model, optimization, interval uncertainty, reliability,
composite structures
1. Introduction
Managing uncertainty, especially in the early stages of structural design,
is critical to many novel designs and materials. While robust design and
reliability-based optimization frameworks have been successfully developed
(see e.g. [1, 2, 3, 4] for recent work in this field), most of these formulations
require a precise stochastic definition of the uncertainty involved. However,
early stage structural design with novel materials or applications is marked by

Corresponding Author
Email address: mdcoll@umich.edu (Matthew Collette)

Preprint submitted to Computers and Structures

September 8, 2015

comparatively limited and vague information about the design and associated
uncertainties. Uncertainty associated with limited knowledge is epistemic in
nature and is normally reducible by investment in engineering investigation.
Consequently, traditional reliability-based design tools may not be well suited
to model the design situation as the epistemic uncertainty normally cannot
be stochastically defined.
Reliability analysis with non-probabilistic interval uncertainty [5, 6] has
been studied as an approach to modeling epistemic uncertainty. While this
removes the need to define a precise stochastic distribution, reasonable uncertainty bounds still must be developed. The approach proposed here takes
a different route. Reliability-based design optimization and interval uncertainty approaches are combined. Then, the width of the uncertainty intervals are treated as an objective in a multi-objective optimization approach
while maintaining consistent reliablity levels. The resulting Pareto fronts
show the impact of lack of knowledge on design performance and allows the
engineering design team to prioritize where to invest time in reducing epistemic uncertainty. The core contribution required to make such an approach
practical is a novel adaptive surrogate modeling technique for efficient interval reliability analysis. This surrogate approach allows the combination of
reliability models, interval uncertainty, and multi-objective optimization to
remain computationally feasible.
The central concept of the proposed framework is that interval uncertainty is a useful representation of uncertainty in early-stage design knowledge. Conventional structural optimization approaches [7] often use stochastic form to account for uncertainty in the design. However, in the authors
field of marine structures, such models are difficult to apply early in the design process as precise stochastic uncertainty information does not yet exist.
Any error in assumption of distribution can be harmful later in the design [8].
This is especially true for marine structures, where too low early structural
weight estimates can cause extensive design re-work or in-service structural
failures [9]. It is argued that interval uncertainty can be used to address this
concern where no assumption of distribution is needed [10]. Among other
non-traditional uncertainty models [11], interval uncertainty was selected as
in most cases the non-deterministic parameters and variables are only known
within intervals [12]. This work is the first to explore the coupling between
structural optimization with interval uncertainty that is arranged in reliability analysis. The interval uncertainty range will be treated as design variables, and the optimizer will determine feasible design configurations with
2

respect to various uncertainty intervals. The aim is to resolve the trade-off


between interval uncertainties and design performance through the optimization, thus unveiling the overall impact of early stage design uncertainty on
the design.
A complication in selecting interval uncertainty is that working with intervals in optimization can be computationally expensive. In design optimization involving interval uncertainty, a max-min optimization [13] is often
needed. In min-max approaches, the optimizer searches for a solution that
has the best worst-case performance in the uncertainty interval. In terms of
reliability analysis involving interval uncertainty, the worst case of interval
analysis needs to satisfy the specified reliability constraint. As reliability
simulation itself normally involves a search for most probable point of failure
[14], reliability analysis with interval uncertainty becomes a nested optimization in which the worst case performance needs to be located. Considering
that such analysis is repeatedly requested in population-based heuristic design optimization approaches, the computation can soon become intractable.
This work presents a method that is capable in locating the worst case reliability result while remaining computationally efficient, therefore allowing
the aforementioned interval uncertainty study to be carried out within a
moderate computation budget in reliability-based design optimization.
Many previous authors have studies how to efficiently solve reliabilitybased design optimization (RBDO) problems. One way is focused on reliability problem formulation, where the performance measure approach [15],
sequential RBDO [16] and single-loop RBDO [17] have been proposed. However, they are not ideal as solutions to interval uncertainty problems where
worst case reliability needs to be computed. A more direct way to reduce
computational cost is referring to the research field of surrogate modeling.
In recent years, development in this direction has been reported [18]. Among
the surrogate methods proposed, Kriging [19] shows promise as an approximation tool in reliability simulation. Kriging was first proposed for structural
reliability problem by Kaymaz [20], and recent development can be found in
works of Bichon et al. [21], Echard et al. [22], and Dubourg et al. [23]. These
studies on Kriging methods for reliability mainly focused on the approximation of the limit state function, and then applying Monte Carlo Simulation
(MCS) method for reliability analysis. Such methodology has also been applied in solving reliability analysis with interval uncertainty problems [24].
While Kriging-assisted MCS methods may be a viable strategy for a single
reliability analysis, adopting this methodology in population-based optimiza3

tion algorithms, such as evolutionary algorithm, can be problematic, as the


large number of reliability analysis required can quickly render MCS method
extremely costly even with the help of Kriging.
This manuscript introduces a new Kriging modeling strategy tailored to
the specific need in design optimization with interval uncertainty. The proposed Kriging method is advantageous in that it can give a predicted worstcase performance along with the estimated function response; therefore, the
inner-loop search in the interval uncertainty domain can be avoided. The
strategy is achieved by exploiting the Kriging model information to establish a local quadratic approximation for reliability in the interval variables
domain. With the quadratic approximation, the worse-case performance can
be directly located to avoid a search operation. Additionally, two refinement criteria for the Kriging are implemented to ensure the quality of worstcase prediction from the proposed surrogate model and the accuracy of the
quadratic approximation. It is demonstrated in this manuscript that the
proposed method can accurately estimate worst-case reliability performance
for multi-objective evolutionary algorithm with a high level of efficiency.
The rest of the paper is organized as follows. Section 2 reviews the interval
uncertainty and interval reliability analysis. Subsequently, Section 3 introduces the proposed surrogate modeling method and the multi-objective optimization framework for trade-off study. In Section 4, the proposed method
is examined in an interval benchmark problem [5], and then the validated
method is applied to CFRP top-hat stiffened panel design [25]. Discussion
and concluding remarks are given in the last section.
2. Review of interval reliability analysis
This work treats uncertainties that are due to lack of information via an
interval formulation. There are two critical components to such an approach:
the definition of the interval model and the application of this model in
reliability analysis. Each of these components is reviewed in turn in this
section.
2.1. Interval uncertainty
Interval uncertainty provides an appropriate alternative from stochastic
uncertainty, as no information regarding stochastic distribution is required.
Interval modeling [26] is usually applied in a simple close form:
Y = [Yl , Yh ] = {Y I|Yl Y Yh }
4

(1)

Interval uncertainty model is concerned with investigating the whole range of


the potential values bounded by a higher bound and a lower bound. There is
no assumption of probabilistic distribution within these bounds; specifically,
a uniform distribution is not assumed. The interval definition can also be
interpreted in another form:
Y [Y0 , Y0 + ]

(2)

where is defined as the maximum deviation of uncertainty from nominal


value, representing the range of the interval uncertainty. In this form, it is
clear that interval uncertainty can reflect the tolerance for error in the design
[27]. The impact of interval ranges on the design performance is the aim of
this work. This trade space can be studied by making a version of one for
the objectives of the optimization problem, as reviewed in Section 3.
2.2. Worst-case reliability index with interval uncertainty
Without interval uncertainty, reliability analysis is concerned with calculating the probability of failure in a limit state function:
Z
Pf = P r(g(X) 0) =
fX (X)dX
(3)
g(X)0

where g denotes the limit state at which the system is safe if g(X) 0. X
is a vector of random variables accounting for stochastic uncertainties, and
fX (X) is the joint probability density function. Required and achieved values
of Pf are normally given in terms of the safety index, :
Pf = ()

(4)

As direct integration of the expression given in Equation 3 is difficult, many


approximate methods of estimating have been proposed. Here, for simplicity, the first order reliability method (FORM) [14] is used. In FORM, the
reliability index can computed in the following procedure:
= minkU k
U

(5)

s.t. g(U ) = 0
where U is the transformed standard normal variables from random variables:
ui = 1 (Fxi (xi )). The reliability index can also interpreted as the minimum
distance from origin to limit state in U space.
5

Given that the information about stochastic uncertainty models are usually incomplete in the early stage of innovative marine structure design, the
calculated reliability index may not be accurate enough to ensure sufficient
safety in design [8]. Interval uncertainty can be introduced into reliability
analysis to model the epistemic contribution to uncertainty and address this
concern [28]. Generally, when interval uncertainty parameter is involved in
the limit state function, the interval reliability index output becomes an interval: [L , H ]. In this reliability-based optimization involving interval
uncertainty, the worst-case reliability index L is used for safety examination.
Here, interval parameter Y is used to illustrate the process in computing L . After the transformation of random variables X to U space, the
limit state function g(U, Y ) = 0 is defined with both normal variables U and
interval parameter Y . Though there is proposed work [29] conducting monotonicity analysis for reliability with interval parameter, most of the times an
optimization search is needed to locate L in the interval reliability output.
The procedure is defined with the equation below:
L = minkU (Y )k
U,Y

s.t. g(U, Y ) = 0
YL Y YU

(6)

The above procedure is a nested optimization process where the outer


loop minimizes the reliability index by locating the worst-case combination
of interval variables Y , while the inner loop is a reliability determination
via any standard reliability evaluation method. In this work, the FORM
procedure defined in Eq. 5, is used for the reliability index evaluation.
2.3. Impacts of interval reliability analysis
A reliability-based design optimization that accounts for worst-case reliability using FORM analysis can be expressed as the following:
min f ((X))
s.t. L = min fF ORM (X, Y ) t
Y

(7)

Y [Y 0 , Y 0 + ]
where f is a performance function associated with design variables and t is
the target reliability index constraint. fF ORM denotes the process by which
6

to evaluate the reliability index. Additional constraints that do not involve


Y or may be deterministic could also be added.
While most studies on interval reliability assumed fixed interval domain,
this work argues that various range values of interval uncertainty need
to be examined. The range reflects the epistemic uncertainty present.
However, epistemic uncertainty is normally reducible by investing engineering time and expense to generate and analyze additional data. From the
project management perspective, it is interesting to explore what the impact
of epistemic uncertainty is on design performance and what performance gain
could be achieved by reducing it. It is proposed that the width of can be
set as an objective in multi-objective optimization along with conventional
objectives such as weight, cost etc. The resulting Pareto trade space will
show the engineers the impact of epistemic uncertainty on performance.
Computational efficiency is the primary issue that needs to be addressed
in order for the proposed interval study. Due to the optimization search
for worst-case combination of interval variables Y , the number of reliability analyses needed can increase significantly and makes the computation
intractable for design optimization. This manuscript proposes an advanced
technique that is capable of conducting the worst case search in interval
domain and clearing the obstacle in proposed trade study on interval uncertainty.
The following section will introduce an adaptive surrogate modeling strategy to locate the worst-case reliability and a multi-objective optimization
framework in order to investigate the trade-off between interval uncertainty
and design performance.
3. Efficient trade-off analysis with Adaptive Kriging modeling
3.1. Overview
In this section, an efficient implementation of the interval reliability analysis is put forward in a multi-objective optimization framework. Since the
design optimization cycle time is closely related with the number of reliability simulation in Eq. 7, in this study, FORM reliability simulation is
approximated by a proposed Kriging surrogate model in the optimization.
Moreover, the surrogate method proposed here specifically address the problems where the worst-case search is needed in interval variable domain. In
each surrogate prediction for the reliability performance of the individual, an
estimated worst-case reliability index can also be provided simultaneously in
7

the proposed method. Thus, a higher level of efficiency is achieved by removing the inner-loop search. The surrogate-assisted RBDO that accounts
for worst-case reliability can be expressed as the following:
min f ((X))
s.t. L (X, Y ) = y(X, Y ) t

(8)

where [X, Y ] is the estimated worst-case scenario of design point [X, Y ]


from the surrogate model. y means that the surrogate model is used to
approximate FORM simulation. The Kriging theory and the derivation
of worst-case estimator are presented in sequence. Afterwards, the multiobjective optimization for interval uncertainty trade study is introduced.
3.2. Kriging modeling
Kriging is a powerful surrogate model that has been widely used to approximate computationally expensive simulations. An in-depth Kriging theory can be found in works of Sacks [19] and Simpson [30]. As a brief explanation, Kriging predicts the function value y at an unobserved point based
on a set of sampling points through a realization of a regression model and
a stochastic process:
y(x) = f T + z(x)
(9)
where f is regression basis functions by users choice and are regressional
coefficients. In this work, ordinary Kriging is used; thus, f is a vector of all
1.0 with length ns . The stochastic process z is assumed to have zero means
and a covariance of:
E[z(xi )z(xj )] = 2 R(, xi , xj )

(10)

where is the process variance and R(, xi , xj ) is the correlation model. A


commonly used Gaussian correlation model is adopted here:
R(, xi , xj ) = exp(

nv
X

k (|xik xjk |2 )

(11)

k=1

where is a correlation parameter vector that is found by optimizing a maximum likelihood function. After that, the Kriging predictor for an unobserved
point x can be expressed as the following:
y(x) = f T + r T (x)R1 (Y F )
8

(12)

where is computed by least square regression, the vector r measures the


correlation between the prediction point x and the sampled points [x1 ...xm ].
The mean square error s2 of the predictor can also be provided:


1
s2 (x) = 2 1 + uT F T R1 F
u r T R1 r
(13)
where u = F T R1 r f . As an indication of prediction quality, s2 will be
used later for adaptively improve the modeling.
From Eq. 12 it follows that the gradient of the Kriging predictor can be
derived as follows:
Jy = Jf (x)T + Jr (x)R1 (Y F )

(14)

where Jf and Jr are the Jacobian of f and r, respectively. Jf is [On1 ] as f


is a vector of constants in ordinary Kriging.
The ith row of the Hessian matrix of the Kriging predictor,
Hi,: = [

2 y
2 y
2 y
,
, ...,
]
xi x1 xi x2
xi xnv

(15)

can also be derived as follows:


Hi,: =

Jr (x)T 1
R (Y F )
xi

(16)

where:


Jr (x)
xi


=
mn

2 R(, x, xm )
, m = 1, ..., ns , n = 1, ..., nv
xi xn

(17)

These properties will be used to accelerate the worst-case performance


prediction search, as explained in next section.
3.3. Worst-case prediction and adaptive refinement
In this study, the Kriging surrogate model is used as a worst-case estimator to replace the optimization search that is conventionally required in
locating the worst case Y . Y is expressed as follows:
Y = arg min f (X, Y )
Y I

(18)

where I = [Y L , Y U ] is the bounded interval variables domain. In the proposed surrogate method, Y is estimated in a single Kriging evaluation and
therefore avoids an additional nested optimization search.
For every candidate point D that has n design variables and m interval
variables D = [X1 , ..., Xn , Y1 , ..., Ym ], the predicted response yD , Jacobian
matrix JD , and Hessian matrix HD , can be provided by the surrogate based
on Section 3.2. Within JD and HD , the components associated with interval
variables Y will be sorted out and assembled to form local Jacobian and
Hessian matrices for interval domain:
y
y T
JY = [
, ...,
]
Y1
Ym
2

y
2 y

2
Y1 Ym
Y.1
..
...
.
HY =
.
.

2
2
y
y

Ym Y1
Y 2

(19)

Afterwards, a quadratic model for the interval domain can be established


using JY , HY and yD :
1
y(Y + sY ) sY T HY sY + JY sY + yD
2

(20)

where sY is called the Newton step to the minimum of the quadratic, which
is the estimate of worst-case performance. sY can be obtained by:
sY = HY1 JY

(21)

Assuming the quadratic approximation implied by this approach matches


the underlying Krigin model, the worst case can be directly determined as
Y = Y + sY . This method is much more efficient for the worst-case search
as it eliminates the inner optimization run originally required in Eq. 18.
However, there are two levels of nested assumptions in this approach.
First, it is assumed that the quadratic approximation of the Kriging surface
is accurate. Second, it is assumed that the Kriging model is a faithful model
of the underlying reliability simulations. To ensure that these assumptions
are both reasonable, two online refinement criteria are used. These criteria
are evaluated every time a worse-case performance prediction is made, and
if either is violated, an updated prediction is generated. This scheme will
ensure that an accurate worst case performance prediction is provided for
10

optimization while refining the Kriging model to increase the quality of the
surrogate model as the optimization runs. The refinement criteria are as
follows:
1. Based on the predicted worst-case design, a repeated worst-case prediction is made reestablishing the quadratic approximation at the worstcase point found. The two outcomes of these two searches need to be
sufficiently close;
2. The U metric proposed in [22] for Kriging model accuracy is used here:
U=

y(Dworst ) t
s(Dworst )

(22)

where s is the Kriging variation derived in Eq. 13. This metric compares
the distance from the reliability constraint boundary to the predicted
error in the Kriging model. Points close to a constraint boundary or
with large prediction errors would be highlighted for refinement. A
minimum value of 3.0 is required for U metric in this study.
The first criterion ensures that the worst-case prediction is stable by an
iterated prediction check. The second criterion establishes a lower confidence
bound [31] regarding the target value t . The U metric utilizes the Kriging
variation information to ensure the accuracy of prediction. A minimum value
of 3.0 suggests that the probability of making a mistake on y t is (3) =
0.135%. In each worst-case estimation, these two criteria will be examined,
and any violation will be used as a guidance to update the surrogate model.
The procedure is outlined in Algorithm 1.
3.4. Multi-objective optimization coupling interval variables
The goal of the proposed method is to explore the impact of reducing the
width of interval uncertainties on design performance. The resulting trade
space will help prioritize areas for engineering investment to reduce epistemic uncertainty. With the developed worst-case estimation technique, this
trade-off study of interval uncertainty can be resolved in a multi-objective
optimization framework. This work proposes to couple interval uncertainty
into structural design optimization by treating interval widths as design
variables, and an interval reduction metric as objective function. Some interval reduction measures can be found in the work of Li et al. [32]. In this
paper, a simple inverse metric of interval variable ranges production is used
11

Algorithm 1 Worst case prediction and updating scheme.


BEGIN
Initialize: Train an initial Kriging model
for Candidate design Di = [X, Y ] do
(1)
Predict the worst case performance Dworst = [X, Y ]of Di ;
(2)
(1)
Repeat the process by predicting Dworst of Dworst ;
Compute the U metric, and compare two predicted worst cases;
(2)
(1)
if |
y (Dworst ) y(Dworst )| 105 and U 3.0 then
Do not update Kriging
else
Evaluate (Di ) using exact reliability analysis
Update Kriging model
Estimate Dworst again
end if
end for
Set reliability index of Di as y(Dworst )
END
as an indication of the cost needed to gain relevant information and reduces
the interval uncertainty:
1
Cost = QI
(23)
i=1 i
The interval reduction cost function and structural performance function
will be optimized within NSGA-II - a multi-objective genetic algorithm optimizer proposed by Deb et al. [33]. The NSGA-II algorithm is an elitist
non-dominated sorting genetic algorithm that has proven to be powerful in
capturing a set of diversified Pareto optimal solutions. In NSGA-II, children
population is created from parent chromosomes by both crossover and mutation. The simulated binary crossover algorithm is adopted here for crossover
with an exponent of 4.0. The random mutation rate is set at a low probability
of occurrence of 0.1%.
Relying on the proposed adaptive surrogate modeling strategy to estimate worst-case reliability, the optimizer can quickly evaluate the adequacy
of structural design with different uncertainty interval widths, and thus keeps
the interval computation tractable. The surrogate-assisted optimization process is summarized in Algorithm 2 below.

12

Algorithm 2 Adaptive surrogate-assisted MOGA with interval variable.


BEGIN
Initialize: Latin Hypercube Sampling method to collect initial sample
points, evaluate sampled points using FORM and construct a Kriging surrogate model;
Initialize NSGA-II.
while Termination criteria of GA is not met do
for Individual Di in population do
Evaluate fitness values of Di using objective functions [f, Cost];
Estimate the worst case reliability index L of Di by surrogate
Check updating criteria described in Algorithm 1
if Status is updating Kriging then
Evaluate (Di ) using exact reliability analysis
Updating sampling database
Refine Kriging model and estimate L again
end if
Set constraint violation as max[0, t L ]
end for
Apply GA operators to create the next generation of population
end while
END

13

4. Case studies
In this section, the proposed method is examined with an engineering design problem [5] and a marine composite stiffened panel design problem [25].
Through these case studies, the functionality of worst-case performance surrogate prediction is well examined, and the ability of proposed algorithm in
delivering a closely converged Pareto front compared to high-fidelity simulation results is shown.
4.1. Cantilever tube
A cantilever tube problem proposed in Dus [5] interval reliability study
is used in this work to test the framework. The cantilever tube shown in
Fig. 1 is subjected to external forces F1 , F2 , P and torsion T . The limit
state function is expressed as follows:
g(X, Y ) = Sy max

(24)

where Sy is the yield strength, and max is the maximum von Mises stress
given by:
p
2
max = x2 + 3zx
(25)
The calculation of stress is determined from classical structural mechanics:

P + F1 sin 1 + F2 sin 2 M c
+
A
I
M = F1 L1 cos 1 + F2 L2 cos 2
(26)

A = [d2 (d 2t)2 ], c = d/2,


4

Td
I = [d4 (d 2t)4 ], zx =
.
64
4I
The random variables and interval variables setting are given in Table 1
and Table 2. Note that the interval variable ranges are fixed in the original problem [5] for a single analysis. In this work, the test multi-objective
optimization problem with varying interval ranges is formulated as follows:
x =

min V olume(t, d) = (d t)tL1


min Cost(1 , 2 )
s.t. L (X, Y ) = y(X, Y ) t
where [t, d] = [(X1 ), (X2 )]
Y0 Y Y0 +
14

(27)

Figure 1: Cantilever tube problem [5].

Table 1: Random variables

Variables
X1 (t)
X2 (d)
X3 (L1 )
X4 (L2 )
X5 (F1 )
X6 (F2 )
X7 (P )
X8 (T )
X9 (Sy )

Parameter 1 Parameter 2 Distribution


5mm
0.1mm
Normal
42mm
0.5mm
Normal
119.75mm
120.25mm
Uniform
59.75mm
60.25mm
Uniform
3.0kN
0.3kN
Normal
3.0kN
0.3kN
Normal
12.0kN
1.2kN
Gumbel
90.0Nm
9.0Nm
Normal
220.0Nm
22.0Nm
Normal

Table 2: Interval variables

Variables Intervals
Y1 (1 )
[0 , 10 ]
Y2 (2 )
[5 , 15 ]

15

Table 3: Design variables

Design variables Description


t
Mean value of thickness X1
d
Mean value of diameter X2
1
Interval variable range of Y1
2
Interval variable range of Y2

Lower bound Upper bound


3mm
6mm
38mm
44mm
0
10
0
10

The mean value of thickness and diameter are set as design variables along
with two interval variable ranges. The design variable domain is defined in
Table 3. The interval uncertainty lengths 1 , 2 range from 0 to 10, where
zero ranges means the interval variables are reduced to the deterministic
values: [Y1 , Y2 ] = [0 , 5 ], and the largest ranges mean the interval variables
cover the same range shown in Table 2. In conducting the analysis, the
initial surrogate model is constructed from 150 points sampled by the Latin
Hyper-square Method evenly in the design space. Reliability simulations are
conducted using the PyRe (python Reliability) module.
To validate the proposed method, initial, the worst reliability in the interval range is located by Algorithm 1 directly (e.g. no optimization). This
computation is also performed by Du [5] which allows for comparison with
the current method. The design point [t, d, 1 , 2 ] = [5.0, 42.0, 10, 10] is
chosen as the prediction point, which means the random variables X and
interval variables Y are the same as Tables 1 and 2 in the literature. The
predicted worst-case interval variable combination is Y = [3.9289, 7.8074]
and the associated reliability index is 3.5933. A comparison of the result
with Dus method is shown in Table 4. It is clear that the worst-case prediction successfully estimated the worst-case design compared to nonlinear
optimization results from Dus study. The computational efficiency of the
surrogate model is also promising, the 952 limit state function evaluations
are made for 150 reliability analysis during the sampling stage. Though this
is higher than the 147 function calls used by Du, the surrogate is now ready
for optimization while Dus work only finds a single value.
Next, the test problem stated in Eq. 27 is optimized using the proposed
multi-objective optimization framework in Algorithm 2. As a reference, the
test problem is also optimized within NSGA-II with all the reliability analysis
conducted directly via FORM simulation. The two optimizations both used
100 generations and a population size of 100, and a target reliability index

16

Table 4: Worst case reliability

Worst case reliability


pmax
(= 1 (L ))
f
function evaluations of g

Proposed surrogate prediction Du 2007


1.63 104
1.63 104
952*
147

*:Note this surrogate was built for optimization, hence it contains more
function calls than Dus single-value computation.
constraint of 3.0. A direct comparison between the two Pareto fronts is shown
in Fig. 2. Note that the product of the interval width is plotted directly in
Fig. 2. This is the inverse of the objective space the problem was solved in
via Eq. 23.
Both Pareto fronts show a clear trade-off between interval uncertainty
range and design performance, though the absolute impact on structural
material volume is small. The increase of interval range multiples 1 2
causes penalty on design performance measured by volume of cantilever tube.
The penalty stops after reaching a certain value of interval range. This is
due to the nonlinearity of interval uncertainties in the original problem: if a
local minimum of performance is already contained in the interval, expanding
the interval does not further reduce performance unless a new minimum is
included. The front found by the surrogate closely follows the front found
by direct FORM simulation. For this problem, this demonstrates that the
proposed approach is efficient and practical.
A detailed comparison illustrated in Fig. 3 can disclose the relative importance between the two interval uncertainties involved. It can be seen that
Pareto optimal points favor large 1 values and small 2 values, which indicates that reducing interval uncertainty Y2 is more worthwhile in improving
design performance compared to Y1 . It also indicates that there is a critical
value of 2 at which point the design performance worsens rapidly.
The worst case prediction of the surrogate is examined in the optimization
run by computing the bias of the prediction:
bias =

|predicted actual|
max

(28)

where the bias is defined as the normalized difference between predicted worst

case Y and the actual Y from high-fidelity analysis. An average bias value
of 2.7104 is recorded for the 10100 individual evaluations in the evolutionary algorithm. This indicates that the proposed surrogate models worst-case
17

58.04

Volume(cm3 )

58.03

58.02

58.01

High-fidelity FORM analysis


Proposed surrogate method

58.00
0

20

40

1 2

60

80

100

Figure 2: Pareto fronts comparison.

Figure 3: Impact of interval range on performance.

prediction is sufficiently accurate for interval analysis throughout the optimization run. The closely converged Pareto front from surrogate method in
Fig. 2 also suggests that the proposed surrogate model is capable in locating
the worst-case performance in an accurate and efficient way. Overall, the
quadratic approximation to locate the worse performing point directly from
18

the Kriging surrogate, and the coupling of this technique with the NSGA-II
appears to successfully solve this two-interval problem.
4.2. Top-hat stiffened panel structure design
Marine composite structures are often used for high-speed, innovative
vessels. Compared to conventional steel vessels, there is larger uncertainty in
both the loading of the structure and the material variability of the composites for such vessels. Marine composite structural design then requires significantly larger loading and material uncertainty factors than steel structures.
In this case study, it is attempted to apply interval uncertainties to represent
large variabilities in loading component, while using conventional reliability
methods to capture the material variability of the composite. Through the
proposed method, a trade study between interval uncertainty range and design performance of a top-hat stiffened panel structure is performed. Such an
analysis is similar to an early-stage vessel design problem, in which a robust
estimation of structural weight is needed while loading remains uncertain.

(a) Picture of a FRP top-hat stiffened (b) Schematic of a multi-stiffener grilgrillage.


lage.
Figure 4: Configuration FRP top-hat stiffened grillage plate.

A simplified grillage structure with carbon fiber-reinforced plastic (CFRP)


materials [25] is adopted here to demonstrate an application of proposed
method. The typical configuration of this type of structure used in marine
19

Figure 5: A sectional view of the CFRP single skin with top-hat stiffeners.

structures is shown in Fig. 4. The size of this top-hat stiffened grillage plate
is 4.2m in length and 3.2m in width, and spacing values for longitudinal
and transverse frames are 800mm and 700mm, respectively. This gives the
grillage plate three longitudinal and five transverse top-hat stiffeners.
The sectional view of a top-hat shape stiffener with effective breadth is
shown in Fig 5. This top-hat stiffener including base plate is consisted of
crown, web, non-structural former. For simplicity, these components are
constructed to have geometrical symmetry, and longitudinal configuration is
of the same as in the transverse direction. The depth of the stiffened plate
is fixed at 180.00mm. Furthermore, it is assumed that CFRP laminates are
monolithic; hence no attempt is made to ascribe thickness of ply details,
make-up, fiber volume fractions, etc. Such decisions would be taken up at
a later stage after gross decisions about the choice of particular structural
topology and material are made.
Under lateral pressure, both the strength and the deflection of such panels are important to structural design. For this work, the maximum panel
deflection is considered as the limit state equation for the structure. The
deflection limit state function is defined as follows:
g(X, Y ) = wmax w(X, Y )

(29)

where wmax is the allowed maximum deflection, taken as L/150 in this study.
The deflection w of the grillage is estimated by using Naviers energy method
[34], where the deflection is determined by equating total strain energy to
20

the work done by the load. The equation is given below:


w=

X
m=1 n=1

16P l4 b/EIr
mx
ny
sin
sin
Is l 3 4
6
4
l
b
mn[m (r + 1) + Ir b3 n (s + 1)]

(30)

where Ir and Is are moments of inertia for the longitudinal stiffeners and
transverse stiffeners, respectively; m and n are wave numbers; E is equivalent
Youngs moduli; and a reasonable pressure load P is estimated by the ABS
High Speed Craft rules [35]
The descriptions of distributions for each components in Eq. 29 are presented in Table 5. In this work, the geometric design variables are treated
as random variable X, while the interval variable Y is used to change the
COV of the loading estimate in the limit state equation. The mean value of
equivalent Youngs modulus E is taken as 140GPa.
Table 5: Distributions of variables in CFRP problem

Variables
tw
tcr
tp
E
P

Description
Mean
Web thickness
1
Crown thickness
2
Plate thickness
3
Equivalent Youngs modulus E
Pressure Load
pd

COV
0.03
0.03
0.03
0.05
0.2

Distribution
Normal
Normal
Normal
Normal
Normal

The multi-objective optimization problem for interval trade-off study is


formulated in Eq. 31, within which the independent design variables are
defined in Table 6.
min W eight(X)
min Cost()
s.t. L (X, Y ) = y(X, Y ) t

(31)

where X = [1 , 2 , 3 ]
Y0 Y Y0 +
The above problem is optimized in both the proposed surrogate method
presented in Algorithm 2, and the all FORM analysis in terms of different
ways in computing L . The initial surrogate model is constructed using 150
sampling points. The NSGA-II parameters are set as 100 generations and a
21

Table 6: Design Variables of CFRP problem

Design variables Description


1
Mean value of tw
2
Mean value of tcr
3
Mean value of tp

Interval variable range

Lower bound
1.0mm
1.8mm
5mm
0.01

Upper bound
15mm
15mm
45mm
0.1

population size of 100. For a more rigorous comparison, two different target
reliability criteria t are investigated at 2.5 and 3.0, respectively. The corresponding optimized results are shown in Fig. 6. Again, the Cost objective
(Eq. 23) is inverted to interval range for a better trade space view.

290
280

Weight(kg)

270
260
250
240
230

t =3.0 FORM analysis


t =3.0 Proposed method
t =2.5 FORM analysis
t =2.5 Proposed method

220
210
0.00

0.02

0.04

0.06

0.08

0.10

0.12

Figure 6: Pareto fronts comparison under different target reliability index

It is clearly shown in Fig. 6 that at different target reliability index levels, the proposed surrogate method successfully demonstrated its ability in
22

worst-case estimation and converging very closely to the high-fidelity results


achieved from all FORM analysis. In this composite structure case study,
the absolute value of the penalty for information uncertainty is higher than
in the cantilever tube case. The increased interval uncertainty range causes
approximately a 20% weight increase on the design in the worst scenario.
The strong linear trend is related to linear relationship between pressure and
deflection in Eq. 30. The potential weight penalty is critical for composite
material application in naval ship designs where weight requirement is stringent. Therefore, this trade-off information can be highly valuable in early
stage structural weight estimation for these innovative designs.
The proposed method is also highly computationally efficient compared
to direct reliability analysis. A comparison of the number of FORM reliability simulation required for these optimization runs is presented in Table 7.
Without using a surrogate, over 105 FORM reliability analyses are required
to converge the Pareto fronts. Thus, the direct simulation method can easily
become inapplicable when reliability simulation becomes more complex and
time-consuming. However, the proposed method used on 150 FORM calls
initially, and then another 7-8 for refinement as the optimization proceeded.
On average the proposed method used far less than one FORM call in locating each worst case L , and about 1.6 FORM calls per final point found on
the Pareto front. Examining the statistics of the iterative quadratic search
and refinement for the worse performance shows that highest number of iterations was 3 Kriging predictions. Directly finding reliability from MCS-based
method on a Kriging surrogate can take up to 106 Kriging predictions in
computing L for a single individual evaluation. Coupled with the results
of the cantilever tube presented previously, the proposed method has been
shown to be both robust and efficient on these problems.
Table 7: Computation cost comparison

Cases
t
t
t
t

= 3.0
= 3.0
= 2.5
= 2.5

Overall reliability simulation fF ORM calls


FORM analysis
203079
Proposed method 158
FORM analysis
200101
Proposed method 157

23

Average fF ORM calls


in computing L
20.10
0.016
19.81
0.015

5. Conclusion
This manuscript proposed and demonstrated a novel interval-uncertainty
optimization approach to explore the value in reducing epistemic uncertainty
in design. Initially, a combined interval-reliability model was presented. In
this approach, the limit state equation can contain both stochastic (aleatory)
variables and interval-uncertainty variables. Then, a surrogate modeling
approach was demonstrated that extended existing Kriging modeling approaches by directly locating minima on the Kriging surface via the Jacobian
and Hessian of the model. This approach was used to remove the innermost
loop in the optimization - the location of the worst reliability performance
within a given interval. Adaptive refinement criteria to ensure the accuracy
of this approximation were also presented. Next, the approach was coupled
to a conventional NSGA-II optimizer, and an information cost function defined in terms of the product of interval uncertainty widths. By including
this information cost function as an objective, the impact of epistemic uncertainty on the design performance was revealed in the Pareto trade-spaces
that resulted from the NSGA-II. The proposed approach was demonstrated
on a cantilever tube and composite panel design problem. For both problems, the proposed approach was shown to be accurate and efficient, using
fewer than 200 direct reliablity analysis while finding Pareto fronts similar
to those found without the use of any surrogate.
With the help of the presented method, the decision makers are better
equipped to explore novel structural designs in early stage. As shown in the
composite panel case study result, the interval uncertainty study can fully
prepare the designer to design against the worst-case scenario and make structural weight estimations under incomplete information. Additionally, this
interval uncertainty trade study can quickly inform the designer regarding
whether engineering improvement is critically needed for the design project.
Acknowledgments
The team at the University of Michigan was supported by Ms. Kelly
Cooper, Office of Naval Research, under grant N00014-11-1-0845. Her continued support is greatly appreciated. Additionally, Han Koo Jeong wishes
to express his gratitude to Kunsan National University for the financial support received through the Kunsan National Universitys Long-term Oversea
Research Program for Faculty Member in the year 2014.
24

6. Additional Resources
The code used to implement the optimization approach and a sample
problem is available under the MIT Open Source License at DeepBlue. Note
for the purposes of peer review, these files are available by request, and will
be assigned a final URL when the article is published
References
[1] James N. Richardson, Rajan Filomeno Coelho, and Sigrid Adriaenssens.
Robust topology optimization of truss structures with random loading
and material properties: A multiobjective perspective. Computers &
Structures, 154:4147, July 2015.
[2] Pingfeng Wang, Zequn Wang, Byeng D. Youn, and Soobum Lee.
Reliability-based robust design of smart sensing systems for failure
diagnostics using piezoelectric materials. Computers & Structures,
156:110121, August 2015.
[3] Yangjun Luo, Mingdong Zhou, Michael Yu Wang, and Zichen Deng.
Reliability based topology optimization for continuum structures with
local failure constraints. Computers & Structures, 143:7384, September
2014.
[4] Zeng Meng, Gang Li, Bo Ping Wang, and Peng Hao. A hybrid chaos
control approach of the performance measure functions for reliabilitybased design optimization. Computers & Structures, 146:3243, January
2015.
[5] Xiaoping Du. Interval reliability analysis. In ASME 2007 International
Design Engineering Technical Conferences and Computers and Information in Engineering Conference, pages 11031109. American Society of
Mechanical Engineers, 2007.
[6] C. Jiang, W. X. Li, X. Han, L. X. Liu, and P. H. Le. Structural reliability analysis based on random distributions with interval parameters.
Computers & Structures, 89(2324):22922302, December 2011.
[7] Yaochu Jin and J
urgen Branke. Evolutionary optimization in uncertain
environments-a survey. Evolutionary Computation, IEEE Transactions
on, 9(3):303317, 2005.
25

[8] Yakov Ben-Haim and Isaac Elishakoff. Convex models of uncertainty in


applied mechanics. Elsevier, 2013.
[9] R. Keane Jr. Reducing Total Ownership Cost: Designing Inside-Out of
the Hull. Naval Engineers Journal, 124(4):6780, December 2012.
[10] Liu, Yan and Collette, Matthew. Efficient optimization framework for
robust and reliable structural design considering interval uncertainty.
In Analysis and Design of Marine Structures V, pages 555563. CRC
Press, February 2015.
[11] Bernd Mller and Michael Beer. Engineering computation under uncertainty Capabilities of non-traditional models. Computers & Structures,
86(10):10241041, May 2008.
[12] Scott Ferson, Cliff A. Joslyn, Jon C. Helton, William L. Oberkampf,
and Kari Sentz. Summary from the epistemic uncertainty workshop:
consensus amid diversity. Reliability Engineering & System Safety,
85(13):355369, July 2004.
[13] Yew-Soon Ong, P.B. Nair, and K.Y. Lum. Max-min surrogate-assisted
evolutionary algorithm for robust design. IEEE Transactions on Evolutionary Computation, 10(4):392404, August 2006.
[14] Abraham M Hasofer and Niels C Lind. Exact and invariant secondmoment code format. Journal of the Engineering Mechanics division,
100(1):111121, 1974.
[15] Jian Tu, Kyung K Choi, and Young H Park. A new study on
reliability-based design optimization. Journal of mechanical design,
121(4):557564, 1999.
[16] Xiaoping Du and Wei Chen. Sequential optimization and reliability assessment method for efficient probabilistic design. Journal of Mechanical
Design, 126(2):225233, 2004.
[17] Jinghong Liang, Zissimos P Mourelatos, and Jian Tu. A single-loop
method for reliability-based design optimisation. International Journal
of Product Development, 5(1):7692, 2008.

26

[18] Bruno Sudret. Meta-models for structural reliability and uncertainty


quantification. arXiv preprint arXiv:1203.2062, 2012.
[19] Jerome Sacks, William J Welch, Toby J Mitchell, and Henry P Wynn.
Design and analysis of computer experiments. Statistical science, pages
409423, 1989.
[20] Irfan Kaymaz. Application of kriging method to structural reliability
problems. Structural Safety, 27(2):133151, 2005.
[21] B. J. Bichon, M. S. Eldred, L. P. Swiler, S. Mahadevan, and J. M.
McFarland. Efficient Global Reliability Analysis for Nonlinear Implicit
Performance Functions. AIAA Journal, 46(10):24592468, 2008.
[22] B. Echard, N. Gayton, and M. Lemaire. AK-MCS: An active learning reliability method combining Kriging and Monte Carlo Simulation.
Structural Safety, 33(2):145154, March 2011.
[23] V. Dubourg, B. Sudret, and J.-M. Bourinet. Reliability-based design optimization using kriging surrogates and subset simulation. Structural and
Multidisciplinary Optimization, 44(5):673690, November 2011. arXiv:
1104.3667.
[24] Xufeng Yang, Yongshou Liu, Yi Gao, Yishang Zhang, and Zongzhan
Gao. An active learning kriging model for hybrid reliability analysis with
both random and interval variables. Structural and Multidisciplinary
Optimization, pages 114, 2014.
[25] K Maneepan, HK Jeong, RA Shenoi, et al. Optimization of frp top-hat
stiffened single skin and monocoque sandwich plates using genetic algorithm. In The Fifteenth International Offshore and Polar Engineering
Conference. International Society of Offshore and Polar Engineers, 2005.
[26] David Moens and Dirk Vandepitte. Interval sensitivity theory and
its application to frequency response envelope analysis of uncertain
structures. Computer methods in applied mechanics and engineering,
196(21):24862496, 2007.
[27] C Jiang, HC Xie, ZG Zhang, and X Han. A new interval optimization method considering tolerance design. Engineering Optimization,
(ahead-of-print):114, 2014.
27

[28] Xiaoping Du, Agus Sudjianto, and Beiqing Huang. Reliability-based


design with the mixture of random and interval variables. Journal of
mechanical design, 127(6):10681076, 2005.
[29] Chao Jiang, Xu Han, WX Li, J Liu, and Z Zhang. A hybrid reliability approach based on probability and interval for uncertain structures.
Journal of Mechanical Design, 134(3):031001, 2012.
[30] T.W. Simpson, J.D. Peplinski, P.N. Koch, and J.K. Allen. Metamodels
for computer-based engineering design: survey and recommendations.
Engineering with Computers, 17(2):12950, 2001.
[31] Dennis D Cox and Susan John. Sdo: A statistical method for global
optimization. Multidisciplinary design optimization: state of the art,
pages 315329, 1997.
[32] M. Li, N. Williams, and S. Azarm. Interval Uncertainty Reduction
and Single-Disciplinary Sensitivity Analysis With Multi-Objective Optimization. Journal of Mechanical Design, 131(3):031007031007, February 2009.
[33] K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Transactions on
Evolutionary Computation, 6(2):182197, 2002.
[34] OK Bedair. Analysis of stiffened plates under lateral loading using sequential quadratic programming (sqp). Computers & structures,
62(1):6380, 1997.
[35] American Bureau of Shipping. Rules for Building and Classing HighSpeed Naval Craft. Houston, 2015.

28

Potrebbero piacerti anche