Sei sulla pagina 1di 15

KF Qualitätsmanagement Outline

Vertiefungskurs V
 Customer satisfaction measurement
 The Structural Equation Model (SEM)
Messung und statistische  Estimation of SEMs
Analyse von  Evaluation of SEMs
 Practice of SEM-Analysis
Kundenzufriedenheit

3.12.2004 Messung & Analyse von 2


Kundenzufriedenheit

The ACSI Model ACSI-Model: Latent Variables


 Customer Expectations: combine customers’
experiences and information about it via media,
advertising, salespersons, and word-of-mouth from other
customers
 Perceived Quality: overall quality, reliability, the extent
to which a product/service meets the customer’s needs
 Customer Satisfaction: overall satisfaction, fulfillment
of expectations, comparison with ideal
 Perceived Value: overall price given quality and overall
quality given price
 Customer Complaints: percentage of respondents who
reported a problem
 Customer Loyalty: likelihood to purchase at various
price points

Ref.: http://www.theacsi.org/model.htm
3.12.2004 Messung & Analyse von 3 3.12.2004 Messung & Analyse von 4
Kundenzufriedenheit Kundenzufriedenheit

The European Customer


Satisfaction Index (ECSI)

% %
Base Q2 Q2 Q2 Q2 Q2 Q2 Q2 Q2 Q2 Q2
Chan Chang
line* 1995 1996 1997 1998 1999 2000 2001 2002 2003 2004
ges es

MANUFACTURING/DURA
79.2 79.8 78.8 78.4 77.9 77.3 79.4 78.7 79.0 79.2 78.3 -1.1% -1.1%
BLES

Personal Computers 78 75 73 70 71 72 74 71 71 72 74 2.8% -5.1%

Apple Computer, Inc. 77 75 76 70 69 72 75 73 73 77 81 5.2% 5.2%

Dell Inc. NM NM NM 72 74 76 80 78 76 78 79 1.3% 9.7%

Gateway, Inc. NM NM NM NM 76 76 78 73 72 69 74 7.2% -2.6%

All Others NM 70 73 72 69 69 68 67 70 69 71 2.9% 1.4%

Hewlett-Packard Company
78 80 77 75 72 74 74 73 71 70 71 1.4% -9.0%
– HP Ref.: http://www.swics.ch/ecsi/index.html
Hewlett-Packard Company
– Compaq
78 77 74 67 72 71 71 69 68 68 69 1.5% -11.5%

3.12.2004 Messung & Analyse von 5 3.12.2004 Messung & Analyse von 6
Kundenzufriedenheit Kundenzufriedenheit

1
ACSIe-Model for Food Retail Austrian Food Retail Market
Hackl et al. (2000)  Pilot for an Austrian National CS Index (Zuba, 1997)
0,33 Emotional  Data collection: December 1996 by Dr Fessel & GfK
Latent variables and
Factor
Perceived path coefficients (professional market research agency)
Quality 0,35
 839 interviews, 327 complete observations
0,37  Austria-wide active food retail chains (1996: ~50%
0,36 Custo-
0,73 mer Satis- from the 10.5 B’EUR market)
(− 0,01) faction 0,34 Billa: well-assorted medium-sized outlets
0,53
Expec-
Hofer: limited range at good prices
0,34
Loyalty Merkur: large-sized supermarkets with
tations
(0,06) Value comprehensive range
Meinl: top in quality and service

3.12.2004 Messung & Analyse von 7 3.12.2004 Messung & Analyse von 8
Kundenzufriedenheit Kundenzufriedenheit

The Data The Emotional Factor


Indicators Latent Principal component analysis of satisfaction drivers
total expected quality (EGESQ), expected compliance with Expectations staff (availability, politeness)
demands (EANFO), expected shortcomings (EMANG) (E) outlet (make-up, presentation of merchandise, cleanliness)
total perceived quality (OGESQ), perceived compliance with Perceived range (freshness and quality, richness)
needs (OANFO), perceived shortcomings (OMANG) Quality (Q) price-value ratio (value for price, price for value)
value for price (VAPRI), price for value (PRIVA) Value (P) customer orientation (access to outlet, shopping hours,
queuing time for checkout, paying modes, price information,
total satisfaction (CSTOT), fulfilled expectations (ERWAR), Customer Sa-
sales, availability of sales)
comparison with ideal (IDEAL) tisfaction (CS)
identifies (Zuba, 1997)
number of oral complaints (NOBES), number of written Voice (V)
staff, outlet, range: “Emotional factor”
complaints (NOBRI)
price-value ratio: “Value”
repurchase probability (WIEDE), tolerance against price- Loyalty (L) customer orientation: “Cognitive factor”
change (PRVER)

3.12.2004 Messung & Analyse von 9 3.12.2004 Messung & Analyse von 10
Kundenzufriedenheit Kundenzufriedenheit

Structural Equation Models Customer Satisfaction


Combine three concepts
 Latent variables
Is the result of the customer‘s
 Pearson (1904), psychometrics comparison of
 Factor analysis model
 Path analysis
 his/her expectations with
 Wright (1934), biometrics  his/her experiences
 Technique to analyze systems of relations
 Simultaneous regression models
 Econometrics has consequences on
 loyalty
 future profits of the supplier

3.12.2004 Messung & Analyse von 11 3.12.2004 Messung & Analyse von 12
Kundenzufriedenheit Kundenzufriedenheit

2
Expectation vs. Experience CS-Model: Path Diagram
 Expectation reflects
 customers‘ needs Expecta-
 offer on the market tions
 image of the supplier
Custo-
 etc. mer Satis-
 Experiences include faction

 perceived performance/quality Perceived


subjective assessment Loyalty
 Quality
 etc.

3.12.2004 Messung & Analyse von 13 3.12.2004 Messung & Analyse von 14
Kundenzufriedenheit Kundenzufriedenheit

A General CS-Model CS-Model: Structure


EX: expectation to
Expecta- Voice PQ: perceived EX PQ CS LY
tions from
quality
CS: customer EX X X 0
Custo-
mer Satis- satisfaction
faction LY: loyalty
PQ 0 X 0
Perceived
Loyalty
Quality CS 0 0 X
Recursive structure:
triangular form of
Profits relations LY 0 0 0

3.12.2004 Messung & Analyse von 15 3.12.2004 Messung & Analyse von 16
Kundenzufriedenheit Kundenzufriedenheit

CS-Model: Equations Simple Linear Regression


Model: Y = α + γX + ζ
PQ = α1 + γ11EX + ζ1
Observations: (xi, yi), i=1,…,n
CS = α2 + β 21PQ + γ21EX + ζ2 Fitted Model: Ŷ = a + cX
LY = α3 + β 32CS + ζ3 OLS-estimates a, c: s
c= xy s

s x2
, a = y − cx
Simultaneous equations model minimize the sum of squared residuals
in latent variables
∑ ( y − yˆ )
i i i
2
= S (α, γ ) →
α,γ
min
Exogenous: EX
sxy: sample-covariance of X and Y
Endogenous: PQ, CS, LY
Error terms (noises): ζ 1, ζ 2, ζ 3

3.12.2004 Messung & Analyse von 17 3.12.2004 Messung & Analyse von 18
Kundenzufriedenheit Kundenzufriedenheit

3
Criteria of Model Fit Multiple Linear Regression
R2: coefficient of determination Model: Y = α + X1γ1 + ... + Xkγκ + ζ = α + x’γγ + ζ
the squared correlation between Y and Ŷ: Observations: (xi1,…, xik, yi), i=1,…,n
R2 = ryŷ2 In Matrix-Notation: y = α + Xγγ + ζ
t-Test: Test of H0: γ=0 against H1:γ≠0 y, ζ : n-vectors, γ: k-vector, X: nxk-matrix
t=c/s.e.(c) Fitted Model: ŷ = a + Xc
s.e.(c): standard error of c OLS-estimates a, c:
F-Test: Test of H0: R2=0 against H1: R2≠0
c = ( X ' X ) −1 X ' y, a = y − c1 x1 − ... − ck xk
R2 2
F=
1 − R2 n − 2 R2 = ryŷ2
follows for large n the F-distribution with n-2 and F-Test
2 df t-Test

3.12.2004 Messung & Analyse von 19 3.12.2004 Messung & Analyse von 20
Kundenzufriedenheit Kundenzufriedenheit

Simultaneous Simultaneous
Equations Models Equations Models
A 2-equations model: Model: Y = BY + ΓX + ζ
PQ = α1 + γ11EX + ζ1 Y, ζ: m-vectors, Some assumptions:
CS = α2 + β21PQ + γ21EX + ζ2 B: (mxm)-matrix ζ: E(ζ)=0, Cov(ζ) = Σ
Γ: (mxK)-matrix, Exogeneity: Cov(X,ζ) = 0
X: K-vector
In matrix-notation: Y = BY + ΓX + ζ
with  PQ   ζ1  Problems:
Y =  , X = ( EX ) , ζ =   Simultaneous equation bias: OLS-estimates of
 CS  ζ 2 
coefficients are not consistent
 0 0  α1 γ 11 
B= , Γ =   path coefficients Identifiability: Can coefficients be consistently
β
 21 0   α 2 γ 21  estimated?

3.12.2004 Messung & Analyse von 21 3.12.2004 Messung & Analyse von 22
Kundenzufriedenheit Kundenzufriedenheit

Path Analytic Model Path Analysis


PQ = γ11EX + ζ 1  Wright (1921, 1934)
CS = β21PQ + γ21EX + ζ 2  A multivariate technique
δ1
ζ2  Model: Variables may be
EX  structurally related
γ21
 structurally unrelated, but correlated
Var(δ1) = σEX2  Decomposition of covariances allows to write
γ11 CS covariances as functions of structural parameters
 Definition of direct and indirect effects
β21
PQ
 ζ  σ 2 0 
Var  1  =  1 2
ζ1 ζ 2   0 σ 2 

3.12.2004 Messung & Analyse von 23 3.12.2004 Messung & Analyse von 24
Kundenzufriedenheit Kundenzufriedenheit

4
Example Direct and Indirect Effects
σCS,EX = γ21σ2EX + β21σPQ,EX ρCS,EX = γ21 + γ11β21
= γ21σ2EX + γ11β21σ2EX
δ1  Direct effect: coefficient that links independent with
dependent variable; e.g., γ21 is direct effect of EX
EX ζ2
γ21 on CS
σ YX = ∑ i∈(Y < X ) θYiσ iX
 Indirect effect: effect of one variable on another via
γ11 one or more intervening variable(s), e.g., γ11β21
CS
 Total indirect effect: sum of indirect effects
β21 between two variables
PQ with standardized variable EX:  Total effect: sum of direct and total indirect effects
ζ1 ρCS,EX = γ21 + γ11β21 between two variables

3.12.2004 Messung & Analyse von 25 3.12.2004 Messung & Analyse von 26
Kundenzufriedenheit Kundenzufriedenheit

Decomposition of Covariance σyx First Law of Path Analysis


σ YX = ∑ I ∈(Y < X ) θYI σ IX Decomposition of covariance σxy between Y and X:
σ YX = ∑ i∈(Y < X ) θYiσ iX
Assumptions:
I ∈ (Y < X ): variable on path from X to Y  Exogenous (X) and endogenous variables (Y) have
mean zero
θYI: path coefficient of variable I to Y
 Errors or noises (ζ )
 have mean zero and equal variances across
observations
 are uncorrelated across observations
 are uncorrelated with exogenous variables
 are uncorrelated across equations

3.12.2004 Messung & Analyse von 27 3.12.2004 Messung & Analyse von 28
Kundenzufriedenheit Kundenzufriedenheit

Identification Identification, cont’d


PQ = γ11EX + ζ1 Y1 = γ11X + ζ1 Y1 = γ11X + ζ 1
CS = β 21PQ + γ21EX + ζ2 Y2 = β 21Y1 + γ21X + ζ2 Y2 = β21Y1 + γ21X + ζ 2
In matrix-notation: Y = BY + ΓX + ζ
p=6
σ1X =γ11 σX2
 0 0  γ11  2
σ12 0 
B =  , Γ =   , Φ = (σ EX ), Ψ =  2
σ2X = β21σ1X + γ21σX2
 β21 0 γ 21   0 σ2  σ21 = β21σ12 + γ21σ1X first 3 equations allow
unique solution for path
Number of parameters: p=6 σX2 = σX2 coefficients, last three for
Model is identified, if all parameters can be expressed σy12 = γ11σ1X+σ12 variances of δ and ζ
as functions of variances/covariances of observed
variables σy22 = β21σ21 + γ21σ2X+σ22

3.12.2004 Messung & Analyse von 29 3.12.2004 Messung & Analyse von 30
Kundenzufriedenheit Kundenzufriedenheit

5
Condition for Identification Latent variables and Indicators
 Just-identified: all parameters can be uniquely Latent variables (LVs) or constructs or factors are
derived from functions of variances/covariances unobservable, but
 Over-identified: at least one parameter is not We might find indicators or manifest variables (MVs)
uniquely determined
 Under-identified: insufficient number of for the LVs that can be used as measures of the
variances/covariances latent variable
Indicators are imperfect measures of the latent
Necessary, but not sufficient condition for variable
identification: number of variances/covariances at
least as large as number of parameters
A general and operational rule for checking
identification has not been found

3.12.2004 Messung & Analyse von 31 3.12.2004 Messung & Analyse von 32
Kundenzufriedenheit Kundenzufriedenheit

Indicators for “Expectation” Notation


δ1 From: Swedish CSB δ1 λ1
E1 Questionnaire, Banks: X1 X1=λ1ξ+δ1
δ2 Private Customers δ2 λ2 X2=λ2ξ+δ2
E2 EX X2
λ3
ξ X3=λ3ξ+δ3
δ3 E1, E2, E3: „block“ of LVs δ3
E3 for Expectation X3
“reflective” indicators
E1: When you became a customer of AB-Bank, you probably knew ξ: latent variable, factor
something about them. How would you grade your expectations Xi: indicators, manifest
on a scale of 1 (very low) to 10 (very high)? Some properties:
variables LV: unit variance
E2: Now think about the different services they offer, such as bank λi: factor loadings noise δi: has mean zero,
loans, rates, … Rate your expectations on a scale of 1 to 10?
δi: measurement errors, variance σi2, uncorrela-
E3: Finally rate your overall expectations on a scale of 1 to 10? noise ted with other noises
3.12.2004 Messung & Analyse von 33 3.12.2004 Messung & Analyse von 34
Kundenzufriedenheit Kundenzufriedenheit

Notation CS-Model: Path Diagram


δ1 λ1 X1=λ1ξ+δ1 δ1
X1 E1
δ2 λ2 X2=λ2ξ+δ2 δ2 ζ2
X2 ξ X3=λ3ξ+δ3 E2 EX
ε4
δ3
λ3
δ3 γ21 C1
X3 E3
γ11 ε5
In matrix-notation: CS C2
ξ: latent variable, factor ε1 ε6
Xi: indicators, manifest X = Λξ + δ Q1
β21
C3
variables ε2 PQ
with vectors X, Λ, and δ Q2
λi: factor loadings e.g., X = (X1, X2, X3)‘ ε3
δi: measurement error, Q3 ζ1
noise
3.12.2004 Messung & Analyse von 35 3.12.2004 Messung & Analyse von 36
Kundenzufriedenheit Kundenzufriedenheit

6
SEM-Model: Path Diagram SEM-Model: Notation
δ1
X1 η = Βη + Γξ + ζ Inner relations, inner model
δ2 ζ2
X2 ξ ε4  0 0  γ 11  2
 σ 12 0 
γ21 Β=  , Γ =   , Φ = (σ EX ), Ψ = 
δ3 Y4 2
X3
ε5  β 21 0   γ 21   0 σ2 
γ11 η2 Y5
ε1 ε6 Outer relations, measurement model
Y1 Y6
β21 X, δ: 3-component vector X = Λxξ+δ, Y = Λyη+ε
ε2
Y2 η1 Y, ε: 6-component vector
ε3 η = Βη + Γξ + ζ
ζ1 λ λ λ 0 0 0
Y3
X = Λxξ+δ, Y = Λyη+ε Λ ′x = ( λ11 λ12 λ13 ) , Λ ′y =  11 12 13  , Θδ , Θε
 0 0 0 λ11 λ12 λ13 
3.12.2004 Messung & Analyse von 37 3.12.2004 Messung & Analyse von 38
Kundenzufriedenheit Kundenzufriedenheit

Covariance Matrix
Statistical Assumptions of Manifest Variables
 Error terms of inner model (ζ) have Unrestricted covariance matrix (order: K = kx+ky)
 zero means Σ = Var{(X’,Y’)’}
 constant variances across observations
 are uncorrelated across observations
 are uncorrelated with exogenous variables Model-implied covariance matrix
 Error terms of measurement models (δ, ε) have  A A2 
 zero means Σ(θ ) =  1  , θ = (Β, Γ, Φ, Ψ , Λ x , Λ y , Θδ , Θε )
 constant variances across observations  A2′ A3 
 are uncorrelated across observations A1 = Λ x ( I − Β) −1 (ΓΦΓ′ + Ψ )[( I − Β) −1 ]′Λ y + Θε
 are uncorrelated with latent variables and with each
other A2 = Λ y ( I − Β) −1 ΓΦ[ Λ x ]′
 Latent variables are standardized A3 = Λ x Φ[ Λ x ]′ + Θδ
3.12.2004 Messung & Analyse von 39 3.12.2004 Messung & Analyse von 40
Kundenzufriedenheit Kundenzufriedenheit

Estimation of the Parameters Discrepancy Function


 Covariance fitting methods The discrepancy or fitting function
 search for values of parameters θ so that the model- F(S;Σ) = F(S; Σ(θ))
implied covariance matrix fits the observed
is a measure of the “distance” between the model-
unrestricted covariance matrix of the MVs implied covariance-matrix Σ(θ) and the estimated
 LISREL (LInear Structural RELations): Jöreskog unrestricted covariance-matrix S
(1973), Keesling (1972), Wiley (1973)
 Software LISREL by Jöreskog & Sörbom Properties of the discrepancy function:
 PLS techniques  F(S;Σ) ≥ 0;
 partition of θ in estimable subsets of parameters
 F(S;Σ) = 0 if S=Σ
 iterative optimizations provide successive
approximations for LV scores and parameters
 Wold (1973, 1980)

3.12.2004 Messung & Analyse von 41 3.12.2004 Messung & Analyse von 42
Kundenzufriedenheit Kundenzufriedenheit

7
Covariance Fitting (LISREL) PLS Techniques
 Estimates of the parameters are derived by  Estimates factor scores for latent variables
F(S;Σ(θ)) θ min  Estimates structural parameters (path coefficients,
 Minimization of (K: number of indicators) loading coefficients), based on estimated factor
F(S;Σ) = log|Σ| – log|S| + trace (SΣ-1) – K scores, using the principle of least squares
gives ML-estimates, if the manifest variables are  Maximizes the predictive accuracy
independently, multivariate normally distributed  “Predictor specification”, viz. that E(η|ξ) equals the
 Iterative Algorithm (Newton-Raphson type) systematic part of the model, implies E(ζ|ξ)=0: the
error term has (conditional) mean zero
 Identification
 No distributional assumptions beyond those on 1st
 Choice of starting values is crucial
and 2nd order moments
 Other choices of F result in estimation methods like OLS
and GLS; ADF (asymptotically distribution free)

3.12.2004 Messung & Analyse von 43 3.12.2004 Messung & Analyse von 44
Kundenzufriedenheit Kundenzufriedenheit

The PLS-Algorithm Estimation of Factor Scores


Step 1: Estimation of factor scores Factor ηi: realizations Yin, n=1,…,N
1. Outer approximation Yin(o): outer approximation of Yin
2. Calculation of inner weights Yin(i): inner approximation of Yin
3. Inner approximation Indicator Yij: observations yijn; j=1,…,Ji; n=1,…,N
4. Calculation of outer weights 1. Outer approximation: Yin(o)=Σjwijyijn s.t. Var(Yi(o))=1
Step 2: Estimation of path and loading coefficients by 2. Inner weights: vih=sign(rih), if ηi and ηh adjacent;
minimizing Var(ζ ) and Var(δ) otherwise vih=0; rih=corr(ηi,ηh) (“centroid weighting”)
Step 3: Estimation of location parameters (intercepts) 3. Inner approximation: Yin(i)=ΣhvihYhn(o) s.t. Var(Yi(i))=1
 Bo from η = Bo + Bη + Γξ + ζ 4. Outer weights: wij=corr(Yij,Yi(i))
 Λo from X = Λo + Λxξ + δ
Start: choose arbitrary values for wij
Repeat 1. through 4. until outer weights converge

3.12.2004 Messung & Analyse von 45 3.12.2004 Messung & Analyse von 46
Kundenzufriedenheit Kundenzufriedenheit

Example Example, cont’d


δ1 Starting values wEX,1,…,wEX,3,wPQ,1,…,wPQ,3,wCS,1,…,wCS,3
E1 Outer approximation:
δ2 ζ2 EXn(o) = ΣjwEX,jEjn; similar PQn(o), CSn(o);
E2 EX
γ21(+
+) ε4 standardized
δ3 C1
E3 Inner approximation:
γ11(+
+) ε5
CS C2 EXn(i) = + PQn(o) + CSn(o)
ε1 ε6 PQn(i) = + EXn(o) + CSn(o)
Q1 C3
β21(+
+) CSn(i) = + EXn(o) + PQn(o)
ε2 PQ
Q2 standardized
ε3 Outer weights:
Q3 ζ1 wEX,j = corr(Ej,EX(i)), j=1,…,3; similar wPQ,j, wCS,j

3.12.2004 Messung & Analyse von 47 3.12.2004 Messung & Analyse von 48
Kundenzufriedenheit Kundenzufriedenheit

8
Choice of Inner Weights Measurement Model: Examples
Centroid weighting scheme: Yin(i)=ΣhvihYhn(o) Latent variables from Swedish CSB Model
vij=sign(rih), if ηi and ηh adjacent, vij=0 otherwise 1. Expectation
with rih=corr(ηi,ηh); these weights are obtained if vih are E1: new customer feelings
chosen to be +1 or -1 and Var(Yi(i)) is maximized E2: special products/services expectations
E3: overall expectation
Weighting schemes:
2. Perceived Quality
ηh predecessor ηh successor Q1: range of products/services
Q2: quality of service
centroid sign(rih) sign(rih)
Q3: clarity of information on products/services
factor, PC rih rih Q4: opening hours and appearance of location
path bih rih Q5: etc.

bih: coefficient in regression of ηi on ηh

3.12.2004 Messung & Analyse von 49 3.12.2004 Messung & Analyse von 50
Kundenzufriedenheit Kundenzufriedenheit

Measurement Models Estimation of Outer Weights


Reflective model: each indicator is reflecting the latent  “Mode A” estimation of Yi(o): reflective
variable (example 1)
Yij = λijηi + εij measurement model
Yij is called a reflective or effect indicator (of ηi) weight wij is coefficient from simple regression of
Formative model: (example 2) Yi(i) on Yij: wij = corr(Yij,Yi(i))
ηi = πy'Yi + δi  “Mode B” estimation of Yi(o): formative
πy is a vector of ki weights; Yij are called formative or
cause indicators measurement model
Hybrid or MIMIC model (for “multiple indicators and multiple weight wij is coefficient of Yij from multiple
causes”) regression of Yi(i) on Yij, j=1,…,Ji
 Choice between formative and reflective depends on the multicollinearity?!
substantive theory
 Formative models often used for exogenous, reflective  MIMIC model
and MIMIC models for endogenous variables

3.12.2004 Messung & Analyse von 51 3.12.2004 Messung & Analyse von 52
Kundenzufriedenheit Kundenzufriedenheit

Properties of Estimators ACSI Model: Results


A general proof for convergence of the PLS-algorithm
does not exists; practitioners experience no
problems
Perceived
0,78 − 0,38 Voice
Quality 0,47 − 0,29
 Factor scores are inconsistent but “consistent at
large”: consistency is achieved with increasing Custo-
sample size and block size 0,90 0,95 0,17
0,73 0,53 (− 0,15) mer Satis- 0,57 (0,06)
 Loading coefficients are inconsistent and seem to 0,12 faction 0,35
be overestimated
 Path coefficients are inconsistent and seem to be Expec- 0,40
Loyalty
underestimated tations 0,35
− 0,24 Value
(0,06) EQS-estimates
PLS-estimates

3.12.2004 Messung & Analyse von 53 3.12.2004 Messung & Analyse von 54
Kundenzufriedenheit Kundenzufriedenheit

9
Evaluation of SEM-Models Inspection of Results
 Depends on estimation method  Covariance-fitting methods: global optimization
 Covariance-fitting methods: distributional  Model parameters and their standard errors; do they
assumptions, optimal parameter estimates, factor confirm theory?
indeterminacy  Correlation residuals: sij-sij(θ)
 PLS path modeling: non-parametric, optimal  Graphical methods
prediction accuracy, LV scores
 PLS techniques: iterative optimization of outer
 Step 1: Inspection of estimation results (R2, models and inner model
parameter estimates, standard errors, LV scores,  Model parameters
residuals, etc.)
 Resampling procedures like blindfolding or jackknifing
 Step 2: Assessment of fit give standard errors of model parameters
 Covariance-fitting methods: global measures  LV scores
 PLS path modeling: partial fitting measures  Graphical methods

3.12.2004 Messung & Analyse von 55 3.12.2004 Messung & Analyse von 56
Kundenzufriedenheit Kundenzufriedenheit

Fit Indices Chi-square Statistic


 Covariance-fitting methods: covariance fit  Test of H0: Σ = Σ(θ) against non-specified alternative
measures such as  Test-statistic X2=(N-1)F(S;Σ(θ̂))
 Chi-square statistics  If model is just identified (c=p): X2=0 [c=K(K+1)/2, p:
 Goodness of Fit Index (GFI), AGFI number of parameters in θ]
 Normed Fit Index (NFI), NNFI, CFI  Under usual regularity conditions (normal distribution,
 Etc. ML-estimation), X2 is asymptotically χ2(c-p)-distributed
 Basis is the discrepancy function  Non-significant X2 indicate: the over-identified model
does not differ from a just-identified version
 PLS path modeling: prediction-based measures
 Problem: X2 increases with increasing N
 Communality
 Some prefer X2/(c-p) to X2 (has reduced sensitivity to
 Redundancy sample size); rule of thumb: X2/(c-p) < 3 is acceptable
 Stone-Geisser’s Q2

3.12.2004 Messung & Analyse von 57 3.12.2004 Messung & Analyse von 58
Kundenzufriedenheit Kundenzufriedenheit

Goodness of Fit Indices Other Fit Indices


Goodness of Fit Index (Jöreskog & Sörbom):  Normed Fit Index, NFI (Bentler & Bonett)
F [ S , Σ(θˆ)]  Similar to GFI, but compares with a baseline model,
GFI = 1 − typically the independence model (indicators are
F [ S , Σ(O)] uncorrelated)
 Portion of observed covariances explained by the  Ranges from 0 (poor fit) to 1 (perfect fit)
model-implied covariances  Rule of thumb: NFI > 0.9
 “How much better fits the model as compared to no  Comparative Fit Index, CFI (Bentler)
model at all”  Less depending of sample size than NFI
 Ranges from 0 (poor fit) to 1 (perfect fit)  Non-Normed Fit Index, NNFI (Bentler & Bonett)
 Also known as Tucker-Lewis Index
 Rule of thumb: GFI > 0.9  Adjusted for model complexity
 AGFI penalizes model complexity:  Root mean squared error of approximation, RMSEA
 c  F [ S , Σ(θˆ)] (Steiger):
AGFI = 1 −   RMSEA = F [ S , Σ(θˆ)] /(c − p )
 c − p  F [S , Σ(O)]
3.12.2004 Messung & Analyse von 59 3.12.2004 Messung & Analyse von 60
Kundenzufriedenheit Kundenzufriedenheit

10
Assessment of PLS Results Some Indices
 Not a single but many optimization steps; not a Assessment of diagonal fit (proportion of explained
variances)
global measure but many measures of various
aspects of results  SMC (squared multiple correlation coefficient) R2:
(average) proportion of the variance of LVs that is
 Indices for assessing the predictive relevance explained by other LVs; concerns the inner model
 Portions of explained variance (R2)  Communality H2: (average) proportion of the variance of
 Communality, redundancy, etc. indicators that is explained by the LVs directly connected
 Stone-Geisser’s Q2
to it; concerns the outer model
 Redundancy F2: (average) proportion of the variance of
 Reliability indices indicators that is explained by predictor LVs of its own LV
 NFI, assuming normality of indicators  r2: proportion of explained variance of indicators
 Allows comparisons with covariance-fitting
results
3.12.2004 Messung & Analyse von 61 3.12.2004 Messung & Analyse von 62
Kundenzufriedenheit Kundenzufriedenheit

Some Indices, cont’d Stone-Geisser’s Q2


Assessment of non-diagonal fit  Similar to R2
E
 Explained indicator covariances Q2 = 1 −
rs = 1− c/s O
with c = rms(C), s = rms(S); C: estimate of Cov(ε) E: sum of squared prediction errors; O: sum of
 Explained latent variable correlation squared deviations from mean
rr = 1− q/r  Prediction errors from resampling (blindfolding,
jackknifing)
with q = rms(Q), r = rms(Cov(Y)); Q: estimate of
Cov(ζ )  E.g., communality of Yij, an indicator of ηi
 reY = rms (Cov(e,Y)), e: outer residuals
Qijc2 = 1 −
∑ [ y − (λˆ Y )]
n ijn ij in
2

 reu = rms (Cov(e,u)), u: inner residuals


∑ [y − y ]
n ijn ij
2

rms(A) = (ΣiΣj aij2)1/2: root mean squared covariances (diagonal


elements of symmetric A excluded from summation)
3.12.2004 Messung & Analyse von 63 3.12.2004 Messung & Analyse von 64
Kundenzufriedenheit Kundenzufriedenheit

Lohmöller’s Advice ACSI Model: Results


 Check fit of outer model
 Low unexplained portion of indicator variances and
covariances Perceived
 High communalities in reflective blocks, low 0,78 − 0,38 Voice
residual covariances Quality 0,47 − 0,29
 Residual covariances between blocks close to zero Custo-
 Covariances between outer residuals and latent 0,90 0,95 0,17
0,73 (− 0,15) mer Satis- 0,57 (0,06)
variables close to zero 0,53
0,12 faction 0,35
 Check fit of inner model
 Low unexplained portion of latent variable indicator Expec- 0,40
variances and covariances Loyalty
tations 0,35
 Check fit of total model − 0,24 Value
 High redundancy coefficient (0,06) EQS-estimates
 Low covariances of inner and outer residuals PLS-estimates

3.12.2004 Messung & Analyse von 65 3.12.2004 Messung & Analyse von 66
Kundenzufriedenheit Kundenzufriedenheit

11
Diagnostics: EQS Diagnostics: PLS (centroid weighting)

ACSI ACSI e Hui Schenk


ACSI ACSIe
R2 0.29 0.35 0.43 0.40
χ2 247.5 378.7
Q2 0.36 0.41 0.58 0.49
df 81 173
rr 0.47 0.55 0.58 0.59
NNFI 0.898 0.930
H2 0.71 0.64 0.64 0.64
RMSEA 0.079 0.060
F2 0.22 0.24 0.30 0.26
r2 0.63 0.63 0.57 0.60
reY 0.26 0.24 0.19 0.09
reu 0.19 0.17 0.16 0.08

3.12.2004 Messung & Analyse von 67 3.12.2004 Messung & Analyse von 68
Kundenzufriedenheit Kundenzufriedenheit

Practice of SEM Analysis Practice of SEM Analalysis cont’d

 Theoretical basis  Model


 Data  LISREL allows for more general covariance structures
 Scaling: metric or nominal (in LISREL not standard) e.g., correlation of measurement errors
 Sample-size: a good choice is 10p (p: number of  Estimation
parameters); <5p cases might result in unstable
estimates; large number of cases will result in large  Repeat estimation with varying starting values
values of X2
 Reflective indicators are assumed to be uni-dimensional;  Diagnostic checks
it is recommended to use principal axis extraction,  Use graphical tools like plots of residuals etc.
Cronbach’s alpha and similar to confirm the suitability of  Check each measurement model
data  Check each structural equation
 Model  Lohmöller’s advice
 Identification must be checked for covariance fitting  Model trimming
methods  Stepwise model building (Hui, 1982; Schenk, 2001)
 Indicators for LV can be formative or reflective; formative
indicators not supported in LISREL

3.12.2004 Messung & Analyse von 69 3.12.2004 Messung & Analyse von 70
Kundenzufriedenheit Kundenzufriedenheit

LISREL vs PLS The Extended Model


 Models
 PLS assumes recursive inner structure (0,20) Emotional
 PLS allows for higher complexity w.r.t. B, Γ, and Λ; 0,33
Perceived Factor 0,31
LISREL w.r.t. Ψ and Θ
Quality 0,35
 Estimation method
0,58
 Distributional assumptions in PLS not needed 0,55
 Formative measurement model in PLS
0,37
0,36
Custo-
0,87 0,85
 Factor scores in PLS 0,73 0,53 (− 0,14) mer Satis- 0,48
 PLS: biased estimates, consistency at large (− 0,01) faction 0,34
 LISREL: ML-theory
 In PLS: diagnostics much richer Expec- 0,41
0,34 Loyalty
 Empirical facts tations
 LISREL needs in general larger samples (− 0,14) Value
(0,06) EQS-estimates
 LISREL needs more computation
PLS-estimates

3.12.2004 Messung & Analyse von 71 3.12.2004 Messung & Analyse von 72
Kundenzufriedenheit Kundenzufriedenheit

12
Diagnostics: EQS Diagnostics: PLS (centroid weighting)

ACSI ACSI e Hui Schenk


ACSI ACSI e R2 0.29 0.35 0.43 0.40
χ2 247.5 378.7
Q2 0.36 0.41 0.58 0.49
df 81 173
rr 0.47 0.55 0.58 0.59
NNFI 0.898 0.930
H2 0.71 0.64 0.64 0.64
RMSEA 0.079 0.060
F2 0.22 0.24 0.30 0.26
r2 0.63 0.63 0.57 0.60
reY 0.26 0.24 0.19 0.09
reu 0.19 0.17 0.16 0.08

3.12.2004 Messung & Analyse von 73 3.12.2004 Messung & Analyse von 74
Kundenzufriedenheit Kundenzufriedenheit

Model Building: Hui’s Approach Model Building: Schenk’s Approach

Emotional 0,32
Emotional
Perceived Factor Perceived Factor
0,43 0,61
Quality 0,31 Quality 0,35
− 0,18 0,31
0,10 0,35
Custo- Custo-
0,36 0,32
0,42 mer Satis- 0,73 mer Satis-
0,33
faction 0,34 faction
− 0,18 0,60
Expec- 0,17 Expec-
0,63
tations 0,21 tations
Value 0,12 0,23
Loyalty Value

3.12.2004 Messung & Analyse von 75 3.12.2004 Messung & Analyse von 76
Kundenzufriedenheit Kundenzufriedenheit

Data-driven Specification
 No solid a priori knowledge about
relations among variables
The end  Stepwise regression
Search of the “best” model
Forward selection
Backward elimination
Problem: omitted variable bias
 General to specific modeling

3.12.2004 Messung & Analyse von 77 3.12.2004 Messung & Analyse von 78
Kundenzufriedenheit Kundenzufriedenheit

13
Stepwise SE Model Building Stepwise SE Model Building
Hui’s algorithm
 Hui (1982): models with interdependent
Stage 1
inner relations 1. Calculate case values Yij for LVs ηi as principal
 Schenk (2001): guaranties causal component of corresponding block, calculate R =
Corr(Y)
structure, i.e., triangular matrix B of path 2. Choose for each endogenous LV the one with highest
coefficients in the inner model correlation to form a simple regression
3. Repeat until a stable model is reached
η=Bη+ζ a. PLS-estimate the model, calculate case values, and
recalculate R
b. Drop from each equation LVs with t-value |t|<1,65
c. Add in each equation the LV with highest partial
correlation with dependent LV
3.12.2004 Messung & Analyse von 79 3.12.2004 Messung & Analyse von 80
Kundenzufriedenheit Kundenzufriedenheit

Stepwise SE Model Building Hui’s vs. Schenk’s Algorithm


Hui’s algorithm, cont’d Hui’s algorithm is not restricted to a causal
structure; allows cycles and an arbitrary
Stage 2 structure of matrix B
1. Use rank condition for checking identifiability of Schenk’s algorithm
each equation  uses an iterative procedure similar to that used
2. Use 2SLS for estimating the path coefficients in by Hui
each equation  makes use of a priori information about the
structure of the causal chain connecting the
latent variables
 latent variables are to be sorted

3.12.2004 Messung & Analyse von 81 3.12.2004 Messung & Analyse von 82
Kundenzufriedenheit Kundenzufriedenheit

Stepwise SE Model Building Data, special CS dimensions


Schenk’s algorithm Staff 2 availability1 (PERS), politeness1 (FREU)
1. Calculate case values Yij for LVs ηi as principal Outlet 3 make-up1 (GEST), presentation of mer-
component of corresponding block, calculate R = chandise1 (PRAE), cleanliness1 (SAUB)
Corr(Y) Range 2 freshness and quality (QUAL), richness
2. Choose pair of LVs with highest correlation (VIEL)
3. Repeat until a stable model is reached Customer- 7 access to outlet (ERRE), shopping hours
a. PLS-estimate the model, calculate case values, and orientation (OEFF), queuing time for checkout1
recalculate R (WART), paying modes1 (ZAHL), price
b. Drop LVs with non-significant t-value information1 (PRAU), sales (SOND),
c. Add LV with highest correlation with already included availability of sales (VERF)
LVs 1 Dimension of “Emotional Factor”

3.12.2004 Messung & Analyse von 83 3.12.2004 Messung & Analyse von 84
Kundenzufriedenheit Kundenzufriedenheit

14
References
C. Fornell (1992), “A National Customer Satisfaction Barometer:
The Swedish Experience”. Journal of Marketing, (56), 6-21.
C. Fornell and Jaesung Cha (1994), “Partial Least Squares”, pp.
52-78 in R.P. Bagozzi (ed.), Advanced Methods of Marketing
Research. Blackwell.
J.B. Lohmöller (1989), Latent variable path modeling with partial
least squares. Physica-Verlag.
H. Wold (1982), “Soft modeling. The basic design and some
extensions”, in: Vol.2 of Jöreskog-Wold (eds.), Systems under
Indirect Observation. North-Holland.
H. Wold (1985), “Partial Least Squares”, pp. 581-591 in S. Kotz,
N.L. Johnson (eds.), Encyclopedia of Statistical Sciences, Vol.
6. Wiley.

3.12.2004 Messung & Analyse von 85


Kundenzufriedenheit

15

Potrebbero piacerti anche