Sei sulla pagina 1di 10

Simple Regression: (one independent/one dependent)

Type of Variable Example of Variable


2 scale variables e.g. Total Purchase Intention/Total Brand
Attitude

Multiple Regression: (2 independent/one dependent)

Type of Variable Example of Variable


3 scale Variables e.g. Total Purchase Intention etc.

Correlation:

Type of Variable Example of Variable


Atleast 2 scale Variables e.g. Total Purchase Intention etc.
Independent T Test (from different samples)

Type of Variable Example of Variable


1 nominal variable Gender
1 scale (with 2 categories) Will buy/Won’t Buy
Analysis:
One-way ANOVA

Type of Variable Example of Variable


1 scale variable (>2 categories) Total Purchase Intention/occupation
2 ordinal
Paired T Test (From same sample)

Type of Variable Example of Variable


1 Nominal variable
1 scale variable

ANOVA repeated measures

Type of Variable Example of Variable


1 Nominal variable T1, T2, T3
1 scale variable
Factor analysis & reliability analysis

 They are quality control analysis used for confirmation whether all dimensions lead to the
respective construct
 For cross loading use rotation
 Gives you the idea that what you are trying to measure you are actually measuring the same
thing

Factor loading

Analyze>dimension reduction>factors> select variable>descriptive>KMO & Bartlett table > continue>


rotation> Promax & loading plots> continue> options> suppress small coefficients .40> continue > ok

Analysis

 Check plot & warning to see whether there are more than 1 factors
 The negative signs in the component matrix shows negatively coded variables (will need to
recode them by Transform> recode into same variable> select variable> old & new>ok)
 If KMO>0.7 data is adequate to conduct FA (it will be 0.5 if there are only 2 qs) & sig<0.05
 In total variance table if eigenvalues are greater than 1 then it’s considered a separate factor

Reliability

Analyze>scale>reliability analysis>select Qs>ok

 Cronbach's Alpha to test reliability if alpha is >0.7 construct is sig or reliable


 If negative may want to recode

Now redo Factor Analysis

Never compute your variables until you clarify the construct with its dimension and its reliability

Computing Variable

Transform> Compute variable>target variable name >send (use+) > ok

Computation of regression variable

Analyze>dimension reduction> factor> select Variables> scores> save as variable regression> continue>
ok

 It will create a new variable do rename it


 Then run correlation between the new variable and regression variable if it is>.99 use either for
FA otherwise use the regression variable.
Binary Variable

Turning data set into Binary to do Chi-square test for Two Binary variables

 First find median:

Analysis> descriptive>Frequency>select variable> Stats> Median>ok

 Once median is known compute a new bin variable

Transform>compute variable>set new name>1>if>define value lable using if condition variable_name <=
"median".

Do the process again for setting second value

Chi-square

Analyze>Descriptive stats> cross tabs> select rows and columns>statistic> (Chi square, phi and
crammers V)> continue> Cells (Percentages)>continue> display> clustered bar charts>ok

 Can add a layer to it by setting in crosstabs add variable in layer>ok & display layer in variable
(keep in mind data is not split)
 Best for two variables with two categories (it can be done for more than one too)
 Can use only binary data
 Analyze sig level and graphical representation
 Phi and crammers coefficient is read as the same as correlation strength and association

Paired T test

We use "paired t-test" when you have a two observations and they are in a scale format and they are
coming from the same source. It cannot handle more than two data points. Eg t1 & t2

p-value 0.05 =>> significant

Formula: (To test the magnitude of the significance)

eta squared = (t^2) / (t^2 + N-1)

it's interpretation: small effect 0.01 Moderate effect 0.06 Large effect 0.14

Result: A paired samples t-test was conducted to evaluate the impact of the intervention on student's
scores of the of Statitics
test (FOST). There was a statistically significant decrease in FOST scores from Time 1 (M=40.17, SD=5.16)
to Time 2 [M-37.5, SD 5.15, t (29)=5.39. At the probability of p-value of less than 0.05 the ETA squared
shows (0.5) small effect.

Independent sample t test

Analyze> compare means> independent sample t test> select test variable (scale) and group
variables (nominal/categorical)> define

 We use "independent t-test" when you have a two observations and they are in a scale format
and they are coming from the different source. It cannot handle more than two data points.
 p-value 0.05 =>> significance
 Formula: (To test the magnitude of the significance) eta squared = (t^2) / (t^2+N1 + N2-2)
 For sig check Levine’s sig if >0.05 equal variance assumed and check sig in respective row.
 Significance (Levine’s –and corresponding sig)
 Effect Size (ETA squared – T 2 / T2 + (N1+N2-2)
o Small (.01)
o Moderate (0.06)
o large (0.14)

ANOVA repeated measure

Analyze>> General Linear Model >> Repeated Measures> new variable >levels>define

 to check significance we look at Wilks Lambda, Look at the multi variate test for Wilks Lambda
and it's significant
 eta-squared (larger = better)
 is the relation significant with respect to the group and it's eta (either larger or smaller than the
whole)
 graph will show you the clear and concise picture

Split file

**To check correlation by different groups

Data>Split File> organize output by groups

 then run correlation and the results would be shown in groups

Double click on the result for pair wise comparison

One way ANOVA

Analyze> Compare means> post hoc (check Tukey)> Options (check Descriptive & Means Plot)> ok

For highest value check positive values


ANALYSIS:

 Check Significance: <0.05


 Check pairwise (I-J) difference’s corresponding significance value
 Lowest Optimism (all negative differences in the box); highest optimism (all positive
difference in the box).
 ETA squared= sum of squares between groups/Total Sum of Squares

Correlation

Analyze>correlate>bivariate

 Pearson Correlation for scale variables


 Spearman Correlation for categorical data
 Check sig, magnitude and direction
 (WRITE RESULTS THIS WAY) Analysis:
o Magnitude/Strength (e.g. 0.10 to 0.29= small; 0.30-0.49 Medium, 0.5-0.1 large)
o Direction (Positive/Negative)
o Significance (Statistically Sig or not: <0.005

Regression

Analyze > regression>Linear

 Check adjusted R-square, model significance, standardized coefficients beta, significance of


betas

Freidman Test

Analyze> non parametric test > related sample fields > run

If you double click output get pair wise comparison

Potrebbero piacerti anche