Sei sulla pagina 1di 216

PURDUE UNIVERSITY

GRADUATE SCHOOL
Thesis Acceptance
This is to certify that the thesis prepared
By
Entitled
Complies with University regulations and meets the standards of the Graduate School for originality
and quality
For the degree of
Final examining committee members
, Chair
Approved by Major Professor(s):
Approved by Head of Graduate Program:
Date of Graduate Program Head's Approval:
Margaret L. Dalrymple
The Value of Evaluation: A Case Study of Evaluating Strategic Plan Initiatives
Doctor of Philosophy
Judith M. Gappa Deborah E. Bennett, Co-Chair
Rabrindra N. Mukerjea Sidney M. Moon
Andrea M. Trice
July 23, 2007
Judith M. Gappa
Kevin R. Kelly
Graduate School ETD Form 9
(01/07)











THE VALUE OF EVALUATION:

A CASE STUDY OF EVALUATING STRATEGIC PLAN INITIATIVES






A Dissertation

Submitted to the Faculty

of

Purdue University

by

Margaret L. Dalrymple





In Partial Fulfillment of the

Requirements for the Degree

of

Doctor of Philosophy



August 2007
Purdue University
West Lafayette, Indiana
UMI Number: 3304589
3304589
2008
UMI Microform
Copyright
All rights reserved. This microform edition is protected against
unauthorized copying under Title 17, United States Code.
ProQuest Information and Learning Company
300 North Zeeb Road
P.O. Box 1346
Ann Arbor, MI 48106-1346
by ProQuest Information and Learning Company.
ii



















For my parents,

Ogden and J anet Dalrymple



iii






ACKNOWLEDGMENTS


I owe a great number of people my sincere gratitude for their support,
encouragement, and assistance. I owe the greatest of professional debts to my advisor,
Dr. J udith Gappa, whose sound advice, erudite guidance, and solid encouragement kept
me on my path to completing this work. I was very lucky to have such a superb
dissertation committee, each member of which served as an excellent mentor, each giving
exceptional guidance. Special thanks go to Professor Rabindra Mukerjea, Dr. Deborah
Bennett, Dr. Andrea Trice, and Dr. Sidney Moon for their contributions to this work.
I would also like to extend my appreciation to all of the participants at the
universities whose strategic plans this study considers. Although they will not be named
in order to preserve confidentiality, their efforts in assisting me with this research are by
no means forgotten. I am grateful for their time and wisdom. I could not have completed
this research without the expert assistance of colleagues whose guidance lent polish to the
final product: Dr. Carol Kominski, Mr. Darrin Harris, Dr. Victor Borden, Ms. J ulie
Carpenter-Hubin, Dr. Michael Dooris, and Dr. Louise Sandmeyer.
I thank my doctoral cohorts, Dr. Andrew Robison, Dr. Sara Stein Koch, Dr. Steve
Wanger, and others for their support, collaboration, and friendship over the course of
these past few years.


iv
Dr. J acquelyn Frost deserves special thanks for her professional guidance,
mentorship, and friendship throughout this entire journey. Her steadfast support has kept
me motivated, and I will forever be grateful for her faith and trust in me.
I am also deeply appreciative to Esther Weiner for her support at a critical point
in my life, and to my sister Georgina Dalrymple for her timely childcare assistance and
constant encouragement.
Last, but by no means least, I thank my family; they are my greatest supporters.
Thank you to my children for their incredible patience and ability to keep me grounded.
They may be small in size, but their unceasing love has been a source of overwhelming
strength. To my husband, Steven, I owe the greatest debt. This accomplishment would
never have come to pass without his love, guidance, and understanding. Thank you.


v






TABLE OF CONTENTS



Page
LIST OF FIGURES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . viii
ABSTRACT. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix
CHAPTER I: INTRODUCTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .1
Background of the Study: Overview of Strategic Planning in Higher Education . .2
Brief History of Strategic Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .3
Strategic Planning in Higher Education . . . . . . . . . . . . . . . . . . . . . . . . . . . .4
Using a Logic Model as a Template for the Evaluation . . . . . . . . . . . . . . . . . . . . . 8
The Research Study . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .8
Statement of the Research Problem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .9
Definitions of Key Term and Variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .10
Overview of the Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .11
Purpose and Significance of the Study. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .12
Limitations of the Study. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .14
Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .15

CHAPTER II: LITERATURE REVIEW. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .16
Search Process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .17
Literature Review on Strategic Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .19
Phase I: Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .21
Components of a Strategic Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .24
Phase II: Resource Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .26
Phase III: Implementation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .28
Phase IV: Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .29
Literature Review of Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
Assessment Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
Key Elements of Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Literature Review on the Methods of Assessing a Strategic Plan . . . . . . . . . . . . .39
Literature Review on Logic Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .40
Case Studies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .44
Program Evaluations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .48
Institutional Effectiveness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .53
Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56


vi


Page

CHAPTER III: METHODOLOGY . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .59
Overview of Research Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
Rationale for Using a Case Study Design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .60
Using a Logic Model to Frame the Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . .63
Overall Procedure . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .63
Site and Participant Selection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
Data Collection . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .66
Document Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .66
Survey Instrument . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .67
Semi-structure Interviews . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 69
Data Analysis and Validation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71
Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 75

CHAPTER IV: RESEARCH RESULTS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .76
University A . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .78
Sources of Evidence and Logic Model . . . . . . . . . . . . . . . . . . . . . . . . . . .79
Evaluation Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82
Integrating Budgeting with Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88
Communication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .90
Culture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .92
University B. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
Sources of Evidence and Logic Model. . . . . . . . . . . . . . . . . . . . . . . . . . . 94
Evaluation Methodology. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 96
Leadership. . . . . . . . . . . . . . . . . .. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .101
Integrating Budgeting with Planning. . . . . . . . . . . . . . . . . . . . . . . . . . . .104
Culture . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .106
Consistency of the Strategic Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 107
University C . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .108
Sources of Evidence and Logic Model . . . . . . . . . . . . . . . . . . . . . . . . . .110
Evaluation Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .113
Integrating Budgeting with Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . .118
Culture and Leadership . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .120
Linking the Strategic Plan to Other Processes . . . . . . . . . . . . . . . . . . . . .121
Length of the Planning Cycle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 121
University D . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .122
Sources of Evidence and Logic Model . . . . . . . . . . . . . . . . . . . . . . . . . . 123
Evaluation Methodology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 127
Integrating Budgeting with Planning . . . . . . . . . . . . . . . . . . . . . . . . . . . .131
Communication . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .134
Cross Case Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .135
Summary and Discussion of the Results within a Logic Model . . . . . . .135


vii
Page

Summary and Discussion of the Common Themes . . . . . . . . . . . . . . . . .140
A Conceptual Model for Evaluating Strategic Planning Processes . . . . . . . . . . .146
Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 151

CHAPTER V: SUMMARY AND DISCUSSION. . . . . . . . . . . . . . . . . . . . . . . . . . . . . .153
Review of Problem Statement and Research Methodology. . . . . . . . . . . . . . . . .153
Discussion of Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .154
Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .158
Significance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .159
Relationship of the Current Study to Prior and Future Research . . . . . . . . . . . . 160
Concluding Remarks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .162

BIBLIOGRAPHY . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 164

APPENDICES
Appendix A: Flowchart of the Case Study Methodology . . . . . . . . . . . . . . . . . .173
Appendix B: List of Documents for Analysis . . . . . . . . . . . . . . . . . . . . . . . . . . .174
Appendix C: Strategic Planning Evaluation Survey . . . . . . . . . . . . . . . . . . . . . .175
Appendix D: Survey Invitation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .177
Appendix E: Interview Questions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .179
Appendix F: Interview Invitation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 180
Appendix G: Expert Panel Evaluation Form . . . . . . . . . . . . . . . . . . . . . . . . . . . .181
Appendix H: Documents Reviewed . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 183
Appendix I: Interview Subjects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .185
Appendix J : Interview Questions Incorporated into Logic Model . . . . . . . . . . .186
Appendix K: University A Survey Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . .188
Appendix L: University B Survey Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . .190
Appendix M: University C Survey Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . .192
Appendix N: University D Survey Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . .194

VITA . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .196



viii



LIST OF FIGURES



Figure Page

Figure 2.1: Strategic Planning Key Components . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
Figure 2.2: Basic Logic Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
Figure 2.3: Formative and Summative Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .42
Figure 2.4: Comprehensive Logic Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 44
Figure 2.5: Diagram of the Institutional Planning Process at Parkland College . . . . . . . .55

Figure 3.1: Expert Panel Members . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 74

Figure 4.1: Sources of Evidence Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .77
Figure 4.2: Matrix of University As Evidence by Logic Model . . . . . . . . . . . . . . . . . . .81
Figure 4.3: Matrix of University Bs Evidence by Logic Model . . . . . . . . . . . . . . . . . . .95
Figure 4.4: Matrix of University Cs Evidence by Logic Model . . . . . . . . . . . . . . . . . .112
Figure 4.5: Matrix of University Ds Evidence by Logic Model . . . . . . . . . . . . . . . . . . 125
Figure 4.6: Comparison Matrix of each Institutions Evidence by Logic Model . . . . . .137
Figure 4.7: Universities Results Charted on the Logic Model . . . . . . . . . . . . . . . . . . . .138
Figure 4.8: The Conceptual Evaluation Model - Evaluation of Resources Needed . . . .147
Figure 4.9: The Conceptual Evaluation Model - Evaluation of Intended Results . . . . . .148



ix




ABSTRACT



Dalrymple, Margaret L. Ph.D., Purdue University, August 2007. The Value of
Evaluation: A Case Study of Evaluating Strategic Plan Initiatives. Major Professor:
J udith M. Gappa



The current environment of oversight and accountability, along with declining
state support for higher education, have prompted many universities to utilize strategic
planning in order to achieve their objectives. Yet strangely, relatively little research has
been done on evaluation models for assessing strategic planning in higher education
institutions. This study endeavors to shed some much-needed light on how higher
education institutions have assessed their university-wide strategic plans by focusing on
their evaluation methods.
This study focuses on the methodology of various approaches, and does not
determine whether those strategic plans have been effective or not. Rather, the
dissertation examines the methods used in collecting data and evidence during the
evaluation phase, the final stage of strategic planning.
The dissertation follows the design of a multiple case study, the cases selected
being higher education institutions located within the United States, affiliated with the
Association of American Universities (AAU), and in the initial stage of the evaluation of
their university-wide strategic plans. Within the multiple cases, the studys data (coded


x
using a logic model) is derived from detailed document analysis, a survey, and
interviews. Results of , the study reveal several key factors in a systematic evaluation of
strategic planning initiatives: communication, leadership, culture, and integration of
budgeting and planning.
Exemplar characteristics drawn from the studys findings result in a conceptual
evaluation model. This model has, five elements: a summarization and exploration of the
relational issues of influences and resources; a review of activities and processes; an
analysis of data; an assessment of the consequence, outcomes, and/or the effectiveness of
the plan; and stakeholder participation in the evaluation. These elements provide the basis
for an evaluation methodology that a higher education institution can apply to its strategic
plan initiative. The conceptual evaluation model may thus provide academic practitioners
with a comprehensive tool with which to evaluate their own institutions strategic plans.
The findings will potentially benefit academic leaders, strategic planners, and
institutional researchers by identifying the critical components of evaluation of a
university-wide strategic plan.




1



CHAPTER I: INTRODUCTION



People commonly use statistics like a drunk uses a lamp post: for support rather
than for illumination. Though aimed squarely at the politicians of his time, Andrew
Langs witty observation continues to resonate. Indeed, one might even apply it to
todays higher education institutions when it comes to evaluating their strategic plans.
For even though the purpose of strategic planning is to assist an institution in achieving
its vision, it is nevertheless true that some institutions tend to use data to support their
initiatives rather than to evaluate the actual merit of their plans.
In this day and age of assessment and accreditation, much has been written about
the need to measure the effectiveness of a higher education institution. What better way
to do this than to look at the level of success of an institutions strategic plan? Ironically,
though, evaluations of a strategic plans success (or failure) are not often undertaken.
Perhaps it is because this type of analysis seems overwhelming and difficult to
conceptualize, or because no standard techniques of evaluation have yet been devised. Or
perhaps it is because strategic plans are so complex since institutions employ them to take
into consideration their unique environmental, fiscal, governance, geographic, and
demographic issues. However, assessment of a strategic plan is absolutely necessary in
order to address the question of how effective it has been.


2
The desire for oversight and accountability combined with declining government
support for higher education has created an environment that demands that institutions
employ effective strategic plans to achieve their mandated objectives. But therein lies the
greatest challenge: in such an environment of accountability, institutions must be able to
showcase (or own up to) the actual level of effectiveness or success they have achieved in
implementing their strategic plans. While developing a strategic plan is relatively
commonplace now, evaluating whether it is truly effective is far less common, and this
failure has even derailed several institutions overall plans (Mintzberg, 1994;
Schmidtlein, 1990). The challenge thus lies in moving from the implementation of a
strategic plan to an evaluation of it that is detailed, inclusive, and responsive, and that can
also inform institutional budgeting and marketing. This apparent lack of evaluation
demonstrates the need for assessment of strategic plans in higher education institutions.
This chapter begins by describing the background and rationale for this study. It
then presents the research study and defines key terms. Next an overview of the
methodology is presented. The chapter concludes with the purpose and significance of
the study as well as its limitations.




Background of the Study: Overview of Strategic Planning in Higher Education

It is no coincidence that higher education institutions have embraced strategic
planning at the same time that a number of changes have occurred in funding, staffing
and enrollment. Both the number of students and faculty rose dramatically in the last half
of the 20
th
century (Keller, 1983; Nunez, 2003). Following suit, physical facilities have


3
expanded as well. More recently, changes in traditional funding sources and poor
economic conditions have required institutions to look both internally and externally for
new sources of funding in order to address these changing conditions (Nunez, 2003, p. 1).
The most widely used approach to deal with such new circumstances is strategic planning
(Glaister & Falshaw, 1999; Nunez, 2003).


Brief History of Strategic Planning. The etymology of strategy is from the
Greek words of stratos (army) and agein (to lead). Strategic planning originated in the
military, which defined it as the science of planning and directing large-scale military
operations, of maneuvering forces into the most advantageous position prior to actual
engagement with the enemy (Torres, 2001, p. 15). However, it is in the private sector
that the concept has come into its own in the last several decades. It evolved there as a
planning model for defining the purpose and policies of a company and its business.
Eventually it was modified into management of risk, industry growth, and market share
(Torres, 2001). Early studies on strategic planning in the private sector focused on
manufacturing, banking, and transportation industries (Anderson, 2000). In 1973, Rue
conducted a study to empirically evaluate the planning practices and attitudes of
practitioners in private sector organizations. The results revealed that the majority of
firms engaged in moderate to extensive strategic planning activities and few firms had no
planning processes. Rue (1973) emphasized that strategic planning had grown in
popularity and size in the early 1970s, as had the need to involve staff specialists,
planning departments, and outside consultants. More recent research regarding the


4
current status of strategic planning is limited. In one recent study, Glaister and Falshaw
(1999) examined the extent to which private firms used strategic planning and their views
and attitudes toward it. Glaister and Falshaw asserted that, while strategic planning
suffered a downturn in popularity in the late 1970s due in large part to the inability of
strategic planning tools to deliver what was expected of them, strategic planning had
regained its reputation by the 1990s (1999, p.108). This was due to its emphasis on
resource management, which bolstered strong performance in many firms through
resource acquisition.


Strategic Planning in Higher Education. Strategic planning started to be
accepted in higher educational institutions in the 1980s. The 1983 publication of George
Kellers book Academic Strategy: The Management Revolution in American Higher
Education provides a reference point in time as to when institutions began to take a closer
look at strategic planning. Keller (1983) suggested that the following four internal and
external factors encouraged institutions to adopt planning:
1. Changing external conditions, including changing demography.
2. Demands for accountability from outside agencies.
3. Decreasing financial strength and increasing operational costs, coupled with
diminishing financial support from the government.
4. Progress in information technology.
The trends that Keller outlined continue today. The drive to measure institutional
efficiency and performance has gained dramatic momentum in response to national


5
economic needs and new governmental demands (Alexander, 2000; Epper & Russell,
1996; Ewell, 2004; McLeod & Cotton, 1998; Nunez, 2003). Epper and Russell (1996)
reviewed 20 years worth of data in state coordinating and governing boards of higher
education. In their review they found within the state appropriations to higher education
the emerging issues of effectiveness and accountability of institutions (Epper & Russell,
1996).
The adoption of the idea of strategic planning did not become commonplace in
higher education institutions until the late 1990s. Most of those same institutions,
however, failed to follow through with the whole cycle, and therefore missed many of its
potential benefits. Instead, they ended up with a large report gathering dust on the shelf
viewed as yet another management fad that created a lot of busy work. Those familiar
with that type of strategic planning experience (as well as some others) have been
understandably skeptical of the effectiveness of strategic planning. Nevertheless, it is still
seen as necessary and desirable, especially in the context of todays financial climate
(Birnbaum, 2000; Mintzberg, 1994; Schmidtlein, 1990).
The recent performance-based accountability movement is based on the
perception that the more traditional measures of institutional performance, such as peer
review and market choice, are not adequate indicators of institutional value. Public
expenditures for higher education have become a way for governing bodies to legitimize
performance-based funding. The accountability movement has also become a convenient
means for comparing and ranking institutions against one another. League tables,
scorecards, national rankings, and report cards are some of the popular devices
developed by governing officials to compare institutional performance measurements


6
(Alexander, 2000, p. 419). Strategic planning is thus one possible response to the recent
movement toward accountability in government, as well as to the nations general
economic condition. Strategic planning addresses issues that are in current policy
debates: quality measurements, institutional comparisons, and reallocation strategies
(Alexander, 2000).
The decline in state and federal funding for public institutions as well as the
increasing costs of technology, student services, and institutional financial aid have all
created budget shortfalls. Both private and public institutions have traditionally relied on
financial support from federal and state governments. Due to decreases in these
governmental sources of funding, increased reliance on revenue generated from tuition
and fees and endowment income is fast becoming the norm. Smith (2004) noted that
tuition has consistently risen several percentage points above the rate of inflation for
several years now. More attention is being paid to private fundraising and nontraditional
funding opportunities, such as corporate funding and research income. Furthermore,
strategic planning has been used increasingly as a tool to assist institutions in becoming
more efficient in the setting of budget priorities and the expenditure of scarce resources.
It has even been advocated as necessary in order to ensure institutional survival and
vitality (Schmidtlein, 1990).
Currently, strategic planning is the most common tool used among higher
education institutions to identify and respond to their external environments, to clarify
institutional strengths and weaknesses, and to develop action plans to achieve defined
objectives (Kotler & Murphy, 1981; Nunez, 2003; Shirley, 1988; Winstead & Ruff,
1986). The benefits and importance of strategic planning are reflected in its prominent


7
place in the current literature on higher education. The ability to align the organization
with its environment is one of its most basic strengths (Nunez, 2003; Shirley, 1988).
Strategic plans, correctly done, help direct an institution to develop a course of action that
provides a roadmap to success. Strategic planning can help facilitate communication and
participation, accommodate divergent interests and values, foster wise and reasonably
analytic decision making, and promote successful implementation (Nunez, 2003, p. 4).
As Dooris (2002-2003, p. 27) eloquently states, the challenge is to use planning well and
wisely.
Strategic planning is most effective when implemented as an integral part of
institutional decision-making, not as a separate formal, comprehensive activity. Creating
a plan that is flexible and individually designed helps institutions achieve many benefits:
the creation of a vision, programs, services, and the acquisition of greater financial
strength. Strategic plans also provide important process benefits: improved institutional
cultures or climates, better communication among units, and an increased understanding
and consensus (Schmidtlein, 1990). Evaluating the results of strategic plan
implementation can demonstrate how well the strategic plan is fulfilling its stated
purpose. The results from the assessment of the strategic plan can empower and inform
the next round of strategic planning and the setting of future priorities. The cyclical
sequence of the strategic planning process can thus help an institution reach its full
potential.





8
Using a Logic Model as a Template for the Evaluation
An effective program evaluation does more than collect, analyze, and provide
data; it makes it possible to continually learn about and improve the program. A logic
model is a common evaluation tool that facilitates effective program planning,
implementation, and evaluation, and is often seen as a diagram that shows the major
components of a program, project, plan, or initiative (W.K. Kellogg Foundation, 2004).
Using a logic model produces an inventory of the resources and activities, shows how
these lead to the desired results, and serves as a method for assessment (W.K. Kellogg
Foundation, 2004).
This study uses a logic model to identify the various sections of an institutions
strategic plan and to discern whether the institutions evaluation methodology has
covered all of the components within a logic model. Since a logic model monitors an
entire programs process, the model serves as a template for this study as it reviewed
whether the participating institutions assessed all aspects of their respective strategic
plans. A more detailed description of the logic model is provided in the literature review
in chapter two. The model was used to frame the evaluation questions used in this study,
which is outlined in chapter three.


The Research Study
This study looked at strategic planning at a macro level, as a campus-wide
initiative. Rather than focusing on planning activities (as most of the research does) this
study reviewed how institutions assessed their strategic plans. In doing so, it simply


9
focused on the methodology of various approaches, and did not determine whether those
strategic plans were effective or not. It paid particular attention to the methods used in
collecting data and evidence during the final stage of strategic planning, the evaluation
phase.
The purpose of this research study was to examine the methodologies employed
by public research-intensive, higher education institutions in the evaluation of their
strategic plan initiatives, the results of which provides information about the various
methods of assessment of strategic plans in academia and a methodological model for
practitioners on evaluating their strategic plans.


Statement of the Research Problem. This study attempted to answer two
general questions:
1. What, if any, methodological processes do research universities use to
evaluate their strategic planning process? That general question incorporates
several related questions:
a. What are their methods of data collection and analysis?
b. How is the evaluation process implemented?
c. Who are the main participants?
2. How does the strategic plan evaluation methodology identify the extent to
which the targeted strategic plan funding sources match with the strategic plan
goals?
a. Does it identify the sources of the funding received?


10
b. Did the allocation of the strategic plan funds received match the originally
planned strategic plan goals and funding sources? (If it is determined that they
did not match, what was the difference and why did it happen?)


Definitions of Key Term and Variables
The rhetoric used while describing strategic planning cannot only be confusing,
but also ambiguous. The vocabulary may be the same, but meanings may differ between
studies, institutions, or people. Below are commonly used words and the specific
definitions used throughout this study. They are included not only because some terms
may be unfamiliar to the reader, but also because certain key terms may have multiple
meanings.
ASSESSMENT: the process of providing credible evidence of the outcomes of higher
education, undertaken for the purpose of improving programs and services within an
institution. For the purpose of this study, the term evaluation will share the same
meaning.
BENCHMARKS: external measures that may be comparisons of fundamental data items
among similar institutions.
EVALUATION: the process of providing credible evidence of outcomes for the purpose
of improvement. For the purpose of this study, this term will be used interchangeably
with assessment.
INSTITUTIONAL EFFECTIVENESS: the degree to which assessment and evaluation
findings from various academic and administrative units can be shown to demonstrate


11
accountability and used to evaluate and improve the quality of services, and to determine
whether or not the impact of the units is consistent with the institutions mission and
vision.
METRICS: internal measures of fundamental data items that may provide the means for
trend analysis.
PERFORMANCE INDICATORS: internal or external measures on which to base
reporting, funding, and/or budgeting decisions. May also be labeled metrics or
benchmarks.
STAKEHOLDER: a person or group with a direct interest, involvement, or investment in
the higher education institution.
STRATEGIC PLANNING: planning for future directions based on environmental
scanning, performance indicators, and input from stakeholders. It is most often used to
articulate the institutions mission and vision and may be one element used in
determining overall institutional effectiveness.


Overview of the Methodology
A case study design was selected for this research. The study followed a multiple
case design meaning that the same study contains more than a single case. A multiple
case design provides for the replication of cases and is therefore often considered more
compelling and regarded as more robust (McLaughlin, McLaughlin, & Muffo, 2001;
Herriott & Firestone, 1983; Yin, 1994, p. 45). The multiple cases were selected higher
education institutions based on two fundamental criteria: located within the United States,


12
and affiliated with the Association of American Universities (AAU). Furthermore, the
selected institutions had a strategic plan initiative; were at least in the initial stage of its
evaluation assessment; were public institutions; and were willing to participate in this
study.
Within the multiple cases, the data collection used document analysis, a survey,
and interviews. The document analysis focused on documents relating to the evaluation
of each institutions strategic plan. Staff or administrators in areas that are significantly
involved in their institutions strategic plan initiative and the participants engaged in the
assessment of the institutions strategic plan were surveyed about the methodology used.
The interview sample for this case study were key staff or administrators that were
involved with the strategic plan evaluations. By using multiple sources of data, this study
followed triangulation, the basic principle for collection of case study evidence.
Triangulation enhances the validity of the data collection process by using multiple
sources of data to allow the researcher to develop multiple measures of the same
phenomenon. Quantitative and some qualitative data analysis techniques were used to
analyze the results obtained through the data collection instruments. A detailed
description of the methodology is presented in chapter three.


Purpose and Significance of the Study
The purpose of this research was to study methodologies used by higher education
institutions to evaluate their strategic plans. As discussed in chapter two, relatively little
has been published on evaluation models for assessing the effectiveness of strategic


13
planning in a higher education institution. Instead, there is only limited anecdotal
evidence and intermittent, vague references to the necessity of assessing a strategic plan.
Indeed, there is insufficient information about assessment or evaluation (beyond
anecdotes) even within the available higher education strategic planning literature
(Birnbaum, 2000; Cope, 1981; Chaffee, 1985; Keller, 1997). There is widespread
agreement that the literature is nearly nonexistent on empirical findings about evaluating
higher education strategic planning (Birnbaum, 2000; Schmidtlein & Milton, 1988-1989;
Keller, 1997; Nunez, 2004; Paris, Ronca, & Stransky, 2005).
In addressing this dilemma, this study is significant for several reasons. First, the
results of this study may lead to an improved understanding of evaluation methodologies
applied to strategic planning initiatives in higher educational institutions. This study
addresses the current lack of research in this area and provide empirical evidence
regarding the assessment of strategic plans. Strategic planning in higher education is
sufficiently new and unique that the study seems likely to advance knowledge of the
strategic planning field. Little empirical evidence exists to support or refute strategic
plannings usefulness in higher education (Schmidtlein & Milton, 1988-1989). This study
adds to the existing knowledge by investigating the concept of strategic planning
assessment within higher education. Therefore, it should, add significantly to the body of
literature in the content field.
Second, this study will result in more than an untested theory; it examines
examples of evaluation methodologies currently in practice. Although higher education
literature identifies strategic planning as a crucial activity, evaluation or assessment
methods are still evolving and a definitive model has yet to be established. This study,


14
then, provides real-life examples that are being tested and used, thereby provides a
descriptive framework of the selected institutions evaluation methodologies.
Third, the study provides insight for academic practitioners into the practical
implications of evaluating their strategic plans. In the current environment of
accountability, institutions must be able to demonstrate the effectiveness and/or success
they have achieved by implementing their strategic plans. Awareness of possible
outcomes of evaluating a strategic plan can inform the next cycle of strategic planning
and the setting of future goals. Themes or trends identified lead to the development of a
new conceptual model of evaluating a strategic plan defined in chapter four. It provides
an empirical base for future planners to modify and improve their planning initiatives.
The strategic planning literature, reviewed in chapter two, highlights the lack of
existing knowledge about the evaluation of a strategic plan. Adding to the body of
literature and by providing knowledge in the necessity and types of methodologies will
therefore provide valuable insights for higher educational leaders, strategic planners, and
stakeholders.


Limitations of the Study
There are certain limitations to this study. First, the institutional sites and
participants are from public research higher education institutions. Therefore, the results
from this study may not be generalizable to non-research or private institutions.
Moreover, the deficiency of similar studies limits the ability to compare findings of this
study with previous studies.


15
Second, there exists within strategic planning itself numerous definitions and
interpretations. Therefore, the participants may have various understandings of strategic
planning and evaluation methods. In addition, participants from one institution may have
very different perceptions of strategic planning than another, thereby impacting their
interpretation.


Summary
This introductory chapter has provided the basis for the research study. It has
included an overview of the research problem, the research study, definitions of key
terms, a brief description of methodology, and a statement about the significance and
limitations of the study. The following chapter reviews the literature on strategic planning
within higher education. The chapter also examines literature pertaining to higher
education assessment and the evaluation of strategic plans.


16



CHAPTER II: LITERATURE REVIEW



Much has been written about strategic planning. Indeed, it has been the subject of
myriad journals articles, conference papers, and other scholarly works (Schmidtlein,
1990). But even though strategic planning is increasingly common in higher education
today, empirical evidence of successful implementations, outputs, or results of strategic
planning is hard to find, not least because most of the literature focuses on the planning
and implementation phases. Strategic planning is frequently used in higher education
institutions in order to address the current environment of oversight and accountability.
New and innovative ways of reacting to these oversight and accountability demands are
critical for the long-term success of institutions. Though many institutions have employed
various methods of strategic planning, it is nevertheless true that as demands increase, so
too does the need to improve these methods in order to employ processes that are proper,
effective, and proven (Nunez, 2003).
This chapter discusses the literature pertaining to both strategic planning within
higher education, and assessment of comprehensive higher education programs or
processes. More specifically, it addresses the bodies of literature relevant to this study
which proposes to examine the evaluation methodology of strategic plan initiatives at
public research higher education institutions. The review of the literature has three
objectives:


17
1. To review the literature on strategic planning and how higher education has
utilized the strategic planning concept. (Literature Review on Strategic
Planning)
2. To review the literature on the purposes of assessment, the primary audience
for assessment results, and the models for assessment. (Literature Review of
Assessment)
3. To review the literature on research methodologies used to assess strategic
planning in higher education. (Literature Review on the Methods of Assessing
a Strategic Plan)
The chapter begins with a brief description of the search process utilized to find
the relevant works to be reviewed.


Search Process
The literature search entailed a thorough investigation using a multitude of
resources. After searching through the Purdue University catalog, certain databases
pertaining to higher education were explored: ArticleFirst (articles in Business,
Humanities, Medicine, Popular Culture, Science, Social Science, Technology); Education
Resource Information Center (ERIC - all aspects of education research and practice); and
WorldCat (a catalog of books and other materials in libraries worldwide). Relevant
dissertations were reviewed through the Dissertation Abstract Index (an index of theses
and dissertations from American and international universities). Key journals were also
scrutinized, including The Chronicle of Higher Education, Research in Higher


18
Education, New Directions for Institutional Research, and New Directions for Higher
Education. When searching, multiple descriptors were used (e.g., higher education,
institutional research, strategic planning, assessment, evaluation).
In addition, several associations websites were examined for references and
resources. These associations are listed below:
American Association for Higher Education (AAHE)
American Association of State Colleges and Universities (AASCU)
Association of American Universities Data Exchange (AAUDE)
Association of Institutional Research (AIR)
Association for the Study of Higher Education (ASHE)
Council for Higher Education Accreditation (CHEA)
Center for Higher Education Policy Studies (CHEPS)
Education Commission of the States (ECS)
National Association of College and University Business Officers (NACUBO)
National Assoc. of State Universities and Land-Grant Colleges (NASULGC)
National Education Association (NEA) Higher Education
National Center for Higher Education Management Systems (NCHEMS)
National Consortium for Continuous Improvement in Higher Education
(NCCI)
Society of College and University Planning (SCUP)
State Higher Educational Executive Officers (SHEEO)
Western Interstate Commission for Higher Education (WICHE)


19
Materials and papers presented at national conferences of these associations were
drawn upon as well. For example, the AIR annual forums, the 29
th
Annual AAHE
conference, and the 2004 Assessment Institute were reviewed.
The most important resources found were in the bibliographies of relevant
articles, documents, and books. Results from the electronic searches of ERIC, together
with the bibliographical review of the relevant resources, provided the greatest amount of
valuable literature, even if it was less than anticipated.
In carrying out the literature review it was discovered that both assessment and
strategic planning entered into higher education about twenty years ago. However, both
are rapidly evolving concepts, constantly changing and adapting to prevailing trends in
higher education. The most recent documents therefore often proved to be among the
most valuable (if also at times obscure) sources.


Literature Review on Strategic Planning
Strategic planning is an instrument that sets priorities, designs action steps, and
assesses progress. What constitutes strategic planning is best described through phases, or
the series of steps that assist an organization in understanding what it is, and in
developing future directions. However, the majority of strategic planning literature
focuses on its proper implementation. Kotler and Murphy (1981, p. 471) defined strategic
planning as the process of developing and maintaining a strategic fit between the
organization and its changing marketing opportunities. Rowley, Lujan and Dolence
(1997) noted that a key purpose of strategic planning is to enable an organization to align


20
itself most effectively with its external environment. Keller (1983) and Winstead and
Ruff (1986) added that strategic planning is a continuous systematic external look that
focuses on organizational strengths and weaknesses. Expanding on the definition, Shirley
(1988, p. 5) states that strategic planning articulates institutional mission and vision.
Nunez (2003, p. 3) summarizes strategic planning as a disciplined process intended to
examine internal strengths and weaknesses, evaluate the external environment and
position the institutional accordingly, produce decisions and actions, and help the
institution to maximize its strengths, minimize its weaknesses, and capitalize upon
opportunities.
Strategic planning clarifies institutional purpose and direction by creating goals
and strategies for achievement through a vision that unites the institutions mission, target
audiences, and outcomes. The key components of strategic planning follow this basic
hierarchical outline:
Mission a concise statement that clarifies the institutions unique purpose
Vision a clear and persuasive statement of future aspirations
Goals characteristics needed in order to fulfill the mission and realize the
vision
Strategies / Objectives specific action-oriented strategies
Measurements data to measure achievement and provide accountability.
There are various models, components, or steps in developing a strategic plan
(Boyd & Reuning-Elliot, 1998; Glaister & Falshaw, 1999; Kaplan & Norton, 1996;
Kotler & Murphy, 1981; Rowley, Lujan, & Dolence, 1997), but for the purpose of this
study it is defined in four basic phases:


21
Phase I planning (environmental assessment, clarification of the institutional
mission, vision, goals, strategies/objectives, and measurements)
Phase II - resource development (internal assessment)
Phase III implementation (coordinated unit plans and data analysis)
Phase IV assessment (outcomes, priorities, and identifying future strategic
directions).


Phase I: Planning. The strategic planning model is best known for its planning
phase. Much has been written on this phase of the model (Howell, 2000; Keller, 1983;
Kotler & Murphy, 1981; Rowley, Lujan, & Dolence, 1997). The planning phase can be
difficult for some to understand due to the inherent differences between traditional
planning models and strategic planning. Rowley, Lujan, and Dolence (1997) have
outlined key distinctions between traditional planning and strategic planning, drawing
special attention to the important difference of focus. Strategic planning is externally
focused whereas the traditional approach has an internal emphasis. Other differences
include:
1. Alignment - traditional planning sets goals and objectives to achieve these
goals, whereas strategic planning aligns the organization with its environment;
2. Specificity versus direction - strategic planning is less specific and puts
emphasis on broad directions rather than exact targets;


22
3. Focus - traditional planning is parochial and gives attention to units within an
organization, whereas strategic planning keeps its focus on the organization as
a whole; and
4. Time-relatedness - strategic planning is an ongoing process, whereas
traditional approaches are episodic (Rowley, Lujan, & Dolence, 1997).
One difficulty in adopting strategic planning is that institutions often interpret
strategic planning as one uniformly defined planning process. Schmidtlein (1990)
believed that comprehensive planning approaches frequently resulted in frustration and
disappointment when formulaic plans were followed without taking into consideration
the distinctive cultures of various institutions. Institutions with successful strategic plans
have modified the processes to best fit their needs. Instead of uncritically adopting a
particular planning approach, they often engage in a broad range of activities both in
their attempts to address the potential impact of changing conditions on their mission and
operations and in developing strategies to address these impacts (Schmidtlein, 1990, p.
1). Institutions with successful plans thus based their planning approach on their unique
institutional characteristics.
The traditional corporate version of strategic planning can be modified to fit the
distinctive power structures of academe. While academic issues usually lie within the
purview of the faculty, administrators have responsibility for the budget, for complying
with legal and managerial requirements, and for dealing with external demands.
Consequently, implementing academic decisions, especially within the strategic planning
process, requires consultation and consensus among faculty and administrators. Faculty
involvement and leadership in planning that deals with academic issues is clearly an


23
important and critical element, since academic matters are unlikely to be implemented if
they lack faculty support. Given the central role faculty play in making academic
decisions, fostering faculty initiative is an important condition for successful planning
(Schmidtlein, 1990).
Rowley, Lujan, and Dolence (1997) suggest that when an institution begins to
plan strategically, a SWOT analysis should be undertaken. A SWOT analysis involves
reviewing an organizations internal Strengths and Weaknesses and external
Opportunities and Threats. The analysis of external opportunities and threats is essential
to understanding the local culture (i.e. the immediate educational, economic,
demographic, physical, political, social, and technological environment). The internal
strengths and weaknesses should include an assessment of campus roles, values, cultures,
mission, and vision. The vision of the institution centers on its role within the higher
educational system and society. The institutions mission communicates its general
character, defines appropriate programs and activities, and provides future strategic
direction (Schmidtlein, 1990). Also during this phase Schmidtlein recommends that a
strategic planning committee be appointed to develop the planning document, the time
lines, the roles of all the various stakeholders, and setting criteria to evaluate the plan. As
all of this requires information to be shared among the campus community, the
importance of communication throughout the planning process can hardly be
overemphasized.
It is clear that strategic plans often fail to move beyond the planning phase. Most
failures stem from using comprehensive processes unsuited for the institution or from a
lack of internal political support (Schmidtlein, 1990; Taylor & Karr, 1999). Part of the


24
problems with planning can be the size and complexity of the process the overlapping
committees, the paperwork, and the process slow pace (Dooris, Kelley, & Trainer, 2004;
Taylor & Karr, 1999). Therefore, in order to accomplish strategic planning, it is
important to move beyond planning and determine how to measure progress.


Components of a Strategic Plan. To support the vision and mission, measurable
goals and strategies must be well defined. Goals provide the overarching aim. Strategies
or objectives are the assessable statements supporting the goal, clearly defining what it is
the institution wants to accomplish. In order to determine success or failure, the
objectives should be clearly stated (Cordeiro & Vaidya, 2002, p.30). Measurements
provide direction (Rieley, 1997a; Atkinson, Waterhouse, & Wells, 1997); as such, they
articulate the strategic plans goals and strategies/objectives. (These may also be referred
to as metrics, benchmarks, or key performance indicators.) Measurements allow an
institution to track how well it is actually doing in its strategic plan effort. Measurements
enable an institution to better understand what creates success, and to effectively manage
processes to achieve that success (Atkinson, Waterhouse, & Wells, 1997). Measurements
are internally focused measures that monitor the health, effectiveness, and efficiency of
the organization consistent with the goals and objectives established by the strategic plan
(Cordeiro & Vaidya, 2002; Rowley, Lujan, and Dolence, 1997). Ultimately, the
measurements relate back to the institutions mission.
While each college or university determines its own measurements, there are
consistent types of data that can be used across the board. The general categories and


25
measures used on many campuses divide the data into input measures and outcome
measures. Subcategories of information regarding students, faculty, staff, learning, and
university resources are found under these input and outcome headings. Typical
measurements include admission standards, enrollment, financial aid, retention of
students, graduation rates, degrees awarded, faculty productivity, research initiatives,
alumni and community relations. Input measures usually focus on the characteristics of
the students and faculty, physical plant, and the evaluation of resource allocations.
Outcome measures are typically focused on student outcomes, teaching, and faculty
research. It is important to note that certain critical elements are sometimes lacking, such
as information on student learning assessment practices, technology in the classroom,
interdisciplinary programs, administrative operational efficiency, and the impact of
university or faculty research (Ray, 1998). Often this is due to the limited availability of
data to track those particular initiatives.
These measurements are used to provide a baseline of key data points, as well as
to act as markers of improvement during the life of the plan. Often the term
benchmarks is used to describe measurements that are comparisons of fundamental
data items among peer institutions. Using the peer institutions, the benchmarks act as a
reference point for best practices (Anderes, 1999, p.119). Generally, peer institutions
listed are considered to be aspirational, as in possessing various qualities or strengths not
presently part of the institution developing the strategic plan. The measurements can be
qualitative or quantitative, and play a major role in the strategic plan.
The following a hypothetical example may serve to illustrate a strategic plans
key components:


26

Mission The core elements of our institution are: Becoming a national leader in the
quality of our academic programs; being recognized for the quality of the
learning experience for our students; creating an environment that respects
and nurtures all members of the community; and expanding the land-grant
mission to meet the 21
st
century needs.
Vision To be one of the nations best universities in integrating teaching, research,
and service.
Goals Create a diverse university community.
Strategies Increase the number of women and minority faculty at a senior level each
year for five years through hiring assistance initiatives.
Measure The annual number of full-time women and minority faculty for each year of
the strategic plan initiative.

Figure 2.1: Strategic Planning Key Components



Phase II: Resource Development. Strategic planning articulates the institutions
mission. The consensus on a mission creates a set of shared assumptions about the future
that serves as the basis for budget decisions. The strategic plan is a comprehensive
document, and thus not only provides future directions, but also encompasses current
practices in the total university. Because the budget is a collective document as well, it
should be synchronized with the strategic plan.
A significant part of the strategic planning process requires that resources be
identified in the campus budget to implement the plan (Cordeiro & Vaidya, 2002;
Dickmeyer, 2004; Prager, McCarthy & Sealy, 2002; Rieley, 1997a). It is at this point that
many institutions, while embracing the basic connection between the strategic plan and
the budget, nevertheless fail to adequately coordinate the two. Often budgetary decisions
are administratively expedient under the pressure of budget requirements, but not
necessarily cognizant of priorities established by institutional planning (Anderes, 1996,


27
p. 129). Institutions clearly need to consider the programs and resources that will be
required in order to move in desired future directions (Schmidtlein, 1990). As objectives
are developed, resources should be assigned to accomplish each objective. The plan that
is not connected directly to the budget process could become a disjointed, frustrated
effort with minimal success and no long-term gains (Rieley, 1997a; Taylor & Karr,
1999).
Anderes (1996) provides many reasons to link the two. First and foremost,
budgets can act as basic extensions of planned priorities, implemented within the general
intent of the strategic plan. The budget can likewise legitimize the plan. Often the success
of a plan is measured by whether or not its objectives and priorities have been included in
the budget development process, and whether or not they received adequate funding. By
coordinating the plan with the budget, strategic planning priorities are identified and
evidence of achievement can be established, thereby reducing the likelihood of decisions
being made outside the strategic plans priorities or simply to satisfy stakeholder
expectations. Most of all, it provides greater confidence in the institutional direction
when a coherent strategic plan supports the budget (Anderes, 1996).
Anderes (1996) recommends that the integration of the budget with the strategic
plan be in the earliest stages of the process. Some other key conditions for effective
implementation of the two include active leadership, broad participation by key internal
and external stakeholders, a clear intention to integrate planning priorities into budget
development and allocations of funding, and the understanding of the need for
assessment, which includes the tracking of funding (Anderes, 1996).


28
Anderes (1996) suggests that when the budget is coordinated with the strategic
plan, all units are able to justify their budget requests so that they relate to strategic plan
initiatives. The credibility of a budget request can be judged on how well it represents
strategic planning priorities. Planning priorities are legitimized by the allocation of
resources and their feasibility is usually based on their ability to garner funding. The
clearest indicator in determining whether planning and budgeting are connected is the
degree to which planning priorities are ultimately funded.
Resource development is a significant part of the strategic planning process. It is
necessary to evaluate the resources needed for achievement, allocate funding for new
ideas, and attend to the financial implications. Coordination between the budget and
strategic plan can help achieve the objectives, promote collaboration and partnership
among units, and provide a means of measuring progress (Anderes, 1996; Prager,
McCarthy & Sealy, 2002; Rieley, 1997a). The implementation of the strategic plan relies
upon identifying resources that can be allocated within the budget through incremental
changes.


Phase III: Implementation. Leadership is key in the implementation phase
(Keller, 1983; Nunez, 2003; Schmidtlein, 1990; Taylor & Karr, 1999). Administrative
leadership is needed to set priorities, make choices, and negotiate with external
stakeholders. Recognizing, however, that the lack of internal political support can easily
result in a plans failure (Schmidtlein, 1990). Faculty initiative and leadership are clearly
necessary as well. Effective communication both within the institution and with the


29
community at large (external environment) can help both insiders and outsiders
understand what the university is trying to accomplish through the new plan, and thus
also enhance a better understanding of the institutions core values (Taylor & Karr,
1999).
The implementation phase involves well-thought-out communication. A culture
of resource development, planning, and allocation driven by strategic plan priorities can
be adopted. Cistone and Bashford (2002) recommend that each administrative and
academic unit develop its own strategic plan in keeping with the institutions overall
mission, goals and resources. Units should develop plans that are unique to their purpose
but compatible with the institutions. The more useful unit plans are to the units
themselves, the more useful they are to the institution as a whole (Cistone & Bashford,
2002). These plans may vary by unit throughout the institution; however, the unit plans
ultimately must align with the overall institutional mission. The ultimate idea is that the
unit strategic plans are grounded in the institutions common framework of data analysis;
the results being a coordinated development of measurements for assessment for all
plans. This ensures that a cohesive and comprehensive institutional strategic plan will be
the result.


Phase IV: Assessment. The final - but often missing - phase is the assessment
analysis of the strategic plan. Too often, a heavy emphasis is placed on the beginning
phases of strategic planning, making for inadequate attention to the evaluation of the
plans validity. By the time an assessment is done, it is often based on wishful thinking


30
rather than on actual evidence of improved outcomes (Mintzberg, 1994). The assessment
can evaluate progress toward goals using appropriate measurements, which in turn will
help the institution reflect on its progress and guide it toward its future.
If integrated into the work of the institution, the evaluation is not merely an add-
on for some other purpose such as accreditation. When appropriate, current data
collection efforts and management policies may be integrated into the plan. Cistone and
Bashford (2002) propose that the assessment method be reliable, valid, and consistent
over time. It should use data that are systematically collected and easily accessible. The
data collection for the measurements will provide the key elements that support the
analysis. If the method makes sense, is reasonably easy to use, and produces results, it
will likely be useful to the institution. Once the analysis is complete, it is advised to have
an endorsement from the institutions top leaders along with a disclosure of the results to
the public. By performing the evaluation, the institution will be able to correlate its
performance to its established purpose as stated in its strategic plan (Cistone & Bashford,
2002).


Literature Review of Assessment
In the quest for literature on assessment in higher education, one finds that,
depending on the author, assessment and evaluation may or may not mean the same
thing. Some distinguish between the two terms, while others do not. Generally,
assessment refers to information obtained from students and other stakeholders that may
be used to improve academic programs within the university. The problem lies with how


31
one defines evaluation. Some use program evaluation interchangeably with assessment,
and others view evaluation as the process to determine the achievements of staff for
awarding promotions, tenure, and merit pay (Banta, 2004, p. 1). For the sake of clarity,
this study used Trudy Bantas definition of assessment as the process of providing
credible evidence of the outcomes of higher education that is undertaken for the purpose
of improving programs and services within an institution (2004, p. 1). This definition
goes beyond the common belief that assessment is solely concerned with student learning
outcomes, and illustrates that assessment can be done on various levels, from general
education all the way up to campus environments as a whole.
The assessment movement has been around since the early to mid-1980s. When
encountering the term assessment, most think primarily of assessing what students have
learned. However, assessment can be used in a broader sense, from group work, to unit
review, to university-wide assessment. As Banta (2004) puts it:
Aggregating the work of all students in a classroom will provide information to
inform classroom assessment. Aggregating student work across various classes or
modules can provide assessment (evaluation) of the impact on learning of an
entire course of study. Looking at student products across the disciplines in a
college provides assessment at that level. Assessment findings from various
academic units within a university can provide a measure of institutional
effectiveness that can be used to demonstrate accountability at the state, regional,
or national level. (p. 1)
Ever increasing attention is now being given to assessment by accreditation
associations, state agencies, and local governing boards. Demands for objective


32
information regarding verification of institutional mission performance are forcing
institutions to establish new assessment systems or to refocus existing systems to respond
to these new requirements (McLeod & Cotton, 1998, p. 39). Yet those who have been
working with assessment since the beginning think assessment should be primarily about
improving student learning and only secondarily about determining accountability
(Angelo, 1999; Ewell, 2004; Ewell, 2005; Kuh, 2005; Linn, 2000).
The assessment movement as a whole has been guided by a few core principles:
Assessment should focus on learning outcomes that will restructure curricula
and pedagogy.
Assessment should be seen as a type of scholarship, and the results of
assessment should lead to continuous improvement.
Assessment should ultimately affect the larger process of institutional
effectiveness (Ewell, 2004, p. 3).
Ewell (2004) argues that what has happened instead is that the model of
assessment has become excessively mechanical and process-centered. We let assessment
get excessively distanced from the day-to-day business of teaching and learning... [M]any
faculty started avoiding the topic entirely... [and] assessment simultaneously became
something that external bodies started forcefully asking institutions to do (Ewell,
2004, p. 4). The most popular approach in todays political climate is to use assessment
for accountability. Ewell (2004) believes that if assessment is used for accountability it
could lead to abuses of standardized testing regimens, and negative consequences if the
institution fails to make grade. It could even be misused as a means of deliberate
comparison between institutions (Ewell, 2004, p. 5).


33
Assessment Models. Since institutions vary in size, mission, type of student,
governance, funding, and comparative quality, their general assessment plans should be
tailored to reflect their respective uniqueness. J ust as no standard assessment model will
work for every institution, no one institution should blindly adopt the assessment plan of
another (McLeod & Cotton, 1998, p. 39). There are, however, certain key elements that
should be considered in devising an assessment plan for any institution. First and
foremost, an assessment plan is created and communicated. Effective assessment,
according to Palomba and Banta (1999), begins with an overall plan for assessment. This
would suggest where and when students will be evaluated, the type of evaluation
instruments to be used, and how to respond to the results (Banta, 2004, p. 5). As Ewell
(2002) explains, these assessment plans sometimes stem from a requirement by an
accreditor or state authority. These plans usually include an institutional statement of
principles, stated learning goals, a charge to departments to find or develop a suitable
assessment method, and a schedule for data collection and reporting. The decision to
undertake the assessment process should be given as a clear charge and communicated
from the top level. Top endorsement gives it validity and recognition as an important
function. A committee or major officer is typically assigned the responsibility of
directing the planning process (McLeod & Cotton, 1998).
The next important step is the involvement of others. Palomba and Banta (1999)
affirm that all stakeholders should be included in the assessment process. In a
comprehensive assessment program, it is important to involve as many stakeholders as
possible. Such groups include students, faculty, student affairs professionals,
administrators, graduates, and employers (Banta, 2004, p. 5). Involvement can happen


34
throughout the process: from setting goals and objectives for courses and curricula,
selecting or designing assessment methods, collecting data, to reporting and
communicating the results. Generally, when an institution plans to implement
assessment, committees comprised of faculty from multiple disciplines are formed to plan
and oversee the work. These committees or task forces should be given clear but brief
action items (McLeod & Cotton, 1998; Ewell, 2002). Resource support for the committee
should be provided in the form of funds, staff, and data as well. The product of the
committees or task force should be a written operational proposal for assessment to be
approved by the committee or major officer. Once approved, the implementation may
begin. A tabular or matrix format is often used to keep track of implementation and
reporting, and the assessment methods may include standardized examinations, faculty-
made tests, surveys, focus groups, portfolios, and work samples (Ewell, 2002; McLeod &
Cotton, 1998).
Another characteristic of effective assessment is the periodic evaluation of the
assessment program itself, preferably through peer review by the task force itself or by
another group. There should be a regular and cyclical evaluation of the assessment plan, a
review of its utility and effectiveness, with modifications as prompted by the findings
when necessary (McLeod & Cotton, 1998; Palomba & Banta, 1999). Successful
assessment initiatives depend upon a supportive climate with effective teamwork in
planning, implementation, and utilization of findings. Periodically, campus leaders
should follow up to make sure that the findings are being used to improve practice.




35
Key Elements of Assessment. While the elements of assessment described below
refer to student learning, the concepts can be applied to broader assessment issues.
Successful assessment begins with identifying the purpose and defining the mission of
the assessment process itself. The assessment activities are guided by the articulation of
certain goals and objectives (Palomba & Banta, 1999; Terenzini & Upcraft, 1996). Goals
express the intended results in general terms, whereas the objectives describe the
concepts. Objectives specifically state what needs to be assessed, and therefore guide
what tools are most accurate and suitable for the measurement (Erwin, 1991).
The type of outcome most important to the purpose will determines the
appropriate assessment approach (Terenzini & Upcraft, 1996). For example, when
assessing student learning, there are three major categories of intended learning
outcomes: cognitive, affective, and psychomotor. Cognitive outcomes refer to thinking
skills. Most often Blooms (1956) taxonomy is used to determine cognitive ability. There
are six levels arranged in order of increasing complexity:
1. Knowledge is defined as remembering information without necessarily
understanding it and is the lowest level of cognitive outcomes. It includes
behaviors such as defining, listing, identifying, and labeling.
2. Comprehension involves understanding learned material and includes
behaviors such as generalizing, discussing, and interpreting.
3. Application refers to the ability to use learned material in new and concrete
situations. It includes behaviors such as showing, manipulating, and solving
problems.


36
4. Analysis is the ability to break down information into its component parts to see
interrelationships and ideas. Related behaviors include discriminating,
illustrating, and comparing.
5. Synthesis is the ability to put parts together to form a new whole. It involves
using creativity to compose or design something new.
6. Evaluation refers to the ability to judge the value of a material for a given
purpose. Behaviors related to evaluation include appraising, concluding, and
supporting. (Bloom, 1956)
Affective outcomes refer to attitudes and values and are directed toward a person,
object, place, or idea. Examples of intended affective outcomes include being sensitive to
the values or others, becoming aware of ones own talents and abilities, and developing
an appreciation for lifelong learning. Psychomotor outcomes refer to skill and
performance, skills that involve perception, dexterity, and coordination (Palomba &
Banta, 1999).
The type of desired outcomes will determine the assessment method that is most
appropriate, and the method depends on whether the data are quantitative or qualitative.
An extensive range of possible assessment techniques exists. The instrument chosen
needs to yield results that can be statistically analyzed, are valid and reliable, have norms
against which an institutions results may be compared, and are easily taken, scored, and
analyzed. If such an instrument is not available then one must be developed (Terenzini &
Upcraft, 1996).
When considering what instrument and method to use, it is essential to consider
whether the measure is indirect or direct. Direct measures are what most people think of


37
with regard to assessment: assignments, exams, projects, portfolios, and papers that
enable the instructor to see what students actually know and can do. They are key to
evaluating the acquisition of knowledge and skills. However, they do not tell why certain
components of students knowledge are strong or weak. Indirect measures enable the
instructor to appraise the process of learning through questionnaires, interviews, and
focus groups. It is important to include indirect measures since they help us to understand
why weaknesses of learning are occurring and what might be done to address them
(Banta, 2004). Banta (2004) advocates that good assessment includes both direct and
indirect measures.
The collection of data may be done at the program, school, or campus-wide level.
Campus-wide information ordinarily needs to be enriched with program-specific
information. For program level assessment, there should be a mix between data collection
methods focused at the programmatic level and the course level. It is also possible to
have the classroom instructors utilize their assessment methods to demonstrate the way
their courses are contributing to the program. The collection of data may use a cross-
sectional approach, comparing different groups of students at a common point in time, or
tracking and comparing successive cohorts of students. Longitudinal studies or samples
may be used as well. The most important consideration is that the information be valid
and useful for decision-making and improvement (Palomba & Banta, 1999, p. 110).
Raw data is unlikely to have much impact. To make the information useful, it is
necessary to depict a relevant analysis, draw comparisons, create a report, and distribute
it to the appropriate audience. When collecting campus-wide information, various
formats will appeal to various audiences. A comprehensive report of the entire project


38
will be necessary, but an executive summary style report that provides an overview is
also essential. Certain audiences may appreciate theme reports, such as study behaviors
or career themes. Institutional report cards may be created to include evidence about what
students know and can do in certain summary measures. Institutional report cards may
even include performance indicators. Reports may also be created for specific audiences,
such as internal committees, accrediting bodies, and state agencies. The elements of a
report include an introduction and project objectives, methodology, results, and a
conclusion and recommendations.
The most valuable aspects of assessment include the clarity it brings to goals and
objectives, the encouragement it gives to a systematic look at issues, and the emphasis it
places on improvement. Assessment information is of little use if it is not shared with
the appropriate audiences and used in meaningful ways (Palomba & Banta, 1999, p.
297). The audiences may be internal e.g., faculty, committees, or administrators. The
audience may include external bodies such as regional accreditors, professional
accreditors, and state commissions or governments. The usefulness of the assessment is
tied directly to the purpose of the study. Thus if the appropriate groundwork is laid, the
results can be utilized with significant effect.
The elements of assessment used as an example above were written for student
learning, but assessment can be done on various levels and areas. Assessment can be
done on general education on a campus environment as a whole or on other specific
topics within smaller units. For example, evaluating campus services can be done. In
Upcrafts and Schuhs (1996) book Assessment in Student Affairs, the authors provide
several reasons why those who provide services may want to assess their units. It can


39
answer the questions of whether the unit is providing high quality service, meeting its
goals and objectives, is cost effective and affordable, and if the service is reaching the
appropriate audiences. However, nowhere in the literature was it found the methods of
evaluation or assessment for strategic planning. The following sections will provide an
examination of what was presented in the literature.


Literature Review on the Methods of Assessing a Strategic Plan
The following section describes the results of the literature search for examples of
institutional assessment practices for strategic planning. Literature on strategic planning
within the private sector was used sparingly due to the significant organizational
differences between higher education and corporations. Not only are the power structures
distinct (as discussed in the previous section), but most corporations are primarily profit-
driven. Therefore, not only is the strategic plan implementation often less convoluted in
the latter, but the evaluation of its effectiveness is more clear-cut. Needless to say, higher
education is more complicated. To date, no clear models have been found that adequately
evaluate strategic planning in a higher education institution. Instead the literature offers
anecdotal evidence, usually amounting to the occasional brief and ambiguous references
to the necessity of assessing a strategic plan, and information on program evaluations and
institutional effectiveness. The anecdotal evidence does not provide any sort of model,
and sporadic references to assessing a strategic plan are often vague and lack sufficient
detail. Although both program evaluations and institutional effectiveness methods could
be modified to evaluate strategic planning, they obviously fall short of being a proper


40
analysis. Whereas program evaluations are difficult to apply to the entire institution,
institutional effectiveness schemes are too general.
Even within the available higher education strategic plan literature, there is little
information about assessing or evaluating beyond individual case studies and promotional
pieces (Birnbaum, 2000; Chaffee, 1985; Cope, 1981; Dooris, Kelley & Trainer, 2004;
Keller, 1997; Paris, Ronca, & Stransky, 2005). The issue is that much of what has been
written is basically prescriptive advocacy for a particular approach based on little or no
systematic analysis of actual campus-planning environments and experiences
(Schmidtlein & Milton, 1988-1989, p. 5). There is widespread agreement that the
literature on empirical findings about successful higher education strategic planning is
nearly nonexistent (Birnbaum, 2000; Dooris, Kelley & Trainer, 2004; Keller, 1997;
Nunez, 2004; Paris, Ronca, & Stransky, 2005; Schmidtlein & Milton, 1988-1989).


Literature Review on Logic Model
Due to the limited amount of information in the literature on assessing a strategic
plan, the logic model will serve as an evaluation guide for this study. A logic model is
often described as a road map or a graphic framework used in planning, implementing,
and evaluating programs (Bickman, 1987; W.K. Kellogg Foundation, 2004; Whorely,
Hartry, & Newcomer, 1994). The W.K. Kellogg Foundation has applied logic modeling
to their social improvement programs, resulting in several publications which have
greatly contributed to the body of knowledge regarding logic modeling. In general, a
logic model is described there as a systematic and visual way to present and share your


41
understanding of the relations among the resources you have to operate your program, the
activities you plan, and the changes or results you hope to achieve (W.K. Kellogg
Foundation, 2004, p.1). The basic logic model uses pictures or words to describe the
sequence of activities and how they are linked to the expected results. It especially
highlights the relationship between the planned work and the intended results (See figure
2.2).

The components of the basic logic model are the inputs, activities, outputs,
outcomes, and the impact. The inputs are resources, such as the human, financial,
organization, and community, that are invested in the program. The activities are the
tools, services, processes, technology, and actions that are an intentional part of the
program implementation. These first two components, inputs and activities, are part of
the planned work that describes the resources needed to implement the program and the
intentions of the program. The outputs are the results of the activities and the outcomes
are the specific changes in the program that include both short-term and long-term
outcomes. Short-term outcomes include specific changes in attitudes, behaviors,
knowledge, and skill, and are usually expressed at the individual level. Long-term
outcomes are the expected results by the end of a certain number of years, which
typically build on the progress expected by the short-term outcomes. The impact is the
Inputs Activities
Outputs
Outcomes Impact
Your Planned Work Your Intended Results
Figure 2.2: Basic Logic Model (W.K. Kellogg Foundation, 2004)


42
fundamental intended or unintended change occurring in the organization, community, or
system because of the program activities. These last three components -- outputs,
outcomes, and impact -- are the intended results of the program (Bickman, 1987; W.K.
Kellogg Foundation, 2004; Whorely, Hartry, & Newcomer, 1994).
Since a logic model illustrates the purpose and content of the program, it is a
useful tool to develop evaluation questions that cover the entire process of the program.
Within a logic model evaluation, there are two different types of questions -- formative
and summative. Formative questions monitor progress toward objectives, provide
information to improve the program, and help refine the data collection and activities.
Formative information should be periodic and reported often. Summative evaluation
helps to determine if the intended goal was achieved. As such, it tends to take the form of
before-and-after snapshots reported after the conclusion of the program to document the
effectiveness and lessons learned from the experience of the program. The table below
lists some of the benefits of formative and summative evaluation questions.

Formative Evaluation - Improve Summative Evaluation - Prove
Provides information that helps you
improve your program. Generates periodic
reports. Information can be shared quickly.
Generates information that can be used to
demonstrate the results of the program to
stakeholders and the community.
Focuses most on program activities,
outputs, and short-term outcomes for the
purpose of monitoring progress and making
mid-course corrections when needed.
Focuses most on programs long-term
outcomes and impact. Although data may
be collected throughout the program the
purpose is to determine the value and worth
of the program based on the results.
Helpful in bringing attention to suggestions
for improvement.
Helpful in describing the quality and
effectiveness of your program by
documenting its impact on stakeholders
and the community.

Figure 2.3: Formative and Summative Evaluation (Bond, Boyd, & Montgomery, 1997;
W.K. Kellogg Foundation, 2004)


43

Formative questions help to improve the program, while summative questions help to
prove whether the program worked as planned. Both kinds of questions generate
information that determines the extent to which the program has had the expected result.
These types of questions are important not only for improving the program and
determining whether the program is successful, but also for illustrating the success of the
program to others (Bond, Boyd, & Montgomery, 1997; W.K. Kellogg Foundation, 2004).
The various logic model components interrelate with a broader framework of
context, implementation, and results (W.K. Kellogg Foundation, 2004). Questions that
explore the relational issues of the influences and resources -- such as the economic,
social, or political environment of the community -- are questions that are contextual in
nature, and fit within the input component of a logic model. These types of questions help
explain some of the strengths and weaknesses of the program, as well as the effect of
unanticipated and external influences on it. Questions that evaluate the implementation
and the extent to which actions were executed as planned are in the activities and outputs
components of a logic model. Questions within the outcome and impact components of
the logic model are under the broader framework of results. They ask to what extent
progress is being made toward the desired outcomes. These questions try to document the
changes that occur in the community as a outcome of the program. Typically, answers to
these questions are about the effectiveness and satisfaction of the activities (W.K.
Kellogg Foundation, 2004). Figure 2.4 shows a diagram of how the logic model
interrelates to this broader framework, as well as to the formative and summative
evaluation questions.


44


Case Studies. Some institutions have published their own case studies about
evaluating their strategic plans. Most offer a short description of their annual reviews,
evaluations, or audits of strategic planning. Rieley (1997b) suggests that the audit phase
of strategic planning should use quality improvement tools to see if the needs of the
stakeholders were met in the prescribed timeline and to their satisfaction, and then to put
all of this information in an effectiveness matrix (Rieley, 1997b, p.180). However,
Rieley fails to describe this effectiveness matrix an omission all too typical in the
literature. Many case studies do not give detailed descriptions of their evaluation
approaches, preferring rather to describe their experiences in short passages. The case
studies below offer examples of strategic planning at private, public, small, and large


45
institutions. Some have recently adopted strategic planning, and others have been
perfecting it over a number of years.
For example, Pennsylvania State University (PSU) has twenty years of experience
with strategic planning, yet its own self-studies have systematically emphasized the
implementation process, thus providing little information on assessment. It should be
noted, however, that the universitys strategic plan has evolved over the years to
eventually include tracking performance indicators as a way to evaluate its progress.
Dooris (2002-2003) has shown that PSU initiated its strategic planning process in 1983 as
a one-size-fits-all program with an annual planning and budget cycle. In the 1990s it then
incorporated the continuous quality improvement (CQI) principles for benchmarking of
critical processes. PSU also incorporated a system of budget recycling that provided both
incentives and means to reallocate resources from administrate activities to academic
functions. In 1997, assessment was deliberately incorporated into the plan by merging the
leadership and support responsibilities for assessment activities into a single
administrative center. PSU thereby implemented performance measures to help achieve
the goals of the plan and in 1998 began to produce an annual report of university-level
performance indicators. It also required all planning units to incorporate a similar
approach in their unit-level plans as well. Strategic indicators are then tracked to chart
progress toward meeting the overarching institutional goals (Dooris, 2002-2003).
Planning units define and compile their own strategic indicators, which are tied to
university and unit goals. Some examples of the universitys planning indicators are the
average combined SAT scores of first-year students, graduation rates, research
expenditures, program rankings, and number of faculty selected as fellows in professional


46
organizations (Dooris, 2002-2003, p. 30). In another earlier article about their experience
at PSU, Dooris & Lozier (1990) believe that strategic planning is not so much about a
finished product, but the process along the way. They describe it as a learning tool with
its impact occurring gradually. It is built on awareness, on the dissemination of
information throughout the organization, on building on ideas that work, and on
discarding or modifying ideas that do not (Dooris & Lozier, 1990, p. 17).
The PSU approach, which can be characterized as the cyclical process of
strategic planning, has also been embraced by the College of St. Scholastica (Duluth,
Minnesota). Annual reviews, updating, and revisions are an essential part of the process
and illustrate the impact of previously-planned actions and changes. The St. Scholastica
planning committee reviews the five-year goals and revises them as needed, reflecting
progress made toward the goals over the past year, and develops new goals to address any
new weakness identified, or to emphasize any new strengths (Cahoon, 1997).
The Ohio State University (OSU) offers an example of the typical trend to use
benchmarks to mark progress. Its benchmarking is comprised of the collection of
strategic information about the institution with comparable information for nine
benchmark universities. The benchmark universities were selected based on
organizational similarity (such as histories and resource allocation), quality, and higher
reputation rankings. OSU believes that the use of benchmarks provides an opportunity for
performance advancement in its own programs, initiatives, and policies by stimulating
thought and comparison (Ohio State University, 2004). Some of these strategic indicators
include assessment of faculty experience and student satisfaction, qualifications of
students, retention and graduation rates, diversity in demographics, and research


47
outcomes. These strategic indicators are then used to provide evaluation material for the
annual report and for the overall academic plan.
The University of Wisconsin-Madison has used its North Central Association
(NCA) reaccreditation study as the framework of its campus-wide strategic plan. Each of
the plans strategies has at least one point-person assigned to it to supervise its progress.
These strategies are then used to guide budget reallocations and reductions. Annual
reports from the schools, colleges, departments and administrative units are likewise
organized according to the strategies articulated in the strategic plan. Each year at the
Deans Council retreat, the deans review progress on the strategic plan and identify issues
that require their concerted leadership (Paris, 2005). In 2004, Paris, Ronca, and Stransky
at the University of Wisconsin-Madison researched the impact of the campus-wide
strategic plan, the impact of unit plans, and the factors that most contributed to effective
implementation at their institution (Paris, 2005). Their online survey was administered to
campus leaders, including administrators, deans, directors and departmental chairs. The
majority agreed that the campus plan set clear goals, sharpened focus, and prioritized
needs. Approximately one third felt that the campus plan fostered collaboration among
groups. In those schools, colleges, administrative units or academic departments that
reported high levels of goal achievement, the approach tended to feature annual goals that
were explicit about collective intentions, measures of success, and periodic checks for
monitoring progress (Paris, 2005). Paris, Ronca, and Stransky (2005) believed that the
results of their research demonstrated that the respondents who reported success take a
participatory approach to planning, in that there is a clear relationship between active
participation and a strategic plans overall success.


48
Parkland College in Illinois offers another example of how an institution
evaluates its strategic plan. However, at that institution, strategic planning is a component
of institutional effectiveness. Parkland will therefore be discussed in that section.


Program Evaluations. Due to the lack of studies evaluating institutional strategic
plans, much of the literature reviewed was about program evaluation. In one such article,
Chmielewski, Casey, and McLaughlin (2001) described three methods for evaluation.
The first method utilized the model called The Balanced Scorecard developed by
Harvard School of Business professors Robert Kaplan and David Norton in 1992. The
idea behind this approach is to weigh the significance of groups of activity and
performance measures in order to achieve a specific result (Chmielewski, Casey, &
McLaughlin, 2001, p. 6). The balanced scorecard model recommends using four sets of
measures: 1) financial measures; 2) customer satisfaction/knowledge; 3) internal business
processes; and 4) organizational learning and growth (Chmielewski, Casey, &
McLaughlin, 2001; Kaplan & Norton, 1992). However, as this model was developed for
the business sector and focuses on a specific program review, Chmielewski, Casey, and
McLaughlin (2001) found that it was not as useful or appropriate for university-wide
management (p.7). As Stewart and Carpenter-Hubin (2000-2001) found, the balanced
scorecard mechanism simply provides a common framework of reference in strategic
decisions and clarifies the choices and challenges involved. Overall, it provides a system
where rewards can be linked to accomplishment of objectives and resources can be more


49
easily allocated to the priorities of the institution (Stewart & Carpenter-Hubin, 2000-
2001).
Using the balanced scorecard model as a starting point, Chmielewski, Casey, and
McLaughlin created their own Program Portfolio Model that contains five interrelated
components. Four components are directly related: program overview, operations and
activities, outcomes and effectiveness, organizational capacity and growth. The fifth
component, impact/value-added, is the product of the other four components. The
program overview examines the history, mission, goals, and objectives of the program.
Operations and activities see how well the program is achieving its goals by asking what
aspects are contributing or acting as barriers to accomplishing the goals. Organizational
capacity and growth studies change, and whether the program is able to grow and
develop. Outcomes and effectiveness examines the results, and the impact/value added
aspect asks whether the program has benefited the university community. In order to
make this program portfolio review useful, Chmielewski, Casey, and McLaughlin (2001,
p.10) argued that it should incorporate some major activities that were required from
academic program review, learning outcomes, assessments, North Central Association
accreditation, and institutional planning and management. As Chmielewski, Casey, and
McLaughlin (2001) put it:
Not only would assembling a program portfolio assist departments in completing
the program review self-study, external accreditation reports, and the yearly
assessment reports, the program portfolio would also provide a way for
departments to share information with university strategic planning committees.
Thus, the program portfolio is one instrument or tool that could serve multiple


50
functions and simplify many of the information sharing and reporting tasks within
the university and to external accrediting agencies. (p.12)
Aguirre and Hawkins (1996) also discuss applying integrated assessment and
strategic planning processes to comply with requirements for accountability rather than
developing a new model. Evaluating a strategic plan does not necessarily mean creating a
new process, however; methods and results from other processes or projects can be
modified as a starting point for assessing a strategic plan.
The third evaluation method that Chmielewski, Casey, and McLaughlin (2001)
discussed was Freed, Klugman, and Fifes Eight Quality Principles (1997), in which used
certain program traits are used as a way to grade the program. These eight qualities are:
1. Vision, mission, and outcomes driven: does the organization have a clear
sense of direction?
2. Systems dependent: are all actions are part of inactive and interdependent
processes?
3. Leadership, creating a quality culture: supportive leaders that understand that
quality principles are part of the organizations culture.
4. Systematic individual development: knowledge and skills are continuously
updated through education, training, and career planning.
5. Decisions based on fact: success depends on the appropriate information being
gathered and considered.
6. Delegation of decision making: those involved in the daily operations have the
best knowledge of that operation and should be involved in making decision.


51
7. Collaboration; planning for change: those involved in the organizations
outcomes should work together to define the process.
8. Leadership: supporting a quality culture, since change is inevitable.
(Chmielewski, Casey, & McLaughlin, 2001, p.20; Freed, Klugman, & Fife,
1997).
Institutions typically coordinate their program reviews. The system can be
developed in such a way that it assists with the evaluation of a strategic plan by providing
evidence of improvement, or the lack thereof. In one such institution, Cohen (1997)
describes how Washburn University has created a program review system. All units (both
academic and non-academic) undergo program review on a five-year cycle. The most
important aspect of the process and the basis of the review is the self-study prepared by
each unit. In addition to providing standard data, each units self-study must contain a
brief summarization of its mission; its goals and measurable objectives; an examination
of the programs and/or procedures of the unit; a discussion of the evaluation system being
used; and its strengths and weaknesses. For academic units, this includes budget data,
staffing, student credit hours, enrollment, the number of majors, the number of graduates,
and the cost per student credit hour. Non-academic units have customized self-studies to
fit their individual needs. At Washburn, a self-study manual is available and includes
factors of assessment, the universitys mission statement, environmental scan data, tips
on how to prepare a self-study, and information on different types of outcome
assessments. The self-studies are reviewed by the areas directors, and then by a
University Review Committee, which prepares a report that includes strengths and
weaknesses of the unit, suggestions for improvement, an overall program rating, and


52
some future budgetary recommendations. This report is shared with the area director and
the unit itself, and is ultimately submitted to the president. Each year special emphasis is
given to possible improvements, and adjustments are made with regard to previous
weaknesses. This material is then tabulated and circulated to provide a better idea of what
changes have taken place on campus (Cohen, 1997). This type of program evaluation
provides documentation of progress for strategic plan goals and objectives.
When considering program reviews, one usually thinks of academic areas
focusing on their institutions mission of instruction, research, and/or service. Yet
program reviews can be adopted for non-academic areas as well. Schwarzmueller and
Dearing (1997), for instance, describe their institutions model for non-instructional
program review. Common performance indicators defined by their professional
organizations, such as the National Association of College and University Business
Officers (NACUBO) or the Association of Governing Boards (AGB), may be used to
provide normative data to allow for comparisons. At their institution all major areas
participate in a self-study and external evaluation once every five years. The self-study
looks at the history of the department and its purpose; an analysis of the goals and
objectives (in measurable results); a review of the activities and functions of the
department (with standards set by professional bodies); a review of staff resources and an
analysis of the individuals professional development; an analysis of equipment and
facilities; and the departments relationship to its external environment (relationships
with students, other departments, and community) (Schwarzmueller & Dearing, 1997,
p.117). The final report summarizes the analyses and presents its strengths, weaknesses,
and recommendations, including financial modifications, if any. The final report forms


53
the basis for budgeting over the next five years. This model is designed to help each area
evaluate its performance in terms of productivity, efficiency, effectiveness, quality, and
innovation. The non-instructional program reviews are joined with the assessment
program and instructional program review to comprise a system for assessing
institutional effectiveness.


Institutional Effectiveness. There are some articles that discuss institutional
effectiveness and its relationship to strategic planning, noting that the two can work in
unison to achieve continuous improvement. Institutional effectiveness (another constantly
evolving concept) is generally said to be the degree to which an institution accomplishes
desired effects (Cameron, 1985; Taylor, 1989). For the purposes of this study,
institutional effectiveness is treated as the evaluation of those parts of the institution that
enable students to learn, and is a broad concept that involves the entire institution in an
evaluation and improvement process (Boyle, J onas, & Weimer, 1997; Kater & Lucius,
1997). The National Alliance of Community and Technical Colleges (NACTC) describes
institutional effectiveness as the process of articulating the mission of the college,
setting goals emanating form that mission, defining how the college and the community
will know when goals are being met, and using the data from assessment in an ongoing
cycle of goal setting and planning (Easterling, J ohnson, & Wells, 1996, p.152).
Easterling, J ohnson, and Wells (1997) recommend that a self-assessment be completed
every five years, and that each academic and administrative unit undertake the process as
well -- the process being a team effort with all members/levels of a department


54
contributing to the report. J onas and Zakel (1997) briefly touch upon how the institutional
effectiveness model and its six core indicators provide a framework for assessing how
well their institution is carrying out its mission and realizing its vision, which in turn
provides the foundation for the development and review of the overall strategic plan.
Their six core indicators include: 1) access to success, 2) lifelong learning, 3) student
development, 4) community focus, 5) quality of workplace, and 6) stewardship. The
strategic planning initiatives are thus evaluated on an annual basis in relationship to these
core indicators.
Leake and Kristovich (2002) of Parkland College (Champaign, Illinois) have also
written about the institutional effectiveness plan at their institution. Their process
involves having support units (departments and offices) measure how well they are
achieving their stated goals and then document how they contribute to the greater mission
of the institution. In their view, having every unit conduct a regular assessment ultimately
results in university-wide evaluation. As Leake and Kristovich (2002) put it:
The support units are reviewed in a cycle that involves evaluating the goals of the
unit with respect to the colleges mission and purposes statement, defining
objectives, evaluating data, and applying the results of the evaluation to improve
the service quality of the unit and support education improvement. Each unit
develops its own mission statement measure[s] goals, gathers data, evaluates
outcomes, and uses the results for continuous improvement, with the support and
guidance of the instructional effectiveness committee. Every support unit
completes an assessment record form with its objectives assessments, and
analyses. (p. 252)


55
Parklands model for institutional effectiveness consists of three components:
strategic planning, operational planning, and budget planning, which all combine to form
a continuous feedback loop (see Figure 2.5.).


Figure 2.5. Diagram of the Institutional Planning Process at Parkland College
The strategic plan is a projection of where the institution would like to be in five years.
Parkland Colleges Planning Handbook (2004), likewise offers a brief statement on how
the university and its units are to evaluate their own component plans. The strategic plan
is to be evaluated every other year using the evaluation of previous planning
effectiveness, environmental scanning of key data and trends, and input from


56
stakeholders, including faculty, staff, community leaders, and organizations (Parkland
College Office of Institutional Research and Evaluation, 2004, p.2). The evaluation
results form the basis for the next plans development. Unfortunately, no more detailed
information is provided regarding the evaluation of their strategic plan an omission
quite typical of the strategic plans reviewed for this study. The operational plan is then to
develop tasks or actions to be accomplished by the various areas for the next fiscal year.
Once the operational plans have been developed, annual budget planning begins.
In the case of Parkland College, the goal is to integrate the vision, goals, and
strategies of the colleges strategic plan into the comprehensive budget. The action items
are placed into three budgetary categories: personnel requests, instruction and equipment
requests, and other action proposals (Parkland College Office of Institutional Research
and Evaluation, 2004, p. 4). Appropriate upper-level administrators then prioritize and
rank the budgetary requests, tabling unfunded requests for future action.


Summary
Although no clear model for assessing a strategic plan is currently available, using
common elements modified from the reviewed literature and incorporating the elements
into a logic model may guide an evaluation of a strategic plan. The common elements
organized into the framework of a logic model would include:
A summarization of the mission and vision: this formative evaluation would
fall under the input component.


57
A reflection on the progress made toward the goals: this examination and
discussion of strengths and weaknesses would provide information for the
input component.
A review of internal business processes, such as staff resources, operations,
activities, and functions would provide a formative evaluation of the activities
component of a logic model.
An analysis of the goals and objectives in measurable units with the use of
internal and external measures (including financial measures, equipment and
facilities) would be a formative evaluation of theoutput component.
The consequence or value added -- outcomes and effectiveness; organizational
learning and growth -- would present a summative evaluation of the short and
long-term outcomes.
A review of the current goals would provide a summative evaluation for the
outcome component of a logic model.
Asking stakeholders (students, faculty, staff, community leaders, and
organizations) for their thoughts and analysis, including their satisfaction and
knowledge, would provide summative evaluation that lies within the impact
component of a logic model.
A revision of goals or the development of new goals are elements in the
impact component.
Submission of the planning committees final report to the president, followed
by the community and stakeholders would fall under the outcomes and impact
components of a logic model.


58
This literature search has shown that evaluating the impact of a strategic plan should
be comprehensive and insightful, making it vital that the evidence collected be clear,
manageable, understandable, and well articulated. Although the review of literature lacks
both consistent theoretical methodologies and empirical evidence, the use of strategic
planning in higher educational institutions continues. While developing a strategic plan is
relatively commonplace now, evaluating it is far less common. This apparent lack of
evaluation demonstrates the need for assessment of strategic plans in higher education.
Empirical evidence will provide planners the evidence needed to modify and improve
future planning initiatives.
This chapter has provided the literature review for the proposed research study. It
has included a review of the literature pertaining to higher education on strategic
planning, assessment, a logic model, and various examples of assessing a strategic plan.
The following chapter will describe the proposed methodology in order to study the
evaluation methods of strategic plans at higher educational institutions.



59



CHAPTER III: METHODOLOGY



This chapter explains the methodology employed for this research study. The
chapter begins with an overview of the research questions and the rationale for the case
study design. The research methodology follows with the explanations of the data
collection, data analysis, and data validation. Descriptions of the research participants,
data collection instruments, and the research procedures are addressed. The chapter
concludes with a summary of the methodology.


Overview of Research Questions
The purpose of this study was to examine evaluation methodologies used for
assessing strategic plan initiatives at public research higher education institutions within
the American Association of Universities. The results of this research provide
information about the various methods of assessment of strategic plans in academia, and
a methodological model for practitioners on evaluating their strategic plans. This study
reviewed those institutions that had:
Implemented university-wide strategic plans (plans for the institution as a
whole, rather than for specific areas or departments),
Plans endorsed by their Board of Trustees or Regents (their governance body)


60
Begun the evaluation of their plan.
The study was guided by the following research questions:
1. What are the methodological processes that research universities use, if any,
to evaluate their strategic planning process?
a. What are their methods of data collection and analysis?
b. How is the evaluation process implemented?
c. Who are the main participants in the strategic planning evaluation
process?
2. How does the strategic plan evaluation methodology identify the targeted
strategic plan funding sources to match the strategic plan goals?
a. Does it identify the sources of the funding received?
b. Did the allocation of the strategic plan funds received match the originally
planned strategic plan goals and funding sources? If not, what was the
difference and why did it happen?


Rationale for Using a Case Study Design
A case study design was selected for this research due to the nature of the
questions and how the research would be conducted. A case study analysis is an ideal
methodology when a holistic, in-depth investigation is needed (Feagin, Orum, and
Sjoberg, 1991). Merriam describes a case study as "an examination of a specific
phenomenon such as a program, an event, a person, a process, an institution, or a social
group (1988, p. 9-10). By using multiple sources of data, case studies are intended to


61
bring out the details. They are intensive studies that provide a rich source of information
and can focus on rare, unique, and extreme cases or on revelatory cases where an
observer may have access to a phenomenon that was previously inaccessible.
The rationale for using a case study methodology for this research was that it
lended itself as a revelatory case (Yin, 1994). That is, the researcher had an opportunity
to observe and analyze a phenomenon previously inaccessible to scientific investigation
(p. 40). The researcher had access to a situation previously not studied, therefore the case
study methodology was worth conducting because the descriptive information alone
would be revelatory.
This study followed a multiple case study design, which provided for the
replication of the cases. A multiple case design simply means that the same study may
contain more than a single case. Evidence from multiple cases is often considered more
compelling and regarded as more robust (Herriott & Firestone, 1983; Yin, 1994, p. 45.)
Every case serves as a specific purpose to the overall scope of the study, therefore,
following replication logic (Yin, 1994, p. 45). If similar results are obtained from all
these cases, prediction is said to have taken place.
In each of these situations, an individual case or subject is considered analogous
to a single experiment. Each case must be carefully selected so that it either predicts
similar results or produces contrasting results for predictable reasons. Conducting several
individual case studies, arranged effectively within a multiple-case design, is akin to the
ability to conduct experiments on related topics. If all the cases turn out as predicted, the
cases would provide compelling support for the initial proposition. If the cases are in
some way contradictory, the initial proposition must be revised and retested with another


62
set of cases. This logic is similar to the way scientists deal with contradictory
experimental findings (Yin, 1994).
Case studies are multi-perspective analyses; case study analysis is known as a
triangulated research strategy. The need for triangulation arises from the ethical need to
confirm the validity of the processes (Tellis, 1997; Yin, 1994). The findings of a case
study based on multiple sources of data is perceived as more accurate and convincing
than a study based on a single source of evidence (Tellis, 1997; Yin, 1994). Yin (1994)
list six sources of evidence or data collection in the case study protocol: documentation,
archival records, interviews, direct observation, participant observation, and physical
artifacts; the most common being interviews, observations, and document analysis
(Bogden & Biklen, 1992; Merriam, 1988; Park, 1997, Tellis, 1997; Yin, 1994).
One disadvantage with case studies is that some see the method as not scientific,
i.e. nonrandom sample with non-generalizable conclusions. Other drawbacks may be that
the researcher may be biased and/or the subjects are usually aware that they are being
studied. However, case studies are designed to bring out the details from the viewpoint of
the participants by using multiple sources of data (Tellis, 1997). The overall nature of this
research or phenomenon lends itself to qualitative research; to investigate, to explain, to
document, and to predict the evaluation methodology of strategic plan initiatives at public
research institutions.






63
Using a Logic Model to Frame the Evaluation
A logic model illustrates the purpose and content of a program and makes it easier
to develop meaningful evaluation questions from a variety of vantage points: content,
implementation, and results (W.K. Kellogg Foundation, 2004). It helps identify areas of
strength and/or weakness, thereby enriching the assessment of the program. It points out
what activities need to be monitored and what kind of measurements might indicate
progress toward results. Using the logic model in this studys methodology therefore
provided for a more comprehensive evaluation process.
Specifically, the logic model guided the data collection and analysis. The surveys
and interviews questions were coordinated to the components of the logic model. The
questions were structured to provide data that were matched to each component of the
logic model (see Appendices C and G). As a result of the data being synchronized to the
logic model components, the analysis provided an examination of the institutions
evaluation methodologies to see if they assessed their entire strategic plan processes. This
ensured that this studys review incorporated the entire strategic plan process regardless if
the institutions own evaluation of its strategic plan initiative did or did not. Using a logic
model to frame the assessment thereby produced a more holistic approach for the
evaluation of the cases.


Overall Procedure
This multiple case study design contains more than a single case, in other words,
more than one institution that has implemented a university strategic plan and is


64
evaluating the results. Therefore, the first step was to select the sites and participants.
Since the data collection employed a survey and interviews as sources of evidence, a
survey instrument was constructed and interview questions were created. Once these
steps were completed, the data collection began. A flowchart of the overall procedure is
located in Appendix A.


Site and Participant Selection. This case study sought to gather in-depth data
about higher education institutions methodology for assessing their strategic plans. The
sites were selected based on these fundamental criteria:
1. The institution had a strategic plan initiative that had been in place for a
number of years and the institution was at least beginning the evaluation
assessment of its initiative. This assumes that the institution had completed its
planning, resource development, and implementation phases.
2. The institution was a public entity where data are in the public domain, easing
the tension that sometimes surrounds a study including salary and financial
data that may be key measures used in their strategic plan.
In order to protect the participants and offer confidentiality, the institutions were
not named in this dissertation.
In order to do a comparison and contrast of similar institutions, institutions were
selected for the multiple cases that fit the same criteria. Institutions that had the following
key similarities were reviewed:


65
1. Located within the United States (due to accreditation standards and
governmental funding procedures)
2. Member of the Association of American Universities (AAU) (due to similar
characteristics). Institutions are invited to be members of the AAU based upon
the breadth and quality of their university programs of research and graduate
education, institutional missions, characteristics and trajectory. To be an AAU
member, specific indicators are reviewed: competitively funded federal
research support; membership in the National Academies (National Academy
of Science, National Academy of Engineering, and the Institute of Medicine);
National Research Council faculty quality ratings; faculty arts and humanities
awards, fellowships, and memberships; citations; USDA, state, and industrial
research funding; doctoral education; number of postdoctoral appointees; and
undergraduate education (AAU, 2005).
A minimum of four institutions fitting these parameters and willing to participate
in this study were selected. This number is a representative sample, given the scope and
resources of this research study.
The researcher sent an e-mail request to the list-serve of the Association of
American Universities Data Exchange (AAUDE) representatives asking if their
institution had a university wide strategic plan. The representatives are typically
employed in the budget, planning, or institutional research offices. There were 14
responses. Of the remaining 49 institutions, the researcher reviewed their websites. If key
web pages (presidents or chancellors web page, provosts or vice president for
academic affairs web page, Office of Institutional Research web page, or a page or site


66
referring to their strategic planning) did not reveal a strategic plan, multiple descriptors
were used in the institutions web search engine (e.g., strategic plan, strategic directions,
academic plan) to find a university-wide strategic plan for that institution. The researcher
contacted the Director of Institutional Research if clarification was needed on either
identifying the strategic plan, or on the specific nature of their plan. The list of AAU
institutions was condensed to six institutions that met the research criteria. Based upon
the correspondence and available information, four institutions were selected.


Data Collection
Within each selected institution, the data collection for this study followed the
basic principle for collection of case study evidence, triangulation. Triangulation is a
common process of using multiple sources of data to allow the researcher to develop
multiple measures of the same phenomenon, thereby enhancing the validity of the data
collection process. This study used document analysis, a survey questionnaire, and
personal interviews as the three sources of data.


Document Analysis. The first source of evidence for this study was the analysis
of public documents on strategic plan data and information, e.g. documents that are in the
public domain such as memoranda, agendas, or reports. The researcher perused the
institutions websites for strategic planning documents. The strategic plan, annual reports
of the strategic plan, committee meeting minutes, budgetary or funding documents, and


67
any other related documents on the assessment or outcome of the strategic plan that were
publicly available were reviewed (see Appendix B). If the documents were not available
on the website, or additional documents were needed, the researcher contacted the
institution to collect the material by postal mail or in person upon the campus visit. Once
gathered, the documents were analyzed and coded for correspondence to the logic model
components and for thematic content.
The procedure for the document analysis was as follows:
1. Reviewed the website for documents.
2. Printed and dated each relevant document.
3. If key documents were missing, the institution was contacted for a paper copy
of the document (if available).
4. The documents were analyzed by reviewing for thematic content, and coded.
5. Emergent patterns and themes were noted.


Survey Instrument. One of the sources of evidence was data collected by a
survey of key participants. The Strategic Planning Evaluation Survey instrument (see
Appendix C) emerged from the literature regarding strategic planning. The Likert-type
scale survey questions were written by Dr. William Nunez for his study of Faculty and
Academic Administrator Support for Strategic Planning in the Context of Postsecondary
Reform, which in turn were derived from previous research (Nunez, 2003). Expert
review and a factor analysis were used by Dr. Nunez to ensure the validity of the survey


68
questions. The open ended questions, specifically included for this study, were designed
by the researcher.
The survey inquired about the methodology used and the participants engaged in
the assessment of the institutions strategic plan. Each question was linked to a
component of a logic model.
Staff or administrators in the Offices of the President, Provost, Institutional
Research, and other presidential cabinet members, strategic plan task force, or committee
members involved with their strategic plans evaluation were queried. It is highly likely
that members of the Presidents and Provosts staff are significantly engaged in their
institutions strategic plan initiative. Additionally, the Institutional Research practitioners
are typically responsible for collecting and reporting the data on the strategic plan and
play a key role in the development and execution of the evaluation process. Therefore,
surveying these offices within the selected institutions was ideal in order to observe the
evaluation methodology of the strategic plan. The specific individuals and their electronic
addresses (e-mail) were found through their universitys directory. An e-mail invitation
with a link to an on-line survey was sent to them (see Appendix D). The online survey
used the Hosted Survey software, and was anonymous per institution.
The procedure for the survey was as follows:
1. The mailing list was complied.
2. The survey and invitational letter was mailed.
3. A reminder letter with a link to the survey was e-mailed seven days after the
initial mailing.


69
4. A second reminder letter with a link to the survey was e-mailed 14 days after
the initial mailing.
5. A final reminder letter with a link to the survey was e-mailed 21 days after the
initial mailing.
6. The completed surveys were analyzed by entering the responses of the Likert-
scale questions (converted into a numeric scale) into a Microsoft Excel
Worksheet.
7. A SAS (Statistical Analysis Software) program was written, stipulating the
descriptive statistical analyses that were used to assist with the analyses of the
data.
8. The responses to the open-ended questions were reviewed for thematic
content and coded.
9. Emergent patterns and themes from the responses to the open-ended questions
were noted.


Semi-structure Interviews. The third source of evidence was semi-structured
interviews of administrators at the selected institutions who play a key role in the
strategic plans evaluation. Standardized interview questions were employed, in order to
reduce interviewer effect, facilitate data collection, provide continuity, and assist data
analysis. As with the survey questions, each interview question was linked to a
component of a logic model. The standardized interview questions facilitated consistency
in the data collection process (see Appendix E). The interviews were semi-structured, in


70
that the questions and sequence were specified in advance in an outline form, yet the
interviewer decided the wording of the questions during the course of the interview. The
questions provided a more comprehensive data collection, yet the interviews remained
fairly conversational and situational (Patton, 1987). As Patton (1987) suggests, this style
of interviewing allows the interviewer to build a conversation within a particular
predetermined subject.
The interview sample for this case study was purposefully limited to staff or
administrators from the institutions to focus on the institutions evaluation processes.
Depending on the organizational structure of the institution, individuals from the Offices
of the President, Provost, Institutional Research, presidents cabinet, Strategic Plan Task
Force, or other committees that are involved with the strategic plan evaluation were
selected for interviews.
Each potential interview participant was contacted initially by mail, then by a
follow-up telephone call to schedule an interview (see Appendix F). The cooperation of
the participant was then confirmed. If that person agreed to participate then the researcher
scheduled a 60 minute interview at the participants choice of location. Interviews were
conducted by the researcher and were recorded using audiotape.
With permission, the interviews were recorded to capture full and exact
quotations for analysis and reporting. Also, the researcher took notes during the
interview. After the interview, the researcher transcribed the recording and sent the
transcription to the interviewee to make certain the content was correct, and to act as a
validation check. The interviews were analyzed for thematic content.
The procedure for selecting those to be interviews was as follows:


71
1. Using the results from the document analysis and the survey, a list of possible
participants was drawn up.
2. The possible participant was contacted first by mail, then by a follow-up
telephone call to schedule an interview.
3. The cooperation of the participant was confirmed.
4. The interview was conducted.
5. After the interview, the notes and recording were transcribed word for word.
6. The transcription was sent for validation to the interviewee.
7. The interview summaries were approved by the participant.
8. The summaries were analyzed by reviewing for thematic content, and coded.
9. Emergent patterns and themes were noted.


Data Analysis and Validation
Yin (1994) suggested that case studies have a general analytic strategy in order to
guide the decision regarding what will be analyzed and for what reason. The analytic
strategy for this study was to use a logic model to guide the descriptive framework of the
steps or methods of the institutions evaluations. To the extent that the evaluation of a
strategic plan can be described within a logic model, as a sequence of decisions and/or
activities, the research focused on the type of methods that were employed for the
evaluation.
The analysis of the study began with the case data. The case data consisted of all
the information gathered at each institution: the documents, interview data, and survey


72
results. The case data were accumulated, organized, and classified into a manageable case
record. The case records were used to write the narrative, a descriptive picture of how
higher education institutions evaluate strategic plans. Once the evidence was displayed in
the logic model components, the narratives were then presented thematically to provide a
holistic portrayal of these institutions evaluation methodology in the chapter four
(Patton, 1987).
To establish the quality of the research, validity and reliability tests were applied
to the study. The study was based upon multiple sources of data, which, in itself,
provided the triangulation that confirmed the construct validity of the processes.
Furthermore, the use of multiple cases within this research provided external validity as
well. The use of replication (multiple cases) established that the studys finding can be
generalized (Yin, 1994). The consistency in the overall patterns of the data from different
sources and reasonable explanations for differences in data from various sources
contributed to the internal validity of the findings. In additional, the use of protocols for
each data collection technique assisted in the credibility of the reliability.
Within the data sources themselves, the documents were examined for content
analysis. As Patton (1987) describes, content analysis involves identifying coherent and
important examples, themes, and patterns in the data (p. 149). Inductive analysis was
also used in identifying emergent patterns, themes and categories that came out of the
data. Using the other two data collection techniques (interviews and survey) help
strengthened and corroborated the document analysis.
The second source of data, the survey instrument emerged from the literature
regarding strategic planning. The Likert-type scale survey questions were written by Dr.


73
William Nunez for his study of Faculty and Academic Administrators Support for
Strategic Planning in the Context of Postsecondary Reform, which in turn were derived
from previous research (Nunez, 2003). In his study, expert review and a factor analysis
were used to ensure the validity of the survey questions. In addition, Dr. Nunez
performed a test of reliability; Cronbachs alpha with a coefficient of .70 was used on the
survey instrument. It provided an estimate of the internal consistency reliability by
determining how all items on the survey related to all other items and to the total survey
and therefore, measured how reproducible the survey instruments data were.
In order to measure the content validity of the third source of data, the interview,
expert evaluation of the questions was sought. A panel of higher education experts in
strategic planning and survey methodology was asked to review the interview questions.
Each panel member received a copy of the interview questions and the first chapter of the
dissertation proposal. Using an assessment document (see Appendix G) that incorporated
a five-point Likert-type scale (excellent =5, poor =1) for each question, the panel
members were asked to evaluate the questions and rate the degree to which the items
were appropriate as measures. The panel members were provided the opportunity to
provide comments as well. The researcher reviewed the rankings and comments and
revised or removed items based on the panel member feedback.


74

Name Dr. Carol Kominski Dr. Victor Borden Mr. Darrin Harris
Position Associate Vice
President for
Institutional
Effectiveness and
Research
Associate Vice President,
University Planning,
Institutional Research
and Accountability
Consultant for the
Office of Quality
Improvement
Institution Texas Womans
University
Indiana University University of
Wisconsin

Figure 3.1: Expert Panel Members

Once the interview was conducted, the individual interview summaries were sent
to the participant for their review, acting as a member check. Thematic codes were
developed based on the review of the interviews per institution. The interviews and
thematic codes were read independently by another reader. The researcher and second
reader met to discuss the codes and compare responses, resulting in an inter-coder rating
of 98%. The data obtained through the interviews were collaborated by the other two data
collection techniques. A few other methods were employed as well, in order to determine
the trustworthiness of the interview results. On-site interviews offered an opportunity to
spend some time in that institutions culture that provided a contextual framework as well
as the opportunity to identify characteristics or aspects of the culture that contributed to
the study of that institutions strategic plan process. In addition, by interviewing multiple
people at each institution, the multiple perspectives or interpretations strengthened the
validity of the findings.





75
Summary
As discussed in this chapter a revelatory case study can provide an examination of
a situation that has not previously been scientifically investigated. This research, through
the development of an in-depth case study, has gone beyond the typical study of the
implementation of a strategic plan in a higher education institution, proceeding to a more
substantive analysis of how strategic plan initiatives are evaluated. By reviewing the
evaluation methods and procedures of strategic planning processes in higher education
institutions, themes or trends were identified that encouraged of the development of a
new evaluation model described in chapter five. The results of this study thus lead to an
improved understanding of the evaluation methodologies employed to assess a strategic
planning initiative at higher education institutions and provide real-life examples that are
being tested and used, thereby providing a descriptive framework. The study also
provides insight for practitioners of strategic planning into the practical implications of
evaluating their strategic plans.


76





CHAPTER IV: RESEARCH RESULTS



The research for this study has focused on the methodologies employed by public
research-intensive higher education institutions in the evaluation of their strategic plan
initiatives. Four institutions were selected as case studies. Their selection was based on
two fundamental criteria: a location within the United States, and an affiliation with the
Association of American Universities (AAU). Furthermore, each selected institution had
a university-wide strategic plan initiative that had been in place for a number of years;
was in the initial stage of its evaluation assessment; and was willing to participate in this
study. The four case studies were all large public, research extensive institutions, with
student enrollments of over 25,000 students. They all employed 15,000 or more people
and had annual budgets over $1.2 billion. To protect the confidentiality of the
participants, and in accordance with standard research practices, the names of the
institutions will not be revealed.
The results reported in this chapter stem from the combination of three types of
data collection: document analysis, surveys, and interviews. This chapter, therefore,
features detailed examinations of the documents, tables and figures from the survey
results, and direct quotes from the participants. The data were synchronized according to
the logic model components to ensure that the research results will provide a


77
comprehensive analysis of the evaluation process. The following table provides an
overview of sources of evidence from the four universities (see Figure 4.1).
University Documents
Reviewed
Survey
Participants
Interview Participants
A N =24 N =40
45%
Response
Rate
1. Provost
2. Director of Strategic Planning and
Assessment
3. Vice Provost for Research
4. Vice Provost for Engagement
5. Director of Budget and Fiscal Planning
6. Director of Institutional Research
B N =33 N =37
24%
Response
Rate
1. Executive Vice President of Business and
Treasurer
2. Executive Vice Provost for Academic
Affairs
3. Director of Institutional Research
C N =30 N =26
35%
Response
Rate
1. Executive Vice President of Business and
Treasurer
2. Executive Director of the Office of
Planning and Institutional Assessment
3. Director of Planning Research and
Assessment
D N =38 N =39
23%
Response
Rate
1. Executive Associate Provost
2. Interim Director of Financial Planning and
Budget


Figure 4.1: Sources of Evidence Overview
This chapter is organized according to the universities studied. For each
university, the section begins with a description of the sources of evidence, with the
attending data coded into the logic model components. A matrix is provided as a visual
reference indicating how strong or weak the evidence sources were in each one of the
logic model components. The appendices present the list of documents, detailed survey
results in tables, and lists of positions of those who responded to the survey and took part
in interviews. Although all the quotations from the strategic planning documents are open


78
to the public and available on websites and in published reports, the results have been
written here in such a way to protect the confidentiality of the survey and interview
participants. Neither the institution nor the participants are named, in order to safeguard
their confidentiality.
A description of each universitys evaluation methodology is presented after the
matrix. For each university, the evaluation process is defined by describing the key
participants, or main evaluators, the measures, and the reports. Also, each university
section is organized by themes taken from the data to further guide the reader. Although
there are some similar themes between the four case studies, each university had some
unique characteristics within their strategic planning process that influenced the
evaluation of their plan. The case studies are written in such a way that each one tells the
story of how the institution has evaluated its strategic plan.
The synthesis of the results is presented in the cross case analysis. The chapter
concludes with a conceptual model, which was produced from this research.


University A
With an enrollment of approximately 40,000 and an annual budget of $1.4 billion,
University A serves as the leading public higher education institution of its state. Along
with its growing regional and national stature, though, has come heightened public
oversight and demands for accountability, which in turn has led to an increased reliance
on strategic planning in recent years. The university is currently in the fifth year of its
six-year strategic plan initiative, which has called for an annual $156 million in new
resources to be used to support and facilitate learning, research, and community


79
engagement. The resources identified to support the strategic plan were: state
appropriations, federal appropriations, fees and tuition, sponsored funding, internal
reallocation, private giving, revenues for licenses and patents, and revenues from other
sources.
Although data have been collected and annually reported on each of the strategic
plan metrics and benchmarks for the last five years, this university is now beginning to
summarize and evaluate the results for a comprehensive review of its plan through some
additional activities and reports.
University As strategic plan included a mission, vision, and three major goals: 1)
to achieve and sustain preeminence in discovery; 2) to attain and preserve excellence in
learning through programs of superior quality and value in every academic discipline;
and 3) to effectively address the needs of society through engagement. Each goal had
several characteristics, resulting in key strategies that had specific measures. In addition
to the three main goals, overarching strategies with measures of their own figured into the
process as well. Some of these measures were then compared to eleven peer institutions.
In all, there were 49 metrics and 24 benchmarks.


Sources of Evidence and Logic Model
Sources of evidence for University A included public documents, the Strategic
Planning Evaluation Survey, and interviews of key personnel. The following documents
were analyzed: the strategic plan, the annual reports, news releases, the strategic plan
website and its supporting documents, as well as the Board of Trustees website, the


80
presidents website and the Office of Institutional Researchs website (see Appendix H).
TheStrategic Planning Evaluation Survey (see Appendix C) was e-mailed to forty upper-
level administrators at University A, including the original Strategic Plan Task Force and
the Strategic Plan Review Committee, both of which had memberships representing the
vice presidents, deans, faculty, administrators, and clerical and support staff. The survey
was also e-mailed to those deans and vice presidents not serving on the other two
committees, as well as the faculty senate chairman, the managing director of the
treasurers office, and the director of institutional research. A total of eighteen responses
were received, providing a response rate of 45% (see Appendix K). In the open-ended
question asking respondents to state the office or area where they were employed, ten
replied that they were in academic areas, and the other eight indicated an affiliation with
administrative units. The survey questions themselves were linked to the logic model
components. Interviews of six members of the upper administration at University A were
also conducted with participants from the Offices of the President, Provost, Institutional
Research, and Budget and Fiscal Planning (see Appendix I). Consistency in data
collection was achieved by asking standardized interview questions linked to the logic
model components (see Appendices J and K). A visual reference of how strong or weak
the research sources were for each one of the logic model components is presented in the
logic model evidence matrix of University A (see Figure 4.2).


81

Sources of Evidence
Document
Analysis
Survey Results Interview
Results
Input
Reflections made on the
use of resources, including
human, financial,
organizational, and
community, that have
been invested into the
strategic plan initiative.
5 references Four items
with responses
that ranged from
72% to 100% for
somewhat agree
or strongly agree
14 responses
Activities
Review of the processes
and actions that were an
intentional part of the
implementation, such as
tools, technology, and
services.
2 references 6 question with
responses that
ranged from
77% to 94% for
agree or strongly
agree
11 responses
Output
Whether the evaluation
methodology included
gathering data or results
on the activities.
10 references 3 question with
responses that
ranged from
83% to 94% for
agree or strongly
agree
13 responses
Outcomes
Short-term and long-term
outcomes consisting of
specific changes resulting
from the outputs,
including attitudes,
behaviors, knowledge,
and/or skills.
16 references 4 question with
responses that
ranged from
83% to 94% for
agree or strongly
agree
24 responses
L
o
g
i
c

M
o
d
e
l

C
o
m
p
o
n
e
n
t
s

Impact
Fundamental intended or
unintended changes
occurring in the
organization or
community
9 references 7 question -
responses ranged
from 83% to
94% for agree or
strongly agree
21 responses
Figure 4.2: Matrix of University As Evidence by Logic Model


82
Evaluation Methodology
At the onset of its strategic plan, a task force was assembled to establish an
outline for what the university wanted to achieve, how it would achieve it, what measures
would be used to determine progress, and what measures would be used to determine
when the objectives had been reached. Significantly, though, there was no mention of
how exactly the university would assess the plan at the completion of the initiative. As
University As administrators are just now beginning a comprehensive evaluation of their
strategic plan, there has not been a public summary or comprehensive report to date.
However, there have been several news releases (e.g. newspaper articles and web news
announcements) discussing the impending review of the strategic plan.
In response to the Strategic Planning Evaluation Surveys questions about the
evaluation methodology, 88% responded that they had evaluated and offered advice on
the strategic plan, and 95% agreed that the strategic plan activities had helped to
accomplish University As mission. An open-ended question asked respondents to
describe briefly the evaluation methodology at their institution. Only one stated that
he/she did not know of any methods or steps. The most common response (N=14, 77%)
was that University A used measures and reported its progress in an annual report.
During the interviews, participants were asked to discuss how the institution evaluated
the strategic plan. Most replied that the specific measures were key to evaluating the
strategic plan. As some of these measures were used as benchmarks against other peer
institutions, these comparisons were spoken of as a way to evaluate the plan as illustrated
by these quotes:


83
Some of them are very simple quantitative measures where we have benchmarked
ourselves against our peers on certain metrics. We simply look at their numbers
every year, we look at our numbers, and compare to see if we have made any
movement.

First, you have to look at what the goal really met and the characteristics that
define attainment of the goal. Then, looking at those characteristics and the
measures that go with each, report on the progress. Also, it would be compared
with the peer groupto see if it has moved in a positive direction to the peers. If
it appears progress has been made, but you have also fallen behind your peers
compared to when you started, then I would say it was not sufficient progress.

Four survey respondents (22%) mentioned that a review committee was assigned
to evaluate the plan. This was certainly borne out in the following detailed response:
[An] annual assessment of progress with a report [including]
metrics/benchmarking data, [an] investment return analysis, [and] a five-year
comprehensive reviewretrospective and prospectivefocusing on the "big
picture" with salient aspects of goal achievement and recommendations...
conducted by [an] across-campus review committee with a report to the president
and the Board of Trustees.

Thus, even though a formal review has not been made public, there is evidence clearly
acknowledging the beginnings of an evaluation.


Main Evaluators. After the fifth year of the strategic plan, a review committee of
approximately fifteen university employees was formed. Faculty, administrators, staff,
and student representatives served on this committee, which articulated a comprehensive
view of the strategic plan, that was passed along, in turn, to the president and Board of
Trustees. However, there was a mixed reaction when the participants were asked who
was involved in the evaluation of the strategic plan. The three general responses given


84
indicated that the review committee, the upper administration, or all levels of the
university were involved in evaluating the plan. This mixed reaction may be attributable
to a lack of publicity regarding the role of the review committee and/ or the fact that its
findings were not made public.
Seven respondents (39%) in the survey stated that the upper administration was
responsible for the evaluation. Some of the typical responses in the interviews were:
Im assuming that the presidents cabinet has discussed the progress of the
strategic plan on a regular basis.
That happens at a really high level. Who is able to say [if] things are not
going as planned? Probably the provosts office.
One administrator summed it up this way:

Clearly, the [presidents] office, and the Office of Institutional Research [are
actively involved]. The deans are very sensitive to this and they are actively
involved with this every day. I would say in some colleges department heads are
involved, they may have made it part of their evaluation thinking as well. It is
probably not part of the everyday thinking of many faculty members, which is
probably okay. I dont know if it needs to be. It is those of us that are struggling
with the overall directions and policies of the institution that need to worry about
strategic planning.

The other participants thought that all levels of the university were involved in the
evaluation of the plan. One-half (N=9, 50%) of the survey respondents stated that the
evaluation was at all levels. As one subject commented: I think the whole leadership of
the university is involved, down to the department level. Another noted in more detail:
It is at all levels because we collect data from the bottom up. Even at the departmental
level they [provide] information. It is pulled [together] all of the academic
informationand the non-academic information.


85
Reports. Each November the president of University A reported on the status of
the strategic plan to the Board of Trustees. In addition to this presentation, the annual
report was prepared. This report was a formal three-part document about progress made
on the strategic plan, comprised of the Presidents Report (providing both a general
overview of progress and a message from the president); a one-page executive summary
Progress on the Strategic Plan; and a detailed report providing an update on each metric
and benchmark titled Benchmarks and Metrics Report. This annual report was
distributed widely and made available to the public, thus providing the campus and
community with an update regarding progress made toward achieving the strategic plan.
This report was often brought up by those interviewed, most of whom recounted
that University A periodically presented specific strategic plan data to the Board of
Trustees, in addition to furnishing key stakeholders and the public with an annual
published report:
We have an annual grand accounting to the Board of Trustees.

It is an annual ongoing process. Once a year we report to our Board of Trustees.
We collect metrics on every one of the strategic plans initiatives. Its a continual
ongoing process of collecting data, information, and then making reports for the
Board of Trustees.

On the annual basis a three-part report is prepared for the Board of Trustees.
There is a corporate view that provides a broad overview of the status of the plan
for the past year. There is a more detailed summary of the specific goal areas
[and] a more detailed report prepared by my office. In addition to that, there is the
[report] that is really the nitty-gritty detail that is given to the Board members, the
Presidents cabinet, and to the upper-level administration of the university.


Measures. There were a number of measures that University A used to track
progress. These measures were collected and reported every year for the annual report.


86
Within the survey, 94% of the respondents agreed that University A had a formal process
for gathering and sharing internal and external data. The process of defining, collecting,
analyzing, and reporting these measures seemed to be well understood. Consequently,
when asked about the measures, it is not surprising that people could elaborate on
process. In the interviews, most said that the original measures were still the primary
factors with regard to evaluation, even if they may have been modified a bit. Two
participants elaborated on what they meant by modifying the measures:
As we went through and refined things, we did not make any drastic changes in
terms of the metrics. When the committee first created the metrics, some ideas
were not necessarily practical or possible. There was some struggle to interpret
what they may have meant and to qualify their idea. Although those metrics
were kept since the committee decided that they wanted them.

When the strategic plan was first developed, the measures werent precisely
stated. The benchmarks and metrics were subject to definition. There were a
few rare cases in which we had to modify the definitions based on the availability
of information or on further review of what the measure really meant and what
was of value to us. However, for the most part, I would say we maintained the
same set of measures.


Mid-course Adjustment. Three years into the University As strategic plan, a
mid-course adjustment was performed due to three factors: the need to modernize
administrative computing applications, the need for attention to the repair and
rehabilitation of existing buildings, and the decision to raise the goal of the capital
campaign due to its early successes. The mid-course adjustment thus modified goals and
resources of the strategic plan, prompting one of the participants to note:




87
We have on several occasions made course corrections. Ill give you two
examples. The campaign had an original goal to raise $1.3 billion. Very
quickly we realized we could do better than that and we raised the goal to $1.5
billion. That was an early correction. We initially said we would add three
hundred faculty to the campus over a five-year period. It became apparent to me
that that it would take closer to seven years. The course correction was to adjust
that out to seven years. Actually, we took the campaign out to seven years as well.
It just made sense to do that.

Another administrator reflected on the sensible need for a mid-course correction:

Why didnt we realize that [repair and rehabilitation] problem and make it part of
the strategic initiative earlier? Youve seen the numbersthis didnt happen
overnight. It should have been part of the examination when we established the
plan. Or maybe they werent open to it when the plan was created. Maybe that is a
flaw in the process.

Almost as if responding to this view, another interviewee had this to say:

Any strategic plan has to be able to absorb mid-course corrections without having
to revise the plan entirely. There needs to be some resiliency with the plan that so
that if external conditions shift, you are able to make adjustments without having
to change your plan. In our case, an example was once again due to the
insufficiency of the state funds. The university did not receive funds for facilities.
Because of that we had to create a new mechanism which included some
additional fees, but also included new reallocation of funds to remedy that over
time. A positive example is the fundraising campaign. We were meeting the goals
and so we increased them.

The idea behind a mid-course correction, then, was that it allowed for flexibility in
dealing either with unforeseen obstacles threatening to derail the plan, or with surprising
opportunities that could just as easily make the plan irrelevant. In the words of one
interviewee:
As long as you remember that the decisions can be tweaked, modified, you go in a
general direction, a strategic direction for the institutions. For me that is the real
strategic element of what we do. Its the process that is more important than the
specific goals, since they may not be right.






88
Integrating Budgeting with Planning
When participants were asked if the evaluation of the strategic plan was an
integral part of the planning process, all responded resoundingly in the affirmative:
Yes, its critical. Reviewing the measures over time and establishing base line data is
a critical and an essential component [of the evaluation]. If those measures hadnt
been looked at on an ongoing basis, I dont think we would have been as effective in
evaluating our plan as we have been.

Oh yes. It drives everything we do. All decisions are at least put into the context of
the strategic plan.

University A certainly came to recognize the sometimes-fragile interdependence of
various funding sources. From the beginning, the strategic plan listed key areas for funds
in supporting the strategic plan: state appropriations, federal appropriations, fees and
tuition, sponsored funding, internal reallocation, private giving, revenues for licenses and
patents, and revenues from other sources. When the interview participants were asked if
the resources that were needed to implement the strategic plan were received as expected,
some firmly answered yes. Others, however, added that state funding had not matched
expectations:
We laid out a resource-planning scheme, and we have exceeded in some areas and
fallen short in one particular area that has to do with the state support. The states
economy was affected by post-9/11... We did not receive what we had assumed
or anticipated that the state would be able to provide. The state did provide us
some new funding for new initiatives, but not as much as we had expected. On the
other hand, the private fundraising had exceeded the initial goals and is still
exceeding the goal.

The state appropriations were not what we expected them to be. Additional
funding for those areas as I mentioned before [in reference to the mid-course
correction] was needed so we had to add a special fee for students to come up
with that money for repair and rehabilitation, for example. To the extent that we
could allocate what was available to us was pretty much kept in line. However,
because of the additions, additional funding that was required and because we
didnt get the appropriations we had expected.


89
When asked if the allocation of those funds followed the original plan, all interviewees
agreed:
It was programmed ahead of time, and we basically marched right in tune. Other
than our mid-course correction, we stuck to our plan of where we were able to
raise the resources for our allocations. We did a pretty good job at that.

Yes, they did, except where we didnt get the state funding. A few initiatives were
slowed, but in several instances, we had a backflow that bridged the gap with
other sources of funds. So for the most part, yes.

Yes, with one exception. We assumed at the beginning that we had five sources of
funding. We assumed that we would fundraise; that has worked fine. We assumed
that we would reallocate; that has worked fine. We had assumed we would
increase our sponsored program activity; that has worked fine. We assumed we
would raise tuition; that worked fine. We assumed that the state would, at least,
stay the course, and that hasnt worked at all. Ive been here six years, and if we
get the budget increases that we are recommended this year, that will be the first
time in six years that we have seen any budget increase from the state. So all the
progress weve made, which has been substantial, is from these other four
sources.

In the responses to the survey, 83% thought the results of the strategic plan had
been used to evaluate the distribution of resources. The increase in revenues from the
various sources, along with the redirection of existing resources, were used to fund new
opportunities with the greatest impact for strategic priorities. The strategic plan
established the framework for setting the annual priorities and guiding the major
budgetary decisions. Many of those interviewed spoke of new initiatives established by
the academic colleges that were related to the strategic goals of the university. To get
university funding for these new initiatives, the colleges had to demonstrate how they had
first internally reallocated funds and how their initiative would advance the universitys
overall strategic goals. One described it in this way:



90

We have a requirement of a 2% reallocation of funds Its an idea of recycling
existing money, which can then be targeted and focused toward the strategic
objectives When [the president] is evaluating the incremental requests for
funds, his first question is: What have you done for yourself? He is more inclined
to fund those activities where someone has come up with a part, providing their
own share from whatever funds they can reallocate. In the fact that those
questions get asked in the context of the strategic plan, its just another thing that
reinforces the effectiveness of the plan. It is an ongoing thing; it is integrated
with the budget planning.

One upper-level administrator described her experience with these types of strategic
initiatives:
I would take a program that had potential, had demonstrated that, was willing to
reallocate its own resources and take steps, have a plan, and show me how it
would do it. I would help them. And we have tried to do it in a few cases. With
mixed success I think it has worked very well, and in a few cases it worked less
well.

All of the interviewees stated that there was a direct relationship between funding
decisions and the strategic planning initiatives. This was most plainly stated by one
participant:
The strategic plan goals, priorities, and strategies are intrinsically connected with
the budget allocation process on an annual basis. We do an annual review of the
actual funding allocation to the goals and the corresponding changes in the
metrics associated with the goals. Every year the budget document is compared
showing direct connections. In other words, all the dollars have been laid out in
terms of strategic plan goals and priorities.


Communication
Besides the annual report and presentation to the Board of Trustees, the president
also shared the results of the progress on the strategic plan with the campus and
community through various venues. Typically, his public speeches included several


91
references to the strategic plan. There was also a strategic plan website linked to the
universitys main webpage, presidents website and that of the Office of Institutional
Research as well. Moreover, news articles in the local, student, and in-house newspapers
periodically reported on the progress of the strategic plan. One administrator said that
when people are asked at other institutions if their university has a strategic plan, they
might be able to answer yes or no, [but] would be hard-pressed to answer what the major
elements were. If you asked people around here, they would probably be able to tell you.
Another respondent agreed:
I just think one of the main reasons from my vantage point that the strategic
planning that has been so successful at [University A] has been because of all the
various communication efforts that were made. I think that has to do with the
leadership of [the president] and his community visits. I dont think there is
person on campus in one fashion or another that doesnt know that we have been
undergoing a strategic planning effort over these years, and I think that has been a
beneficial thing.

Some saw the strategic plan as an effective communication tool in itself, since it clearly
articulated University As mission and vision. This view was reflected in several
interview responses:
It is a very effective tool to get people thinking. Strategic planning is not about
getting everybody to do what you say. You get them pointed in the same
direction.

It is a good communication mechanism with the central administration. That is
part of the thing about strategic planning, it helps articulate to the stakeholders
what the key elements [are] and what we are trying to target.
By using the strategic plan to convey the direction it wanted to go, University A
effectively facilitated positive interactions with the state legislature and increased its
general fundraising efforts. One respondent with specific budgetary and financial
planning duties for the university reiterated this point:


92
With the strategic planning activity, and the marketing of it, we have actually
increased the amount of resources the institutions has, even in these dire times.
The president was insightful; he did his homework and marketing up front Did
the [strategic planning] activity give us any more money? I am absolutely certain
it did.



Culture

Some of the interviewees noted that University A experienced an overall change
in culture after having adopted strategic planning. Using elements of planning such as
implementation, resource allocation, and the integration of budgeting and planning, a new
mindset was created and quickly became pervasive across the campus. One administrator
described the transformation this way:
Compared to six or seven years ago, that is a dramatic change. We learned about
this from the review committee that created the comprehensive report. The
strategic plan has transformed the culture of the university and moved it to a
different level. Everyone asks for data before making a decision, even at the
deans level. It is data-driven decisions.

Thus, while effective and inspiring leadership is certainly a necessary component in
motivating an institution to adopt strategic planning, it is also clear that a universitys
culture must likewise adapt to the point where core internal constituencies are willing
(and in some cases eager) to take the initiative as well. As one respondent put it: If you
dont get the buy-in, then it will not work. There needs to be an excitement, a passion for
this, which I think we have seen at this university.
The chairman of the Board of Trustees was noted in an in-house news article as
having said that a formal review of the strategic plan had been undertaken, and there had
been a preliminary discussion of future goals for the next plan. His statement implied that
in developing the requirements for the next plan, significant consideration had been


93
accorded to the review of the current plan. In addition, the president of University A had
been traveling around the state during the final year of the plan to receive comments on
how well the strategic plan had been received by the community, and to receive
suggestions on what goals the university should pursue in the future. These comments
and actions, along with other reports generated (particularly that of the review
committee) illustrate that University A has channeled great effort toward a systematic
evaluation of its strategic plan. This characterization is also borne out in the fact that the
data points for this particular case study readily correspond with the logic model
components. With this in mind, the comments of one interview participant are
particularly prescient: It has been quite an exciting six years. This is one of the few
strategic plans that I have had the pleasure of working on and implementing that has
worked.


University B
Like the first institution, University B has risen to regional and national
prominence in recent decades. It features an annual budget of $3.7 billion and enrolls just
over 50,000 students. And like University A, it, too, has recently turned to strategic
planning to solidify and enhance its position in the landscape of American higher
education. Created in 2000, University Bs strategic plan called for an investment of $750
million over the next five years focusing on four key goals: 1) becoming a national leader
in the quality of academic programs, 2) being universally recognized for the quality of
the learning experience, 3) creating an environment that truly values and is enriched by


94
diversity, and 4) expanding the land-grant mission to address societys most compelling
needs. University B hoped to fund the plan from four sources: additional state support,
funds redirected from university resources, private fundraising, and increased tuition.


Sources of Evidence and Logic Model
Among the sources of evidence for University B are the following public
documents: news releases, speeches, the strategic plan, the annual updates, the annual
reports on strategic indicators, the strategic plan website (with attending documentation),
as well as the websites of the Board of Trustees, the presidents office, and the Office of
Institutional Research (see Appendix H). Another source of evidence is the Strategic
Planning Evaluation Survey (see Appendix C). It was e-mailed to thirty-seven upper-
level administrators and deans at University B, resulting in nine completed responses, a
response rate of 24% (see Appendix L). In the open-ended question asking respondents to
indicate the area in which they were employed, four responded that they were in
administrative units and five replied that they were in academic areas. Three upper-level
administrators were also interviewed at University B. These participants were from the
Offices of the Senior Vice President for Business and Finance, the Provost, and
Institutional Research (see Appendix I). The logic model evidence matrix of University B
(Figure 4.3) provides a visual reference of how strong or weak each of the research
sources were for the logic model components (see Appendices J an L).



95

Sources of Evidence
Document
Analysis
Survey Results Interview
Results
Input
Reflections made on the
use of resources, including
human, financial,
organizational, and
community that have been
invested into the strategic
plan initiative.
10 references Four items
with responses
that ranged from
67% to 100% for
somewhat agree
or strongly agree
7 responses
Activities
Review of the processes
and actions that were an
intentional part of the
implementation, such as
tools, technology, and
services.
15 references 6 question with
responses that
ranged from
55% to 100% for
agree or strongly
agree
11 responses
Output
Whether the evaluation
methodology included
gathering data or results
on the activities.
32 references 3 question with
responses that
ranged from
78% to 89% for
agree or strongly
agree
13 responses
Outcomes
Short-term and long-term
outcomes consisting of
specific changes resulting
from the outputs,
including attitudes,
behaviors, knowledge,
and/or skills.
18 references 4 question with
responses that
ranged from
78% to 100% for
agree or strongly
agree
10 responses
L
o
g
i
c

M
o
d
e
l

C
o
m
p
o
n
e
n
t
s

Impact
Fundamental intended or
unintended changes
occurring in the
organization or
community
5 references 7 question -
responses ranged
from 89% to
100% for agree
or strongly agree
16 responses
Figure 4.3: Matrix of University Bs Evidence by Logic Model


96
Evaluation Methodology
A public document featured on the Presidents website stated that University B
had established a set of institutional goals and a matrix for evaluation for its strategic
plan. Even though this was the one and only time a formal evaluation was mentioned,
89% of those responding to the present studys Strategic Planning Evaluation Survey
indicated that specific activities and outcomes had occurred at the university through an
evaluation of the strategic planning results. Most of those interviewed, however,
described this evaluation as an informal process. One described it this way:
I think it happens because in the sessions at the leadership retreat, if we thought
that it was on the wrong track we would just change it. So it is really not the same
as doing thoughtful evaluation. I think it is just informal, really.

The strategic plan was originally a five-year plan, but was subsequently extended and
expanded for another five years. No formal evaluation was carried out after the first five-
year period, and the university is currently in year seven of this overall strategic plan.
Commenting on the plans timeline, one interviewee casually remarked: That plan was
approved in December 2000, and there are some thoughts about having the plan
reviewedgoing through some type of evaluation process. But that hasnt gotten very far
yet. Another added:
Well, our strategic plan is pretty generic, which is probably why weve been able
to have the same one in place for seven years. Saying that were going to have a
world-class faculty is kind of like motherhood in a research university. So I
dont think that weve really gone back and evaluated the academic plan. Its just
generally accepted that these are the goals that we want.



Measures. The four goals were specifically addressed with six characteristics and
fourteen supporting objectives for all four goals. Each of these fourteen objectives had a


97
general implementation plan and cost associated with it. Each year an update report
provided information about the progress made toward achieving the goals. A different
report, named the Scorecard, compared University B to its benchmark universities, nine
aspirational peer institutions. Within this report there were thirty-nine measures that
measured the progress made on the six characteristics of the plan.
All of the survey respondents agreed that data collection and analysis were an
integral part of the institutions strategic planning process. The strategic plan measures
were often brought up in the interviews and described as the sole process of evaluating
the university-level plan. One interviewee reiterated that [the director of institutional
research] presents institutional level metrics that we compare ourselves with to our peers.
That contextualizes the success, progress, achievements, [and] objectives of the academic
plan. The measures had stayed relatively the same since their inception, with only minor
changesas reflected in the following observations:
The [strategic] plan measures havent changed. We might have tinkered with
them a little bit, just because sometimes the data change. [For example], with
student surveys, we changed from doing the HERI College Student Survey to
doing NSSE. So you have things like that, but were trying to get at the same
issues, basically.

In all cases, I would say that the changes have been evolutionary, as people have
learned more about the process of measuring performance, collecting metrics, and
all of that.

Another interviewee said, One thing people have all realized is that it is the plan that
works. So we havent made any changes at the macro level. The academic plan is still
pretty much the same academic plan that we have always had.



98
Dialogues and Other Related Methods. Another way in which University B
measured its performance on the strategic plan was to engage in discussions with the
colleges. Colleges scheduled a biennial dialogue with the provost and senior leadership
to ensure that they were engaging in activities that supported the overall strategic plan, as
well as to determine their priorities for the following two years. The biennial dialogue
used data situated around the components of the strategic plan to solicit questions,
comments, and/or observations. This has prompted each dean to develop a two-year
leadership action plan for his or her college and for him or herself as the leader of the
college. The provost has linked these leadership plans to the annual performance review
of each dean.
Each academic department and administration support unit was to incorporate and
integrate, as deemed appropriate, the strategic plan goals into its own respective plan.
Support units submitted performance indicators annually and their budget requests were
evaluated with regard to their support of the strategic plan. Academic and support units
engaged in program reviews and had to demonstrate the ways in which they had helped
the university to achieve its goals.


Reports. Each year the institution issued a progress report on its measures. An
annual report provided information on progress made toward achieving its goals,
including specific information by goal and strategy. An additional report, the Scorecard,
compared the university to its aspirational peers according to a set of benchmark
measures. Each year, both sets of reports are available on the universitys main website.


99
Of those responding to this studys survey, 89% agreed that the institution had a formal
process for gathering and sharing internal and external evaluation data. As one
interviewee described it: The progress should really show up on the [strategic] plan
Scorecard.
Although there was not a specific Board of Trustees meeting identified for a
review of the annual report of the strategic plan, it was often referenced in other speeches
or reports given to the Board. As one person put it, Everything is always related to the
[strategic] plan. In addition, the president and provost would give a state of the
university address each year to the University Senate and campus community that would
relate activities back to the strategic plan.


Main Evaluators. Although 89% of the survey respondents indicated that they
had evaluated and offered advice about the strategic plan at their institution, only 67% of
the respondents said that faculty, staff, and administrators were actively encouraged to
participate in the institutions long-range planning process. This latter sentiment was
more consistent with the some of the comments from the interviews. When asked who
was involved in the evaluation of the strategic plan, most pointed principally toward the
upper administration:
It would be the people at the leadership retreat, and they are sort of passively
involved.

On the academic side, it is the provost, [who] involves the vice provost in the
various areas, which they engage with the deans and the governance structure.



100
[It involves] all the people that are involved in the leadership retreat, the cabinet,
the council, the faculty leaders, and the Board of Trustees.

On the other hand, the biennial dialogues with the colleges were often mentioned in the
open-ended survey responses. By engaging the colleges in this endeavor and having the
support units report on their own performance indicators, also pegged to the strategic
plan, some participants in this research considered this to be clear evidence of having
incorporated all levels into the evaluation of the plan.


Leadership Retreat and Agenda. Two years after the plan was implemented, the
president of University B left for another institution. The Board of Trustees clearly
expected the next president to retain the strategic plan, with allowances for adaptation to
the Presidents style. This was recounted by one respondent who closely followed this
transition:
The hope is when [the president] came here he would stay five to seven years, but
he didnt. Before [the next president] was hired, we went through quite a bit of
angst to get that strategic plan in place. We didnt want to start from scratch. We
wanted [the next president] to pick it up and move it forward in the direction that
had [a new] imprint on it, but to still be consistent with the broader plan.

As part of that process, the incoming president therefore instituted the annual Leadership
Retreat. Each summer the senior administration would meet at a retreat to determine the
next years priorities. From this meeting would emerge the Leadership Agenda, which
identified short-term priorities, specific action steps, and resource allocations, that would
guide them throughout the ensuing academic year. These priorities would not be novel in


101
and of themselves, but rather be embedded within the strategic plan. One respondent
clarified it thus:
The idea was that you take the [strategic] plan and each year you would pick
something to emphasize and that would be the leadership agenda. Its more the
tactical implementation part of the strategic planning process.

Another described it this way:

The President holds a leadership retreat annually In the leadership retreat, then,
the President and senior administrators talk about what parts of the [strategic] plan
they want to focus on for this year. So its kind of becomeand this is a clich,
. a living document, but I think it really is true in this caseso we have this
generic document that we dont really revisit or evaluate; we just accept as being
appropriate goals. Then annually we talk about which ones are we going to work
towards. And so, in a sense, thats a kind of an evaluation. Its a re-thinking about
priorities and whats important, given the new context or whatevers happened.

So although they had a strategic plan, the Leadership Agenda guided the institutions
immediate emphases. It was the Leadership Agenda that was most discussed and acted
upon. Therefore, even if the strategic plan goals were often described as broad, the
Leadership Agenda helped to identify more specifically focused action items.


Leadership
Typically, strategic plans are spearheaded by the president, who is strongly
influenced by the Board of Trustees and the Provost. These leaders direct the rest of the
university to engage in a strategic plan. If there are significant changes in this leadership,
the strategic plan is often discarded or radically modified. This is especially the case
when the strategic plan is widely thought to be the creation of the outgoing leader. To
some degree this seems to have been the case at University B, as some interviewees


102
expressed sentiments that the strategic plan was considered to be part of the presidents
endeavor, thus possibly spelling its doom once the outgoing president left. One
administrator noted that [t]here is some feeling when this strategic plan was put into
place that it was more the Presidents plan. There was some concern about the level of
consultation that created it. Similarly, a college dean said that [t]he current strategic
plan was developed, however, with little initial input from outside the President, Provost,
and Vice Presidents, and required some modification after it had been prepared.
Significantly, though, University B had undergone several leadership transitions during
the course of its strategic plan, and was able to retain most elements of the plan and carry
it forward. Respondents summed it up this way:
We have had the same [strategic] plan across two presidents. [The second
president] was brought in to stay the course on this [strategic] plan, but to have
[the] imprint from the leadership agenda perspective.

When [the second president] was brought in, I think [the second president] was
given the plan and the Board of Trustees was pretty instrumental in saying, This
is our plan, this is what we want to follow.

Interestingly, University Bs current president and provost will also be leaving before the
plans culmination. The common assumption is that the next president and provost will
continue the plan, much as the current president and provost were expected to do.
Reflecting this context, one administrator mused on the future:
Well, it is kind of hard to say that someone would come in and say world class
faculty, not important. Enhance the quality of teaching, not so much So, I dont
know. It might be a matter of emphasis. One of our goals of the [strategic] plan is
to help build [the states] future. That is where we think about things like
commercialization, outreach and engagement, all of those kinds of things. Will
outreach and engagement be such a big deal to the next person? Maybe, maybe
not.

Another described the change of leadership this way:


103

Now, a lot depends on the personal temperament you have for president. So when
[the second president] followed [the first president][egos did not] get in the way
in doing what needs to get done. [The second president] was willing to pick up
what a previous president had done and improve upon it. If this next president is
some kind of an egomaniac, like a Donald Trump type of personality, that can all
unravel pretty quickly. So a lot of it depends on the nature of the leadership. Same
[idea goes for] our provost. [The second provost] was committed to picking up
where our previous provost had gone with that. So hopefully the next provost will
do the same thing.

Driving home the point of hoped-for continuity, one respondent reflected on the situation:
I think a new leader will want to put [his or her] mark on it. Maybe not say, Is this
working? but Are there ways we can expand it or enhance it?
Similarly, University B has also undergone a massive transition in their Board of
Trustees. The Board was comprised of nine members (most appointed by the governor),
many of whom rotate off each year. However, a year and a half ago the legislature
changed the law and expanded the board from nine to seventeen. Thus, two people out of
the nine began to rotate off under the previous system, but then eight new members were
added, making the majority of the board new within the last eighteen months. One
administrator was philosophical about the change, noting that this is where you see the
value of the plan like this. ... It allows you to at least start on the same page. Another
interviewee was a bit less sanguine:
[W]e are going through a change in our Board of Trustees and sometimes these
things can guide what things they are interested in or not interested in, and now
we are going through a change of presidents so my guess is that is going to have
some influence.





104
Integrated Budgeting and Planning
One news article prominently stated that University Bs strategic plan had
established the framework for setting the annual priorities, thus guiding major budgetary
decisions. The strategic plan itself had stated preliminary cost estimates and possible
available resources for support of the plan. It acknowledged that these preliminary
scenarios were between $54.5 million and $117.5 million in continuing funds and
between $400 million and $555 million in one-time funds to support the strategic plan
initiatives. However, one interviewee added this clarification:
It is not really the case that when we built the [strategic] plan we put price tags on
each thing and then looked for ways to fund it all. Because a lot of the plan wasnt
so much about meeting the target, but getting better. There are some areas where
we can attach dollars But whenever we had had specific dollars attached, those
dollars have been allocated. For the most part it has been less specific than that.

The university had expected to fund the plan with state support, funds redirected from
university resources, private fundraising, and increased tuition. However, two events
significantly affected this funding plan. The state economy went into a recession for a
number of years, and the university had to petition the state in order increase their tuition.
While this state has a tuition cap, University B was nevertheless able to gain a temporary
waiver and increase tuition beyond the cap for two years. The following comments are
representative of the comments made by administrators:
We got more [money] in some areas and less in others. I did a projection of what
the strategic plan called for, and the biggest change was that we didnt get the
state funds that we thought we would. We adopted the strategic plan in December
of 2000, and the economy crashed the next year. That affected state resources. On
the other hand, we ended up getting more resources in some of the other areas. So
in the end, it was sort of a wash, as I recall.

I think we were a little bit nave in the amount of resources we would be able to
generate, particularly from the state.


105

The university had also implemented a new budget model around the same time that the
strategic plan was implemented. As one interviewee commented: When we were
redoing the budget system, we rightfully said, How can you do a budget system if you
dont have [a strategic] plan? The answer is [that] we are doing those things in parallel.
This new budget model has been described as an adaptation of Responsibility Centered
Management, a type of incentive-based budget. As described in a news release, the new
model restructured the colleges budgets to ensure their base budgets were aligned with
the needs of the strategic plan, in other words to create budget practices with regard to
the allocation of new revenues and provide incentives to reduce cost and generate
additional revenues needed to address academic priorities. There was, however, still a
central pool of money available for financing certain strategic initiatives. Budget requests
were thus evaluated with regard to their support of the goals of the strategic plan. The
resulting situation was described by one administrator like this:
There is a percentage of the central pool that comes off to the provost
automatically, for the Provosts strategic reserve fund. That fund is what is used
to invest in strategic ways in collaboration with the other vice presidents to
achieve the academic plan.

It was noted that the strategic plan helped determined the distribution of the resources:
We use the strategic plan to decide where resources are distributed. Integrating the
planning with the funding was considered a critical component of the success of the plan:
I think the important part in making this work, is aligning resources with the academic
priorities. In fact, all of the survey respondents agreed that the benefits of strategic
planning were worth the resources that were invested.


106
[In the budget document for the Board of Trustees] It shows how we link our
budget decisions to our academic objectives. We dont spend any money at this
university unless it is contextualized within the [strategic] plan. So this whole
[report] is [strategic] plan related. This is another way we link what we do to the
[strategic] plan.

The level of financing for the strategic plan was reported annually to the Board of
Trustees, and used to help guide the budget process for the following year. Normally, the
provost and the senior vice president for business and finance would make a presentation
on both the financial indicators (which are not a part of the strategic plan) and the
strategic indicators. The presentation was used to help guide and inform the budget
process and the leadership agenda for the following year. It was an annual process tied to
the budget and to the academic planning process. An official with specific responsibilities
for university business and financing concluded, I think the success [of the strategic
plan] or lack thereof is always contextualized around what our fiscal capacity is.


Culture
Interview participants felt that after seven years of having the strategic plan in
place, it had significantly affected the culture of the university. Keeping the strategic plan
at the forefront and constantly referring to it in communications and in unit-level
planning had made everyone aware of the plan. These two comments expressed it best:
One of the things that was kind of interesting when we were interviewing for a
provost (the last time, before [the current Provost] had the position), another one
of the candidates was on campus and said Wow, this is amazing. Everywhere I
go people know what your academic plan is about.




107
I think one of the biggest culture changes has been that everyone knows a
direction we are all moving in. Giving it a name , knowing that it is up on the
web, you can look at it, the measures are there. We are being really honest, it is
not a PR kind of thing. I think it has built an awareness and caused people to kind
of coalesce around the goals.


Consistency of the Strategic Plan
The interviewees also reflected broadly on the experience of strategic planning at
their campus. Keeping focused on the same plan from year to year seemed to have helped
them remain attuned to their priorities and have a sense of progress. The following quotes
aptly illustrate this sentiment:
So the degree in which [a strategic] plan can help focus and also have some
degree of consistency from year to year really helps in moving a place forward
academically, even though you dont have all the resources you think you can
dream up. That has been the real value of this.

You have a bit of tension between the people who like the big hairy audacious
goals and like to pronounce victory by throwing a lot of money at something and
think that is the way to be successful. I think the evidence shows the way to be
successful is to know what you are doing and steadily make progress, so over a
period of the years you make dramatic progress.

These comments are very consistent with the results from the survey question that asked
whether strategic planning had played an important role in the success of the institution.
The majority readily agreed that it had. Yet they also tended to register concern with
regard to future improvement. One administrator singled out a particular area for
improvement in the next strategic plan:
But we dont measure anything about fiscal accountability. So maybe to recognize
that while academics are the purpose of all this being here, there are really
competing non-academic activities that have to happen to make the university
work, and those should be evaluated regularly at the center too, as part of the
same process.


108

These comments illustrate that even though University B had not undertaken a systematic
evaluation of its plan (particularly regarding finances), at least some stakeholders had
given it careful consideration.
Originally University Bs strategic plan was a five-year vision. The president that
initially headed up the initiative was quoted in a news release as saying:
We consider it to be a living document. We dont consider it to be perfect. We
intend to continue to revise and update this plan as we move forward. I encourage
all of you to read it and help us improve it. What didnt we get right?

Considering his subsequent change of professional plans, the university was clearly
obliged to take him at his word, and indeed continued on with the plan long after his
departure, and far beyond the initial five-year vision. Although there was never a
systematic evaluation of the plan, it has been informally assessed, as several interview
subjects and survey participants have noted. It has thus been possible to link responses
and references found in the documents, survey responses, and interviews to each
component of the logic model, demonstrating that this informal process had served as a
default evaluation method.


University C
While its overall budget and enrollment figures ($3.3 billion and approximately
40,000, respectively) generally correspond to the other institutions considered so far,
University C does feature one unique characteristic with respect to the present study:
strategic planning in one form or another has been practiced at this bastion of American


109
public higher education for decades. In 1997 it established a five-year university-level
plan, followed in turn by a three-year university plan. During the course of this study,
University C was in the final year of its third university-level plan (another three-year
plan), and was gearing up for the next planning cycle. The expectation was that it would
be returning to a five-year cycle. The strategic plan itself set forth six goals:
Enhance academic excellence through the support of high-quality teaching,
research, and service;
Enrich the educational experience of all [University C] students by becoming
a more student-centered university;
Create a more inclusive, civil, and diverse university learning community;
Align missions, programs, and services with available fiscal resources to
better serve our students and their communities;
Serve society through teaching, research and creative activity, and service;
and
Develop new sources of non-tuition income and reduce costs through
improved efficiencies.
The strategic plan process at University C is often referred to as a top-down/bottom-up
process. The top-down characteristic is the administrations mandate that every budget
unit must submit a strategic plan. General guidelines are distributed, asking each unit to
discuss: 1) its future directions, strategies, and measures of performance; 2) its strategic
performance indicators; and 3) an internal recycling plan that describes how the unit will
recycle 1% of its operating budget each year and redirect those funds to the highest
priorities. The bottom-up characteristic stems from the fact that the university-level plan


110
is only finalized once the unit plans have been reviewed. Each goal of the university-level
plan therefore has multiple strategies shaped by the strategic plans from the various
budget units. After the unit plans have been submitted, they are read, with an eye toward
identifying common themes, which in turn influence the way in which the university-
level plan is composed.
In addition to the universitys strategic plan, University C pays close attention to a
set of strategic performance indicators. The same set of indicators has been tracked for
several years now, and the university publishes them in a companion document to the
university-level strategic plan.


Sources of Evidence and Logic Model
In order to study University Cs strategic plan, several sources of materials were
obtained and reviewed, including the following public documents: news releases,
speeches, newsletters, journal articles, the strategic plan, the annual updates, and the
annual strategic indicators reports. University websites were likewise taken into careful
account, including those for the strategic plan, the Board of Trustees, and the Offices of
the President, Executive Vice President, Provost, Planning and Institutional Assessment,
and Finance and Budget (see Appendix H). In addition, the Strategic Planning Evaluation
Survey was e-mailed to twenty-six upper-level administrators and deans at University C.
Of the nine that responded (making for a 35% response rate), five stated that they were in
administrative units, two replied that they were in academic areas, and two did not
answer the open-ended question regarding the nature of employment area (see Appendix


111
M). Three upper administrators from the Offices of the Senior Vice President for Finance
and Business, as well as Planning and Institutional Assessment, were interviewed on the
campus of University C (see Appendix I). The executive vice president and provost
declined to be interviewed stating that the executive director of the Office of Planning
and Institutional Assessment would be more appropriate for the interview.
For University C, the logic model evidence matrix presented below provides a
visual reference of each of the research sources by components (see Figure 4.4) (see
Appendices J and M).



112

Sources of Evidence
Document
Analysis
Survey Results Interview
Results
Input
Reflections made on the
use of resources, including
human, financial,
organizational, and
community that have been
invested into the strategic
plan initiative.
6 references 4 items
with responses
that ranged from
78% to 100% for
somewhat agree
or strongly agree
5 responses
Activities
Review of the processes
and actions that were an
intentional part of the
implementation, such as
tools, technology, and
services.
11 references 6 question with
responses that
ranged from
78% to 100% for
agree or strongly
agree
12 responses
Output
Whether the evaluation
methodology included
gathering data or results
on the activities.
26 references 3 question with
responses that
ranged from
89% to 100% for
agree or strongly
agree
8 responses
Outcomes
Short-term and long-term
outcomes consisting of
specific changes resulting
from the outputs,
including attitudes,
behaviors, knowledge,
and/or skills.
26 references 4 question with
responses that
ranged from
78% to 100% for
agree or strongly
agree
11 responses
L
o
g
i
c

M
o
d
e
l

C
o
m
p
o
n
e
n
t
s

Impact
Fundamental intended or
unintended changes
occurring in the
organization or
community
15 references 7 question -
responses ranged
from 78% to
100% for agree
or strongly agree
12 responses
Figure 4.4: Matrix of University Cs Evidence by Logic Model


113
Evaluation Methodology
Due in part to the top-down/bottom-up structure of strategic planning at
University C, the most common response among survey and interview respondents
regarding the evaluation process was that to evaluate the university plan, one really
needed to go back to the unit-level plans to determine whether articulated goals had been
accomplished. One interviewee provided this illustration:
Each strategy in this [university] plan comes from a unit-level plan. For
example advance excellence in legal education was in the [law schools]
strategic plan. So at the end of the cycle, you go back and when you read the
unit-level plans again, youll find out if [the law school] did this or not.

The unit-level plans thus have strategic performance indicators that resonate with the six
overarching goals of the university plan. It was explained that since the university goals
are more broad than the unit goals, the strategic performance indicators are more
meaningful at the unit level. Therefore, to examine the effectiveness of the university
plan, one must review the units strategic performance indicators and the data directly
relating to those indicators. An administrator reiterated this by saying that [t]he strength
of the university plan really rests within the unit-level plans and how they measure
progress toward goals.
The ideal of this integrated approach notwithstanding, several stakeholders at
University C have described the evaluation process for the university-level strategic plan
as informal at best and nonexistent at worst. Most often, the evaluation process was
referenced primarily at the unit level. The Education Criteria for Performance Excellence
from the 2004 Baldrige National Quality Program were often cited in internal documents
to help units evaluate their plans. These evaluations were usually tied to the preparation


114
and development of the next strategic plan for the unit. In creating their next plans, the
provosts guidelines instructed units to discuss their current internal and external
challenges that had been identified through benchmarking, stakeholders feedback,
market analysis, and other assessment data. In various presentations to the Board of
Trustees, units made it clear that they regularly take into consideration student interest,
faculty expertise, developments in the discipline, and societal and state needs when
reviewing their plans. Significantly, the only comment to be found in the documents
regarding the evaluation process for the university was in a speech from the president to
the Board of Trustees, in which he says that as the campus was entering the final year of
the plan, the university was reflecting on the results. Other than that simple statement,
the evaluation process at the university level was not mentioned in all the other
documents reviewed for this study. Yet within the survey itself, 78% of respondents
agreed that specific changes had occurred through a systematic evaluation of the plans
results. Based upon this information, one might say that the evaluation for the university-
level plan could be characterized as an informal process of tracking overall performance
indicators. The following quotes are representative of the comments made by the
interview participants:

We dont have a process evaluating how well we are performing at the university
level. We do have it for the unit level. Our trustees are aware of this, and they see
it, they sort of know whats in it. Its not a formal step.

We have evaluated the planning process, and we have evaluated the unit plans,
but Im not sure if we have evaluated this [university-level] plan.

I think we can do a better job at systematically evaluating the plan. We are
constantly looking at our strategic indicators. Based on how the plan is developed,
it really is at the unit level.


115

Well, it is not the evaluation of plan so much as it is a statement of whether the
goals that were outlined in the plan have been accomplished. Its not whether it
was a good plan because it had a mission, a vision, and a value statement; its
Did the unit do what they said what they were going to do? That is really
determined at the unit level.

To be honest, I dont know how important it is to assess at the university level,
whether we are achieving these goals. They are such high-level goals. Are we
aligning our missions and programs at the university level, or are we creating a
civil and inclusive climate at the university level? It is so big and broad that it is
hard to know.


Measures. In several documents it was stated that the strategic performance
indicators at University C were an attempt to answer the question, How well are we
doing? The indicators, or measures, are tracked to chart progress toward meeting the
universitys goals. In addition, each unit prepared its own strategic performance
indicators relating to both the university and unit goals. The strategic performance
indicators are analyzed every year.
All of the survey respondents agreed that University C had a formal process for
gathering and sharing internal and external data. It seemed that the process in which data
on indicators were collected, analyzed, and reported was well defined. Therefore, it is not
surprising that most interviewees mentioned the indicators as a way to assess the progress
of the plan. Even the brochure for the performance indicators stated that the measures
were a way to carefully evaluate and improve [the universitys] performance. One
person commented in the interview:
Well, what we would say is that we track the strategic performance indicators
every year. That is how we evaluate the planYouve seen the strategic
performance indicators. They are mostly input and output; there are not many
outcome measures. We could do a better job on that for sure.


116
Others likewise commented that even though the indicators were typically considered to
be the best way to track progress, it was recognized that other methods may exist,
although none were mentioned. For example, one individual said, I would say at the end
of the day we are not hamstrung to say those metrics alone are the only way that we judge
how well we are doing. Apparently some had maintained that such indicators were the
bestand perhaps onlyway to measure progress. An administrator commented:
Yeah, there are people who would refute whether the metrics are really giving a
true indication of whether you are making progress or not on a goal. You could
debate that. Thats fair.
However, the indicators do seem to have remained the favorite method by which to track
progress, as they have now been used for almost ten years, and have since increased in
number. One stated it this way:
We have a set of strategic indicators, which evolve over time. I think there are 28
of them [in the previous report]. So what has happened with these strategic
indicators, they just keep growing all the time. I think we have something like 40
now. At some point it stops being strategic. You are just measuring everything,
like a university fact book.

Another commented:

I dont think we have kept the same indicators, but we have had this approach for
a long time that there will be indicators that are organized around the goal. In that
way the approach has been similar even though the indicators have multiplied.


Main Evaluators. When asked in the survey who the main participants were in
the evaluation of the strategic plan, the majority (57%) responded said that it was the
upper administration. This corresponds to the comments made by the interviewees as
well:


117

The provost [and the Office of Institutional Planning and Assessment]
I would say informally the leadership of the university. But formally, I would say
no, there is not a process for that.
A lot of the evaluations of the plans and other high-level decision-making
depends on what the President is observing, people like that.
Perhaps, since most did not know of an evaluation process for the university plan, or
thought of it as an informal process, it was not well understood how the process was
completed and who were the main participants.


Reports. The strategic plan and the strategic indicators reports were easily
available on the web. Also, both reports were highly polished and their paper brochures
of professional quality. Yet, whether or not an annual report or presentation was given to
the Board of Trustees was not evident. The only official Board of Trustees statement
regarding the strategic plan was the approval of the plan in the minutes of the board
meeting. One administer described it this way:
We share, of course, the strategic plan with the Board get the Boards input as
we draft the plan. The Board sees it and gives us their reaction and input to it. So
the Board has some involvement and certainly awareness of the plan. But to
answer the question, no, we dont go to the Board every November with a specific
report on the plan.
There were other presentation or reports that mentioned working toward the goals, such
as the State of the University address. Also, the senior vice president for finance and
business mentioned that his presentations about the budget to the Board included how the
university strategic plan had guided the budget decisions. He said it is more linked with


118
the discussions about budget and the priorities of budget and capital than it is an isolated
report.


Integrating Budgeting and Planning
The strategic plan does not specifically identify exact levels of investments and
what the investment elements might be. The funding is described as following the
directions laid out in the plan. There are central monies that support the strategic
initiatives. For example, since there is a major maintenance backlog, every year some is
used for maintenance.
In the past, the university had required a small percentage of the units total
budgets for the central pool. Because the university plan was a conglomerate of the unit
plans, some of the central monies were also used to support unit plan initiatives. One
administrator articulated it in this way:
There were the presidents strategic initiatives. There was a central pool of money
to support the life science consortium, the materials consortium, the
environmental consortium, and what you would see in the plans, is that positions
could be co-funded and you get money centrally for some of those positions. So
there was a relation between central monies and what you saw in [units] strategic
plans.
For some time, the university had a planning council made up of deans, faculty
members, administrators, and some students. The planning council would review each
unit plan, consider the requests for resources, and recommend funding. Although this was
perceived as a powerful committee, the actual funding amounts were small in relation to
the total university budget. The executive director of planning stated it like this:


119
The other important consideration with all this, is we dont have that much
discretionary money anyhow. If people were looking to planning primarily as a
way to get more money, to affect the allocation of resources, that was the wrong
way to think about it. They should be thinking about writing a plan for
themselves, not because it is going to have any impact on how the provost or
president decides to spend the central monies.
In committee was disbanded in 2002, due to a restructuring of the review process.
Given its limited state resources, University C had to exercise internal budget
reductions and reallocations to fund strategic priorities and critical operating needs. More
recently it has changed this practice to ask each unit how it would internally recycle at
least one percent of is operating budget each year. Units are no longer required to
reallocate funds for the central pool. Guidelines to the units stated that the internal
reallocation would continue for five years. However, given the lack of predictability of
state funding of higher education, there could be a need to have central reallocation in
some years.
In the budget presentations to the Board, the budget was referenced to the
strategic plan. Beginning in 1992, there was a deliberate process of budget recycling and
reallocation incorporated in the strategic planning process. Since that time, the
universitys budget priorities have reallocated funds from its administrative and support
activities to the universitys goals. In a news release in 2000, the provost stated:
Budgetary information is factored into the planning process to provide a realistic
context for proposed changes and initiatives, and strategic plans are taken into
account directly in budget allocation decision.
By the time that news release came out in 2000, the University was in its ninth
consecutive year of internal budget reallocations and that practice continues today.



120
Culture and Leadership
The provost, in a 2001 news release, stated that over the years strategic planning
had become very much a part the universitys culture. Im pleased with the attention and
care that the university community has given to this process and Im proud of the
outcomes were seeing. By practicing strategic planning for so many years, strategic
planning had become part of the university, as one of its customs. One person described
the situation this way:
There is definitely a culture of planning here. Ive been around here for 26 years,
so I was here before it all started. I can remember at the beginning there was a lot
of skepticism isnt this one more management fad or that they would wait this
out. Or people didnt really understand it. That is just not the case anymore.
People know that this is just how we manage the university. Like it or not, they
are going to be involved in planning. That is just the way it happens, and partly
this has been the experience people have gone through, it is like a generational
thing. A lot of people who were here 25 years ago arent here anymore. Also, with
the people that came in [after that] this is all they have ever known (at this
institution). It is the way we do it.
However, it is more likely that the reason strategic planning has become such a way of
life at University C is because its upper administration has had such a long history with
the university. With the university leadership the driving force behind strategic planning,
it had become a tradition. An administrator explained it this way:
It is in the culture at [University C]. [The senior vice president for finance and
business] has been here about 30 years. [The provost] has been here 30 years.
[The president] started here as an assistant professor, then became an associate
dean, left and came back. He has been here about 10 or 12 years. What you got is
this culture of leadership I mean they know the institution so well.








121
Linking the Strategic Plan to Other Processes
Participants in this study often mentioned that the success of the plan was evident
by the way it had been incorporated into other processes. Two examples of linking the
strategic plan to the other processes were the performance appraisal system and
professional development opportunities. One described the relationship as follows:

That may seem a bit circuitous but if you are looking at the plan, one of the ways
you can tell if a plan has been implemented, is you look at the training and
development opportunities for people, and whether they are congruent with what
we say is important in the strategic plan, you look at how peoples performance is
evaluated, [and ask] is that congruent with what we say is important in the plan?

In an effort to determine if the university was truly following the strategic plan, a survey
was distributed to the faculty and staff every three years to see if they believed that the
strategic plan values were in fact practiced at the university. By linking the plan and its
implementation to each individuals performance appraisal and professional
development, it can be said that it was integrated throughout the university.


Length of the Planning Cycle
Since University C had such a long history with strategic planning, it had tried
different variations of planning including the time length of the plan. For a while, it did
strategic planning on a five-year cycle. However five years was considered too long
given the environmental changes that could happen during that time period. Therefore, it
moved to a three-year cycle believing that the shorter planning cycle was probably more
realistic. Yet, it was mentioned during the interviews that the three-year cycle seemed to


122
be too short and required constantly planning. There was some disagreement as to how
long the next strategic plan would be. Both five-years and three-years were mentioned.
In sum, at University C strategic planning had become a continuous process, a
routine, and, over time, units had grown quite proficient at planning. However, the
university level plan seemed to be still evolving and maturing. Even though a systematic
and periodic evaluation of the university plan was not evident, their top-down/bottom-up
process created an unofficial assessment of the plan. This was evident from references
stated in the documents reviewed and responses given in the survey and interviews.
Though periodic reviews at the University level may have been informal, each
component of the logic model was engaged.


University D
As the fourth and final institution under consideration here, University D likewise
fits the parameters of the studys cohort. Enrolling almost 30,000 students and operating
with an annual budget of just over $1.2 billion, this leading public university put its first
strategic plan into place in J uly 2003. The plan was generally academic in nature, and
was one plan among many ongoing initiatives at the university. It enumerated six goals:
Provide the strongest possible academic experience at all levels;
Integrate interdisciplinary research, teaching and public service;
Improve faculty recruitment, retention, and development;
Increase diversity of faculty, students, and staff;
Enhance public engagement; and


123
Extend global presence, research and teaching.
The strategic plan included more than seventy-five action steps and over thirty
illustrative benchmarks. Within the plan, each action step included a listing of the
administrative or academic unit responsible for overseeing implementation. In
conjunction with the strategic plan, another report on the Measures of Excellence
provided additional data points to help track progress. University D was in the final year
of this strategic plan during the course of the present study.
The strategic plan had a five-year timeframe with an interconnected framework to
all of the unit levels. Each year the provost issued guidelines to the deans and vice
chancellors setting forth expectations for their required annual planning and budget
reports. One of these expectations was a description of how the unit had addressed the
strategic plans goals in the course of the previous year.


Sources of Evidence and Logic Model
Similar to the other case studies, the following public documents were examined
for University D: the strategic plan, annual updates, data reports, news releases, meeting
minutes, presentations, and speeches. The websites for the following offices were
likewise consulted: the strategic plan, the Board of Trustees, the president, the executive
vice chancellor and provost, the Office of Institutional Research and Assessment, the
Office of Financial Planning and Budgets, as well as the universitys accreditation
website (see Appendix H). The Strategic Planning Evaluation Survey was distributed to
thirty-nine individuals that were either on the academic planning task force, or who were


124
executive staff, deans, or management staff in institutional research. The survey netted
nine responses, making for a 23% response rate (see Appendix N). Of these nine
responses, four were from administrative units, three were from academic areas, while
one opted not to answer that particular question. The executive associate provost and the
interim director for financial planning and budgets were both interviewed at University D
(see Appendix I). Figure 4.5 below documents the research sources by the logic model
components for University D (see Appendices J and N).


125

Sources of Evidence
Document
Analysis
Survey Results Interview
Results
Input
Reflections made on the
use of resources, including
human, financial,
organizational, and
community that have been
invested into the strategic
plan initiative.
3 references 4 items
with responses
that ranged from
78% to 89% for
somewhat agree
or strongly agree
2 responses
Activities
Review of the processes
and actions that were an
intentional part of the
implementation, such as
tools, technology, and
services.
13 references 6 question with
responses that
ranged from
55% to 100% for
agree or strongly
agree
3 responses
Output
Whether the evaluation
methodology included
gathering data or results
on the activities.
30 references 3 question with
responses that
ranged from
44% to 100% for
agree or strongly
agree
5 responses
Outcomes
Short-term and long-term
outcomes consisting of
specific changes resulting
from the outputs,
including attitudes,
behaviors, knowledge,
and/or skills.
20 references 4 question with
responses that
ranged from
56% to 100% for
agree or strongly
agree
6 responses
L
o
g
i
c

M
o
d
e
l

C
o
m
p
o
n
e
n
t
s

Impact
Fundamental intended or
unintended changes
occurring in the
organization or
community
5 references 7 question -
responses ranged
from 44% to
100% for agree
or strongly agree
6 responses
Figure 4.5: Matrix of University Ds Evidence by Logic Model


126
Some degree of difficulty was encountered in obtaining interviews and survey
participants at University D. Possibly contributing to this may have been the fact that
unlike the other institutions, the researcher did not have a main campus contact at
University D to help facilitate such interactions. Without this main contact, obtaining
participants and scheduling interviews was somewhat more complicated, in that it
required much more detailed explanation as well as repeated requests. The limited
accessibility of the interviewees did not lend itself to a campus visit, which meant that
interviews had to be conducted over the telephone. In addition, finding participants
willing to be interviewed proved challenging. The executive vice chancellor and provost,
for instance, declined invitations, as did both the assistant provost for institutional
research and assessment and the director of reporting in the Office of Institutional
Research and Assessment. Interestingly, all of these figures simply said that the executive
associate provost would be the best person to speak to regarding the strategic plan. When
asked in his interview whom else he could recommend, the executive associate provost
explained that there had been quite a bit of turnover, making it hard to refer anyone else
for an interview, as most others in key positions were simply too new to be able to make
informed comments on the universitys strategic planning process.
Another possible factor accounting for the low level of interest in speaking about
the strategic plan may be that the strategic plan itself does not enjoy a high profile at
University D. Indeed, one of the interviewees even mentioned that the university should
have done a better job communicating and promoting the strategic plan by demonstrating
how intricately it was bound up with daily management decisions. This was also reflected
in the public documents reviewed, most of which simply incorporated the strategic plan


127
into long lists of initiatives currently being undertaken by the university. For example,
during the same time period covered by the strategic plan, the university was also
pursuing a major development campaign, a financial plan, a project to increase state
resident financial aid, an accreditation enhancement plan, a business efficiency project,
and an improved workplace task force, just to name a few. And whereas in the other case
studies these types of initiatives were usually explicitly tied to the strategic plan, such a
connection was not as evident at University D. In fact, rather than the strategic plan being
the driving force behind these initiatives, it seems to have been thought of more as a
parallel activityhence, perhaps, the lower interest level and response rate for the
present study. Nevertheless, even if one might have wished for a higher quantity of such
information, this concern is alleviated by the high quality of the sources that were
available.


Evaluation Methodology
Although 44% of those who responded to the survey agreed that University D had
a systematic process of strategic planning, one-third (33%) of respondents actively
disagreed with that statement. In the document review, most references to the strategic
plan were related to the data. Less often cited were the specific changes or reflections of
the plans actual effectiveness. Although it had illustrative benchmarks for its six goals,
the emphasis was on the several action steps (or strategies) for each goal. Within each
action step was a series of recommendations that named a responsible office. The


128
benchmarks provided were described as examples that could help measure
improvement.


Measures. Most often, measures were referred to when discussing the assessment
of the plan. Eight of the nine survey respondents agreed that data collection and analysis
were an integral part of the institutions strategic planning process. The strategic plan had
benchmarks, and these were listed in a companion report called the Measures of
Excellence. The Board of Trustees decided to develop this report at about the same time
the strategic plan was implemented, in an effort to start obtaining data to track progress.
As noted in the Board minutes, these measures serve as indicators of accomplishments
and quality in education; faculty; staff; public service and engagement;
internationalization; finance and facilities; and athletics. Ten peer institutions were
identified as a reference point for a number of the measures. The Measures of Excellence
report thus tracks data to monitor the universitys performance on its strategic plan goals,
as well as additional data for other initiatives and activities. Some of the measures also
serve as illustrative benchmarks for the strategic plan. One administrator described it in
this way:
These are not tied to a specific strategic plan, but are indicators that they came up
with (about forty) about how [University D] is doing. There is some overlap. For
example, both the [strategic plan] and the Measures of Excellence look at
undergraduate retention and graduation rates. But there are some things there in
the Measures of Excellence that go beyond the academic side that have to do with
infrastructure, capital, budgeting, with development of research facilities.



129
There were other factors besides such measures that played a role in the evaluation
process, including unique opportunities. These were described as events not necessarily
anticipated at the creation of the strategic plan, but ones nonetheless related to the spirit
of the plan. For example, the plan has a long-term goal of increasing the research
capability of the university. Recently, the chancellor decided that the goal for sponsored
research would be to reach a billion dollars per year. Although that goal is not written
anywhere in the plan, it is consistent with the plan, even if not expressly stated. An
interviewee went on to explain this approach in more detail:
They have to do with the changing economic situation and also with just the
unpredictable nature of things that happen Yeah, I think that [it has] to have a
little wiggle-room. Its not going to be enough to say that its not in the plan, so
we cant do it. Theres always going to be opportunities that come up.

Main Evaluators. In the survey, just slightly more than half (55%) agreed that
faculty, staff, and administrators were actively involved in the strategic planning process.
In fact, the strategic plan was rarely referenced in unit-level websites and documents.
One interviewee said that to a lesser extent, the faculty [are participants], only because
we publicize it, put it on our webpage, and talk about it now and then. The participants
in interviews expressed opinions consistent with this attitude. The only main participants
in the evaluation who were mentioned were the executive associate provost, the provost,
the chancellor, and the Board of Trustees. Most respondents to an open-ended question in
the survey said that just the top administrators were involved; only two respondents
stated that the strategic plan was a campus-wide effort.


130
It should be noted that the main architect of the plan, the provost, had left
University D after three years to become the president of another institution. Perhaps the
provost had also served as main driver of the plan, and his absence resulted in the lack of
a unified understanding of the evaluation of plan. However, as his successor was an
internal hire that had been involved in the development, that persons comprehensive
understanding of the plan no doubt played a key part in the plans subsequent evaluation.


Reports. Each year a progress report was presented to the Board of Trustees.
Annual updates of the Measures of Excellence were also presented to the Board at the
same time. These presentations were only made available in an electronic format on the
Board of Trustees website; published paper reports or data reports were not distributed to
the campus or the community. Each year the chancellor would give a state of the
university address that occasionally mentioned the strategic plan, but, notably, none of
these addresses expressly highlighted it. Progress on the strategic plan was basically
directed to the Board only. As one interviewee put it:
So those are really the two things that we use -- the annual update or benchmarks
on the five-year [strategic] plan and the annual Measures of Excellence, both of
which are done in the form of reports to the Board of Trustees with presentations.
Apparently, then, this style of reporting proved ineffective in informing the campus and
community of the scope and purpose of the institutions strategic plan. The Board of
Trustees may have been well informed, but the rest of the stakeholders were not. This
state of affairs was clearly borne out in the difficulty encountered when trying to get
people to talk about the strategic plan at University D. Specifically, those individuals that


131
declined to be interviewed all stated that they did not have the sufficient knowledge to
respond to this research study.


Integrating Budgeting and Planning
From the beginning, University Ds strategic plan was developed to further refine
the planning and budget process. One early draft stated that the plan would serve as a
primary guide for resource allocation. It was expressly noted in several documents that
the strategic plan would shape budget decisions. One news article similarly stated,
While the plan was shaped by many different hands, it is bound together in a common
purpose shared by all: to tie budget decisions made each year to the Universitys overall
goal of becoming the countrys leading public university. The chancellor reaffirmed this
view in a news release, proclaiming that this was indeed one of the purposes of the plan.
A news article the campus paper described it this way:
Considering the fact that this is the first [strategic] plan that has ever been done at
[University D], its reasonable to assume not everyone knows what it is supposed
to do, or more important, appreciates the central role it will play on campus
through most of the decade. It is about money and mission and how best to
connect the two in the allocation of limited resources over the next five years.

And yet, when promoting the strategic plan, the provost reported in a news article
that the campus community should view the plan beyond just the implications for budget
allocations for academic priorities. To that end, he added, the strategic plan was being
implemented in tandem with a five-year financial plan developed by the Office of
Finance and Administration. The financial plan was described as a framework to inform
the campus about university-wide goals. This financial plan also would provide an


132
alignment of resources to meet the highest priorities in future budgets. The financial plan
identified funding options, such as private gifts and endowment income, student tuition
and fees, enrollment-growth dollars allocated by the state, and monies generated by
research. While admitting that it was impossible to know for certain that the funding
would occur as predicted, the provost believed the assumptions were conservative. As it
turned out, however, the funds received did not match expectations, leading to notably
mixed results in the area of funding the plan. Some specific examples were given in the
case of fundraising. For example, the plan called for increasing the honors program, but
due to inadequate monies raised, that goal was unable to be met. In other cases, though,
the opposite proved to be the case. For example:
We got a $5 million gift from a donor, and there was a [strategic] plan initiative
about faculty development. We took the $5 millionit was generally stated to
support facultyand used it specifically to implement some goals that were set
forth in the [strategic] plan for faculty development.

In a state of the university address, the chancellor said, The Board of Trustees was
strong in its determination that we really put our money where our mouth isthat we are
clear and direct in acquiring and moving resources to support our highest priorities.
Therefore, the university-level strategic plan instructed the units to include resource
allocation plans with their strategic plans. These reallocations would then be used to
support targeted strategic plan initiatives. The provost was quoted in a news article as
having remarked, In the annual budget process, I ask each dean, vice chancellor and
director to address the six [goals] of the [strategic] plan and relate these to their budget
requests. A planning task force was developed to manage both the solicitation and
evaluation of the unit proposals. It was the planning task forces responsibility to decide


133
which of the proposals should be referred and recommended for funding, based upon the
innovation of each proposal and how compatible it was with the goals of the strategic
plan. Through an annual budget and planning process, each of the deans and vice-
chancellors then met with the provost, executive associate provost, the associate provost
for finance, and the vice chancellor for finance and administration. The template for the
unit report included the relationship of each units plan to the six goals of the university
strategic plan, the steps needed to be taken in order to accomplish stated goals, and the
budget requests needed to help implement them. Some units were successful in their
requests and duly received funding; others were not. In his interview, the executive
associate provost observed that, Conversely, programs or units that dont fit directly
with the[strategic] plan and dont seem to address it, we actually cut their budgets.
One very important result of integrating the strategic plan with budget requests
was an increased understanding across the university community with regard to budget
decisions. The following comments were made during the interviews:
A specific change that has occurred is a much greater transparency and
understanding of the budget process. Thats really important. It used to be that it
was like a black box: requests would go in, and stuff would come out, and
nobody would know how the decisions were made. I think one of the best things
[the provost] was able to accomplish was thateven in a time of budget cuts
people understood why cuts were made and the criteria used.

People understand the budget process because its open, its transparent, we have
criteria. We say, Heres how were going to make these decisions.
Interviewees also mentioned that increasing the understanding of the budget decisions
served to improve the relationship between the administration and faculty. Moreover, by
tying the strategic plan and the budget together, the odds were greatly increased that the
plan would actually be implemented and used:


134
The concern that [the provost] had was that [it] would just sit on the shelf. And I
think the way weve avoided that is by tying it, making it an integral part of the
budget and planning process, the annual budget process. So that was really
important.
In sum, the greatest result of the plan seems to have been the integration of planning with
budgeting, and how that development has facilitated an increased awareness of the
overall process.


Communication
One of the explicit goals of University Ds strategic plan was to foster
communication across the campus, and particularly to encourage campus constituents to
engage in an ongoing dialogue in order to ensure the plans steady progress and to
monitor any new developments that might impact it. Despite this desire, however, it is
clear that communication efforts foundered, resulting in modest references of the plan in
public documents and even, perhaps, in the unwillingness of faculty and staff to
participate in this study. The lack of reports and publicity did little to engage the campus
community. As one interviewee observed:
I dont think we did enough to communicate to the students and to the faculty on a
regular basis about how strategic planning was being used to shape the campus on
kind of a day-in, day-out basis. I think we do a really good job of communicating
that up to the Board of Trustees. We will get little articles in the in-house
newsletter, or the student newsletter, but not enough. Thats something we should
have done more.

In fact, an equal number of survey respondents (44%, N=4) both agreed and disagreed
that strategic planning played an important role in the success of the institution. In
addition, two people even agreed with the statement that the strategic planning activities


135
did little to help the institution to fulfill its mission. None of the other universities
survey responses in this research were so demonstrably divided. The lack of
communication may therefore have contributed significantly to the lack of a unified
understanding of the plan and its outcomes.
In sum, strategic planning was new to University D. Even though it was in the last
year of the plan, the plan had only begun four years earlier. Poor communication, limited
public presentations, and the fact that it had to compete with other initiatives all seem to
have factored into the participants nebulous understanding of the plans evaluation
process. But even if there is no evidence thus far of a systematic evaluation of the
strategic plan at University D, a strategic plan initiativeas it evolves and matures in
coming yearswill most likely continue to have a major presence at the university.


Cross Case Analysis
The following provides an analysis of all four cases. First, the results are
examined within the logic model components. Second, the key themes that emerged from
the cases are discussed.


Summary and Discussion of the Results within a Logic Model
Data from each of the four case studies were evaluated and sorted into the five
components of a logic model. These components are: input, activities, outputs,
outcomes, and impact. Both the survey and interview questions were linked to the logic


136
model components, thereby facilitating the coding of the results of the research. The
documents were also reviewed and coded into the appropriate logic model component.
The data were then tallied for each component to show how strong or weak the research
sources were for each one of the logic model components. For each case study institution,
a logic model evidence matrix was developed. The following is a comparison of the four
cases using the evaluation matrix (see Figure 4.6).


137

Cases
University A University B University C University D
Input

19 references/
and survey
responses
ranging from
72% to 100%
in agreement
17 references/
and survey
responses
ranging from
67% to 100%
in agreement
11 references/
and survey
responses
ranging from
78% to 100% in
agreement
5 references/
and survey
responses
ranging from
78% to 89% in
agreement
Activities

13 references/
and survey
responses
ranging from
77% to 94% in
agreement
26 references/
and survey
responses
ranging from
55% to 100%
in agreement
23 references/
and survey
responses
ranging from
78% to 100% in
agreement
16 references/
and survey
responses
ranging from
55% to 100%
in agreement
Output

23 references/
and survey
responses
ranging from
83% to 94% in
agreement
45 references/
and survey
responses
ranging from
78% to 89% in
agreement
32 references/
and survey
responses
ranging from
89% to 100% in
agreement
35 references/
and survey
responses
ranging from
44% to 94% in
agreement
Outcomes

40 references/
and survey
responses
ranging from
83% to 94% in
agreement
28 references/
and survey
responses
ranging from
78% to 100%
in agreement
37 references/
and survey
responses
ranging from
78% to 100% in
agreement
26 references/
and survey
responses
ranging from
56% to 94% in
agreement
L
o
g
i
c

M
o
d
e
l

C
o
m
p
o
n
e
n
t
s

Impact

30 references/
and survey
responses
ranging from
83% to 94% in
agreement
21 references/
and survey
responses
ranging from
89% to 100%
in agreement
27 references/
and survey
responses
ranging from
78% to 100% in
agreement
11 references/
and survey
responses
ranging from
44% to 100%
in agreement

Figure 4.6: Comparison Matrix of Each Institutions Evidence by Logic Model
The logic model provides an inventory of the resources and activities that lead to the
relevant results, thus serving as a method for assessment. By monitoring an entire
universitys evaluation process, the logic model can help determine whether the


138
institution has assessed all aspects of its strategic plan. It also identifies areas of strength
and/or weakness.
A basic comparison of the institutions results can be made by reviewing the
amount of materials, references, statements from documents, open-ended survey
questions, and interviews for each logic model component (see Figure 4.7).
0
10
20
30
40
50
Input Activities Output Outcomes Impact
Logic Model
N
o
.

o
f

R
e
f
e
r
e
n
c
e
s
A
B
C
D

Figure 4.7: Universities Results Charted on the Logic Model
For example, three of the four institutions had the least amount of references in the input
component. This demonstrates the lack of emphasis in the evaluation for reflections made
on relational issues invested in the strategic plan initiative and the appropriateness of the
mission and vision. Similarly, a comparison within the activity component shows that
there were more references than with the input component. However, the amount of data
for the activities component was limited. Assessments of the processes or actions of the
strategic plan initiative (as categorized by the activities component) were occasionally
referred to in reviews or modifications of internal business practices. When reviewing the
distribution of the references within the logic model, the impact component is a middle


139
value. Although evaluating the impact of a strategic plan seems like it would be the
primary factor, for the four institutions studied it was not. Perhaps the evaluations of the
strategic plans were too young, and more focused on the present to allow for or
encourage reflections made on long-term consequences. The major emphasis in the
evaluations in these four case studies was in either the output or outcomes components of
the logic model. The interview participants at these institutions certainly had little
difficulty in rattling off the progress made on the measures and emphasizing the
immediate outcomes of those constructive numbers. Reports on the performance
measures were typical and the publication materials and presentations gave specific
examples of changes that had occurred at the institutions because of their respective
strategic plans.
When comparing the survey results of the four case studies some differences
emerged among the cases in the levels of agreement to the survey questions. For
instance, University D had the lowest amount of agreement with the survey questions in
all of the logic components, except for the questions linked to the input component. (For
these input questions, University D tied with University C in having the highest level of
agreement to the survey questions.) The implication of the lower agreement levels of
University C demonstrates the lack of conformity of the participants and indicates that
the evaluation process is not as developed as it is at the other three institutions. By
contrast, University B has the highest level of agreement to the survey questions. This is
not surprising, however, given the fact that strategic planning had been part of this
institutions culture for many years, and that most of the survey participants had a long
history with the institution.


140
In comparing the four institutions, University D had the least amount of materials,
references, and statements; yet it followed the same pattern of distribution across the
logic model as the other institutions. The smaller amount of University Ds references
were most likely due to the fact that it had started its strategic plan four years earlier and
therefore had the least amount of time to evaluate its plan. Conversely, University B had
the largest amount of references for its evaluation process and followed the same pattern
of distribution. Using the logic model as a template in the review has thus provided a
more comprehensive overview of the evaluation process of the strategic plans.


Summary and Discussion of Common Themes
The results of the data analysis showed that although each case was distinctive,
there were common evaluation methodologies and themes among all of the institutions.
The following section describes and analyzes these common methodologies and themes.


Measures. The prevalent evaluation tool for all of the institutions was a heavy
reliance on measures. These performance measures were most often cited as evidence of
their evaluation. Obviously, the use of measures is an effective and demonstrative way to
show change. Yet more often than not, these measures were quantitative in nature; less
emphasis was given to qualitative data. Perhaps with todays increasing interest among
higher education institutions in using dashboards (visual representations of the most
important performance measures), qualitative data does not lend itself to this format.


141
Omitting qualitative data from the measures, however, amounts to a missed opportunity
to illustrate other important aspects of the institution. It is important to have performance
measures, but the measures themselves are not the only way to evaluate a plan.


Main Evaluators. Another aspect of the evaluation methodology common to all
institutions studied was a strategic plan committee. In each case, the committee was only
formed for the creation of the plan and not responsible fornor participated inthe
evaluation of the plan. The one slight exception to this was University A. Not only had
the university initially convened a task force for the creation of the strategic plan, it had
also assembled a second committee, called the Strategic Plan Review Committee, in the
beginning of the final year of the plan. The strategic plan review committees sole
purpose was to focus on providing a retrospective view of the effectiveness of the plan
and a prospective view of the priorities and characteristics to be considered for the next
plan. In addition, University A included community visits as another evaluation
technique. These community visits were another tool in the methodology to get feedback
from the stakeholders in the city, region, and state. University A thus recognized the
importance of the evaluation process including representatives from across the campus as
well other the stakeholders, as a vital addition to the upper administration itself. The
evaluation should include some of the people that developed the plan as well. This
representational evaluation can be achieved in various manners, such as though a review
committee or task force, surveys, focus groups, or open forums, to name just a few.
Asking stakeholders for their reflections, thoughts, or satisfaction on the progress made


142
toward stated goals creates a discussion of strengths and weaknesses. The importance
was generally recognized of making this inclusive and part of the strategic plan process.
That being said, it is still nevertheless true that the main evaluators identified for
each institution were for the most part members of the upper administration. All levels of
the universities and other stakeholders may have been involved with the development
and/or implementation of the plan, but may not necessarily have been consulted in its
evaluation. Only University A included within its evaluation a review committee and the
presidents community visits (i.e., visits to regional cities, towns, and businesses to
communicate the progress made on the strategic plan). But even so, these were in
addition to the review by upper administrators. The upper administration and Board of
Trustees were generally the primary or principal evaluators of the institutions strategic
plans.


Communication. The most common way of communicating the evaluation of the
strategic plan was the tracking of measures at each institution. These results were
normally disseminated through publications, most notably annual reports. In each case an
annual report was indeed disseminated, although with variations among institutions with
regard to intended audiences and distribution. For example, University A presented its
annual report to the Board and distributed it to stakeholders. It was widely available
throughout the campus and to the community. By contrast, University Ds annual
presentation was primarily for its Board of Trustees and was not circulated beyond the
Board. Also, the presidents at Universities A and B both included in their public talks


143
progress made on the strategic plans. In contrast, the presidents at Universities C and D
mentioned the strategic plans more in passing in their addresses, not particularly
highlighting them as the impetus for change. Thus, even though the institutions saw the
value in communication, their approaches to it were very different. These various
communication styles assisted in the different results.


Culture. In three of the four cases, participants stated that strategic planning had
become ingrained into the culture. At Universities A, B, and C, participants often cited
how the culture had changed due to the adoption of a strategic plan, and how the
institutions employees had shifted their attitudes and behaviors, and moved towards
accepting the plan and working toward a shared vision. Data-driven and strategic
priorities were common phrases used by these participants to describe the new way of
thinking at their institutions.
On the contrary, University Ds strategic plan had been introduced four years
earlier, and its influence on the culture was never mentioned. It may have been that the
strategic plan at University D had not had sufficient time to become ingrained into the
culture of the institution. Or it could be that since the strategic plan was rarely highlighted
in communications (at least in comparison to the other institutions), it did not become as
prevalent in the conversations and customs at large. At this institution, the strategic plan
had a lower profile and did not seem to generate very much attention or recognition of its
wider importance.


144
As discussed previously in this chapter, the culture of strategic planning can
influence its effectiveness, particularly in encouraging participation and collaboration.
Using elements of planning such as implementation, resource allocation, and the
integration of budgeting and planning, a new mindset can be created, becoming pervasive
across the campus. Establishing this culture of supportive cooperation for strategic
planning may thereby enhance the likelihood that a meaningful evaluation of the plan
will be carried out.


Leadership. Leadership was frequently cited as having played a major role in the
evaluation of the strategic plan at some of the institutions. For example, although
University B underwent many changes in leadership, it was nevertheless consistently able
to carry on with its strategic plan. In fact, several participants noted that the new leaders
purposely continued to adhere to (and in some cases even enhance) the strategic plan.
With their support and guidance, the strategic plan therefore not only survived, but also
adapted to a changing environment. Even though the evaluation was frequently
characterized as informal at University B, such supportive leadership was considered a
key factor in its overall success.
Leadership was mentioned at University C as well, even though it had a very
different situation. University C has had the same set of leaders for many years. The
participants stated that their strategic planning process and the evaluation of the plans
(both historically and currently) were strengthened by the consistency of the leadership.
Having well-established and erudite leadership in strategic planning created a favorable


145
environment in which the plan could be evaluated. One participant stated that because the
president was so knowledgeable about the plan, he easily drew talking points from it for
his public speeches, and that he and other leaders had a pretty good sense of whether
people [were] doing what they said they would be doing or not. Although the evaluation
methodology was not considered systematic, there was an understanding that it was
accomplished, even if casually.


Integration of Budgeting and Planning. Another important (but often
overlooked) aspect of the evaluation of the strategic plan is the review of the internal
business processes, such as staff resources, operations, activities, and functions. The
goals and objectives of the strategic plan articulate the core values, the mission and
vision. Therefore, the activities stemming from that plan should be linked to other
important organizational processes, such as budgeting and professional development
activities (Dooris & Sandmeyer, 2006). That is, the activities should reflect the core
values of the institution. For example, linking the budget process to the strategic plan is
essential. Effective resource allocation includes top-down guidance informed by bottom-
up knowledge and realities. Effective resource allocation applies measures consistently
and is able to respond to analysis of those measures to make monetary adjustments if
necessary. Incremental (or across the board) budgeting generally does not take into
account the priorities of the institution as described in the strategic plan, so if those
priorities are not funded, then the institution will not be able to sustain the plan, no matter
how inclusive it is.


146
The integration of budgets and strategic planning was discussed at each of the
four campuses under consideration. All of the universities recognized that in order for
their strategic plans to be successful, the costs of those plans needed to be estimated up
front, and the plans needed to be funded. At each institution, units had to demonstrate
how they had first internally reallocated funds and how their initiatives had advanced the
universitys strategic goals in order to get additional funding from the university. One of
the greatest benefits of the strategic plan for each institution was the integration of the
budgeting and planning processes, and how that integration served to increase awareness
of the values emphasized by the strategic plan itself.


A Conceptual Model for Evaluating a Strategic Plan Initiative
Even though strategic plans in higher education are as unique as the institutions
they reflect, there are basic evaluation elements that are generally applicable to all
institutions strategic planning processes. The exemplar characteristics from the findings
help create the conceptual model. There are five elements of evaluation in the conceptual
model. These elements provide the basis of an evaluation methodology that an institution
can apply to its strategic plan initiative. This conceptual model is organized according to
five elements, and prompts questions that fit into the larger framework or template. These
example questions provide institutions with the opportunity to explore and evaluate the
various components or processes within a strategic plan initiative. The conceptual
evaluation model is provided below in two tables. The first figure lists the first evaluation


147
elements that target the resources needed (see Figure 4.8). The second figure provides the
last three elements that access the intended results of the evaluation (see Figure 4.9).
Evaluation Element Interrelated
with Logic
Model
Sample Questions
First element. A summarization of the
mission and vision. Exploration of the
relational issues of influences and
resources, such as the economic, social,
or political environment of the
community, and the appropriateness
and fit of the vision. These types of
questions help explain some of the
effect of unanticipated and external
influences.
Input Internal. Have expectations
or opportunities in learning,
research, or service been
considered?

External. Have the changes
in the local, state, national, or
global environment been
considered?
E
v
a
l
u
a
t
i
o
n

o
f

R
e
s
o
u
r
c
e
s

N
e
e
d
e
d

Second element. A review of internal
business processes, such as staff
resources, operations, activities, and
functions. Questions that ask about the
extent to which actions were executed
as planned.
Activities Internal. Do the internal
business processes, such as
budgeting or professional
development, articulate the
values and goals of the plan?
Can they demonstrate a link
to the strategic plan?

External. Did the resource
development and allocation
have enough flexibility to
adapt to the conditions yet
meet the needs?

Figure 4.8: The Conceptual Evaluation Model - Evaluation of Resources Needed


148


Evaluation Element Interrelated
with Logic
Model
Sample Questions
Third element. An analysis of the data;
goals and objectives in measurable
units, which include the internal and
external measures. The measures
should include financial, equipment
and facilities, along with the learning,
research, and service data.
Output If targets were set for the
measures, the key question is
if those targets were met.

Internal. Measures that track
fundamental data on the
institution.

External. Measures that
provide comparisons of
fundamental data among
similar institutions.
Fourth element. An assessment of the
consequence, value added, the
outcomes, and/or the effectiveness of
the plan. These questions try to
document the changes that occur at the
institution as an impact of the plan.

Outcomes
and Impact
Internal. To what extent is
progress being made toward
the desired outcomes? Has
a culture of planning
developed or evolved? or
Does the leadership support
the core principles?

External. Are the outcomes
and impact of the strategic
plan recognized by external
agencies?
E
v
a
l
u
a
t
i
o
n

o
f

I
n
t
e
n
d
e
d

R
e
s
u
l
t
s

Fifth element. Stakeholder participation
in the evaluation by providing their
perceptions of the effectiveness of the
strategic plan.
Impact Internal. This would include
involvement from
stakeholders - students, staff,
faculty, regional and state
leaders, as well as input from
other organizations, by
asking about their
perceptions of the
effectiveness of the plan.

External. This would include
perceptions from external
sources that could provide a
validation, such as
perceptions of university
presidents around the world.

Figure 4.9: The Conceptual Evaluation Model - Evaluation of Intended Results


149
This conceptual evaluation model is closely integrated with the logic model. The
logic model provides a succinct framework for evaluating programs or processes. The
conceptual evaluation model uses the logic model components (inputs, activities, outputs,
outcomes and impact) as a base to assist the unification and comprehensiveness of the
model. Resources that are invested into the program or process, such as the human,
financial, organizational, and community, are inputs. The tools, services, or actions make
up the activities component. The results of the activities are the outputs while the specific
changes in attitudes, behaviors, knowledge, and skill arising from the process are the
outcomes. The fundamental intended or unintended change occurring in the organization,
community, or system because of the program or process is the impact component of the
logic model (Bickman, 1987; W.K. Kellogg Foundation, 2004; Whorely, Hartry, &
Newcomer, 1994). When reviewing elements of the strategic plan that fall under the first
two components (inputs and activities), the evaluation is assessing the part of the process
that involves the resources needed to implement the plan and the intentions of the plan.
When reviewing elements that are within the last three components (outputs, outcomes,
and impact), the evaluation is assessing the intended results of the plan. The conceptual
evaluation model thus demonstrates how an institution should go beyond the output
measures and incorporate all components of the logic model in evaluating the outcome
and impact of the strategic plan.
The evaluation elements and process should be seen as part of the strategic plan
initiative from its conception. Incorporating the evaluation methodology into the strategic
planning process will enable it to be more holistic and inclusive. The feedback from the
models questions will not only answer the question of whether the plan was effective,


150
but may also help inform future planning. The final report at the end of the planning
cycle should be reviewed by the president and Board of Trustees, but also by all
stakeholders. Stakeholders should likewise play a role in responding to the evaluation.
Integrating this type of communication into the evaluation (and thus ultimately into the
strategic plan itself), may thereby lead to a more comprehensive strategic planning
process.
In addition to the logic model as a base, the questions within the conceptual
evaluation model are both formative- and summative-type questions. Formative questions
help to improve the strategic plan process, while summative questions help to illuminate
whether the plan was effective. Examples of some of the formative questions are those
that ask about the summarization of the mission and vision, or the analysis of the
measures. Summative questions generally ask about the short- and long-term outcomes of
the plan; asking stakeholders (e.g., students, faculty, staff, community leaders, and
organizations) for their thoughts and analysis, including their satisfaction and knowledge,
is an illustration of this.
As Banta (2004) advocates, good assessment includes both direct and indirect
measures. Both of these have been incorporated into the conceptual evaluation model as
well. The direct measures are the outputs, such as the specific metrics and benchmarks or
targets. The indirect measures are the results from community visits, focus groups, and
surveys. Questions within the conceptual evaluation model should also include both
quantitative and qualitative methods. For instance, measurements of alumni support or
projections of student enrollment are quantitative measures, while a review committee or
feedback opportunities for stakeholders provide qualitative measures. The measures


151
themselves should focus on the significant few rather than the trivial many (Dooris &
Sandmeyer, 2006). A meaningful evaluation measures what is truly important, avoiding
the temptation to focus just on easy-to-calculate metrics. It is therefore vital to measure
consistently across periods, both internally (by tracking historical trends) and externally
(by benchmarking against peers), with as much emphasis as possible on outcomes.
The value of strategic planning is not the plan itself, but the philosophy and
communication that it engenders and encourages. Once the plan has been implemented,
institutions should use it to foster a continual dialogue on the institutions mission and
vision. As the strategic plan initiative matures, this dialogue, in the form of an evaluation,
should help to determine whether the initiative was effective, as well as to inform future
planning efforts.


Summary
This chapter has reported the results of the revelatory case studies in order to
examine how some higher education institutions evaluate their strategic plan initiatives.
The results of the study included an analysis of public documents, descriptive statistics on
the survey, and interview transcriptions, all linked to the logic model. This approach has
thus provided a template through which each institutions evaluation of its strategic plan
initiatives may be assessed. The most obvious finding it that although all four institutions
had some sort of evaluation routine, only University A had a systematic evaluation
methodologywith clear ramifications for the plans overall success and, even more
significantly, for the task of demonstrating the plans usefulness. By reviewing the


152
overall results of the cases, a conceptual model for evaluating a strategic plan was
presented in this chapter as well. A more detailed discussion are presented in the next and
final chapter.


153




CHAPTER V: SUMMARY AND DISCUSSION



In summarizing the dissertations research and results, this chapter explicates and
reiterates the significance of the present study. After a brief review of the research
problem and methodology, the chapter discusses the implications. The chapter concludes
with a description of the studys relationship to previous research, suggestions for
additional research, and closing remarks.


Review of Problem Statement and Research Methodology
As noted in chapter two, the current environment of oversight and accountability,
along with declining state support for higher education, have prompted many universities
to utilize strategic planning in order to achieve their mandated objectives. Yet strangely,
relatively little research has been done on evaluation models for assessing strategic
planning in higher education institutions. This study has therefore endeavored to shed
some much-needed light on how higher education institutions have assessed their
university-wide strategic plans. This has been accomplished by focusing on their
evaluation methods. Guiding this study were two main questions: What, if any,
methodological processes do research universities use to evaluate their strategic planning
processes? and How does the strategic plan evaluation methodology identify the extent


154
to which the targeted strategic plan funding sources correlate with the strategic plan
goals?
In focusing on those core questions, this dissertation has used a case study
methodology appropriate for multiple institutions. Data collection included documents,
results of a survey, and semi-formal interviews. The concept of triangulation (i.e., using
multiple sources of data to measure the same phenomenon) guided the data collection and
analysis. The data obtained from the research stemmed from documents about the
evaluation of each institutions strategic plan, descriptive statistics derived from survey
results, and direct quotes and notes from interviews with key administrators. Such data
triangulation has substantially enhanced the validity of the study. The data were then
coded, analyzed and reported using qualitative and quantitative procedures.


Discussion of Results
When reviewing the overall results, exemplar characteristics emerged from the
findings and were incorporated into the conceptual model. These important findings
should be taken into consideration when evaluating a strategic plan. For instance,
University A employed several tactics that led to a systematic evaluation. The creation of
a review committee, the use of community visits, and the consistent use of
communication of the strategic plan were all very successful. These three tactics
involved and motivated various campus constituencies, which in turn were key to
evaluating the plan. By engaging the stakeholders, University A was able to get a better
understanding of the perceptions of the outcomes and impact of its strategic plan.


155
Although Universities B and C were not as effective in their evaluations as University A,
they, too, offered strategies that proved to be valuable. Both cases offered good examples
of how leadership drives the strategic plan. The advantage to having good leadership is
that it stabilizes the institution, even during times of change. Even though University D
was the weakest case, it nevertheless provided insight on what to include in an
evaluation. The strategic plan did not have a high profile at University D, and was
simply one of many initiatives that were being implemented. In addition, the relatively
low priority given to communicating and promoting the strategic plan was reflected in the
lower interest level in the strategic plan itself. The findings from University D
emphasized the issues that would occur as a result of the lack of clear communication.
It is apparent from the findings that the more closely budgeting is integrated with
the strategic plan, the more likely it is for the university to be able to follow its plan and
reach its goals. Being specific about potential funding sources, yet allowing for
flexibility, gave University A the direction and latitude it needed in order to successfully
meet its goals. Each university attested that integrating budgeting with strategic planning
helped to create a sense of transparency in the budgeting decisions, which in turn was one
of the greatest benefits of the strategic plan. Defining both resource development and the
strategic allocation of those resources thus allows for both increased elucidation and
direction.
Even though all four universities studied here had similar strategic plans, the
particular cultures at the universities influenced the establishment of their respective
strategic plans. At Universities A, B, and C, participants often commented on the campus
culture and its adaptation to strategic planning. Participants at these universities had a


156
more unified understanding of their plans, and seemed to share an attitude of acceptance,
and even, in some cases, a common galvanizing goal of working toward a shared vision.
Conversely, at University D, those who declined to participate in the survey or interviews
most often stated that they were too unfamiliar with the strategic plan to be able to
contribute to this study. As discussed in chapter four, the culture of a university can thus
clearly influence the effectiveness of its strategic plan, most notably in the realms of
participation and collaboration. Establishing a culture that is both supportive and
cooperative with regard to strategic planning can thus enhance the likelihood that the plan
will be evaluated in a meaningful fashion.
A very simple but critical component of a successful implementation and
evaluation of a strategic plan is the use of communication. Continual communication of
the outcomes and impact of the strategic plan is one of the easiest ways to promote
involvement and motivation. And yet, this very simple action is often overlooked. The
best example of effective communication can be found in University A. An annual report
was presented to the Board and was distributed publicly. In addition, speeches,
presentations, and news releases almost always related the topic at hand back to the
strategic plan. Each action or outcome was likewise related to the plan. By contrast,
University Ds annual presentation was primarily only for its Board of Trustees, and little
effort was made to share this information beyond this audience. Thus, the various
communication styles either assisted in or detracted from the institutions promotion,
involvement, and participation in the strategic plan.
It was also apparent from the findings that leadership was a critical factor in the
evaluation of the strategic plan. However, leadership does not necessarily mean having


157
the same president, provost, or members of the Board of Trustees. Although the stability
of leaders should not be underemphasized, a change of leadership does not necessarily
spell doom for a strategic plan. For instance, University B underwent many changes in
leadership, and it was noted that the strategic plan helped to maintain a sense of
consistency at the institution during the change of leadership. Another example of
leadership playing a major role was at University C. The same set of leaders had been at
University C for many years. It was often commented that the strategic planning process
and the evaluation of the plans (both historic and current) were strengthened by stability
in the leadership. Consistent leadership in strategic planning can thus be said to play a
major role in fomenting a favorable environment in which a strategic plan may be
evaluated.
It is worthwhile to draw added attention to some of the contextual factors whose
importance came into focus in the course of the study. For example, several of the
interviews showed that while strategic planning rhetoric may have been similar among all
the institutions, each campus still featured its own distinctive vocabulary, particularly
with regard to certain key concepts or buzz words. Such subtle distinctions in language
were usually found in the documents as well. Once the researcher learned to recognize
these differences, the interviews became more conversational, and required less attention
to definitions or explanations.
It is also worth noting just how important a source the detailed interviews proved
to be. Those interviewed were often quite candid, providing many helpful illustrations of
issues facing their institution. It was quite informative for the researcher to learn about


158
the institutions in this manner, and it provided a more nuanced contextual backdrop
against which to view and better understand the various strategic plans.


Limitations. Despite the potential benefits to those involved in strategic planning
in higher education, the present study does have its limitations. As noted in chapter one,
the institutions under consideration here were all public research universities. Therefore,
the results from this study may not necessarily be applicable to non-research or private
institutions. In addition, the perceptions and rhetoric at the institutions studied may vary
somewhat from that of other institutions, thereby impacting their interpretation of the
findings.
Beyond this, some other confines of the present study should be taken into
consideration when considering future research. Having a main campus contact at the
institution can greatly assist with data collection, since such a contact person can provide
critical documents and offer guidance on the selection of survey participants and
interview subjects. This type of study depends significantly upon the reflections and
observations made by high-level leaders. Typically, however, these individuals tend to
be very busy and even inaccessible. The campus contact can greatly aid the research
process by putting the researcher in touch with such key individuals. It is likewise very
helpful when the campus contact provides a cover letter to the survey participants on his
or her campus, as this would seem to boost the response rate. Having a campus contact
can help to legitimize the study in the eyes of that institutions participants.


159
Another factor to take into consideration was all four institutions in this study
were in the final year of their strategic plan initiatives. Therefore, questions related to the
overall impact of the plan were usually viewed as somewhat ahead of the curve, as such
impacts (whether intended or unintended) are often not recognizable until seven to ten
years after the plans conclusion. Thus, since these four cases were still completing their
strategic plans, references to impact were generally more anticipatory than declarative.
It is also important to note that the conceptual model is focused on adaptability.
That is, the model provides an evaluation couched in general terms, since higher
education institutions, and their respective strategic plans, are each distinct. Nonetheless,
there are still basic evaluation criteria generally applicable to all institutions strategic
planning processes. The conceptual model therefore provides a template that can be
adapted to fit the institution.


Significance. The purpose of this research was to study methodologies used by
higher education institutions to evaluate their strategic plans. Therefore, first, and
foremost, this study may provide practitioners in the field of strategic planning or
institutional research valuable information when evaluating their own strategic plans. It
provides a descriptive framework enriched by illustrations drawn from actual experiences
at the selected institutions. These evaluation methodologies in practice, and more
importantly, the conceptual model itself, may thus offer academic practitioners firm
strategies to employ at their own institutions.


160
The main significance of the creation of the conceptual model may simply be that
it provides a tool others can use. Future academic practitioners may therefore want to
incorporate the conceptual model when developing their own strategic plan, as the
conceptual model provides an empirical base for future planners to modify and improve
those planning initiatives.


Relationship of the Current Study to Prior and Future Research
As described in chapter two, there is widespread agreement that literature on
empirical findings about successful higher education strategic planning is nearly
nonexistent (Birnbaum, 2000; Dooris, Kelley, & Trainer, 2004; Keller, 1997; Nunez,
2004; Paris, Ronca, & Stransky, 2005; Schmidtlein & Milton, 1988-1989). One reason
for this deficiency may be that no strategic planning evaluation model has been
specifically proposed for higher education institutions. Yet by performing an evaluation
of the strategic plan, a university may well be able to correlate its performance to
established purposes. After all, the most valuable aspects of evaluation as an integral part
of strategic planning include the clarity it brings to goals and objectives, the
encouragement it gives to taking a systematic look at issues, and the emphasis it places
on institutional improvement.
This study has aimed to expand the knowledge of how strategic plans are actually
evaluated, and in doing so, to contribute to the development of the conceptual evaluation
model. As the first study to examine evaluation methodologies for university-wide
strategic plans, this dissertation provides interested parties with a meaningful foundation


161
in an area vital to the current and future success of universities and colleges. But even
though the conceptual evaluation model appears to be transferable to all types of higher
education institutions, additional case study research is essential to validate or disprove
the conceptual evaluation model developed in this study. Expanding the research to
include other types of higher education institutions, for example, would most likely
strengthen the proposed model. For instance, many private liberal arts colleges are
incorporating more and more elements of assessment activities into their strategic plans.
Studying their methodologies may enhance the evaluation methodologies used by larger
universities. Another possible avenue for further investigation would be to evaluate the
conceptual model presented here in relation to the concepts of Total Quality Management
or to the Balanced Scorecard. Both of these management approaches use various
perspectives to center on performance and strategy for the institutions long-term success.
The conceptual model could certainly be applied in reviewing these approaches.
Specifically applying the conceptual evaluation model to institutions that have an Office
of Institutional Effectiveness could serve as the basis of another related research study.
Such offices commonly review assessment and evaluation findings from the various
academic and administrative units to demonstrate accountability, evaluate and improve
the quality of services, and determine whether or not the impact of the units is consistent
with the institutions mission and vision. It could prove very interesting to test the
conceptual model in such an organizational structure. Alternatively, the conceptual
evaluation model presented here could be compared to or contrasted with some of the
assessment elements of the accrediting agencies or to the Baldrige National Quality
Program. One other possible permutation of this line of study could be to incorporate the


162
conceptual model in the development of a new strategic plan. By following the lifespan
of that plan, the research could then focus on the influence that systematic evaluation
brings to bear on the effectiveness of the strategic plan itself.
The current environment emphasizing accountability in institutions of higher
learning all but dictates that such institutions demonstrate the effectiveness of their
strategic plans. It is hoped, then, that additional research will replicate and enhance the
conceptual model presented here, and thereby further advance knowledge about the
evaluation of strategic planning efforts. In the meantime, the conceptual evaluation model
may provide academic practitioners with a comprehensive tool with which to evaluate
their strategic plans, significantly integrating both theoretical and practical perspectives
into the process.


Concluding Remarks
More than anything, strategic planning provides an opportunity to strengthen an
institution by fostering a vital conversation and dialogue about its core mission, and by
helping those most interested in the well-being of the institution to achieve a vision
featuring commonly held values. To foster such a conversation, an evaluation of the
strategic plan should include both qualitative and quantitative methods, and should
recognize the essential elements throughout the process. Evaluating these elements also
informs the next conversation about future goals. Strategic planning, if correctly done,
helps an institution to develop and continue courses of action that are a roadmap to
success. The key is to use the right elements in the evaluation and to develop a holistic


163
evaluation process. For when a strategic plan becomes muddled in an excess amount of
data, it ceases to be strategic; and conversely, when a strategic plan merely reports data, it
fails to be a plan. Above all, it should be meaningful; and as such, it is well worth
remembering Albert Einsteins memorable admonition: Not everything that can be
counted counts, and not everything that counts can be counted.










BIBLIOGRAPHY


164





BIBLIOGRAPHY



AAU, 2005. http://www.aau.edu/aau/Policy.pdf retrieved on December 22, 2005.

Anderes, T. (1996). Connecting Academic Plans to Budgeting: Key Conditions for
Success. In B. P. Nedwek (Ed.), Doing Academic Planning: Effective Tools for
Decision Making (pp. 129-134). Ann Arbor, MI: Society for College and
University Planning (ERIC Document Reproduction Service No. ED 451 785)

Anderes, T. (Fall 1999). Using Peer Institutions in Financial and Budgetary Analyses.
New Directions for Higher Education, 107, 117-123.

Anderson, T. J . (2000). Strategic planning, autonomous actions and corporate
performance. Long Range Planning, 33, 184-200.

Aguirre, F., & Hawkins, L. (1996). Why reinvent the wheel? Lets adapt our institutional
assessment model. Paper presented at the New Mexico Higher Education
Assessment Conference, Albuquerque, New Mexico. (ERIC Document
Reproduction Service No. ED 393 393)

Alexander, F. K. (J ul/Aug 2000). The changing face of accountability. The Journal of
Higher Education, 71(4), 411-431.

Angelo, T. A. (1999) Doing assessment as if learning matters most. AAHE Bulletin,
51(9), 3 - 6.

Atkinson, A. A., Waterhouse, J . H., & Wells, R. B. (Spring 1997). A stakeholder
approach to strategic performance measurement. Sloan Management Review,
38(3), 25-37.

Baldrige National Quality Program (2007). Education criteria for performance
excellence. Retreived May 5, 2007 from
http://www.quality.nist.gov/PDF_files/2007_Education_Criteria.pdf

Banta, T. W. (2004) Developing assessment methods at classroom, unit, and university-
wide levels. Retrieved November 2, 2004, from
http://www.enhancementhemes.ac.uk/uploads%5Cdocuments%5Cbantapaperrevi
sed.pdf


165
Banta, T. W., Lund, J . P., Black, K. E., & Oblander, F. W. (1996). Assessment in
practice: Putting principles to work on college campuses. San Francisco: J ossey-
Bass Inc.

Bickman, L. (Ed.). (1987). Using program theory in evaluation. New Directions for
Program Evaluation Series, 33, San Francisco: J ossey-Bass Inc.

Birnbaum, R. (2000). Management Fads in Higher Education: Where They Come From,
What They Do, Why They Fail. San Francisco: J ossey-Bass Publishers.

Bloom, B. S. (1956). Taxonomy of Educational objectives: The classification of
educational goals. Handbook I: Cognitive Domain. White Plains, N.Y.:
Longman.

Bogden, R. C., & Biklen, S. K. (1992). Qualitative research for education: An
introduction to theory and methods. Neidham Heights, MA: Simon and Schuster.

Bond, S. L., Boyd, S.E., & Rapp, K. A. (1997). Taking stock: A practical guide to
evaluating your own programs. Chapel Hill, NC: Horizon Research, Inc.
Retrieved May 5, 2006, from http://www.horizon-research.com

Boyd, B. K., & Reuning-Elliot, E. (1998). A measurement model of strategic planning.
Strategic Management Journal, 19, 181-192.

Boyle, M., J onas, P. M., & Weimer, D. (1997). Cyclical self-assessment: Measuring,
monitoring, and managing strategic planning. In S. E. Van Kollenburg (Ed.), A
Collection of Papers of Self-Study and Institutional Improvement (pp. 190-192).
Chicago: North Central Association of Colleges and Schools, Commission on
Institution of Higher Education. (ERIC Document Reproduction Service No. ED
408 880)

Cahoon, M. O. (1997). Planning our preferred future. In S. E. Van Kollenburg (Ed.), A
Collection of Papers of Self-Study and Institutional Improvement (pp. 187-189).
Chicago: North Central Association of Colleges and Schools, Commission on
Institution of Higher Education. (ERIC Document Reproduction Service No. ED
408 880)

Cameron, K. S. (1985). Institutional effectiveness in higher education. An introduction.
Review of Higher Education, 9(1), 1-4.

Chaffee, E. E. (1985). The concept of strategy: From business to higher education. In J .
C. Smart (Ed.), Higher Education: Handbook of Theory and Research (pp. 133-
172). New York: Agathon Press.




166
Chmielewski, T. L., Casey, J . C., & McLaughlin, G. D. (J une 2001). Strategic
management of academic activities: Program portfolios. Paper presented at the
Annual Meeting of Association for Institutional Research, Long Beach, CA.
(ERIC Document Reproduction Service No. ED 456 782)

Cistone, P. J ., & Bashford J . (Summer 2002). Toward a Meaningful Institutional
Effectiveness Plan. Planning for Higher Education, 30(4), 15-23.

Cohen, S. H. (1997). A program assessment system that really works in improving the
institution. In S.E. Van Kollenburg (Ed.), A Collection of Papers of Self-Study
and Institutional Improvement (pp. 104-107). Chicago: North Central Association
of Colleges and Schools, Commission on Institution of Higher Education. (ERIC
Document Reproduction Service No. ED 408 880)

Cope, R. G. (1981). Strategic planning, management, and decision making. ASHE-ERIC
Higher Education Research Report, no. 9 Washington, D.C.: American
Association for Higher Education. (ERIC Document Reproduction Service No.
ED 217 825)

Cordeiro, W. P., & Vaidya, A. (Summer 2002). Lessons learned from strategic planning.
Planning for Higher Education, 30(4), 24-31.

Dickmeyer, N. (2004). Tips on Integrating Planning and Decision Making. National
Association of College and University Business Officers. Retrieved J uly 14, 2004
from http://www.nacubo.org/x3466.xml

Dooris, M. J . (Dec 2002-Feb 2003). Two Decades of Strategic Planning. Planning for
Higher Education, 31(2), 26-32.

Dooris, M. J ., & Lozier, G. G. (Fall 1990). Adapting Formal Planning Approaches: The
Pennsylvania State University. New Directions for Institutional Research:
Adapting Strategic Planning to Campus Realities, 67, 5-21.

Dooris, M. J ., Kelley, J . M., & Trainer, J . F. (Fall 2004). Strategic Planning in Higher
Education. In M. J . Dooris, J . M. Kelley, and J . F. Trainer (Eds.), Successful
Strategic Planning (pp. 5-10). San Francisco: J ossey-Bass Inc

Dooris, M. J ., & Sandmeyer, L. (October 2006). Planning for Improvement in the
Academic Department. Effective Practices for Academic Leaders, 1(10), 1-16.

Easterling, D., J ohnson, B., & Wells, K. (1996). Creating the link between institutional
effectiveness and assessment. In S. E. Van Kollenburg (Ed.), A Collection of
Papers of Self-Study and Institutional Improvement (pp. 151-156). Chicago:
North Central Association of Colleges and Schools, Commission on Institution of
Higher Education. (ERIC Document Reproduction Service No. ED 394 393)


167
Ehrenberg, R. G., & Rizzo, M. J . (J uly-August 2004). Financial Forces and the Future of
American Higher Education. Academe, 90(4), 29-31.

Einstein, A. ThinkExist.com Retrieved May 10, 2007 from http://thinkexist.com/
quotation/not_everything_that_counts_can_be_counted-and_not/15536.html

Epper, R. M., & Russell, A. B. (Oct 1996). Trends in State Coordination and
Governance: Historical and Current Perspectives. State Higher Education
Executive Officers Association. (ERIC Document Reproduction Service No. ED
409 807)

Ewell, P. T. (2002). An emerging scholarship: A brief history of assessment. In T. W.
Banta (Ed.), Building a Scholarship of Assessment (pp. 3-25). New York: J ohn
Wiley & Sons, Inc.

Ewell, P. T. (2004). The examined life: Assessment and the ends of general education.
Paper presented at the 2004 Assessment Institute, Indianapolis, IN.

Ewell, P. T. (2005). Can assessment serve accountability? It depends on the question. In
J . C. Burke (Ed.), Achieving Accountability in Higher Education (pp. 104-124).
San Francisco: J ossey-Bass Inc.

Erwin. T. D. (1991). Assessing student learning and development: A guide to the
principles, goals, and methods of determining college outcomes. San Francisco:
J ossey-Bass Inc.

Feagin, J ., Orum, A., & Sjoberg, G. (Eds.). (1991). A case for case study. Chapel Hill,
NC: University of North Carolina Press.

Freed, J . A., Klugman, M. R., & Fife, J . D. (1997). A culture for academic excellence;
Implementing the quality principles in higher education. ASHE-ERIC Higher
Education Report, 25, 1. (ERIC Document Reproduction Service No. ED 406
963)

Glaister, K. W., & Falshaw, J . R. (1999). Strategic planning: Still going strong? Long
Range Planning, 32, 107-116.

Herriott, R. E., & Firestone, W. A. (1983). Multisite qualitative policy research:
Optimizing description and generalizability. Educational Researcher 12, 14-19.

Howell, E. (2000). Strategic planning for a new century: Process over product. ERIC
Clearinghouse for Community Colleges Digest. Office of Educational Research
and Improvement.(ERIC Document Reproduction Service No. ED 447 842)




168

J onas, S., & Zakel, L. (1997). Improving Institutional Effectiveness. In S. E. Van
Kollenburg (Ed.), A Collection of Papers of Self-Study and Institutional
Improvement (pp. 202-206). Chicago: North Central Association of Colleges and
Schools, Commission on Institution of Higher Education. (ERIC Document
Reproduction Service No. ED 408 880)

Kaplan, R. S., & Norton, D. P. (J anuary 1992). The balanced scorecard: Measures that
drive performance. Harvard Business Review, 71-79.

Kaplan, R. S., & Norton, D. P. (1996). Translating Strategy into Action: The Balanced
Scorecard. Boston: Harvard Business School Press.

Kater, S., & Lucius, C. (1997). As clear as mud? The difference between assessing
institutional effectiveness and student academic achievement. In S. E. Van
Kollenburg (Ed.), A Collection of Papers of Self-Study and Institutional
Improvement (pp. 88-90). Chicago: North Central Association of Colleges and
Schools, Commission on Institution of Higher Education. (ERIC Document
Reproduction Service No. ED 408 880)

Keller, G. (1983). Academic Strategy: The Management Revolution in American Higher
Education. Baltimore: J ohns Hopkins University Press.

Keller, G. (1997). Examining what works in strategic planning. In M.W. Peterson et al.
(Eds.), Planning and management for a changing environment: A handbook on
redesigning postsecondary institutions (pp. 52-60). San Francisco: J ossey-Bass,
Inc.

Kotler, P., & Murphy, P. E. (1981). Strategic planning for higher education. Journal of
Higher Education, 52, 470-489.

Kuh. G. D. (2005). Imaging asking the client: Using student and alumni surveys for
accountability in higher education. In J . C. Burke (Ed.), Achieving Accountability
in Higher Education (pp. 148-172). San Francisco: J ossey-Bass Inc.

Lang, A. (2001). In G. Lieberman, G. (Ed.), 3,500 Good Quotes for Speakers: A treasury
of pointed observations, epigrams and witticisms to add spice to your speeches.
(p.62). New York, NY: Broadway Books

Leake, D. C., & Kristovich, S. A. R. (2002). Instructional Support Units: The final
frontier The voyages of a two-year community college in instructional
effectiveness. In S. E. Van Kollenburg (Ed.), A Collection of Papers of Self-Study
and Institutional Improvement (pp. 251-255). Chicago: North Central Association
of Colleges and Schools, Commission on Institution of Higher Education. (ERIC
Document Reproduction Service No. ED 469 349)


169

Linn, R. L. (March, 2000). Assessments and accountability. Educational Researcher,
29(2), 4 -16.

Merriam, S. B. (1988). Case study research in education a qualitative approach. San
Francisco, CA: J ossey-Bass.

McLaughlin, J . S., McLaughlin, G. W., & Muffo, J . A. (2001). Using qualitative and
quantitative methods for complementary purposes: A case study. In R. D. Howard
& K. W. Borland, J r. (Eds.), Balancing qualitative and quantitative information
for effective decision support (pp. 15-44). San Francisco, CA: J ossey-Bass.

McLeod, M., & Cotton, D. (Spring 1998). Essential decisions in institutional
effectiveness assessment. Visions: The Journal of Applied Research for the
Florida Association of Community Colleges, 39-42.

Mintzberg, H. (1994). The Rise and Fall of Strategic Planning. New York: Free Press.

Nunez, W. J . (December 2003). Faculty and academic administrator support for
strategic planning in the context of postsecondary reform. Unpublished doctoral
dissertation, University of Louisville.

Nunez, W. J . (May 2004). Strategic planning in higher education: Assessing faculty and
administrative support in a reform environment. Paper presented at the 2004
Association for Institutional Research, Boston, MA.

Ohio State University (2004). Strategic Indicators 2004. Retrieved December 11, 2004
from http://oaa.osu.edu/irp/stratind/2004SIreport.pdf

Palomba, C. A., & Banta, T. W. (1999). Assessment essentials: Planning, implementing
and improving assessment in higher education. San Francisco: J ossey-Bass Inc.

Park, J . E. (1997). A case study analysis of strategic planning in a continuing higher
education organization. Unpublished doctoral dissertation, Pennsylvania
State University, Pennsylvania.

Parkland College Office of Institutional Research and Evaluation (Fall 2004). Parkland
College Planning Handbook. Retrieved December 30, 2004 from
http://www.parkland.edu/ie

Paris, K. (2004). Moving the Strategic Plan off the Shelf and into Action at the University
of Wisconsin - Madison. In M. J . Dooris, J . M. Kelley, and J . F. Trainer (Eds.),
Successful Strategic Planning (pp. 121-127) San Francisco: J ossey-Bass Inc.




170
Paris, K. (2005). Does it Pay to Plan? What We Learned about Strategic Planning in a
Big Ten University. Retrieved February 23, 2006 from http://www.ncci-cu.org

Paris, K. A., Ronca, J . M., & Stransky E. N. (2005). A Study of Strategic Planning on the
University of Wisconsin-Madison Campus. Paper presented to the Wisconsin
Center for the Advancement of Postsecondary Education, University of
Wisconsin System Board of Regents.

Patton, M. Q. (1987). How to use qualitative evaluation and research methods. Newbury,
Park, CA: Sage Publications.

Patton, M. Q. (1990). Qualitative evaluation and research methods. Newbury, Park, CA:
Sage Publications.

Prager, McCarthy, & Sealy. (2002). Ratio Analysis in Higher Education: New Insights
for Leaders of Public Higher Education. Retrieved May 24, 2004 from
http://www.kpmg.org/

Ray, E. (1998). University Performance Indicators and the Benchmarking Process. In J .
Carpenter-Hubin (Ed.), Strategic Indicators and Benchmarking Retrieved
September 7, 2004, from http://www.pb.uillinois.edu/AAUDE/specialtopics.cfm

Rieley, J . B. (Apr.1997a). A comprehensive planning model. The Center for Continuous
Quality Improvement. Milwaukee Area Technical College.(ERIC Document
Reproduction Service No. ED 409 956)

Rieley, J . B. (1997b). Doing Effective Strategic Planning in a higher education
environment. In S. E. Van Kollenburg (Ed.), A Collection of Papers of Self-Study
and Institutional Improvement (pp. 175-181). Chicago: North Central Association
of Colleges and Schools, Commission on Institution of Higher Education. (ERIC
Document Reproduction Service No. ED 408 880)

Rowley, D. J ., Lujan, H. D., & Dolence, M. G. (1997). Strategic change in colleges and
universities. San Francisco: J ossey-Bass Inc.

Rue, L. W. (1973). The how and who of long-rang planning. Business Horizons, 16, 23-
30.

Schmidtlein, F. A. (1989-1990). Why Linking Budgets to Plans Has Proven Difficult in
Higher Education. Planning for Higher Education, 18(2), 9-23.

Schmidtlein, F. A., & Milton, T. H. (1988-1989). College and university planning:
Perspectives from a nation-wide study. Planning for Higher Education, 17(3),1-
19.



171

Schmidtlein, F. A. & Milton, T. H. (Fall 1990). Adapting Strategic Planning to Campus
Realities New Directions for Institutional Research: Adapting Strategic Planning
to Campus Realities, 67, 1- 2.

Schwarzmueller, E. B. & Dearing, B. (1997). A model for non-instructional program
review. In S. E. Van Kollenburg (Ed.), A Collection of Papers of Self-Study and
Institutional Improvement (pp. 116-119). Chicago: North Central Association of
Colleges and Schools, Commission on Institution of Higher Education. (ERIC
Document Reproduction Service No. ED 408 880)

Shirley, R. C. (1988). Strategic planning: An overview. New Directions for Higher
Education, 16(4), 5 -14.

Smith, M. F. (J uly-August 2004). Growing Expenses, Shrinking Resources: The States
and Higher Education. Academe, 90(4), 33-35.

Stewart, A. C., & Carpenter-Hubin, J . (Winter 2000-2001). The balanced scorecard:
Beyond reports and rankings. Planning for Higher Education, 29(2), 37-42.

Taylor, A. L. (1989). Institutional effectiveness and academic quality. In C. Fincher
(Ed.), Assessing institutional effectiveness: Issues, methods, and management (pp.
1118). Athens, GA: Georgia University Institute of Higher Education.

Taylor, A. L., & Karr, S. (1999). Strategic planning approaches used to respond to issues
confronting research universities. Innovative Higher Education 23(3), 221- 234.

Tellis, W. (1997). Application of a case study methodology. The Qualitative Report 3,
http://www.nova.edu/ssss/QR/QR3-3/tellis2.html retrieved on J une 22, 2005.

Terenzini P. T. & Upcraft, M. L. (1996) Assessing program and service outcomes. In M.
L. Upcraft & J . H. Schuh (Eds.), Assessment in student affairs: A guide for
practitioners (pp. 217 239). San Francisco: J ossey-Bass Inc.

Torres, C. A. R. (2001). An assessment process for strategic planning in a higher
education institution Unpublished doctoral dissertation, Dowling College.

Upcraft, M. L., & Schuh, J . H. (1996). Assessment in student affairs: A guide for
practitioners. San Francisco; J ossey-Bass Inc.

W. K. Kellogg Foundation (J anuary 2004). Logic Model Development Guide: Using
Logic Models to Bring Together Planning, Evaluation, and Action Retrieved
May 2, 2006, from http://www.wkkf.org




172
Wholey, J . S., Hatry, H. P., & Newcomer, K. E. (Eds). (1994). Handbook of Practical
Program Evaluation. San Francisco: J ossey-Bass Inc.

Winstead, P. C., & Ruff, D. G. (1986). The evolution of institutional planning models in
higher education. Orlando, FL: Association for Institutional Research (ERIC
Document Reproduction Service No. ED 280 412)

Yin, R. K. (1994). Case Study Research: Design and Methods. Thousand Oaks, CA: Sage
Publications.





















APPENDICES


173
APPENDIX A



FLOWCHART OF THE CASE STUDY METHODOLOGY
Proposal
Approval from
Committee
IRB Review and
Approval
Preview AAU
Public Institutions
and Select Peers
Develop
Interview
Questions
Expert
Panel
Review
CASE STUDY PROCEDURE
Universities A, B, C & D
Survey:
1. Administer
2. Analyze Results
Document Review
Semi-structured
Interviews
(Upper -level
Administrators )
Interviewees check
transcript notes
Review Data and
Determine if there
is Enough Data to
Triangulate
No
Yes
Data
Analysis
Review Data
and Themes
Final
Analysis
Write
Dissertation


174
APPENDIX B



LIST OF DOCUMENTS FOR ANALYSIS



1. List of documents on the institutions website or on the institutions specific
website for their Strategic Plan.
2. The Strategic Plan.
3. List of offices or individuals that are main participants, such as those listed in
committees or task force.
4. The document that specifies the measures (benchmarks, metrics, targets).
5. The document that specifies the timeline of the plan.
6. The document that specifies the strategic plan process at that institution.
7. Documents on the implementation of the plan, which may include evaluation
strategy.
8. Reports of progress, or annual reports.
9. Strategic plan budgetary or funding reports.
10. Any public documents that are related to the strategic plan evaluation, such as
governance committee minutes or task force committee minutes.


175
APPENDIX C



STRATEGIC PLANNING EVALUATION SURVEY

S
t
r
o
n
g
l
y


A
g
r
e
e

S
o
m
e
w
h
a
t


A
g
r
e
e

N
e
i
t
h
e
r

A
g
r
e
e

n
o
r

D
i
s
a
g
r
e
e

S
o
m
e
w
h
a
t

D
i
s
a
g
r
e
e

S
t
r
o
n
g
l
y

D
i
s
a
g
r
e
e

D
o
n
'
t

K
n
o
w

/


N
o

O
p
i
n
i
o
n

1
I have evaluated and offered advice about the
strategic plan at my university.

2
Procedures for assessing goal attainment are
clearly stated at my institution.

3
Institutional research (data collection and
analysis) is an integral part of my institution's
strategic planning process.

4
My institution dedicates resources to strategic
planning activities.

5
My institution has a systematic process of
strategic planning.

6
I serve (or have served) on a strategic planning
committee or workgroup at my institution.

7
I am (or have been) involved in strategic
planning activities at my institution.

8
Specific changes at my institution have
occurred through systematic evaluation of our
strategic planning results.

9
The benefits of strategic planning are not
worth the resources we invest in our process.

10
Our strategic planning process has been
developed to "fit" the institution.

11
My institution is committed to allocating
resources to remedy areas of weakness found
through the strategic planning process.

12
I use the results of strategic planning activities
to evaluate the distribution of resources.






176
Yes No
13
Are you involved in the collection of data for
the strategic plan?




P
r
o
v
i
d
e

D
a
t
a

D
a
t
a

C
o
l
l
e
c
t
i
o
n

D
a
t
a

A
n
a
l
y
s
i
s

I
n
t
r
e
p
r
e
t

R
e
s
u
l
t
s

F
i
n
a
l

E
v
a
l
u
a
t
i
o
n

14
If yes, what aspects of the data collection do
you provide? (Check all that apply.)



15 In what office, area, or unit are you?








16 What offices, areas, or units are the main participants in the evaluation of the
strategic plan?








17 Briefly describe the method or steps your institution employs to evaluate the
strategic plan.







Thank you for your participation!


177
APPENDIX D



SURVEY INVITATION



February 2007

<<Participant Name>>
<<Participant Address>>
<<City>><<State>><<Zip Code>>

Dear <<Participant Name>>:

All too often, strategic planning in academia ultimately proves futile due to the lack of
deliberate assessment of progress made on the plans goals, even though this critical
phase can help the institution to evaluate the effectiveness of its plan, reevaluate the
plans evolution, and continue to build upon its successes. I am therefore currently
conducting dissertation research on the evaluation of strategic plan initiatives.
Specifically, my research project will take the form of a case study focusing on a handful
of AAU institutions with campus-wide strategic plans. As such, it will identify
methodological strengths and challenges from which new strategies may be identified,
and ways in which institutions may improve their own strategic planning processes.

You have been selected to participate in this research study because of your experience
with your institutions strategic plan initiative. Not only will your perceptions directly
inform this study, they will also, in turn, contribute to the growing literature on strategic
planning. In addition, they may also prove useful for a number of planning officers in the
Committee of Institutional Cooperation (CIC) who have expressed strong interest in the
results.

I hope you will be willing to complete a short survey asking for your opinions regarding
the various components involved in evaluating your institutions strategic plan. The time
required to participate is minimal -- approximately 10 minutes or less. You may complete
the survey through the following website:

URL: http://web.ics.purdue.edu/~mld/planevaluation.htm

Due to the relatively small number of people with experience in strategic planning, a high
response rate is critical to the success of my research. Therefore, I sincerely hope you
will choose to participate. If, however, you do not wish to participate in this study, you
can avoid future correspondence by contacting me via e-mail at mld@purdue.edu or by
phone at 765-494-7108. Please note that your responses will be completely anonymous.


178
Whether you agree to participate or not, I will not identify you or your institution in any
paper or presentation.


Thank you in advance for your assistance with this research.

Sincerely,





Margaret Dalrymple
Ph.D. Candidate in Higher Education Administration
Department of Educational Studies
Purdue University


179
APPENDIX E



INTERVIEW QUESTIONS



1. How would you describe the methods or steps your institution employs in
evaluating the strategic plan?
2. Are the originally formulated strategic plan measures still the primary
determining factors in the evaluation of the strategic plan?
3. Are there other factors that figure into the evaluation of the strategic plan?
4. Are the procedures for assessing goal attainment well-understood by the main
participants?
5. Who is actively involved in the evaluation of the strategic plan?
6. Have the results of the strategic planning activities been used to evaluate the
distribution of resources?
7. Did the strategic plan funds received match the original plan of the specific
resources identified in the strategic plan?
8. Did the allocation of the strategic plan funds received match the originally
planned strategic plan goals?
9. From your perspective, is your institution committed to allocating resources to
remedy areas of weakness identified through the evaluation of the strategic plan?
10. Having now systematically evaluated your strategic plan results, what specific
changes have occurred at your institution as a result?
11. Now in hindsight, is there anything in particular that you would have done
differently?


180
APPENDIX F



INTERVIEW INVITATION


February 2007

<<Participant Name>>
<<Participant Address>>
<<City>><<State>><<Zip Code>>

Dear <<Participant Name>>:

As a doctoral candidate in the Department of Educational Studies at Purdue University, I
am currently conducting dissertation research on the evaluation of strategic plan
initiatives. In reviewing the methodologies used to evaluate campus-wide strategic plans
at a handful of AAU institutions, my study will identify relative strengths and challenges
in the process and suggest ways in which similar institutions might improve their own
strategic planning processes. I believe this study will prove widely beneficial by
contributing to the growing literature on strategic planning, detailing various evaluation
methods currently in use, and offering a potential template for future evaluation methods.
Indeed, several planning officers in the Committee of Institutional Cooperation (CIC)
have already expressed interest in the results.

I am therefore writing to you today to ask if you would be willing to share some of the
insights you have gained in evaluating your campus strategic plan. Your perceptions
would directly inform this study and, in turn, contribute to the growing literature on
strategic planning. Specifically, I would like to conduct a confidential semi-formal
interview with you, or a designee, to discuss your opinions on the various components of
the evaluation of your institutions strategic plan.

Thank you very much for considering my request. I will be in touch with you in the near
future to discuss your thoughts on this matter and, should you be amenable, to schedule
an interview time.

Sincerely,


Margaret Dalrymple
Ph.D. Candidate in Higher Education Administration
Department of Educational Studies
Purdue University



181
APPENDIX G



EXPERT PANEL EVALUATION FORM

Please review the interview questions for their appropriateness and clarity and provide
your comments, suggestions, and overall evaluation. Your feedback is an important
component to the validity of this study.
Scale for the evaluation:
5=Excellent, 4=Good, 3=Average, 2=Below Average, 1=Poor
5 4 3 2 1
1. How would you describe the methods or steps your institution
employs in evaluating the strategic plan?


2. Are the originally formulated strategic plan measures still the
primary determining factors in the evaluation of the strategic plan?


3. Are there other factors that figure into the evaluation of the
strategic plan?


4. W hat are the procedures for assessing goal attainment?


5. Is the evaluation of the strategic plan, including the data
collection and analysis, an integral part of the institutions planning
process?


6. Who is actively involved in the evaluation of the strategic plan?


7. Have the results of the strategic planning activities been used to
evaluate the distribution of resources?


8. Did the strategic plan funds received match the original plan of
the specific resources identified in the strategic plan?


9. Did the allocation of the strategic plan funds received match the
originally planned strategic plan goals?

10. From your perspective, is your institution committed to
allocating resources to remedy areas of weakness identified through
the evaluation of the strategic plan?


11. Having now systematically evaluated your strategic plan
results, what specific changes have occurred at your institution as a
result?



182

12. Now in hindsight, is there anything in particular that you would
have done differently?


Please provide any comments or suggestions.


183
APPENDIX H



DOCUMENTS REVIEWED


University A
The original plan
Annual reports (five years)
Thirteen news releases
Websites: The strategic plan
Board of Trustees
Office of the President
Office of the Provost
Office of Institutional Research

University B
The original plan
Annual reports (six years)
One journal article
Three Board of Trustee speeches
Three university-wide memorandums
Ten news releases
Minutes from a Board of Trustees meeting
Minutes from a Faculty governance meeting
Minutes from a Question and Answer meeting with the President
Presentation to the Board of Trustees
Websites: The Board of Trustees
Office of the President
Office of Academic Affairs
Office of Business and Finance
Office of Institutional Research and Planning

University C
The original plan
Annual reports (four years)
A university-wide memorandum
Minutes from a Board of Trustees meeting
Two journal articles
Two Board of Trustee speeches
Two news releases
Three newsletters
Three brochures
Six presentations to the Board of Trustee meetings


184
Websites: The Board of Trustees
Office of the President
Office of the Executive Vice President and Provost
Office of Planning and Institutional Assessment
Office of Finance and Budget

University D
The original plan
Draft of the original plan
Annual reports (three years)
Two data reports
Two presentations to the Board of Trustee meeting
Four minutes from Board of Trustees meetings
Seven Presidential speeches
Twelve news releases
Websites: The Board of Trustees
Office of the Chancellor
Office of the Executive Vice Chancellor and Provost
Office of Institutional Research and Assessment
Office of Financial Planning and Budget
The accreditation re-affirmation process



185
APPENDIX I



INTERVIEW SUBJ ECTS


University A
Provost
Director of Strategic Planning and Assessment
Vice Provost for Research
Vice Provost for Engagement
Director of Budget and Fiscal Planning
Director of Institutional Research

University B
Executive Vice President of Business and Treasurer
Executive Vice Provost for Academic Affairs
Director of Institutional Research

University C
Executive Vice President of Business and Treasurer
Executive Director of the Office of Planning and Institutional Assessment
Director of Planning Research and Assessment

University D
Executive Associate Provost
Interim Director of Financial Planning and Budget


186
APPENDIX J



INTERVIEW QUESTIONS INCORPORATED INTO LOGIC MODEL

Interview Questions Logic Model Component
1 How would you describe the methods or steps
your institution employs in evaluating the
strategic plan?
Output - an analysis of goals and
objectives in measurable units
2 Are the originally formulated strategic plan
measures still the primary determining factors in
the evaluation of the strategic plan?
Input - a reflection of the progress
made toward the goals.
3 Are there other factors that figure into the
evaluation of the strategic plan?
Input - influences and resources
4 What are the procedures for assessing goal
attainment?
Output - an analysis of goals and
objectives in measurable units
5 Is the evaluation of the strategic plan, including
the data collection and analysis, an integral part
of the institutions planning process?
Outcome - Short-term outcome that
include specific changes in attitudes,
behaviors, knowledge, and skill.
6 Have the results of the strategic planning
activities been used to evaluate the distribution
of resources?
Outcome - long-term outcomes
7 Who is actively involved in the evaluation of the
strategic plan?
Outcome - Short-term outcome that
include specific changes in attitudes,
behaviors, knowledge, and skill.
8 Did the strategic plan funds received match the
original plan of the specific resources identified
in the strategic plan?
Activities - describes the resources
needed to implement the program
and the intentions of the program.
9 Did the allocation of the strategic plan funds
received match the originally planned strategic
plan goals?
Activities - describes the resources
needed to implement the program
and the intentions of the program.
10 From your perspective, is your institution
committed to allocating resources to remedy
areas of weakness identified through the
evaluation of the strategic plan?
Outcome - long-term outcomes


187
11 Having now systematically evaluated your
strategic plan results, what specific changes have
occurred at your institution as a result?
Impact - reflection on the
effectiveness and magnitude of the
strategic plan.
12 Now in hindsight, is there anything in particular
that you would have done differently?
Impact - reflection on the
effectiveness and satisfaction of the
strategic plan.




188
APPENDIX K



UNIVERSITY A SURVEY RESULTS


Percent that
Responded
with
Somewhat
Agree or
Strongly
Agree
Number of
Respondents
Input Questions
Our strategic planning process has been developed to "fit"
the institution.
72% 13
The vision of my institution is clearly defined. 100% 18
The institutional mission is central to the strategic planning
process.
88% 16
Faculty, staff, and administrators are encouraged to
participate in the long-range planning process of the
institution.
88% 16
Activities Questions
Procedures for assessing goal attainment are clearly stated
at my institution.
94% 17
I serve (or have served) on a strategic planning committee
or workgroup at my institution.
94% 17
I am (or have been) involved in strategic planning activities
at my institution.
88% 16
Faculty, staff, and administrators are actively involved in
the strategic planning process.
77% 14
I have engaged in specific planning exercises to aid in my
institutions strategic planning activities.
77% 14
I have helped formulate outcome measures to measure
progress.
77% 14
Output Questions
My institution has a systematic process of strategic
planning.
94% 17
Day-to-day decisions made by institutional decision-makers
are consistent with long-range goals and objectives.
83% 15
My institution has a formal process for gathering and
sharing internal and external data.
94% 17


189
Outcome Questions
Institutional research (data collection and analysis) is an
integral part of my institutions strategic planning process.
94% 17
My institution dedicates resources to strategic planning
activities.
94% 17
My institution is committed to allocating resources to
remedy areas of weakness found through the strategic
planning process.
94% 17
The results of strategic planning activities are used to
evaluate the distribution of resources.
83% 15
Impact Questions
I have evaluated and offered advice about the strategic plan
at my university.
88% 16
Specific changes at my institution have occurred through
systematic evaluation of our strategic planning results.
88% 16
The benefits of strategic planning are not worth the
resources we invest in our process. (Item reverse coded)
94% 17
Strategic planning plays an important role in the success of
my institution.
94% 17
Efforts to evaluate the strengths, weaknesses, opportunities,
and threats of my institution are worthwhile.
94% 17
Resources dedicated to strategic planning activities are
investments in the long-term health of my institution.
83% 15
Strategic planning activities do little to help our institution
to fulfill its mission. (Item reverse coded)
88% 16



190
APPENDIX L



UNIVERSITY B SURVEY RESULTS


Percent that
Responded
with
Somewhat
Agree or
Strongly
Agree
Number of
Respondents
Input Questions
Our strategic planning process has been developed to "fit"
the institution.
89% 8
The vision of my institution is clearly defined. 89% 8
The institutional mission is central to the strategic planning
process.
100% 9
Faculty, staff, and administrators are encouraged to
participate in the long-range planning process of the
institution.
67% 6
Activities Questions
Procedures for assessing goal attainment are clearly stated
at my institution.
100% 9
I serve (or have served) on a strategic planning committee
or workgroup at my institution.
55% 5
I am (or have been) involved in strategic planning activities
at my institution.
89% 8
Faculty, staff, and administrators are actively involved in
the strategic planning process.
89% 8
I have engaged in specific planning exercises to aid in my
institutions strategic planning activities.
89% 8
I have helped formulate outcome measures to measure
progress.
78% 7
Output Questions
My institution has a systematic process of strategic
planning.
78% 7
Day-to-day decisions made by institutional decision-makers
are consistent with long-range goals and objectives.
89% 8
My institution has a formal process for gathering and
sharing internal and external data.
89% 8


191
Outcome Questions
Institutional research (data collection and analysis) is an
integral part of my institutions strategic planning process.
100% 9
My institution dedicates resources to strategic planning
activities.
89% 8
My institution is committed to allocating resources to
remedy areas of weakness found through the strategic
planning process.
78% 7
The results of strategic planning activities are used to
evaluate the distribution of resources.
89% 8
Impact Questions
I have evaluated and offered advice about the strategic plan
at my university.
89% 8
Specific changes at my institution have occurred through
systematic evaluation of our strategic planning results.
89% 8
The benefits of strategic planning are not worth the
resources we invest in our process. (Item reverse coded)
100% 9
Strategic planning plays an important role in the success of
my institution.
89% 8
Efforts to evaluate the strengths, weaknesses, opportunities,
and threats of my institution are worthwhile.
100% 9
Resources dedicated to strategic planning activities are
investments in the long-term health of my institution.
89% 8
Strategic planning activities do little to help our institution
to fulfill its mission. (Item reverse coded)
100% 9




192
APPENDIX M



UNIVERSITY C SURVEY RESULTS


Percent that
Responded
with
Somewhat
Agree or
Strongly
Agree
Number of
Respondents
Input Questions
Our strategic planning process has been developed to "fit"
the institution.
89% 8
The vision of my institution is clearly defined. 78% 7
The institutional mission is central to the strategic planning
process.
89% 8
Faculty, staff, and administrators are encouraged to
participate in the long-range planning process of the
institution.
100% 9
Activities Questions
Procedures for assessing goal attainment are clearly stated
at my institution.
78% 7
I serve (or have served) on a strategic planning committee
or workgroup at my institution.
100% 9
I am (or have been) involved in strategic planning activities
at my institution.
89% 8
Faculty, staff, and administrators are actively involved in
the strategic planning process.
100% 9
I have engaged in specific planning exercises to aid in my
institutions strategic planning activities.
100% 9
I have helped formulate outcome measures to measure
progress.
78% 7
Output Questions
My institution has a systematic process of strategic
planning.
100% 9
Day-to-day decisions made by institutional decision-makers
are consistent with long-range goals and objectives.
89% 8
My institution has a formal process for gathering and
sharing internal and external data.
100% 9


193
Outcome Questions
Institutional research (data collection and analysis) is an
integral part of my institutions strategic planning process.
100% 9
My institution dedicates resources to strategic planning
activities.
100% 9
My institution is committed to allocating resources to
remedy areas of weakness found through the strategic
planning process.
78% 7
The results of strategic planning activities are used to
evaluate the distribution of resources.
78% 7
Impact Questions
I have evaluated and offered advice about the strategic plan
at my university.
100% 9
Specific changes at my institution have occurred through
systematic evaluation of our strategic planning results.
78% 7
The benefits of strategic planning are not worth the
resources we invest in our process. (Item reverse coded)
89% 8
Strategic planning plays an important role in the success of
my institution.
89% 8
Efforts to evaluate the strengths, weaknesses, opportunities,
and threats of my institution are worthwhile.
89% 8
Resources dedicated to strategic planning activities are
investments in the long-term health of my institution.
89% 8
Strategic planning activities do little to help our institution
to fulfill its mission. (Item reverse coded)
89% 8




194
APPENDIX N



UNIVERSITY D SURVEY RESULTS


Percent that
Responded
with
Somewhat
Agree or
Strongly
Agree
Number of
Respondents
Input Questions
Our strategic planning process has been developed to "fit"
the institution.
78% 7
The vision of my institution is clearly defined. 89% 8
The institutional mission is central to the strategic planning
process.
89% 8
Faculty, staff, and administrators are encouraged to
participate in the long-range planning process of the
institution.
78% 7
Activities Questions
Procedures for assessing goal attainment are clearly stated
at my institution.
89% 8
I serve (or have served) on a strategic planning committee
or workgroup at my institution.
89% 8
I am (or have been) involved in strategic planning activities
at my institution.
100% 9
Faculty, staff, and administrators are actively involved in
the strategic planning process.
55% 5
I have engaged in specific planning exercises to aid in my
institutions strategic planning activities.
89% 8
I have helped formulate outcome measures to measure
progress.
78% 7
Output Questions
My institution has a systematic process of strategic
planning.
44% 4
Day-to-day decisions made by institutional decision-makers
are consistent with long-range goals and objectives.
56% 5
My institution has a formal process for gathering and
sharing internal and external data.
100% 9


195
Outcome Questions
Institutional research (data collection and analysis) is an
integral part of my institutions strategic planning process.
89% 8
My institution dedicates resources to strategic planning
activities.
100% 9
My institution is committed to allocating resources to
remedy areas of weakness found through the strategic
planning process.
56% 5
The results of strategic planning activities are used to
evaluate the distribution of resources.
78% 7
Impact Questions
I have evaluated and offered advice about the strategic plan
at my university.
89% 8
Specific changes at my institution have occurred through
systematic evaluation of our strategic planning results.
78% 7
The benefits of strategic planning are not worth the
resources we invest in our process. (Item reverse coded)
78% 7
Strategic planning plays an important role in the success of
my institution.
44% 4
Efforts to evaluate the strengths, weaknesses, opportunities,
and threats of my institution are worthwhile.
100% 9
Resources dedicated to strategic planning activities are
investments in the long-term health of my institution.
78% 7
Strategic planning activities do little to help our institution
to fulfill its mission. (Item reverse coded)
89% 8











VITA


196





VITA



MARGARET L. DALRYMPLE

3009 Georgeton Road
West Lafayette, IN 47906


EDUCATION

PH.D. IN HIGHER EDUCATION ADMINISTRATION 2007
Purdue University
West Lafayette, Indiana

M.A. IN SOCIOLOGY 1992
University of Colorado
Colorado Springs, Colorado

B.A. IN SOCIOLOGY AND PSYCHOLOGY 1990
Augustana College
Sioux Falls, South Dakota

EXPERIENCE

ASSISTANT DIRECTOR OF INSTITUTIONAL RESEARCH 2005 to Present
Office of Institutional Research, Purdue University
West Lafayette, Indiana

SENIOR INSTITUTIONAL RESEARCH ANALYST 2003 to 2005
Office of Institutional Research, Purdue University
West Lafayette, Indiana

ASSOCIATE REGISTRAR FOR RESEARCH 2000 to 2003
Office of the Registrar, Purdue University
West Lafayette, Indiana




197
RESEARCH ANALYST 1995 to 2000
Office of the Registrar, Purdue University
West Lafayette, Indiana

RESEARCH ASSISTANT 1992
Center for Social Science Research, University of Colorado
Colorado Springs, Colorado

EXPERTISE

SPECIAL PROJ ECTS
Investment Return Analysis
Designer and coordinator of data collection for Purdue University West
Lafayette Campus
Strategic Plan Metrics and Benchmarks Progress Report
Coordinator of data collection for Purdue University System
President Forums
Coordinator of data collection and designer for Purdue University West
Lafayette Campus
Board of Trustees Governance Report
Coordinator of data collection and designer for Purdue University System
Graduating Students Learning Outcomes Survey
Coordinator and administrator of the Purdue University survey (West
Lafayette Campus)
Higher Education Research Institute (HERI) College Student Survey (CSS)
Administrator for Purdue University West Lafayette Campus
Higher Education Research Institute (HERI) Cooperative Institute Research
Program (CIRP)
Survey Analyst for Purdue University West Lafayette Campus
National Study of Faculty and Students (NSoFaS)
Coordinator of data collection for Purdue University West Lafayette
Campus
Enrollment Projections
Generated fall and spring enrollment projections for Purdue University
West Lafayette Campus
Integrated Postsecondary Education Data System (IPEDS) Enrollment Survey,
Completions Survey, and Graduation Rates Survey
Completed for Purdue University System
Indiana Commission of Higher Education (ICHE) Legislature Request for
institutional data
Completed for Purdue University System



198
PROFESSIONAL ACTIVITIES

Member-at-Large: Membership Committee Chair,
Indiana Association for Institutional Research,
2006-07
Presentation Proposals Reviewer for Track Four: Informing Institutional Management
and Planning
Association for Institutional Research National Conference, Seattle, Washington,
May 2008
Presentation Proposals Reviewer for Track Five: Higher Education Collaborations, Policy
Issues, and Accountability
Association for Institutional Research National Conference, Chicago, Illinois,
J une 2006
Presentation Proposals Reviewer for Track Five: The Practice of Institutional Research
Association for Institutional Research National Conference, Tampa, Florida,
J une 2003
Newcomers Orientation Workshop Session Presenter
Indiana Association for Institutional Research Conference, Nashville, Indiana,
April 2003
Indiana Association for Institutional Research Conference, Nashville, Indiana,
March 2001
Association for Institutional Research National Conference, Cincinnati, Ohio,
May 2000
Indiana Association for Institutional Research Conference, Nashville, Indiana,
March 2000
Newcomers Workshop Committee Member
Association for Institutional Research National Conference, Toronto, Ontario,
Canada,
J une 2002
Session Facilitator
Association for Institutional Research National Conference
2005, 2003, 2001, 2000, 1999, & 1998

MEMBERSHIPS IN PROFESSIONAL ASSOCIATIONS:

Association for Institutional Research (AIR)
Association for the Study of Higher Education (ASHE)
Association of American University Data Exchange (AAUDE)
Indiana Association for Institutional Research (INAIR)

PRESENTATIONS

INAIR Best Presentation: Communicating the Strategic Plan
Association for Institutional Research National Conference
Kansas City, Missouri, J une 2007


199

Lessons Learned about Communicating the Strategic Plan
Association for Institutional Research National Conference
Chicago, Illinois, May 2006

The Missing Link: Evaluating a Strategic Plan Initiative
Association for Institutional Research National Conference
San Diego, California, May 2005

Getting the Word Out: Communicating the Strategic Plan
Indiana Association for Institutional Research Conference
Greencastle, Indiana, March 2005

"The Perceptions of Indiana Institutional Researchers on the Role of Institutional
Research
in the Strategic Planning Process"
Indiana Association for Institutional Research Conference
Terre Haute, Indiana, April 2004

Beyond Guesswork: One Universitys Example of Projecting Undergraduate
Enrollment
Association for Institutional Research National Conference
Tampa, Florida, May 2003

"The Adventures of Enrollment Projections"
Indiana Association for Institutional Research Conference
Nashville, Indiana, April 2003

"Ingredients for Enrollment Projections: Lots of Statistics, a Bit of Gut-feeling,
Some Guesswork, and a Little Luck"
Indiana Association for Institutional Research Conference
Nashville, Indiana, April 2003

"Effective Report Preparation: Streamlining the Reporting Process"
Indiana Association for Collegiate Registrar and Admission Officers Conference
Indianapolis, Indiana, October 1999

"Effective Report Preparation: Streamlining the Reporting Process"
Association for Institutional Research National Conference
Seattle, Washington, May 1999

"Effective Report Preparation: Streamlining the Reporting Process"
Indiana Association for Institutional Research Conference
Nashville, Indiana, March 1999



200
"A New Focus for Institutional Researchers: Developing and Using a Student Decision
Support System"
Association for Institutional Research National Conference
Minneapolis, Minnesota, May 1998

"A New Focus for Institutional Researchers: Developing and Using a Student Decision
Support System"
Indiana Association for Institutional Research Conference
Nashville, Indiana, March 1998

"A Study of the Delimination of the Dissertation Process"
Western Social Science Association Conference
Denver, Colorado, April 1991

HONORS AND AWARDS

Years of Service Recognition 2006
Office of the Provost, Purdue University
Best Presentation 2005
Indiana Association for Institutional Research
Individual Professional Development Grant 1999
Administrative and Professional Staff Advisory Committee, Purdue University
Student Services New Professional Award Nominee 1997
Office of the Registrar, Purdue University

PUBLICATIONS

"A New Focus for Institutional Researchers: Developing and Using a Student Decision
Support System," Association for Institutional Research Professional File (Winter
1999)

Book review of Ambassadors of U.S. Higher Education: Quality Credit-Bearing
Programs Abroad, Eds. J . Dupree and M. P. Lenn, in College & University 74, 2
(Winter 1999)

Potrebbero piacerti anche