Sei sulla pagina 1di 2

Volume 15, Number 4

February 15, 2011

A Simple Way to Score Administrative Quality


Jennifer Patterson Lorenzetti

aye Shelton, dean of online education at Dallas Baptist University, has been involved in online education since 1998. In this time, she has seen the increase in public demand for accountability in higher education and online educations response in demonstrating their level of quality. Demonstrating quality has been foremost in her mind as she looks at the online programs at Dallas Baptist University, where

For many years no one could offer a checklist that online programs could use to demonstrate the quality of their programs and plan for improvement.
Sample scorecard.

35-40 percent of the institutions enrollment takes at least one online course. My focus was quality; we have to have quality to be competitive, she says. However, in 1998 there were no best practices for determining quality, and for many years no one could offer a checklist that online programs could use to demonstrate the quality of their programs and plan for improvement. This changed when Shelton undertook the creation of just such an instrument that every online program -- regardless of size, focus, or governance of the institution could use to measure quality. Starting with the IHEP study Quality on the Line: Benchmarks for Success in Internet-Based Distance Education and 43 administrators of online education programs from a variety of institution types, Shelton undertook a six-round

Delphi study to come up with a comprehensive list of measures of quality in online education. The panel was drawn from a suggested list of 76 contributed by the Sloan Consortium as experts respected in the field. Of the 43 that joined the study, 86 percent had nine or more years experience in online education. It took six rounds of consensus-building over 180 days to flesh out the original 24 benchmarks for quality and come to agreement on the final list. The only compensation these experts received was a $25 Amazon gift card. With experts from various institution types participating in a Delphi study, Shelton was confident that the criteria that emerged were equally applicable to institutions of all sizes and types. tion type.

Using the instrument


The final recommendations are in usable form, but Shelton is in the process of making them even more accessible by writing a handbook for their use. This handbook will include indicators, supporting studies, and best practices supporting each dimension of the instrument; institutions will be able to work through the instrument as a self-study, adding documentation that supports their self-analysis.
continued on page 8

in this issue
A Simple Way to Score Administrative Quality . . . . . . . . . . . . . . . . . . . . . . . . .Cover Monthly Metric: Whos marketing their program on Google? . . . . . . . . . . . . . . . . .3 Administration: Online format saves academic program . . . . . . . . . . . . . . . . . . . . .4 Faculty Development: From the people who brought you Quality Matters . . . . . . . .5 In the News: For-profits gunning for the GAO . . . . . . . . . . . . . . . . . . . . . . . . . . .6

A MAGNA

PUBLICATION

Cover..from page 1

Although the instrument generates a final score that indicates degree of adherence to the measures of quality, the process of working through the analysis is the most valuable part, and institutions should not expect to achieve a perfect score. For example, Shelton herself has won two course design awards for her work and has a 92 percent student course completion rate, but when I went to self-assess, I could not get a perfect score, she says. As one place that she falls short of perfection, in Social and Student Engagement, the first metric reads, Students should be provided a way to interact with other students in an online community. Shelton admits that she has ways for students to interact at the course level, but she has not developed any at the program level. Finding places for improvement such as this one gives her goals to add to her programs strategic plan. Scoring: (Based on all scorecards being used. To see all scorecards, visit:
www.magnapubs.com/files/newsletters/der/scorecard.pdf)

For institutions wanting to use this instrument, Shelton advises plunging right in. Go to the web site and read the instrument and start working through the items, she advises. Make notes of why you gave yourself a [certain] score.

Proving ourselves
Although all institution types have had increasing pressure to prove the quality of their programs, online education has drawn the most attention of late. Accreditors want to know this, says Shelton. As online educators, we have had to prove ourselves two to three times as much. Using an instrument like this is a great way to get ahead of the curve and start amassing evidence before the accreditors or the public demand it. By working through this instrument as a self-assessment, online programs can see where and why they are succeeding and make plans for further improvement. And, when the time comes to demonstrate quality to others, the evidence will already be there. G

210 = perfect score 189-209 (90-99%) = exemplary (little improvement is needed) 168-188 (80-89%) = acceptable (some improvement is recommended) 147-167 (70-79%) = marginal (significant improvement is needed in multiple areas) 126-146 (60-69%) = inadequate (many areas of improvement are needed throughout the program) 125 and below (59% and below) = unacceptable

Sample scorecard.

Sample scorecard.

February 15, 2011

Distance Education Report

Potrebbero piacerti anche