Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Jennifer Maddrell
instruments of instruction based on learner analysis, task analysis, and environment design and
evaluation.”
Some who hold this conception have argued that instruction is a scientific discipline and
instructional design is a technology which incorporates known and verified instructional
strategies (Merrill et al., 1996). Others forward a conditions-goals-methods instructional design
framework which suggests designers follow functional prescriptions toward attainment of the
instructional goal (Reigeluth, 1983). Inherent in this framework is the assumption of a
prescriptive knowledge base that can be “codified, owned, controlled, and communicated
unambiguously to others” (Wilson, 1997, p. 301).
Instructional Design as a Problem Solving Process and Decision Making Activity
Instructional design conceived of as a Problem-Solving Archetype began in the early
1970s and is characterized by both the cognitive activity required of the designer and the
application of the designer's acquired skills and experience (Davies, 1978). Unlike a process of
rule using and procedure following as described above, instructional design as a decision making
activity is conceived of as cognitive problem solving process (Jonassen, 2008). In contrast to a
conception of instructional design as the application of unambiguous and objectivist
prescriptions, instructional design as a decision making process focuses on the identification and
accommodation of given constraints; instructional design practice heuristics offer guidance, but
not prescriptions for decision making (Silber, 2007; Zemke & Rosset, 2002).
Some who share this viewpoint see instructional design as a process of collective decision
making involving a community of interested participants which include not only the designer, but
also experts in other areas and the stakeholders who work together to on the instructions design
(Willis, 1998). While some outright condemn this collective negotiation of the instructional
design process (Merril et. al., 1996), others suggest the change in conception as an evolution in
the application of traditional instructional design models which places additional and expanded
emphasis on the analysis of the instructional context and on iterative design decision making
(Dick, 1996).
Instructional Design as a Project Development Process
The instructional design task is sometimes generically described in terms of phases in the
instructional project development process, including analysis, design, development,
implementation, and evaluation, often referred to under the acronym ADDIE (Molenda, n.d).
Such a focus on the major phases in the instructional project development process has prompted
some to suggest that the instructional designer's task is as much about project planning and
management as it is a process to build instruction (Zemke & Rossett, 2002). This view is
partially supported by findings which suggest instructional project success is linked to a range of
factors related to the project's planning and management, including access and management of
tangible resources (funding, development tools, and delivery equipment) and implementation
support (trainer support and examination procedures) (Klimcak and Wedman, 1997).
Hybrid Viewpoint.
Still others take a hybrid viewpoint and suggest that the instructional design task should
Instructional Design Task Analysis 4
be viewed as both a time tested tradition and a knowledge base, with a designer's toolkit
including all of the previously mentioned facts, concepts, skills, and strategies (Rowland, 2004).
Within this hybrid viewpoint, the designer relies on a blend of specialized design skills, design
heuristics, models, and practical considerations when facing multi-layered design decisions, each
with its unique sets of goals, principles, tools, and processes (Gibbons, 2003).
Implications for Teaching Instructional Design
The range of conceptions of the instructional designer's task suggest a lack of clarity with
regard to how the instructional design task should be taught. If there is a lack of agreement on
what the tasks are, how can there be agreement with regarding to instructional designer training?
As should be considered following any task analysis, the field must come to terms with which
tasks are most relevant. What should be the training priority? Based on the noted task
conceptions, should instruction design programs focus on the application of rule-based
instructional design models? Media design and development? Time tested best practices?
Problem solving and decision making skills? Project development and management skills? All of
the above?
One way to answer these questions is to evaluate how effectively existing instructional
design and technology programs are preparing their graduates to handle the tasks they are
required to perform on the job. Such an evaluation was the focus of a recent survey of the
Instructional Design and Development, Training and Performance, and Distance Learning
division members of the Association for Educational Communication and Technology (Larson,
2005). The good news is all respondents felt either “somewhat” to “fully prepared” by their
instructional design and technology programs for general instructional design practice. However,
the eight issues the respondents felt unprepared to handle included a range of topics which are
most likely not a significant part of most instructional design and technology programs,
including (a) freedom to challenge decisions of supervisors, (b) the nature of internal politics, (c)
the availability of project resources for work assignments, (d) directive versus participative
management styles, (e) workload, (f) trade-offs between quality, timeliness, and cost in work
assignments, and (g) the amount of freedom given to make decisions.
These findings from a small sample of instructional designers may offer support to those
who suggest too great a focus on learning rule-based steps in the instructional design process at
the expense of learning how to tackle constraint filled and complex instructional design problems
(Silber, 2007). The findings also beg an important follow up question. If an instructional designer
fails at implementation of an instructional design task, is it an inherent fault within the learned
instructional design model and strategies or is it a deficiency in the designer's ability to
implement? Some suggest that unsuccessful instructional design projects are less an inherent
fault of instructional design models and strategies than a lack of implementation expertise
(McCombs, 1986). If this is true, then the case could be made that a designer's training must
include skill development in the application of the models and heuristics, including the
development of higher-order analysis and problem-solving skills associated with instructional
design plan implementation (McCombs).
Instructional Design Task Analysis 5
In addition, if the survey results are representative of the larger instructional design
community, then a gap is suggested between what is learned in the classroom and what is
required in on the job. This supports those who argue that there is a required level of
instructional design expertise that rule-based systems are incapable of fully capturing (Wilson,
1997). Such a position suggests a closer focus on novice to expert development which extends
beyond a solid understanding of instructional models, heuristics, and strategies to the
development of additional skills and knowledge which allow the designer to recognize pitfalls
and take corrective action when faced with unanticipated obstacles and constraints (Shambaugh
& Magliaro, 2001). Sharing this viewpoint, Rowland (2004) suggests a broad set of designerly
core competencies, including skills such as judgment to solve ill-defined problems, creativity in
using formal techniques, composition of novel rather than general prescriptions, and mindful
reflection in action rather than a mechanical execution of tasks.
Similarly, others argue that if instructional designers are skillful, they are able to balance
many knowledge and information sources, including the needs and characteristics of clients
(Schifffman, 1995). Underlying this argument is the assumption that the number of instructional
design solutions is inexhaustible and the instructional designer cannot rely on knowledge of
optimal procedures, but rather on his or her problem solving skills to satisfy the given
instructional situation as dictated by the inherent constraints and opportunities of the
instructional project (Silber, 2007; Jonassen, 2008). As such, the designer must learn the skill of
satisficing (doing the best job possible given the constraints of the situation) versus relying on
the application of optimal rule-based solutions (Jonassen).
Conclusion
The field has forwarded multiple conceptions of the instructional designer's task which
impact not only how the field is defined, but more importantly how designers are trained to
execute their jobs. While training based on observable behaviors and optimal procedures will
provide a framework for the task to be accomplished, more recent conceptions of the job suggest
additional skills and competencies are required to effectively perform the instructional designer's
tasks and to implement instructional projects. Preliminary surveys of the field appear to support
viewpoint.
Instructional Design Task Analysis 6
References
Davies, I.K. (1978) Educational technology: Archetypes, paradigms and models. In J.A. Hartley
& I.K. Davies (Eds.) Contributions to educational technology, vol. 2 (9-29). London:
Kogan Page.
Dick, W. (1996). The Dick and Carey model: Will it survive the decade? Educational Technology
Research and Development, 44(3), 55-63. doi: 10.1007/BF02300425.
Gibbons, A. (2003). What and how do designers design? TechTrends, 47(5), 22-25. doi:
10.1007/BF02763201.
Heinich, R. (1973). Is there a field of educational communications and technology? Audiovisual
Instruction. 18:5, 44-46.
Jonassen, D. H. (2008). Instructional Design as Design Problem Solving: An Iterative Process.
Educational Technology Magazine: The Magazine for Managers of Change in Education,
48(3), 21-26.
Jonassen, D. H., Tessmer, M., & Hannum, W. H. (1999). Task analysis methods for instructional
design. Mahwah, N.J.: L. Erlbaum Associates.
Klimczak, A., & Wedman, J. (1997). Instructional design project success factors: An empirical
basis. Educational Technology Research and Development, 45(2), 75-83. doi:
10.1007/BF02299525.
Larson, M. (2005). Instructional design career environments: Survey of the alignment of
preparation and practice. TechTrends, 49(6), 22-32. doi: 10.1007/BF02763727.
McCombs, Barbara. (1986) “The ISD Model: Review of those Factors Critical to Its Successful
Implementation” ECTJ 34:2, Summer, 67-81
Merrill, M. D., Drake, L., Lacy, M., & Pratt, J. (1996). Reclaiming Instructional Design.
Educational Technology, 36(5), 5-7.
Militello, L. G. and Hutton, R. J. B. (1998). Applied cognitive task analysis (ACTA): a
practitioner’s toolkit for understanding cognitive task demands. Ergonomics, 41 (11),
1618 – 1641.
Molenda, M. (n.d.). In Search of the Elusive ADDIE Model. Retrieved from
http://www.indiana.edu/~molpage/In%20Search%20of%20Elusive%20ADDIE.pdf.
Reigeluth, C. M. (Ed.). (1983). Instructional Design: What is it and why is it? In Instructional-
design Theories and Models, 3-31.
Rowland, G. (2004). Shall we dance? A design epistemology for organizational learning and
performance. Educational Technology Research and Development, 52(1), 33-48.
Schiffman, S.S. (1995). Instructional Systems Design: five views of the field. In G.J. Anglin
(Ed.), Instructional Technology: Past, Present, and Future 2nd ed. (pp. 131-142).
Englewood, CO: Libraries Unlimited.
Shambaugh, N., & Magliaro, S. (2001). A reflexive model for teaching instructional design.
Educational Technology Research and Development, 49(2), 69-92.
Instructional Design Task Analysis 7