Written by: Alireza Hejazi, APF Emerging Fellow
Teaching foresight is both enjoyable and challenging. New and experienced teachers alike are constantly faced with making foresight theory and practice meaningful for their students. Developing and running a foresight course is a challenging job, but evaluating it can be more thought-provoking. Looking at a foresight course from different points of view, foresight instructors may find this question meaningful: “how should we evaluate a foresight course to ensure the credibility of learning outcomes?” This blog post reviews three stages of evaluation and deserves foresight coaches’ care and appropriate action.
Many observers believe that an evaluation agenda can be developed only after running an educational program. However, if foresight instructors inspect these three points in their syllabi with the support of an expert, they will save much energy, time and fund for future reviews and corrections: (1) Establishing instructional objectives, (2) Planning instructional strategies, and (3) Assessing learning outcomes. Without enough care for these three items, every educational initiative is doomed to failure.
Instructional objectives are “statements describing what the student will be able to do after completing a unit of instruction” (Kibler, Cegala, Barker & Miles, 1974, p. 2). Instructional objectives are typically articulated on the course syllabus, and many teachers provide detailed instructional objectives for specific units covered in a course. They help students know what to expect. In using instructional objectives, teachers are better able to articulate what they teach, and can better help students meet those objectives. For example, we can tell our students that they will be able to lead a scenario learning process for a leadership team that tests their strategy against a range of possible future developments.
Instructional strategies that are usually used in foresight courses include futurist lectures, discussions, group activities, reflection papers, and presentations. The choice of instructional strategy depends on the particular goals of a specific lesson or unit. In the domain of strategic foresight, common education base indicates that instructional strategies should be developed so that students become skillful at learning and practicing foresight knowledge, engaging in both written and oral academic discourse, working fluently with foresight data, building environmental scanning systems, developing scenarios and problem solving effectively. All these require providing students with particular opportunities, models, and guidance needed to develop each of those sets of skills.
Learning outcomes are more determined by the motivation, skills and behaviors of the student and less by differences among instructional strategies. In other words, any single instructional strategy is inherently more effective than all other strategies. Lerner et al. (1985) found that there must be a “goodness of fit” between the instructional situation and the student. Not surprisingly, some students are in situations where they “fit well” with their instructional situation and those students excel academically; other students have a poor fit with the instructional environment and are at risk academically.
Bringing that observation into the foresight field an instructor may find certain instructional strategies effective in advancing specific learning outcomes. For example, while discussions reflect learners’ understanding and analysis of futures concepts, reflection papers and presentations show how competent they are in producing foresight outputs. A foresight teacher can facilitate assessing learning outcomes by creating a table of authorities that identifies the objectives covered by the assessment tool as well as questions corresponding to each objective. Using a flexible variety of questions in the assessment tool (to be changed occasionally) and talking friendly to the students about the test are also good techniques that can be applied.
New foresight coaches can always check the practicality of their educational programs by conducting a pilot course project and may enjoy experienced foresight teachers and gurus’ ideas and views about their project.
A foresight course can be monitored effectively by asking a number of questions like these: Is the specific need of learners in learning foresight being addressed? Are the general and special teaching methods are applied effectively? Is the instructor confident about the data presented to the students? What is running right and what is being practiced wrongly by both the teacher and the students? What major conclusions do the students make in their discussions? Are their conclusions supported by the teaching and learning materials? How are educational data being used by the students? Are there other possible explanations for students’ understandings and reflections? What are they?
At the basic level, foresight instructors might be able to answer some of the above questions, but at the expert level, they and their students need to be monitored by expert observers. A good way to do this is inviting some expert foresight teachers to inspect our courses and receive their ideas. Their appraisal would be a wealth of knowledge that can advance our teaching effort in constructive ways. Being open to critiques and welcoming necessary reforms and improvements that should be made in the course will enrich our educational experience and will satisfy our students’ expectations. The following table summarizes stages of evaluation, involved parties and sources of evaluation clearly.
Table 1. Stages of evaluating a foresight course
In addition to involved parties and sources of evaluation mentioned above, a foresight course should be also evaluated and compared against courses conducted in similar areas such as strategic planning and management. Foresight teachers may be entitled to a wide range of knowledge and experience shared by many teachers online on strategic matters around the world. The best source of evaluation that is always available to an instructor is the students’ feedbacks. If they report cases like following items, the instructor requires a serious revision of the course material or teaching system: “You’ve left me behind. I can’t follow. The level of jargon in this course is beyond my understanding. I cannot use the LMS (Learning Managing System) easily. I don’t enjoy reading this.” Down the road, everything should be tuned according to students’ needs and level of understanding.
An eagle knows when a storm is approaching long before it breaks. It flies to some high spot and wait for the winds to come. When the storm hits, it sets its wings so that the wind will pick it up and lift it above the storm. While the storm rages below, the eagle is soaring above it. The eagle does not escape the storm. It simply uses the storm to lift it higher. It rises on the winds that bring the storm.
Managing a foresight course can appear as a storm and a foresight coach should be as clever as an eagle. When the course is completed and the students are graduated, it’s a good time to look back and find weak and strong points in our foresight educational program. Problems that students reported during the course period such as working with LMS (Learning Managing System), using foresight methods and tools, using and applying foresight data and preparing assigned outputs along with other unpredicted difficulties that appeared during the course all may come upon us like a storm. We can rise above them by setting our course up to higher levels of learning and teaching foresight. The storms do not have to overcome us. We can let our checking do the balancing work for us and lift us above them. Instructor’s experience coupled with students and experts’ feedbacks that had monitored our course make a compound that can enrich our educational effort.
Revisiting and post-evaluating a foresight course can be done in long middle and short runs. In long term, we should consider where our course fits into the curricular goals and course sequences. Perhaps the broad goals of our foresight course should be redefined, and a rearrangement of textbooks and study materials is necessary. For example, setting a goal such as leading a departmental team to develop strategic plans should consider developing mission, vision, and goals, appropriately matched to the near-term competitive, customer and industry environment. In middle term, learning objectives should be articulated for course and appropriate readings; videos, slides, websites, etc. need re-identification. The nature of assignments and activities should be also determined according to objectives, assessments, and instructional activities. And finally in short term, the calendar of activities, syllabus, LMS should be checked and updated.
Kibler, R., Cegala, D., Barker, L., & Miles, D. (1974). Objectives for instruction and evaluation. Boston: Allyn and Bacon, Inc.
Lerner, J. V., Lerner, R. M., & Zabski, S. (1985). Temperament and elementary school children’s actual and rated academic performance: A test of a “goodness-of-fit” model. Journal of Child Psychology and Psychiatry and Allied Disciplines, 26, 125-126.
About the author
Alireza Hejazi is a PhD candidate in Organizational Leadership at Regent University and a member of APF Emerging Fellows. His works are available at: http://regent.academia.edu/AlirezaHejazi