Published using Google Docs
Stoeckel_synthesis_paper
Updated automatically every 5 minutes

SIMULATIONS IN DISCOVERY LEARNING                                                

Computer Simulations in Assisted Discovery Learning

Marta R. Stoeckel

Boise State University


Abstract

Discovery learning theory calls for students to discover the rules and relationships behind a concept with little or no explicit instruction.  Computer simulations are particularly well-suited to the application of this theory, providing students an opportunity to explore and infer the underlying model.  Many criticisms have been directed at discovery learning, typically based on the challenges inherent in this approach.  These criticisms have influenced simulation designers, leading to the integration of scaffolding into many simulations to produce environments intended for assisted, rather than pure, discovery.  This paper will begin with an overview of discovery learning theory, followed by an examination of the criticisms of the theory, especially the particular difficulties many students face when attempting to engage in discovery learning.  Next, this paper will look at the connections between discovery learning and computer simulations.  Finally, it will look at several examples of how simulations have applied research findings into effective assisted discovery learning to provide scaffolds for learners.

Keywords: discovery learning, assisted discovery, simulation


        As constructivism has become an increasingly important view in education, open-ended, student-centered approaches such as discovery learning have taken the place of much explicit instruction, where students are directly provided with the target information.  Discovery learning, in particular, is viewed as an effective method to convey not only content, but the process of scientific inquiry (Bruner, 1961).  However, discovery learning is not without its critics.  Many have pointed out the significant cognitive and metacognitive demands discovery learning places on students.  Analyses of the challenges students face in discovery learning have led to an understanding of the kinds of scaffolds that provide a balance between supporting students in difficult tasks while still generating the high level of cognitive engagement which makes discovery learning successful.  This understanding of how to support discovery learning is especially apparent in the design of many computer simulations, which make explicit the scaffolds being provided to students.

What is Discovery Learning?

In his 1961 paper The Act of Discovery, Jerome Bruner called for the student to be permitted “...to put things together for himself, to be his own discoverer” (p. 21).  This paper is generally viewed as the origin of the discovery theory of learning (Mayer, 2004).  As with other constructivist approaches, discovery learning calls for learners to construct, test, and refine mental models in order to acquire knowledge (Ertmer & Newby, 1993), but discovery learning specifically calls for students to use experimentation and exploration to discover relationships, rules and ideas (Mayer, 2004).

While there is no single, agreed-upon definition of discovery learning within the field, the critical characteristic is that the target information is not provided to the learner in advance, requiring them to discover it for themselves (Alfieri, Brooks, Aldrich, & Tenenbaum, 2011).  Piaget’s assertion that “each time one prematurely teaches a child something he could have discovered from himself, the child is kept from inventing it and, consequently, from understanding it completely” (as cited in Klahr & Nigam, 2004, p. 1) summarizes the central belief underlying all forms of discovery learning.

Because this definition is so general, the term discovery learning may be applied to a wide range of instructional practices and conditions (Alfieri, Brooks, Aldrich, & Tenenbaum, 2011).  In fact, discovery learning can be viewed as a spectrum from pure (also known as unassisted) discovery, in which students are presented with a problem and receive little to no guidance regarding how to solve it, to guided (also known as assisted or enhanced) discovery in which some form of scaffolding is provided to students (Mayer, 2004).  Examples of pure discovery are very rare, with most instructors favoring guided discovery (Alfieri, Brooks, Aldrich, & Tenenbaum, 2011).  The scaffolding provided in guided discovery may take a wide range of forms, including structured inquiry, thoughtful questioning from the instructor, or other supports (Hammer, 1997).  

Regardless of the degree of assistance, most examples of discovery learning are connected to the process of scientific discovery (De Jong & Van Joolingen, 1998).  This results in a general framework in which learners develop a plan, collect data, analyze data, and develop a model in a repeating cycle.  While pure discovery lacks explicit structures for this process, the most successful students apply some version of it regardless.  Assisted discovery frequently includes scaffolding to ensure the majority of students apply a version of the scientific discovery process.

Challenges in Discovery Learning

        Much has been written on whether discovery learning is effective.  Bruner (1961) laid out four primary benefits which proponents of discovery learning theories continue to refer to.  First, discovery learning trains students to organize new information and seek out patterns, a skill with a wide range of applications.  Second, by placing the locus of control on the learner, discovery learning produces intrinsic rewards which, in turn, increases motivation.  Third, many pursuits require an understanding of the process of discovery, and discovery learning approaches provide students opportunities to practice the arts of inquiry, thereby developing an understanding of the process of discovery that cannot be produced solely by studying statistics or logic.  Finally, the organization of information and cognitive structures inherent in discovery learning ease the retrieval of memories, meaning the learner is more likely to recall information associated with discovery learning than information learned through rote memorization.

        While Bruner’s arguments are certainly powerful, there are significant problems with pure discovery learning.  Mayer (2004) cites several examples of studies which found pure discovery to be less effective than other instructional methods.  In each study, students participated in a lesson to learn a specific rule, but many of the students in the pure discovery condition never encountered, let alone learned, the intended target rule.  Without parameters to constrain their exploration, it was impossible for many of the students to reach the intended destination.

        Even when students encounter the intended content, discovery learning places significant metacognitive demands on students.  Students must self-regulate their learning by setting goals, generating a plan, monitoring their progress, and making adjustments as needed, but few learners will engage in these processes on their own (Azvedo, Behnagh, Duffy, Harley, & Trevors, 2012).  Bruner (1961) himself recognized the importance of these metacognitive skills in observations of a game similar to 20 questions in which children asked a series of yes or no questions to determine the cause of a car running off a road.  Those children who took an organized approach, beginning with general questions and using the answers to work toward specifics, were the most consistently successful, yet many of the children took a “pot-shot” approach, throwing out hypotheses with no discernible pattern rather than using previous answers to refine an existing hypothesis.  Many students simply lack the skill to independently apply the metacognitive skills discovery learning requires.

        Discovery learning also requires learners to recall, synthesize, and apply domain knowledge throughout the process (De Jong & Van Joolingen, 1998).  Students must be able to identify the prior knowledge relevant to the problem and apply it to generate a meaningful hypothesis and design a worthwhile experiment.  In the data analysis phase of discovery, prior knowledge forms a lens through which the data must be viewed in order to find meaning.  This level of synthesis and application is extremely challenging for novices, which can result in students floundering during discovery learning without appropriate supports.

        Beyond these general issues, the specific steps in the discovery process can prove extremely challenging for most students.  Generating a hypothesis is a crucial, but extremely difficult, process in discovery learning with several specific challenges (Chinn & Brewer, 1993).  First, many students, including those at the university level, simply do not understand what a hypothesis truly is (De Jong & Van Joolingen, 1998).  As part of this misunderstanding, many learners will be influenced by irrelevant factors, such as a fear of being wrong, when generating a hypothesis.  Students also struggle with using experimental data to effectively evaluate a hypothesis.  Dunbar (1993) found that many learners may keep a refuted hypothesis simply because they cannot come up with an alternative while Chinn and Brewer (1993) describe a range of actions a learner may take when data does not support a hypothesis, including outright rejection of the conflicting data.  An opposite response has also been observed; some learners lack confidence in their hypothesis and may change it even when the experiment does not call for a rejection of the hypothesis (Klahr & Dunbar, 1988).  In an extreme example of this, young subjects ascribed excessive importance to individual data points and, therefore, repeatedly changed their hypotheses during an experiment (De Jong & Van Joolingen, 1998).

        At least as challenging as generating a hypothesis is the design of an experiment.  Klahr, Dunbar, and Fay identify several key features of a good experiment, including that it must be simple and easy to monitor, it must give clear results, it must focus on only one dimension of a hypothesis, and it must exploit surprising results (as cited in De Jong & Van Joolingen, 1998, p. 14).  Given that these criteria can be challenging for professional researchers to meet, many students struggle with experimental design.  Many of the struggles students face in experimental design can be placed into several categories.  First, rather than attempting to test a hypothesis, students may focus on creating some desirable outcome, leading to a very narrow exploration of the variables (De Jong & Van Joolingen, 1998).  Related to this, students are often subject to confirmation bias and may design experiments with the intent of confirming, rather than rejecting, a hypothesis, again resulting in a narrow or incomplete exploration.    Many student experiments are also either inconclusive or inefficient.  A well-designed experiment will manipulate only one variable at a time and uses a systematic pattern to examine the full range of variables available or applicable to the problem.  Many students instead use a random approach, making it difficult to ensure the full range of variables is examined, or manipulate multiple variables simultaneously, making it impossible to identify clear relationships in the data.

        Once the experiment is complete, students must engage in the cognitively challenging task of data analysis and interpretation.  Experimental data is an often complex package and deriving meaning can require cognitively challenging processes such as interpreting graphs or identifying patterns (De Jong & Van Joolingen, 1998).  In addition, students are likely to misinterpret their data in a way that confirms their hypothesis (Chinn & Brewer, 1993).  Any issues in the experimental design will further compound the challenges in data interpretation due to incomplete or poorly organized data or data in which multiple variables were manipulated simultaneously.

        These concerns and challenges do not mean that discovery learning should be dismissed.  In a meta-analysis conducted by Alfieri, Brooks, Aldrich, and Tenenbaum (2011), while pure discovery approaches tended to result in lower student achievement than other approaches, assisted discovery consistently produced greater gains than other instructional methods.  This finding supports the assertions that most students lack the metacognitive skills to engage productively in pure discovery.  When provided with appropriate scaffolding, however, the high levels of cognitive engagement required by discovery learning can result in meaningful learning.  It is also worth note that students who experience assisted discovery learning may, over time, develop skills and strategies which allow them to succeed in settings closer to pure discovery.

Simulations in Discovery Learning

        Simulations are a natural fit for discovery learning. A simulation provides an imitation of a phenomenon or process based on a model which the learner must induce by manipulating the environment and examining the results (Jonassen & Easter, 2012).  While traditional laboratory activities may be more effective for developing certain types of knowledge, simulations can provide opportunities for exploration which would not otherwise be available (Snir, Smith, & Grosslight, 1993).  In particular, simulations are valuable tools for cases where the required equipment would be prohibitively expensive, the experiment requires excessive manual labor, or the experiment would constitute a safety hazard.  Simulations also provide the opportunity to adjust the time scale of an effect, allowing students to observe and measure phenomena which occur too quickly or too slowly to examine in the lab, or to remove undesired variables, such as friction, simplifying the process of identifying the target concept from within the data (Hennessey et al., 2007).  Finally, simulations provide the opportunity to adjust variables, such as gravity, that cannot normally be manipulated in the classroom or to push variables into extreme conditions which cannot be produced in the classroom without damaging equipment or causing a safety hazard (Wiemann & Perkins, 2006).

        Perhaps the most interesting way simulations support discovery learning is through conceptual enhancements.  Conceptually-enhanced simulations provide representations of normally unobservable aspects of a phenomena, allowing students to simultaneously explore the content of the simulation at both the concrete and theoretical level (Snir, Smith, & Grosslight, 1993).  This makes it possible for students to efficiently construct and test models that address the underlying mechanisms, rather than restricting explorations to observable effects.  For example, the bending light simulation (phet.colorado.edu) in figure 1 allows learners to display light as a wave, rather than a simple ray, making normally unobservable details, such as the wave’s change in speed in a new medium, visible to the user.

Screen Shot 2014-07-14 at 9.56.49 AM.png

Figure 1 Bending light simulation

Assisted Discovery in Simulations

        Because of the well-documented challenges with pure discovery learning, many simulation designers consider the specific challenges students face in order to incorporate effective scaffolding into educational simulations.  These supports take a wide variety of forms, but consistently result in greater student learning than simulations intended for pure discovery.

        Several studies have looked specifically at the effects of supporting student recall of background knowledge.  Hulshof and De Jong (2006) developed and tested an optics simulation which provided “just-in-time” knowledge tips as students worked through the simulation.  As students work through the simulation, background information related to their current activities in the simulation becomes available and the information available adapts as the student progresses.  This not only eases cognitive load by providing a reference, but reduces the need for the student to filter knowledge to identify information relevant to the current phase of discovery. Reid, Zhang, and Chen (2003) also examined the effects of explicit background knowledge activation by placing multiple choice questions related to the content in the introduction to a simulation on buoyancy and a static reference with definitions of concepts which could be accessed at any point during the simulation. In both cases, students with the background knowledge supports demonstrated significantly higher gains than students using the simulation without those scaffolds.

        Van Joolingen (1998) describes a hypothesis scratchpad designed to assist students in effectively generating and evaluating hypotheses.  The scratchpad provides a template for hypothesis generation, making explicit the required pieces of a hypothesis.  The scratchpad also includes a tracker which keeps a history of the hypotheses generated by the learner and summarizes the results of testing each hypothesis.  While Van Joolingen does not provide data to support the use of the hypothesis scratchpad, the structured format, tools for linking a hypothesis to experimental data, and record of prior hypotheses certainly have the potential to address many of the struggles students face in generating and evaluating hypotheses.

        Scaffolding for experimental design can also be a powerful tool in simulations.  Reid, Zhang, and Chen (2003) were able to produce gains over the pure discovery condition simply by providing students with text to remind them of the principles of good experimental design and prompts to set an objective before beginning each set of trials.  Manlove, Lazonder, and Jong (2006) used a similar approach, providing an outline of the process of inquiry with the addition of prompts for students to monitor their progress using the outline and to record key information.  In spite of the minimal interaction provided by these tools, students demonstrated significant gains.

        Van Joolingen (1998) provides examples of more interactive tools for supporting experimental design.  One of these is a tool for recording and storing the results of experiments the student performs in the simulation, reducing the need for a learner to recall what they previously found.  Using this tool, students can sort experiments by any of the variables, review results, or even replay an experiment entirely.  This simulation also includes an intelligent feedback system, which is able to directly communicate to students information such as which recorded experiments are relevant to evaluating a hypothesis or potential problems in an experimental design.  These tools directly address some of the common issues students have in experimental design.

        Simulations can also provide support for the metacognitive skills discovery learning requires.  Manlove, Lazonder,  and Jong (2006) used the Co-Lab simulation environment to provide students with a set of goals and sub-goals as they completed a discovery task.  Students were prompted to use these goals to guide their process and monitor their progress during the simulation.  ThinkerTools and Inquiry Island encourage a similar progress monitoring as well as other metacognitive processes through the use of intelligent agents (Azvedo, Behnagh, Duffy, Harley, & Trevors, 2012).  Students interact with these agents as they progress through the simulation, prompting them to monitor and self-reflect as a result.

Conclusions

        Discovery learning approaches present students with significant challenges, such as synthesizing prior knowledge, generating hypotheses, designing experiments, and analyzing data in addition to metacognitive demands such as self-regulating and monitoring progress.  When students are provided with appropriate scaffolding to assist in these tasks, however, the high degree of cognitive engagement demanded by discovery learning can produce significant results.  A number of computer simulations have integrated effective scaffolds to produce increased student learning.  The value of assisted discovery learning should continue to inform the design of educational simulations.  A knowledge of what constitutes effective assisted discovery is also a valuable tool for the classroom teacher seeking to utilize simulations, as it can guide an informed selection of simulations.  Some of the scaffolds used in simulations could also be provided by an instructor if necessary.  Regardless of whether the theory is applied by the instructor or the simulation designer, the informed application of scaffolding in discovery learning will result in a high-degree of cognitive engagement from students and significant, meaningful learning.


References

Alfieri, L., Brooks, P. J., Aldrich, N. J., & Tenenbaum, H. R. (2011). Does discovery-based instruction enhance learning? Journal of Educational Psychology, 103(1), 1.  Retrieved from http://epubs.surrey.ac.uk/804096/1/Tenenbaum%202011%20Does%20Discovery-Based%20Instruction%20Enhance%20Learning.pdf.

Azvedo, R., Behnagh, R., Duffy, M., Harley, J., & Trevors, G. (2012).  Metacognition and self-regulated learning in student-centered learning environments.  In D. Jonassen & S. Land (Eds.),  Theoretical foundations of learning environments (2nd ed.) (pp. 170-196).  New York, NY: Routledge.

Bruner, J. S. (1961). The act of discovery. Harvard Educational Review, 31, 21–32.  Retrieved from https://esci310-civicscienceeducation.wikispaces.com/file/view/The+Act+of+Discovery-Bruner.pdf/93415772/The%20Act%20of%20Discovery-Bruner.pdf.

Chinn, C. A., & Brewer, W. F. (1993). The role of anomalous data in knowledge acquisition: A theoretical framework and implications for science instruction.  Review of educational research, 63(1), 1-49.  Retrieved from http://www.researchgate.net/publication/49176136_The_role_of_anomalous_data_in_knowledge_acquisition__a_theoretical_framework_and_implications_for_science_instruction/file/72e7e52a1108424f79.pdf.

De Jong, T., & Van Joolingen, W. R. (1998). Scientific discovery learning with computer simulations of conceptual domains. Review of educational research,68(2), 179-201.  Retrieved from http://hal.archives-ouvertes.fr/docs/00/19/06/80/PDF/deJong-Ton-1998b.pdf.

Dunbar, K. (1993). Concept discovery in a scientific domain. Cognitive Science, 17(3), 397-434.  Retrieved from http://onlinelibrary.wiley.com/store/10.1207/s15516709cog1703_3/asset/s15516709cog1703_3.pdf;jsessionid=B77C667C4FD5EA8F32CF6CAEE03E2101.f02t02?v=1&t=hxlzf5sc&s=69b25d9efcf529c27d3cb68f1f57ee7419420f3f.

Ertmer, P.A., & Newby, T.J. (1993). Behaviorism, cognitivism, constructivism: Comparing critical features from an instructional design perspective. Performance Improvement Quarterly, 6(4), 50-72. Retrieved from http://edtech.mrooms.org/file.php/740/EDTECH504_Module2/504Module2_ErtmerNewby.pdf.

Hammer, D. (1997). Discovery learning and discovery teaching. Cognition and instruction, 15(4), 485-529.  Retrieved from https://castl.duq.edu/Conferences/Library03/PDF/Disc_Learn/Hammer_D.pdf.

Hennessy, S., Wishart, J., Whitelock, D., Deaney, R., Brawn, R., Velle, L. L., ... Winterbottom, M. (2007). Pedagogical approaches for technology-integrated science teaching. Computers & Education, 48(1), 137-152. Retrieved from http://www.pgce.soton.ac.uk/ict/SecondaryICT/PDFs/pedagogyfortechnologyintegratedscienceteaching.pdf.

Hulshof, C. D., & De Jong, T. (2006). Using just-in-time information to support scientific discovery learning in a computer-based simulation. Interactive Learning Environments, 14(1), 79-94.  Retrieved from http://telearn.archives-ouvertes.fr/docs/00/19/06/88/PDF/Hulshof-Casper-2006.pdf.

Jonassen, D. & Easter, M. (2012).  Conceptual change and student-centered learning environments.  In D. Jonassen & S. Land (Eds.),  Theoretical foundations of learning environments (2nd ed.) (pp. 95-112).  New York, NY: Routledge.

Klahr, D., & Dunbar, K. (1988). Dual space search during scientific reasoning.  Cognitive science, 12(1), 1-48.  Retrieved from http://onlinelibrary.wiley.com/store/10.1207/s15516709cog1201_1/asset/s15516709cog1201_1.pdf?v=1&t=hxlzk9b0&s=e1af08130303427bee6b1daf92ff8148b43d2bd5.

Klahr, D., & Nigam, M. (2004). The equivalence of learning paths in early science instruction effects of direct instruction and discovery learning. Psychological Science, 15(10), 661-667. Retrieved from http://www.psychology.nottingham.ac.uk/staff/dmr/c8ccde/Readings%20for%20Learning%20about%20science/KlahrNigam.2-col.pdf.

Manlove, S., Lazonder, A. W., & Jong, T. D. (2006). Regulative support for collaborative scientific inquiry learning. Journal of Computer Assisted Learning, 22(2), 87-98.  Retrieved from http://140.115.126.240/mediawiki/images/2/2e/Regulative_support_for_collaborative_scientific_inquiry_learning.pdf 

Mayer, R. E. (2004). Should there be a three-strikes rule against pure discovery learning? American Psychologist, 59(1), 14-19.  Retrieved from http://anitacrawley.net/Resources/Articles/Mayer%20Should%20there%20be%20a%20three%20strike%20rule%20against%20pure%20discovery%20learning.pdf.

Reid, D. J., Zhang, J., & Chen, Q. (2003). Supporting scientific discovery learning in a simulation environment. Journal of Computer Assisted Learning,19(1), 9-20.  Retrieved from http://tccl.rit.albany.edu/papers/ReidZhang2003.pdf.

Snir, J., Smith, C., & Grosslight, L. (1993). Conceptually enhanced simulations: A computer tool for science teaching. Journal of Science Education and Technology, 2(2), 373-388.  Retrieved from http://www.jstor.org/stable/40186191.

Van Joolingen, W. (1998). Cognitive tools for discovery learning. International Journal of Artificial Intelligence In Education (IJAIED), 10, 385-397.  Retrieved from http://hal.archives-ouvertes.fr/docs/00/19/73/49/PDF/vanJoolingen99.pdf.

Wieman, C. E., & Perkins, K. K. (2006). A powerful tool for teaching science. Nature physics, 2(5), 290-292. Retrieved from http://222.178.111.6/PhET2011/publications/NaturePhysics_Final.pdf.