Loading...
Menu

Teaching With Simulations

[
**]

Teaching with simulations

Nico Rutten

Doctoral committee

chair:

Prof. dr. ir. A. J. Mouthaan – University of Twente

promoter:

Prof. dr. W. R. van Joolingen – University of Twente

assistant promoter:

Dr. J. T. van der Veen – University of Twente

members:

Prof. dr. W. Admiraal – Leiden University

Prof. dr. H. Eijkelhof – University of Utrecht

Prof. dr. J. M. Pieters – University of Twente

Prof. dr. J. H. Walma van der Molen – University of Twente

Dr. A. W. Lazonder – University of Twente

Dr. E. B. Moore – University of Colorado, Boulder, USA

CTIT Ph.D. thesis series no. 14-317

Centre for Telematics and Information Technology

P.O. Box 217, 7500 AE  Enschede, the Netherlands

Nico Rutten

Teaching with simulations

thesis, University of Twente, Enschede, the Netherlands.

ISSN 1381-3617; (CTIT Ph.D. thesis series no. 14-317)

DOI print: 10.3990/1.9789402119589 ISBN print: 978-94-0211-958-9

DOI e-book: 10.3990/1.9789402118438 ISBN e-book: 978-94-0211-843-8

keywords: classroom study, computer simulation, computer supported inquiry learning, educational technology, interactive learning environment, Peer Instruction, science education, teaching

cover background: Hyena Reality www.freedigitalphotos.net

cover design: Studioivo www.studioivo.nl

publisher: CTIT www.utwente.nl/ctit

printer: Brave New Books www.bravenewbooks.nl

language editor: Emily Fox

layout: Nico Rutten

© Copyright 2014, Nico Rutten, [email protected]

Teaching with simulations

&
&

&Dissertation&

to obtain

the degree of doctor at the University of Twente,

on the authority of the rector magnificus,

prof. dr. H. Brinksma,

on account of the decision of the graduation committee

to be publicly defended

on Friday 29th of August 2014 at 14:45

by

Nicolaas Petrus Gerardus Rutten

born on October 13th, 1978

in Bleiswijk, the Netherlands

promoter: Prof. dr. W. R. van Joolingen

assistant promoter: Dr. J. T. van der Veen

This dissertation has been approved by the promoter and assistant promoter.

[
__]

[
__]

[
__]

for Hanneke, Steven & Arthur

[
__]

[
__]

[
__]

“Precisely this abstractness,

this separation from reality,

is what helps to get the idea.”

a remark from one of the teachers

interviewed in the observational study

“It’s the question that drives us.”

Trinity in The Matrix

[
__]

[
__]

[][]&Preface&

The title of this dissertation ‘Teaching with simulations’ is symbolic in three ways.

First, it symbolizes the organizational structure of my PhD project: it started off as a collaboration between the department of Instructional Technology and the ELAN Institute for Teacher Education, both part of the faculty of Behavioral Science at the University of Twente. Jan van der Veen looked after my daily supervision at the ELAN Institute. Soon after I had started my PhD project, Wouter van Joolingen became a professor at his department. The collaboration proved to be fruitful: Wouter switched from the Instructional Technology department to the ELAN Institute to occupy a full professorship. However, he has recently decided to leave the ELAN Institute to work at the Freudenthal Institute.

The dissertation title also symbolizes the challenge facing this research area. The ideal situation for investigating the learning effects of computer simulations is to have full control over all variables that could possibly influence learning effects. The ideal situation for investigating teaching is to impose as little researcher control over teaching practices as possible, in order for these to be more ecologically valid. These ideals collide. Therefore, it is very hard to design studies to investigate teaching with simulations that are both experimentally and ecologically valid.

The third symbolic meaning of ‘Teaching with simulations’ is this dissertation’s purpose: to bridge the research areas of pedagogy and instructional science. Completely bridging the gap between these fields is beyond our reach, because of the conflicting ideals mentioned above. However, I do believe that the outcomes of walking this bridge yield ideas for other researchers tackling similar research questions.

Nico Rutten, Enschede, 2014

&List of& &F&&igures and& &T&&ables&[
**]

[[
]]

Figures

[[
]]

1-1 Terminal velocity simulation, studied by Hennessy et al. (2007)

1-2 Animation of charge within an electric circuit, studied by Hennessy et al. (2007)

1-3 PhET simulation Gas properties, studied by Wieman and Perkins (2006)

1-4 PhET simulation Circuit Construction Kit, studied by Finkelstein et al. (2005)

1-5 PhET simulation Radio waves & Electromagnetic fields, studied by Finkelstein et al. (2006)

1-6 A framework relating learners’ needs, suggested tools, and the teacher’s role, as originally published by Salinas (2008)

2-1 Geographical origin of the studied publications

2-2 User interface of the computer-simulated forest game, studied by Riess and Mischo (2010)

2-3 Relations accounted for in the computer simulation studied by Riess and Mischo (2010)

2-4 The Virtual Chemistry Laboratory, studied by Dalgarno et al. (2009)

2-5 The Virtual Laboratories Electricity environment, studied by Zacharia (2007)

2-6 Example of chromosome analysis in Karyolab, studied by Gibbons et al. (2004)

2-7 An axon-length experiment in the diffusion lab, studied by Meir et al. (2005)

2-8 Representations of simulated motion phenomena, studied by Ploetzner et al. (2009)

2-9 Students working with Graph Plotter, studied by Mitnik et al. (2009)

2-10 Students collaborating to construct a layer cake structure in SMALLab, studied by Birchfield and Megowan-Romanowicz (2009)

2-11 Scientific Discovery Learning supported by heuristics that are both implicit and explicit, studied by Veermans et al. (2006)

2-12 Simulations interface of the concrete task (upper panel) and abstract task (lower panel), studied by Lazonder et al. (2009)

2-13 Screenshot from Taiga, studied by Barab et al. (2009)

2-14 River City interface, studied by Ketelhut et al. (2010)

2-15 A 3D model of the TEAL space, studied by Dori and Belcher (2005)

2-16 The TEAL classroom in action, studied by Dori and Belcher (2005)

3-1 Geographical location of the 19 schools of the 24 participating teachers

3-2 One of the participating teachers teaching with a simulation

4-1 An example of a ConcepTest, used by Crouch and Mazur (2001)

4-2 An example of a picture used in the FCI

4-3 Students using voting devices to register their responses to a question in class

4-4 An example of instruction in the Accustomed condition: a real-life demonstration of forces and motion on a table serving as a ramp

4-5 An example of instruction in the Peer Instruction condition: students use their voting devices to answer a question projected on the right screen about the simulation shown on the left interactive whiteboard

4-6 Peer Instruction implementation

A-1 Our colleagues at the ELAN Institute

[[
__
Tables]]

[[
]]

2-1 Enhancement of traditional instruction with computer simulation

2-2 Comparison between different kinds of visualization

2-3 Comparison between different kinds of instructional support

2-4 Classroom settings and lesson scenario

3-1 Coding scheme for lesson observations

3-2 Teacher questions

3-3 Pattern matrix for the exploratory factor analysis

3-4 Learning goal congruence & frequencies of coded teacher utterances

3-5 Pearson correlations

4-1 Characteristics of the teachers and their students

4-2 Coding scheme for lesson observations

4-3 Schematic outline of the lesson series and measures

4-4 Teacher questions

4-5 Teachers’ predictions of which condition has the highest learning gains

4-6 Scheduling of the lessons in both conditions

4-7 Analysis of questionnaire responses

4-8 Students’ answers given by using their voting devices during Peer Instruction

&Chapter 1& [][
**][][]&Introduction&

Having learners learn the same way that scientists conduct science. This approach to learning is known as ‘inquiry learning’, and it can be very well supported by using computer simulations, as has been widely investigated and confirmed. Most of this research has focused on students interacting with a computer simulation individually or in small groups. The main subject of this dissertation is using computer simulations during whole-class teaching. Do the benefits of learning with computer simulations also apply when using these tools while teaching in a whole-class setting? This question remains largely unanswered in the extant literature of this research field. In this introductory chapter we first elaborate on what is known so far about learning with computer simulations (section 1.1). We then explain what ‘having learners learn the same way that scientists conduct science’ means (section 1.2), and how teaching with technology can be facilitated and improved (section 1.3). This chapter concludes with our rationale for how the studies conducted in this dissertation have been set up (section 1.4). The first study (Chapter 2) is a literature review focused on finding out what is already known about the learning effects of computer simulations in science education. We subsequently studied teaching practices (Chapter 3) to investigate relations between implementations of computer simulations in physics and the attitudes of teachers and students about using computer simulations as support for teaching. In our experimental study (Chapter 4) we compared two pedagogical approaches to implementing computer simulations in physics, replicated this study five times, and analysed the interaction between the students and their teacher.

[
__]

1.1 What is Known about Learning with Computer Simulations?[
**]

1.1.1 []Inquiry learning with computer simulations

Simulations are more suitable for understanding relations between concepts than for learning facts. Therefore, they are especially appropriate for increasing conceptual knowledge (Urhahne, Nick, & Schanze, 2009). With simulations such as those illustrated in this chapter (see Figure 1-1, Figure 1-3, Figure 1-4 and Figure 1-5), learners can learn in a way that is similar to the way scientists do research. This way is called the inquiry approach to learning (de Jong & van Joolingen, 1998). For example, by having students predict how a process will unfold and having them subsequently test these predictions, an ‘internal discourse’ can be encouraged to occur within the students’ minds (Windschitl, 1998). Although there is no doubt that such predictions allow for deeper understanding to develop, the question remains as to how to prevent having students make predictions from becoming a superficial exercise or being skipped altogether. Some learning environments avoid this by prompting students to make predictions at certain points (T. Bell, Urhahne, Schanze, & Ploetzner, 2010). One of the most difficult aspects of involving students in inquiry learning is posing questions that are both meaningful and open to scientific inquiry.

Figure 1-1 Terminal velocity simulation, studied by Hennessy et al. (2007). Printed with permission.

Inquiry-based teaching seeks students’ active engagement in the process of learning as opposed to simply demonstrating to them how things work. Within science education a shift is discernible from such an ‘exemplary scientific practice’ toward a more ‘naturalistic practice’, in which having students be able to actually interact with concrete reality is valued (Hennessy et al., 2007). Shifting the focus from valuing the recall of facts toward valuing inquiry-based science activities leads not only to increased conceptual understanding, but also to improved self-confidence and improved science process skills and achievement (Zacharia, 2003). Even though the importance of learning by inquiry is widely recognized, it is still hard to find a commonly accepted definition of it (T. Bell et al., 2010). Apart from its importance, the question of how scientific inquiry is actually implemented in science education has rarely been considered (Björkman & Tiemann, 2013), and the concept of inquiry instruction appears to be frequently misunderstood by science teachers, for example, by simply equating it with hands-on instruction (Maeng, Mulvey, Smetana, & Bell, 2013). The way that implementation of inquiry-based instruction is conducted varies considerably among teachers. Internationally, there seem to be considerable differences regarding how scientific inquiry is implemented in schools (Björkman & Tiemann, 2013). Its success depends on the structure that teachers provide for the learning activities, and their experience with inquiry projects (H. Y. Chang, 2013; Fogleman, McNeill, & Krajcik, 2011).

In order to make a lasting change in teaching and learning regarding the use of computer simulations, including ways of visualizing them in a high-tech fashion, it is necessary to create a specific didactic and curricular approach. In doing so, the importance of the choice to simulate a phenomenon or to have students interact with it in reality should not be overestimated. A more important question-in the context of learning physics-is in what ways the physical or virtual phenomenon can be manipulated (Zacharia & Olympiou, 2011). In fact, referring to physical experiments by using the term ‘hands-on’ is misleading, as exercises can be ‘hands-on’ or ‘hands-off’ regardless of their mode of presentation as physical or virtual. The extent to which an exercise addressing a given phenomenon allows for manipulation is more important than whether this is done physically or virtually. Teaching with simulations allows for shifting control of the course of the learning process from the teacher to the student, to a great extent. Instead of directly instructing students about specific content, the teacher provides students with an environment where they can explore and discover (Akpan, 2001). When stimulating learning by supporting an active approach, this is not so much a matter of behavioral activity (e. g., hands-on activity or discussion), but more a matter of cognitive activity (e. g., selecting, organizing, and integrating knowledge) (Mayer, 2004).

The notion of the student as independent knowledge builder should not be romanticized. No matter how instruction is organized, a large part of learning arises from authoritative sources; even if it is not from a teacher, it can be from books, television programs and the like. If learning by exploration and experimentation is not working, its success generally depends on more intensive guidance by a teacher, rather than less (Scardamalia & Bereiter, 1991). An important role for the teacher is to lead plenary class discussions to reveal confusion on concepts and relations between them, and to clarify these. Experienced teachers can quickly notice alternative conceptions and use these in their teaching. For example, they can set up a scenario to purposefully create cognitive conflict (Hennessy, 2006). Sound curricula consist of direct instruction as well as learning by inquiry. Inquiry learning is especially appropriate for acquiring deep, intuitive, conceptual knowledge. Direct instruction is best used for learning factual and procedural knowledge (de Jong, 2006). It can also be appropriate to provide direct instruction in inquiry learning environments in the form of lectures, at points when students are receptive for such an approach. Achievement can increase when direct instruction is used when students indicate that there is a need for it. Direct instruction delivered when the teacher sees fit appears to be less effective (H. Y. Chang, 2013).

Even though learning with computer simulations can enhance learning results, this does not necessarily imply that the inquiry learning process per se was successful. If a student succeeds in reaching a certain state in a simulation, this does not necessarily mean that the desired conceptual knowledge has been acquired as well (de Jong & van Joolingen, 1998). Overly prescriptive simulation exercises can cause students to have a view of science that resembles the simplistic confirmatory nature of many science lab activities (Windschitl & Andre, 1998). Chen (2010) argues that the majority of virtual laboratories that are available online are based on an oversimplified view of inquiry learning. This view assumes a deductive relationship between a tested hypothesis and evidence, instead of a view that stresses the necessity of inspecting the hypotheses, the evidence and the experimental conditions as a whole. Chen therefore warns that replacing physical laboratories with virtual ones could eventually lead to students having naïve thinking paths based on oversimplified logic, which departs from the goal of science education.

We consider teaching to be ‘interactive’ if students’ contributions are expected, encouraged and extended (Beauchamp & Kennewell, 2010). Interactivity is also a characteristic of simulations themselves. It is one of the most unique and powerful aspects of the application of computer simulations in science education. The possibility of interacting with simulations allows users to obtain insight into the cause-and-effect relationships in the simulated model (Amory, Naicker, Vincent, & Adams, 1999). Interactivity of instruction is a critical element for learning, as it stimulates student engagement with learning activities and collaboration between the students in class. In turn, such active collaboration between students allows teachers to adjust the pace, the style, and the subject of lectures to the needs of students; to identify and clarify confusion; and to make sure all is well understood before proceeding to the next topic (Blasco-Arcas, Buil, Hernández-Ortega, & Sese, 2013). Collaborative learning is related to increased use of higher thinking strategies and critical thinking skills. This is caused by more active participation, verbalization of methods and strategies, and increased motivation and time on task. Variation of instructional methods increases the likelihood of meeting the needs of all students, as some learn better in one way than in another (Kewley, 1998). What matters is that the teacher is available when students are most receptive to teacher guidance, and can help them reformulate their thinking. It is important for teacher guidance to build upon students’ own ideas. In this way, the teacher can stress similarities and differences between such informal ideas and scientific conceptions. This supports students in constructing more abstract, general, and declarative knowledge networks (Hennessy, 2006).

1.1.2 []Learning from visualized phenomena

Science instruction consisting of only lectures and demonstrations is considered suboptimal for students’ development of conceptual scientific understanding (Crouch & Mazur, 2001; Wieman & Perkins, 2006). Even though students often consider demonstrations to be their favorite part of science instruction (Crouch, Fagen, Callan, & Mazur, 2004), objective tests show that teachers can considerably overestimate the extent to which students actually learn from their demonstrations (Zacharia & Anderson, 2003). A lack of learning effectiveness for frontal learning with demonstrations may be caused by a lack of active engagement by the students with the content to be learned. An alternative explanation could be the difficulty students have in filtering the information they are confronted with during a demonstration to identify what is essential for understanding and what is not. As an expert, a teacher succeeds in this filtering in a largely automatic fashion. Depending on teachers’ proficiency, they can help students with this filtering process. A student can do this to a much lesser extent, i. e., filter information in order to preserve the information that is essential for understanding the laws underlying the demonstrated phenomenon (Wieman & Perkins, 2005). This process of filtering information depending on relevance is related to the concept of fidelity, which refers to the possibility of varying levels of realism that is allowed for when simulating reality. “Simulation = (reality) – (task irrelevant elements)” as Gagné eloquently stated (Lunetta & Hofstein, 1981). As this reduction of reality’s complexity allows for teaching with a simplified version of reality (see Figure 1-2), simulations can facilitate learning by focusing students’ attention more on the targeted phenomena (de Jong & van Joolingen, 1998).

Figure 1-2 Animation of charge within an electric circuit, studied by Hennessy et al. (2007). Printed with permission.

On the one hand, low fidelity simulation allows for filtering out of irrelevant details and focusing the student’s attention on the content to be learned. On the other hand, high fidelity simulation can stimulate recognition of the simulated phenomenon in its natural setting and subsequent reasoning about it (Zacharia & Olympiou, 2011). Use of simulations allows for perceptual grounding for concepts that are otherwise too abstract to comprehend (Goldstone & Son, 2005). In turn, this allows students to predict what will happen with abstract ideas in a more concrete way (Akpan, 2001). Simulations make it possible to show what is normally invisible, and to make explicit connections among multiple representations (Wieman, Adams, Loeblein, & Perkins, 2010). By visualizing relations, simulations can support the development of insight into complex phenomena (Akpan, 2001). Using simulations can bridge the gap between concrete and abstract ways of reasoning in a way that was not formerly available in the science classroom (Y.-F. Lee & Guo, 2008). This can ensure that students are better prepared when they ultimately move into the non-simulated context (Lindgren & Schwartz, 2009).

By reducing the amount of technical jargon and mathematics surrounding complex scientific phenomena, application of computer simulations has facilitated making certain topics accessible to a much wider audience (Wieman & Perkins, 2006). However, simulation environments with an inappropriate design such as being overly complex can hinder learning or even mislead students and have negative effects (Y.-F. Lee & Guo, 2008). Even if simulations are well-designed from an expert point of view, this does not ensure effective learning, as what is happening on a computer screen is often perceived differently by novice students (Wieman & Perkins, 2006). Attention is necessary to prevent students from developing misconceptions caused by taking simulated abstract concepts too literally or assuming that every variable can be controlled easily (Hennessy, 2006).

1.1.3 Inquiry-based teaching

A more teacher-directed pedagogical approach appears to be a poor fit with what contemporary technologies such as simulations have to offer. These technologies allow for an active student-centered way of learning. However, contemporary teaching often has a frontal lecture-based character (Salinas, 2008). A teacher-directed stance to implementing computer simulations in science education runs the risk of reducing learning to a step-by-step cookbook approach that exactly prescribes what students are to do. In comparison to such a teacher-directed approach, a constructivist approach is probably more effective, allowing students the opportunity to construct, test, and evaluate their own hypotheses (Windschitl, 1998). Simulations appear to fit well within scientific education reforms, for example, as support for Interactive Lecture Demonstrations or Peer Instruction (Finkelstein et al., 2006).

The central idea of constructivism is that knowledge construction takes place in one’s own mind (Dori & Belcher, 2005). Even though constructivism has many forms, the underlying premise is that learning is an active process where learners actively engage in sense making to construct coherent and organized knowledge (Mayer, 2004). A basic assumption of teaching according to the constructivist approach is that knowledge cannot simply be transmitted by the teacher to be received by the student: students must be involved in the construction of their own knowledge (Dori & Belcher, 2005). Such an approach to teaching is accompanied by transfer of control over their learning processes to the learners, to a certain extent. In many groups of students this causes increases in challenge, motivation and engagement (Hennessy et al., 2007). An empirically-based model that builds on these ideas is the learner-centered model. Establishing a more central role for students within the learning process can be accomplished by: providing them with the choice to select subjects that are more personally relevant to them; offering flexibility as to how to spend their time by having them work at their own pace; giving them more responsibility over their own learning processes; and ensuring increased understanding by focusing more on critical thinking skills, instead of memorization (Salinas, 2008). As working at one’s own pace provides a learner with the opportunity to integrate the information before proceeding to the next phase, the information to be learned can be divided into digestible chunks (Betrancourt, 2005). When students do not actively participate in the process of knowledge construction, a possible risk is that the knowledge to which they are directly exposed is isolated, distorted, or forgotten. Consequently, students are primarily able to apply the learned information in familiar contexts, but without the ability to transfer beyond them. Nurturance of transfer is an important role of education (Y.-F. Lee & Guo, 2008).

Pedagogical insights appear to progress more slowly than the development of technological applications for educational use. A common practice that results from this is that teachers adapt new technologies to existing pedagogies that originated from experience with more conventional means (Hennessy, 2006). This is not necessarily problematic, as it is possible that a teacher is already accustomed to such behaviours as talking through laboratory experiments according to the inquiry cycle. However, teaching with new technologies according to existing practices raises the risk that additional affordances will remain unutilized. In addition to recognizing the possibilities of new technologies for replacing existing practices, an attempt should be made to identify their unique capacities for learning that goes beyond existing pedagogies (Winn, 2005). Teacher education is an important starting point for engaging pre-service teachers in finding out how technology can be integrated as part of a curriculum, instead of considering it as an additional component of an existing lesson (R.-J. Chen, 2010). Clarification of effective pedagogical approaches to teaching with simulations can offer researchers and practitioners heuristics and recommendations that can also be integrated into teacher education programs (Khan, 2011). The research literature suggests that supporting teachers in the integration of technology for inquiry learning can lead to their students acquiring deeper conceptual understanding of science, higher motivation to learn science, and improved inquiry learning practices (Maeng et al., 2013).

1.1.4 []Computer simulations and laboratory activities

Because of the many advantages of simulations, it is possible that students using simulations will learn more compared to students using comparable laboratory environment (Finkelstein et al., 2005). Simulations can be used in class when equipment is not available, or when it is not practical to set it up (Wieman et al., 2010). Another application of simulations is for doing experiments that would otherwise be impossible to do (Wieman et al., 2010). Variables can easily be changed in simulations in response to students’ questions, where this is not always possible with real equipment (Wieman et al., 2010). Students can practice lab techniques before engaging in lab experience with real equipment (Akpan, 2001). They can also practice with simulations at home to repeat or extend classroom experiments for additional clarification (Wieman et al., 2010). Classroom use of simulations can also support student motivation (Khan, 2011) and interest (Akpan, 2001).

Computer simulations and laboratory activities can easily be combined in teaching practice. Such variation in teaching approaches increases the likelihood of satisfying the diversity of interests and learning needs among students. This facilitates learning compared to when teaching using only one approach (Powell & Lee, 2004). Combining teacher-directed ways of teaching with application of high tech tools can synergistically enhance the educational experience (Finkelstein et al., 2006). It is better not to make the objective be to replace laboratory experiments by simulations, as each approach has its own benefits and drawbacks. Therefore, it does not make sense to claim that one is better than the other. Nevertheless, it is possible to imagine situations in which using simulations is the only option, because the ‘real’ experience is impossible (Hatherly, Jordan, & Cayless, 2009). Furthermore, the importance of deciding whether a learning experience should take place in a virtual or real way must not be exaggerated. Whether a learning experience is generated by a computer or is produced on a laboratory bench is less important than whether the student considers the experience to be meaningful and learns from the process (Barko & Sadler, 2013).

Figure 1-3 PhET simulation Gas properties, studied by Wieman and Perkins (2006). Printed with permission.

Compared to laboratory experiments, simulations have the disadvantages that they are only able to show outcomes that are pre-programmed, and can only be manipulated to a limited extent. Moreover, they do not do much to develop the skill of handling lab equipment (Karlsson, Ivarsson, & Lindström, 2013). Yet, obstacles for learning with laboratory activities are the limited availability of facilities, lack of time, and large classes. Besides that, the manipulation of equipment itself is time-consuming, and conducting lab experiments often comes down to ritualistically following a list of tasks, preventing students from engaging with the larger purpose of the exercise (Hofstein & Lunetta, 2004). In contrast, simulated experiments require less space and time, and have a lower budget (Nickerson, Corter, Esche, & Chassapis, 2007): there is no need for lab equipment; the school schedule is less of a constraint; and it is possible to teach a larger group of students, who are possibly also more geographically dispersed (Karlsson et al., 2013). Most websites that are recommended for use in combination with physics laboratories are based on simulations (Chen et al., 2012). Nevertheless, an advantage of physical laboratories is that handling equipment provides tactile information, which leads to deeper conceptual information processing according to theories on embodied cognition. Another advantage is that the authentic delays between trials can stimulate careful planning of the next experiment and taking time for evaluation. Virtual laboratories, on the other hand, allow students to link processes that are normally invisible to observable processes and symbolic equations, stimulating an abstract way of reasoning that overarches different representations (de Jong, Linn, & Zacharia, 2013). Simulation allows students to explore models and processes of much higher complexity compared to what is possible in laboratory settings at schools, without the accompanying dangers and costs (Hennessy, 2006).

In order to set the cognitive stage for students’ future learning, simulations can be used effectively for the introduction of challenging or unfamiliar concepts (Brant, Hooper, & Sugrue, 1991; Windschitl, 1998). Simulation use as a preparatory activity can function as a conceptual model that helps students to better understand and encode information. Students who are confronted with such a model during a preparatory activity are eventually more likely to remember the information and to reason with the principles learned in transfer situations (Akpan, 2001). When using simulations as an activity after laboratory experiments or instruction, simulations can support integration of acquired basic conceptual understanding into meaningful associations (Brant et al., 1991; Windschitl, 1998). However, confronting students with a simulation afterward can cause the students to have difficulty in assimilating the information based on the simulated model. Moreover, students who are confronted with information without having seen the simulation first can have less success with encoding the information in cognitive structures compared to students who learned with simulations in advance (Akpan, 2001). In a recent review, Smetana and Bell (2012) recommend having simulations precede hands-on explorations if the goal is to acquire or improve science process skills, and to use simulations afterward if the goal is to enhance conceptual understanding. In practice, many teachers appear to start using technology only after having spent several lessons on introducing and exploring certain subjects (Hennessy, 2006).

1.1.5 []Providing instructional support

Ongoing development of simulations during the past decades has led to an increase in their complexity: their initial character as more phenomenological-having as their main purpose to determine the effects of changing variables-has evolved into primarily an inquiry character, in which operating virtual equipment can support complex thought experiments (Y.-F. Lee & Guo, 2008). By having students freely explore a simulation, their ‘messing about’ may lead to discovering the opportunities and constraints of a system (Finkelstein et al., 2005). Offering learners an open learning environment allows them to actively engage in formulating principles and procedures, and to develop higher order skills themselves (Njoo & de Jong, 1993). ‘Openness’ of a learning environment, however, should not be interpreted as offering learners total freedom. Even though the possibility exists that something will actually be learned, empirical research of the past half-century shows that providing minimal guidance in learning is less effective and efficient compared to support that is focused specifically on the cognitive processes necessary for learning (Kirschner, Sweller, & Clark, 2006). Supporting discovery learning by providing sufficient guidance is of paramount importance (Alfieri, Brooks, Aldrich, & Tenenbaum, 2011; Mayer, 2004). In relation to the inquiry cycle, learning support can be specifically geared toward the phases of interpretation, experimentation, and reflection (Reid, Zhang, & Chen, 2003). Educators are advised to use this teaching approach only when sufficient time is available for the discovery processes to take place, as learning by inquiry can be time-consuming (Swaak, de Jong, & van Joolingen, 2004; Windschitl, 2000).

An important question is the extent to which the inquiry learning process needs to be supported. The issue is to find the balance between providing enough freedom for exploring the learning environment and becoming cognitively active in the process of sense making on the one hand, and providing sufficient support in order for cognitive activity to result in meaningful learning on the other hand (Mayer, 2004). If instructions are minimal, students may not succeed in exploring certain concepts while using the simulation (Meir et al., 2005). They may merely use the simulation as far as necessary to answer the questions, a practice that can possibly result in a poorly developed knowledge framework. One risk of providing more elaborated instructions is that besides following these instructions, students may show reduced initiative to try things on their own in the simulation. Elaborately described instructions may be experienced as a kind of barrier between the student and the simulation and give students the feeling that the simulation belongs not to them, but to the teacher (Adams, Paulson, & Wieman, 2008). Research on inquiry learning shows that it is more effective-compared to more oppressive supports or directives-to provide support with an optional or just-in-time character (Lindgren & Schwartz, 2009).

Figure 1-4 PhET simulation Circuit Construction Kit, studied by Finkelstein et al. (2005). Printed with permission.

Learning with computer simulations can be supported from within a simulation itself as well as by a teacher. In a computer simulation guidance and feedback can be provided by such means as giving hints, tips on rollover or corrective feedback (de Jong & van Joolingen, 1998). The recent research literature shows a shift toward increased attention to pedagogical issues concerning effective support for learning with computer simulations (Smetana & Bell, 2012). The importance of this shift is emphasized by our own review, as we show that the majority of researchers investigating the learning effects of computer simulations during the past decade have ignored this pedagogical context (Rutten, van Joolingen, & van der Veen, 2012). We therefore fully endorse Hennessy’s (2011) suggestion not to consider technology as the starting point, but instead to begin with an observation of patterns of interaction and activity that already exist in the classroom, and to investigate how these patterns can be supported by the affordances of technological tools.

1.1.6 []The role of the teacher

It is somewhat unclear how teachers can support learning with simulations that will result in robust understanding of a subject (H. Y. Chang, 2013). In particular, the teacher plays a major role in whole-class teaching with computer simulations with regard to familiarity with computers, attitude about the role of technology in teaching and learning, and preferred style of teaching (Smetana & Bell, 2012). The role of the teacher in teaching with simulations is to stimulate pedagogical interaction involving himself/herself, students and the computer simulation. As this role is of paramount importance for the success of this mode of teaching, systematic research on how to provide optimal guidance for this teacher role is required (van Joolingen, de Jong, & Dimitrakopoulou, 2007). Up to now, the role of the teacher has been insufficiently investigated within the general research area of application of educational technology (Hennessy, 2006; Urhahne, Schanze, Bell, Mansfield, & Holmes, 2010), as well as with regard to computer simulations in particular (Hsu & Thomas, 2002). Moreover, little is known about how simulations can be integrated within computer-enhanced curricula (Papadouris & Constantinou, 2009; Zacharia & Anderson, 2003). Much research seems to assume that ICT implementation in schools automatically leads to changes in teaching methods and learning arrangements, without actually investigating these changes (R.-J. Chen, 2010). As pedagogical developments are unable to keep pace with the rapid developments of educational technology (Hennessy, 2006), classroom application of computer simulations often means repackaging their use to fit within prevailing instructional practices (Windschitl, 2000). A consequence is that the chosen pedagogical approach is often out-of-step with how students are already used to working with the technology outside school (Campbell, Wang, Hsu, Duffy, & Wolf, 2010). Knowledge gaps in this research area are: what are the most effective instructional settings for teaching and learning with computer simulations, and what roles should the teacher take on in supporting the learning processes (Rutten et al., 2012; Smetana & Bell, 2012)?

A clear role for the teacher lies in creating a pedagogical context focused on an inquiry culture, because even though computer-supported learning environments are highly sophisticated nowadays, such a culture will not develop without the teacher’s systematic effort (Hennessy, 2006; Windschitl, 2000). Being the person who monitors the timeline of a lesson, the teacher should allocate time to explicit instruction on exactly how a simulation works in order to ensure proper use (Marshall & Young, 2006), provide students with the opportunity to pose and pursue their own “what-if…”-scenarios (Hennessy, 2006), and find the right balance between covering the topics required by national standards and allowing time to help students integrate their ideas and engage in scientific inquiry (Linn, Lee, Tinker, Husic, & Chiu, 2006).

When teaching physics with simulations or demonstrations, it is possible that the informal ideas (alternative conceptions) that students already have are inappropriate or even interfere with the development of suitable discourse (Roth, McRobbie, Lucas, & Boutonné, 1997). For effective integration of computer simulations in their lessons, teachers should not only know about the range of possible alternative conceptions, but should also be able to recognize these in their particular students (Webb, 2005). As teachers are familiar with the learning content and are proficient at handling concepts, it may be particularly difficult for them to understand what students experience as hard to comprehend (Baser, 2006). But as soon as it is clear what types of alternative conceptions students have, computer simulations are a very suitable tool for deliberately creating cognitive conflict: simulations can be set up to represent situations that contradict students’ existing conceptions, after which they have the opportunity to reflect upon this and resolve the conflict (Trundle & Bell, 2010). Disequilibration was believed by both Piaget and Vygotsky to be a necessary process for learning: without unexpected events nothing rises to consciousness (Kewley, 1998).

Windschitl and Andre (1998) argue that teaching from an inquiry approach is especially suitable for students with more advanced epistemological beliefs, as students with epistemological beliefs that are less developed benefit more from an objectivist treatment. In relation to differing group achievement levels, Hennessy and colleagues (2007) argue that in less proficient groups the main focus should be on understanding the visualized phenomena, and that in more able groups it is also important to focus attention on the premises on which the simulation is based. Besides prior achievement levels, the cognitive impact of a simulation can also be influenced by the student’s gender (Y.-F. Lee & Guo, 2008).

Figure 1-5 PhET simulation Radio waves & Electromagnetic fields, studied by Finkelstein et al. (2006). Printed with permission.

1.2 Learning Activities Supporting Inquiry[
**]

As Bell, Urhahne, Schanze and Ploetzner (2010) point out, there is a variety of conceptualizations of the process of inquiry learning in this research area. Bell et al. (2010) describe nine distinct inquiry processes in their compilation. Our description below of five inquiry learning activities is inspired by their compilation, as well as by descriptions of learning activities by the project Science Created by You: http://scy-net.eu/scenarios/index.php/The_Scenario_Repository. Our attempt to merge inquiry processes in a conceptually consistent way resulted in the following five learning activities: Orienting & Asking questions, Hypothesis generation & Design, Planning & Investigation, Analysis & Interpretation, and Conclusion & Evaluation.

1.2.1 []Orienting & Asking questions

The process of inquiry can be focused on answering a question, but also on other goals, such as investigating a controversial dilemma or solving a problem. A teacher can introduce this with a classroom discussion and support it with, for example, narratives, videos, or simulations. While the teacher provides a framework for the learning activities, the students’ involvement in discussions can be monitored. The students can take notes, ask questions, and discuss the contents. In order to be able to formulate learning goals, it must be clear what knowledge and skills the students already have, where there are gaps, and where information can be found to fill these gaps. Formulating questions can be facilitated by structuring the question (problem/case) by identifying relevant limitations and variables. Knowing when the learning activity has been successfully completed necessitates clarifying the goals that should be achieved, or the criteria that should be met.

1.2.2 [][]Hypothesis generation & Design

Together with students’ prior knowledge and the notes they have taken, the structure of the question (problem/case) forms the basis for formulating hypotheses, which can be considered as supposed relations between measurable dependent and independent variables. Depending on the focus of inquiry, the process of generating hypotheses and designing can take several approaches. An experiment can be set up to test hypotheses. For problem resolution, relating a hypothesis to the data allows for checking whether the hypothesis solves the problem. Another approach to investigating hypotheses is to design a model by building a physical or virtual artifact. For example, students can design a house to investigate influences on CO2 emissions. The appropriateness of models can be evaluated by relating it to the notes that students took during the learning process.

1.2.3 [][]Planning & Investigation

Clearly formulated hypotheses facilitate planning the work process. Planning includes determining the order of activities and intermediate goals, and how these activities can be divided among the participants. Investigations can be performed by conducting experiments or designing artifacts, using physical or virtual tools.

1.2.4 []Analysis & Interpretation

Teachers can support the students’ process of data investigation by organizing the data collected and interpreting them by identifying key issues. When solving problems, solutions found by experts can also be examined, and compared with the students’ own solutions for the same problem. For investigation of controversial cases, different perspectives on approaching the case can be analyzed, and the value of different information sources can be evaluated. These processes can generate new questions for further inquiry.

1.2.5 []Conclusion & Evaluation

Arriving at conclusions in the inquiry process can mean achieving consensus about a solution to a problem, producing a common artifact, or synthesizing views to arrive at a mutual decision. As it is important not only to arrive at a conclusion, e. g., solve a problem, but also to have actually learned something, reflection is necessary to allow for recognition of similar problems (questions/cases) in the future, transfer of knowledge to such situations, and the ability to apply the learned strategy. Besides evaluating one’s own outcomes, it can also be interesting to evaluate others’ outcomes and determine the extent to which they meet the criteria set. Comparing the collected data to such criteria can necessitate subsequent refinement of the conceptual model. When determining whether learning goals are achieved, it can be valuable for future inquiry activities to identify what factors have been facilitators or barriers in attaining the goals.

1.3 Teaching with Technology[
**]

1.3.1 []Facilitating teaching with technology

Even though hardware and software are widely available for teaching and its quality has improved dramatically, technology appears to be mainly used for presentation purposes, instead of for engaging students in learning activities (R.-J. Chen, 2010). For learning in virtual environments, it is more important that learning activities be carefully designed for learning than whether the exotic interface allows for intuitive interaction (Mikropoulos & Natsis, 2011). Involving students in inquiry learning in a creative, student-centered way, is just as possible in a whole-class setting with only one computer and one projection screen. A strong inquiry lesson does not require advanced technology (Maeng et al., 2013). In general, the focus is too much on using the software tools, and little thinking goes into how to integrate these tools into the process of teaching in order to provide added educational value and achieve learning goals (Papadouris & Constantinou, 2009). Merely exposing students to technology does not lead to the goal of engaging students in scientific inquiry activities to develop deep scientific understandings (Schrum et al., 2007).

Teaching within ICT-rich learning environments allows for making use of their extra ‘affordances’, where ‘affordance’ in this context refers to what the learning environment offers the student (Webb & Cox, 2004). The use of these new affordances within ICT-rich learning environments leads to higher complexity in teachers’ pedagogical reasoning concerning both the planning and teaching of lessons. This also requires adaptation of teachers’ values and beliefs (Webb, 2005). The teachers need additional knowledge, understanding, and experience with ICT to be able to: determine the affordances of ICT-tools that are related to their teaching goals; develop suitable tasks that make the specific affordances available; and, respond adequately to students’ reactions in the classroom. During teaching, the teachers can facilitate student learning in three ways: by providing the appropriate affordance; by increasing the level of affordance that a tool provides, for example, by having students make predictions with a simulation; and by providing additional information about an affordance, for example, by explaining or demonstrating certain features of a tool (Webb & Cox, 2004). To support student learning, a teacher first needs to realize what misconceptions students can have related to a certain subject. One example of a lack of coherence resulting in misconceptions is that students consider slowing down, speeding up, moving at a constant speed, and standing still as independent states of motion, and think that these states are unrelated to a fixed relationship between force and acceleration. Another example is that students think, on the one hand, that slowing down at a constant rate necessitates a constant opposing force, but on the other hand, that a constantly increasing force is needed for an object to constantly accelerate (Thornton & Sokoloff, 1998). Teachers need to be able to determine the extent to which their own students have such misconceptions. When considering the use of ICT-tools, the teacher needs to be capable of determining whether the affordances of a tool allow for removing students’ misconceptions. To make decisions on whether or not to teach with specific ICT-tools, the teacher must be able to compare the affordances of the tool with affordances of alternative ways of teaching (Webb, 2005).

Salinas (2008) provides a framework that shows how different aspects of teaching with technology are intertwined: it shows relations between the learning needs of students, the technology that could be used, the relevant level of Bloom’s Taxonomy, and the most appropriate role for the teacher in supporting the learning process (see Figure 1-6). According to Salinas, today’s teachers are not trained for changing their roles in order to allow for optimal facilitation of learning with technology. Learning with technology can support learning at higher levels of Bloom’s Taxonomy, but it necessitates appropriate role adaptation by the teacher, namely, changing from a leader into a guide/facilitator. As Salinas argues: “By using technology not as an aid to teaching or as a sophisticated toy, but as a fully integrated educational tool, will our students learn not only know how to read, write, and do math, but also how to explore, create new knowledge, and solve the problems that certainly await them in the 21st century” (page 8).

Figure 1-6 A framework relating learners’ needs, suggested tools, and the teacher’s role, as originally published by Salinas (2008). Printed with permission.

1.3.2 []Improving teaching with technology

According to Schrum (1999), three aspects of experience are crucial to support preservice teachers in learning about technology and about how they can integrate technology into their teaching. First, they should be exposed to different kinds of technological tools in skill-based courses. Second, they should learn about how these technological tools can be integrated into subject areas in method courses. And third, these teachers should be placed in a technology-rich environment in actual teaching practice, where they can receive support while implementing technology-rich lessons. In other words, acquisition of technological skills by preservice teachers requires not only the necessary knowledge, but also opportunities to practice implementing these skills, in order to eventually allow for augmentation of students’ learning results (R.-J. Chen, 2010).

Dexter and Riedel (2003) suggest several approaches to developing preservice teachers’ beliefs and self-efficacy concerning the integration of technology into their teaching. Teacher expertise can be augmented by having them collaborate in workshops. Moreover, collaborating teachers can be stimulated to start technology-rich projects together with preservice teachers. A third suggestion is to implement concrete examples of technology integration into a curriculum in one’s own teaching approach by observation of others’ teaching practices (R.-J. Chen, 2010).

Merely bringing teachers together in computer workshops is insufficient to accomplish actual changes in teachers’ pedagogical practices. A blended approach has a higher chance of success: short workshops alternating with periods in school, and having the teachers communicate with each other and exchange learning materials (Voogt, Almekinders, van den Akker, & Moonen, 2005). Unfortunately, it still happens that teachers are not involved in the development of technology integrated education. If in these cases the teachers are considered to be the cause of the failure of a technological innovation, then this is rather unfair (Urhahne et al., 2010).

According to McCrory (2008) there are four knowledge aspects that are essential within the science domain: knowledge of science, knowledge of students’ preconceptions, knowledge of science-specific pedagogy, and knowledge of ICT. Webb (2008) argues that science teachers need to learn how to link the affordances of specific ICT tools to prevailing misconceptions of their students to be able to effectively integrate ICT tools into learning activities. Voogt (2009) investigated the effectiveness of ICT professional development arrangements for teachers, in which the principles as stated by McCrory and Webb were integrated. The teachers appeared to be able to plan and perform ICT-supported science lessons. However, the teachers needed more time to actually integrate ICT into their daily teaching practices (Voogt, 2010).

1.4 Our Research on Teaching with Computer Simulations[
**]

Investigating computer simulations under ecologically valid conditions involves confronting several practical barriers. Finding teachers willing to participate in our studies was hard, because of the time investment for teachers besides their usual workload and the small benefit for them of participating, other than a small gift and the recordings of the lessons for self-reflection. Teachers who are willing to have our research implemented in their regular lessons are necessarily confronted with a scheduling structure that is considered essential in order to fulfill the requirements of our research design. Donnelly and colleagues (Donnelly, O’Reilly, & McGarr, 2013) mention several complicating variables for implementation of rigid experimental designs in a secondary school-based context that also apply to our research studies: restraining the control group in an experimental intervention focused on learning gains will meet with ethical resistance from teachers, principals, and parents; teachers are not required to participate in scientific research; unavailability of funds to financially compensate participating teachers; setting up the studies requires taking into account travel time because of the geographical spread of participating teachers; daily school life necessitates taking into account frequent irregularities due to student absenteeism, school events, teacher training days, school holidays, and so forth. The result of combining standards for experimental research with the possibilities within teachers’ daily teaching practices inevitably necessitates arriving at a compromise. Allowing for practical feasibility in our studies has consequently resulted in the research having been performed at different points during the school year and varying durations of implementation of the experimental study.

All observations, interviews and experiments that are described in this dissertation have been conducted in collaboration with physics teachers at secondary schools in the Netherlands. The results of TIMSS-Advanced (Meelissen & Drent, 2009)-an international comparative study of secondary education-reveal that Dutch physics teachers feel very well-equipped to teach the subject matter. Eighty-six percent of the teachers use a computer during the lesson for whole-class instruction or demonstration purposes. Compared to the other countries participating in the TIMSS, Dutch students have the least opportunity to perform experiments during the physics lesson and are most confronted with tests based completely on open questions, whereby most questions are focused on application of knowledge instead of on its reproduction.

The teachers in our studies mostly use the simulation suite created by the Physics Education Technology project: http://phet.colorado.edu (2014). The PhET simulation suite consists of more than 100 online and freely available computer simulations that teachers can use in their teaching. We noticed that these simulations are widely used in Dutch secondary physics education. The widespread introduction of interactive whiteboards in Dutch science classrooms facilitated the embrace of PhET simulations by physics teachers. The fact that nearly all of them have been voluntarily translated into Dutch illustrates the appreciation that science teachers in the Netherlands have for these simulations. In line with most educational simulations, the PhET sims can be interacted with by changing variables that respond dynamically to this input, which enables learning from a constructivist approach; this range of possibilities of interaction is productively constrained in order to prevent the student from being overwhelmed and going astray; and the simulations allow for visualization of phenomena and processes that are normally invisible. Moreover, the PhET design team put a lot of effort into making the simulations engaging, and also into building in a considerable amount of guidance and feedback, including ways to address common misconceptions (Finkelstein et al., 2006; Wieman et al., 2010). The implementation of each generation of PhET simulations is preceded by an iterative cycle of testing in and out of class, student interviews, and making necessary improvements (Finkelstein et al., 2005). Based on their own research, the authors claim that their simulations fit well with the present environment of internet and games that young people grow up in; that their simulations allow for conveying ideas in very different and powerful ways; and that they can be pedagogically more effective compared to demonstrations or laboratory experiments. Concerning the ways students interact with PhET simulations, the authors report that students are often inclined to explore extreme cases in the simulation by themselves; it also seems that they are more inclined to explore the simulations compared to laboratory experiments; and students are rarely misled by visualizations on an unrealistic scale, e. g., large blue electrons crawling over the screen (Wieman & Perkins, 2006).

1.4.1 []Research questions

The overarching research question on which this dissertation focuses is:

· How can whole-class science teaching benefit from computer simulations?

To answer this research question we studied the literature, observed lessons, interviewed teachers, had students fill in questionnaires, and conducted experiments.

The purpose of the first study in this dissertation was to find out what is already known about teaching with computer simulations. We therefore conducted
a review study focused on the following research questions:

· How can traditional science education be enhanced by the application of computer simulations?

· How are computer simulations best used in order to support learning processes and outcomes?

After having studied the literature, we turned to investigating teaching practices to experience first-hand how physics teachers teach with computer simulations, and to learn from them and their students how they think about teaching with these tools. For this purpose, we arranged lesson observations and interviews with 24 physics teachers of schools located throughout the Netherlands. In this study we investigated this research question:

· How do physics teachers use computer simulations in whole-class teaching?

We endeavored to gain insight into teaching with simulations by delving into the research literature, by observing lessons in which it occurred, and by inquiring about it with the teachers and their students. This gradually led to ideas about how teaching with simulations works. We wanted to put these beliefs to the test by conducting an experiment. To set up an experiment, it is possible to have groups of students come to the university and participate in an experiment under highly controlled conditions. However, ecological validity would be undermined by such an approach. We therefore chose to conduct the experiments ‘in the field’, that is, at the schools. As the groups of students remained intact, the study is quasi-experimental. Inspired by the results of the review study and the observational study, the experimental study had as its purpose to investigate whether the enhanced learning outcomes following inquiry-based learning with computer simulations are applicable as well at the whole-class level when there is inquiry-based teaching according to a student-centered approach, that is, Peer Instruction. We focused on the following research questions:

· How does an inquiry-based teaching approach support learning with computer simulations in a whole-class setting?

· How does pedagogical interaction in whole-class teaching with computer simulations influence learning gains?

We chose the research design that is frequently used in the studies that we have reviewed: a comparison between the learning effects for several groups of students over several lessons by measuring the students’ knowledge about a subject at certain timepoints. We replicated this this pre-post research design five times. We analyzed the interaction between the teachers and their students to study the impact of the teacher support in different contexts.

Chapter 2[
**][][][]&The Learning Effects of Computer Simulations in Science Education&[][1]

This article reviews the (quasi-)experimental research of the past decade on the learning effects of computer simulations in science education. The focus is on two questions: how use of computer simulations can enhance traditional education, and how computer simulations are best used in order to improve learning processes and outcomes. We report on studies that investigated computer simulations as a replacement of or enhancement to traditional instruction. In particular, we consider the effects of variations in how information is visualized, how instructional support is provided, and how computer simulations are embedded within the lesson scenario. The reviewed literature provides robust evidence that computer simulations can enhance traditional instruction, especially as far as laboratory activities are concerned. However, in most of this research the use of computer simulations has been approached without consideration of the possible impact of teacher support, the lesson scenario, and the computer simulation’s place within the curriculum.

&2.1& &Introduction&

The increasing availability of computers and related equipment such as smartboards and mobile devices, as well as the fact that computer simulations have become available for a wide range of science subjects (e. g., the PhET sims at http://phet.colorado.edu, 2014), have led to simulations becoming an integral part of many science curricula. This raises the question of how simulations are best used to contribute to improved learning of science. Research into the use of computer simulations has a long history, as de Jong and van Joolingen (1998) pointed out in their review. In the present review, we investigate the state of the art in simulations for science education, focusing on the ways simulations can be used to enhance traditional instruction and on the ways they can be embedded in instructional support to promote learning processes. We determined the reported effects of those interventions on learning process and learning outcome.

According to de Jong and van Joolingen (1998) a computer simulation is “a program that contains a model of a system (natural or artificial; e. g., equipment) or a process”. Their use in the science classroom has the potential to generate higher learning outcomes in ways not previously possible (Akpan, 2001). In comparison with textbooks and lectures, a learning environment with a computer simulation has the advantages that students can systematically explore hypothetical situations, interact with a simplified version of a process or system, change the time-scale of events, and practice tasks and solve problems in a realistic environment without stress (van Berkum & de Jong, 1991). A student’s discovery that predictions are confirmed by subsequent events in a simulation, when the student understands how these events are caused, can lead to refinement of the conceptual understanding of a phenomenon (Windschitl & Andre, 1998). Possible reasons instigating teachers to use computer simulations include: the saving of time, allowing them to devote more time to the students instead of to the set-up and supervision of experimental equipment; the ease with which experimental variables can be manipulated, allowing for stating and testing hypotheses; and provision of ways to support understanding with varying representations, such as diagrams and graphs (Blake & Scanlon, 2007).

By placing emphasis on the learner as an active agent in the process of knowledge acquisition, computer simulations can support authentic inquiry practices that include formulating questions, hypothesis development, data collection, and theory revision. Proceeding through a simulation can gradually lead learners to infer the features of the simulation’s conceptual model, which may lead to changes in the learners’ original concepts (de Jong & van Joolingen, 1998). By actively involving learners in exploring and discovering, computer simulations can be powerful learning tools, as learning involving doing is retained longer than learning via listening, reading, or seeing (Akpan, 2001). Even though the tendency towards more learner-centered instead of teacher-centered education has caused this discovery learning approach to be popular (Veermans et al., 2006), the extent to which control can be turned over from the teacher to the learner does have its limits. If there is insufficient support for the processes of discovery learning within a computer simulation, learners have difficulties in generating and adapting hypotheses, designing experiments, interpreting data and regulating learning (de Jong & van Joolingen, 1998). Minimization of guidance clearly leads to a deterioration of the effectiveness of inquiry learning. Even though providing learning support restricts the students’ possibilities of freely exploring the simulation environment to a certain extent, the scaffolding it provides improves their performance in simulation-based learning (van Berkum & de Jong, 1991).

In recent years learning technologies have lost their initial prestige, because they were often introduced with mythical overstatements regarding their effects on learning processes and outcomes, and were subsequently unable to live up to those expectations (Dillenbourg, 2008). What does research of the past decade on computer simulations in science learning say about their educational effectiveness in this regard? Is their application advisable and is their effectiveness-as far as their use as pedagogical intervention is concerned-robust?

Our investigation of interventions considered in the research on educational computer simulations in science learning first endeavored to cluster the themes of those interventions into categories. Subsequently, we examined whether the effectiveness of computer simulations from the perspective of those themes could be deduced from the research. In other words, the present study focuses on answering the following research questions:

· How can traditional science education be enhanced by the application of computer simulations?

· How are computer simulations best used in order to support learning processes and outcomes?

We were also interested in the extent to which researchers considered the role of teachers in guiding the students’ learning processes while working on the simulation.

Our categorized themes are based on the results of our literature search and serve merely as a possible way of organizing these results. Therefore, we certainly do not claim this categorization to be comprehensive. However, a clearer picture of the effectiveness of computer simulations from the perspective of these themes can serve as a basis for deriving teacher guidelines for providing effective guidance to students while working with computer simulations.

&2.2& [][]&Method&

2.2.1 [][]Data collection

To answer the research questions, three databases were searched for relevant research articles: ERIC (2014), Scopus (2014) and ISI Web of Knowledge (2014). Searching these databases started on September 21st, 2009 and was repeated to track changes until the final check on April 14th, 2011. We limited our search to the past decade (published in the period 2001-2010). Journal articles and reviews were searched by using the following keywords: [“computer simulation” OR “interactive learning environment”], [science OR physics OR chemistry OR biology OR mathematics] and [(education* OR instruction*) AND (teach* OR train*)]. The ERIC-search resulted in 333 articles. The Scopus-search-additionally limited by [Social Sciences OR Psychology]-resulted in 163 articles. The search of ISI Web of Knowledge-additionally limited by [Education & Educational Research OR Psychology OR Behavioral Sciences] -resulted in 89 articles. By comparing the total of 585 publications, we found 75 duplicates. The exclusion of these duplicates resulted in a total of 510 unique publications.

To decide whether publications were actually about computer simulations as we conceive them, we determined whether de Jong and van Joolingen’s (1998) definition of computer simulation (as stated earlier) applied. We consider the possibility of interacting with a simulation to be an essential characteristic, distinguishing this construct from animations (instructional animations, which have been excellently reviewed by Höffler and Leutner, 2007). We focused on students from 12-20 years old, as these are the most important years for the acquisition of basic scientific knowledge. Studies on computer simulations in areas other than science were excluded, except for the other affiliated STEM-disciplines: technology, engineering and mathematics. Studies about modeling were also excluded, as modeling can be considered to be a distinct research area. We are particularly interested in studies in which the computer simulation serves an educational purpose. Therefore, we selected those studies in which the use of the computer simulation is aimed at changing knowledge and/or skills, and in which these changes are quantitatively measured and a comparison between groups and/or between pretest and posttest is made. As our interest in the measurement of learning effects led to our focus to be on (quasi)experimental studies, applying this criterion inevitably led to the exclusion of works that address relevant questions, but do not focus on an empirical intervention, such as research on the Molecular Workbench by the Concord Consortium (Pallant & Tinker, 2004; Tinker & Xie, 2008), and research on the Physics Education Technology (PhET) project (Finkelstein et al., 2005; Finkelstein et al., 2006; Wieman & Perkins, 2005, 2006). Studies that merely describe the design of a computer simulation, or use subjective judgment (as in feedback questionnaires) as the only instrument of measurement were excluded. To ensure the quality of the publications, we excluded studies in journals that are not registered in the ISI Web of Knowledge (2014) database. Applying all exclusion criteria left a total of 51 publications remaining, 48 empirical studies and 3 reviews. Figure 2-1 gives the geographical origins of the publications studied.

Figure 2-1 Geographical origin of the studied publications.

2.2.2 [][]Qualitative analysis

After investigating the contents of each publication, we endeavored to find ways to categorize the studies by looking at coherence between the interventions on which the studies focused. We distinguished the following themes that served to interrelate the interventions addressed across this body of studies: variations in representation; degree of immersion (the extent to which users of a virtual environment actually believe they are inside this environment); instructional support; gaming; level of engagement; teacher guidance; and collaboration. Studies that make an overall comparison between using computer simulations and traditional instruction-and therefore focus less specifically on certain themes-are reviewed separately. After having scored each study’s intervention accordingly for thematic relatedness, we concluded that considering gaming and engagement as separate categories would be unjustified, as these themes are each the focus of intervention in merely two studies. This review is organized according to the remaining themes as grouped under four major categories: enhancement of traditional instruction with computer simulation (including the themes traditional instruction and laboratory activities), different kinds of visualization (including representation and immersion), different kinds of instructional support, and classroom settings and lesson scenario (including engagement, teacher guidance and collaboration). We distinguish visualization from representation, as using different media-e. g., stereoscopic glasses and a computer screen-allows for visualizing the same representation in various ways. Representations displaying processes that change with respect to time are referred to as being dynamic (Ainsworth & VanLabeke, 2004).

2.2.3 [][]Statistical analysis

Where relevant and possible, effect sizes were calculated. We used the methods of Thalheimer and Cook (2002) for calculating Cohen’s d from the data presented in the studies. Even though Thalheimer and Cook (2002) offer five different ways of calculating Cohen’s d, the data from 19 studies did not allow for its calculation, which makes drawing conclusions at a meta-analytic level unwarranted. Among the studies that report an intervention effect (43 studies) reasons for missing d’s are: t-tests or F-tests or data on which these could be based are missing (12); or statistical data are reported only at a detailed level and do not allow for calculation of a d that can be associated with the authors’ overall conclusions (7).

&2.3& [][]&Results&

In reviewing the publications all studies were assigned to four major categories, which were subdivided into specific themes: Enhancement of traditional instruction with computer simulation, subdivided into computer simulation and traditional instruction and laboratory activities and computer simulation; Comparison between different kinds of visualization, subdivided into different kinds of representation and varying degrees of immersion; Comparison between different kinds of instructional support, subdivided into supporting Scientific Discovery Learning and instructional support approached from other perspectives; and Classroom settings and lesson scenario, subdivided into varying levels of engagement and the roles of teacher guidance and collaboration. Needless to say, most studies do not focus on just one theme, but rather attempt to take several themes into account. Therefore, our categorization of the studies is based on each study’s focal intervention. For example, the studies by Gelbart et al. (2009), Mitnik et al. (2009), Manlove et al. (2006), and Saab et al. (2007) all take collaboration into account, but none of these studies is placed under the category of collaboration. This is because while all of these studies work with collaborating students in the experimental group as well as in the control group, they focus their intervention itself on another theme.

2.3.1 [][]Enhancement of traditional instruction with computer simulation

The 17 studies in this section have in common that they investigated the effects of computer simulations as a supplement or alternative to traditional teaching, as opposed to comparing different kinds of simulations with each other. We first discuss studies that are relatively diverse in their research approach, and then move on to discuss studies that investigated the specific topic of computer simulations in the realm of laboratory activities. Table 2-1 provides specific details on the studies reviewed in this section.

[]Computer simulation and traditional instruction

Jimoyiannis and Komis (2001) compared a group of students who received traditional classroom instruction with a group who were exposed to both traditional instruction and computer simulations. They investigated the effect of this intervention on students’ understanding of basic kinematics concepts concerning simple motions through the Earth’s gravitational field. The students who used the computer simulation in addition to traditional instruction achieved significantly higher results on the research tasks. Therefore, the researchers suggest that computer simulations can be used as a complement to or alternative for other forms of instruction in order to facilitate students’ understanding of velocity and acceleration. Stern, Barnea and Shauli (2008) similarly compared two groups of students, both of which were taught curriculum on the kinetic molecular theory. The experimental group subsequently spent additional class periods using the computerized simulation, “A Journey to the World of Particles”. The students in the experimental group scored significantly higher than the students in the control group (Cohen’s d = 0.81) on a test measuring their understanding of the theory. However, overall achievement was very low and long-term learning differences negligible. The authors attribute this to a lack of sound teaching strategies, i. e., addressing students’ prior knowledge and guiding their interpretations of learning experiences.

The study by McKagan, Handley, Perkins, and Wieman (2009) investigated the effects of reforming a physics course; among other changes, a computer simulation was implemented in the curriculum on the photoelectric effect. As demonstrated by improved exam achievement, the reformed curriculum led to an improved ability to predict the results of experiments on the photoelectric effect. However, students’ ability to connect observations and inferences logically did not improve. According to the authors this might be a symptom of students’ more general lack of reasoning skills for drawing logical inferences from observations. As the implemented learning techniques were not investigated separately, claims about the effectiveness of computer simulations can hardly be made on the basis of this study.

In comparing a research simulation with regular class work within the domain of genetics, Gelbart, Brill and Yarden (2009) found a significantly positive influence of the computer simulation on learning outcomes. Students’ understanding was measured by testing their ability to respond correctly to true/false statements (d = 0.87) and to provide explanations for their choices (d = 0.80). Moreover, two learning types could be distinguished among the students, based on differences in how they take advantage of opportunities to become conversant with research practices: students who are more research-oriented appear to be more able to expand their knowledge in comparison with more task-oriented students. Riess and Mischo (2010) also assessed the learning effectiveness of a computer simulation by investigating students’ task performance as well as their ability to provide explanations for their answers. Students received the task of cultivating a simulated section of forest in order to experience short- and long-term effects of treating a forest as a cultivated ecosystem (see Figure 2-2 and Figure 2-3). Results show that systems thinking can be most effectively fostered by providing a combination of specific lessons and opportunities to explore a computer simulation (dachievement = 0.37 and djustification = 0.13).

Figure 2-2 User interface of the computer-simulated forest game, studied by Riess and Mischo (2010). Printed with permission.

Figure 2-3 Relations accounted for in the computer simulation studied by Riess and Mischo (2010). Printed with permission.

In their study, Duran, Gallardo, Toral, Martinez-Torres, and Barrero (2007) focused on both the affective and cognitive domains in order to investigate the effects of a computer simulation on students’ motivation and interaction. They replaced part of the traditional method in a subject titled “Electrical Machines and Installations” with a software-based method that made use of a computer simulation. This appeared to stimulate discussions among the students themselves as well as with the teacher during the brainstorm session. Although the results for the cognitive domain could not be clearly interpreted, the results for the affective domain indicate that the new method has a profound influence on student satisfaction. The authors ascribe this improvement to the use of real-world examples and showing real-time simulations during lectures. Additionally, the new method improves participation and students’ initiative compared to traditional instruction. We will elaborate further on this study in section 2.3.4. The study by Kiboss, Ndirangu and Wekesa (2004) similarly took both cognitive and affective gains into account. Their Computer-Mediated Simulation program on the biology subject of cell theory led to improvements in academic achievement (d = 1.54), students’ perceptions of classroom environment (d = 2.78), and their attitudes toward the subject (d = 2.16, all very large effects).

[]Laboratory activities and computer simulation

Multiple studies focused on using simulations as a means of preparing students for laboratory activities. In the study by Martinez-Jiménez et al. (2003), students in both the control and experimental groups performed an experiment on the extraction of caffeine from tea. A pre-laboratory simulation program introduced the experiment for the experimental group. Student performance was evaluated for: carrying out the experiment, laboratory report quality, experiment problem-solving and results of a written test. The researchers found that using the preparatory simulation leads to better comprehension of the techniques and basic concepts used in their laboratory work. The students with the greatest learning deficiencies profit most from using the pre-laboratory program. In a study by Baltzis and Koukias (2009), students taking a course on analog electronics were encouraged to complete a circuit simulation task individually prior to performing a laboratory experiment in pairs. This intervention led to increased interest in the course and an overall improvement of academic performance.

Another study on pre-laboratory exercises was conducted by Winberg and Berg (2007), who considered the questions that students ask their teachers during the laboratory exercise as an indicator for cognitive focus, and took the spontaneous use of chemistry knowledge during interviews as an indicator of the usability of knowledge. The results of their experiments suggest that introducing laboratory work with a preparatory computer simulation leads to students asking more theoretical questions during laboratory work and showing more chemistry knowledge while being interviewed. The authors therefore conclude that preparatory exercises intended to help students integrate their theoretical, conceptual knowledge into schemata can allow room for reflection, but may also contribute to students having a better sense of direction during their laboratory work. In a similar fashion, Limniou, Papadopoulos, Giannakoudakis, Roberts and Otto (2007) show that replacing part of a laboratory session on the topic of viscosity with a collaborative pre-lab simulation exercise can improve content knowledge.

In a more recent study, Dalgarno et al. (2009) compared the ability of a 3-dimensional Virtual Laboratory (VL) and a Real Laboratory (RL) to function as a tool for familiarizing students with the spatial structure of a laboratory and the apparatus and equipment it contains. After the VL-group had explored the simulation (see Figure 2-4) and the RL-group had been taken on a tour of the actual laboratory, all students were tested on their recall of the laboratory layout and their familiarity with apparatus. The researchers conclude that the Virtual Laboratory is an effective tool for familiarization with the laboratory setting.

Figure 2-4 The Virtual Chemistry Laboratory, studied by Dalgarno et al. (2009). Printed with permission.

In studying the differences between using so-called Real Experimentation (RE) and Virtual Experimentation (VE), Zacharia (2007) compared a control group only using RE with an experimental group using a combination of RE and VE (see Figure 2-5). The results indicate that replacing RE with VE during a specific part of the experiment has a positive influence on students’ conceptual understanding of electrical circuits, as measured by conceptual tests (d = 0.70). The effectiveness of virtual laboratories versus real laboratories as learning mechanisms has also been investigated by Gibbons, Evans, Payne, Shah and Griffin (2004) in a bioinformatics class (see Figure 2-6). Based on their first study, they assert that virtual laboratories can save significant amounts of time for students without affecting learning (dpractice = 3.56 and dassessment = 2.36, both very large effects on study time, where practice and assessment respectively refer to exercises with and without immediate feedback). In their second study they found that virtual laboratories do not necessarily lead to performance improvements: the potential of virtual laboratories to outperform traditional laboratories depends on the nature of the presented material (dtopic 1 = 0.71 and dtopic 2 = 0.40). In a study on the topic of protein structure, White, Kahriman, Luberice and Idleh (2010) compared traditional teaching with a 3D visualization and a simulation that shows the consequences for protein folding of altering amino acid sequences. The authors conclude that the learning effectiveness of their visualization and simulation activities is comparable, and that these activities are more effective than traditional instruction.

Figure 2-5 The Virtual Laboratories Electricity environment, studied by Zacharia (2007). Printed with permission.

Figure 2-6 Example of chromosome analysis in Karyolab, studied by Gibbons et al. (2004). Printed with permission.[
__]

Meir, Perry, Stal, Maruca and Klopfer (2005) investigated another virtual laboratory called OsmoBeaker, which allows students to perform inquiry-based experiments on diffusion and osmosis at the molecular level (see Figure 2-7). Even though their simulated laboratories lead to improved understanding and can help to overcome student misconceptions, the authors emphasize the key role that written instructions accompanying the simulations play in promoting learning, as simply presenting a simulation environment to students is not enough.

Figure 2-7 An axon-length experiment in the diffusion lab, studied by Meir et al. (2005). Printed with permission.

Chang, Chen, Lin, and Sung (2008) compared using computer simulations with traditional laboratory learning, as well as different supportive learning models with each other. They also investigated whether abstract reasoning abilities would have an impact on the extent to which students could learn from simulations. The results show that learning about optical lenses by using simulations leads to a significantly greater improvement in learning outcomes in comparison with traditional laboratory practice (although all effect sizes are small: dexperiment prompting vs. lab = 0.12, dhypothesis menu vs. lab = 0.17 and dstep guidance vs. lab = 0.11). Students with better abstract reasoning abilities appear to benefit more from simulation-based learning (d = 0.06, a negligible effect). The authors conclude that helping students with the construction of hypotheses is a good way to support simulation-based learning in general. However, they warn that the offering of support during experimental procedures limits the students’ freedom insofar as they are to follow the offered steps, which can weaken their learning results.

[]Summary and discussion

The reviewed studies that compared the application of computer simulations with traditional instruction seem to indicate that traditional instruction can be successfully enhanced by using computer simulations. Within traditional education they can be a useful add-on, for example serving as a pre-laboratory exercise or visualization tool. In most cases simulation conditions showed improved learning outcomes, with effect sizes up to 1.54. With regard to the cognitive domain, use of computer simulations appears to facilitate students ‘ conceptual understanding (Jimoyiannis & Komis, 2001; Meir et al., 2005; Stern et al., 2008; Zacharia, 2007), requires less time (Gibbons et al., 2004), and improves the ability to predict the results of experiments (McKagan et al., 2009). With regard to the affective domain, computer simulations can positively influence students ‘ satisfaction, participation and initiative (Duran et al., 2007) and improve their perception of the classroom environment (Kiboss et al., 2004). Studies that specifically focused on using computer simulations as pre-laboratory exercise tools conclude that they can effectively support familiarization with the laboratory (Dalgarno et al., 2009), improve students ‘ cognitive focus (Winberg & Berg, 2007), lead to better comprehension of the techniques and basic concepts used in laboratory work (Martinez-Jiménez et al., 2003), and increase interest in the course and improve academic results (Baltzis & Koukias, 2009; Limniou et al., 2007). Martinez-Jiménez et al. (2003) additionally report that those students with the greatest learning deficiencies profited most from working with the pre-laboratory program.

The results of research on the enhancement of traditional instruction with computer simulation are promising, as the majority of studies report improvements for the cognitive and affective domains. However, a word of caution is warranted, as short-term increased understanding does not necessarily lead to meaningful learning over the long term-as Stern et al. (2008) point out-and most studies in this category investigated only short-term results. In order to ensure that meaningful learning takes place, it is necessary to attune teaching strategies and the curriculum to the use of simulations and vice versa (Stern et al., 2008), such as by the encouragement of students to follow a research-oriented approach (Gelbart et al., 2009) or by focusing curriculum on the development of scientific reasoning skills (McKagan et al., 2009).

[Table 2-1 Enhancement of traditional instruction with computer simulation
__]

table<>. <>. |_.
p<>. primary author

(year of publication) |_.
p<>. science discipline |_.
p<>. cognitive topic |_.
p<>. N |_.
p<>. interventions |_.
p<>. results/conclusions |_.
p<>. effect size

(Cohen’s d) | <>. |<>^\7.
p<>. Computer simulation and traditional instruction | <>. |<>^/2.
p<>. Gelbart, H. (2009) |<>^/7.
p<>. biology |<>^/2.
p<>. genetics |<>^/2.
p<>. 95 |<>^.
p<>. computer simulation vs. regular class work |<>^.
p<>. better understanding |<>^.
p<>. 0.87 (statements)

0.80 (explanations) | <>. |<>^.
p<>. learning approach: research-oriented vs. task-oriented |<>^.
p<>. more knowledge expansion |<>^.
p<>. * | <>. |<>^/2.
p<>. Riess, W. (2010) |<>^/2.
p<>. ecosystem forest |<>^/2.
p<>. 424 |<>^.
p<>. Simulation and lessons vs. traditional teaching |<>^/2.
p<>. higher learning gains |<>^.
p<>. 0.37 (achievement)

0.13 (justification)


<>^.
<>^/3.
<>^.
<>^.
<>^/2.
<>^.
<>^.
<>^.
<>^.

table<>. <>. |<>^\7.
p<>. Laboratory activities and computer simulation | <>. |<>^/2.
p<>. Gibbons, N. J. (2004) |<>^/5.
p<>. biology |<>^.
p<>. chromosome analysis |<>^.
p<>. study 1: 47 |<>^/2.
p<>. virtual approach vs. real approach |<>^.
p<>. decreased study time |<>^.
p<>. 3.56 (practice)

2.36 (assessment) | <>. |<>^.
p<>. bioinformatics |<>^.
p<>. study 2: 30 |<>^.
p<>. increased assessment scores |<>^.
p<>. 0.71 (topic 1)

0.40 (topic 2) | <>. |<>^.
p<>. White, B. (2010) |<>^.
p<>. protein structure |<>^.
p<>. 477 |<>^.
p<>. computer simulation vs. visualization |<>^.
p<>. no difference in learning gains |<>^.
p<>. n.a.

| <>. |<>^/2. p<>. Meir, E. (2005) |<>^. p<>. diffusion |<>^. p<>. study 1: 15 |<>^/2. p<>. virtual laboratory (pretest-posttest) |<>^/2. p<>. better understanding |<>^/2. p<>. * | <>. |<>^. p<>. osmosis |<>^. p<>. study 2: 31 | <>. |<>^/2. p<>. Martinez-Jimenez, P. (2003) |<>^/2. p<>. chemistry |<>^/2. p<>. caffeine extraction from tea |<>^/2. p<>. 274 |<>^. p<>. traditional methods with Virtual Chemistry Laboratory vs. traditional methods without Virtual Chemistry Laboratory |<>^. p<>. better comprehension of the techniques and basic concepts |<>^. p<>. * | <>. |<>^. p<>. interaction effect: Virtual Chemistry Laboratory AND greatest learning deficiencies |<>^. p<>. highest improvement |<>^. p<>. * | <>. |<>^/2. p<>. Winberg, T. M. (2007) |<>^/4. |<>^/2. p<>. acid-base titration |<>^/2. p<>. study 1: 175

study 2: 58 |<>^/2.
p<>. computer simulation vs. laboratory exercise |<>^.
p<>. posing more theoretical questions during laboratory work |<>^.
p<>. * | <>. |<>^.
p<>. exhibiting a more complex, correct use of chemistry knowledge during interviews |<>^.
p<>. * | <>. |<>^.
p<>. Limniou, M. (2007) |<>^.
p<>. viscosity |<>^.
p<>. 88 |<>^.
p<>. laboratory with pre-lab simulation exercise vs. laboratory without pre-lab simulation exercise |<>^.
p<>. better understanding |<>^.
p<>. * | <>. |<>^.
p<>. Dalgarno, B. (2009) |<>^.
p<>. laboratory familiarization |<>^.
p<>. study 1: 22

study 2: 95 |<>^.
p<>. virtual laboratory vs. real laboratory |<>^.
p<>. no difference in effectiveness for gaining familiarity with the laboratory |<>^.
p<>. n.a.

|

table<>. <>. |<>^.
p<>. Baltzis, K. B. (2009) |<>^.
p<>. engineering |<>^.
p<>. analog electronics |<>^.
p<>. 518 |<>^.
p<>. laboratory with simulation vs. laboratory without simulation |<>^.
p<>. better academic results and increased interest in the course |<>^.
p<>. * | <>. |<>^.
p<>. Zacharia,
Z. C. (2007) |<>^/4.
p<>. physics |<>^.
p<>. electrical circuits |<>^.
p<>. 90 |<>^.
p<>. laboratory with simulation vs. laboratory without simulation |<>^.
p<>. better conceptual understanding |<>^.
p<>. 0.70 | <>. |<>^/3.
p<>. Chang, K. E. (2008) |<>^/3.
p<>. optical lens |<>^/3.
p<>. study 1: 153

study 2: 231 |<>^.
p<>. simulation-based learning vs. laboratory learning |<>^.
p<>. better learning outcomes |<>^.
p<>. 0.12 (prompting)

0.17 (hypothesis menu)

0.11 (step guidance) | <>. |<>^.
p<>. abstract reasoning abilities: high vs. low |<>^.
p<>. more benefit from simulation-based learning |<>^.
p<>. 0.06 | <>. |<>^.
p<>. experiment prompting OR hypothesis menu vs. step guidance |<>^.
p<>. better learning results |<>^.
p<>. 0.03 (prompt. vs. step)

0.06 (menu vs. step) |

Note. All studies are presented in order of discussion in this review, grouped by science discipline. Calculation of Cohen’s d: n.a. = not applicable; * = not possible, based on the methods by Thalheimer & Cook (2002) ; = equal group sizes assumed .

[
__]

2.3.2 Comparison between different kinds of visualization[
**]

Beyond studying whether computer simulations can enhance traditional instruction, many researchers focused their attention on the question of how to implement a computer simulation. Studies addressed issues including ways of visualizing simulation processes, specific support measures, and configurations of the classroom setting and lesson scenario. In this section 7 studies that compare different kinds of visualizations are reviewed. We begin by discussing variations in representation and continue with varying students’ degree of immersion by using different media. Table 2-2 provides specific details on the studies reviewed in this section.

[]

Different kinds of representation

In an investigation of whether students ‘ understanding of line graphs could be improved by means of dynamic representations, Ploetzner, Lippitsch, Galmbacher, Heuer, and Scherrer (2009) compared groups that varied in the availability of simulated motion phenomena, dynamic line graphs, dynamic iconic representations and dynamic stamp diagrams (see Figure 2-8). In an initial study, students appeared to be unable to make use of the dynamic representations for improving their understanding of line graphs. As the researchers hypothesized that the lack of effectiveness was caused by insufficient support, they performed a second study in which they complemented the design of dynamic representations with pedagogical measures: during learning students could have their questions clarified, as they could ask questions and receive assistance from a teacher as well as from peers. Combining representations with supportive pedagogical measures led to more successful learning than dynamic representations alone.

Figure 2-8 Representations of simulated motion phenomena, studied by Ploetzner et al. (2009). Printed with permission.

Trey and Khan (2008) paid special attention to using computer-based analogies to simulate unobservable scientific phenomena. Two groups of students were introduced to a computer simulation on chemical equilibrium behavior. The simulation shows a dynamic analogy of Le Châtelier’s Principle on the stability of equilibrium situations. During the simulation one of the groups worked with the simulated analogical example; students in the other group were asked to recall a verbal and pictorial static analogy presented in the form of text and pictures, which both groups had seen earlier. The results suggest that analogies that are dynamic, interactive and integrated in a computer simulation can have a stronger effect on learning outcomes than analogies that have been shown as text and static pictures (d = 1.45, a very large effect).

The study by Goldstone and Son (2005) focused on what is the best way to present simulation materials along the dimension concrete-idealized. This refers to the amount of detail and the extent to which graphical elements contain sufficient information to identify the real-world, concrete entity being represented. They conducted two experiments comparing four conditions in which the concreteness of the first of two simulations was manipulated: consistently concrete elements, consistently abstract elements, concreteness fading, and concreteness introduction. The results indicate that there was a difference across conditions between students’ performance on the simulation itself and on a transfer test. Although performance on the simulation itself was best supported by concrete elements, idealized graphics appeared to be more effective in supporting transfer to an abstractly related simulation. The authors recommend combining both concrete and idealized formats, a conclusion that is consistent with theories that predict more general schemas when the schemas are multiply instantiated (e. g., Gick & Holyoak,1980, 1983). The most effective sequence appears to be to start with concrete representations and let these become
more idealized over time.

[
Varying degrees of immersion]

The reviewed computer simulations discussed above were presented to the students on a computer screen. To investigate the influence of immersion, special attention was paid by the researchers in the following studies to the effects of presenting simulations by alternative media.

Using a 3D virtual environment called “Virtual Water”, Trindade, Fiolhais, and Almeida (2002) compared viewing a simulation on-screen with viewing it by using stereoscopic glasses. They also investigated the influence of students’ spatial ability on conceptual understanding of the contents of the simulation. Their study reveals that 3D virtual environments can support achievement of a better conceptual understanding of some content-especially content that allows for more interactivity-by students with high spatial abilities. However, stereoscopic visualizations did not seem to contribute much to conceptual understanding, even though the stereoscopic view did indeed provide some sense of immersion.

In a study by Moreno and Mayer (2004) students learned to design the roots, stem and leaves of a plant so that it could survive in five different virtual reality environments. Two interventions were developed based on two different methodological approaches: based on an instructional media approach, the game was presented via desktop computer (low immersion) or head-mounted display (high immersion); based on an instructional approach, students were spoken to in a personalized (e. g., including I and you) or nonpersonalized (e. g., third-person monologue) manner. Results indicate that students learn more deeply when they are spoken to in a conversational style (dretention = 0.94 and dproblem solving = 1.79, a very large effect) rather than a formal style. However, the cutting-edge educational technology did not lead to better performance on tests of retention and transfer, even though students reported higher levels of physical presence with high rather than low immersion. The authors therefore recommend using high-immersion virtual reality only when the immersion is the focal point of instruction, and not to add it as a way to induce physical presence for its own sake.

In the present literature review, the study by Mitnik, Recabarren, Nussbaum, and Soto (2009) is the only one where the group of students working with the computer simulation served as the control group. This study focused on the effect of an educational activity with robots for improving the ability to produce and interpret line graphs (see Figure 2-9). Students had to graph different linear movements performed by a mobile robot. The experimental group worked on the activity in a face-to-face computer-supported collaborative learning situation, with wirelessly interconnected handhelds and robots. Results indicate that the robot activity appeared to be nearly twice as effective (d = 1.14, a very large effect) as the computer simulation activity in improving students’ ability to interpret line graphs. The robot activity was considered more motivating and fostered collaboration among students. According to the authors the motivation of the students in the experimental group was-contrary to the control group-not based on novelty, but on immersion in the activity, which stimulated students’ commitment and involvement throughout the entire experiment.

Figure 2-9 Students working with Graph Plotter, studied by Mitnik et al. (2009). Printed with permission.

Another example of the abandonment of the computer screen as simulation medium is SMALLab: a semi-immersive mixed-reality learning environment that integrates computer-generated data with real-world components (see Figure 2-10). In investigating the potential impact of SMALLab use on student-driven collaborative learning in the domain of earth science, Birchfield and Megowan-Romanowicz (2009) found that mixed-reality technology can cause both student discussion exchanges and learning outcomes to increase.

Figure 2-10 Students collaborating to construct a layer cake structure in SMALLab, studied by Birchfield and Megowan-Romanowicz (2009). Printed with permission.

[]Summary and discussion

The increasing quality of visualizations-boosted by ongoing ICT-developments-does not necessarily translate into better learning. Although some effects of visualization were found, with a maximum effect size of 1.14, most studies showed no effect. Regarding the comparison of different types of representation, the research shows that concrete representations provide the best support within a simulation itself, and idealized graphics most effectively support transfer to an abstractly related simulation (Goldstone & Son, 2005). Not only is it recommended to combine different types of representation, it is also essential to combine representations with supportive measures, as insufficient support can hamper effectiveness (Ploetzner et al., 2009). It should be kept in mind that these recommendations may depend on the domain under consideration and the exact tasks the participants are given.

Some researchers took the comparison between different representations a step further. Instead of using the computer screen as the only medium of presentation, they used different kinds of technology, allowing for investigation of the influence of immersion. Providing this sense of immersion by using a stereoscopic view contributes little (Trindade et al., 2002) or not at all (Moreno & Mayer, 2004) to test performance. Moreno and Mayer found that the style in which students are addressed is a more important factor of influence on learning. Achieving conceptual understanding within 3D virtual environments does appear to be facilitated by higher spatial abilities (Trindade et al., 2002). Using robots seems to be an effective way not only to immerse students in an educational activity, but also to increase learning results, as it improves students’ ability to interpret line graphs (Mitnik et al., 2009). Mixed-reality technology has the potential to support student discussion interchanges and learning outcomes (Birchfield & Megowan-Romanowicz, 2009).

Overall, it seems that improvements of learning outcomes by fostering a sense of immersion are better supported by mixing technology with reality (i. e., by using robots or SMALLab), in comparison to immersing students in virtual reality (i. e., by using a head-mounted display or stereoscopic glasses).

[Table 2-2 Comparison between different kinds of visualization
__]

table<>. <>. |<>^.
p<>. primary author

(year of publication) |<>^.
p<>. science discipline |<>^.
p<>. cognitive topic |<>^.
p<>. N |<>^.
p<>. interventions |<>^.
p<>. results/conclusions |<>^.
p<>. effect size

(Cohen’s d) |<>^. | <>. |<>^\8.
p<>. Different kinds of representation | <>. |<>^.
p<>. Trey, L. (2008) |<>^.
p<>. chemistry |<>^.
p<>. Le Châtelier’s Principle |<>^.
p<>. 15 |<>^.
p<>. computer simulation with dynamic analogy vs. computer simulation without dynamic analogy |<>^.
p<>. enhanced learning of unobservable phenomena in science |<>^.
p<>. 1.45 |<>^. | <>. |<>^/3.
p<>. Goldstone,
R. L. (2005) |<>^/3.
p<>. general science |<>^/3.
p<>. competitive specialization |<>^/3.
p<>. study 1: 84

study 2: 88 |<>^/2.
p<>. concreteness fading OR concreteness introduction vs. consistently idealized OR consistently concrete |<>^.
p<>. better performance on the simulation itself |<>^.
p<>. 0.65 (1) &
0.58 (2) |<>^. | <>. |<>^.
p<>. better transfer to another simulation |<>^.
p<>. 0.86 (1) &
0.66 (2) |<>^. | <>. |<>^.
p<>. concreteness fading vs. concreteness introduction |<>^.
p<>. better performance |<>^.
p<>. * |<>^. | <>. |<>^.
p<>. Ploetzner, R. (2009) |<>^.
p<>. physics |<>^.
p<>. kinematics |<>^.
p<>. study 1: 111

study 2: 24


<>^.
p<>. dynamic visualizations with pedagogical measures vs. dynamic visualizations without pedagogical measures
<>^.
p<>. improved understanding of line graphs
<>^.
p<>. *

table<>. <>. |<>^\10.
p<>. Varying degrees of immersion | <>. |<>^/2.
p<>. Moreno, R. (2004) |<>^/2.
p<>. biology |<>^/2.
p<>. botany |<>^/2.
p<>. 48 |<>^.
p<>. personalized agent messages vs. nonpersonalized agent messages |<>^\2.
p<>. better performance on retention and problem-solving tests |<>^\2.
p<>. 0.94 (retention)

1.79 (problem solving) |<>^. | <>. |<>^.
p<>. immersion: head-mounted display vs. desktop delivery |<>^\2.
p<>. higher reported level of physical presence, but no difference in performance on tests of retention and transfer |<>^\2.
p<>. n.a.

<>^.
<>^.
<>^.

Note. All studies are presented in order of discussion in this review, grouped by science discipline. Calculation of Cohen’s d: n.a. = not applicable; * = not possible, based on the methods by Thalheimer & Cook (2002) ; = equal group sizes assumed .

2.3.3 [][]Comparison between different kinds of instructional support

In their review of discovery learning in simulation environments, de Jong and van Joolingen (1998) recommend the analysis of learning problems and the evaluation of ways to support learning as principal items for the research agenda. In the present review instructional support clearly emerges as the most investigated theme, as the presence of 19 studies in this section reveals. We begin by reviewing studies that explicitly relate their theoretical background to Scientific Discovery Learning, and subsequently discuss studies that are based on other theories. Table 2-3 provides specific details on the studies reviewed in this section.

[]Supporting Scientific Discovery Learning

By combining four different scaffolding components (structural, reflective, subject-matter and enrichment) in four different configurations (ranging from low to full support), Fund (2007) investigated the influence of these support programs on students’ knowledge and understanding. In particular, the structural component that supplied a general framework for solving problems yielded significant differences, having a consistent and potent impact on learning outcomes. However, a combination of structural and reflective components was necessary for improved learning outcomes. Both reflective and subject-matter components had cumulative benefits over time. The reflective component appeared to stimulate meta-cognitive processes, which could have been generated because the obligation to write down solutions led to an internal dialog. A study by Zhang, Chen, Sun, and Reid (2004) also compared different kinds of supportive measures for inquiry learning. They propose a three-fold approach for supporting scientific discovery learning: it should take place in a meaningful, systematic and reflective manner.

By reviewing computer simulations from the vantage of research on perception and spatial learning, Lindgren and Schwartz (2009) introduced four learning effects to clarify aspects of simulation design: picture superiority, noticing, structuring, and tuning. The authors conclude that simulations facilitate improved learning and adaptation for students upon entry into the non-simulation environment. However, they warn that attempting to make the resemblance between the simulation and non-simulation environments as high as possible might undermine the simulation’s pedagogical properties, such as well-chosen images, contrasting cases, and the recognition of structure.

In a review of research, Blake and Scanlon (2007) introduce a set of features for the effective use of simulations for science teaching in the context of distance learning. They conclude that to be scientifically useful, simulations should be based on realistic events and data. Other useful features are the use of multiple representations and graphs as well as the possibility of watching graphs develop in real-time during the experiment. The authors recommend that all simulations should be provided with means for customizing the activities to the students’ ability levels, and that a narrative should be provided for the students to follow, either within the simulation itself or by the use of accompanying notes.

Veermans, van Joolingen, and de Jong (2006) compared the effects of scaffolding via implicit and explicit heuristics (see Figure 2-11). In an implicit-heuristics learning environment, the heuristics were used in offering support but without offering the heuristics themselves, while in an explicit-heuristics learning environment the heuristics themselves were also made explicit to the students. Results indicate that students in both conditions improved in their domain knowledge. Process analyses suggest that offering explicit heuristics facilitates more self-regulation in students.

Figure 2-11 Scientific Discovery Learning supported by heuristics that are both implicit and explicit, studied by Veermans et al. (2006). Printed with permission.

To examine the effects of integrating and/or linking multiple dynamic representations on learning outcomes, van der Meij and de Jong (2006) experimented with three different conditions in a learning environment on the physical subject of moments: separated, non-linked representations (S-NL), separated, dynamically linked representations (S-DL), and integrated, dynamically linked representations (I-DL). The best results were generally seen for participants in the condition in which the representations were integrated and dynamically linked. Participants in that condition also experienced the learning environment as easiest to work with. Overall, participants learned from working with the learning environment. However, simply linking representations dynamically did not lead to improved learning results in comparison with non-linking. The authors believe that the requirement to mentally translate between different representations is a good way to acquire deeper knowledge in a domain.

By assigning students to one of three inquiry learning tasks in an unknown domain, Lazonder, Wilhelm, and van Lieburg (2009) investigated whether it is sufficient for students merely to have knowledge about the variables in the simulation, or whether having a basic understanding about how these variables are interrelated is also necessary (see Figure 2-12). The concrete task contained known variables, from which hypotheses about their relations could be deduced. The intermediate task used known variables, but the deduction of their interrelatedness was not possible. The abstract task contained unknown variables for which hypotheses about their relations could not be deduced. Results show that the concrete participants performed more successfully (dconcrete vs. intermediate = 0.82) and efficiently, whereas no difference could be detected between the intermediate and abstract participants’ achievement. The authors conclude that a basic understanding of the interrelatedness between variables is necessary for supporting the processes and outcomes of inquiry learning.

Figure 2-12 Simulations interface of the concrete task (upper panel) and abstract task (lower panel), studied by Lazonder et al. (2009). Printed with permission.

Manlove, Lazonder, and de Jong (2006) researched the possibilities of offering online support for regulating collaborative inquiry learning. Students worked in small groups with a computer simulation to conduct scientific inquiry within the physical subject of fluid dynamics. Both the control group and the experimental group could use a planning support tool. The tool for the experimental group contained additional regulatory guidelines, including a hierarchy of goals and subgoals, hints and explanations, and a template for the final report. The fully specified tool appeared to provide better support for both their learning outcomes (d = 0.98) and their initial planning (d = 3.5, a very large effect). Although offering regulatory guidelines during collaborative scientific discovery learning leads to improved planning activities, the results for monitoring activities are less conclusive. In a more recent study by Manlove, Lazonder, and de Jong (2009) that also focused on regulatory software scaffolds during scientific inquiry learning, the researchers were especially interested in whether there is a difference between paired and single students as far as the use of regulative scaffolds is concerned. The results show that the pairs scored significantly better than individual students on learning outcomes (dreport structure = 0.23, dreport content = 0.83 and dmodel quality = 1.16, a very large effect). Contrary to the researchers’ expectations, collaboration did not seem to have any effect on the use of regulatory scaffolds, which remained fairly low for both pairs and singles. According to the authors, this might be related to a persistent problem: the availability of an instrument implies neither that students will make use of it nor that their use of the instrument is effective.

Saab, van Joolingen, and van Hout-Wolters (2007) investigated the effects for collaborative discovery learning of instructions that were based on the RIDE-rules, which are derived from four principles that they deduced from literature on collaborative processes: Respect, Intelligent collaboration, Deciding together and Encouraging. Analyses show that the RIDE-instruction can lead to more constructive communication and improved discovery learning activities. However, direct effects on scientific discovery learning outcomes were not found.

[]Instructional support approached from other perspectives

In an attempt to investigate how much guidance should be offered to students working on a computer simulation on the subject of enzyme kinetics, Gonzalez-Cruz, Rodriguez-Sotres, and Rodriguez-Penagos (2003) compared groups that were supported with three different levels of instructions (detailed, intermediate or minimal) with a group that solved problems in class and a group without additional support sessions. Results show that students using the instruction support program significantly benefited compared to students that did not make use of it. In the short term, the intermediate instructions were more convenient in helping the students to prepare reports, while in the long term both the intermediate and minimal instructions performed equally well. The authors state that offering students some freedom while they use the computer simulation is more beneficial, as long as the tutor still reviews and comments on their work afterward. The strategy they therefore recommend is the intermediate level of instruction, where both freedom and structure are offered.

Mayer’s spatial contiguity principle was the basis for Lee’s (2007) investigation of the effects of a visual treatment on students’ understanding and transfer of chemistry knowledge. The principle was applied with regard to the distance of images on the screen and by adding visual-cue scaffolding to the simulation. The number of icons and the distance between interrelated icons were varied in the experimental condition, and a monitoring tool was added. The extent to which the visual treatments interacted with students’ spatial ability was also investigated. The visual treatment led to achievement of significantly higher results on comprehension and transfer tests (dcomprehension = 0.32 and dtransfer = 0.17; both small effect sizes). An interaction effect was also found: students with low spatial abilities performed better in the treatment group compared to the control group, whereas students with high spatial abilities achieved equal results regardless of condition.

The effectiveness of using a computer simulation in promoting scientific understandings was researched by R.L. Bell and Trundle (2008), who integrated the planetarium software program “Starry Night Backyard” with instruction on moon phases. The computer simulation served as a reliable way to consistently collect data, instead of playing a dominant role in the foreground as an instructional feature. Therefore the authors consider the results of this study as support for the assumption that educational technologies should facilitate established effective instructional materials, instead of replacing them. A follow-up study (Trundle & Bell, 2010) shows that learning about moon shapes and the ability to explain the cause of moon phases can be supported equally by this simulation, observations from nature, or a combination of both. However, learning by using only the computer simulation resulted in higher gain of knowledge about lunar sequences, compared to learning based on observations from nature alone. Gazit, Yair and Chen (2005) also performed research on the development of students’ conceptual understanding of astronomical phenomena, focusing on real-time learning processes of students using a “Virtual Solar System”. Even though they conclude that their simulation can support the development of scientific understanding, they stress that students’ high interactive performance might not be sufficient for conceptual development to take place should sufficient orientation and navigation tools be lacking.

By comparing two groups of students that used a computer simulation in the course “Probing your Surroundings”, Clark and Jorde (2004) investigated the extent to which the revision of students’ ideas about thermal equilibrium could be facilitated by the integration of a tangible model in a visualization. While trying to figure out why objects feel the way they do, students in the augmented visualization condition could use a tangible model, in addition to the thermal equilibrium visualization that was available to the control group as well. This tangible model consisted of a different representation of the problem and a picture of a hand with an arrow next to it that-when an object was clicked-showed the heat flow to and from the hand depending on the temperature gradient between the hand and the object. Students in the experimental group appeared to be better able to understand thermal equilibrium, by demonstrating a better ability to explain why objects feel the way they do (d = 0.94), as well as by being better able to predict the temperature of objects in different surroundings (d = 0.51).

In an attempt to find the optimal timing for offering different kinds of information to students using a computer-based simulation, Kester, Kirschner and van Merriënboer (2004) compared four information presentation formats. These formats varied two factors: the timing of offering supportive information (before or during task practice) and the timing of offering procedural information (before or during task practice). Students’ information searching behavior revealed that they most needed supportive information before the task and procedural information during task practice. The authors argue that it is possible to determine optimal moments of presentation for these different kinds of information based on the specific task requirements during simulations.

The next two studies approach the provision of support by introducing a gaming element. Even though we included these studies in our review as they meet our inclusion criteria for computer simulations, we recognize these studies’ focus blends into the subject of gaming, being a distinct research area.

The aim of a study by Papastergiou (2009) was to determine the learning effect and motivational appeal of an educational game about learning computer memory concepts. Two educational applications were compared, one with and the other without a gaming approach. The gaming approach appeared to be effective in improving students’ knowledge about computer memory (d = 0.64), as well as more motivating (d = 0.76) than the non-gaming approach. Even though boys were more involved in computer games than girls as far as enjoyment, experience and domain knowledge were concerned, there was no significant difference in the extent to which boys and girls learned by using the game. Boys and girls also experienced the game as equally motivating. Based on the results of this study the author concludes that digital game-based learning can be used as an educational environment within high school education, because educational games can increase knowledge of subject-matter, as well as improve students’ enjoyment, involvement and interest in the learning process.

The application of games as a curricular scaffold was also studied by Barab et al. (2009), for a 3D game-based curriculum designed to teach water quality concepts (see Figure 2-13). The conditions they compared-expository textbook dyad, simplistic framing dyad, immersive world dyad and immersive world single-user-allow for relating their results to several themes that have already been covered in our review thus far. Concerning visualization, both immersive-world conditions performed significantly better on proximal items (d IW,D vs. ET,D = 1.39 and d IW,S vs. ET,D = 1.25, both very large effects) than the non-immersive expository textbook condition. This study can also be related to the research by Moreno and Mayer (2004), as Barab et al. (2009) compared their immersive world conditions in which information was written in the first person to a simplistic framing condition in which information was written in the third person. As already discussed in subsection 2.3.2, Moreno and Mayer (2004) found that manner of address had a significant influence on learning, with a conversational style leading to deeper learning compared to a formal style. Consistent with those findings, Barab et al. (2009) found that immersive world dyads outperformed simplistic framing dyads (d IW,D vs. SF,D = 1.09) and expository textbook dyads (d IW,D vs. ET,D = 1.60, a very large effect) on an open-ended transfer task. However, the simultaneous variation of immersion and narrative voice between conditions does not allow for an exact determination of the effect of narrative voice.

Figure 2-13 Screenshot from Taiga, studied by Barab et al. (2009). Printed with permission.

“River City” is another example of a 3D immersive virtual environment (see Figure 2-14). This multi-user environment allows students to move around through a virtual town and perform collaborative inquiry in order to discover why the residents are getting ill. Extensive research on the learning effects of this environment shows that students’ engagement and learning outcomes can be improved to an extent that is comparable to what can be achieved with physical experimentation (Ketelhut & Nelson, 2010). However, a comparison between different curriculum variations showed that the extent to which inquiry learning is supported depends on whether learning outcomes are tested by using multiple-choice questions or writing letters to the mayor of River City (Ketelhut et al., 2010). The authors suggest that the latter method might be more appropriate for scientific inquiry assessment, as it is a more authentic activity than a multiple-choice test.

Figure 2-14 River City interface, studied by Ketelhut et al. (2010). Printed with permission.

[
__]

[]Summary and discussion

The above collection of research publications illustrates the kaleidoscope of possibilities for providing instructional support. The research on how to support Scientific Discovery Learning uses a variety of approaches. Zhang et al. (2004) propose three perspectives (meaningful, systematic, and reflective) and recommend approaching the development of learning support in simulation environments from all three perspectives. According to Fund (2007), who compared four scaffolding components (structural, reflective, subject-matter, and enrichment), the structural component has the most potent impact on learning outcomes. Nevertheless, several researchers recommend offering both structure and freedom, as restricting freedom too much can weaken learning results (K. E. Chang et al., 2008; Gonzalez-Cruz et al., 2003).

Research into the basic conditions for supporting computer simulation learning processes shows positive effects of support, with effect sizes up to 3.50. The following recommendations can be given: students’ self-regulation is best facilitated by providing heuristics explicitly instead of implicitly (Veermans et al., 2006), and the best timing for providing information is before task practice as far as supportive information is concerned and during task practice for procedural information (Kester et al., 2004). It is necessary for learners to have a basic understanding of the variables that are involved (Lazonder et al., 2009). Where different representations are used, requiring mental translation between them supports the acquisition of deeper domain knowledge (van der Meij & de Jong, 2006).

Research on learning with computer simulations in collaboration shows that relations between collaborative learning processes and individual learning outcomes are not straightforward. Even though the RIDE-instructions by Saab et al. (2007) improved discovery learning activities, the students’ learning results did not improve. In comparison to working on discovery learning activities individually, collaborating in pairs can indeed significantly improve learning outcomes (Manlove et al., 2009). Providing collaborating groups of students with additional regulatory guidelines can also improve learning results (Manlove et al., 2006).

A recurrent finding in the research on instructional support is that it is of utmost importance to provide students with a learning environment in which freedom and structured support are well-balanced, which is in line with research on the ineffectiveness of minimally guided instruction (Kirschner et al., 2006). Even though the research on providing instructional support has been fruitful, and allows us to deduce useful recommendations, the majority of studies approached the provision of support only from within the simulation: e. g., by adding a human-teacher-like support system (Fund, 2007). This leaves the question of how optimal instructional support by teacher guidance or curricular embedment could be provided largely unanswered.

[Table 2-3 Comparison between different kinds of instructional support
__]

table<>. <>. |<>^.
p<>. primary author
(year of publication) |<>^.
p<>. science discipline |<>^.
p<>. cognitive topic |<>^.
p<>. N |<>^.
p<>. interventions |<>^.
p<>. results/conclusions |<>^.
p<>. effect size

(Cohen’s d) | <>. |<>^\7.
p<>. Supporting Scientific Discovery Learning | <>. |<>^/2.
p<>. Fund, Z. (2007) |<>^/4.
p<>. general science |<>^/2.
p<>. various science problems |<>^/2.
p<>. 473 |<>^.
p<>. support programs: with structure component vs. without structure component |<>^.
p<>. more effective work patterns, better knowledge construction and understanding |<>^.
p<>. * | <>. |<>^.
p<>. support programs: structure component with reflective component vs. structure component without reflective component |<>^.
p<>. more constructivist knowledge acquisition and deeper understanding |<>^.
p<>. * | <>. |<>^/2.
p<>. Lazonder,
A. W. (2009) |<>^/2.
p<>. meaning of and relations between variables |<>^/2.
p<>. 57 |<>^.
p<>. concrete task vs. intermediate task |<>^.
p<>. more successful performance |<>^.
p<>. 0.82 | <>. |<>^.
p<>. intermediate task vs. abstract task |<>^.
p<>. equally successful performance |<>^.
p<>. n.a.

| <>. |<>^. p<>. Zhang, J. (2004) |<>^/2. p<>. physics |<>^. p<>. floating and sinking |<>^. p<>. study 1: 80

study 2: 30 |<>^.
p<>. experimental support AND interpretative support AND reflective support vs. no support |<>^.
p<>. better (meaningful, systematic, and reflective) discovery learning |<>^.
p<>. * | <>. |<>^.
p<>. Veermans, K. (2006) |<>^.
p<>. collisions |<>^.
p<>. 30 |<>^.
p<>. heuristics: implicit with explicit vs. implicit without explicit |<>^.
p<>. more self-regulation |<>^.
p<>. * | <>. |<>^/4.
p<>. van der Meij, J. (2006) |<>^/4. |<>^/4.
p<>. moments |<>^/4.
p<>. 72 |<>^/2.
p<>. representations: integrated, dynamically linked vs. separate, non-linked |<>^.
p<>. low complexity part: no difference in learning results on domain knowledge |<>^.
p<>. n.a.

| <>. |<>^. p<>. high complexity part: better learning results on domain knowledge |<>^. p<>. * | <>. |<>^. p<>. representations: separate, dynamically linked vs. separate, non-linked |<>^. p<>. no difference in learning outcomes |<>^. p<>. n.a. | <>. |<>^. p<>. representations: integrated, dynamically linked vs. separate, dynamically linked OR separate, non-linked |<>^. p<>. easiest working experience |<>^. p<>. 0.99 (I,DL vs. S,DL)

0.96 (I,DL vs. S,NL) |

table<>. <>. |<>^/2.
p<>. Manlove, S. (2006) |<>^/4. |<>^/3.
p<>. fluid dynamics |<>^/2.
p<>. 17 |<>^/2.
p<>. support tool with regulatory guidelines vs. support tool without regulatory guidelines |<>^.
p<>. better learning outcomes |<>^.
p<>. 0.98 | <>. |<>^.
p<>. better initial planning |<>^.
p<>. 3.50 | <>. |<>^.
p<>. Manlove, S. (2009) |<>^.
p<>. 30 |<>^.
p<>. collaborative use of regulative software scaffolds vs. individual use of regulative software scaffolds |<>^.
p<>. higher learning outcomes and no difference in frequency and duration of regulative scaffold use |<>^.
p<>. 0.23 (report structure)

0.83 (report content)

1.16 (model quality) | <>. |<>^.
p<>. Saab, N. (2007) |<>^.
p<>. collisions |<>^.
p<>. 38 pairs |<>^.
p<>. RIDE collaboration instruction vs. control |<>^.
p<>. more and effective discovery learning activities and more constructive communication, but no difference in discovery learning results |<>^.
p<>. n.a.

| <>. |<>^\7. p<>. Instructional support approached from other perspectives | <>. |<>^. p<>. Ketelhut, D. J., & Nelson, B. C. (2010) |<>^. p<>. biology |<>^. p<>. epidemiology |<>^. p<>. 500 |<>^. p<>. experimentation: virtual vs. physical |<>^. p<>. no difference in engagement or learning outcomes |<>^. p<>. n.a.
<>^/2.
<>^.

table<>. <>. |<>^/2.
p<>. Gonzalez-Cruz, J. (2003) |<>^/4.
p<>. chemistry |<>^/2.
p<>. enzyme kinetics |<>^/2.
p<>. 119 |<>^.
p<>. guidance: intermediate instructions vs. detailed instructions OR minimal instructions OR additional class session OR control |<>^.
p<>. short term: better preparation of reports |<>^.
p<>. * | <>. |<>^.
p<>. guidance: intermediate instructions vs. minimal instructions |<>^.
p<>. long term: no difference in performance |<>^.
p<>. * | <>. |<>^/2.
p<>. Lee, H. (2007) |<>^/2.
p<>. Boyle’s Law and Charles’ Law |<>^/2.
p<>. 257 |<>^/2.
p<>. visual treatment: spatial contiguity AND visual-cue scaffolding vs. control |<>^.
p<>. high spatial ability: no difference in performance on comprehension and transfer tests |<>^.
p<>. n.a.

| <>. |<>^. p<>. low spatial ability: better performance on comprehension and transfer tests |<>^. p<>. 0.32 (comprehension)

0.17 (transfer) | <>. |<>^/2.
p<>. Papastergiou, M. (2009) |<>^/2.
p<>. computer science |<>^/2.
p<>. computer memory |<>^/2.
p<>. 88 |<>^/2.
p<>. gaming vs. non-gaming |<>^.
p<>. more effective in promoting conceptual knowledge |<>^.
p<>. 0.64 | <>. |<>^.
p<>. more motivational (enjoyment, engagement and interest) |<>^.
p<>. 0.76 | <>. |<>^/3.
p<>. Barab, S. A. (2009) |<>^/3.
p<>. general science |<>^/3.
p<>. water quality |<>^/3.
p<>. 51 |<>^.
p<>. immersive world dyad OR immersive world single-user vs. expository textbook dyad |<>^.
p<>. better performance on proximal items |<>^.
p<>. 1.39 (IW,D vs. ET,D)

1.25 (IW,S vs. ET,D) | <>. |<>^.
p<>. immersive world dyad vs. expository textbook dyad |<>^.
p<>. better performance on distal items |<>^.
p<>. 1.51 | <>. |<>^.
p<>. immersive world dyad vs. simplistic framing dyad OR expository textbook dyad |<>^.
p<>. better performance on open-ended transfer task |<>^.
p<>. 1.09 (IW,D vs. SF,D)

1.60 (IW,D vs. ET,D) |

table<>. <>. |<>^.
p<>. Bell, R. L. (2008) |<>^/7.
p<>. physics |<>^/4.
p<>. moon phases |<>^.
p<>. 50 |<>^.
p<>. computer simulation (pretest-posttest) |<>^.
p<>. better scientific understanding |<>^.
p<>. * | <>. |<>^/3.
p<>. Trundle, K. C. (2010) |<>^/3.
p<>. 157 |<>^.
p<>. observations: computer simulation vs. computer simulation AND from nature vs. from nature |<>^.
p<>. no difference in gain of knowledge about lunar shapes or the ability to explain the cause of moon phases |<>^.
p<>. n.a.

| <>. |<>^. p<>. observations: computer simulation AND from nature vs. computer simulation OR from nature |<>^. p<>. no difference in gain of knowledge about lunar sequences |<>^. p<>. n.a.
<>^.
<>^.
<>^/2.
<>^.
<>^/2.
<>^.

[_Note. The reviews by T. Bell, et al., (2010), Blake & Scanlon (2007) and Lindgren & Schwartz (2009) are not included. All studies are presented in order of discussion in this review, grouped by science discipline. Calculation of Cohen’s d: n.a. = not applicable; * = not possible, based on the methods by Thalheimer & Cook (2002) ; = equal group sizes assumed. _]

2.3.4 Classroom settings and lesson scenario[
**]

Most reviewed interventions up to this point zoomed in on the use of computer simulations per se. When used in an educational context, the role of computer simulations within the classroom lesson has several aspects worthy of attention. First, we focus on the influence of student engagement on learning. We subsequently discuss how learning can be supported by scripting diverging activities during the lesson scenario. Table 2-4 provides specific details on the studies reviewed in this section.

[]Varying levels of engagement

In a study by Wu and Huang (2007), students’ behavioral, emotional and cognitive engagement were investigated by comparing classrooms having differing instructional approaches. In a student-centered class, the topic of forces and motion was introduced by the teacher and students subsequently worked with computer simulations and completed assignments in pairs. Although the scientific concepts were also introduced by the teacher in the teacher-centered class, students did not have computers at their disposal: instead the teacher used a projector linked to a laptop to demonstrate the simulations and to guide the students in completing their learning activities. The researchers found that students’ prior achievement levels could interact with instructional approaches, as low-achieving students benefited more from the teacher-centered approach (d = 1.07). Even though the emotional engagement of the students in the student-centered classroom was greater, the level of emotional engagement did not appear to have an impact on students’ achievement.

However, Laakso, Myller, and Korhonen (2009) found a result that appears to contradict this finding. They compared groups of students who worked on simulated algorithm exercises, using different levels of the “Extended Engagement Taxonomy”. These levels range from presenting a visualization to other students (as being the highest level of engagement) to just viewing a visualization or no viewing at all (as being the lowest levels). Merely viewing algorithm animations appeared to be insufficient to achieve learning, even when it was possible to share understandings and misunderstandings with a partner while watching the visualization. According to the authors, learning environments should be designed specifically to act on the higher levels of the engagement taxonomy, because such learning environments can stimulate learning activities to become more active and more student-centered. As both of these studies on the theme of engagement not only used different ways to operationalize the construct, but also used different measurement procedures, speculation on what caused the conflicting results is troublesome. While Wu and Huang (2007) consider engagement to be a multifaceted construct implying behavioral, emotional, and cognitive participation in learning experiences, Laakso et al. (2009) base their engagement levels on screen captures and voice recordings of students’ activities. Moreover, whereas the study by Wu and Huang (2007) contains a detailed description of teacher guidance, the study by Laakso et al. (2009) lacks specification of how the learning processes were supported by the teacher.

The roles of teacher guidance and collaboration

An example of actively involving teacher guidance is the attempt by Dori and Belcher (2005) to make the learning process more social by adopting a method that includes peer-collaboration and the use of a Personal Response System. They investigated the effects in the social, cognitive and affective domains of studying in small groups while working with a computer simulation in the so-called TEAL-project: an environment for Technology-Enabled Active Learning (see Figure 2-15 and Figure 2-16). Following mini-lectures on electromagnetism in class, the students were requested to respond to multiple-choice questions in real time, after which the students’ response distribution was shown in bar graphs on a classroom screen. When there was no agreement, the teacher asked the students to try to come to an agreed-upon answer by peer-discussion in groups of three students. Repeating the same multiple choice question to the class afterward often resulted in more consensus on the supposedly correct answer. Compared to traditional teaching, the TEAL-students significantly improved their conceptual understanding. In the small-scale experiment, the majority of participants also recommended the course to fellow students, indicating the advantages of interactivity, visualization, and hands-on experiments. The authors conclude the TEAL-environment can have a strong positive influence on students ‘ learning results, because this technology supports active learning. Likewise, an implementation of the TEAL-environment at a university in Taiwan (Shieh, Chang, & Tang, 2010) during two semesters led to higher learning gains in the second semester (d = 0.39) compared to traditional teaching.

Figure 2-15 A 3D model of the TEAL space, studied by Dori and Belcher (2005). Printed with permission.

[
__]

[][
__][]

Figure 2-16 The TEAL classroom in action, studied by Dori and Belcher (2005). Printed with permission.

In the study by Duran et al. (2007), a description of a software-based method was given in which the simulations were not just used as demonstrations, but were included within a method that promotes students’ understanding and participation (as discussed earlier in this review). After the presentation of a real-world scenario and theoretical explanations of the main concepts of a chapter, students were challenged to predict the evolution of the scenario. In the brainstorming session that followed, the students could have a discussion in groups about the scenario’s evolution. In this phase, the teacher could mix with the students to clarify doubts they might have and guide discussions. At the end of their discussions the ideas of the groups were collected. Subsequently the simulation was run to check whether the predictions by the students were right or not, instead of the correct answers being presented by the teacher. Finally, after discussion and theoretical explanation, the collected ideas were contrasted with the results of the phenomena that were shown in the simulation stage.

In a study that was focused on the difference in the communication venue, Limniou, Papadopoulos, and Whitehead (2009) compared a course where the students used synchronous face-to-face communication with a course in which the teacher and students communicated in an asynchronous way by using a “WebCT environment”. The teachers had different roles in these two environments: in the synchronous condition the teacher had a more active role in guiding the students in achieving learning outcomes by face-to-face discussion and interaction, whereas in the virtual environment the role of the teacher was more supportive, aimed at asking questions and the collection of resources to support independent learning. As both approaches led to the same learning outcomes, the authors assert that this gives teachers more freedom in choosing a feasible approach depending on university facilities, the staff’s time and the students’ familiarity with virtual learning environments.

[]Summary and discussion

As different operationalizations of the construct engagement were used among researchers, its influence remains unclear, although effect sizes up to 1.07 were found. Laakso, Myller et al. (2009) stress that learning activities should be as active and student-centered as possible by promoting the use of learning environments that function on the higher levels of engagement. In a study by Wu and Huang (2007), however, student achievement was not affected by emotional engagement. Embedding computer simulations in a didactic environment can-besides having an impact on learning processes and outcomes-influence the role of the teacher and classroom communication. Collaboratively working on computer simulations may allow more active learning, which in turn improves students’ conceptual understanding (Dori & Belcher, 2005; Shieh et al., 2010). Even when different teaching approaches lead to the same learning outcomes, the choice of a specific approach can still have an impact on the role of the teacher in terms of the need to provide guidance to the students (Limniou et al., 2009).

To gain knowledge about the optimal application of computer simulations within science education, research should be approached from both a zoomed in perspective-by manipulating variables within a simulation-and a zoomed out perspective-by taking the broader pedagogical context into account. The scarcity of studies that zoomed out, however, shows that most researchers have investigated the effectiveness of computer simulations without including such factors of influence as teacher guidance, classroom session scenarios or curricular characteristics.

[Table 2-4 Classroom settings and lesson scenario
__]

table<>. <>. |<>^.
p<>. primary author

(year of publication) |<>^.
p<>. science discipline |<>^.
p<>. cognitive topic |<>^.
p<>. N |<>^.
p<>. interventions |<>^.
p<>. results/conclusions |<>^.
p<>. effect size

(Cohen’s d) | <>. |<>\7.
p<>. Varying levels of engagement | <>. |<>^.
p<>. Laakso, M. J. (2009) |<>^.
p<>. computer science |<>^.
p<>. binary heap |<>^.
p<>. 75 |<>^.
p<>. engagement levels: changing vs. viewing |<>^.
p<>. better learning performance |<>^.
p<>. 0.68 | <>. |<>^/2.
p<>. Wu, H. K. (2007) |<>^/2.
p<>. physics |<>^/2.
p<>. force and motion |<>^/2.
p<>. 54 |<>^.
p<>. classes: student-centered vs. teacher-centered |<>^.
p<>. more emotional engagement, but no difference in learning achievements |<>^.
p<>. n.a.

| <>. |<>^. p<>. interaction effect: teacher-centered AND low-achieving students |<>^. p<>. more benefit from the teacher-centered approach |<>^. p<>. 1.07 | <>. |<>\7. p<>. The roles of teacher guidance and collaboration | <>. |<>^. p<>. Limniou, M. (2009) |<>^. p<>. chemistry |<>^. p<>. acid-base titration |<>^. p<>. study 1: 80

study 2: 80 |<>^.
p<>. virtual simulation vs. traditional laboratory |<>^.
p<>. no difference in learning outcomes |<>^.
p<>. n.a.

| <>. |<>^. p<>. Dori, Y. J. (2005) |<>^/3. p<>. physics |<>^. p<>. electromagnetism |<>^. p<>. 811 |<>^/3. p<>. Technology Enabled Active Learning vs. traditional teaching |<>^. p<>. better conceptual understanding and learning outcomes |<>^. p<>. * | <>. |<>^/2. p<>. Shieh, R. S. (2010) |<>^. p<>. mechanics |<>^. p<>. study 1: 113 |<>^. p<>. no difference in learning gains |<>^. p<>. n.a.
<>^.

Note. All studies are presented in order of discussion in this review, grouped by science discipline. Calculation of Cohen’s d: n.a. = not applicable; * = not possible, based on the methods by Thalheimer & Cook (2002).

&2.4& [][]&Conclusions&

We started this review with two main questions. The first of these regards the extent to which traditional science education can be enhanced by using computer simulations, and the second regards how simulations and their instructional support are best shaped and implemented to optimize the use of simulations themselves. The reviewed articles provide information from an experimental perspective. In this section we discuss the major trends and results found.

With respect to the use of simulations as enhancement or replacement of traditional means of teaching the results are unequivocal: simulations have gained a place in the classroom as robust additions to the repertoire of teachers, either as an addition to available traditional teaching methods or as a replacement of parts of the curriculum. All reviewed studies that compare conditions with or without simulations report positive results for the simulation condition for studies in which simulations were used to replace or enhance traditional lectures. Effect sizes up to 1.5 for posttest scores and above 2 on scores related to motivation and attitude were found for this situation. The replacement of laboratory activities by simulations or their use as preparatory laboratory activities is a special case of this. Here a large gain in efficiency of learning can be reported. Very large effect sizes for time on task are reported, with simulation-based instruction on the favored side. Another effective way of using simulations is as a preparatory activity for real laboratory activities. Positive effects are found for the comprehension of the lab task as well as for practical laboratory skills during the real lab activity.

The latter finding brings us to an important issue. The acquisition of laboratory skills is often a learning goal in itself which cannot be completely replaced by simulations. However, it becomes clear that, as in domains where simulation has already been widely accepted as a training facility-such as flight simulation-simulations can play an important role in making lab activities more effective by offering the simulation as a pre-lab training.

The second research question has two parts: in what ways were simulations enhanced to try to improve their success and what were the effects of these enhancements? In the studies that we reviewed two main themes were investigated: the way the simulation results are presented visually as well as instructional support provided to the learner during work on the simulation.

With respect to visualization, most studies reviewed considered the representation of simulation output data. No unequivocal results were reported in the studies reviewed, partly because the number of studies that could be found was relatively small and because they vary in the kind of representation that they investigated. There is one outstanding result: the visualization of invisible phenomena that was investigated by Trey and Khan (2008). A large learning effect on these unobservable phenomena was found, which is in line with one of the main advantages of computer simulation hypothesized by de Jong and colleagues (van Berkum & de Jong, 1991; van Joolingen & de Jong, 1991). With respect to the level of immersion in 3D simulations, no study found a big effect for immersion as such. Effects were found for additional instruction as well as for the use of robots for learning on the use of graphs (Mitnik et al., 2009). The fact that immersion as such does not clearly contribute to learning effects has a probable cause in the lack of a clear function of the immersion. Embodiment of abstract concepts could provide such a function (Tall, 2008) and deserves attention in future research.

At the level of instructional support, a large variety of supportive measures has been studied in the research reviewed. About half of these studies concern the use of scientific discovery learning and the processes needed. A review by Alfieri, Brooks, Aldrich, and Tenenbaum (2011) showed the necessity of providing learners in a discovery environment-although not necessarily with simulations-with instructional support. The types of support can still be classified along the lines identified by de Jong and van Joolingen (1998): support for transitive learning processes-for instance, hypothesis generation and the design of experiments-as well as regulative processes aiming at the planning and monitoring of learning activities. The eight studies in this category report positive results on a variety of variables. Few report effect sizes or enough data to compute an effect size, but the overall impression is that effects are moderate. It is also noteworthy that most of these studies report effects on the learning process and direct outcomes of these processes, indicating that strong results on posttests could not be achieved. This does not disqualify these studies, but points to the long-standing problem associated with discovery learning that discovery methods by their nature take some time to have an effect, as dual learning goals are being pursued: learning of domain knowledge as well as learning of discovery skills.

The other instructional interventions that are not aimed at discovery learning show positive effects, but it is impossible to deduce general trends from these, as the domain and the kinds of intervention vary. Here it becomes clear that the space of options for intervention, as well as the space for the design of the simulations themselves is very large, making it difficult to extract general trends. The fact that quite large effect sizes can be obtained by varying the design and instructional support shows the importance of careful use of simulations and instructional design. General guidelines are of only limited value here, making the design of simulation-based instruction an engineering science. The reviewed studies show that effects of well-designed simulation-based instruction are potentially high. The main factors that need to be considered are the way the learner is addressed and involved, the way information from the simulation is presented and integrated, what additional information is presented, and how this presentation is timed.

The effects of computer simulations in science education are caused by interplay between the simulation, the nature of the content, the student and the teacher. A point of interest for the research agenda in this area, as mentioned by de Jong and van Joolingen (1998) in their review, is to investigate the place of computer simulations in the curriculum. Most of the studies we reviewed, however, investigated the effects of computer simulations on learning ceteris paribus, consequently ignoring the influence of the teacher, the curriculum, and other such pedagogical factors. In order for educational innovations such as computer simulations to be successful, teachers need to be provided with the necessary skills and knowledge to implement them (Pelgrum, 2001). Without proper teacher skills, the full potential of simulations, such as their suitability for practicing inquiry skills, may remain out of reach. Instead, they may be used as demonstration experiments or be completely controlled by the teacher (Lindgren & Schwartz, 2009). Reducing the use of computer simulations to a step-by-step cookbook approach undermines their potential to afford students with an opportunity to freely create, test and evaluate their own hypotheses in a more richly contextualized environment (Windschitl & Andre, 1998).

Whereas the outcomes of the studies reviewed by de Jong and van Joolingen (1998) were not univocally in favor of simulations, the majority of studies we reviewed suggest an improvement in effectiveness over the past decade. We believe this has mainly been caused by an ongoing synergy of technological advancements and improvements of instructional support. Although (quasi)experimental research of recent years has certainly been fruitful, we recommend that the focus on different kinds of visualizations and supportive measures be extended to a more comprehensive view, for example by including the lesson scenario and the computer simulation’s place within the curriculum as factors of influence. Additionally embedding the role of the teacher would be a promising step forward in the establishment of a pedagogical framework for the application of computer simulations in science education.

Chapter 3[
**][][][]&Inquiry-Based Teaching with Computer Simulations in Physics&[][2]

In this study we zoom out from individual or small-group learning with computer simulations to the pedagogical context of whole-class teaching with computer simulations. We investigate relations between the attitudes and learning goals of teachers and their students regarding the use of computer simulations in whole-class teaching, and how teachers implement these simulations in their teaching practices. We observed lessons presented by 24 physics teachers in which they used computer simulations. Students completed questionnaires about the lesson, and each teacher was interviewed afterwards. These three data sources capture the implementation by the teacher, and the learning goals and attitudes of students and their teachers regarding teaching with computer simulations. For each teacher, we calculated an Inquiry-Cycle-Score reflecting the extent to which the teacher successfully attempted to perform inquiry activities during the lesson, and a Student-Centeredness-Score reflecting the level of active student participation. We correlated these scores with scores reflecting the congruence of teacher and student learning goals and with scores reflecting attitudes toward teaching with computer simulations. Statistical analyses revealed positive correlations between the inquiry-based character of a teaching approach and students’ attitudes regarding its contribution to their motivation and insight; a negative correlation between the student-centeredness of a teaching approach and its inquiry-based character; and a positive correlation between a teacher’s attitudes about inquiry-based teaching with computer simulations and learning goal congruence between the teacher and his/her students.

[
__]

&3.1& [][]&Introduction&

Computer simulations offer an excellent opportunity for conducting scientific inquiry, allowing students to develop their scientific literacy (de Jong & van Joolingen, 1998; Rutten et al., 2012). However, this requires that the computer simulations allow interaction with the learners that lets them use the simulation as a source for genuine inquiry activities (van Joolingen et al., 2007). As a result, classroom use of computer simulations often does not go beyond the level of illustration (Windschitl, 2000). Classroom application of computer simulations can provide teachers with the opportunity to stimulate their students to express their ideas about a given domain. This may clarify students’ ideas and misconceptions for the teacher. Supplementing classroom use of computer simulations with a teacher-led discussion allows for guiding the learners’ attention to important aspects of the research process and connecting the different stages of inquiry learning (Gelbart et al., 2009). However, the most effective classroom use of computer simulations (Adams et al., 2008), how the learning processes can be mediated by the teacher (Hennessy, 2006), and how to integrate computer simulations into a physics curriculum (Zacharia & Anderson, 2003) appear to have been sparsely researched. Moreover, much research on educational technology seems to be based on the assumption that its use takes place in isolation from other activities: in the absence of a teacher’s guidance, and detached from curriculum and assessment structures (Hennessy, 2006).

Salinas (2008) provides a model that shows how new technologies can have added value when applied in teaching practices. This model proposes connections between the learners’ needs, the levels of Bloom’s Taxonomy, the role of the instructor, and the appropriate technology to be used. Simulations can provide a suitable technology that can support the level of interaction needed for inquiry learning, but that needs to be augmented by an appropriate pedagogy. In such a pedagogy, implementation of interactive activities can change the teacher’s role from being a mere transmitter of information to becoming a facilitator of higher-order thinking skills (Gokhale, 1996). Salinas’ framework and inquiry-based teaching and learning with computer simulations are in line with the ideas of constructivism and social constructivism. Where in constructivism the central idea is that understanding is constructed in one’s own mind by ‘learning by doing’, the focus in social constructivism is more on creating knowledge in a group setting by knowledge sharing and distribution (Dori & Belcher, 2005).

Clear learning goals are crucial for effective teaching (Marzano, 1998). Salinas argues that his model facilitates the choice of appropriate technologies for achieving learning goals. However, as Salinas also argues, one of the main obstacles to the implementation of his model is that the teacher needs to adapt his/her role appropriately: the learning goals that are associated with higher levels of Bloom’s Taxonomy call for the teacher to take on a role that is less directive and more supportive. In many current learning approaches for technology-supported instruction, insufficient attention is given to the role that the teacher should fulfill (Urhahne et al., 2010). According to Salinas (2008), educators do not have an adequate understanding of what pedagogical principles should underlie the incorporation of such new technologies.

Teachers influence how students learn by varying the types of questions they ask. King (1990, 1992) distinguishes recall questions and critical thinking questions, where recall questions require students to recall information that was presented earlier, and critical thinking questions require them to analyze, apply and evaluate it. Creemers and Kyriakides (2006) make a related distinction between product questions and process questions, where product questions expect merely a single response from a student, while process questions expect students to explain their answer as well. Effective teachers not only pose many questions to their students to involve them in discussion; they also ask relatively many process questions (Creemers & Kyriakides, 2006).

When investigating teachers’ questioning in teaching, it is informative not only to look at how often certain types of questions are posed, but also to determine how these questions are sequenced and followed up. A possible question sequence in the context of inquiry learning with computer simulations is: predict, observe, explain (Hennessy et al., 2007). When students merely observe a computer simulation as a whole class and subsequently listen to the explanation by the teacher, the learning situation is comparable to a traditional demonstration (Crouch et al., 2004). Students’ ability to predict outcomes does not appear to depend greatly on whether they have observed a demonstration or not. What appears to have a greater impact on that skill is whether students had the opportunity to predict the outcomes before observing and the opportunity to discuss the outcomes with each other afterwards before the teacher’s explanation. Considering the learning profits this yields, the extra time needed for predicting and discussing seems more than worth it (Crouch, Watkins, Fagen, & Mazur, 2007). The value of prediction questions may lie in stressing what is important, and in the construction of a mental framework needed for exploring a phenomenon. Without such a framework, the level of detail in a situation can be too high for a student to remember relevant scientific ideas (Adams et al., 2008).

An important factor influencing learning is the attitude toward a given instructional medium and instructional approach (Pyatt & Sims, 2012; Trundle & Bell, 2010). An attitude is a tendency to evaluate an object in terms of favourable or unfavorable attribute dimensions (Ajzen, 2001), which distinguishes the concept from beliefs or opinions (van Aalderen-Smeets, Walma van der Molen, & Asma, 2012). In turn, teaching with computer simulations can positively impact attitudes toward learning (Khan, 2011; Vogel et al., 2006; Zacharia & Anderson, 2003).

The purpose of the present study is to investigate relations between the attitudes and learning goals of teachers and their students regarding the use of computer simulations in whole-class teaching, and how teachers implement these simulations in their teaching practices. We observed physics teachers, all teaching one lesson using one or more computer simulations. Salinas (2008) argues that if teachers do succeed at appropriately tailoring their role to introduced technology such as simulations, this allows for the emergence of a ‘new learning environment’: one that is characterized not only by different roles for the teacher and students, but also by higher student motivation and improved learning outcomes. We want to know whether such presumed relations can be revealed by investigation of the inquiry-based character of teacher implementations, and the learning goals and attitudes of students and their teachers related to teaching with computer simulations. With regard to learning goals, we focused on congruence between teacher and student learning goals, as this is an important aspect of the learning process. Learning can be disrupted by a lack of congruence between students’ self-regulation and teachers’ external regulation of learning processes (Vermunt & Verloop, 1999). Because our research approach to incorporation of both inquiry-based learning with computer simulations and its contextual teaching factors is unprecedented, we believe our study contributes to the existing research literature.

&3.2& [][]&Method&

We focused our study on physics lessons conducted in Dutch secondary education. A physics lesson was eligible for observation when a physics teacher planned to use at least one computer simulation during whole-class interaction with the students. The participating teachers mostly taught with simulations from the PhET simulations suite available online (2014). Sometimes the teachers used simulations from other websites, but in all cases these represented a physics phenomenon in simplified form, and allowed for interactivity with the underlying model by the possibility of influencing several variables (de Jong & van Joolingen, 1998). Although each simulation has its own peculiarities, we considered those in this study as physics simulations in general, because our focus was not on the specific characteristics of simulations themselves, but on the teachers using these according to the predict-observe-explain principle. The teachers mostly used simulations on mechanics, electricity, or wave phenomena.

3.2.1 [][]Participants

The participants in the study were 24 Dutch secondary education physics teachers, and the students in their classes. Most of the teachers were male (3 female, 21 male). Their ages ranged from 23 to 59 (M = 39.4; SD = 8.97), and their years of experience from 1 to 35 (M = 11.0; SD = 9.82). The classes consisted of 7 to 27 students (M = 20.9; SD = 4.55). The students’ ages ranged from 12 to 19 (M = 15.9; SD = 1.40). The teachers were recruited by direct mailings, calls on online groups and professional newsletters, and at a professional physics education conference. As our recruitment was most successful at this conference, participating teachers probably have a higher than average interest in didactic development. Schools were distributed across the Netherlands (see Figure 3-1).

Figure 3-1 Geographical location of the 19 schools of the 24 participating teachers. Our university is located in Enschede, the Netherlands.

3.2.2 [][]Data sources

For our investigation of the attitudes and learning goals of the teachers and their students, and the implementation of the computer simulation in class, we collected data from teacher interviews, lesson observations, and student questionnaires. To prevent the interviews from influencing what was done during class, lesson observations always preceded the interviews. In an observation, the teacher used at least one computer simulation during the lesson (see Figure 3-2); the teacher and the students were video recorded; the students completed questionnaires during the last 15 minutes of the lesson; and after the lesson, the teacher was interviewed for about half an hour.

Figure 3-2 One of the participating teachers teaching with a simulation. Printed with permission.

[]Lesson observations

During the lesson, the researcher was positioned at the back of the classroom with two cameras, filming how the teacher conducted the lesson. At the front of the classroom, two cameras recorded students’ participation in the lesson. Recording began the moment the students entered the classroom and stopped when they started completing their questionnaires at the end of the lesson.

[]Questionnaires

Students were given questionnaires at the end of the lesson. Our questionnaire is based on four questionnaires (Apperson, Laws, & Scepansky, 2006; Davies, 2002; Maor & Fraser, 2005; Mucherah, 2003) that inquire about topics related to using computer simulations, and the experiences of students and teachers. We selected questions from these questionnaires that are related to the teacher role, to control over the lesson discourse, and to engagement in the lesson. The questionnaires consisted of 27 items (see Table 3-3) that could be answered on a 5-point Likert scale, and one open-ended question asking about what they considered were the three most important things to be learned during that lesson. This open question was also posed to the teacher. The purpose of this question was to determine learning goal congruence.

[]Teacher interviews

After the observation of a physics lesson and completion of the questionnaires by the students, the teacher was interviewed. This semi-structured interview took about 30 minutes. To allow for full transcription afterwards, the interview was audio recorded. Along with several demographic questions, the interview consisted of twelve questions about teaching with computer simulations in a whole-class setting.

3.2.3 [][]Data analysis

[]Lesson observations

We analyzed the lesson observations to find out whether the teaching approach resembled inquiry-based teaching. Table 3-1 shows the scheme that we used to code the questions asked by the teacher. Any episode during which the teacher addressed the whole class was eligible for coding.

[Table 3-1 Coding scheme for lesson observations
__]

table<>. <>. |<>^\4.
p<>. teacher questions that are related to physics | <>. |<>. |<>.
p<>. codes |<>^.
p<>. application |<>^.
p<>. examples | <>. |<>^/4.
p<>. What kind of question is it? |<>^.
p<>. Recall |<>^.
p<>. Questions that students should be able to answer with the knowledge they already have.

|<>^. p<>. “In what unit is this variable measured?” | <>. |<>^. p<>. Prediction |<>^. p<>. Students are asked to predict how a phenomenon will develop further before this has actually happened. |<>^. p<>. “What happens if that variable is doubled?” | <>. |<>^. p<>. Observation |<>^. p<>. The teacher inquires about what students are observing at that moment. |<>^. p<>. “And what do you see right now?” | <>. |<>^. p<>. Explanation |<>^. p<>. Students are asked to explain why a phenomenon has developed in a certain way. |<>^. p<>. “Now how do you explain this result?” | <>. |<>^/2. p<>. Who answers the question? |<>^. p<>. teacher |<>^\2. p<>. The teacher’s question is answered by the teacher himself/herself. | <>. |<>^. p<>. student |<>^\2. p<>. The teacher’s question is answered by a student. | <>. |<>^\4. | <>. |<>^\4. p<>. teacher questions that are not related to physics or fall within the categories below | <>. |<>. |<>. p<>. code |<>^. p<>. application |<>^. p<>. examples | <>. |<>^/4. p<>. What kind of question is it? |<>^/4. p<>. other |<>^. p<>. Students are personally addressed. |<>^. p<>. “Alison?” | <>. |<>^. p<>. Student answers are repeated back in the form of a question. |<>^. p<>. “You’re saying a lower frequency?” | <>. |<>^. p<>. The teacher checks whether subject-matter is understood. |<>^. p<>. “Is that clear?” | <>. |<>^. p<>. The learning process is regulated.
<>^.
p<>. “What have we seen today?”

All transcribed lesson observations were coded by the first author. Six of the transcripts were double-coded by the second author and a student assistant. Because we worked with three coders, we determined inter-rater reliability using Krippendorff’s alpha. We calculated the reliability of our approach for coding the teacher questions in three ways: our ability to discriminate between actual physics content questions (recall, prediction, observation, and explanation) and other questions; our ability to discriminate who answered questions (answered by the teacher, answered by the student, and other); and, our ability to discriminate between the different kinds of questions (recall, prediction, observation, explanation, and other).

Our purpose in analyzing the observed lessons was to determine the extent to which the teaching approach seen resembles inquiry-based teaching. Two aspects of the pedagogical approach that we consider to be essential for inquiry-based teaching are student-centeredness and resemblance to the inquiry cycle. For each lesson observation we calculated two scores reflecting these aspects: a Student-Centeredness-Score (SCS) and an Inquiry-Cycle-Score (ICS). We calculated these scores when the teacher asked at least three physics content questions. Student-centeredness relates to the percentage of physics-related teacher questions that are answered by the students: [100 * N teacher questions answered by student / (N teacher questions answered by teacher + N teacher questions answered by student)]. We determined resemblance of the teaching approach to the inquiry cycle by making an inventory of the extent to which the cycle of predict-observe-explain was followed during the lesson. The phases of this P-O-E cycle relate respectively to the inquiry cycle phases of ‘stating a hypothesis’, ‘performing an experiment’, and ‘drawing conclusions’, and consequently do not cover the phase of ‘orientation’. Hennessy et al. (2007) argue that this P-O-E cycle is one of the pedagogical principles on which research on use of ICT in science has been based.

We assigned scores to sequences of coded questions by application of the following hierarchical scoring system: weight of the sequence explanation = 1; weight of observation-explanation = 2; weight of prediction = 3; weight of prediction-observation = 4; weight of prediction-explanation = 5; weight of prediction-observation-explanation = 6. This scoring system is based on the following rationale: without the phase of prediction, inquiry-based teaching is out of the question; observation can be considered as learning at a lower level of Bloom’s Taxonomy compared to prediction and explanation; but observation does make an inquiry cycle more complete compared to a cycle in which explicit observation is lacking. For the determination of these sequences the recall and other questions are ignored, and sequences of similar questions are combined, such as an observation followed by an observation. Sequences only count when they are not overlapped by a sequence of higher weight. For example, the sequence P-P-P-O-E-E does not count as separate sequences of P, P-O, O-E, or E, as these are all overlapped by one P-O-E sequence.

[]Questionnaires

At the end of each lesson the students completed questionnaires. These questionnaires consisted of 27 items on a 5-point Likert scale, and one open-ended question. We performed exploratory factor analysis to analyze the responses to the items.

The open-ended question asked students what they considered were the three most important things to be learned during that lesson. Each class’s teacher also answered this question. The purpose of this open question was to investigate learning goal congruence between the teacher and his/her students during the specific lesson. To measure learning goal congruence we used cosine similarity (Manning & Schütze, 1999). In computing cosine similarity, word frequencies in two texts are represented by vectors in a high-dimensional space. The cosine of the angle between these vectors is taken as a measure for similarity, yielding a number between 0 (no similarity at all) and 1 (perfect similarity). We computed both the congruence of learning goals within the group of students themselves, and the congruence between the learning goals of students with those of their teacher. We refer to these kinds of congruence as: learning goal congruence group and learning goal congruence teacher. In computing cosine similarity we converted the answers given by the students and teachers by taking the following steps: we retained only the nouns in the student and teacher responses; we replaced synonyms, plurals and diminutives by their synonyms; and we removed capitalization, special punctuation, and multiple instances of the same word in one answer. We removed students with an empty answer sequence before calculating the mean learning goal congruence group over an entire class, because retaining empty answer sequences would otherwise inflate congruence scores. There were no teachers with an empty answer sequence.

[]Teacher interviews

Teacher utterances were eligible for coding when an utterance reflected the teacher’s thoughts on the effects of whole-class teaching and learning with computer simulations. Utterances about the effects of teaching with computer simulations were divided into those attributing positive effects, those attributing negative effects, and those stating that it does not make a difference compared to other educational means. After having inventoried these utterances in all of the interviews, we considered the extent to which these views on the effects of computer simulations on teaching and learning could be further specified according to more specific codes. In determining these codes we attempted to maintain a balance between conceptual delineation of differing themes and grouping together of conceptually related themes. This resulted in a coding scheme with 10 negative-effect codes, 1 neutral code, and 23 positive-effect codes (see our Results Supplement, which is available as supplementary material accompanying the online article).

An example will clarify how the coding process functioned in practice. Consider the following teacher statement: “By using computer simulations, I can quickly show something on my IWB, and by influencing variables the students directly notice what happens”. This statement was chunked and coded as follows: [By using computer simulations, I can quickly show something on my IWB] = positive effect, and [by influencing variables the students directly notice what happens] = positive effect. Application of our coding scheme linked these statements to the following codes: [By using computer simulations, I can quickly show something on my IWB] = positive – time saving, and [by influencing variables the students directly notice what happens] = positive – direct feedback. As we are interested in inquiry-based teaching with computer simulations, the codes most related to this concept were aggregated into one measure, which we refer to as: N inquiry utterances. We selected the codes for this aggregated measure by determining whether a code refers to affordances that support inquiry activities.

3.2.4 [][]Relating pedagogical aspects

We assume that students will have a more positive attitude toward teaching with computer simulations when the teacher successfully adapts technology implementation and teaching approach to each other. Another assumption is that the impression one gets by interviewing a teacher about his/her attitude toward teaching with computer simulations to a certain extent resembles what a teacher actually does during a lesson observation. Our third assumption is related to the important role the teacher plays in aligning the use of computer simulations to learning goals (Smetana & Bell, 2012). We assume that when a teacher succeeds at such congruence, this will be reflected in positive attitudes expressed by the teachers and their students about teaching with computer simulations. In short, we expect that what a teacher does during implementation will be related to the learning goals and attitudes of the teachers and their students. To investigate whether this is supported by our data, we relate the types of data to each other by calculating correlations.

&3.3& [][]&Results&

3.3.1 [][]Lesson observations

Calculations of the inter-rater reliability of our approach for coding the teacher questions revealed that our ability to discriminate between actual physics content questions and other questions is .79 (95%CI: .65-.90), between by whom questions are answered is .75 (95%CI: .66-.83), and between the different kinds of questions is .61 (95%CI: .55-.66). According to the criteria proposed by Strijbos and Stahl (2009; 2007), these reliability results can respectively be interpreted as: excellent, excellent, and good. Initially, our dataset consisted of 25 lesson observations and interviews of teachers. Retaining only those observations with three or more physics content questions resulted in the removal of one teacher from the dataset.

Table 3-2 provides an overview of inquiry-based question sequences in each of the observed lessons. Two scores reflect each lesson’s student-centeredness (SCS) and the extent to which the inquiry-cycle is evident (ICS).

[
__]

Table 3-2 Teacher questions

<>^/3.
p<>. Inquire-Cycle-Score (ICS)
<>^\2.
p<>. % answered by …
<>^/3.
p<>. physics content questions posed by each teacher in chronological order*
<>^/2.
<>^.
<>^.
<>^.
<>^.
<>^.
<>^.
<>^.
<>^.
<>^.
<>^.
<>^.
<>^.
<>^.
<>^.
<>^.
<>^.
<>^.
<>^.
<>^.
<>^.
<>^.
<>^.
<>^.
<>^.

[_ Note. *Each code refers to a type of teacher question abbreviated as follows: r = recall, P = prediction, O = observation, and E = explanation. Questions in bold refer to teacher questions answered by a student; teacher questions that are not bold are answered by the teacher him-/herself. A darker shade of gray represents higher resemblance to the inquiry cycle. _
__]

[
__]

3.3.2 [][]Questionnaires

We analyzed the students’ answers to the 27 questionnaire items by performing exploratory factor analysis based on Alpha factoring with an Oblimin with Kaiser Normalization rotation method. This method of oblique rotation was chosen because we expected the factors to be related. The correlation matrix revealed that there are no variables that correlate too highly with other variables (R > .8), which would have been an indication of multicollinearity. The KMO-measure of sampling adequacy is .906, which is very good (Kaiser, 1974). The significant result of Bartlett’s test of sphericity (χ²(351) = 3750.29; p < .001) indicates that the correlation matrix is appropriate for factor analysis. The data screening did not result in the exclusion of variables. Because a clear point of inflexion in the curve could be discerned in the scree plot, we chose to extract seven factors (Field, 2009) (see Table 3-3), which together explain 40.0% of the total variance.

To measure reliability we calculated Cronbach’s alpha for each factor (see Table 3-3). As Cronbach’s alpha should be at least .7, the scales for factors 2, 3, 4, and 6 are not used for further analyses. The internal consistency of factor 1 is good, and for factors 5 and 7 acceptable (Field, 2009). We conceptually labeled factor 1 as representing motivation, as it combines items on supporting attention, enjoyment, and learning. Factor 5 combines items on realizing complexity, support of learning, and wonderment and is therefore related to insight. However, as all related items load negatively on the factor, we use the negative of the score of factor 5 as representing insight. Finally, factor 7 represents inspiration, as items on supporting interest, creative thinking, and the learning process load high on this factor. Table 3-3 shows additional factor details and the items that are related to each factor.

[Table 3-3 Pattern matrix for the exploratory factor analysis
__]

table<>. <>. |<>^.
p<>. factor |<>^.
p<>. 1 |<>^.
p<>. 2 |<>^.
p<>. 3 |<>^.
p<>. 4 |<>^.
p<>. 5 |<>^.
p<>. 6 |<>^.
p<>. 7 | <>. |<>^.
p<>. I pay more attention during the lesson when computer simulations are used.

|<>^. p<>. .688 |<>. |<>. |<>. |<>. |<>. |<>. | <>. |<>^. p<>. The use of a computer simulation makes it easier to stay focused on the lesson. |<>^. p<>. .599 |<>. |<>. |<>. |<>. |<>. |<>. | <>. |<>^. p<>. I like physics better when computer simulations are used. |<>^. p<>. .596 |<>. |<>. |<>. |<>. |<>. |<>. p<>. .322 | <>. |<>^. p<>. The use of computer simulations supports my learning during the lesson. |<>^. p<>. .475 |<>. |<>. |<>. |<>. |<>. |<>. | <>. |<>^. p<>. It’s fun when computer simulations are used in teaching. |<>^. p<>. .472 |<>. |<>. |<>. |<>. |<>. |<>. p<>. .361 | <>. |<>^. p<>. Using computer simulations helps me to improve my knowledge of physics. |<>^. p<>. .431 |<>. |<>. |<>. |<>. p<>. -.272 |<>. |<>. | <>. |<>^. p<>. I was always able to predict the course of the simulation. |<>. |<>. p<>. .606 |<>. |<>. |<>. |<>. |<>^. | <>. |<>^. p<>. There is a clear link between the simulation used and the subject of the lesson. |<>. |<>. p<>. .453 |<>. |<>. p<>. .291 |<>. |<>. |<>^. | <>. |<>^. p<>. I know what I have to remember from this simulation. |<>. |<>. p<>. .373 |<>. |<>. |<>. |<>. |<>^. | <>. |<>^. p<>. Our teacher encourages us to express our opinions. |<>. |<>. |<>^. p<>. -.549 |<>. |<>. |<>. |<>. | <>. |<>^. p<>. I feel free enough to express my opinions during discussions. |<>. |<>. |<>^. p<>. -.479 |<>. |<>. |<>. |<>. | <>. |<>^. p<>. -/- (Class discussions are inhibited by the use of computer simulations.) |<>. |<>. |<>. |<>. p<>. .433 |<>. |<>. |<>^. | <>. |<>^. p<>. Computer simulations clarify complex matters. |<>^. p<>. .284 |<>. |<>. |<>^. p<>. .306 |<>^. p<>. -.294 |<>. |<>. | <>. |<>^. p<>. Our teacher uses computer simulations to clarify a subject to the class. |<>. |<>. |<>. |<>. p<>. .217 |<>. |<>. |<>^. | <>. |<>^. p<>. I also use computer simulations outside formal class times. |<>. |<>. |<>. |<>. |<>. |<>. |<>^. | <>. |<>^. p<>. Computer simulations make me realize how complex reality really is. |<>. |<>. |<>. |<>. |<>^. p<>. -.601 |<>. |<>. | <>. |<>^. p<>. Using computer simulations helps me to better understand the subject. |<>^. p<>. .345 |<>. |<>. |<>. |<>. p<>. -.454 |<>. |<>. | <>. |<>^. p<>. By using the simulation I learned how reality works. |<>. |<>. |<>. |<>. |<>^. p<>. -.414 |<>. |<>. | <>. |<>^. p<>. Computer simulations help me to wonder about new things. |<>. |<>. |<>. |<>. |<>^. p<>. -.388 |<>. |<>. p<>. .317 | <>. |<>^. p<>. The use of computer simulations helps with getting better grades. |<>. |<>. |<>. |<>. |<>^. p<>. -.308 |<>. |<>. | <>. |<>^. p<>. There is enough time to discuss with each other during the simulation. |<>. |<>. |<>. |<>. |<>. |<>. p<>. .753 |<>^. | <>. |<>^. p<>. During the simulation I have enough time to think by myself what will happen. |<>. |<>. |<>. |<>. |<>. |<>. p<>. .520 |<>^. | <>. |<>^. p<>. The use of computer simulations increases my interest in the subject of the lesson. |<>^. p<>. .316 |<>. |<>. |<>. |<>. |<>. |<>. p<>. .524 | <>. |<>^. p<>. When a computer simulation is used, I get new ideas. |<>. |<>. |<>. |<>. |<>^. p<>. -.302 |<>. |<>. p<>. .412 | <>. |<>^. p<>. Using a computer simulation makes me think about the subject. |<>. |<>. |<>. |<>. |<>^. p<>. -.311 |<>. |<>. p<>. .406 | <>. |<>^. p<>. Computer simulations should be used more often in other subjects. |<>^. p<>. .214 |<>. |<>. |<>. |<>. |<>. |<>. p<>. .381 | <>. |<>^. p<>. The use of computer simulations can really get class discussions going.
<>^.
p<>. -.342
<>^.
p<>. .375
<>^.

[_ Note. Factor loadings < |.2| are suppressed; extraction method: Alpha Factoring; rotation method: Oblimin with Kaiser Normalization; *labels for factors with Cronbach’s alpha > .7: factor 1 = motivation; factor 5 = -/- (insight); factor 7 = inspiration. _
__]

Table 3-4 shows the means of the learning goal congruence for each class. The measure M learning goal congruence group does not take into account what the teacher answered regarding the three most important things to be learned during that lesson. This measure only compares the students’ answers with each other. The measure M learning goal congruence teacher compares the students’ answers with the answers of their teachers.

3.3.3 [][]Teacher interviews

Our Results Supplement accompanying the online version of this article shows a complete inventory of the results of coding the teacher interviews, from both a quantitative and qualitative perspective. It provides insight about what kind of teacher utterances the specific codes represent. The selection of codes that we report on in Table 3-4 specifically refer to the affordances of computer simulations for supporting inquiry-based teaching: adjustable variables, direct feedback, illustration/visualization, supporting predictions, visualization of invisible phenomena. We aggregated these codes into one measure: N inquiry utterances.

3.3.4 [][]Relating pedagogical aspects

To allow for calculation of correlations between variables, these variables need to be measured at the same level (i. e., at the class or student level). All questionnaire variables are measured at the student level, and all variables based on the lesson observations and teacher interviews are measured at the class level. We therefore converted the questionnaire variables to the class measurement level by calculating the means per class for each variable. Table 3-5 shows a correlation matrix for all variables at the class measurement level.

[Table 3-4 Learning goal congruence & frequencies of coded teacher utterances
__]

<>^\2.
<>^/2.
<>^.
<>^\26.
<>^\2.
<>^\2.
<>^\2.
<>^\2.
<>^\2.
<>^\2.

[_ Note. *Teachers are ordered according to total coded teacher utterances relating to inquiry-based teaching with computer simulations; M learning goal congruence of students with each other; ***M learning goal congruence of students with their teacher._]

[Table 3-5 Pearson correlations
__]

<>^/2.
p<>. Student-Centeredness-Score
<>^/2.
p<>. Inquiry-Cycle-Score
<>^/2.
p<>. M factor motivation
<>^/2.
p<>. M factor insight
<>^/2.
p<>. M factor inspiration
<>^/2.
p<>. M learning goal congruence group
<>^/2.
p<>. M learning goal congruence teacher
<>^/2.
p<>. N inquiry utterances
<>^.
<>^/2.
<>^.
<>^/5.
<>^.
<>^.
<>^.
<>^.
<>^.

[_ Note. Significant correlations between variables from different data sources are printed in bold; N = 24; *p < .05 (2-tailed); p < .01 (2-tailed). _]

We are primarily interested in significant correlations between variables derived from different data sources. Our findings in Table 3-5 show that there are three correlations that meet this criterion: the Student-Centeredness-Score significantly correlates with M factor motivation (r = .43, p (2-tailed) < .05), the Inquiry-Cycle-Score significantly correlates with M factor insight (r = .50, p (2-tailed) < .05), and N inquiry utterances significantly correlates with M learning goal congruence teacher (r = .55, p (2-tailed) < .01). Another noteworthy correlation between variables belonging to the same data source is the significant negative correlation between the Student-Centeredness-Score and the Inquiry-Cycle-Score (r = -.45, p (2-tailed) < .05).

&3.4& [][]&Discussion of Limitations and Implications&

When coding the questions posed by the teacher, we determined the type of each question (i. e., recall, prediction, observation, explanation, or other), by whom it was answered, and the sequences in which these questions were structured (i. e., the extent to which they approached the inquiry cycle). By taking the steps in the data analysis process, the questions were coded, the sequences in which these are structured were weighted, and eventually two scores were calculated for each lesson. We acknowledge that in this process the information regarding what the questions were actually about was lost. Therefore, it is possible that a sequence that we consider P-O-E could, for example, span different conceptual domains. In this study, we chose to focus on the typological level to prevent the introduction of an extra source of subjectivity, which would result from additional coding on a conceptual level.

The system we used to assign weights to different sequences of questions is based on the assumption that the most preferable teaching approach follows the complete cycle of prediction-observation-explanation. However, as Chen (2010) argues: portraying inquiry learning as a deductive relationship between the tested hypothesis and evidence runs the risk of conveying an oversimplified view of scientific inquiry. Preferably, the tested hypothesis, evidence and associated experimental conditions are inspected as a whole, supporting epistemological authenticity. Our system of weighing sequences is suitable for measuring the extent to which the P-O-E cycle is approached, but capturing this epistemological authenticity on a higher level requires searching for alternative data analysis approaches.

The factor analysis we performed on the answers to the questionnaires resulted in three reliable factors: motivation, insight and inspiration. However, these constructs are based on a relatively small number of items (6, 5, and 5 respectively). Performing follow-up studies with questionnaires that are specifically developed for measuring these constructs would do more justice to their multi-faceted nature.

We used an open-ended question to inquire about what students and their teachers considered to be the three most important things to learn during the lesson. By calculating the cosine similarity between the answers of the students and their teachers, we wanted to find out whether there was learning goal congruence. We believe that it makes sense to expect that when answering the question about the three most important things to learn, students will think more about the concrete lesson they have just received, and teachers will answer it by taking more of a bird-eye’s view on the goals of the curriculum. If so, then that would lead to higher learning goal congruence between the students themselves, compared to the congruence between students and their teacher. Table 3-4 does show that the scores on M learning goal congruence group are structurally higher compared to the scores on M learning goal congruence teacher, but this does not mean this bird-eye’s view assumption is correct, as the learning goal congruence scores for group and teacher are calculated in slightly different ways, making it impossible to compare them. Nevertheless, these two measures significantly correlate with each other (r = .55, p (2-tailed) < .01), which is an interesting finding in itself: apparently, our approach to calculating learning goal congruence among the students provides a good indication of congruence between the learning goals of the students and their teacher.

In our review of the literature on the learning effects of computer simulations (Rutten et al., 2012) we found that most research of the past decade focused on individuals or small groups interacting with simulations, thereby ignoring the role of the teacher and lesson scenarios. Whereas such studies provide information that is relevant to the design and learning effects of simulations, they lack ecological validity in the sense that they do not include the classroom dynamics in realistic teaching situations. The present study aims at filling this gap by relating the attitudes and learning goals of teachers and their students with what teachers actually do while teaching with computer simulations, using multiple data sources. We ‘zoomed out’ to the context of teaching practices and found four relations between pedagogical aspects related to inquiry-based teaching with computer simulations:

1. Student-centered implementation of computer simulations in teaching relates to students’ positive attitude about its contribution to their motivation.

2. Implementation of computer simulations in teaching that resembles the inquiry cycle relates to students’ positive attitude about its contribution to their insight.

3. Student-centered implementation of computer simulations in teaching relates to low resemblance to the inquiry cycle, and vice versa.

4. Learning goal congruence between a teacher and his/her students relates to the teacher’s positive attitude about inquiry-based teaching with computer simulations.

In the present study we did not experimentally manipulate teaching practices. Consequently, we cannot make claims about causal relations. However, the approach taken in this study allowed us to observe teachers while they enacted their preferred way of teaching, unconstrained by imposed experimental conditions on teaching practices. During the lessons observed for the present study, the teachers were free in their choice of computer simulations and in their approach to teaching with them, supplying ecological validity. Our study shows that simply observing teaching practices supported by computer simulations without experimentally manipulating them allows interesting relations between pedagogical aspects to surface. Nevertheless, it remains interesting and necessary to investigate whether intervening in simulation-based teaching will expose similar relations.

Students have a more positive attitude on teaching with computer simulations in terms of contributions to their motivation and insight when the teaching approach has an inquiry-based character. An explanation based on Salinas’ framework (2008) is that apparently in such a case the role of the teacher is successfully tailored to the introduced technology, as inquiry-based teaching implicitly means a more student-centered teaching approach. According to the framework, such a teaching approach precisely suits the affordances of computer simulations for optimally supporting the achievement of learning goals. However, our results show that teachers implement computer simulations generally by having their teaching approach resemble the inquiry cycle or by teaching in a student-centered way, but rarely by incorporating both aspects of inquiry-based teaching. Apparently, it is rather difficult to combine teaching according to the inquiry cycle with a shift of control to the students by having them answer the questions posed by the teacher. This combination seems hard, compared to having students answer questions without an inquiry-related character, e. g., recall questions. Future research could focus on whether informing teachers about the importance of learning goal congruence and the positive effects of teaching with computer simulations according to an inquiry-based approach results in inquiry-based teaching in practice, in more positive attitudes about inquiry-based teaching with computer simulations, and in higher learning outcomes.

&3.5& [][]&Acknowledgements&

The authors would like to thank all the teachers and their students who participated in this study, and are grateful for the insightful comments from the referees. We would also like to thank Anjo Anjewierden for calculating cosine similarity, and the Fischer Working Group on Physics Education at the University of Duisburg-Essen for thinking along with us about analyzing the lesson observations.

Chapter 4[
**][][][]&Understanding& &the Effects of Inquiry-Based Teaching with Computer Simulations&[][3]

Going beyond simply measuring the effectiveness of an inquiry-based teaching approach with computer simulations, we search for mechanisms in the pedagogical context that have an impact on learning outcomes. We compare the effectiveness of two pedagogical approaches by having teachers teach with computer simulations in an Accustomed condition and a Peer Instruction condition. We replicated this quasi-experiment with five teachers. We investigate the pedagogical context by analyzing interaction between teachers and their students. Our analysis of teacher-student interaction reveals the aspects of the pedagogical context that have an impact on learning outcomes in this series of separate quasi-experiments. We conclude by suggesting possible mechanisms occurring in the pedagogical interaction that can explain our findings, opening up possibly fruitful directions for future research.

&4.1& [][][]&Introduction&

Research over the past decade on the learning effects of computer simulations in science education has shown that computer simulations can improve the effectiveness of instruction (Rutten et al., 2012). Most of this research focused on learning by individual students or students in small groups. In such settings, it is relatively easy to influence and control learning processes while carrying out inquiry learning tasks. It is not a given that such results can be transferred from these small-scale settings to learning in whole classrooms, which typically have one teacher and between 20 and 30 students. Whereas the learning processes in small-scale settings are usually supported and controlled from within the simulation or by using additional teaching material, in full classrooms an important role falls to the teacher. The teacher can act as a facilitator of knowledge construction, and can encourage the students to discuss and reflect on their learning (Khan, 2011; Maeng et al., 2013). A previous study shows that teaching with computer simulations at the whole-class level using an inquiry-based approach relates positively to students’ attitudes about the contribution of teaching with computer simulations to their motivation and insight (Rutten, van der Veen, & van Joolingen, submitted). Attitude towards a given instructional medium and instructional approach is an important factor influencing the degree of conceptual change (Pyatt & Sims, 2012; Trundle & Bell, 2010).

For these reasons, it is worthwhile to study in more detail how whole-class teaching for inquiry learning with computer simulations can be supported. In such studies, it is important to address both the teacher and the students. Students can be supported by triggering the relevant learning processes by asking questions, providing feedback or providing other means of support. Providing structures and examples for performing these triggering actions can assist teachers in the support of these inquiry processes. The properties of these types of support for teachers and students in the whole class context are not well known.

Any means of supporting teachers in their support of inquiry-based learning should be evaluated at several levels. First, the effect on teacher behavior should be checked: does a support measure affect the behavior of the teachers in terms of performance of supportive actions towards the students? Second, the effects of the teachers’ behavior on the learning activities of the students need to be studied. These two aspects together form the pedagogical interaction in the classroom. Finally, the effects of this pedagogical interaction on the final learning outcomes in terms of test results, grades or other measures should be determined. All three levels are important for understanding in full the impact that teaching interventions can have on the learning processes in classrooms.

We present a study in which teachers are supported in stimulating their learners to perform inquiry processes using computer simulations. The method we use is based on Peer Instruction (Crouch & Mazur, 2001), which will be elaborated below. The main objectives of the study are:

1. to compare different approaches to whole-class teaching with computer simulations, and

2. to investigate the impact of pedagogical interaction on learning outcomes.

In our study we used Peer Instruction as a method for engaging students in the processes relevant for inquiry and compared it to teachers’ accustomed way of teaching with simulations. Peer Instruction can be considered to be a student-centered teaching approach, as it contains built-in phases during which the students are in control of their learning processes. It is a teaching approach that was originally developed by Eric Mazur in response to his observation that students appeared to learn very little, if anything, from traditional lectures (Crouch et al., 2004). Most research on Peer Instruction focuses on the actual Peer Instruction phase by measuring the learning effects of students convincing each other. In our study we explicitly include the teacher in the loop of inducing and evaluating episodes of Peer Instruction (Rutten et al., 2012; Zingaro & Porter, 2014). By focusing on learning achievement measured directly after the Peer Instruction phase, it is possible to overlook the subsequent teacher-led discussion in which the teacher can fulfill a complementary role as domain expert within the Peer Instruction learning process. Therefore, scores for the correctness of answers to questions posed directly after the Peer Instruction phase should probably be considered as an underestimate of student learning (Zingaro & Porter, 2014).

Central to Peer Instruction are so-called ConcepTests that are used to actively engage students and to find out what they struggle with (Crouch & Mazur, 2001). A ConcepTest is a short conceptual question-usually in multiple-choice format-about the subject under discussion at that point (see Figure 4-1). Engagement is stimulated by having the students answer the questions individually and then try to convince each other that their answer is right (Crouch & Mazur, 2001; Crouch et al., 2007). As a matter of fact, when trying to convince each other it is not even necessary to arrive at a consensus viewpoint for learning to occur, as discussing each other’s ideas and being confronted with different viewpoints and ways of reasoning can be a fruitful learning result in itself (Hennessy, 2011). Based on their own research, Crouch, Fagen, Callan, and Mazur (2004) argue that the extra in-class time needed to have students predict answers is time very well spent, as it yields improved understanding and higher engagement.

Figure 4-1 An example of a ConcepTest, used by Crouch and Mazur (2001). Printed with permission.

While students try to convince each other the teacher is free to walk around the classroom and get a sense of how students are reasoning toward their answers. In this way, the teacher can learn from his/her students the best way to teach them. Crouch, Watkins, Fagen and Mazur (2007) recommend not grading answers to ConcepTests, as a collaborative atmosphere is paramount. Because Peer Instruction supports learning in an interactive, non-competitive and conceptual way, the authors suggest that this learning approach might make physics courses more welcoming for female students. Based on communications from hundreds of instructors world-wide who have implemented Peer Instruction in their teaching, the authors conclude that this teaching method is readily adaptable to the local context (Crouch & Mazur, 2001).

In the study we used a quasi-experimental design for studying aspects of the pedagogical interaction and its effect in six pairs of classes, taught by five different teachers. This allows us to study the impact of the teacher support in different contexts, allowing us to understand its effectiveness or lack thereof not only at the level of outcome, but also at the level of the interaction between teacher and students. Peer Instruction enriched lessons were compared with teachers’ accustomed ways of teaching, using the same simulations in both.

&4.2& [][]&Method&

To investigate the impact of teaching with computer simulations according to an inquiry-based approach, we set up a pre-post quasi-experiment. In this experiment teachers delivered a simulation-based lesson on Newtonian mechanics in two parallel classes, covering two days of instruction (about 80 minutes total). In one class they followed their usual (“accustomed”) way of teaching; in the other class they followed our script based on Peer Instruction for inquiry. We do not treat this as a single experimental or quasi-experimental study because we compared each teacher’s accustomed way of teaching with a scripted condition, and the accustomed teaching modes may vary from teacher to teacher. Instead we present each comparison between accustomed and scripted condition as a separate quasi-experiment. We will discuss the quasi-experiments in relation to each other in order to come to general conclusions.

4.2.1 [][]Participants

Five teachers participated in our study. Each of them taught at least two parallel classes in upper secondary education. This allowed us to compare how each teacher taught when using two different approaches. A total of 218 students participated, which includes only those who were present at the pretest, the intervention lessons and also the posttest. The students’ average age was 15.7 (.77) and ranged from 14 to 18 years. Table 4-1 provides an overview of the characteristics of the teachers and their students in each of the parallel classes.

[Table 4-1 Characteristics of the teachers and their students
__]

<>^/3.
<>^\2.
<>^\2.
<>^\2. <>^\2. <>^\2. <>^\2. <>^\2. <>^\2. <>^\2.
<>^.
<>^\3.
<>^/13. <>^\2.
<>^/2.
<>^.
<>^/4.
<>^.
<>^.
<>^.
<>^/2.
<>^.
<>^/2.
<>^.
<>^\2/2.

[_ Note. *The educational level HAVO has five grades and transliterates as “higher general continued education”; the educational level VWO has six grades and transliterates as "preparatory scientific education". Data for students who missed any part of the study except for the delayed posttest were excluded from the analysis._]

4.2.2 [][]Investigating pedagogical interaction

Lesson observations

We analyzed lesson observations to reveal the impact of pedagogical interaction between the students and their teacher on the outcomes of the pre-post quasi-experiments. Table 4-2 shows the scheme that we used to code questions asked by the teacher in order to find out whether the teaching approach corresponds with inquiry-based teaching. Any episode during which the teacher addressed the whole class was eligible for coding.

All transcribed lesson observations were coded by the first author. Eight out of the 24 total transcripts were independently double-coded by a PhD student. The selection of these transcripts was balanced across conditions, teachers, and timepoint of data collection. We calculated the reliability of our approach for coding the teacher questions in three ways: our ability to discriminate consistently between actual physics content questions (recall, prediction, observation, and explanation) and other questions; our ability to discriminate who answered questions (answered by the teacher, answered by the student, and other); and our ability to discriminate between the different kinds of questions (recall, prediction, observation, explanation, and other). Calculations of inter-rater reliability based on Cohen’s kappa revealed that the reliability of our discrimination between actual physics content questions and other questions is .87, for who answered questions it is .86, and for the different kinds of questions it is .73.

[
__]

Table 4-2 Coding scheme for lesson observations

table<>. <>. |<>^\4.
p<>. teacher questions that are related to physics | <>. |<>. |<>.
p<>. codes |<>^.
p<>. application |<>^.
p<>. examples | <>. |<>^/4.
p<>. What kind of question is it? |<>^.
p<>. Recall |<>^.
p<>. Questions that students should be able to answer with the knowledge they already have.

|<>^. p<>. “In what unit is this variable measured?” | <>. |<>^. p<>. Prediction |<>^. p<>. Students are asked to predict how a phenomenon will develop further before this has actually happened. |<>^. p<>. “What happens if that variable is doubled?” | <>. |<>^. p<>. Observation |<>^. p<>. The teacher inquires about what students are observing at that moment. |<>^. p<>. “And what do you see right now?” | <>. |<>^. p<>. Explanation |<>^. p<>. Students are asked to explain why a phenomenon has developed in a certain way. |<>^. p<>. “Now how do you explain this result?” | <>. |<>^/2. p<>. Who answers the question? |<>^. p<>. teacher |<>^\2. p<>. The teacher’s question is answered by the teacher himself/herself. | <>. |<>^. p<>. student |<>^\2. p<>. The teacher’s question is answered by a student. | <>. |<>^. |<>^\3. | <>. |<>^\4. p<>. teacher questions that are not related to physics or fall within the categories below | <>. |<>. |<>. p<>. code |<>^. p<>. application |<>^. p<>. examples | <>. |<>^/4. p<>. What kind of question is it? |<>^/4. p<>. other |<>^. p<>. Students are personally addressed. |<>^. p<>. “Alison?” | <>. |<>^. p<>. Student answers are repeated back in the form of a question. |<>^. p<>. “You’re saying a lower frequency?” | <>. |<>^. p<>. The teacher checks whether subject-matter is understood. |<>^. p<>. “Is that clear?” | <>. |<>^. p<>. The learning process is regulated.
<>^.
p<>. “What have we seen today?”

For each lesson observation we calculated two scores: a Student-Centeredness-Score (SCS) and an Inquiry-Cycle-Score (ICS). Student-centeredness is defined as the percentage of physics-related teacher questions that are answered by the students (and not by the teacher himself or herself). The Inquiry-Cycle-Score was computed based on teachers’ actions related to a cycle of predict-observe-explain. Hennessy et al. (2007) argue that this P-O-E cycle is one of the pedagogical principles upon which research on the use of ICT in science has been based. Each question teachers asked was coded as either r, P, O or E. Sequences of questions were assigned Inquiry Cycle scores as follows: E = 1; O-E = 2; P = 3; P-O = 4; P-E = 5; P-O-E = 6. This scoring system is based on the following rationale: without the phase of prediction we cannot speak of inquiry-based teaching. Observation makes an inquiry cycle more complete compared to a cycle in which explicit observation is lacking. For the determination of these sequences recall and other questions are ignored, and consecutive repetitions of the same question type are combined, such as an observation followed by an observation. Shorter sequences only count when they are not part of a longer sequence of higher weight. For example, the sequence P-P-P-O-E-E does not count as separate sequences of P, P-O, O-E, or E, as these are all overlapped by one P-O-E sequence.

Teacher predictions

Once a teacher had finished teaching lessons 1-4 for both conditions, the teacher was asked to predict which class learned most about Newtonian mechanics during the series of lessons. They were also asked to describe what they thought were the most important contributing factors. Possible factors could address differences between classes and their circumstances, but could also be about differences in individual student characteristics.

4.2.3 []Research design

Lesson series and measurements

We measured learning effects by administering assessments and questionnaires and registering voting system data (see Figure 4-3). Students completed assessments and questionnaires during a pretest, a posttest, and a delayed posttest. The teacher also completed a questionnaire. The total series of lessons spanned 4½ lessons for all participating students: in lesson 1 the pretest was administered; in lessons 2 and 3 the intervention lessons were conducted; in lesson 4 the posttest was administered; and one month later the delayed posttest was administered in lesson 5. Table 4-3 shows a schematic outline of the lesson series.

[Table 4-3 Schematic outline of the lesson series and measures
__]

<>^.
p<>. lesson 1
<>^.
p<>. lesson 2
<>^.
p<>. lesson 3
<>^.
p<>. lesson 4
<>^.
p<>. lesson one month later
<>^.
<>^.
<>^.
<>^.
<>^.

[_ Note. *FCI: Force Concept Inventory; TCSQ: Teaching with Computer Simulations Questionnaire; MSLQ: Motivated Strategies for Learning Questionnaire._]

We used the Force Concept Inventory (Hestenes, Wells, & Swackhamer, 1992) to measure improvements in conceptual insight in Newtonian mechanics (see Figure 4-2). This instrument is available online in many languages at http://modeling.asu.edu/R&E/Research.html (2014) and has been widely used. At pretest and delayed posttest students completed only the FCI. At immediate posttest students completed not only the FCI, but also a questionnaire containing six motivational scales from the Motivated Strategies for Learning Questionnaire (Pintrich & de Groot, 1990; Pintrich, Smith, García, & McKeachie, 1991) and a questionnaire that we developed in a previous study (Rutten et al., submitted) to investigate views on teaching with computer simulations, which we refer to as the TCSQ: the Teaching with Computer Simulations Questionnaire. Because the FCI data were collected at different timepoints and the students are nested within their teachers’ classes, it would seem logical to analyze the data with a multi-level growth model. However, sufficient power to detect cross-level effects would require 20 or more groups (Kreft, 1998).

Figure 4-2 An example of a picture used in the FCI. Printed with permission.

[]Simulations, lesson design and intervention

Lessons 1, 4 and 5 were designed to be exactly the same for all participating students, as the pretest, posttest and delayed posttest were administered during these lessons. The lessons that differ across the two conditions are lessons 2 and 3. We implemented our experimental intervention by having the same teacher teach with computer simulations during these lessons in two different ways in the two parallel classes. In the remainder of this article these two ways of teaching are referred to as the Accustomed condition and the Peer Instruction (PI) condition.

Figure 4-3 Students using voting devices to register their responses to a question in class. Printed with permission.

We used simulations available from the PhET simulations website (2014). The simulations used in the lessons were selected based on the criteria that they aim at resolving misconceptions tested in the FCI (Hestenes, 2012), that a Dutch version is available and that inquiry-based conceptual questions are available (Loeblein, 2014). We chose the following PhET simulations: Projectile motion, Forces in 1 dimension, and The ramp – forces and motion.

The teacher presented the simulations on an interactive whiteboard (available in all classrooms). In the Accustomed condition, the teacher decided how the lesson would unfold (see Figure 4-4). In the experimental (scripted) condition, the lesson unfolded along a pattern designed for Peer Instruction (see Figure 4-6). The teacher was asked to follow the script, instead of using his or her own preferred way of teaching. In this Peer Instruction condition the lesson unfolded using a PowerPoint presentation on a separate screen in front of the class. The researcher operated this presentation in collaboration with the teacher. This PowerPoint presentation consisted of a series of twenty conceptual questions about Newtonian mechanics. We based these questions (with permission) on the inquiry-based concept questions published by Trish Loeblein on the PhET website (Loeblein, 2014). We implemented a voting process by showing the multiple-choice conceptual questions in the PowerPoint presentation and using a voting system plugin for PowerPoint (see Figure 4-5). This allowed the students to see how many students had yet to vote, and to see the distribution of votes after the time for voting had elapsed. Students could answer each multiple-choice question by using a personalized voting device. The PowerPoint presentation we used in this experiment can be found at: http://bit.ly/ConcepTests.

Figure 4-4 An example of instruction in the Accustomed condition: a real-life demonstration of forces and motion on a table serving as a ramp. Printed with permission.

Figure 4-5 An example of instruction in the Peer Instruction condition: students use their voting devices to answer a question projected on the right screen about the simulation shown on the left interactive whiteboard. Printed with permission.

Figure 4-6 Peer Instruction implementation.

Using this set-up we implemented Peer Instruction as described by Crouch and colleagues (2007); Figure 4-6 illustrates the connection of the different phases of instruction. The basic pattern ran as follows: in the individual-thinking phase the students were instructed to think individually about the question presented and vote. If the percentage of votes for the correct answer fell between 35% and 70%, a convince-your-neighbors phase followed. During this phase students were instructed to discuss the question in groups of two or three. They were instructed to tell each other what answer they gave and try to convince the other(s) that that answer is correct. After this phase students voted again. Based on literature on Peer Instruction (Crouch et al., 2007) we set the duration of the individual-thinking phase at one minute and of the convince-your-neighbors phase at two minutes.

&4.3& [][][][][][][][][][][][][][][][][]&Results&

4.3.1 []The pedagogical interaction

Table 4-4 provides an overview of inquiry-based question sequences during each of the observed lessons. Each lesson is scored for student-centeredness (SCS) and the extent to which the inquiry-cycle is evident (ICS). The teaching approach of teacher E seems to be quite consistent regarding both types of score, no matter whether he is teaching in the Accustomed condition or in the PI condition. The intervention appears to have quite an impact on the ICS and SCS scores of all other teachers, with the most extreme example being teacher D, with an ICS of 45 in the Accustomed condition and 8 in the PI condition.

Table 4-5 shows teachers’ predictions of which condition has the highest learning gains, directly after the posttest was administered. It also elaborates on what factors the teachers deemed to be most influential regarding these learning effects. Teachers A, B, and C state helpful and detrimental factors for both conditions, whereas teacher D only states helpful factors for the Accustomed condition, and teacher E only for the Peer Instruction condition.

4.3.2 []Learning outcome measures

Table 4-6 shows the scheduling of the series of experiments. Day 1 for each teacher has been synchronized in this table, while in reality experiments started on different days during the year. Table 4-7 shows the analysis of students’ responses to the questionnaires. The effect sizes in this table seem to suggest that the implementation of the series of pre-post quasi-experiments resulted in contradictory effects with respect to the outcome measure. For example, a comparison between the effect sizes of the interventions for teachers A and D shows effects that are opposite. Table 4-8 provides an overview of how students answered questions in the Peer Instruction condition. The bottom half of this table shows how the correctness of answering ConcepTests is influenced by Peer Instruction: how often are ConcepTests answered correctly after simply having thought about them individually, and how often after a round of Peer Instruction? The most interesting row in this table is the row showing wrong answers before Peer Instruction and right answers after Peer Instruction. Porter and colleagues (Porter, Bailey Lee, Simon, & Zingaro, 2011) call this group the Potential Learning Group (PLG). Apparently, this is the group that has learned most effectively from Peer Instruction. Teacher D has the lowest percentage of PLG cases, and teacher C the highest. Our Results Supplement accompanying the online version of this article provides more complete information on how students responded to each of the ConcepTests.

[Table 4-4 Teacher questions
__]

<>^/2.
p<>. level
<>^/2.
p<>. condition
<>^/2.
p<>. lesson
<>^/2.
p<>. Inquiry-Cycle-Score (ICS)
<>^\2.
p<>. % answered by…
<>^/2.
p<>. physics content questions posed by each teacher in chronological order*
<>^.
<>^/4.
<>^.
<>^/2.
<>^.
<>^/4.
<>^.
<>^/2.
<>^.
<>^/4.
<>^.
<>^/2.
<>^.
<>^/4.
<>^.
<>^/2.
<>^.
<>^/8.
<>^.
<>^/2.
<>^.
<>^/4.
<>^.
<>^/2.
<>^.

[_ Note. *Each code refers to a type of teacher question abbreviated as follows: r = recall, P = prediction, O = observation, and E = explanation. Questions in bold refer to teacher questions answered by a student; teacher questions that are not bold are answered by the teacher him-/herself. A darker shade of gray represents higher resemblance to the inquiry cycle. _
__]

[
__]

Table 4-5 Teachers’ predictions of which condition has the highest learning gains

table<>. <>. |<>^/2.
p<>. teacher |<>^/2.
p<>. teacher’s prediction |<>^. |<>^\2/2.
p<>. factors considered by the teachers to influence learning gains | <>. |<>^.
p<>. condition | <>. |<>^/7.
p<>. A |<>^/7.
p<>. PI |<>^/4.
p<>. Accustomed |<>^/3.
p<>. helpful |<>^.
p<>. I could focus on what I consider to be important.

| <>. |<>^. p<>. This is the ‘softer’ group that is scared of formulas.

They were glad to receive the material visualized. | <>. |<>^.
p<>. The ‘discovery’ of F = ma was a success.

| <>. |<>^. p<>. detrimental |<>^. p<>. They had to think less about the concepts than the other group. | <>. |<>^/3. p<>. PI |<>^. p<>. detrimental |<>^. p<>. My feeling says that it went too fast and superficially. | <>. |<>^. p<>. helpful |<>^. p<>. It appealed to them and they were alert, as they wanted to know whether they were right. | <>. |<>^. p<>. helpful |<>^. p<>. The good tempo might also contribute to captivating them. | <>. |<>^/5. p<>. B |<>^/5. p<>. nodifferrence |<>^/2. p<>. Accustomed |<>^/2. p<>. helpful |<>^/2. p<>. This group now has another teacher and will therefore have been a bit more attentive to what happened in class. | <>. |<>. | <>. |<>^/3. p<>. PI |<>^. p<>. helpful |<>^. p<>. This group seems more attentive to me as something new happens in class: the voting devices. | <>. |<>^. p<>. detrimental |<>^. p<>. I considered this lesson to be very passive andpreprogrammed. | <>. |<>^. p<>. helpful |<>^. p<>. The peak of these lessons was at the times the students had to convince each other. | <>. |<>^/4. p<>. C |<>^/4. p<>. PI |<>^/3. p<>. Accustomed |<>^. p<>. detrimental |<>^. p<>. Besides focusing on the concepts, I had this group focus on constructing and calculating, which results in their being educated more in line with what is expected from them later on at the exams. They were, however, less well-prepared to answer the posttest questions. | <>. |<>^. p<>. detrimental |<>^. p<>. I almost entirely taught in a frontal manner. | <>. |<>^. p<>. detrimental |<>^. p<>. There was no opportunity to check and discuss their homework, resulting in less reflection than normal. | <>. |<>^. p<>. PI |<>^. p<>. helpful |<>^. p<>. As this group received two lessons that were completely focused on understanding Newtonian mechanics, they were better prepared to answer the posttest questions. | <>. |<>^/2. p<>. D |<>^/2. p<>. Accustomed |<>^. p<>. Accustomed |<>^. p<>. helpful |<>^. p<>. After having conducted a lesson with the voting system, I could quickly adapt the lesson to teach in the way my students are accustomed to. | <>. |<>^. p<>. PI |<>. |<>. p<>. - | <>. |<>^. p<>. E |<>^. p<>. PI |<>^. p<>. VWO A |<>. |<>. p<>. - | <>. |<>. |<>. |<>^. p<>. VWO PI |<>^. p<>. helpful |<>^. p<>. This group has better students. | <>. |<>. |<>. |<>. |<>. p<>. helpful |<>^. p<>. I believe these lessons went more smoothly. | <>. |<>. |<>. |<>. |<>. p<>. helpful |<>^. p<>. They seem to be less affected by puberty, causing the yields of lessons to be higher. | <>. |<>. |<>. |<>^. p<>. HAVO A |<>. |<>. p<>. - | <>. |<>. |<>. |<>^. p<>. HAVO PI |<>^. p<>. helpful |<>^. p<>. As the voting system forces every student to participate, there is no ‘opt-out’. |

[Table 4-6 Scheduling of the lessons in both conditions
__]

<>^/2.
p<>. year and level*
<>^/2.
p<>. experimental condition
<>^\13.
p<>. day
<>^.
<>^.
<>^.
<>^.
<>^.
<>^/2.
<>^.

[_Note. pre = pretest; exp = intervention; post = posttest; d. post = delayed posttest ; *The educational level HAVO has five grades and literally translates as "higher general continued education"; the educational level VWO has six grades and literally translates as "preparatory scientific education". _]

[Table 4-7 Analysis of questionnaire responses
__]

<>^\2.
p<>. A
<>^\2.
p<>. B
<>^\2.
p<>. C
<>^\2.
p<>. D
<>^\4.
p<>. E
<>^\3.
<>^\3.
<>^\3.
<>^/8.
<>^.
<>^/2.
<>^.
<>^/2.
<>^.
<>^/2.
<>^.
<>^/4.
<>^.
<>^/2.
<>^.
<>^/6.
<>^.
<>^/2.
<>^.
<>^/2.
<>^.

Note. A darker shade refers to a higher score across the row for that measure; *Effect sizes are calculated with Cohen ‘s d based on gain scores. Positive effect sizes refer to effects in favor of the Peer Instruction condition.

[
__]

[Table 4-8 Students’ answers given by using their voting devices during Peer Instruction
__]

<>^.
p<>. A
<>^.
p<>. B
<>^.
p<>. C
<>^.
p<>. D
<>^\2.
p<>. E
<>^\3.
<>^/3.
<>^\2.
<>^\2.
<>^\3.
<>^\3.
<>^/5.
<>^.
<>^.
<>^.
<>^.
<>^\3.
<>^\3.

Note. The individual-thinking phase was only followed by the convince-your-neighbor phase when the question was initially answered correctly by between 35-70% of the students in the classroom.

[
__]

&4.4& [][]&Conclusions and& &Discussion&

The research questions of this study relate to (1) understanding the relative effectiveness of two different teaching approaches to support learning with computer simulations in a whole-class setting, and (2) investigating the impact of pedagogical interaction during whole-class teaching with computer simulations on learning gains. From the data collected in the different classes we can see that, although we carefully implemented the Peer Instruction condition, there is wide variation in the pedagogical interaction that resulted from this implementation. Table 4-4 illustrates how diversely teachers can respond to an experimental implementation. With respect to Inquiry-Cycle-Score, teacher E teaches more or less the same way in both conditions at both educational levels, whereas teacher D shows the most extreme difference: 45 in the Accustomed condition and only 8 in the Peer Instruction condition. It makes sense to have a higher ICS in the Accustomed condition than in the Peer Instruction condition, because with Peer Instruction, time is occupied by the researcher’s PowerPoint questions and the students’ processes of Peer Instruction and voting. Nevertheless, Table 4-4 clearly shows great disparity concerning the impact of experiment implementation on interaction between teachers and students. A key role could be played by sense of ownership or control over the lesson. According to such a view, teacher E retains control over the lesson no matter the condition in which he is teaching, resulting in relatively comparable interaction patterns and learning effects. However, even though teacher D also has a relatively high ICS and SCS in the Accustomed condition, he does not exert control over Peer Instruction lessons, apparently considering the researcher to be in charge. This sense of ownership could be crucial for how the pedagogical interaction unfolds, impacting its learning outcomes. Investigations of teaching approaches by using scripted lessons should take this possibility into account, as such a scripted lesson might not lead to the desired behavioral changes by the teacher, because of losing this sense of ownership.

Our findings presented in Table 4-7 illustrate the expression that ‘one experiment is no experiment at all’: differences found between conditions for certain teachers are the opposite for others. These findings suggest that our questionnaire and assessment data by themselves provide insufficient information to explain the students’ learning gains or lack thereof. This pre-post quasi-experimental research design is often used for investigating the learning effects of computer simulations in science education (Rutten et al., 2012): measuring the effects of an intervention by comparing several classes of students at different timepoints. This design is used in a variety of instructional settings: working with computer simulations individually, in small groups, or in a whole-class setting, where the teacher or a student operates the simulation in front of the class. The present study shows that supplementing such a pre-post design with process analyses focused on teacher-student interaction can help yield better understanding of the pedagogical mechanisms that are at play.

The effect sizes in Table 4-7 show that the learning effects are not consistently in favor of one condition. However, a comparison between Table 4-5 and Table 4-7 reveals that the teachers’ predictions and high effect sizes (|d|>.5) mostly do coincide with each other. This suggests that after having finished the experiment, teachers have a better understanding of which teaching approach resulted in highest learning gains, compared to what could be measured by questionnaires and assessments alone. We consider the reasonable accuracy of these teacher predictions as support for supplementing the questionnaire and assessment results with our examination of lesson observations in order to extend the analysis of products at different timepoints to analysis of the learning process itself. The teachers’ predictions in Table 4-5 are elaborated by the factors that the teachers consider to influence learning gains. Several factors mentioned might be generalizable beyond this study to studies in which learning effects are investigated with a researcher-imposed structure: teachers feeling less able to focus on what they consider important; and, experiencing the researcher-imposed condition as preprogrammed. A researcher who is imposing a certain lesson structure can also be experienced as a kind of invader, which possibly results in a teacher feeling tempted to oppose the intervention out of rivalry. Teacher D’s high effect sizes in the Accustomed condition could be explained by his approach of combining the best of both worlds, as he explains: “After having conducted a lesson with the voting system, I could quickly adapt the lesson to teach in the way my students are accustomed to”.

Considering the Peer Instruction condition, Table 4-8 provides insight into the extent to which the conceptual complexity of ConcepTests was suitable for eliciting a change in learners’ knowledge. The most informative row of data in Table 4-8 is the transition from students answering a question wrong to answering it right after Peer Instruction. Such occurrences suggest that the ConcepTest hit the target: students needed just a little (Peer Instruction) support to achieve sufficient comprehension to answer the question correctly. An alternative explanation for a high transition from wrong to right answers is better student ability to perform Peer Instruction itself, as this relates to abilities such as understanding what someone else does not understand, verbalizing one’s own understanding, and convincing others of your own views.

In our review study (2012) we concluded that most of the studies that we reviewed “investigated the effects of computer simulations on learning ceteris paribus, consequently ignoring the influence of the teacher, the curriculum, and other such pedagogical factors”. Research can be fruitful when it allows for description of the results at a higher level of abstraction, and in turn for practical applicability by translation to other concrete situations (Rol & Cartwright, 2012). Our present study shows that it is important to go deeper than looking at learning outcomes for understanding the effect of computer simulations as a means of instruction. When viewing the learning situation at this abstract level too many factors of the context at hand are not taken into account, resulting in an abstract principle about computer simulations that cannot be concretized in other contexts (Rol & Cartwright, 2012). We therefore recommend further research on the learning effects of computer simulations, and technology in general, to take into account the concrete factors of the pedagogical contexts, and to incorporate these into research designs.

According to Crasnow (2012), the socio-scientific research of the past decades shows a methodological shift: switching from investigating specific cases to a more statistical approach, based on the idea that this allows for finding principles that are generally applicable. The assumed usefulness of such a statistical approach is that finding effects also brings into better focus what are the possible causes of these effects. However, the decontextualization of the research participants, which is necessary for the determination of such general principles, narrows down the insightfulness of such principles. Furthermore, the possibility exists that the effects found do not even exist as decontextualized principles, when these are partially caused by the specific characteristics of the context. Crasnow (2012) therefore argues that to understand reality it is not only necessary to search for general, decontextualized principles, but also to supplement this search with observations of participants within their context. Paraphrasing Crasnow (2012), this means not only searching for the effects of causes, but also for the causes of effects. The widely used research design on which the pre-post quasi-experiment of this study was based is used by researchers to find effects, often in the sense of generally applicable principles. By replicating this research design several times and analyzing the interaction at a deeper level, we showed that the effects found per experiment do not qualify for general applicability. Our analyses of the pedagogical interaction allowed us to search for the causes of effects: searching for the variables that influenced the effects in the contexts at hand.

&4.5& [][]&Suggestions& &for Future& &Research&

In the present study, we supplemented the pre-post research design that is often used in this field in two ways: by replicating it five times, and by not only analyzing learning by measuring at different timepoints, but also by zooming in on the processes themselves. For testing the effectiveness of pedagogical approaches, it is beneficial to conduct such process analyses of teacher-student interaction as a supplement to measuring learning outcomes with pre- and posttests. We suggest the following working hypotheses as possible mechanisms of pedagogical interaction that influenced the learning processes in the present study:

1. When imposing more structure in one condition compared to the other, this can lead to the teacher losing a sense of ownership in the structured condition, possibly causing the teacher to take a more passive stance.

2. When the approach in one condition is less familiar to the teacher compared to the other, then this sense of familiarity can differentially influence the execution of the less familiar pedagogical approach.

3. Both imposed structure and unfamiliarity can cause the teacher to resist implementing a pedagogical approach out of a sense of rivalry.

We recommend that researchers conducting similar studies should not only describe the pedagogical intervention itself, but should also thoroughly elaborate on how their implementation strategy takes into account influential mechanisms of pedagogical interaction.

&4.6& []&Acknowledgement&

The authors would like to thank dr. M. E. G. M. (Menno) Rol for his valuable feedback on an earlier version of this article.

&Chapter 5&[
**][]&Conclusion&&s& &and& &Discussion&

In this chapter we answer the research questions, interrelate the studies conducted, and provide suggestions for future research.

&5.1& [][][][][][][][][][][][][][][][][][][][][][]&What& &h&&ave& &w&&e Learned& &a&&bout Teaching& &w&&ith Simulations&&?&

The studies in this dissertation show that whole-class teaching can benefit from computer simulations. This instructional setting is suitable for the application of simulations as support for a student-centered and inquiry-based approach. Our studies shed light on the pedagogical context of the application of simulations by zooming in on the interaction between the teachers and their students. The results of our studies allow us to answer our research questions. Below, we begin by answering the research questions of the separate studies, and conclude by answering the overarching question on which this dissertation focuses.

· How can traditional science education be enhanced by the application of computer simulations? (Chapter 2)

Instruction can be successfully enhanced by the application of computer simulations. Most of the studies we reviewed in which computer simulations were used as replacement or enhancement of lectures or practicals showed improved learning results for the use of simulations. Besides positive effects on learning, studies also reported improvements for motivation and attitude.

Computer simulations can be used to support inquiry-based instruction. The studies we reviewed show moderate effects on a variety of variables. A possible explanation for these effects being moderate is that inquiry learning requires both learning about the domain at hand as well as acquisition of inquiry learning skills. Improved visualizations do not necessarily translate into better learning. For example, Moreno and Mayer (2004) found that visualizing a simulation via a head-mounted display led to increased immersion in the learning environment, but this did not in turn result in increased learning outcomes.

· How are computer simulations best used in order to support learning processes and outcomes? (Chapter 2)

It is particularly effective to use simulations in preparation for laboratory experiments. In the studies we reviewed, we found high effects sizes when laboratory activities were replaced or enhanced by computer simulations. Although there is extensive research literature on learning with computer simulations, relatively little research focuses on how to teach with these tools. In most research the impact of classroom scenarios and teacher guidance are ignored.

· How do physics teachers use computer simulations in whole-class teaching? (Chapter 3)

Use of computer simulations in whole-class teaching can vary from the purpose of illustration of theory to the purpose of supporting inquiry learning processes. The latter is the focus of our method of analyzing the interaction between the teachers and their students: to what extent is this interaction student-centered and does it resemble the inquiry cycle? We rarely observed these two aspects in combination, that is, teacher-student interaction that is student-centered and that also resembles the inquiry cycle. Perhaps this means that these aspects are difficult to combine. Possibly, this can be trained for in teacher training. We measured students’ attitudes about how much teaching with computer simulations contributes to their motivation and insight. These attitudes relate to the student-centeredness of a teaching approach and its inquiry-based character. The extent to which there is congruence regarding learning goals between a teacher and his/her students relates to that teacher’s endorsement of teaching according to an inquiry-based approach.

· How does an inquiry-based teaching approach support learning with computer simulations in a whole-class setting? (Chapter 4)

Our comparison of an inquiry-based approach to the teachers’ accustomed approach to teaching with computer simulations shows that it is possible to leverage learning gains. However, our series of pre-post quasi-experiments did not consistently favor one of the approaches. To better understand these contradictory pre-post findings, we analyzed the teacher-student interactions during the interventions, in order to turn this black box into glass.

· How does pedagogical interaction in whole-class teaching with computer simulations influence learning gains? (Chapter 4)

The analyses of the pedagogical interaction between the teachers and their students revealed that the teachers responded in diverse ways to our intervention. This was also related to a varied impact on learning gains. We learned from the process analyses of the teacher-student interaction during each of the replicated pre-post quasi-experiments, that such process analyses provide information that is necessary for deeper understanding of pedagogical interaction and learning. A contextual mechanism that we would not have noticed without such analyses, is the diversity of ways teachers perform a scripted lesson. Our method of interaction analysis supports understanding the impact of this mechanism. Further research is required to better understand the specific characteristics of such mechanisms. These could, for example, be related to the teacher’s loss of a sense of ownership, or feeling tempted to oppose an intervention out of rivalry.

The following question spans the research questions of this dissertation:

· How can whole-class science teaching benefit from computer simulations?

The studies in this dissertation support taking several steps toward answering this overarching research question:

1. Even though instruction can be enhanced by using computer simulations, the pedagogical context of whole-class teaching has rarely been investigated in the research literature.

2. Our studies show that teachers can use computer simulations to support student-centered inquiry learning at the whole-class level.

3. Students’ learning gains between different timepoints provide insufficient information to understand whole-class processes of teaching and learning.

4. Our method of analyzing whole-class teaching by scrutinizing teacher-student interaction provides valuable information.

5. Supplementing a pre-post research design with process analyses of teacher-student interaction can help yield better understanding of the outcomes of a quasi-experimental intervention focused on comparing pedagogical approaches.

6. Our improved understanding allows us to provide suggestions on how researching pedagogical approaches can be enhanced. It also makes us realize its complexity, and that there are no 1-dimensional answers in these discussions.

&5.2& [][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][][]&Interr&&elating the Studies& &Conducted&

This dissertation is based on three main studies: a review study, an observational study, and a quasi-experimental study. In our review study of 51 publications on the learning effects of computer simulations in science education, we learned how such research is generally set up, and what can be concluded from these studies when they are taken together. In most of these studies learning effects are measured by determining student achievement at certain timepoints. In our observational study we developed a method of analysis that zooms in on the learning processes by focusing on teacher-student interaction. In our quasi-experimental study we compared two teaching approaches, and zoomed in on the teacher-student interaction by using the method that we developed in our observational study. By replicating the pre-post quasi-experiment five times, it became clear that the results of different runs could differ or even contradict. Analyzing the interaction between the teacher and their students during the intervention helped us to understand better the contextual mechanisms at play.

Nowadays, a lot of value is attached to the label ‘evidence-based’ when implementing educational innovations. Those responsible for implementing these innovations consider such evidence to be more valuable when it is ecologically valid, meaning that its effectiveness has been established beyond laboratory settings. Moreover, it should be published in peer-reviewed articles in renowned journals. The researchers responsible for producing and publishing this evidence consider the research designs that they encounter in such journals as inspiration for setting up their own designs. However, the studies on which they base their research designs are often conducted only once. By performing our quasi-experimental research within teaching practices we achieved our goal of meeting ecological validity. However, by also repeating it several times, the lack of replicability of such a research approach came to light. As the research on which evidence-based educational innovations are based is rarely replicated, it seems appropriate to interpret the predicate ‘evidence-based’ more cautiously.

&5.3& []&Suggestions for Fut&&u&&r&&e& &Research&

Our research can inspire future research on teaching with simulations, but there are also possibilities of applying our findings to other domains. Nowadays, there is a plethora of technologies that can be introduced into the classroom: hardware such as mobile phones and tablets, software such as voting systems and social networks, and developments such as personalization and ubiquitous learning. The question that educational implementation of innovations boils down to, is: now that we have the technology, how can it be used effectively in teaching? Are there ways to support whole-class teaching, learning among peers, or learning individually? The studies in this dissertation can serve both as an illustration of the complexity of these questions, and as an inspiration for other researchers to tackle related research questions.

From the perspective of simulations, in the studies that we have reviewed mostly short-term learning effects have been measured, so incorporation of delayed posttests seems a relevant suggestion for future research. It could also be interesting to investigate possible supportive roles for simulations within the instructional format of the Flipped Classroom. The purpose of this approach is to give students a more active role in the learning process by relocation of certain learning activities outside class time, specifically those that do not require face-to-face time with a teacher (Sams & Bergmann, 2013). An example of such activity relocation is recording direct instruction by the teacher on video, allowing students to watch it at home at their own pace. During class time, direct instruction is reduced to allow for more student-centered learner activities. These can also include working with simulations. Class time is filled with answering questions, discussions, presentations, and so forth. Computer simulations are not only an appropriate tool to support their preparatory study efforts, but can also be used during class time activities, for example, by use of an iPad. Our method of analyzing the teacher-student interaction can be relevant for studying teaching practices with gaming or modeling, for example. It can also be extended to encompass the analysis of student-student interaction during Peer Instruction.

In our review study (Rutten et al., 2012), we discussed not only computer simulations that are shown on screen, but also simulations that were visualized by other media, for example, virtual reality via a head-mounted display. In the observational and experimental studies in this dissertation, all participating teachers showed their simulations in front of the class by using an interactive whiteboard or by beamer projection. One can wonder whether additional learning gains are achievable by using other technologies to visualize the simulation, that is, by not restricting simulation visualization to the use of the classroom’s frontal screen. In our review study (Rutten et al., 2012) we concluded that interacting with a simulation while using a head-mounted display or stereoscopic glasses does not lead to improved learning. However, several studies using other technologies-namely, robots and a semi-immersive mixed-reality environment-showed learning results that were more promising. One possible explanation is that in those cases technology application is not only accompanied by a different way of perception, but also by a different way of relating to and moving through one’s surroundings, resulting in virtual/mixed-reality with an actual embodied character. Investigating the effects of immersion or embodied cognition in augmented or mixed reality and how to embed such technologies in a pedagogical context could also be an interesting path for future research.

&References&

Adams, W. K., Paulson, A., & Wieman, C. E. (2008). What levels of guidance promote engaged exploration with interactive simulations? AIP Conference Proceedings, 1064, 59-62.

Ainsworth, S., & VanLabeke, N. (2004). Multiple forms of dynamic representation. Learning and Instruction, 14(3), 241-255.

Ajzen, I. (2001). Nature and operation of attitudes. Annual Review of Psychology, 52(1), 27-58.

Akpan, J. P. (2001). Issues associated with inserting computer simulations into biology instruction: a review of the literature. Electronic Journal of Science Education, 5(3). Retrieved from http://ejse.southwestern.edu/article/viewArticle/7656/5423

Alfieri, L., Brooks, P. J., Aldrich, N. J., & Tenenbaum, H. R. (2011). Does discovery-based instruction enhance learning? Journal of Educational Psychology, 103(1), 1-18.

Amory, A., Naicker, K., Vincent, J., & Adams, C. (1999). The use of computer games as an educational tool: identification of appropriate game types and game elements. British Journal of Educational Technology, 30(4), 311-321.

Apperson, J. M., Laws, E. L., & Scepansky, J. A. (2006). The impact of presentation graphics on students’ experience in the classroom. Computers & Education, 47(1), 116-126.

Baltzis, K. B., & Koukias, K. D. (2009). Using laboratory experiments and circuit simulation IT tools in an undergraduate course in analog electronics. Journal of Science Education and Technology, 18(6), 546-555.

Barab, S. A., Scott, B., Siyahhan, S., Goldstone, R., Ingram-Goble, A., Zuiker, S. J., et al. (2009). Transformational play as a curricular scaffold: Using videogames to support science education. Journal of Science Education and Technology, 18(4), 305-320.

Barko, T., & Sadler, T. D. (2013). Practicality in virtuality: Finding student meaning in video game education. Journal of Science Education and Technology, 22(2), 124-132.

Baser, M. (2006). Effects of conceptual change and traditional confirmatory simulations on pre-service teachers’ understanding of direct current circuits. Journal of Science Education and Technology, 15(5), 367-381.

Beauchamp, G., & Kennewell, S. (2010). Interactivity in the classroom and its impact on learning. Computers & Education, 54(3), 759-766.

Bell, R. L., & Trundle, K. C. (2008). The use of a computer simulation to promote scientific conceptions of moon phases. Journal of Research in Science Teaching, 45(3), 346-372.

Bell, T., Urhahne, D., Schanze, S., & Ploetzner, R. (2010). Collaborative inquiry learning: models, tools, and challenges. International Journal of Science Education, 32(3), 349-377.

Betrancourt, M. (2005). The animation and interactivity principles in multimedia learning. The Cambridge Handbook of Multimedia Learning, 287-296.

Birchfield, D., & Megowan-Romanowicz, C. (2009). Earth science learning in SMALLab: A design experiment for mixed reality. International Journal of Computer-Supported Collaborative Learning, 4(4), 403-421.

Björkman, J., & Tiemann, R. (2013). Teaching patterns of scientific inquiry: A video study of chemistry lessons in Germany and Sweden. Sience Education Review Letters, 2013, 1-7.

Blake, C., & Scanlon, E. (2007). Reconsidering simulations in science education at a distance: Features of effective use. Journal of Computer Assisted Learning, 23(6), 491-502.

Blasco-Arcas, L., Buil, I., Hernández-Ortega, B., & Sese, F. J. (2013). Using clickers in class. The role of interactivity, active collaborative learning and engagement in learning performance. Computers & Education, 62(0), 102-110.

Brant, G., Hooper, E., & Sugrue, B. (1991). Which comes first the simulation or the lecture? Journal of Educational Computing Research, 7(4), 469-481.

Campbell, T., Wang, S. K., Hsu, H.-Y., Duffy, A. M., & Wolf, P. G. (2010). Learning with web tools, simulations, and other technologies in science classrooms. Journal of Science Education and Technology, 19(5), 505-511.

Chang, H. Y. (2013). Teacher guidance to mediate student inquiry through interactive dynamic visualizations. Instructional Science, 41(5), 895-920.

Chang, K. E., Chen, Y. L., Lin, H. Y., & Sung, Y. T. (2008). Effects of learning support in simulation-based physics learning. Computers & Education, 51(4), 1486-1498.

Chen, R.-J. (2010). Investigating models for preservice teachers’ use of technology to support student-centered learning. Computers & Education, 55(1), 32-42.

Chen, S. (2010). The view of scientific inquiry conveyed by simulation-based virtual laboratories. Computers & Education, 55(3), 1123-1130.

Chen, S., Lo, H.-C., Lin, J.-W., Liang, J.-C., Chang, H.-Y., Hwang, F.-K., et al. (2012). Development and implications of technology in reform-based physics laboratories. Physical Review Special Topics – Physics Education Research, 8(2), 020113.

Clark, D., & Jorde, D. (2004). Helping students revise disruptive experientially supported ideas about thermodynamics: Computer visualizations and tactile models. Journal of Research in Science Teaching, 41(1), 1-23.

Crasnow, S. (2012). The role of case study research in political science: Evidence for causal claims. Philosophy of Science, 79(5), 655-666.

Creemers, B. P. M., & Kyriakides, L. (2006). Critical analysis of the current approaches to modelling educational effectiveness: The importance of establishing a dynamic model. School Effectiveness and School Improvement: An International Journal of Research, Policy and Practice, 17(3), 347-366.

Crouch, C. H., Fagen, A. P., Callan, J. P., & Mazur, E. (2004). Classroom demonstrations: Learning tools or entertainment? American Journal of Physics, 72(6), 835-838.

Crouch, C. H., & Mazur, E. (2001). Peer Instruction: ten years of experience and results. American Journal of Physics, 69(9), 970-977.

Crouch, C. H., Watkins, J., Fagen, A. P., & Mazur, E. (2007). Peer Instruction: Engaging students one-on-one, all at once. Research-Based Reform of University Physics, 1(1), 40-95.

Dalgarno, B., Bishop, A. G., Adlong, W., & Bedgood Jr, D. R. (2009). Effectiveness of a virtual laboratory as a preparatory resource for distance education chemistry students. Computers & Education, 53(3), 853-865.

Davies, C. (2002). Student engagement with simulations: a case study. Computers & Education, 39(3), 271-282.

de Jong, T. (2006). Technological advances in inquiry learning. Science, 312(5773), 532-533.

de Jong, T., Linn, M. C., & Zacharia, Z. C. (2013). Physical and virtual laboratories in science and engineering education. Science, 340(6130), 305-308.

de Jong, T., & van Joolingen, W. R. (1998). Scientific discovery learning with computer simulations of conceptual domains. Review of Educational Research, 68(2), 179-201.

Dexter, S., & Riedel, E. (2003). Why improving preservice teacher educational technology preparation must go beyond the college’s walls. Journal of Teacher Education, 54(4), 334-346.

Dillenbourg, P. (2008). Integrating technologies into educational ecosystems. Distance Education, 29(2), 127-140.

Donnelly, D., O’Reilly, J., & McGarr, O. (2013). Enhancing the student experiment experience: visible scientific inquiry through a virtual chemistry laboratory. Research in Science Education, 43(4), 1571-1592.

Dori, Y. J., & Belcher, J. (2005). How does technology-enabled active learning affect undergraduate students’ understanding of electromagnetism concepts? Journal of the Learning Sciences, 14(2), 243-279.

Duran, M. J., Gallardo, S., Toral, S. L., Martinez-Torres, R., & Barrero, F. J. (2007). A learning methodology using Matlab/Simulink for undergraduate electrical engineering courses attending to learner satisfaction outcomes. International Journal of Technology and Design Education, 17(1), 55-73.

Education Resources Information Center. (2014). Retrieved 17.04.14.: http://www.eric.ed.gov

Field, A. P. (2009). Discovering statistics using SPSS: SAGE Publications, Ltd.

Finkelstein, N. D., Adams, W. K., Keller, C. J., Kohl, P. B., Perkins, K. K., Podolefsky, N. S., et al. (2005). When learning about the real world is better done virtually: a study of substituting computer simulations for laboratory equipment. Physical Review Special Topics-Physics Education Research, 1(10103), 1-8.

Finkelstein, N. D., Adams, W. K., Keller, C. J., Perkins, K. K., & Wieman, C. E. (2006). High-tech tools for teaching physics: The physics education technology project. MERLOT Journal of Online Learning and Teaching, 2(3), 110-120.

Fogleman, J., McNeill, K. L., & Krajcik, J. (2011). Examining the effect of teachers’ adaptations of a middle school science inquiry-oriented curriculum unit on student learning. Journal of Research in Science Teaching, 48(2), 149-169.

Force Concept Inventory. (2014). Retrieved 17.04.14.: http://modeling.asu.edu/R&E/Research.html

Fund, Z. (2007). The effects of scaffolded computerized science problem-solving on achievement outcomes: A comparative study of support programs. Journal of Computer Assisted Learning, 23(5), 410-424.

Gazit, E., Yair, Y., & Chen, D. (2005). Emerging conceptual understanding of complex astronomical phenomena by using a virtual solar system. Journal of Science Education and Technology, 14(5), 459-470.

Gelbart, H., Brill, G., & Yarden, A. (2009). The impact of a web-based research simulation in bioinformatics on students’ understanding of genetics. Research in Science Education, 39(5), 725-751.

Gibbons, N. J., Evans, C., Payne, A., Shah, K., & Griffin, D. K. (2004). Computer simulations improve university instructional laboratories. Cell Biology Education, 3(4), 263-269.

Gick, M. L., & Holyoak, K. J. (1980). Analogical problem solving. Cognitive Psychology, 12(3), 306-355.

Gick, M. L., & Holyoak, K. J. (1983). Schema induction and analogical transfer. Cognitive Psychology, 15(1), 1-38.

Gokhale, A. A. (1996). Effectiveness of computer simulation for enhancing higher order thinking. Journal of Industrial Teacher Education, 33(4), 36-46.

Goldstone, R. L., & Son, J. Y. (2005). The transfer of scientific principles using concrete and idealized simulations. Journal of the Learning Sciences, 14(1), 69-110.

Gonzalez-Cruz, J., Rodriguez-Sotres, R., & Rodriguez-Penagos, M. (2003). On the convenience of using a computer simulation to teach enzyme kinetics to undergraduate students with biological chemistry-related curricula. Biochemistry and Molecular Biology Education, 31(2), 93-101.

Hatherly, P. A., Jordan, S. E., & Cayless, A. (2009). Interactive screen experiments – Innovative virtual laboratories for distance learners. European Journal of Physics, 30(4), 751-762.

Hennessy, S. (2006). Integrating technology into teaching and learning of school science: A situated perspective on pedagogical issues in research. Studies in Science Education, 42(1), 1-48.

Hennessy, S. (2011). The role of digital artefacts on the interactive whiteboard in supporting classroom dialogue. Journal of Computer Assisted Learning, 27(6), 463-489.

Hennessy, S., Wishart, J., Whitelock, D., Deaney, R., Brawn, R., la Velle, L., et al. (2007). Pedagogical approaches for technology-integrated science teaching. Computers & Education, 48(1), 137-152.

Hestenes, D., Wells, M., & Swackhamer, G. (1992). Force concept inventory. The Physics Teacher, 30(3), 141-158.

Höffler, T. N., & Leutner, D. (2007). Instructional animation versus static pictures: A meta-analysis. Learning and Instruction, 17(6), 722-738.

Hofstein, A., & Lunetta, V. N. (2004). The laboratory in science education: Foundations for the twenty-first century. Science Education, 88(1), 28-54.

Hsu, Y.-S., & Thomas, R. A. (2002). The impacts of a web-aided instructional simulation on science learning. International Journal of Science Education, 24(9), 955-979.

ISI Web of Knowledge. (2014). Retrieved 17.04.14.: http://www.isiknowledge.com

Jimoyiannis, A., & Komis, V. (2001). Computer simulations in physics teaching and learning: A case study on students’ understanding of trajectory motion. Computers & Education, 36(2), 183-204.

Kaiser, H. (1974). An index of factorial simplicity. Psychometrika, 39(1), 31-36.

Karlsson, G., Ivarsson, J., & Lindström, B. (2013). Agreed discoveries: Students’ negotiations in a virtual laboratory experiment. Instructional Science, 41(3), 455-480.

Kester, L., Kirschner, P., & van Merrienboer, J. (2004). Information presentation and troubleshooting in electrical circuits. International Journal of Science Education, 26(2), 239-256.

Ketelhut, D. J., & Nelson, B. C. (2010). Designing for real-world scientific inquiry in virtual environments. Educational Research, 52(2), 151-167.

Ketelhut, D. J., Nelson, B. C., Clarke, J., & Dede, C. (2010). A multi-user virtual environment for building and assessing higher order inquiry skills in science. British Journal of Educational Technology, 41(1), 56-68.

Kewley, L. (1998). Peer collaboration versus teacher-directed instruction: How two methodologies engage students in the learning process. Journal of Research in Childhood Education, 13(1), 27-32.

Khan, S. (2011). New pedagogies on teaching science with computer simulations. Journal of Science Education and Technology, 20(3), 215-232.

Kiboss, J. K., Ndirangu, M., & Wekesa, E. W. (2004). Effectiveness of a computer-mediated simulations program in school biology on pupils’ learning outcomes in cell theory. Journal of Science Education and Technology, 13(2), 207-213.

King, A. (1990). Enhancing peer interaction and learning in the classroom through reciprocal questioning. American Educational Research Journal, 27(4), 664-687.

King, A. (1992). Facilitating elaborative learning through guided student-generated questioning. Educational Psychologist, 27(1), 111-126.

Kirschner, P. A., Sweller, J., & Clark, R. E. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching. Educational Psychologist, 41(2), 75-86.

Kreft, I. G. (1998). Introducing multilevel modeling: Sage.

Laakso, M.-J., Myller, N., & Korhonen, A. (2009). Comparing learning performance of students using algorithm visualizations collaboratively on different engagement levels. Educational Technology & Society, 12(2), 267-282.

Lazonder, A. W., Wilhelm, P., & van Lieburg, E. (2009). Unraveling the influence of domain knowledge during simulation-based inquiry learning. Instructional Science, 37(5), 437-451.

Lee, H. (2007). Instructional design of web-based Simulations for learners with different levels of spatial ability. Instructional Science, 35(6), 467-479.

Lee, Y.-F., & Guo, Y. (2008). Explore effective use of computer simulations for physics education. Journal of Computers in Mathematics and Science Teaching, 27(4), 443-466.

Limniou, M., Papadopoulos, N., Giannakoudakis, A., Roberts, D., & Otto, O. (2007). The integration of a viscosity simulator in a chemistry laboratory. Chemistry Education Research and Practice, 8(2), 220-231.

Limniou, M., Papadopoulos, N., & Whitehead, C. (2009). Integration of simulation into pre-laboratory chemical course: Computer cluster versus WebCT. Computers & Education, 52(1), 45-52.

Lindgren, R., & Schwartz, D. (2009). Spatial learning and computer simulations in science. International Journal of Science Education, 31(3), 419-438.

Linn, M. C., Lee, H.-S., Tinker, R., Husic, F., & Chiu, J. L. (2006). Inquiry learning: teaching and assessing knowledge integration in science. Science, 313(5790), 1049-1050.

Loeblein, P. (2014). Concept questions for Physics using PhET (Inquiry Based). Retrieved 17.04.14.: http://phet.colorado.edu/files/activities/3112/Loeblein%20physics%20clicker%20questions.pptx

Lunetta, V. N., & Hofstein, A. (1981). Simulations in science education. Science Education, 65(3), 243-252.

Maeng, J. L., Mulvey, B. K., Smetana, L. K., & Bell, R. L. (2013). Preservice teachers’ TPACK: Using technology to support inquiry instruction. Journal of Science Education and Technology, 22(6), 838-857.

Manlove, S., Lazonder, A. W., & de Jong, T. (2006). Regulative support for collaborative scientific inquiry learning. Journal of Computer Assisted Learning, 22(2), 87-98.

Manlove, S., Lazonder, A. W., & de Jong, T. (2009). Collaborative versus individual use of regulative software scaffolds during scientific inquiry learning. Interactive Learning Environments, 17(2), 105-117.

Manning, C. D., & Schütze, H. (1999). Foundations of statistical natural language processing: MIT Press.

Maor, D., & Fraser, B. (2005). An online questionnaire for evaluating students’ and teachers’ perceptions of constructivist multimedia learning environments. Research in Science Education, 35(2), 221-244.

Marshall, J. A., & Young, E. S. (2006). Preservice teachers’ theory development in physical and simulated environments. Journal of Research in Science Teaching, 43(9), 907-937.

Martinez-Jiménez, P., Pontes-Pedrajas, A., Polo, J., & Climent-Bellido, M. S. (2003). Learning in chemistry with virtual laboratories. Journal of Chemical Education, 80(3), 346-352.

Marzano, R. J. (1998). A theory-based meta-analysis of research on instruction: Mid-continent Regional Educational Laboratory Aurora, CO.

Mayer, R. E. (2004). Should there be a three-strikes rule against pure discovery learning? American Psychologist, 59(1), 14-19.

McCrory, R. (2008). Science, technology, and teaching: The topic-specific challenges of TPCK in science. Handbook of Technological Pedagogical Content Knowledge (TPCK) for Educators, 193-206.

McKagan, S. B., Handley, W., Perkins, K. K., & Wieman, C. E. (2009). A research-based curriculum for teaching the photoelectric effect. American Journal of Physics, 77(1), 87-94.

Meelissen, M. R. M., & Drent, M. (2009). Nederland in TIMSS-Advanced: leerprestaties van 6 vwo-leerlingen in Wiskunde B en Natuurkunde. Enschede: Universiteit Twente, Vakgroep Onderwijsorganisatie en -management.

Meir, E., Perry, J., Stal, D., Maruca, S., & Klopfer, E. (2005). How effective are simulated molecular-level experiments for teaching diffusion and osmosis? Cell Biology Education, 4(FALL), 235-248.

Mikropoulos, T. A., & Natsis, A. (2011). Educational virtual environments: A ten-year review of empirical research (1999-2009). Computers & Education, 56(3), 769-780.

Mitnik, R., Recabarren, M., Nussbaum, M., & Soto, A. (2009). Collaborative robotic instruction: A graph teaching experience. Computers & Education, 53(2), 330-342.

Moreno, R., & Mayer, R. E. (2004). Personalized messages that promote science learning in virtual environments. Journal of Educational Psychology, 96(1), 165-173.

Mucherah, W. M. (2003). The influence of technology on the classroom climate of social studies classrooms: a multidimensional approach. Learning Environments Research, 6(1), 37-57.

Nickerson, J. V., Corter, J. E., Esche, S. K., & Chassapis, C. (2007). A model for evaluating the effectiveness of remote engineering laboratories and simulations in education. Computers & Education, 49(3), 708-725.

Njoo, M., & de Jong, T. (1993). Exploratory learning with a computer simulation for control theory: Learning processes and instructional support. Journal of Research in Science Teaching, 30(8), 821-844.

Pallant, A., & Tinker, R. (2004). Reasoning with atomic-scale molecular dynamic models. Journal of Science Education and Technology, 13(1), 51-66.

Papadouris, N., & Constantinou, C. P. (2009). A methodology for integrating computer-based learning tools in science curricula. Journal of Curriculum Studies, 41(4), 521-538.

Papastergiou, M. (2009). Digital Game-Based Learning in high school Computer Science education: Impact on educational effectiveness and student motivation. Computers & Education, 52(1), 1-12.

Pelgrum, W. J. (2001). Obstacles to the integration of ICT in education: results from a worldwide educational assessment. Computers & Education, 37(2), 163-178.

Physics Education Technology website. (2014). Retrieved 17.04.14.: http://phet.colorado.edu

Pintrich, P. R., & de Groot, E. V. (1990). Motivational and self-regulated learning components of classroom academic performance. Journal of Educational Psychology, 82(1), 33-40.

Pintrich, P. R., Smith, D., García, T., & McKeachie, W. (1991). A manual for the use of the Motivated Strategies for Learning Questionnaire (MSLQ). Ann Arbor. Michigan, 10-29.

Ploetzner, R., Lippitsch, S., Galmbacher, M., Heuer, D., & Scherrer, S. (2009). Students’ difficulties in learning from dynamic visualisations and how they may be overcome. Computers in Human Behavior, 25(1), 56-65.

Porter, L., Bailey Lee, C., Simon, B., & Zingaro, D. (2011). Peer Instruction: Do students really learn from peer discussion in computing? Paper presented at the Proceedings of the seventh international workshop on Computing education research.

Powell, J. V., & Lee, S. (2004). Teaching techniques and computerized simulation in early childhood classrooms. Journal of Educational Technology Systems, 32(1), 71-100.

Pyatt, K., & Sims, R. (2012). Virtual and physical experimentation in inquiry-based science labs: attitudes, performance and access. Journal of Science Education and Technology, 21(1), 133-147.

Reid, D. J., Zhang, J., & Chen, Q. (2003). Supporting scientific discovery learning in a simulation environment. Journal of Computer Assisted Learning, 19(1), 9-20.

Riess, W., & Mischo, C. (2010). Promoting systems thinking through biology lessons. International Journal of Science Education, 32(6), 705-725.

Rol, M., & Cartwright, N. (2012). Warranting the use of causal claims: a non-trivial case for interdisciplinarity. Theoria, 27(2), 189-202.

Roth, W. M., McRobbie, C. J., Lucas, K. B., & Boutonné, S. (1997). Why may students fail to learn from demonstrations? A social practice perspective on learning in physics. Journal of Research in Science Teaching, 34(5), 509-533.

Rutten, N., van der Veen, J. T., & van Joolingen, W. R. (submitted). Inquiry-based teaching with computer simulations in physics.

Rutten, N., van Joolingen, W. R., & van der Veen, J. T. (2012). The learning effects of computer simulations in science education. Computers & Education, 58(1), 136-153.

Saab, N., van Joolingen, W. R., & van Hout-Wolters, B. (2007). Supporting communication in a collaborative discovery learning environment: The effect of instruction. Instructional Science, 35(1), 73-98.

Salinas, M. F. (2008). From Dewey to Gates: A model to integrate psychoeducational principles in the selection and use of instructional technology. Computers & Education, 50(3), 652-660.

Sams, A., & Bergmann, J. (2013). Flip your students’ learning. Educational Leadership, 70(6), 16-20.

Scardamalia, M., & Bereiter, C. (1991). Higher levels of agency for children in knowledge building: A challenge for the design of new knowledge media. The Journal of the Learning Sciences, 1(1), 37-68.

Schrum, L. (1999). Technology professional development for teachers. Educational Technology Research and Development, 47(4), 83-90.

Schrum, L., Thompson, A., Maddux, C., Sprague, D., Bull, G., & Bell, L. (2007). Editorial: Research on the effectiveness of technology in schools: The roles of pedagogy and content. Contemporary Issues in Technology and Teacher Education, 7(1), 456-460.

Scopus. (2014). Retrieved 17.04.14.: http://www.scopus.com

Shieh, R. S., Chang, W. J., & Tang, J. (2010). The impact of implementing Technology-Enabled Active Learning (TEAL) in university physics in Taiwan. Asia-Pacific Education Researcher, 19(3), 401-415.

Smetana, L. K., & Bell, R. L. (2012). Computer simulations to support science instruction and learning: a critical review of the literature. International Journal of Science Education, 34(9), 1337-1370.

Stern, L., Barnea, N., & Shauli, S. (2008). The effect of a computerized simulation on middle school students’ understanding of the kinetic molecular theory. Journal of Science Education and Technology, 17(4), 305-315.

Strijbos, J.-W. (2009). A multidimensional coding scheme for VMT. In G. Stahl (Ed.), Studying Virtual Math Teams (Vol. 11, pp. 399-419): Springer US.

Strijbos, J.-W., & Stahl, G. (2007). Methodological issues in developing a multi-dimensional coding procedure for small-group chat communication. Learning and Instruction, 17(4), 394-404.

Swaak, J., de Jong, T., & van Joolingen, W. R. (2004). The effects of discovery learning and expository instruction on the acquisition of definitional and intuitive knowledge. Journal of Computer Assisted Learning, 20(4), 225-234.

Tall, D. (2008). The transition to formal thinking in mathematics. Mathematics Education Research Journal, 20(2), 5-24.

Thalheimer, W., & Cook, S. (2002). How to calculate effect sizes from published research articles: A simplified methodology. Retrieved 08.06.10: http://work-learning.com/effect_sizes.htm

Thornton, R. K., & Sokoloff, D. R. (1998). Assessing student learning of Newton’s laws: the force and motion conceptual evaluation and the evaluation of active learning laboratory and lecture curricula. American Journal of Physics, 66(4), 338-351.

Tinker, R. F., & Xie, Q. (2008). Applying computational science to education: The Molecular Workbench paradigm. Computing in Science & Engineering, 10(5), 24-27.

Trey, L., & Khan, S. (2008). How science students can learn about unobservable phenomena using computer-based analogies. Computers & Education, 51(2), 519-529.

Trindade, J., Fiolhais, C., & Almeida, L. (2002). Science learning in virtual environments: a descriptive study. British Journal of Educational Technology, 33(4), 471-488.

Trundle, K. C., & Bell, R. L. (2010). The use of a computer simulation to promote conceptual change: A quasi-experimental study. Computers & Education, 54(4), 1078-1088.

Urhahne, D., Nick, S., & Schanze, S. (2009). The effect of three-dimensional simulations on the understanding of chemical structures and their properties. Research in Science Education, 39(4), 495-513.

Urhahne, D., Schanze, S., Bell, T., Mansfield, A., & Holmes, J. (2010). Role of the teacher in computer-supported collaborative inquiry learning. International Journal of Science Education, 32(2), 221-243.

van Aalderen-Smeets, S. I., Walma van der Molen, J. H., & Asma, L. J. F. (2012). Primary teachers’ attitudes toward science: A new theoretical framework. Science Education, 96(1), 158-182.

van Berkum, J. A., & de Jong, T. (1991). Instructional environments for simulations. Education and Computing, 6(3), 305-358.

van der Meij, J., & de Jong, T. (2006). Supporting students’ learning with multiple representations in a dynamic simulation-based learning environment. Learning and Instruction, 16(3), 199-212.

van Joolingen, W. R., & de Jong, T. (1991). Characteristics of simulations for instructional settings. Education & Computing, 6(3-4), 241-262.

van Joolingen, W. R., de Jong, T., & Dimitrakopoulou, A. (2007). Issues in computer supported inquiry learning in science. Journal of Computer Assisted Learning, 23(2), 111-119.

Veermans, K., van Joolingen, W., & de Jong, T. (2006). Use of heuristics to facilitate scientific discovery learning in a simulation learning environment in a physics domain. International Journal of Science Education, 28(4), 341-361.

Vermunt, J. D., & Verloop, N. (1999). Congruence and friction between learning and teaching. Learning and Instruction, 9(3), 257-280.

Vogel, J. J., Vogel, D. S., Cannon-Bowers, J., Bowers, C. A., Muse, K., & Wright, M. (2006). Computer gaming and interactive simulations for learning: A meta-analysis. Journal of Educational Computing Research, 34(3), 229-243.

Voogt, J. (2009). How different are ICT-supported pedagogical practices from extensive and non-extensive ICT-using science teachers? Education and Information Technologies, 14(4), 325-343.

Voogt, J. (2010). Teacher factors associated with innovative curriculum goals and pedagogical practices: differences between extensive and non-extensive ICT-using science teachers. Journal of Computer Assisted Learning, 26(6), 453-464.

Voogt, J., Almekinders, M., van den Akker, J., & Moonen, B. (2005). A ‘blended’in-service arrangement for classroom technology integration: impacts on teachers and students. Computers in Human Behavior, 21(3), 523-539.

Webb, M. E. (2005). Affordances of ICT in science learning: implications for an integrated pedagogy. International Journal of Science Education, 27(6), 705-735.

Webb, M. E. (2008). Impact of IT on science education International Handbook of Information Technology in Primary and Secondary Education (pp. 133-148): Springer.

Webb, M. E., & Cox, M. (2004). A review of pedagogy related to information and communications technology. Technology, Pedagogy and Education, 13(3), 235-286.

White, B., Kahriman, A., Luberice, L., & Idleh, F. (2010). Evaluation of software for introducing protein structure: Visualization and simulation. Biochemistry and Molecular Biology Education, 38(5), 284-289.

Wieman, C. E., Adams, W. K., Loeblein, P., & Perkins, K. K. (2010). Teaching physics using PhET simulations. The Physics Teacher, 48(4), 225-227.

Wieman, C. E., & Perkins, K. K. (2005). Transforming physics education. Physics Today, 58(11), 26-41.

Wieman, C. E., & Perkins, K. K. (2006). A powerful tool for teaching science. Nature Physics, 2(5), 290-292.

Winberg, T. M., & Berg, C. A. R. (2007). Students’ cognitive focus during a chemistry laboratory exercise: Effects of a computer-simulated prelab. Journal of Research in Science Teaching, 44(8), 1108-1133.

Windschitl, M. (1998). A practical guide for incorporating computer-based simulations into science instruction. The American Biology Teacher, 60(2), 92-97.

Windschitl, M. (2000). Supporting the development of science inquiry skills with special classes of software. Educational Technology Research and Development, 48(2), 81-95.

Windschitl, M., & Andre, T. (1998). Using computer simulations to enhance conceptual change: The roles of constructivist instruction and student epistemological beliefs. Journal of Research in Science Teaching, 35(2), 145-160.

Winn, W. (2005). What we have learned about VR and learning and what we still need to study. Paper presented at the Virtual Reality international conference, Laval, France.

Wu, H. K., & Huang, Y. L. (2007). Ninth-grade student engagement in teacher-centered and student-centered technology-enhanced learning environments. Science Education, 91(5), 727-749.

Zacharia, Z. C. (2003). Beliefs, attitudes, and intentions of science teachers regarding the educational use of computer simulations and inquiry-based experiments in physics. Journal of Research in Science Teaching, 40(8), 792-823.

Zacharia, Z. C. (2007). Comparing and combining real and virtual experimentation: an effort to enhance students’ conceptual understanding of electric circuits. Journal of Computer Assisted Learning, 23(2), 120-132.

Zacharia, Z. C., & Anderson, O. R. (2003). The effects of an interactive computer-based simulation prior to performing a laboratory inquiry-based experiment on students’ conceptual understanding of physics. American Journal of Physics, 71(6), 618-629.

Zacharia, Z. C., & Olympiou, G. (2011). Physical versus virtual manipulative experimentation in physics learning. Learning and Instruction, 21(3), 317-331.

Zhang, J. W., Chen, Q., Sun, Y. Q., & Reid, D. J. (2004). Triple scheme of learning support design for scientific discovery learning based on computer simulation: experimental research. Journal of Computer Assisted Learning, 20(4), 269-282.

Zingaro, D., & Porter, L. (2014). Peer Instruction in computing: The value of instructor intervention. Computers & Education, 71(0), 87-96.

&S&&ummary&

This dissertation is about teaching with computer simulations. It consists of the following studies: a review study, an observational study, and an experimental study.

[]&Motivation for this Dissertation&

This dissertation focuses on whole-class science teaching with computer simulations. Computer simulations display dynamic, visual representations of natural phenomena and can make a great contribution to the science classroom. Simulations can be used in multiple ways. Teachers who have an interactive whiteboard at their disposal have the opportunity to support their teaching with simulations in front of the class. It is also possible to let students work individually with simulations, in a computer room, using laptops or -more recently- tablets. However, the required planning and time are a barrier for learning with simulations individually or in small groups. Use of computer simulations in whole-class teaching and controlled primarily by the teacher has its own particular dynamics compared to the more small-scale settings. This thesis explores those dynamics in relation to control, initiative and activity. In particular, involving students in whole-class interaction around simulations is studied.

From the extant research literature it is already known that the use of computer simulations can support inquiry learning, where learners learn in the same way that scientists do research. This means that students learn by exploring phenomena within the world; they ask questions, make discoveries, and test those discoveries to learn more about them. By their very nature, simulations are suitable for supporting this learning approach, because these can be used for testing hypotheses by changing variables and observing what happens (de Jong, 2006). The research that demonstrates the effectiveness of inquiry learning with computer simulations has mostly been based on students working individually or in small groups. Use of computer simulations in whole-class settings appears to have been scarcely researched. As the benefits of computer simulations as support for learning individually or in small groups are known, it is of interest to investigate whether whole-class teaching can benefit as well. With this question in mind, we reviewed the related research literature, observed and interviewed teachers who teach with computer simulations, and performed an intervention study in which teachers were supported with a Peer Instruction approach.

[]&Review Study&

In our literature review we investigated the research literature of the past decade to find out what is known about the effectiveness of application of computer simulations in science subjects: is their use advisable and to what extent is their educational effectiveness robust? In doing this we built upon earlier work by de Jong and van Joolingen (1998), who found that simulations can be an effective means to support inquiry learning, provided that learners receive the proper support for the inquiry learning processes. The results of our review confirmed these findings and showed that instruction for science education can indeed be enhanced by using computer simulations as support. Besides improving learning gains and motivation, it is also possible to achieve comparable learning effects in a much shorter time frame, for example. In most studies that we reviewed in which computer simulations were used as supplement to or replacement of lectures or practicals, this led to positive effects for the simulation condition. The reviewed studies show that learning effects can be high for well-designed instruction with simulations as support, for example, for inquiry learning. Important factors to take into account are the way students are addressed and engaged, how information is presented and integrated, and the timing of information presentation.

Technological and pedagogical developments allow for further improvements of simulations. Nowadays, simulated learning environments look much better visually than in the last decade of the previous century. It is also possible to consider different modes of perception, for example with stereoscopic glasses. However, these improvements generally do not lead to higher learning outcomes. For most studies this involved the way in which the results of a simulation were shown. Inducing a sense of immersion in a learning environment by application of virtual reality techniques does not necessarily lead to higher learning outcomes. It may be interesting to combine simulation and reality by making it possible to physically move around through a simulated environment.

Computer simulations are particularly effective when used as preparation for laboratory activities. The reviewed studies reported high learning effects, both for replacement of laboratory activities by simulations, as well as for their use as preparatory activity. One thing that most reviewed studies have in common is that they focus on the interaction between the learner, individually or in small groups, with the simulation. Many of the studies were focused on how learning processes can be supported from within the simulation, for example, to regulate the learning process. In inquiry learning this means monitoring what has been learned and planning the best way to continue. Very few of the studies reviewed considered the role of a teacher in a whole-class setting. Despite the fact that individual use of simulations that are well supported by the learning environment clearly has a value for the learning process, whole-class discussion that is initiated and led by the teacher has its own place in the science curriculum. Teachers can use simulations as a central object in such discussions. A teacher can fulfill an important role in supporting inquiry-learning processes in the context of a whole-class learning activity around computer simulations.

[]&Observational Study&

After having investigated the research literature, we shifted our focus to teaching practices to see how teachers use simulations pedagogically and to find out more about how teaching with simulations is experienced. Before trying to influence the teachers’ role, we investigated the current use of simulations in physics classrooms across the Netherlands. In an observational study we observed physics teachers who teach with computer simulations about various physics topics. The purpose of this study was to investigate relations between the way that teachers implement computer simulations in their teaching practices, and the attitudes and learning goals of teachers and their students related to whole-class use of computer simulations. Our method of analyzing interaction during a lesson reveals the student-centeredness and the inquiry-based character of a teacher’s pedagogical approach. These two scores are based on the number and type of questions teachers ask their students and whether they are actually answered by students or by the teacher him- or herself.

When students are central according to our method, they have the idea that teaching with computer simulations contributes to their motivation. When the teaching approach has an inquiry-based character, students have the idea that teaching with computer simulations contributes to their insight. It seems difficult for teachers to teach in a student-centered and inquiry-based way simultaneously. We also asked students and teachers about their (perceived) learning goals for the simulation-based lessons. When the learning goals of the teacher and the students were in line with each other, the teacher also tended to have a positive attitude about inquiry-based teaching with computer simulations.

[]&Experimental Study&

To better understand teaching with computer simulations at the whole-class level, we conducted an experiment with an intervention. In order to enhance the inquiry-based nature of the simulation-based lesson, we developed a lesson plan based on the ideas of Peer Instruction (Crouch & Mazur, 2001). In this approach, learners discuss content based on ConcepTests that relate to the simulation shown. The teacher asks students to predict the behavior of the simulation. Students answered using an online voting system. When between 35-70% of the students answered correctly, they were all asked to try to convince their neighbors of their point of view, followed by a second voting round.

We compared this method with parallel classes in which the same teachers used their accustomed mode of teaching, without our intervention. We studied the process in the classroom, using the same observation method as in the previous study and checked for the way the intervention actually led to discussion and changes in students’ answers. Moreover we administered the Force Concept Inventory (Hestenes et al., 1992) as a measure of improvement in conceptual insight.

To gain insight into the robustness of effects and the possible impact of contextual factors, we replicated this experiment five times after its first run, and analyzed the interaction between the teacher and the students. This analysis clarifies how a teacher works with a simulation and also how a teacher reacts to an intervention. Despite our careful implementation of a Peer Instruction approach, the pedagogical interaction between the teacher and the students appeared to vary considerably in practice. Our study uncovers factors that play a role in teaching with simulations and their learning effects. Research studies focused on pedagogical interventions should take into account how teachers react to prescribed lesson scenarios. It is possible that a prescribed lesson does not lead to the desired behavioral change by a teacher because of loss of a sense of ownership or control over teaching. Because of this, in separate runs of our experiment we saw that the interaction between the teacher and the students looked reasonably the same between conditions with one teacher, but looked very different with another. When testing the effectiveness of pedagogical approaches it is important to have insight into how such teacher behavior differs between conditions.

[]&General Conclusion&

Using computer simulations in a whole-class setting can enrich teaching practices in diverse ways by actively engaging students according to an inquiry approach. When measuring the effectiveness of this practice, it is important to look at learning outcomes achieved at certain timepoints, but also alongside this the learning processes themselves, and the role fulfilled by the teacher.

&Samenvatting&

Dit proefschrift gaat over lesgeven met computersimulaties. Het bestaat uit de volgende studies: een review-studie, een observatiestudie en een experimentele studie.

[][][][][][][][][][][][][][][][]&Aanleiding voor dit Proefschrift&

Dit proefschrift richt zich op het klassikaal toepassen van computersimulaties ter ondersteuning van voorbereidend wetenschappelijk onderwijs. Simulaties kunnen dynamische, visuele representaties van natuurlijke verschijnselen laten zien en daarmee een grote bijdrage leveren aan natuurwetenschappelijk onderwijs. Simulaties kunnen op meerdere manieren gebruikt worden. Leerkrachten die beschikken over een digibord hebben de mogelijkheid om voorin de klas het lesgeven te ondersteunen met simulaties. Het is ook mogelijk om leerlingen individueel met simulaties te laten werken in een computerruimte, door gebruik te maken van laptops, of -meer recent- van tablets. Voor individuele activiteiten, vormen de planning en tijd die daarmee gemoeid zijn, echter een drempel voor het individueel of in kleine groepjes leren met simulaties. Klassikaal gebruik van computersimulaties waarbij de controle vooral bij de leerkracht ligt, heeft een eigen kenmerkende dynamiek, vergeleken met situaties waarin leerlingen individueel of in kleine groepjes werken. Dit proefschrift verkent die dynamiek in relatie tot controle, initiatief en activiteit. In het bijzonder worden manieren onderzocht om leerlingen te betrekken bij de klassikale interactie rond simulaties.

Binnen de bestaande onderzoeksliteratuur is al bekend dat computersimulaties geschikt zijn voor ondersteuning van onderzoekend leren, waarbij leerlingen leren zoals onderzoekers onderzoek doen. Dit wil zeggen dat leerlingen leren door verschijnselen in de wereld ter verkennen; zij stellen vragen, doen ontdekkingen en testen die ontdekkingen om daarover zo meer te weten te komen. Simulaties zijn bij uitstek geschikt voor het ondersteunen van deze vorm van leren, doordat deze gebruikt kunnen worden voor het uittesten van hypothesen door variabelen te veranderen en te kijken naar wat er gebeurt (de Jong, 2006). Het onderzoek waarin de effectiviteit van onderzoekend leren met simulaties wordt aangetoond, is meestal gebaseerd op leerlingen die individueel of in kleine groepjes werken. Het gebruik van computersimulaties in de klassikale context blijkt nog maar weinig onderzocht te zijn. Omdat bekend is welke voordelen ondersteuning met computersimulaties kunnen hebben voor individueel leren of in kleine groepjes, is het interessant om te onderzoeken of klassikaal lesgeven daar ook profijt bij kan hebben. Met die vraag voor ogen hebben wij de gerelateerde onderzoeksliteratuur gereviewd, leerkrachten die lesgeven met simulaties geobserveerd en geïnterviewd, en een interventiestudie uitgevoerd, waarin leerkrachten ondersteund werden vanuit een Peer Instruction benadering.

[][][]&R&&eview-&&s&&tudie&

In onze literatuurstudie hebben we onderzocht wat vanuit onderzoek van het afgelopen decennium bekend is over de effectiviteit van de toepassing van computersimulaties binnen de science vakken: is het aan te raden om deze te gebruiken en in hoeverre is de educatieve effectiviteit robuust? Hierbij bouwden we op eerder werk van de Jong en van Joolingen (1998), die vonden dat simulaties effectief gebruikt kunnen worden ter ondersteuning van onderzoekend leren, zolang leerlingen goede ondersteuning ontvangen voor de processen rond onderzoekend leren. De resultaten van onze literatuurstudie bevestigden deze bevindingen en lieten zien dat instructie voor de science vakken inderdaad verbeterd kan worden door computersimulaties als ondersteuning te gebruiken. Naast het verbeteren van de leerwinst en de motivatie, is het bijvoorbeeld ook mogelijk om vergelijkbare leereffecten te bereiken in een veel kortere tijd. Bij de meeste door ons gereviewde studies waarbij computersimulaties gebruikt werden als aanvulling of vervanging van hoorcolleges of practica, had dit positieve effecten tot gevolg voor de simulatieconditie. Uit de gereviewde studies blijkt dat de leereffecten van goed ontworpen instructie met simulaties hoog kunnen zijn, bijvoorbeeld ter ondersteuning van onderzoekend leren. Belangrijke factoren om rekening mee te houden, zijn de manier waarop leerlingen aangesproken en betrokken worden, hoe informatie wordt gepresenteerd en geïntegreerd, en hoe de presentatie van informatie wordt getimed.

Technologische en didactische ontwikkelingen maken het mogelijk om simulaties al verder te verbeteren. Vergeleken met het laatste decennium van de vorige eeuw zien gesimuleerde leeromgevingen er tegenwoordig visueel beter uit. Er kan ook gedacht worden aan andere manieren van waarnemen, zoals met een stereoscopische bril. Echter, dergelijke verbeteringen hoeven doorgaans niet tot hogere leerresultaten te leiden. Bij de meeste studies ging het hierbij om de manier waarop de resultaten van een simulatie getoond worden. Het bevorderen van een gevoel van onderdompeling in een leeromgeving door gebruik te maken van virtual reality technieken hoeft niet tot hogere leerresultaten te leiden. Het kan interessant zijn om simulatie en werkelijkheid met elkaar te combineren door het mogelijk te maken om werkelijk te bewegen door een gesimuleerde omgeving.

Het is vooral effectief om computersimulaties te gebruiken ter voorbereiding van practica. De gereviewde studies rapporteerden grote leereffecten, zowel bij het vervangen van practica door simulaties als bij het gebruik als voorbereidende activiteit. Wat de meeste gereviewde studies met elkaar gemeen hebben, is dat ze zich richten op de interactie tussen de simulatie en de leerling, individueel of in kleine groepjes. Veel uitgevoerde studies waren gericht op hoe leerprocessen ondersteund kunnen worden vanuit de simulatie zelf, bijvoorbeeld ter regulatie van de leerprocessen. Bij onderzoekend leren betekent dit: monitoren wat er is geleerd en plannen hoe er het beste kan worden verdergegaan. Een leerkracht kan een belangrijke rol vervullen bij de ondersteuning van processen rond onderzoekend leren in de context van klassikale leeractiviteiten rond computersimulaties.

[][][][][][][][][]&O&&bservatiestudie&

Nadat we ons hadden verdiept in de literatuur, verschoven we onze focus naar de onderwijspraktijk om te zien hoe leerkrachten simulaties didactisch inzetten en om meer te weten te komen over hoe lesgeven met simulaties wordt ervaren. Voordat we de rol van de leerkracht probeerden te beïnvloeden, onderzochten we het huidige gebruik tijdens natuurkundelessen in heel Nederland. In een observatiestudie hebben we natuurkundeleerkrachten geobserveerd die lesgeven met computersimulaties over verschillende natuurkundige onderwerpen. Het doel van deze studie was om relaties te onderzoeken tussen de manier waarop leerkrachten computersimulaties implementeren in hun lespraktijk, en de attitudes en leerdoelen van leerkrachten en hun leerlingen betreffende het klassikale gebruik van computersimulaties. Onze methode voor het analyseren van de interactie tijdens een les laat zien in hoeverre de leerling centraal staat en er wordt lesgegeven op een onderzoekende manier. Deze twee scores zijn gebaseerd op de hoeveelheid en het type vragen die leerkrachten stellen aan hun leerlingen en op de mate waarin deze daadwerkelijk door leerlingen of door de leerkracht zelf worden beantwoord. Zodra volgens onze methode de leerling centraal staat, hebben zij het idee dat lesgeven met computersimulaties bijdraagt aan hun motivatie. Zodra er op een onderzoekende manier wordt lesgegeven, hebben leerlingen het idee dat dit bijdraagt aan hun inzicht. Het lijkt voor leerkrachten lastig te zijn om tegelijkertijd zowel onderzoekend als met de leerling centraal les te geven. We hebben leerlingen en leerkrachten ook gevraagd naar hun (ervaren) leerdoelen van door simulaties ondersteunde lessen. Zodra de leerdoelen tussen de leerkracht en de leerlingen met elkaar overeenkwamen, bleek de leerkracht ook een positieve attitude te hebben over onderzoekend lesgeven met computersimulaties.

[][][][][][][]&E&&xperimentel&&e& &S&&tudie&

Om klassikaal lesgeven met computersimulaties beter te begrijpen, hebben wij een experiment uitgevoerd met een interventie. Om het onderzoekende karakter te versterken van de op simulaties gebaseerde les, hebben we een lesplan ontwikkeld gebaseerd op de ideeën van Peer Instruction (Crouch & Mazur, 2001). Volgens deze benadering discussiëren leerlingen met elkaar op basis van ConcepTests gerelateerd aan de inhoud van de getoonde simulatie. De leerkracht vraagt de leerlingen om het verloop van de simulatie te voorspellen. Leerlingen antwoordden door gebruik te maken van een stemsysteem. Zodra tussen de 35- 70% van de leerlingen correct antwoordden, werd iedereen gevraagd om te proberen om hun buurman/-vrouw van hun mening te overtuigen, waarna een tweede stemronde volgde.

We vergeleken deze aanpak met parallelklassen waarin dezelfde leerkrachten lesgaven volgens hun eigen vertrouwde aanpak, zonder onze interventie. We onderzochten het proces in de klas, waarbij we gebruik maakten van dezelfde observatiemethode als in de vorige studie en in de gaten hielden op welke manier de interventie daadwerkelijk leidde tot discussie en veranderingen in de antwoorden van leerlingen. Daarnaast gebruikten we de Force Concept Inventory (Hestenes et al., 1992) om verbetering van conceptueel inzicht te meten.

Om inzicht te krijgen in de robuustheid van effecten en mogelijke invloed van contextuele factoren, hebben we dit experiment na een eerste uitvoering vijfmaal herhaald en de interactie tussen de leerkracht en de leerlingen geanalyseerd. Deze analyse verheldert hoe een leerkracht met de simulatie werkt en tegelijk ook hoe de leerkracht op een interventie reageert. Ondanks onze zorgvuldige implementatie van een Peer Instruction aanpak, bleek de didactische interactie tussen de leerkracht en de leerlingen in de praktijk behoorlijk te variëren. Onze studie brengt aan het licht welke factoren een rol spelen bij lesgeven met simulaties en de leereffecten ervan. Onderzoek gericht op didactische interventies zou rekening moeten houden met de manier waarop een leerkracht reageert op een voorgeschreven les. Het is mogelijk dat een voorgeschreven les niet leidt tot de gewenste gedragsverandering bij een leerkracht door een verlies aan gevoel van eigenaarschap of controle over het lesgeven. Bij afzonderlijke uitvoeringen van ons experiment zag de interactie tussen de leerkracht en de leerlingen over de verschillende condities er daardoor bij de ene leerkracht redelijk hetzelfde uit, maar was bij de andere leerkracht het verschil behoorlijk groot. Bij het testen van de effectiviteit van didactische benaderingen is het belangrijk dat er zicht is op hoe dergelijk leerkrachtgedrag verschilt tussen de verschillende condities.

[][][][][][][]&E&&indconclusie&

Bij klassikale inzet van computersimulaties kan de lespraktijk op diverse manieren verrijkt worden door leerlingen op een onderzoekende manier actief aan de slag te laten gaan. Bij het onderzoek naar de effectiviteit hiervan, is het belangrijk om aandacht te besteden aan bereikte leeruitkomsten op bepaalde momenten, maar daarnaast ook aan de leerprocessen zelf en de rol die de leerkracht vervult.

[][]&Appendices&

overview of publications and presentations: http://bit.ly/NicoRutten

personal resume: [http://bit.ly/resumeRutten
++
++]

review study:

PowerPoint (English) http://slidesha.re/simrevEN

PowerPoint (Dutch) http://slidesha.re/simrevNL

publication (English) http://bit.ly/simrevart

Zentation (English) [http://bit.ly/simrevPL
++
++]

observational study:

recruitment (Dutch) [http://bit.ly/simobservatie
++]

publication (English) http://bit.ly/simobsart

experimental study:

teacher manual (Dutch) http://bit.ly/simexphan

PowerPoint (Dutch) http://bit.ly/ConcepTests

forum (Dutch) [http://bit.ly/ELANsims
++]

publication (English) http://bit.ly/simexpart

graphical overviews:

mindmap (English) http://bit.ly/bubblvars

mindmap (English) http://bit.ly/edusims

Prezi (English) [http://bit.ly/prezivars
++
++]

Figure A-1 Our colleagues at the ELAN Institute.

[][]&Acknowledgements&

A lot of gratitude is owed to all the people who supported me in finishing this dissertation.

First of all, I want to thank my supervisors Jan and Wouter. I admire their positive approach to supporting me. I’ve experienced their supervision as the right balance between the freedom to find my own way, and being there when I needed them. Thank you for this collaborative scientific journey.

The week I got hired for my PhD project was the same week my wife and I got married. Between then and now, our two little boys Steven and Arthur have enriched our lives. I want to thank my mother-in-law Thea and her sister Ans for taking care of them so often. I’m grateful for the support provided by my parents Gerard and Annie. Thank you, Arthur and Steven, for all appropriate distractions. Above all, I want to thank my wife Hanneke for always standing by my side.

Working at the ELAN Institute and at the University of Twente in general has been very enjoyable. Thanks go out to all my colleagues at ELAN, Instructional Technology, and former Educational Science for all entertainment, philosophical reflections and personal attention. Thank you, PhD roommates Daan and Talitha for all the time that we’ve spend together. A special thanks goes out to Floor and Mireille for standing by my side at my dissertation as paranymphs.

Conducting the studies in this dissertation would not have been possible without the participation of many teachers and students. Thank you all for your voluntary efforts in my PhD project. And thank you, Emily, for helping me to write it all down.

Nico Rutten

Enschede, 2014

[][]&Biography&

Nico Rutten was born in Bleiswijk, a village in the west of the Netherlands near Rotterdam. In Bleiswijk he went to the primary school Klimophoeve. For secondary education he went to Sint-Laurenscollege in Rotterdam, where he graduated cum laude at the highest level. Subsequently, he studied Social Work for two years in The Hague, and eventually also moved there. After receiving his undergraduate degree he moved to the south of the country to study Psychology in Maastricht. Within the Cognitive Psychology track he chose Psychopathology as his specialization. After having taken courses for three years in Maastricht, he moved to the east of the country to take optional courses in Nijmegen at the department of Psychology of Culture and Religion. During the two years that he lived in Nijmegen he also participated in 15 voluntary projects across Europe organized by SIW (www.siw.nl) and Youth at Risk (youthatrisk.org.uk). Nico graduated with his master’s degree in Psychology in 2004.

During his studies Nico worked as a tutor at the universities of Maastricht and Nijmegen. After graduation he worked as a youth care worker in Delft, as a care planner in Gouda, and as a business information administrator in Leiden. Together with his wife he lived in Utrecht for two years. After moving to the east of the country to settle down in Enschede, he worked as a business information administrator at a hospital in Hengelo. In 2009, he started with a PhD project at the University of Twente focused on teaching with simulations. In January 2012, he succeeded in publishing, together with his supervisors, his first article in an international scientific journal (bit.ly/simrevart). During his PhD project he frequently presented his work at international conferences (bit.ly/NicoRutten). Besides his PhD project, he has participated in the project Inspiring Science Education since April 2013 (inspiring-science-education.org). On August 29th 2014 Nico defends his dissertation called ‘Teaching with simulations’.

This dissertation is about inquiry-based teaching with computer simulations in science education. Research of the past decade shows that instruction can be enhanced by using computer simulations. For example, it can be an effective tool to prepare for laboratory activities. How to use computer simulations for teaching practices, however, is relatively unknown. How to teach with simulations is the main topic of the studies in this dissertation. Computer simulations allow for student-centered, inquiry-based teaching and learning. Our studies show that simulations are not only experienced as appropriate for learning individually or in small groups, but also at the whole-class level. Investigating teaching with computer simulations at the whole-class level is complex, due to the wide range of impacting variables. Our method to investigate the student-centered, inquiry-based character of teacher-student interaction reveals that zooming in on contextual mechanisms is essential for understanding teaching with computer simulations.

***

[1] This chapter is based on: Rutten, N., van Joolingen, W. R., & van der Veen, J. T. (2012). The learning effects of computer simulations in science education. Computers & Education, 58(1), 136-153.

[2] This chapter formed the basis for: Rutten, N., van der Veen, J. T., & van Joolingen, W. R. (2015). Inquiry-based whole-class teaching with computer simulations in physics. International journal of science education, 37(8), 1225-1245.

[3] This chapter formed the basis for: Rutten, N., van Joolingen, W. R., & van der Veen, J. T. (2016). Investigating an intervention to support computer simulation use in whole-class teaching. Learning: Research and Practice, 2(1), 1-17.


Teaching With Simulations

This dissertation is about inquiry-based teaching with computer simulations in science education. Research of the past decade shows that instruction can be enhanced by using computer simulations. For example, it can be an effective tool to prepare for laboratory activities. How to use computer simulations for teaching practices, however, is relatively unknown. How to teach with simulations is the main topic of the studies in this dissertation. Computer simulations allow for student-centered, inquiry-based teaching and learning. Our studies show that simulations are not only experienced as appropriate for learning individually or in small groups, but also at the whole-class level. Investigating teaching with computer simulations at the whole-class level is complex, due to the wide range of impacting variables. Our method to investigate the student-centered, inquiry-based character of teacher-student interaction reveals that zooming in on contextual mechanisms is essential for understanding teaching with computer simulations.

  • ISBN: 9789402118438
  • Author: Nico Rutten
  • Published: 2016-03-17 23:05:37
  • Words: 52418
Teaching With Simulations Teaching With Simulations