• No results found

Teaching to investigate in Year 11 science, constrained by assessment

N/A
N/A
Protected

Academic year: 2022

Share "Teaching to investigate in Year 11 science, constrained by assessment "

Copied!
28
0
0

Full text

(1)

Teaching to investigate in Year 11 science, constrained by assessment

A

ZRA

M

OEED

Abstract

School science learning aims to engage students into understanding science concepts, developing procedural understanding, and an understanding of the nature of science. In recent times, teachers of senior science in New Zealand have had to adapt to two significant systemic policy changes that have impacted on their practice. The first was a new curriculum that required the teaching of science investigation and the second, internal assessment of science investigation for National Certificate of Educational Assessment (NCEA). This paper presents findings of research that investigated the case of science investigation from the teachers’ perspective. The data were collected through questionnaire, interviews and classroom observations. The findings suggest that teachers changed their practice of teaching science investigation in response to the change in assessment of policy. The consequence of this change led to students being trained to mostly learning a fair testing type of investigation to gain NCEA credits and grades.

here is considerable agreement amongst educational researchers that learning science in secondary school involves developing conceptual understanding, procedural understanding, and an understanding of the nature of science (Hodson, 1998; Monk, 2006).

The concerns, doubts, and relative importance of each of these aspects in relation to students’ learning are debated in the literature (Abrahams & Millar, 2008; Hart, Mulhall, Berry, Loughran, &

Gunstone, 2000; Millar, 2004; Monk, 2006; Wellington, 1998;

Woolnough & Allsop, 1985). The teaching of all three aspects is

T

(2)

required by Science in the New Zealand Curriculum (SiNZC) (Ministry of Education, 1993). It was in the context of SiNZC that the present study was conducted.

Research findings suggest that properly developed investigative skills and meaningful learning from these activities are not so prevalent (Hodson, 1990; Hofstein, Kipnis & Kind, 2008; Roberts &

Gott, 2004). For this research, the doing of science, and in particular how teachers teach students to investigate and assess their learning through internal assessment of science investigation in Year 11, were key foci.

In recent times, teachers of senior secondary science in New Zealand have had to adapt to two significant systemic policy changes that have impacted on their practice. The first change was the move to a single science curriculum statement for all levels that replaced the previous syllabi and prescriptions. SiNZC (Ministry of Education, 1993) was a substantive document that heralded a major change in philosophy and set a new direction for science teaching and learning.

The curriculum statement made the teaching of science investigation a mandatory requirement. Briefly, the document set achievement objectives for each level and indicated progression from one level to the next. Although there was evidence that New Zealand teachers have always done experiments and practicals in science classes, a progression in students’ investigative skills became an expected outcome.

SiNZC (Ministry of Education, 1993) was underpinned by a constructivist theory of learning. According to this theory, learning is considered to be an active rather than passive process, and each individual constructs their own understanding based on their experiences. Students link new learning with their existing knowledge and beliefs, which they modify if necessary (Driver, Asko, Leach, Mortimer, & Scott, 1994). Constructivist pedagogies are learner- centred (Windschitl, 2002). Baviskar, Hartle, and Whitney (2008) described four features of a pedagogy based on constructivism. These include: “eliciting prior knowledge, creating cognitive dissonance, application of new knowledge with feedback and reflecting on learning” (p. 4). The Learning in Science projects at Waikato

(3)

University showed that these pedagogical approaches were implemented in New Zealand classrooms in the 1980s and 1990s with varying degrees of success (Hipkins et al., 2002).

No sooner had teachers come to grips with these curriculum changes when the second systemic change took place with the establishment of the New Zealand Qualifications Authority (NZQA) in 2001. As part of the National Qualifications Framework, there was a change to standards-based assessment and a new National Certificate of Educational Achievement (NCEA) for senior secondary education was put in place. In Year 11, the NCEA level 1 replaced School Certificate, in Year 12, level 2 replaced Sixth Form Certificate, and in Year 13, level 3 replaced the University Bursaries Examination.

These changes were implemented in three successive years from 2002.

Prior to the introduction of NCEA for Year 11, science investigation was not assessed internally except in a few schools that offered modular science.

This paper presents findings of research that investigated the case of science investigation from the teacher’s perspective. When the study was conducted in 2006, the new assessment regime had been in place for five years, allowing teachers time to gain experience in its implementation. This research was therefore timely for gaining insight into any influence on teaching and learning of science investigation that this change in assessment practice might have had.

It will be argued that teachers changed their practice of teaching science investigation and pragmatically chose to teach the kind of investigation that would be assessed for NCEA credits and grades at the cost of students learning that science is predicated upon investigation. The following section presents the literature in relation to practical work and, more specifically, science investigation and what students learn from it.

(4)

Place of practical work in school science

Internationally, many reasons are offered for the inclusion of practical work in school science. Practical work includes experiments and investigation (Millar, 2010). Wellington (1998) offered cognitive, affective and development of skills as reasons for doing practical work which has found some agreement in science education. Cognitive reasons for practical work include improvement of students’

understanding of science ideas (Millar, 2004; Millar & Driver, 1987;

Wellington, 1998), and that practical work may help to confirm the theory students had learnt (Gott & Duggan, 1996), although Millar (2004) warns that “cognitive learning outcomes are not likely to be achieved as a result of engagement in a single practical activity”

(p. 9). Abrahams and Millar (2008) found that “teachers assumed that explanatory ideas ‘emerge’ from observations and add that this does not happen, no matter how carefully these are guided or constrained”

(p. 1965). Affective reasons offered for practical work include student enjoyment of the activity and development of positive attitudes towards science (Wellington, 1998). According to Abrahams (2011), even though teachers say that practical work is motivational, their students thought that engagement in practical work was a more attractive alternative to bookwork. Skill development or gaining procedural knowledge is argued as a reason for engaging students in practical work (Millar & Driver, 1987; Wellington, 1998; Woolnough

& Allsop, 1985). A much narrower form of practical work that involves following set steps to achieve the expected results has been called “recipe practical” and is a common practice in New Zealand schools (Hipkins et al., 2002).

This research focuses on investigation, and the next section presents various types of investigation. The non-alignment of the requirement of learning with the assessment of science investigation for NCEA is critiqued and it is argued that this mismatch of expectations has created tension for science teachers.

(5)

Science investigation: What is it, and what do students learn from it?

Scientific investigation is a holistic approach to learning science through practical work (Woolnough, 1991). Gott and Duggan (1996) state that “the aim of science investigation is to provide students opportunities to use concepts and cognitive processes and skills to solve problems” (p. 26). Students gain most from science investigation when they “discuss expectations, observations, conclusions, theories, and explanations before, during, and after conducting the activity” (Patrick & Yoon, 2004, p. 319). Millar (2004) agrees with the importance of discussion before and after the investigation. Learning investigation needs to be seen as a recursive process (Ministry of Education, 1993) rather than a linear and sequential process, with the investigator going backward and forward as the investigation proceeds. The degree to which the student has control over defining the problem, choosing the methods, and arriving at solutions dictates whether a practical activity is an open investigation or a closed practical activity (Millar, 2010; Roberts, 2009; Simon, Jones, Fairbrother, Watson, & Black, 1992).

In 1999, Watson, Goldsworthy and Wood-Robinson proposed six different types of investigations for school science, covering a range of skills to give students the opportunity to gain an understanding of science ideas and how science works. These are:

1. Classifying and identifying 2. Fair testing

3. Pattern seeking 4. Investigating models 5. Exploring

6. Making things or developing systems.

Of these, fair testing was important in the present research as it was emphasised as a type of investigation in SiNZC (Ministry of Education, 1993) and it was the type of investigation assessed through Achievement Standard AS1.1 (NZQA, 2005). The Education Review

(6)

Office (1996) reported that the fair-testing approach is common, with many primary schools choosing to involve all students in participating in science fairs1 for which most investigations carried out by the students were fair-testing types. The Education Review Office reported that in some cases this was the only science taught. Watson et al. (1999) found that in the United Kingdom the national curricula have an “over-heavy” emphasis on fair testing and that this is detrimental to other kinds of investigation such as “classifying, identifying, pattern seeking, exploring, investigating and making things and developing systems” (p. 85).

Before arguing the consequences of the dominance of fair testing on teaching in Year 11, science investigation needs to be defined from the perspectives of science educators, SiNZC and the Achievement standard AS1.1.

Millar (2010) defines science investigation as:

Practical activity in which students are not given a complete set of instructions to follow (a ‘recipe’), but have some freedom to choose the procedures to follow, and to decide how to record, analyse and report the data collected. They may also (though this will not be taken as a defining characteristic) have some freedom to choose the question to be addressed and/or the final conclusion to be drawn.

Like ‘experiments’, ‘investigations’ are a sub-set of ‘practical work’.

(p. 2)

According to Roberts and Gott (2003) and Abrahams and Millar (2008), students need both understanding of science concepts (substantive knowledge) and skills (understanding of science procedures) to successfully carry out a science investigation. Roberts (2009) proposed that:

Genuine open-ended investigations ... are those in which pupils are unaware of any correct answer, where there are many different routes to a valid solution, where choices have to be made about equipment selection, where different sources of uncertainty lead to variation in the data and where students reflect and modify their practice in the light of the evidence they have collected. The evidence produced, then, is messy rather than the laundered version common in practical work contrived to illustrate ideas to students. (p. 31)

(7)

Such investigation, she argued, allows the student to be creative in their problem solving. In a class where such creativity is allowed, no two investigations would be the same as all students would have the licence to come up with their own approach. In an empirical study, Roberts and Gott (2010) found that an understanding of substantive ideas is not sufficient, and procedural understanding is important and requires “explicit teaching of the concepts of evidence, and particularly ideas associated with uncertainty in data sets” (p. 377).

Focussing on using science investigation to develop conceptual understanding, science educators propose that carrying out a complete investigation of this kind enables students not just to do science but also to learn the science concepts and understand the nature of science (Hodson, 1990; Roberts & Gott, 2006). In school, science investigations may be carried out to confirm a theory. For example, students may investigate the reaction between several metals and oxygen to confirm the theory of oxidation and reduction.

Recently, in Australia, in response to a loss of interest in taking science in secondary schools, Tytler (2007) suggested “Re-imagining science education” and presented what in his view were strands of a re-imagined curriculum. Relevant to the present study was what he considers investigative science should look like:

Science investigations should be more varied, with explicit attention paid to investigative principles. Investigative design should encompass a wide range of methods and principles of evidence including sampling, modelling, field-based methods, and the use of evidence in socio-scientific issues. Investigations should frequently flow from students’ own questions. Investigations should exemplify the way ideas and evidence interact in science. (p. 64)

Science investigation: The curriculum requirement

SiNZC (Ministry of Education, 1993) required the learning of skills and complete investigation. It states:

Carrying out an investigation in science involves an interaction of many complex skills. These include focusing, planning, information gathering, processing, interpreting, and reporting. Students may be

(8)

investigating by carrying out a practical investigation of the “real world”, by carrying out an investigation of appropriate reference material, or by integrating these approaches. (p. 43)

In relation to science investigation, the explanatory notes state:

1. The ability to carry out a complete investigation is the key expected outcome of this achievement aim.

2. It is expected that the students will develop any specific investigative skills they need when they are carrying out a complete investigation.

3. The processes of investigation are not sequential. The process may begin at any point in the table above and will tend to move backwards and forwards. Students should be reflecting on their decisions, actions, and findings and modifying their plans and actions as they are proceeding. (p. 47)

Additionally, focusing and planning, information gathering, processing and interpreting, and reporting constitute the four achievement objectives.

Science investigation: The assessment requirement

The internal assessment of science investigation for NCEA level 1 required students to:

Carry out a practical science investigation, with direction, by planning the investigation, collecting and processing the data, and interpreting and reporting the findings. (NZQA, 2005, p.3)

Ideally, school science investigation would involve practical work in which the student seeks an answer to a question they have identified or a problem they are interested in solving. Students are given few instructions, and putting into practice their procedural and conceptual knowledge, plan and carry out the investigation. They evaluate their procedure as the investigation progresses and make any changes required. Although the answer to the question they were asking may be known to scientists and the teacher, for the student it provides an opportunity to find out for themselves. However, in school science, with large classes of students, teachers find it difficult to manage such investigation and often the whole class carries out the same

(9)

investigation. The fair testing type of investigation is most common in schools even though the learning from such investigation may be limited (Hume & Coll, 2008; Moeed, 2010). Increasingly, there is a call in the science education community to get away from a

“routinised” fair testing type of investigation and give the students the opportunity to carry out open-ended investigation that allows them to be creative in their problem solving.

This paper argues that Year 11 science teachers reconciled the tension between the curriculum requirement of an open-ended investigation and the assessment of a fair testing type of investigation by teaching what would be assessed for NCEA credits and grades.

Methodology

Qualitative research is interpretive, has a naturalistic orientation and allows the use of multiple methods (Denzin & Lincoln, 1994, 2005).

Qualitative research seeks to understand situations in their unique contexts and through the interactions that take place in those settings (Merriam, 1998). Taking an interpretive, case study approach, this research investigated the phenomenon of science investigation. The research was focussed on gaining an understanding of the phenomenon of interest in its unique context. The methodology was appropriate because the research involved the implementations of a policy, and sought to understand the impact of a change on the participants (Mertens, 2005; Patton, 2002).

Data sources in this case study included a regional survey of all Year 11 science teachers in Wellington, interviews with all Year 11 science teachers in one school (the study school), and a Year 11 nested study of one class (the study class) for the 2006 academic year.

All teachers who taught Year 11 science were invited to respond to a questionnaire. The postal survey had a response rate of 62%. A typical state, coeducational, medium size school with decile 6 was selected, and the criteria for selection of the study class in the same school were that the teacher of this class had taught before and after the introduction of NCEA, and had not been taught by the researcher in her capacity as a teacher educator. The class was observed on the one day each week that the teacher specified was the most likely day he

(10)

would do investigative work with his class for the entire academic year. The study class teacher was interviewed three times during the year, and in addition he audio recorded his reflection of the lesson. An observation schedule was used and running record was kept for each observed lesson. Three interviews were held with a focus group of six students from the study class. The questions in the survey and interviews were framed around learning, motivation to learn, and assessment of science investigation which were the foci of the overall research project. Documents including student workbooks and science department management documents, as well as schools’ NCEA results were also analysed. The management document was analysed for gaining insight into the department structure and to determine how the curriculum was to be delivered with a focus on teaching of investigation. Student workbooks were checked for completion of investigation related homework and teacher feedback. The results for the study class and the study school were compared with the national results for NCEA level 1 science for internal and externally assessed achievement standards. All instruments, questionnaires, interview questions and observation schedules were pilot tested (for details, see Moeed, 2010).

The following results are from the teachers’ perspectives and specifically seek to answer the question:

How do Year 11 science teachers practise science investigation?

Results

The results are presented sequentially from the regional teachers’

survey, study school teacher interviews, and the study class teacher interview, observations, and document analysis. The data were analysed, and triangulated, and the emergent themes are discussed later. Pseudonyms are used throughout this paper. Quotations from the survey are coded with a number, for example, RST 025 means regional science teacher number 25. Quotes from science teacher interviews have a pseudonym, for example Stella.

(11)

Types of investigation taught in Year 11 science

Regional teachers were asked what types of investigations they taught in Year 11 science. A table was provided with a list of types of investigations, and teachers were asked to select the types of investigation they taught in each subject. The total number for each kind of investigation identified is presented under ‘all subjects’ in Table 1.

Table 1: Number of each type of investigation carried out in each subject in Year 11 science (101 teachers responded)

Types of science investigations

All

subjects Chemistry Physics Biology Geology Astronomy

Fair testing 228 80 78 63 5 2

Pattern seeking 224 75 62 37 31 19

Classifying &

identifying

195 57 21 58 57 22

Exploring 169 37 40 43 26 23

Investigating models

165 47 39 26 17 36

Making things 139 36 33 35 22 13

Developing systems

59 16 19 10 7 7

Other types 6 1 2 3 0 0

All types 1185 349 294 275 165 122

How regional teachers prepared students for AS1.1 assessment

All participating teachers were doing AS1.1 with their science class.

This was an open-ended question and their first two responses were coded (see Table 2). More than a quarter of the responses (28%) indicated that teachers prepared their students for AS1.1 by doing tasks similar to those used for assessment and by using the template from the Ministry of Education website Te Kete Ipurangi (TKI).2 Another quarter of the responses indicated that teachers used fair testing type tasks. Only 16% of responses recorded that teachers used formative assessment and gave student feedback as to how the students could improve. Other responses indicated that they prepared

(12)

their students by teaching them the skills of planning, interpreting and processing information, and reporting. Some responses indicated that the teachers started preparing students from Year 9, familiarising them with the terminology used for AS1.1.

Table 2: Teachers’ reported student preparation for AS1.1 (97 teachers responded)

Student preparation Teacher responses (n=189)

no. %

Doing tasks similar to those assessed 53 28

Practise fair testing 47 25

Formative assessment and giving feedback 30 16

By teaching skills needed for investigation 22 11

Start preparing students from Year 9 18 10

Teach the science concepts 17 9

Do lots of practical work 2 1

Teachers’ reasons for task selection for teaching and assessment Teachers were asked to respond to a checklist of reasons with the option of checking as many responses as applicable and to note

“others” if required (see Table 3). The data showed that some teachers considered the expense of using a particular task and chose inexpensive tasks (15% of responses). Their students’ understanding of the science concepts was a consideration in 13% of responses followed by the availability of equipment (12% of responses).

A typical response from a teacher at a low decile school was:

We have little technician support, not enough funding for resources and photocopying and the students cannot afford to pay for workbooks. We have to give our students the best deal under such conditions. (RST 025)

The accessibility of a task or a moderated task on TKI was also a consideration. Student interest in the task was a reason offered by some teachers for task selection. The data showed a prevalence of management-related reasons in teachers choosing the assessment task for AS1.1.

(13)

Table 3: Teachers’ reported reasons for choosing the assessment task for investigation (101 teachers responded)

Reasons Teacher responses (n=390)

no. (%)

Inexpensive 59 15

Helps student understanding of concepts 53 13

Requires little equipment 45 12

Students find it easy 43 11

Exemplar on TKI 40 10

Moderated exemplar on TKI 23 6

Students find it engaging 36 9

Easy to differentiate 36 9

Others

Manageable 24 6

Convenient 14 4

Others decide 14 4

Other 3 1

Study school science teachers’ goals for student learning through science investigation

In their interviews, when discussing goals for student learning through investigation, all teachers (n=10) focused on the fair testing type of investigation that was assessed in AS1.1. Although most talked about the importance of learning skills, four saw teaching assessment- associated vocabulary to enhance student achievement in the internal assessment as a key goal. Two teachers believed that this should be done before Year 11. Stella said:

Start the children off in Year 9. Teach them the vocabulary used as far back as that. Knowing that they need to back up their results with evidence and then discuss them.

Beth said she was concerned about the fair testing type of investigation she was teaching and was dissatisfied because she was unsure about what students learnt from it, as the investigations had

(14)

such obvious answers. She also noted the need to stress what students should write for a Merit (explain) or Excellence (discuss) grade in assessment:

I think with year 10, and I need to be a bit more careful, we don’t stress the importance of the differences between discuss and explain and such like, whereas at year 11 we start stressing it an awful lot.

So, I think I would be really hammering that into the kids this year.

(Beth)

How study school teachers prepared students for AS1.1 assessment In their interviews, study school teachers said they prepared students for internal assessment through teaching process skills, fair testing, and assessment related vocabulary and familiarising them with the assessment process (Table 4).

Table 4: Study school teachers’ reported student preparation for AS1.1

Student preparation Number of teachers

(n=10)

Practise fair testing type of investigation 9

Teach vocabulary 6

Teach skills needed for investigation

(planning, gathering information, processing, interpreting information, and reporting)

10

Teach what to write for Achieved, Merit and Excellence grades 4

Teach that science is “real”, “relevant” 2

Teach that scientists investigate all the time 1

More specifically, Lillian thought that her students were learning to go

‘through the hoops’ and were ‘being trained’ to do this kind of investigation:

Mostly you can train anybody to do it. It’s orders. It’s a training exercise. This is what you need. Write this, write this, write that.

There’s your Excellence. It’s kids jumping through hoops, and okay it gets them five credits or four credits. Lovely, they pass everyone.

That’s great, but I’m not entirely sure that you’ve taught them a lot of anything. (Lillian)

(15)

Asked what types of investigations the teachers taught in year 11, none of the teachers interviewed reported any investigation other than fair testing. None of the teachers was satisfied with the process for the assessment of AS1.1 which followed the requirement of fair testing – controlling variables and following steps to get to an answer already known. Their reasons were different, but each expressed a genuine concern for students that was obvious during the interviews. They were despondent, upset, not impressed, uneasy, questioned the fairness, and were pragmatic, saying “this assessment had to be done”

(Mandy).

Study class results

Of the 12 investigations observed, two were exploration (in one, students investigated models) and the rest were fair testing types of investigations. The investigation tasks used were set out in the student workbooks (Abbott, Cooper, & Hume, 2005).

In most cases, students engaged in the investigation, but at the end of the lesson the teacher ran out of time to find out what the students had learnt. Students were asked to write up the investigation for homework. Sighting students’ workbooks showed that few students completed this write-up (4−6 students in the class of 26). For the mock exam (trial run), 11 students handed in a plan for the teacher to mark.

During the first interview with the study class teacher, he described what he wanted the students to learn about investigation:

The ability to plan, to complete the fair test, understand the variables, appropriately handle equipment, measure accurately, cooperate with each other, analyse the data, plot a graph without making too many blunders, write a conclusion which links what they have learnt with the science behind it. We also ask them to evaluate what went wrong and what can we do next time.

(16)

NCEA results for the study class

In the study year (2006), the national and study school average for Achieved grade or better for AS1.1 was 83% (NZQA, 2006). The study class average for AS1.1 was slightly higher at 88%. In the external achievement standards less than 50% of both the school and study class students gained Achieved or better grades in NCEA level 1 science, compared to the national results of over 50%.

Discussion

In relation to teacher practice of science investigation, the triangulation of data from the multiple sources led to the emergence of four themes: Fair testing, a common investigation; training to investigate; changes in teaching practice after the introduction of NCEA; and a pragmatic approach to investigation. Each of these themes is presented next.

Fair testing, a common investigation

Regional Year 11 science teachers, when provided with a list of types of investigations that they may have carried out, more often selected fair testing as the type of investigation they did with their class rather than any other type of investigation. All study school Year 11 teachers interviewed said that they taught the students to carry out a fair testing type of investigation. For example, they talked about the need for students to “control variables” or correctly use “the template” which is designed and used for assessment of this type of investigation.

Regionally, the frequency of selection of “fair testing” was closely followed by pattern seeking and classifying. However, in the study class, pattern seeking and classifying types of investigation were not observed and neither did the study school teachers mention these in the interviews. Experiencing fair testing in Year 11 science is in agreement with Hume and Coll’s (2008) case study findings in New Zealand, which indicated that students in Year 11 were acquiring a narrow view of science investigation as fair testing, and that although learning was taking place, students’ responses demonstrated rote learning and low-level thinking.

(17)

Regionally, and in the study school, more fair testing type of investigations were carried out when teaching physics or chemistry topics than biology or astronomy topics. According to Tytler (2007), such an imbalance occurs because it is easier to control variables in physics and chemistry. Investigating in mostly physics and chemistry contexts is problematic as potentially it could lead to students thinking that investigation is only done in these subjects (Lunetta, Hofstein, &

Clough, 2007; Tytler, 2007).

Fair testing was specified in SiNZC through achievement objectives in the Developing Investigative Skills and Attitudes integrating strand (Ministry of Education, 1993). The controlling of variables was an objective in the Making Sense of the Physical World contextual strand of the curriculum and thus is likely to have led to a teacher focus on fair testing. There was similar “over-heavy”

emphasis on fair testing in the United Kingdom national curricula (Watson, Goldsworthy, & Wood-Robinson, 2000).

The study school science department’s documents reflected the implementation of the curriculum where the examples cited in the unit plans specified teaching of fair testing. This emphasis on fair testing in the study school unit plans is congruent with Hume and Coll’s (2008) finding that the decision to focus on fair testing is made at the departmental level. Other types of investigation also required by the curriculum, such as pattern seeking (Ministry of Education, 1993) and classifying, were not mentioned in the school documents. The resources provided for the teachers on the Ministry of Education website also have more fair testing types of investigation than other types.

A particularly influential factor for fair testing being the main focus for students’ learning of how to investigate in science is that the assessed investigation for NCEA level 1 is a fair testing type of investigation. Fair testing is therefore required to be taught and not surprisingly is found to be the focus of teaching and assessment.

Although other types of investigation, including pattern seeking, classifying and exploring are included in the curriculum, they are not assessed for NCEA. The issues this raises are that if other types of

(18)

investigation are not formally assessed they are less likely to be taught. More importantly, if students mostly experience fair testing, they are likely to have a limited view of science investigation.

Training to investigate

It appears that Year 11 science teachers focused on training their students to undertake the fair testing type of investigation in preparation for internal assessment of science investigation. The approaches the regional teachers said they used included “repetition”,

“doing tasks similar to those assessed” and “practising fair testing”.

This approach was also taken by the study school science teachers who said they were “training” their students to investigate and

“getting them to go through the hoops”. Some of these teachers reported an emphasis on students learning the skills needed to investigate. Thus procedural knowledge, rather than procedural and conceptual understanding were deemed appropriate preparation for assessment through AS1.1. Science teachers in the study school said that this was contrary to how they would ideally teach science investigation, but in the interest of students’ achievement and because students had to be assessed they were pragmatic and continued to teach “what would be assessed” and the view was that there was no choice. These results were supported through observation of the study class where students carried out several fair testing types of investigation and practised the skills of planning, were repeatedly taught about controlling variables, and gathered and recorded data.

This training was reinforced by constantly using the template designed for AS1.1. Cleaves and Toplis (2007) also found students in the United Kingdom were trained to investigate for assessment.

Training for assessment involved an emphasis on what the students needed to write to achieve a particular grade, a practice noted also by Cleaves and Toplis (2007). The NCEA grades require a student to be able to describe their investigation to get an Achieved grade, explain their answer to get a Merit grade, and discuss their results to get an Excellence grade. Constant reinforcement by teachers of what students should do to achieve led to students wanting to know what they should learn and write to get the credits for better grades.

(19)

The focus for the students to achieve, and the support for performance goals (Covington, 1998), was so much a part of teacher thinking in the study school that some teachers described students as the “Merit and Excellence kids” and the study class teacher said that there were “no Excellence students” in his class. Most students who wanted to do well in the NCEA and achieve a good grade were willing to attend extra revision lessons in order to achieve Merit or Excellence. However, in the study class, those students who had not experienced success in the learning tasks and thought they could not gain an Achieved grade, gave up. It appeared that there was little support for these students.

Learning to investigate was largely memorisation in the study class (although the students may have understood, this was not demonstrated in the observed lessons). This was noted when the teacher asked questions the students gave “rote learnt” responses. This is contrary to the philosophy that underpins the curriculum and promotes learning for understanding. The study class teacher regularly opened the lessons with a quiz, but did not build in any feedback to implement a constructivist teaching approach (Baviskar et al., 2008), which is advocated by the science curriculum (Ministry of Education, 1993). There was limited evidence of finding out what the students already knew to address students’ alternative conceptions.

In Abrahams and Millar’s (2008) view and according to research findings by Roberts (2009), both conceptual and procedural understandings are needed to carry out science investigation. Instead of developing these two kinds of understandings to investigate, students were trained to perform in the assessment of science investigation.

Changes in teaching practice after the introduction of NCEA

Teaching of science investigation to year 11 changed after the introduction of assessment of practical investigation. Teachers offered several reasons for the change in practice as being due to a change in assessment policy which required internal assessment of science investigation. Whether they did more, the same, or fewer investigations, the main reason offered for the change in practice was

(20)

due to assessment requirements for NCEA rather than student learning (Moeed & Hall, 2011). Another reason they gave was that complete investigation, a requirement of assessment, was time consuming and took up to three lessons. In the United Kingdom, teachers had also raised similar concerns (Roberts & Gott, 2006).

Most teachers followed the complete investigation process as outlined in AS1.1, which is linear and sequential. Complete investigation is defined in the curriculum, but the difference from the complete investigation for AS1.1 is that the curriculum sets out a recursive process where the student goes backwards and forwards as the investigation progresses and solves problems as they arise.

A pragmatic approach to investigation

Teachers responded to the assessment requirement of science investigation by taking a pragmatic approach and tailoring their teaching and assessment process to their specific needs. Task selection for teaching and assessment of science investigation (AS1.1) appeared to be dependent upon the availability of resources, manageability, and ease of administration. Resourcing needs included science technician support, physical resources including access to the laboratory, the equipment needed to carry out the investigation, the consumables, and access to text books which were sometimes shared between classes (as seen in the study class). Manageability aspects included teaching time, class size, and being able to manage difficult classes. Administration- related issues reported in science teachers’ interviews included the ease of administration of assessment for a large number of classes, the timetable constraints of the examination week, setting up the assessment venue for all classes to be assessed, and supervision.

The manageability of tasks with large groups of students and the related safety issues was a factor evident in the study class where, for example, the teacher was dispensing a variety of fuels to the students for an investigation. His focus was on safety issues; consequently he was unable to get around the entire class to support students’ learning.

Lunetta (2003) suggested that teachers spent too much time on managerial functions rather than on ways of teaching that challenge students’ thinking. Another significant consideration for regional

(21)

survey respondents for choice of task for AS1.1 was the convenience of having tasks available on the Ministry of Education website. The availability of moderated tasks reduced the prolonged process of writing a task and having it externally moderated.

During the curriculum stocktake, McGee et al. (2003) found that teachers reported lack of resources, time restraints, inadequate facilities, and little technician support to repair and maintain equipment and set up laboratories, as reasons for not using an investigative approach in their teaching. Further, in relation to NCEA levels 1 and 2, secondary school teachers reported that they found resourcing of their science programmes challenging. According to McGee et al., the issue was not just resource availability but a lack of time for teachers to adapt a resource to fit their requirement.

The main considerations in the choice of task for assessment, and AS1.1 in particular, were expense and helping students understand concepts. Teacher survey responses suggested that low-decile schools that could not afford the resources sometimes selected tasks that were less resource intensive. Teachers, however, identified many competing reasons for selecting a particular task for AS1.1 and it was clear that the final decision would have required careful balancing of priorities.

Administration issues impacted on the assessment choices made by the teachers. In large schools with many Year 11 science classes, administration of AS1.1 is a logistical exercise. The teacher in charge of practical assessment in the study school pointed out that within the constraints of the examination timetable, and competing demands on technician time and resources, selecting the assessment task for AS1.1 was a pragmatic way of managing and administering the assessment in the school hall. This meant that students collected data in full view of other students, raising issues of validity (Moeed & Hall, 2011).

In the USA, Lunetta (2003) reported challenging factors for managing teaching and assessment of science investigation as large classes, inflexible timetables, and perceived focus on the examination, which is similar to the issues reported by participants in this study.

(22)

Conclusion and implications

This case study of science investigation found that:

 Teachers taught science investigation as a linear and sequential process.

 Teachers’ reported that their practice of teaching science investigation in Year 11 had changed since the introduction of the NCEA. They taught the fair testing type of investigation in Year 11 as required for internal assessment.

 Teachers prepared students for the NCEA AS1.1 assessment through doing tasks similar to those used for the formal assessment.

 Although SiNZC (Ministry of Education, 1993) was underpinned by a constructivist view of learning, there was little evidence of teachers applying constructivist approaches to teaching in Year 11 science.

 Teachers trained Year 11 students to succeed in the assessment of AS1.1 science investigation through repetition and training.

 Teachers adopted a pragmatic approach to the selection of tasks based on availability of resources, manageability, and ease of administration.

A significant systemic change in education took place during this research with The New Zealand Curriculum (Ministry of Education, 2007) being implemented in schools. In that document, the aim in relation to investigation is that students will:

Carry out science investigations using a variety of approaches:

classifying and identifying, pattern seeking, exploring, investigating models, fair testing, making things, or developing systems. (n.p.)

There was a significant change in the achievement aims that now require the teaching of a variety of types of investigations which, in light of the findings of this research, would potentially move away from a narrow focus on fair testing types of investigation. However, the aims are not listed in the curriculum document itself and are only available in the online version of the document or on a separate

(23)

foldout. In the absence of achievement aims in the hard copy of the curriculum document, the requirement for carrying out a variety of approaches to investigation can be overlooked.

The alignment of achievement standards to the curriculum for level 1 took place in 2010 and the assessment of practical investigation continues. The Ministry of Education has retained the internal assessment of science investigation with direction at level 1 in the form of Achievement Standards, Physics AS1.1, Chemistry AS1.1, and Biology AS1.1 (Ministry of Education, 2010).

The implication of the findings of this research is that assessment of investigation through AS1.1 has narrowed the teaching of science investigation to a fair testing approach. The continuation of internal assessment of investigation is not conducive to teaching and learning and requires a change in policy. In light of international research and the findings of this study it is imperative to research further the impact of continued assessment of science investigation at level 1. We need to look for creative ways of assessment of science investigation that do not comprise teaching and learning.

(24)

References

Abbott, G., Cooper, G., & Hume, A. (2005). Year 11 science (NCEA level 1) workbook. Hamilton, New Zealand: ABA.

Abrahams, I. (2011). Practical work in school science: A minds-on approach.

London: Continuum.

Abrahams, I., & Millar, R. (2008). Does practical work really work? A study of the effectiveness of practical work as a teaching and learning method in school science. International Journal of Science Education, 30(14), 1945−1969.

Baviskar, S. N., Hartle, R. T., & Whiney, T. (2008). Essential criteria to characterize constructivist teaching: Derived from a review of literature and applied to five constructivist-teaching method articles. International Journal of Science Education, 1, 1–10.

Cleaves, A., & Toplis, R. (2007). Assessment of practical and enquiry skills:

Lessons to be learnt from pupils’ views. School Science Review, 88(325), 91−96.

Covington, M. V. (1998). The will to learn: A guide for motivating young people. New York: Cambridge University Press.

Denzin, N. K., & Lincoln, Y. S. (1994). Introduction: Entering the field of qualitative research. Thousand Oaks, CA: Sage.

Denzin, N. K., & Lincoln, Y. S. (Eds.). (2005). The Sage handbook of qualitative research (3rd ed.). Thousand Oaks, CA: Sage.

Driver, R., Asko, H., Leach, J., Mortimer, E., & Scott, P. (1994). Constructing scientific knowledge in the classroom. Educational Researcher, 23(7), 5−12.

Education Review Office. (1996). Science in schools: Implementing 1995 science curriculum. Wellington: Author.

Gott, R., & Duggan, S. (1996). Practical work: Its role in the understanding of evidence in science. International Journal of Science Education, 18(7), 791−806.

Hall, C. (2005). National Certificate of Educational Achievement (NCEA): Is there a third way? In J. Codd & K. Sullivan (Eds.), Education policy directions in Aotearoa New Zealand (pp. 235−265). Southbank, Victoria:

Thomson/Dunmore Press.

Hart, C., Mulhall, P., Berry, A., Loughran, J., & Gunstone, R. (2000). What is the purpose of this experiment? Or can students learn something from doing experiments? Journal of Research in Science Education, 37(7), 655−675.

(25)

Hipkins, R., Bolstad, R., Baker, R., Jones, A., Barker, M., Bell, M. et al. (2002).

Curriculum, learning and effective pedagogy: A literature review in science education. Auckland, New Zealand: Ministry of Education.

Hodson, D. (1990). A critical look at practical work in school science. School Science Review, 70, 33−40.

Hodson, D. (1998). Science fiction: The continuing misrepresentation of science in school curriculum. Curriculum Studies, 6(2) 191−216.

Hofstein, A., Kipnis, M., & Kind, P. (2008). Learning in and from science laboratories: Enhancing students’ meta-cognition and argumentation skills.

In C. L. Petroselli (Ed.), Science education issues and developments (pp.

59−94). London: Nova Science.

Hume, A., & Coll, R. (2008). Student experiences of carrying out a practical science investigation under direction. International Journal of Science Education, 30(9), 1201−1228.

Lunetta, V. N. (2003). The school science laboratory: Historical perspectives and contexts for contemporary teaching. In B. J. Frazer & K. G. Tobin (Eds.), International handbook of science education: Part one (pp. 249−261). London: Kluwer Academic.

Lunetta, V. N., Hofstein, A., & Clough, M. P. (2007). Learning and teaching in the school science laboratory: An analysis of research, theory, and practice.

In S. K. Abell & N. G. Lederman (Eds.), A handbook of research on science education (pp. 395−441). New Jersey: Lawrence Erlbaum.

McGee, C., Jones, A., Cowie, B., Hill, M., Miller, T., Harlow, A., &

MacKenzie, A. (2003). Curriculum stocktake: National school sampling study. Teachers’ experiences in curriculum implementation: Science.

(Report to Ministry of Education). Wellington: Ministry of Education.

Merriam, S. B. (1998). Qualitative research and case study applications in education. San Francisco: Jossey-Bass.

Mertens, D. M. (2005). Research and evaluation in educational psychology:

Integrating diversity with qualitative, quantitative, and mixed methods (2nd ed.). Thousand Oaks: Sage.

Millar, R. (2004). The role of practical work in the teaching and learning of science. Paper presented for the meeting of high school science laboratories:

Role and Vision, National Academy of Sciences, Washington, DC.

Millar, R. (2010). Practical work. In J. Osborne & J. Dillon (Eds.), Good practice in science teaching: What research has to say (2nd ed.).

Maindenhead: Open University Press.

(26)

Millar, R., & Driver, R. (1987). Beyond processes. Studies in Science Education, 14, 33−62.

Ministry of Education. (1993). Science in the New Zealand curriculum.

Wellington: Learning Media.

Ministry of Education. (2007). The New Zealand curriculum. Wellington:

Learning Media.

Ministry of Education (2010). Science matrix. Retrieved August 10, from

http://www.tki.org.nz/e/community/ncea/docs/scienceL1_matrix_3jun10.doc Moeed, A. (2010). Science investigation in New Zealand secondary schools:

Exploring the links between learning, motivation and internal assessment in year 11. A thesis submitted to Victoria University of Wellington in fulfilment of the requirements for the degree of Doctor of Philosophy, March 2010.

Moeed, A., & Hall, C. (2011). Learning and assessment of science investigation in year 11. The New Zealand Science Review, 68(3), 95−102.

Monk, M. (2006). How science works? School Science Review, 88(322), 119−121.

New Zealand Qualifications Authority. (2005). Achievement Standard 1.1.

Retrieved June 24, 2009, from:

http://www.nzqa.govt.nz/ncea/assessment/search.do?query=Science&view=

achievements&level=01.

New Zealand Qualifications Authority. (2006). Statistics for schools. Retrieved October 29, 2009, from

http://www.nzqa.govt.nz/qualifications/ssq/statistics/nqf- stats.do?ch=3210&year=2007&nqf

Level=0&st=0&cg=0&la=30&dm=1178&pc=250

Patrick, H., & Yoon, C. (2004). Early adolescents’ motivation during science investigation. The Journal of Educational Research, 97(6), 319−330.

Patton, M. Q. (2002). Qualitative evaluation methods (2nd ed.). Thousand Oaks, CA: Sage.

Roberts, R. (2009). Can teaching about evidence encourage a creative approach in open-ended investigations? School Science Review, 90(332), 31−38.

Roberts, R., & Gott, R. (2003). Assessment of biology investigations. Journal of Biological Education, 37(3), 114−121.

Roberts, R., & Gott, R. (2004). Assessment of Sc1: Alternatives to coursework?

The School Science Review, 85(131), 103−108.

(27)

Roberts, R., & Gott, R. (2006). Assessment of performance in practical science and pupil attributes. Assessment in Education: Principle, Policy & Practice, 13(1), 45−67.

Roberts, R., & Gott, R. (2010). Students' approaches to open-ended science investigation: the importance of substantive and procedural understanding.

Research Papers in Education, cat.inist.fr, 25(4), 377−407.

Simon, S., Jones, A., Fairbrother, R., Watson, J., & Black, P. (1992). Open work in science: A review of existing practice. OPENS project (1992), King’s College University of London: Centre for Education Studies. Retrieved July 12, 2008, from

http://www.comune.torino.it/sfep/praise/dwd/documents/references.pdf Tytler, R. (2007). Re-imagining science education: Engaging students in science

for Australia’s future. Australian Education Review. Victoria, Australia:

ACER Press.

Watson, R., Goldsworthy, A., & Wood-Robinson, V. (1999). What is not fair with investigations? School Science Review, 80(292), 101−106.

Watson, R., Goldsworthy, A., & Wood-Robinson, V. (2000). Beyond the fair test. In J. Sears & P. Sorenson (Eds.), Issues in science teaching (pp. 70−79).

London: Routledge Falmer.

Wellington, J. (1998). Practical work in science: Time for re-appraisal. In J. Wellington (Ed.), Practical work in science: Which way now? (pp. 3−15).

London: Routledge.

Windschitl, M. (2002). Framing constructivism in practice as negotiations of dilemmas: An analysis of the conceptual, pedagogical, cultural and political challenges facing teachers. Review of Educational Research, 72(2), 131−175.

Woolnough, B. E. (1991). Setting the scene. In B. E. Woolnough (Ed.), Practical science (pp. 3−9). Milton Keynes: Open University Press.

Woolnough, B. E., & Allsop, T. (1985). Practical work in science. Cambridge:

Cambridge University Press.

(28)

The author

Dr Azra Moeed is a senior lecturer and curriculum leader science education in the School of Education Policy and Implementation at Victoria University of Wellington. Her teaching and research interests include science teaching and learning in primary and secondary schools and science teacher education.

Email: [email protected]

1 Science fairs are competitions for New Zealand students from year 7 to 13 where students investigate a question of their choice and present their results.

2 Te Kete Ipurangi (2005). Watch that car go. Retrieved 8 March, 2010 from http://www.tki.org.nz/e/search/results.php?1%3Aelem=DC.Subject.Classification&1%3Av al=NCEA%3BNCEA%20Science&1%3Avalop=AND&1%3Asearchtype=term&2%3Aele m=TKI.Level&2%3Aval=NCEA+Level+1&2%3Avalop=AND&2%3Asearchtype=term&

xsl_lang=en&xsl_path=/search/results_e.php

References

Related documents

The overall achievement in fiction has not been quite as high, although it's David Malouf's second novel that is the outstanding Australian book for many years, and

What is depressing, however, is that so few international 'South Africanists' (at the African Literature Association Conference in Claremont , for instance) were even

The teachers expressed an interest in using DLOs to teach science as the science and technology fair was approaching, and in consultation with the teachers a subset of three DLOs

iTaukei students in early senior secondary school (Year 11) appeared to have a less positive view of the social implications of science and normality of scientists, along

disadvantage have special resonance for the Australian Aboriginal community, where the construct, the best interests of the child, has been applied and has resulted in an

This paper discusses issues related to the teaching of statistics to students enrolled in an undergraduate environmental science degree course.. The aim is to describe the teaching

This research project will engage Year 9 Science students in the creation of a slowmation to represent an Earth Science concept that has been misrepresented in popular culture, in

To be eligible for the 2000 Higher School Certificate, students must follow a course of study comprising a minimum of 11 units of Preliminary (Year 11) courses and 11 units of HSC

The NSW 28% rail mode share target will be most efficiently achieved by maximising the efficient use of existing intermodal terminals and making an economically efficient investment

The God's Gardeners are taking actions to make the world a better place (recycling materials, reusing objects, not.. MELISSA CRISTINA SILVA DE SA, Retelling Apocalypse:

The Queensland Curriculum and Assessment Authority (QCAA) is responsible for kindergarten to Year 12 syllabus development, and providing testing, assessment, moderation,

The total ABC contribution to Australian screen drama, combined with approximately $125 million in external funding, delivered up to $244 million in production value to

Relative contributions of individual rivers to the COTS initiation region between Cairns and Lizard Island were calculated based on the total cumulative exposure index, aggregated

Sessional Com m ittee on the Environm ent 79.. A strong research and development effort, particularly into the integration of control methods, is essential to the

The EPA has regulatory responsibilities for exercising the powers and carrying out the functions and duties under the EPA Act and environmental Acts including the Climate

• the financial statements of the EPA on pages 38 to 55, that comprise the statement of financial position as at 30 June 2018, the statement of comprehensive revenue and

• Additional High Conservation Value Vegetation (AHCVV) means areas of vegetation which were found during ground-truthing which would otherwise meet the definition of Existing

(Refer to Annex C which highlights areas of CPW in non-.. which provides funding to acquire the land. c) 280 ha to be protected within existing reserved areas including

Benzene (ppb) change in annual max 1-hour (MDA1) ground level concentrations from Scenario 2 due to future industry (S3-S2) for a subset of the CAMx 1.33 km domain centred over

The processing of the 2007 claim for compensation under MRCA 8 The processing of a subsequent claim in 2010 under MRCA 8 The raising of debts in 2015 in relation to the

5.15 At the time of Mr C’s requests for access to the NDIS, the NDIA did not have any policy or guideline dealing specifically with incarcerated individuals and access to the NDIS.

existence. In making such an estimate, the Government Printer was requested to take as understood that each author body would have its report printed by the

So knowing teachers’ teaching and assessment practices by school leaders – why their teachers teach and assess the way they do, and understanding their views and