• No results found

Implementing the Data Analysis

In document 1.1 The Focus of this Research (Page 146-150)

4.9 Data Analysis Procedures

4.9.2 Implementing the Data Analysis

Within the data analysis process in this study, there were two distinct phases:

Phase One which was concurrent with the data collection and Phase Two which occurred after the data collection period had ceased.

Phase One was characterised by an informal analysis of the data concurrent with data collection. Using the concept of ―emergent design‖ (Lincoln & Guba, 1985, p. 259), an iterative cycle of data collection informing data analysis was undertaken (Merriam, 2002; Sarantakos, 2005). Key themes, concerns, and issues were identified in the existing data and then revisited and explored in the subsequent collection of data. Both the accounts and observations kept a steady stream of material coming into the study for interpretation and analysis which was then used to generate questions for the next round of interviews. This iterative approach sharpened attention and directed it toward specific themes which had been identified by the participants, and it provided a site for initial understandings to be tested and challenged. Rather than impose rigid categories on the data at this tender stage, a flexible and informal approach was adopted whereby hunches were followed and explored based on activity theory concepts.

Phase Two of the data analysis occurred after the data collection had been completed. In this phase, text units deemed relevant to the study were selected and categorised using activity theory as a form of typology. Essentially, Phase Two was a process of disaggregating the data into units and then reassembling these units into new structures with newly acquired meanings (Dey, 1993). This is a form of ―latent coding‖ (Sarantakos, 2005, p. 305) in which the text is read, interpreted, selected, and labelled as a particular semantic unit. Textual units were selected from the material obtained from the data collection methods. Boyatzis

134 134

(1998, p. 63, italics in original) refers to this ―unit of coding‖ as ―the most basic segment, or element, of the raw data or information that can be assessed in a meaningful way regarding the phenomenon.‖ The actual size of the text units could be as small as a phrase; however, as a general rule I tended to select more text rather than less in order to preserve some of the context surrounding it.

Krippendorf (2004, p. 101) refers to this material as ―context units‖ which surround the actual unit of data being selected and give it meaning.

As the case study unit of analysis was the activity system which was directed towards the interactive online learning activity as experienced by the selected EAL students and their teacher(s), any data that were perceived to be connected to the learning activity were selected. Data included local and global perspectives.

For example, local data might take the form of a student discussing her thoughts about what the learning activity entailed while more global perspectives might focus on a teacher‘s perception that eLearning was poorly supported at the institution. Like layers of an onion, the learning activity under study did not stand in isolation, but intersected with and was embedded within other activity systems.

However, some material from the collection process was rejected for a number of reasons. First, it was discarded if it had no obvious relevance to the unit of analysis (for example, off-topic comments which appeared on the recordings).

Second, material obtained through the use of poor interviewing techniques such as the use of leading or closed-ended questions was often excluded as the purpose of this inquiry is to understand the participants‘, not the researcher‘s, constructions of reality. Third, if a lack of comprehension was suspected, then material was not selected. This was a genuine concern when interviewing EAL students as there were several occasions when a tangled or inappropriate response was given suggesting the participant did not understand the question or could not communicate the desired response.

135 135

After selecting a text unit from the data set, this material was reconceptualised into new constructs based on activity theory. It was a process of categorisation which ―define[s] units by their membership in a class or category – by their having something in common‖ (Krippendorf, 2004, p. 105, italics in original). As a form of coding, it creates ―tags or labels for assigning units of meaning to the descriptive or inferential information compiled during a study‖ (Miles &

Huberman, 1994, p. 56). This stage of the analysis was supported by the creation of an analytic tool entitled ―decision guidelines for data analysis‖ (see Appendix K).

In the creation of the decision guidelines tool, some of the literature surrounding the use of activity theory as a conceptual tool was consulted (Brine & Franken, 2006; Gillette, 1993; Hewitt, 2004; Issroff & Scanlon, 2002; Jonassen & Rohrer-Murphy, 1999; Kaptelinin, Nardi, & MacCauley, 1999; Nardi, 2005; Van Aalst &

Hill, 2006; Yamagata-Lynch, 2003). Drawing from these authors, decision guidelines were developed which provided categories or groups into which data could be placed and ―clear operational definitions‖ so that data could be coded in a consistent manner by myself and others over time (Miles & Huberman, 1994, p.

63). The guidelines conceptualised the learning activity from a sociocultural perspective and enhanced the data analysis by improving the consistency of coding decisions.

In addition to mapping activity theory onto the learning activity, the concepts of contradiction (Engeström, 2001) and affordances (Gibson, 1979) were brought to bear on the data, enriching the analysis. Informed by research which focuses on contradictions (Barab, Barnett, Yamagata-Lynch, Squire, & Keating, 2002;

Engeström, 2001; Murphy & Rodriguez-Manzanares, 2008; Russell &

Schneiderheinze, 2005), the data were considered in relation to tensions and

136 136

stresses between various contextual factors and how these conflicts affected outcomes. The concept of affordance (Brine & Franken, 2006; Gibson, 1979;

Hutchby, 2001; Steel, 2009) was also used to consider how the mediation of the computing technology shaped the learning activity. Gibson (1979, p. 127, italics in original) states ―the affordances of the environment are what it offers the animal, what it provides or furnishes, either for good or ill.‖ Also, affordances have functional, relational, and cultural aspects (Hutchby, 2001). As functional entities, they have properties which ―allow certain actions to be readily performed with them, and which therefore push behaviour in certain directions‖ (Tolmie &

Boyle, 2000, p. 120). However, the affordances of an object are also relational in that they may vary for different individuals. For example, a small rock in the Australian desert can offer a lizard protection from the sun, but not the larger kangaroo. In the human domain, affordances can also be shaped by cultural factors. Objects can be associated with values and conventions which control how they are used; therefore, affordances do not necessarily have to be based on the natural features of an object (Hutchby, 2001). It follows from this that activity is not deterministic, solely shaped by the affordances of a tool. Rather, understanding the concept of agency – the way a person uses a tool – is central to understanding the concept of affordance and the nature of participation.

The final stage of the analysis in Phase Two was characterised by a cross-case analysis which identified similarities and differences across various manifestations of the phenomenon (Stake, 2006). This was a process of comparing the data across the three learning activities in the nursing, management, and applied linguistics papers, and synthesising the data into more global descriptive perspectives.

137 137

In document 1.1 The Focus of this Research (Page 146-150)