Methods for Gathering Data

Methodcloud

created using http://www.tagul.com

 

METHOD 1: QUESTIONNAIRE

WHAT?    WHY?

In order to examine the development in students’ skills and understanding, as well as any change in attitudes and emotions, over the course of the inquiry, my primary method for collecting this data was via a questionnaire.  As the school has a 1:1 laptop program, and is currently promoting digital literacy, I created an online questionnaire (click here to access) using Google Docs.  Apart from being visually appealing and easy for students to access and use, this tool provided me with the added benefit of automatically saving and collating all responses, so I did not have to worry about losing or sifting through paper responses.  I would highly recommend this tool for surveys, questionnaires and quizzes, with high school students in particular – it can be accessed as part of the Google Drive section of any Google account.

The format and content of this questionnaire is based on the School Library Impact Measure (SLIM) toolkit, developed at the Center for International Scholarship in School Libraries, by Todd, Kuhlthau and Heistrom (2005).  This toolkit was designed as a means of assessing student learning throughout a unit of guided inquiry, and enables teachers “to chart students’ knowledge and experiences throughout the process” (Todd, Kuhlthau, & Heistrom, 2005, p.5).  I have used a version of the SLIM toolkit, which has been used by Lee FitzGerald (2011, p.29) with Australian Year 11 History students, as this is closely linked to my ILA students.

One difference between the original SLIM and the newer version is the inclusion of feelings/emotions – I chose to include this, as emotions are a big component of most guided inquiry models, and I was interested to see the emotional change in my ILA students from the start to the end of their inquiry.  Another difference is asking WHAT interests the students about the topic, rather than HOW interested are the students – I included this, as my Year 10 students are capable of articulating this, and their level of interest should be evident in their responses; I also wanted to see if the nature, as well as level, of their interest in the topic changed as a result of the inquiry.  The final questionnaire, in line with Todd, Kuhlthau and Heistrom’s recommendation (2005, p.19), included an extra question – “What did you learn in doing this research unit?”  This was included for students to reflect and share any new information skills or understandings that they believe they have acquired.  As well as observing student changes through the research process, all questions are also very useful for teachers in regards to future planning e.g. to see what interests the students, and to find out information literacy areas in need of further development.  Questions used by FitzGerald can be seen in Figure 1 below:

SLIM toolkit

Figure 1 – SLIM toolkit questions, taken from FitzGerald, 2005, p.29

WHO?   WHEN?   HOW?

My ILA is a Year 10 Geography class, researching the health of the Swan River.  This is not a class I teach, and I was present purely as an observer.  As the unit of work was only running for four to five weeks, two of which were school holidays, and given that they were only allocated four lessons to work on their inquiry in class time, I chose to employ only two questionnaires – one at the start of the inquiry and one at the end.  I undertook the questionnaire across two classes, each with the same mix of abilities, and with the same teacher, and collated their responses as one group.  Each class consists of 25 students, but I knew my chances of obtaining two completed questionnaires from every member of one class was unlikely, given that some were planning early or late holidays, some have music lessons during class time, and some might have to complete other work in class and not have time for the questionnaire.  Out of 50 students, I received completed responses to BOTH surveys from 31 students, which is a useful number to work with, in terms of this analysis. 

Each questionnaire was completed during class time, and each one took approximately 15 minutes.  I was present during the completion of each questionnaire, to direct the students to the online survey, which I had uploaded onto their Year 10 Geography Moodle page, to personally clarify the task, be available to answer queries, and to thank them for their assistance and participation.  I explained to the students that their responses would not be shared publicly, and that the information would be used to give teacher librarians (of which I am one) and Humanities teachers a better understanding of how the students feel about the research process, and what areas of information literacy need to be addressed to a deeper extent in future planning i.e. for the benefit of students, as well as teachers.

METHOD 2: OBSERVATION

WHAT?    WHY?

As well as conducting the questionnaire, I was also able to observe the students, while they undertook their research, and to observe the methods the teacher used to facilitate the inquiry, including task documents and frameworks.  By observing classes in action, I was able to gain a better understanding of reasons for questionnaire responses, and to acquire additional information about how the students undertook their research.

WHO?   WHEN?   HOW?

I observed each of the two Year 10 classes on three occasions, during their inquiry task.  I was able to walk from student to student to observe, ask questions and answer questions.  I also presented a brief revision workshop with each class, concerning using the online bibliography generator.  The class teacher was very helpful and supportive of my project, and emailed me any relevant documents and frameworks, related to the students’ inquiry task.

References

Fitzgerald, L. (2011). The twin purposes of Guided Inquiry: Guiding student inquiry and evidence based practice. Scan, 30(1), 26-41.

Todd, R. J., Kuhlthau, C. C., & Heinstrom, J. E. (2005). School library impact measure (SLIM): A toolkit and handbook for tracking and assessing student learning outcomes of guided inquiry through the school library. New Brunswick, NJ: Center for International Scholarship in School Libraries, Rutgers University.

Results

As one would expect, there are overall changes in students’ responses between the first and final questionnaire, largely due to a deeper understanding of the topic, as a result of the inquiry process.  However, some students remained unengaged throughout the inquiry.  These results will be addressed and analysed separately for each question, below.  Direct quotations from student responses will be presented in their original form i.e. errors in spelling, grammar and punctuation are student errors – this gives a more precise insight into the students themselves.  The topic that the students were given was “The Health of the Swan River.”  Participants were studying Year 10 Geography in a co-educational, metropolitan school.

“Write down what you know about this topic”

Statements about topic

As set out in the SLIM toolkit (Todd, Kuhlthau & Heistrom, 2005), I first categorised all students’ responses as either factual statements, explanation statements, or conclusion statements.  I then created the above graph to represent my findings.  Students each gave varying numbers of statements, ranging from 0 from a couple of students in the first questionnaire to 10 from one student in the final questionnaire.

In the first questionnaire, statements were predominantly factual e.g. “swan river has jellyfish” “people go jetskiing in it” “Captain James Stirling discover it” “The Swan River’s located in Western Australia and it’s a bit polluted.” The number of facts given by students in the final questionnaire was not only larger, but the quality of these statements was also higher and more complex, reflecting recently acquired knowledge e.g. “The Swan River (Derbal Yerrigan) is 72 kilometres long” “animals in it have been found dead” “Algal blooms can be red, green, yellow/brown and found in freshwater or marine environments.”

As can be seen in the graph above, there was a drastic change in the number of explanation statements between the first and final questionnaires, and a significant increase in the number of conclusion statements.  This demonstrates a deeper understanding of the topic, which has been achieved through the geographical inquiry e.g. “Poison Algae is causing not only a higher toxicity level in waters but is also destroying habitats and killing wild life” “It has extreme pollution due to urban drainage and excess chemicals” “Pollution is caused by contaminants in storm-waters, herbicides and pesticides that end up in the Swan River’s waters. Also people throwing away rubbish in the river contaminates the river.”

In his first questionnaire, Student R  responded to this question with “N/A.”  In contrast, his final response demonstrated significant new learning in the form of facts, explanations and conclusion: “Feral Fish are a major problem in the Swan River. -They can cause native fish species to die out, spread disease and dig up the river. -They are a threat to the health of the Swan River. -They are dumped into the river by people who don’t want them anymore.”  On the other hand, Student B’s responses showed far less progress – going from “nothing” at the start of the project, to “the water is not very clean” at the end.  Student B appears to be lacking effort and enthusiasm for the inquiry project – approaches to potentially reduce this issue, which was also experienced by a few other students, will be explored in my Recommendations post.

“How much do you know about this topic?”

Knowledge Scores

The knowledge scores shown in the above graph correspond to the following coding, used for student responses in answer to the above question:

Nothing – 0     Not Much – 1     Some – 2     Quite a Bit – 3     A Great Deal – 4

As can be seen in the above graph, most students knew Not Much about the topic at the outset of the inquiry, but this perception of their knowledge jumped to almost Quite a Bit overall, by the end.  Within individual responses, some students’ increases  in scores were more dramatic than others.  For example, Student L changed from Nothing to A Great Deal and several students moved up at least 2 categories.  Smaller changes were generally recorded for those students who had already scored fairly highly on the first questionnaire.  However, there were a couple of instances of students who responded with Nothing or Not Much in both questionnaires.  Again, this disengagement will be discussed in my Recommendations post.  While there is a pleasing change between the first and final questionnaires, there is still room for improvement, as ideally the average would be closer to 4: A Great Deal.

“What interests you about this topic?”

Interest themes

This graph demonstrates how students’ interests changed over the period of their research, based on their new learning.  Although a wide variety of responses was given, distinct themes emerged, which can be seen in the graph.  The decrease in the response of Nothing / Don’t Know shows that a higher percentage of students became engaged in the topic, over the course of the process.

While students stating their interest in an environmental issue implies an interest in wildlife, I only added to the Wildlife tally if this was explicitly mentioned. Responses included under the theme of Wildlife include “I like swans” “there are fish in the river” and “swan river jellyfish.”  Responses categorised under Leisure Activities were, similarly, mainly short and fact-based e.g. “water skiing” and “it has heaps of boats in it.” However, one student reflected on the impact of the river’s health on the leisure activities – “Jet Skiiing is a problem now since the water is dangerous to swim in,” demonstrating higher level thinking.

As with responses to “Write down what you know about this topic,” an increased understanding was evident in responses about interests in the final questionnaire, within the theme of Environmental Issues.  For example, in the first questionnaire, students responded with “The conservation aspect of the topic” and “The fact that it’s addressing the environmental issue of pollution with negative annotations.” After the project, responses were more sophisticated e.g. “The thing that interests me about this topic is how all these small things can have such a major effect on a river and which can then create issues for animals and humans” and “How the algal blooms manage to kill most of the fish in the river.”  This was also the theme with the most dramatic positive change in interest from one questionnaire to the next, demonstrating that education about a specific issue can increase interest in the issue.

Community Action was a new theme in the final questionnaire, which no students mentioned in the first one.  This signifies a deeper understanding of the issues, gained through the inquiry, resulting in evidence of some students reaching a transformative perspective.  According to Lupton and Bruce (2010), when information is used in a transformative way (the “transformative window”), “we use information to question the status quo, challenge existing practice, empower oneself and the community” (p. 14).  Examples include “if everyone works together to solve this issue there could be a great improvement in the river’s health” and “How we can change to let the river survive.”  

“Write down what you think is EASY about researching your topic” 

Graph Easy to Do

Table of Definitions and Examples of “EASY” Themes

Most of the student responses to this question, in both questionnaires, concerned the location of information.  One of the main changes between responses in the two questionnaires can be seen in the emergence of additional themes.  This was not surprising, as the teacher focussed on specific skills throughout the inquiry, such as developing focus questions, and creating a bibliography.  Based on observation and conversation, most students had very little prior experience in formulating their own questions to guide an inquiry or research project, which explains why this was not mentioned in the first survey.  I will discuss this further in my Recommendations post.

Unlike at the start of their inquiry, by the time students completed the final questionnaire, many of them found it easy to find more detailed information on a specific topic.  This may be partly due to students being given some key, useful websites at the start of their research, such as that of the Swan River Trust, which I observed many of the students using as their predominant resource.  Therefore, the high scores for this theme on the final questionnaire are not necessarily due to students becoming more skilful at using expert search strategies, which results may imply.

“Write down what you think is DIFFICULT about researching your topic”

Graph Difficult to do

Table of Definitions and Examples of “DIFFICULT” Themes

As expected, fewer students (only 2) found Everything difficult to do by the end of the inquiry, inferring that nearly all students felt confident about at least one aspect of the process.  As also seen in results for “What Students Found Easy to Do,” most responses here concerned the location of information.  This  demonstrates that this stage is the primary focus of research for most students, possibly due to lack of experience with an inquiry-based approach and process.  Responses appear to suggest that students are not as familiar with other aspects of information literacy, such as evaluating information, formulating questions, synthesising information, and communicating information.

Although a significant proportion of students reported that they found finding specific information “easy to do,” more than half of the students also claimed that finding Useful Info and/or finding Info at Right Level proved to be difficult.  It appears that, while many students were able to locate information on a certain topic without too many problems, the difficulty lay in finding details or answers to their specific questions e.g. “It is difficult to know who is responsible and exactly how do contaminants end up in the river” and “It was difficult to answer ‘where questions’, such as ‘where is the most polluted part of the Swan River?'”  Student N was disappointed that one particular website did not contain all the answers he required: “The Swan River Trust website did not have all the information.”  These responses indicate that many students lack the skills to effectively search for detailed information, particularly online, and that they are not accustomed to finding and corroborating information from a wide range of sources.  Methods to address this weakness will be discussed in the Recommendations post.

“How are you FEELING about your topic?”

Feelings chart

Students were invited to tick as many options as relevant to them, from the choice of feelings listed in the graph above.  Unfortunately, I did not have the opportunity to also gauge feelings at a mid-point in the inquiry, which would have given a clearer idea of the emotional journey of the students through the process.  Kuhlthau, Maniotes and Caspari (2007) advocate that students experience different emotions and feelings at different stages of the inquiry process, represented by the options in the questionnaire.  These authors claim that the dominant feeling at the start of an inquiry is uncertainty, which is certainly reflected in the results above.  They also describe how students become more confident as they prepare to present their new understandings, and become satisfied at the presentation stage (Kuhlthau, Maniotes & Caspari, 2007, p. 20), which also rings true with student responses for this task.  Many more students also felt relieved by the end of their research, but it is not clear whether this is due to completing the work, or finishing the topic.  A number of students remained uncertain and/or anxious after presenting their reports, which may refer to concern about passing or receiving a good mark.  Responses in the Other category included “don’t really care” and “hungry,” again highlighting a small proportion of students who lack engagement with the topic and process.

“What did you learn in doing this research project?”

Graph learned

This question was only asked in the final questionnaire.  According to the SLIM toolkit (Todd, Kuhlthau & Heinstrom, 2005), this question was included to give students an opportunity to reflect on any newly-acquired skills in information literacy.  However, when conducting the survey, I did not make this requirement explicit to the students, so their responses mainly focus on knowledge, rather than skills.  In future administration of the SLIM toolkit with other students, I will need to make the required nature of responses clearer, in order to better evaluate skills development.  As only 2 students mentioned an information-related skill in this section i.e. Info Lit (“How to correctly link information to a source” and “That dot points are easier to take down then paragraphs”), I chose broader themes, as seen in the graph above.  Disengaged students again made their mark in the Nothing category, with responses ranging from “That I don’t like SOCE” to “I learned that the Swan River is a bit more of a bore then I thought.”

The theme of Facts refers to surface knowledge, or information which is seen through a “Generic window” (Lupton & Bruce, 2010) e.g. “the water is not very clean” “I learned how algae reproduces” and “I learnt a lot about the Elizabeth Quay project.”  The theme of Understanding encompasses statements which reflect a deeper understanding of issues, effects, relationships, and possible solutions, in line with Lupton and Bruce’s (2010) “Situated” and sometimes “Transformative” windows.  Examples of responses which fit into this category include “The way we are living now could harm the river for a very long time” and “Perth’s riverside location means management practices in the Swan Canning Catchment will need to accommodate understanding of climate change impacts throughout the region.”  Ideally, with a successful inquiry, responses in this theme would outweigh those in the Facts category, especially at a Year 10 level.  Results therefore highlight students’ lack of experience with this type of learning and research process, and also the limitations on learning caused by a strict timetable and term program, as students were only given 4 class sessions, of 55 minutes each, to work on their inquiry.  Also, student learning can be enhanced through collaboration, as will be explored in the Recommendations post, but this project was entirely individual.

References

Kuhlthau, C. C., Maniotes, L. K., & Caspari, A. K. (2007). Guided inquiry: Learning in the 21st century. Westport, CT: Libraries Unlimited.

Lupton, M., & Bruce, C. (2010). Windows on information literacy worlds: Generic, situated and transformative perspectives. In Practising information literacy: Bringing theories of learning, practice and information literacy together (pp. 3-27). Wagga Wagga, Australia: Centre for Information Studies.

Todd, R. J., Kuhlthau, C. C., & Heinstrom, J. E. (2005). School library impact measure (SLIM): A toolkit and handbook for tracking and assessing student learning outcomes of guided inquiry through the school library. New Brunswick, NJ: Center for International Scholarship in School Libraries, Rutgers University.