Quantcast
Channel: HSSE Online - Geography
Viewing all articles
Browse latest Browse all 26

Guiding Students in Writing Data Response Answers Using Bloom’s Taxonomy for Critical Thinking

$
0
0

Abstract

This study focuses on improving students’ ability to respond to data response questions with two or more variables - in particular, students’ ability to describe and compare the data given in data response questions. Based on Bloom’s Taxonomy, a step-by-step guide was crafted on how to approach these type of questions. The methodology used was quantitative data derived from pre- and post-tests, and a qualitative analysis of the post-test scripts. For this research, we picked Secondary 5 Normal (Academic) students who showed difficulty in coping with data response questions that have two or more variables. We found that the guide was useful in scaffolding writing answers for the students. However, while students were able to apply the lower stages of the guide, they were not able to spiral their critical thinking skills to higher levels of Bloom’s Taxonomy.

Introduction

The Desired Outcomes of Education (DOE) serve as a guide to teachers in crafting their teaching goals. The development of the attributes stated under the DOE are believed to be key in ensuring that our students are able  to thrive in the challenging climate of the 21st Century. One of the ways in which the DOE can be achieved through Geography is through the development of students’ perspectives on Geographical issues by analysing data and information to critically arrive at reasoned conclusions. As such, part of the learning outcomes of the Upper Secondary Geography curriculum (CPDD, 2013) is the development of the following skills:

  1. Extract relevant information from geographical data
  2. Interpret and recognize patterns in geographical data and deduce relationships
  3. Analyse, evaluate and synthesize geographical data to make informed and sound decisions.

As the types of geographical data that students have to work with vary, we felt that providing a structured thinking process would help to scaffold students’ writing, and help them to answer data response questions better.

The focus of the study was on improving students’ competency in handling data response questions that have two or more variables. Our students were generally adept in dealing with questions with two variables that only required them to describe data. Their answers had a clear structure and were well developed. However, we noticed that when students were asked to describe and/or compare the variables given in a data set that had two or more variables, they were unable to meet the requirements of the question. Through this research, we wanted to provide a structured thinking process for our students to use when approaching these types of questions. The main research question for our research question was: To what extent does questioning based on Bloom’s Taxonomy improve students’ ability to describe and compare data with two or more variables?” 

Review of Literature

While there have been various definitions of Critical Thinking, Bloom’s Taxonomy provides a straightforward way in which critical thinking skills can be approached (Duren, et al., 2006). Bloom’s Taxonomy has 6 stages which progressively require a higher level of critical thinking (Figure 1).

According to Facione et al. (2000), critical thinking skills can be assessed through various tools such as rating forms and rubrics, and also seen via the outcomes of the activity or task given out to the students. The authors provided an example for how critical thinking skills can be analysed when students work on a given data set. The conclusions drawn by students can be evaluated, the evaluation of the conclusion can be explained or the conclusion can be re-formulated. All these are several ways to encourage critical thinking skills. Facione et al. (ibid) further argued for the purposeful inclusion of critical thinking skills in instructions and assignments to provide motivation to use the skills.

Ahrash and Lemon (2006) created a method of assessing critical thinking in their biology classroom. They were driven to carry out the research as they acknowledged the problems that existed with ‘discipline-independent’ critical thinking assessment tools. Hence, they created a tool that would measure both the content and the cognitive skills they would like their students to achieve. Their research method had them firstly craft out the questions that would require both biological knowledge and critical thinking skills. Hence, Ahrash and Lemon would be able to evaluate the content knowledge they wanted students to know and the critical thinking skills required. Secondly, they came up with a rubric that clearly stated the level of knowledge and critical thinking skills. Some of the benefits they observed were the transferability of the rubric to outside curriculum hours, and gains in students’ metacognition.

For our research, we modified Ahrash and Lemon’s (ibid) rubric. We focused on demystify the thinking process, using Bloom’s Taxonomy as its structure, so that students would be able to describe and compare variables in data sets. For the purpose of this research, we focused on the stages of Remembering, Understanding and Analysing– with the stage of Applying inherently weaved into all the first three stages stated. This is because we believe that the application of the knowledge gleaned from the topic area is inherently applied by the students as the content forms the basis of the questions that they are attempting. Hence, the guide (Figure 2) was created with each stage of Remembering, Understanding and Analysing having clearly defined steps that would help in approaching the questions of describing and comparing data sets with two or more variables.

We did not explicitly write in the content required to answer specific questions, as we wanted to apply the guide across topics in the curriculum. We hoped that with its introduction and with frequent use, the guide would provide students with a scaffold to check their work against. For the teachers, there is an additional column under Marking Rubric. This column can be customised for different data questions.

The Research Context

This study was conducted in a mainstream secondary school with a class of 36 Secondary 5 Normal (Academic) students. These students had PSLE T-scores ranging between 163 to 190, and a mean of 171.  The study was conducted during the revision period for the students and the data response questions were related to the topic of Geography of Food, as this was the most recently covered topic prior to the data collection period. 

This research was conducted in three different stages. Stage 1 involved a pre-test using a data response question with a data set containing 2 variables (refer to Figure 3). This was conducted to establish a baseline against which to measure improvements in students’ answers.

In Stage 2, the teacher modeled the use of the guide using a new data set with two or more variables. Students were asked to do corrections on their pre-test based on the guide, with an emphasis on the Remembering and Understanding stages of the guide. Stage 3 was the post-test (refer to Figure 4).

The three stages were conducted across three consecutive lessons so that there would be continuity in the learning and sustained exposure to these types of data response questions.

Data was collected at three different points – at Stage 1 (the pre-test), at Stage 2 (corrections of the pre-test) and Stage 3 (post-test). The data set from Stage 1 was compared against those from Stage 2 and Stage 3. These data sets were quantitatively analysed. The answers of the post-test (Stage 3) was qualitatively analysed. Using the guide as a reference point, we colour-coded the answers according to the various levels in the guide to analyse if there were areas of improvement at each of these levels in the guide.

Findings & Discussion

Analysis 1: Stage 1 (Pre-test) VS Stage 2 (Corrections of Pre-Test)

During the pre-test, there was a 13.9% pass while at the corrections of the pre-test, the percentage pass increased to 80.5%. The greatest increase in marks was by 1 to 2 marks. Out of the 13 students who showed only a small increase of 0.5 to 1 mark at this stage, 8 of them were students who were already weak at answering such data questions as seen by the results of their pre-test, and had failed the pre-test. The remaining 5 had already passed the pre-test.

Analysis 2: Stage 3 (Post-test) VS Stage 1 (Pre-test)

While the pre-test had a 13.9% pass rate, there was a 61.1% pass rate for the post-test. However, we had 16.7% of students whose results dropped by a range of 0.5 to 2 marks. The students who had dropped were either students who had done well at the pre-test or had not done well from the start. The majority of the students had an increase of 0.5 to 1 marks. Upon further analysis, many of these students were the weaker students. Approximately 50% of these students who had an increase of 0.5 to 1 mark still failed the post-test despite the increase in marks. 

Qualitative Analysis of Stage 3 (Post Test)

With the guide as a reference point, each script was colour-coded and analysed. When analysed, we noticed that many students were able to reach the stage of Analysing, to a certain extent. They were able to compare the general trend and provide the necessary data to support their points. This is shown in Sample 1 below, where like many others, the student was clearly able to go through the basic steps of identifying the examples of a Developed Country and a Less Developed Country from all the countries given, identifying the variables and applying the content knowledge taught on the level of development and consumption of food. This showed us that the students generally were able to apply the content knowledge taught to them in the context of this question and could do basic analysis of the data.

Sample 1: Post Test Answer

The US (an example of a DC) has the highest average total expenditure but the lowest percentage of household expenditure while Kenya (an example of an LDC) has the lowest average total household expenditure but the highest percentage spent on food. Hence, the higher the average total household expenditure, the lower the percentage of household expenditure spent on food.

We also noticed that students tended to state the data instead of analyzing it. They did not use the necessary vocabulary such as the largest or sharpest increase (Sample 2). From Sample 2, the student merely stated the data or the relationship. While some form of analysis was done by the student implicitly, it was not explicitly stated.

Sample 2: Post Test Answer

The Developed Countries have a higher average total household expenditure and hence, a lower percentage of household expenditure spent on food as compared to Less Developed Countries. The DCs such as US, UK and France have a total household income of US$21,788 to US$32,051 and the percentage of household expenditure spent on food is 6% to 14%. While the LDCs have a total household expenditure of US$541 to US$5118 and the percentage of household expenditure spent on food is 20% to 45%. 

We also noticed that the students spent a lot of time on the stage of Understanding and then concluded their answer by comparing and hence, reaching the stage of Analysing. Sample 3 suggests that the guide has provided students with a clear structure for writing their answers. However, this was only done for the portion of general trend, and the other part of the answer on analyzing the anomaly in the pattern was not looked into.

Sample 3: Post Test Answer

LDCs have a lesser average total household expenditure than DCs. For example, LDCs such as South Africa, Brazil, India and Kenya have average household expenditure of $541 to $5118 while DCs such as UK, US and France have average household expenditure of $21,778 to $32,051. LDCs have a higher percentage of household expenditure spent on food than DCs. This can be seen from LDCs such as South Africa, Brazil, India and Kenya spending 20% to 45% on food while DCs such as US, UK and France spent 6% to 14% on food. Overall, DCs have a higher average total expenditure and lower percentage of household expenditure spent on food.

Implications

The improvement shown in Analysis 1 was expected, but reinforces the point that the students were able to apply the guide to answering the questions. The fact that many of them had passed the post-test would also indicate the guide had helped in providing students with a thinking framework when approaching such questions. It is also an indication that there was retention of the steps given in the guide.

However, there are two main areas of concern. Firstly, the stronger students did not show much of a difference in their marks. Hence it begs the question of the usefulness of the guide in enriching the answers of such students. This suggests that the guide might be more useful for weaker students to aid their thinking and understanding, while another form of assistance needs to be provided for the better performing students.

Another area of concern is that despite an increase in the percentage of passes from the pre-test to the post-test, we still had students who did not pass the post-test. We believe that this points towards the frequency of use of the guide. As it was just introduced to the students, they may not have remembered the steps entirely. Hence, in future, this guide can be introduced earlier on so as to make it second nature for students to use the levels in the guide for structuring data response answers. As we look into the progression of skills from Lower Secondary to Upper Secondary, this guide can be introduced to our Lower Secondary students. We can then slowly increase the thinking steps to move up the levels of thinking such that they move on to doing questions that require deeper analysis in Upper Secondary. As such, with increased familiarity to such a thinking framework, we believe that the students would develop their metacognition and also refer to the guide outside of curriculum time, as per the benefits observed by Ahrash and Lemon (2006). 

Conclusion

Through this study, we have created a scaffold to help students to handle data response questions by making visible the thinking behind answering such questions. Though we started out by having in mind data response questions with two or more variables, we feel that this guide can actually be further applied to other types of data response questions that require only the lower order thinking skills. We feel that there is value in continuing to refine the guide and increasing the exposure of the guide to the students as it would strengthen the students’ critical thinking skills and their ability to handle data response questions.

This research was undertaken as part of Action Research Skills in General (for Geography teachers) course organised by AST.  Supervised by Dr. Tricia Seow.

References

Bissell, A. N., & Lemons, P. P. (2006). A new method for assessing critical thinking in the classroom. AIBS Bulletin56(1), 66-72. Retrieved November 2, 2017, from https://academic.oup.com/bioscience/article/56/1/66/224850

Curriculum Planning and Development Division, Ministry of Education. (2013). Teaching and learning guide for geography. Singapore: Curriculum Planning and Development Division, Ministry of Education.

D. R., L. B., & W. W. (2006). Critical Thinking Framework For Any Discipline. International Journal of Teaching and Learning in Higher Education, 17(2), 160-166. Retrieved November 2, 2017, from http://www.isetl.org/ijtlhe/pdf/IJTLHE17(2).pdf#page=89

Facione, P. A. (2000). The disposition toward critical thinking: Its character, measurement, and relationship to critical thinking skill. Informal logic20(1). Retrieved November 2, 2017, from https://ojs.uwindsor.ca/ojs/leddy/index.php/informal_logic/article/view/...

Related Teaching Materials

AttachmentSize
annex403.21 KB

Viewing all articles
Browse latest Browse all 26

Latest Images

Trending Articles





Latest Images