About the Author(s)


Craig Pournara Email symbol
Division of Mathematics Education, Faculty of Humanities, University of the Witwatersrand, Johannesburg, South Africa

Lynn Bowie symbol
Division of Mathematics Education, Faculty of Humanities, University of the Witwatersrand, Johannesburg, South Africa

Olico Maths Education, Johannesburg, South Africa

Citation


Pournara, C. & Bowie, L., 2023, ‘What mathematics do Grade 7 learners take to high school in the context of the COVID-19 pandemic?’, South African Journal of Childhood Education 13(1), a1239. https://doi.org/10.4102/sajce.v13i1.1239

Original Research

What mathematics do Grade 7 learners take to high school in the context of the COVID-19 pandemic?

Craig Pournara, Lynn Bowie

Received: 22 June 2022; Accepted: 04 Dec. 2022; Published: 27 Oct. 2023

Copyright: © 2023. The Author(s). Licensee: AOSIS.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Background: Poor mathematics performance in South Africa is well known. The COVID-19 pandemic was expected to exacerbate the situation.

Aim: To investigate Grade 7 learners’ mathematical knowledge at the end of primary school and to compare mathematical performance of Grade 7 and 8 learners in the context of the pandemic.

Setting: Data were collected in term four of 2020 at 11 primary schools and five secondary schools. All schools drew learners from poor communities in Gauteng.

Methods: A multiple-choice test covering mathematical content from Grades 4–7 was designed and piloted. Learner performance was measured through number of correct responses. Qualitative error analyses were conducted on learners’ choices of distractors.

Results: The difference in performance of the two grade groups was not statistically significant. There were similar response patterns in learners’ choices of distractors with strong evidence of cue-based reasoning and evidence of additive reasoning in items requiring multiplicative reasoning.

Conclusion: Grade 8 learners made very small gains, likely due to reduced learning time. Learner errors show many similarities with the international literature and show that Grade 7 learners are not yet ready for algebra.

Contribution: The findings provide starting points for addressing the most common errors and highlight the need for: greater attention to whole and rational number concepts in Grade 8; strategies for teacher support in teaching primary maths content; and innovative teaching strategies to fast-track learning of this content.

Keywords: diagnostic assessment; error analysis; whole number reasoning; rational number reasoning; multiplicative reasoning.

Background

The COVID-19 pandemic had a substantial impact on mathematics teaching and learning in South African schools in 2020 and 2021, particularly in poor communities where online learning was not possible. This loss of learning opportunities is likely to have knock-on effects for several years to come because of the hierarchical nature of mathematics. In the last quarter of 2020, the Wits Maths Connect Secondary (WMCS) project, in partnership with the Gauteng Department of Education (GDE) and Olico Maths Education, developed and piloted a baseline assessment of Grade 7 learners’ mathematical knowledge to provide an indicator of what mathematics Grade 7 learners were bringing to high school at the start of the 2021 school year. Given the dual purpose of the test as both a diagnostic and a baseline measure, the authors coined the name DiBa Test. In addition, they were confident that an appropriately designed test instrument would provide diagnostic evidence of learners’ errors which could, in turn, inform teaching. In the longer term, they consider the development of such a test instrument to be strategic for wider use provincially and nationally.

A group of Grade 8 learners were included in the pilot to compare with the Grade 7s. This stemmed from concerns that Grade 8s had very limited learning opportunities as a result of school closures from mid-March to late August 2020. By contrast, Grade 7s had returned to school in June 2020, although schools were then closed for another month in late July. Based on the reduced opportunity to learn for Grade 8s, the authors wanted to investigate possible differences in performance across the two groups, suspecting that Grade 7s might outperform the Grade 8s on some items.

Therefore, the research questions that frame this article are as follows:

  • To what extent does learner performance differ across topics in the DiBa Test?
  • What are the similarities and differences in Grade 7 and 8 learners’ performance on the DiBa Test?
  • What insights can be gained from a diagnostic analysis of learner’s performance on items involving whole number, rational number and multiplicative reasoning?

The authors work from the assumption that a diagnostic–baseline test instrument, consisting only of multiple-choice items (MCQs), provides useful insights into what mathematics learners know and can do. They show how the test provides insights into learners’ mathematical proficiency as evidenced through their responses to carefully constructed items with distractors that address common errors. The authors are well aware that learners may guess responses when they do not know the answer but, as the study analysis will show, the trends which emerge from the data are largely consistent and also reflect many of the errors and misconceptions reported in the local and international literature.

The goal of this study is not to bemoan poor learner performance, although there are patches in the article when the reader may feel weighed down by the extent of low performance. Nevertheless, the authors seek to:

  • establish a picture of what learners can do mathematically in the context of the pandemic;
  • identify typical learner errors in whole number, rational number and multiplicative reasoning; and
  • provide recommendations for curriculum and teaching that are informed by the study findings.

Baseline and diagnostic testing are types of formative assessment in that their aim is to inform and support teaching and learning (Black & Wiliam 1998). For high school mathematics teachers, it is important to have a good indication of the strengths and weaknesses that their learners bring with them from primary school. Thus, establishing a baseline of content that has been mastered and what gaps learners bring to high school is important. In addition, working with learners’ errors and misconceptions has been shown to be an effective component of mathematics teaching (Nesher 1987; Ryan & Williams 2007), and thus an important element in diagnostic testing is to uncover those misconceptions.

Literature review

The study’s in-depth analysis of learner performance deals with whole number, rational number (which includes fraction, decimal fraction, ratio, rate and percentage) and multiplicative reasoning. An overview of the relevant literature is provided which informed the item design in 2020 and now informs the discussion of learner test performance. The study focuses particularly on those aspects of primary school number work that are foundational for high school mathematics. These have been grouped into three themes: relational approach to working with numbers, multiplicative reasoning and rational number constructs.

Relational approach to working with numbers

Following Carpenter et al. (2005), this study considers relational thinking as ‘using fundamental properties of number and operations to transform mathematical expressions rather than simply calculating an answer following a prescribed sequence of procedures’ (p. 54). For example, a fundamental property of multiplication is the commutative law, which provides a mathematical basis for explaining why ‘3 groups of 5’ is the same as ‘5 groups of 3’. Empson, Levi and Carpenter (2011) argue that a strong relational understanding of whole numbers is essential for understanding fraction concepts and procedures. Furthermore, they argue that a focus on relational thinking, first with whole numbers and then with rational numbers, is key to establishing a strong basis for algebra.

This approach to number work, which moves beyond procedures and pays attention to the underlying functional relationships and structures, is crucial to learners’ future mathematical success (Cai & Knuth 2011; Watanabe 2011). Effective use of a relational approach to working with numbers in support of algebraic reasoning has been widely documented in the literature. For example, the use of bar models to solve additive word problems in early grades fosters an understanding of addition and subtraction as inverse operations. This bridges working with unknowns in a less abstract setting to later work with variables (Cai, Ng & Moyer 2011). Blanton et al. (2015) argue for the importance of developing young children’s understanding of the equal sign and equality, of generalised arithmetic (e.g. recognising that α + 0 = α for any number α) and of functional thinking (e.g. exploring how quantities co-vary) and have shown that learners from as early as Grade 3 in the USA can develop these early algebra skills while learning about whole numbers.

Multiplicative reasoning

This study adopts Steffe’s (1992) view on multiplicative reasoning as reasoning that supports the distribution of one composite unit over another composite unit. For example, a learner is reasoning multiplicatively if they can treat 10 bananas in a bag as a unit and so be able to see that three bags would contain 30 bananas. Multiplicative reasoning is crucial in the development of learners’ understanding of number at primary school, and it is a necessary foundation for high school mathematics (Askew et al. 2019). It begins in whole number operations in the early years of primary school and then extends to rational numbers, where it underpins ratio and proportion, fractions, decimals and percentages (Lamon 2005). Multiplicative reasoning is also linked to functional thinking and how quantities co-vary. For example, learners need to recognise that if rope costs R12.00 per metre, then one can work out the cost of any length of rope using this rate, including parts of metres. Functional thinking is the basis of many key concepts and topics in the high school curriculum which build from ratios such as gradient, similarity, trigonometry and probability (Bowie et al. 2022a; Pienaar 2014). Multiplicative reasoning is also foundational for understanding patterns such as exponential growth, including compound interest, in high school. However, teaching multiplicative reasoning is difficult (Lamon 2005), and learner success is difficult to achieve (Brown, Küchemann & Hodgen 2010). This is evident in learner performance on the DiBa Test, as will be shown.

Rational number constructs

Fractions and rational number reasoning are necessary precursors to learners’ success in algebra (Confrey 2012; Siegler et al. 2012; Torbeyns et al. 2015). However, fractions are notoriously difficult to learn and teach (Charalambous & Pitta-Pantazi 2007; Ubah & Bansilal 2018). Some of the complexity of fraction work lies in the fact that they are comprised of five interrelated subconstructs: part–whole, ratio, operator, quotient and measure (Behr et al. 1983; Kieren 1976). Space does not permit a discussion of these constructs.

Several common errors in fractions can be attributed to the over-generalisation of whole-number knowledge. These errors include adding two fractions by adding numerators and adding denominators, for example: . Learners also assume incorrectly that because 3 is greater than 2, then . The prevalence of such errors has been well-documented in the research literature for nearly 40 years (Bills 2003; Fuchs et al. 2017; Kerslake 1986), and DiBa results confirm this.

Similarly, several errors in working with decimals have their origins in overgeneralising from whole number knowledge (Durkin & Rittle-Johnson 2015). Among these is the ‘longer-is-larger’ error (Steinle & Stacey 2004). For example, learners assume 0.32 is greater than 0.7 because it is ‘longer’. This stems from the whole number fact that 32 is greater than 7. There is also evidence of learners’ confusion about the role of zero. For example, some learners assume that 0.03 is the same as 0.3, while others assume that 0.320 is larger than 0.32 (Durkin & Rittle-Johnson 2012; Irwin 2001).

Finally, a vital aspect in the development of numerical knowledge is an appreciation that all real numbers have magnitudes and can therefore be placed on the number line (Resnick, Newcombe & Shipley 2016b). Accurate whole number magnitude estimation has been shown to predict fraction magnitude representation (Hansen et al. 2015; Resnick et al. 2016a). Siegler et al. (2010) emphasise the importance of learners’ treating decimals and fractions as numbers, rather than viewing fractions as made up of two whole numbers. They have shown the correlation of fraction and decimal magnitude representation with general mathematics achievement, which supports the inclusion of the number line in both whole and rational number work.

Development of the test instrument

Item selection and design was informed by the literature discussed above, as well as literature relating to the other topics not discussed in this article. Given the widespread low levels of performance in Senior Phase mathematics, the authors decided to include items spanning the curricula of Grades 4–7, with a small number of items on Grade 8 algebra. It was anticipated that this would indicate whether learners had mastered content of earlier grades and if not, what kinds of errors they made. It was also hoped that by including content from earlier grades, it might avoid the flooring effects that dominated South African learner performance in the Trends in Mathematics and Science Study (TIMSS) assessments (Bowie et al. 2022a, 2022b) and in the Annual National Assessments (ANAs), particularly in Grade 6 and Grade 9 (Department of Basic Education [DBE] 2014).

It was decided to use only MCQs because these could be marked quickly and thus reduce turnaround time for reporting results to schools when the test is implemented in the future. Each MCQ item was carefully chosen and designed to include distractors that reflect typical errors and/or misconceptions as identified in teaching practice and in the local and international literature.

A database of items was created, drawing from a range of existing sources, including released items from TIMSS, Olico’s existing item data base, items from the DBE diagnostic assessments, ANAs for Grade 6 and DBE Baseline assessments of 2020, as well as items gathered from baseline assessments of local schools. Items were separated into five topics: whole number properties and operations; rational numbers (fractions, decimals, percent, ratio and rate); patterns, functions and introductory algebra; measurement and geometry.

The topics were weighted differently based on their relative importance as foundational for Grade 8 mathematics. Number concepts and operations (whole numbers and fractions) constituted approximately two-thirds of the items. The pilot instrument contained 66 items with weightings, as indicated in Table 1.

TABLE 1: Topic weightings for DiBa test pilot instrument.

Two examples of test items are given below, dealing with decimal numbers and ratio.1

In Figure 1, the correct answer is obviously 0.91. The other distractors reflect typical errors and learner difficulties with decimals. Learners choosing option D consider 0.908 to be the largest decimal because they treat the numbers as they would treat whole numbers. Learners choosing option B display the longer-is-larger error because it has the most digits after the decimal. Learner difficulties with the meaning of zero in decimals are captured in the choice of 0.908: if learners treat it as 0.98, they may choose 0.536 as the largest number based on whole number reasoning.

FIGURE 1: Test item to identify largest decimal fraction.

For the ratio item, learners were required to calculate the number of pens in a collection; in a teacher’s stationery box, the ratio of pens to pencils is 5:4. If there are 36 pens and pencils altogether in the box, how many pens are there?

The options were:

  • A: 5 The stated ‘number of pens’ in the ratio
  • B: 18 Half of the total, that is 36 ÷ 2
  • C: 16 Number of pencils in the collection
  • D: 20 Number of pens in the collection (correct answer)

This question depends heavily on multiplicative reasoning. In order to identify the correct option, learners first need to recognise that 5 and 4 do not refer to actual numbers of pens and pencils, respectively. In order to select 20, not 16, learners must pay attention to the wording of the item and to the relationships in the notation (i.e. pens to pencils is 5:4) and recognise that the first number is related to pens.

One of the design features was to choose numbers in such a way that the distractors could be written as whole numbers and hence not in the same numerical form as the question. The intention was to check whether learners could recognise the correct answer even if its form did not match that of the question. For example, in the item on multiplication of fractions, two distractors were given in fraction form and two were given as whole numbers, one of which was the correct answer.

After completing a draft test instrument (hereafter test X), a parallel version (hereafter test Y) was designed with the same topic weightings. No anchor items were included because the goal was to pilot the items, not the instrument as a whole, and hence the authors wanted to pilot as many items as possible. The sequence of the items was partially adjusted so that the equivalent items on the last two pages of test X were moved into the middle of test Y to ensure that learners had time to attempt them (in case the test proved to be too long for the allocated time).

Piloting of the test instrument

With special permission from the GDE because of COVID-19-related restrictions, the test was administered in late October and early November 2020 in four districts in the Gauteng province. The authors tested 473 Grade 7 learners in 11 schools, and 116 Grade 8 learners in 5 schools. The Grade 7 learners were closest to the target sample for intended future research, that is, Grade 8 learners entering high school. On the other hand, the Grade 8 sample provided an indication of learner performance at the end of the first year of high school, thus potentially providing evidence of change in mathematical performance over the year, albeit one that was substantially disrupted by the COVID-19 pandemic.

Both versions of the test were piloted under typical test conditions. Learners were given approximately one hour to write the test. Feedback from invigilators and analysis of the scripts suggests that this was sufficient time to complete the test for both Grade 7s and 8s.

Learners wrote their responses on a preprepared answer grid, which was then scanned and processed by an online learner management system. Accuracy checks of the scanned images showed that approximately 10% of answer sheets were not scanned with 100% accuracy and therefore required manual capture of the learners’ responses.

Ethical considerations

An application for full ethical approval was made to the Human Research Ethics Committee (non-medical) of the University of the Witwatersrand and ethical approval was received on 16 October 2020 (reference number HR20/10/32). Informed written consent was obtained from the parents of study participants. Informed written assent was obtained from learner participants. Schools were assured of anonymity and that their performance would not be compared with that of other participating schools.

Data cleaning and reporting

In preparing the data for analysis, the authors resequenced the items from Test Y to correspond with those of Test X. All references to item numbers in this article refer to the item’s number in Test X. Data cleaning and processing revealed that the maximum percentage of blank responses was 5.4% per item with a mean of 2.8%. This indicated that learners would not have benefited from additional time. The maximum percentage of ‘bad’ responses, where learners selected more than one distractor, was 2.8% with a mean of 0.9%.

The performance of the items is not reported here, rather the focus is on learner performance, beginning with overall test performance, then performance per topic and then per item. This includes analysis of performance on each distractor. When comparing performance across grades, the response profile of each grade group to an item was considered; that is, attention was paid to the proportion of each group that chose a particular distractor and the extent to which the trend from most popular to least popular option was the same or different.

In the next section, the article reports first on overall performances. All comparisons across grades are made with caution, given the vastly different sample sizes. In the more detailed discussions of learner performance on specific items, attention is paid to grade differences where appropriate.

Overall learner performance

The overall learner performance was poorer than anticipated, with an average score of 35.8% and approximately 70% of learners achieving 40% or below (see Figure 2). This is worrying, given that the test dealt mainly with mathematical content of Grades 4–7.

FIGURE 2: Distribution of learner test scores.

The average score for the Grade 7 group was 35.3%, with an average of 38.1% for the Grade 8 group. Table 2 shows the overall performance per topic and per grade. It is not surprising that the best overall performance was on whole numbers and whole number operations. Nor is it surprising that performance was poor on rational numbers. The authors ranked all 66 items on percentage of correct responses. Six of the top 10 (i.e. best answered) items dealt with whole numbers and whole number operations, with only two whole number items in the bottom 10. By contrast, there were only three rational number items in the top 10 but five in the bottom 10. This confirms what is already known about learners’ difficulties with fractions. The very low average for measurement echoes similar findings in TIMSS performance (Bowie et al. 2022b). The authors do not focus on measurement items in this article.

TABLE 2: Learner performance per grade per topic.

When comparing performance across grades, the differences in the mean weighted scores for each topic cluster are small, with the exception of patterns, functions and algebra (see Table 4). Two-sample t-tests on overall performance show that the difference between the two grades was not statistically significant (t [130] = −1.149, p = 0.253). Further t-tests indicated no statistically significant difference between the two grades on any topic. These are important findings, suggesting that Grade 8 learners did not make significant progress on the mathematics tested by the instrument over the 2020 academic year. This does not come as a surprise, because Grade 8s had substantially less time at school than Grade 7s, as noted above. However, there were individual items where the difference in performance between the two grades was substantial. Examples of some of these items are provided in the discussion that follows.

Analysis of learner performance on selected topics

In this section, learner performance is reported on items involving whole number, rational number and multiplicative reasoning. Each section begins with overall performance together with a comparison of performance of the two grade groups. Thereafter, error analyses of clusters of items and/or selected individual items are provided.

Whole number properties and operations

Performance on whole number items for the entire group ranged from 77.8% down to 18.7%, indicating a wide variation in learners’ proficiency in different aspects of whole number properties and operations. There were only four items where more than 50% of Grade 7 learners answered correctly, and only six items where more than 50% of Grade 8 learners answered correctly. As noted above, the authors had expected better overall performance on whole number items because almost all of them dealt with content of Grades 4–6.

There is evidence that learners still lack understanding of the fundamentals of whole number operations such as order of operations and place value. For example, learners had to calculate: 8 + 20 ÷ 4. Only 22.2% of all learners selected the correct answer with the most common error being that of left-to-right reasoning where learners ignored the priority of division over addition, thus getting an answer of 7. Another item involved ‘horizontal addition’ to calculate the sum of 29 998 and 5. By far the most common error in both grades (approximately 20%) was to add 8 and 5 and append this sum to 29 998, giving 299 913. This error suggests many learners do not have a sense of number magnitude and are not paying attention to place value of the digits. A third item involved vertical subtraction with regrouping and borrowing with a partially missing minuend. This item is more cognitively demanding than merely calculating the difference of two 3-digit numbers. The authors discuss the responses in detail because the errors reveal learners’ fragmented understanding of the procedures for column arithmetic.

Learners were required to work out the digits represented by ⊙ and □:

SAJCE-13-1239-E1.jpg

Only 34.8% of learners selected the correct answer, ⊙ = 8; □ = 5. Learners appear to be focused on ‘filling the blanks’ and thus chose methods of achieving this but showed little evidence of proficiency in column subtraction with 3-digit numbers. Most learners appeared to focus on isolated parts of the problem. Approximately 21% appear to have used subtraction with missing subtrahends but were not consistent in assigning the minuend and the subtrahend in each column. For example, in the units-column they seemingly worked upwards and reasoned ‘7 subtract what gives me 2?’ but reasoned downwards in the tens column and, recognising the need to regroup or borrow, said: ‘15 subtract what gives me 6?’ A further 23.9% appeared to treat the problem as one of missing addends, choosing the distractor ⊙ = 1; □ = 5., which they likely obtained by subtracting 6 – 5 = 1 in the tens column and 7–2 = 5 in the units column. Alternatively, they may have reasoned ‘what number added to 2 gives me 7? What number added to 5 gives me 6?’ Irrespective of the reasoning employed, learners focused on isolated digit-wise calculations. Learners with a relational understanding of whole numbers would be expected to work with addition and subtraction as inverse operations rather than adopting this fragmented digit-wise approach.

There were thee items involving whole number where the Grade 8s outperformed the Grade 7s by 10 percentage points (pp) or more. One of these items involved the square root of an even perfect square, for example . In the Grade 7 group, 29.0% chose the distractor that halved the number (i.e., 50) which would suggest a bias towards additive rather than multiplicative reasoning. By contrast, only 16.4% of Grade 8s made this error. The other two items involved the lowest common multiple (LCM) and highest common factor (HCF). The most common error for the LCM (more than 40%), across both grades, involved choosing the smallest value for the distractor. Similarly, the most common choices (more than 30%) for the HCF involved the bigger-valued options, which were a common multiple or the product of the two numbers. Seemingly, many learners associated lowest with smallest and highest with large(st), thus reflecting a lack of understanding of the notions of LCM and HCF and suggesting cue-based strategies that are activated by attention to specific words.

One of the items in which Grade 7s performed better than Grade 8s involved adding powers: 32 + 33. This was somewhat surprising, because there is an emphasis on powers and exponents in the first term of Grade 8. Two distractors were given in power form and two as whole numbers. Only 28.2% of Grade 7 learners chose the correct answer (36), with 54.5% choosing one of the incorrect answers in power form. Almost 60% of Grade 8s chose distractors in exponential form where the exponential law for multiplying same bases (am · an = am+n) had been incorrectly applied. This may suggest they were looking for answers with the same form as the numbers in the question, but it also suggests that learners did not fully understand the exponential law nor when to apply it.

A noteworthy finding related to this item was that although Grade 7s had not been taught exponential laws, 36.2% added bases and exponents, thus choosing the option 65. This appears to be an intuitive response for learners who have not yet been taught exponential rules. It also reflects a tendency to work additively.

Learners’ responses to the whole number items discussed above show repeated evidence of cue-based reasoning (Boaler 1997) based on visual features of the distractors. Strong patterns of cue-based reasoning were also seen in South African learners’ performance in TIMSS 2019 (Bowie et al. 2022b), and there is further evidence in responses to rational number items below.

Rational number properties and operations

There were only two rational number items where learner performance was above 50%. By contrast, performance was below 30% on nine items. The authors discuss the subtopics within rational numbers separately, showing how whole number reasoning dominates learners’ responses in all subtopics together with a prevalance of cue-based reasoning based on visual features and/or counting. In several cases, the response profiles of both grades are provided to illustrate the similarities in the profiles on rational number items.

When adding unit fractions with the same denominator, such as , Grade 7s outperformed the Grade 8s by 10 pp (60.0% compared with 50.0%). The most common error by far, and chosen by approximately 30% of the whole group, was to add numerators and denominators, that is, . Approximately 50% of learners chose a similar distractor when adding fractions with different denominators. This reflects a tendency towards additive reasoning in both items.

Learners were given the following options, when multiplying two fractions, such as: ,

SAJCE-13-1239-E2.jpg

The response profile for both grades was very similar. Approximately 50% of learners chose option A, where the denominators had been correctly multiplied and the numerator was a relatively large number but not the product of 4 and 21. While fewer learners chose option B (24.9% in Grade 7 and 19.8% in Grade 8), this was still far higher than the percentage choosing the correct answer (approximately 12%). That learners default to a procedural approach is apparent here by their choice of options A and B. It appears that very few learners used relational thinking, that is, noticing that 2 divides into 4, and 7 divides into 21 to give . The very poor performance on this item was likely influenced by the whole number distractors, with learners expecting an answer in fraction form. Once again, this suggests cue-based reasoning linked to the form of the numbers rather than considering the result of operating on the numbers.

Learners’ responses to an item involving conversion from percent to common fraction also reflected cue-based reasoning linked to visual features of the distractors. Learners had to identify the fraction equivalent to 12%. Table 3 shows the distractors and the similar response profiles of each grade. More than 75% of learners in each grade chose a distractor containing 12, suggesting that learners were attending to visible features of the distractors rather than calculating an equivalent fraction. Particularly concerning is the fact that over 60% of learners chose values that were either 10 times larger (option C) or 10 times smaller (option D) than 12%, again indicating little attention to magnitude of numbers.

TABLE 3: Response profiles for the common fraction equivalent to 12%.
TABLE 4: Response profiles for identifying the largest decimal number.

The three items on decimals dealt with: identifying the largest of four decimal numbers; identifying a decimal number on a number line; and adding two decimals with different numbers of decimal digits. As indicated above, the distractors reflect typical errors involving both whole number reasoning and decimal reasoning.

While approximately 30% of both groups chose the correct answer (C), more learners, particularly in Grade 7, chose distractor D, thus reflecting whole number reasoning; that is, 908 is the largest number, as discussed above. Approximately 20% of both groups chose the longest number, hence evidence of the longer-is-larger error.

Learners were required to identify the decimal number represented by T on the number line (see Figure 3). This involved coordinating knowledge of decimal place-value and of scale on the number line, in particular recognising that intervals are 0.01 units. Table 5 shows that approximately 25% of all learners correctly chose 0.36. However, again a large proportion of learners demonstrated difficulty with a magnitude of number task.

FIGURE 3: Representing decimals on a number line.

TABLE 5: Response profile for decimal number on number line.

Approximately 55% of learners chose an option which suggests they were counting steps on the number line (A and D), starting at 0.3 but not paying attention to the scale of the diagram and/or to place value. For example, 31.7% of Grade 7s chose 0.9, suggesting they focused on the significant digits 3 and 6 when combining 0.3 and 6.

Similar reasoning was also evident when adding decimal numbers: 0.5 + 0.03. As shown in Table 6, most learners in each grade chose the correct answer (A). However, the most frequent error was option B. It cannot be known whether learners’ choice reflected cue-based reasoning – seeing an option with the digit 8 – or whether they treated 0.03 as 0.3, thereby ignoring the zeroes, or whether they isolated the significant digits and added them as one would add whole numbers.

TABLE 6: Response profiles for: Add 0.5 + 0.03.

In both decimal items above, there is evidence of partially correct decimal reasoning. For example, learners who chose 0.06 in relation to the number line show some evidence of understanding decimal numbers in relation to scale. However, they did not attend to the larger unit of 0.3 when determining the position of T. Similarly, learners who chose 0.053 show some evidence of recognising place value when working with decimals of different lengths. However, in choosing 0.053, they may be confusing the algorithms for adding and mutiplying decimals, where the distractor reflects the sum of the decimal places in the multiplier and multiplicand when mutiplying.

Multiplicative reasoning lies at the heart of work with rate and ratio, but learners’ performance on these items reflected cue-based reasoning and counting. For example, learners were asked for the ratio of squares to circles (see Figure 4) with the following distractors: (A) 2:5 (B) 10:15 (C) 2:1 (D) 5:10.

FIGURE 4: Diagram for ratio item.

Fifty-eight per cent of the whole group chose D, presumably because they counted 5 circles and 10 squares and then chose a ratio with these numbers. This would suggest, once again, that they are working with visual cues, that they have not mastered basic ratio tasks and that they are not paying attention to the order of the ratio as 5:10 is the reverse of what was required.

In the item shown in Figure 5, learners were asked which diagram shows three-quarters of the square coloured grey. The response profiles of both grades were very similar. Only 11% of the whole group correctly selected diagram B, whereas 58.4% of learners chose option A and 20.4% chose option C. These responses suggest learners do not appreciate the importance of equal-sized parts in the part–whole relationship between shaded and unshaded portions. This is further evidence of reasoning based on visual cues and counting – looking for four parts and three parts of shading and non-shading, irrespective of the size of the parts. The responses also show that most learners are not treating three-quarters as a ratio and hence do not consider the six-eighths shading in B as a correct option. This is similar to the ratio of pens to pencils in the task discussed earlier in the article.

FIGURE 5: Distractors for item involving shading parts of whole.

For the ratio item involving pens and pencils (see above), the response profiles of each grade are given in Table 7.

TABLE 7: Response profile for ratio item pens:pencils.

The item was correctly answered by 42.2% of Grade 8s but only 32.1% of Grades 7s. The most frequent response among Grade 7s was to halve the total number (i.e. 18), whereas only 22.4% of Grades 8s chose this option. Approximately 75% of Grade 8s selected an option which suggests some attempt to calculate ratios from the given information (C and D). By contrast, only 55% of Grade 7s chose one of these options. This may suggest a shift from inappropriate and unsophisticated responses to more appropriate strategies involving multiplicative reasoning from Grade 7 to Grade 8.

The selection of items and learners’ responses presented above reflects a dominance of cue-based responses, many of which are based on visual aspects of the numbers and/or diagrams. There is also evidence of whole number reasoning in dealing with decimal fractions. In most items, the Grade 8s outperformed the Grade 7s, although the response profiles are mostly similar.

Multiplicative reasoning

Items requiring multiplicative reasoning appeared across all topics. The authors focus here on three items which show a predominance of additive reasoning in items requiring multiplicative reasoning.

Learners were given a number pattern with a common ratio of 4: 8; ___; 32; 64.

Nearly 60% of Grade 8s answered this item correctly, compared to only 46.1% of Grade 7s. By far the most common error, made by more than 30% of Grade 7s and more than 20% of Grade 8s, reflected additive reasoning where learners chose the distractor 12, most likely obtained from 4 + 4 = 8 and 8 + 4 = 12. This choice of distractor also suggests that learners were not considering all the given terms because the gap between 12 and 32 is clearly not the same as the gap between 4, 8 and 12.

There was even stronger evidence of inappopriate additive reasoning in an item dealing with percentages. Learners were asked: ‘A pizza costs R80. You pay 20% less. How much do you pay?’ Approximately 45% of all learners chose the distractor R60, signalling that they were working additively, subtracting R20 from the original pizza price. In contrast, approximately 20% of Grade 7s and less than 30% of Grade 8s chose the correct answer. Both groups of learners performed poorly on all items involving percentages.

One of the items involving functional relationships required learners to provide the rule associating input values with output values. The rules were deliberately expressed in terms of inputs and outputs (rather than letters) so that learners who were not familiar with algebraic notation would have a better chance of making sense of the rules.

Choose the rule that produces the pattern in the table (see Figure 6).

(The rules are given in Table 8)

FIGURE 6: Input and output values for functional relationship item.

TABLE 8: Response profile for functional relationship represented in a table.

As shown in Table 8, Grade 8s outperformed Grade 7s by approximately 16% correctly choosing the response that connected inputs and outputs. This too shows evidence of gains from Grade 7 to Grade 8 on tasks involving functional thinking.

However, inappropriate additive reasoning was evident in that more than 25% of learners in each grade chose options A and B. Although B focused on the common difference between outputs, the rule was stated as output = input + 4 which is not true for any number pair. Seemingly, learners did not pay attention to ‘input’ and read the rule as ‘adding four’. Option A (like C) was only true for the first pair of numbers.

Discussion and recommendations

In designing the DiBa test, the intention was to avoid a flooring effect by developing and selecting items dealing with mathematical content from Grades 4–7. As can be seen, flooring effects were not avoided on many items. However, learners’ choice of distractors and the related error analysis provide insights into their performance that were not available in previous large-scale national assessments such as the ANAs.

The selection of whole number and rational number items presented here suggests that many learners approach items at face value, paying attention to what is immediately visible, without appropriate consideration for the relationships between symbols, and a lack of awareness of relationships in the context of the expression, diagram and so on. Although multiplicative reasoning underlies large areas of work in high school mathematics, learners’ responses suggest that many still had difficulty reasoning multiplicatively.

There is also evidence that cue-based strategies, such as looking for a particular number (recall item with 12%) and looking for a particular form of a number (recall product of common fractions item) may over-ride approaching items by means of calculations. Consequently, learners may choose an incorrect answer which has the expected form over the correct answer which is not given in the expected form. The prevalence of cue-based strategies may also have been a consequence of a lack of access to calculators. That said, Grade 7 and 8 learners do not have access to calculators for formal assessments. Nevertheless, anecdotal observations in many Grade 7 and 8 classrooms suggest a high dependence on calculators even for simple whole number calculations.

The design feature of ‘unexpected forms’ of numbers may have contributed to poor performance on some items. However, it is the presence of this feature which potentially provides insight into learners’ strategies for choosing distractors. When learners choose incorrect options with the expected form rather than correct answers with an unexpected form, it suggests that they are paying attention to peripheral features of the distactors and their thinking is not informed by a relational understanding of the relationships between the numbers in the item. It may also indicate that learners are not actually doing calculations to obtain answers when presented with an MCQ format.

Returning to the research questions framing this article, the authors have shown that, based on overall learner performance, the difference between the grades was not statistically significant. Given that the majority of items dealt with primary school content, this suggests that in general, Grade 8 learners’ knowledge of primary school mathematics did not improve substantially in 2020. There may be several reasons for this, but the most obvious is that Grade 8s had very little opportunity to attend school in 2020, and when they were at school, teachers were hard-pressed to cover as much Grade 8 content as possible with little time to address gaps in learners’ knowledge of primary school mathematics. Similarly, the overall performance of Grade 7s was also impacted by lost teaching and learning time in 2020, although to a lesser extent than Grade 8s.

However, there is substantial evidence that learners are leaving primary school with many gaps in their mathematical knowledge and that these were not necessarily related to the pandemic. For example, the items on decimal fractions involved content prior to Grade 7. That learners are not proficient in primary school mathematics at the end of primary school is not a new finding. However, the error analysis made possible through the MCQ format provides insights into the nature of learners’ errors and hence gives direction for future interventions, at both the primary and high school levels. The authors noted many similarities in the response profiles of both grades to test items discussed in this article. The same is true for the majority of items across all topics in the test. Many of the errors concur with local and international research, but new findings such as learners’ response to the item involving squares and cubes provide insight into learners’ reasoning prior to instruction on exponents.

The DiBa results show the substantial mismatch between curriculum expectations and the levels at which learners are performing. In testing Grade 7 and 8 learners on mathematics from Grade 4 upwards, it was hoped that many more learners would perform better on items from the lower grades. Unfortunately, this was not the case, and so many Grade 7 learners, possibly the majority of Grade 7s, enter high school without the necessary mathematical preparation to cope. The limited time allocated in Grade 8 to recap Grade 7 mathematics does not adquately address these gaps. This study makes three related recommendations for high school mathematics. Of course, many recommendations could be made for primary mathematics, but they are not in focus here.

Firstly, the Grade 8 curriculum should be revised to address key concepts of whole numbers and rational numbers (from primary school) in greater depth. This is not the same as the ‘revision’ of number work that appears in the current curriculum and which currently takes the form of rapid coverage of a long list of content but does not address conceptual foundations such as place-value and the relative size of numbers (particularly fractions, decimals and very large whole numbers). Secondly, targeted support must be provided for Grade 8 teachers in teaching this content in ways that address the missing fundamentals. Prior work with high school teachers has shown that they are not well equipped with the knowledge and skills to address learners’ gaps in number concepts from primary school. This support should be structured around online resources that are freely avaiable and which teachers can access in their own time. The resources should focus on how and why learners’ mathematical difficulties with numbers arise and then provide strategies to deal with the difficulties. These strategies could be modelled in short video clips that give teachers an idea of how they can intervene in their own classrooms. The resources should also include suitable tasks with well-chosen example sets for teachers to implement. These resources must be designed by experts. Grade 8 teachers, many of whom are early-career teachers with little teaching experience, cannot be expected to produce such materials.

While it is clear that dedicated time needs to be provided to consolidate whole number and rational number concepts, it is clear that teachers cannot reteach the content from scratch. Also, it also seems reasonable to expect that many learners can grasp the content faster than is expected in the grade in which it is usually taught (because the learners are older and have learned more mathematics). Thus the third recommendation is that mathematically sound strategies need to be developed to teach the primary school content to older learners in time-efficient ways that focus on the key aspects. In addition, these stategies should build relational understanding and give attention to generalisation and structure to support learners in the transition to algebra. This may involve omitting some of the details from the primary school curriculum. For example, there is little value in spending time to practise multiplying two 3-digit numbers in high school. Similarly there is little point in excessive time on multiplying mixed numbers, as this has limited relevance to the kinds of algebraic fractions learners will encounter from Grade 9 onwards. Once again, developing such strategies is not trivial, and they will first need to be identified and/or developed, then piloted and refined by experts.

Conclusion

In this article, the authors have presented an analysis of results from the piloting of the DiBa Test in November 2020 with Grade 7 and 8 learners in selected schools in Gauteng. The overall average learner score was below 40%, and there was little difference in overall learner performance between Grade 7 and Grade 8 learners. The most persistent finding is that learners’ choices of distractors were cue-based, focusing on numbers (at face value) without sufficient attention to the context of the numbers and the relationships between numbers in an item. These findings are consistent with previous research conducted by the WMCS project and the local and international literature. The recommendations made in the study are built on the assumption that high school teachers must pay greater attention to learners’ mathematical knowledge gaps but that teachers cannot be expected to do this without substantial support and professional development. This is a matter of urgency, not just to recover from the impact of the COVID-19 pandemic but to address another kind of pandemic that continues to wreak havoc with learners’ future prospects in school, beyond school and in the economy more broadly.

Acknowledgements

The authors are indebted to Jaqui Luksmidas for creating a spreadsheet tool to summarise, analyse and report the quantitative aspects of the data.

Competing interests

The authors declare that they have no financial or personal relationship(s) that may have inappropriately influenced them in writing this article.

Authors’ contributions

C.P. was the principal investigator and L.B. was a co-investigator of this project. They jointly led the development of the test instrument. C.P. performed most of the data analysis and wrote most of the article. L.B. reviewed the analysis, wrote parts of the article and provided comment on all drafts of the article.

Funding information

Financial support was provided by the National Research Foundation (grant no. 71218), the First Rand Foundation and the Zenex Foundation.

Data availability

The data that support the findings of this study are available from corresponding author, C.P., upon request.

Disclaimer

The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of any affiliated agency of the authors.

References

Askew, M., Venkat, H., Mathews, C., Ramsingh, V., Takane, T. & Roberts, N., 2019, ‘Multiplicative reasoning: An intervention’s impact on Foundation Phase learners’ understanding’, South African Journal of Childhood Education 9(1), a622. https://doi.org/10.4102/sajce.v9i1.622

Behr, M., Lesh, R., Post, T. & Silver, E., 1983, ‘Rational number concepts’, in R. Lesh & M. Landau (eds.), Acquisition of mathematics concepts and processes, pp. 91–125, Academic Press, New York, NY.

Bills, C., 2003, ‘Errors and misconceptions in KS3 “Number”’, in J. Williams (ed.), Proceedings of the British Society for Research into Learning Mathematics, vol. 23, pp. 7–12, BSRLM, U.K.

Black, P. & Wiliam, D., 1998, ‘Assessment and classroom learning’, Assessment in Education: Principles, Policy & Practice 5(1), 7–74. https://doi.org/10.1080/0969595980050102

Blanton, M., Stephens, A., Knuth, E., Gardiner, A., Isler, I. & Kim, J., 2015, ‘The development of children’s algebraic thinking: The impact of a comprehensive early algebra intervention in third grade’, Journal for Research in Mathematics Education 46(1), 39–87. https://doi.org/10.5951/jresematheduc.46.1.0039

Boaler, J., 1997, Experiencing school mathematics: Teaching styles, sex and setting, Open University Press, Milton Keynes HSRC, Pretoria.

Bowie, L., Venkat, H., Hannan, S. & Namome, C., 2022a, TIMSS 2019 South African item diagnostic report: Grade 5 mathematics, HSRC, Pretoria.

Bowie, L., Venkat, H., Hannan, S. & Namome, C., 2022b, TIMSS 2019 South African item siagnostic report: Grade 9 mathematics, HSRC, Pretoria.

Brown, M., Küchemann, D. & Hodgen, J., 2010, ‘The struggle to achieve multiplicative reasoning 11–14’, in M. Joubert & P. Andrews (eds.), British Congress for Mathematics Education (BCME-7), University of Manchester, pp. 49–56, BSRLM.

Cai, J. & Knuth, E., 2011, Early algebraization, Springer, Berlin, Heidelberg.

Cai, J., Ng, S. & Moyer, J., 2011, ‘Developing students’ algebraic thinking in earlier grades: Lessons from China and Singapore’, in J. Cai & E. Knuth (eds.), Early algebraization, pp. 25–42, Springer, Berlin, Heidelberg.

Carpenter, T., Franke, M., Levi, L. & Zeringue, J., 2005, ‘Algebra in elementary school: Developing relational thinking’, ZDM – Mathematics Education 37, 53–59. https://doi.org/10.1007/BF02655897

Charalambous, Y. & Pitta-Pantazi, D., 2007, ‘Drawing on a theoretical model to study students’ understanding of fractions’, Educational Studies in Mathematics 64, 293–316. https://doi.org/10.1007/s10649-006-9036-2

Confrey, J., 2012, ‘Better measurement of higher cognitive processes through learning trajectories and diagnostic assessments in mathematics: The challenge in adolescence’, in V. F. Reyna, S. B. Chapman, M. R. Dougherty & J. Confrey (eds.), The adolescent brain: Learning, reasoning, and decision making, pp. 155–182, American Psychological Association. https://doi.org/10.1037/13493-006

Department of Basic Education (DBE), 2014, The Annual National Assessment of 2014. Diagnostic report intermediate and senior phases mathematics, DBE, Pretoria.

Durkin, K. & Rittle-Johnson, B., 2012, ‘The effectiveness of using incorrect examples to support learning about decimal magnitude’, Learning and Instruction 22(3), 206–214. https://doi.org/10.1016/j.learninstruc.2011.11.001

Durkin, K. & Rittle-Johnson, B., 2015, ‘Diagnosing misconceptions: Revealing changing decimal fraction knowledge’, Learning and Instruction 37, 21–29. https://doi.org/10.1016/j.learninstruc.2014.08.003

Empson, S., Levi, L. & Carpenter, T., 2011, ‘The algebraic nature of fractions: Developing relational thinking in elementary school’, in J. Cai & E. Knuth (eds.), Early algebraization, pp. 409–428, Springer, Berlin, Heidelberg.

Fuchs, L., Malone, A., Schumacher, R., Namkung, J. & Wang, A., 2017, ‘Fraction intervention for students with mathematics difficulties: Lessons learned from five randomized controlled trials’, Journal of Learning Disabilities 50(6), 631–639. https://doi.org/10.1177/0022219416677249

Hansen, N., Jordan, N., Fernandez, E., Siegler, R., Fuchs, L., Gersten, R. et al., 2015, ‘General and math-specific predictors of sixth-graders’ knowledge of fractions’, Cognitive Development 35, 34–49. https://doi.org/10.1016/j.cogdev.2015.02.001

Irwin, K., 2001, ‘Using everyday knowledge of decimals to enhance understanding’, Journal for Research in Mathematics Education 32(4), 399–420. https://doi.org/10.2307/749701

Kerslake, D., 1986, Fractions: Children’s strategies and errors. A report of the strategies and errors in secondary mathematics project, NFER-NELSON, Windsor.

Kieren, T., 1976, ‘On the mathematical, cognitive, and instructional foundations of rational numbers’, in R. Lesh (ed.), Number and measurement: Papers from a research workshop ERIC/SMEAC, pp 101–144, ERIC/SMEAC, Columbus, OH.

Lamon, S., 2005, Teaching fractions and ratios for understanding. Essential content knowledge and instructional strategies for teachers, Routledge, New York, NY.

Nesher, P., 1987, ‘Towards an instructional theory: The role of student’s misconceptions’, For the Learning of Mathematics 7(3), 33–40.

Pienaar, E., 2014, ‘Learning about and understanding fractions and their role in the high school curriculum’, Unpublished research report for Master of Education in Curriculum Studies (Mathematics Education), Stellenbosch University.

Resnick, I., Jordan, N., Hansen, N., Rajan, V., Rodrigues, J., Siegler, R. et al., 2016a, ‘Developmental growth trajectories in understanding of fraction magnitude from fourth through sixth grade’, Developmental Psychology 52(5), 746–757. https://doi.org/10.1037/dev0000102

Resnick, I., Newcombe, N. & Shipley, T., 2016b, ‘Dealing with big numbers: Representation and understanding of magnitudes outside of human experience’, Cognitive Science 41(4), 1020–1041. https://doi.org/10.1111/cogs.12388

Ryan, J. & Williams, J., 2007, Children’s mathematics 4–15: Learning from errors and misconceptions, Open University Press, Buckingham.

Siegler, R., Carpenter, T., Fennell, F., Geary, D., Lewis, J., Okamoto, Y. et al., 2010, Developing effective fractions instruction for kindergarten through 8th grade: A practice guide (NCEE #2010-4039), viewed 09 February 2022, from https://whatworks.ed.gov/publications/practiceguides.

Siegler, R., Duncan, G., Davis-Kean, P., Duckworth, K., Claessens, A., Engel, M. et al., 2012, ‘Early predictors of high school mathematics achievement’, Psychological Science 23(7), 691–697. https://doi.org/10.1177/0956797612440101

Steffe, L., 1992, ‘Schemes of action and operation involving composite units’, Learning and Individual Differences 4(3), 259–309. https://doi.org/10.1016/1041-6080(92)90005-Y

Steinle, V. & Stacey, K., 2004, ‘A longitudinal study of students’ understanding of decimal notation: An overview and refined results’, in I. Putt, R. Faragher & M. Mclean (eds.), Mathematics for the third millennium, towards 2010 – Proceedings of the 27th annual conference of the mathematics education research group of Australasia, pp. 541–548, MERGA, Townsville.

Torbeyns, J., Schneider, M., Xin, Z. & Siegler, R., 2015, ‘Bridging the gap: Fraction understanding is central to mathematics achievement in students from three different continents’, Learning and Instruction 37, 5–13. https://doi.org/10.1016/j.learninstruc.2014.03.002

Ubah, I. & Bansilal, S., 2018, ‘Pre-service primary mathematics teachers’ understanding of fractions: An action-process-object-schema perspective’, South African Journal of Childhood Education 8(2), a539. https://doi.org/10.4102/sajce.v8i2.539

Watanabe, T., 2011, ‘Shiki: A critical foundation for school algebra in Japanese elementary school mathematics’, in J. Cai & E. Knuth (eds.), Early algebraization, pp. 109–214, Springer, Berlin, Heidelberg.

Footnote

1. The authors use lookalike items, meaning that the items in the article reflect the key features of the actual test items with minor differences such as different numbers and letters.



Crossref Citations

No related citations found.