About the Author(s)

Jeanette K. Ramollo Email symbol
Department of Primary Education, Faculty of Humanities, Tshwane University of Technology, Pretoria, South Africa

Anil Kanjee symbol
Department of Primary Education, Faculty of Humanities, Tshwane University of Technology, Pretoria, South Africa


Ramollo, J.K. & Kanjee, A., 2023, ‘Supporting teachers to develop formative assessment knowledge and skills in no-fee schools’, South African Journal of Childhood Education 13(1), a1247. https://doi.org/10.4102/sajce.v13i1.1247

Original Research

Supporting teachers to develop formative assessment knowledge and skills in no-fee schools

Jeanette K. Ramollo, Anil Kanjee

Received: 26 July 2022; Accepted: 09 Nov. 2022; Published: 27 Jan. 2023

Copyright: © 2023. The Author(s). Licensee: AOSIS.
This is an Open Access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


Background: Formative assessment has been reported to improve learners’ learning in affluent contexts. However, very few studies have reported the impact of formative assessment on teachers’ knowledge and understanding in no-fee public schools located in a low socio-economic context.

Aim: This article investigates the impact of the Assessment for Learning Capacity Development Programme (AfL CDP) on teachers’ formative assessment knowledge and understanding pertaining to the five formative assessment strategies: learning intentions and success criteria, questioning, feedback, peer and self-assessment.

Setting: This study was conducted as part of the Assessment for Learning (AfL) in Africa project in one Gauteng district involving 20 Grade 3 teachers from six no-fee public schools.

Methods: Teachers in this study participated in the AfL CDP, implemented using the reflect, mediate, acquire and adapt, plan, prepare, present, support (ReMAPS) intervention framework. Baseline and endline data were collected using the formative assessment reflection exercises (FARE) before and after the AfL CDP, while t-tests were used to determine differences in performance.

Results: The results revealed significant improvements in teacher formative assessment knowledge and understanding across all five strategies.

Conclusion: The ReMAPS intervention framework, applied in the AfL CDP, proved successful in supporting teachers to improve their formative assessment knowledge and understanding, even when implemented in challenging contexts, and provides a viable, practical model for implementing AfL pedagogical strategy by the Department of Education.

Contribution: This study adds to the body of knowledge by providing research-based findings about how an AfL capacity development programme implemented in a challenging context in South Africa, benefited teachers’ pedagogical knowledge and understanding.

Keywords: Formative assessment; professional development programmes; ReMAPS intervention framework; no-fee school; learning intentions and success criteria, questioning, feedback, peer and self-assessment.


For effective learning to take place, research indicates that teachers must be able to effectively engage with learners and apply relevant pedagogical practices to determine what learners have learned and to identify and address their learning gaps (Franklin & Harrington 2019:2; Pedler, Hudson & Yeigh 2020:51; Wiliam & Thompson 2007:4). In South African schools, however, two key challenges must first be addressed. Firstly, most teachers do not possess the requisite knowledge and skills to address the learning needs of their learners, nor are they able to effectively use assessment data to identify learning gaps. Secondly, guidance and assistance for teachers are limited, especially as most district officials are not able to provide the specific support that teachers require to improve learning and teaching in schools. Moreover, there is a dearth of information regarding challenges in capacity, cost and time for implementing large-scale intervention programmes in South Africa that focus on improving teachers’ pedagogical practice.

In addressing this challenge, researchers from several institutions (Tshwane University of Technology, Cape Peninsula University of Technology, Aga Khan University in Tanzania and the University of Oxford) were awarded funding from the Education Sciences Research Council (ESRC) to implement the Assessment for Learning in Africa (AFLA) project in three sites: Cape Winelands (South Africa), Dar es Salaam (Tanzania) and Tshwane (South Africa). The primary objective of the project was to understand the conditions under which the Assessment for Learning (AfL) approach can be applied and sustained in challenging contexts to enhance learning for all, especially for the poor and marginalised. Titled ‘Assessment for learning in Africa: Improving pedagogy and assessment for numeracy in foundation years’, the key research question addressed in the project was: How can teaching quality be improved in challenging contexts in South Africa and Tanzania as measured by an increase in Foundation Phase numeracy learning outcomes? In this article, we focus on findings emanating from the project implemented in Tshwane.

The rationale for the AFLA project emanated from the researcher’s experiences and findings from previous projects implemented within the AfL Niche Area.1 Since the establishment of the Niche Area in 2012, a number of projects and studies were undertaken to review assessment policies impacting the education system; to explore pre- and in-service teachers’, school leaders’ and education officials’ knowledge and understanding of assessment; to determine the assessment practices of key role-players across the different levels of the education system (classroom, school, district and provincial or national); and to identify relevant professional development models or approaches for supporting students and practising teachers to enhance their use of formative assessment to improve their pedagogical practices (Kanjee 2018; Kanjee & Mthembu 2015; Kanjee & Sayed 2013; Molefe 2015). Several findings from these studies indicated that the provision of relevant capacity development programmes can lead to improved teacher knowledge and effective use of formative assessment practices during lessons; that these practices impact on how teachers plan, prepare and present their lessons; and that the effective use of formative assessment enhances learner engagement (Kanjee 2018, 2020; Kanjee & Mthembu 2015; Kanjee & White 2014). While these findings also corroborated findings from a similar project conducted in other countries regarding the complexity of implementing formative assessment in schools (Andersson & Palm 2018; Fangxi et al. 2014; Van der Nest, Long & Engelbrecht 2018), a specific challenge identified in the South African context was the large disparities in implementation between teachers in the lower quintile schools2 and teachers in the higher quintile schools3 (Kanjee 2020).

In order to better understand the challenges faced by teachers in the lower quintile schools, the AFLA research consortium was established with the Oxford University Centre for Educational Assessment (OUCEA); the Centre for International Teacher Education (CITE) at the Cape Peninsula University of Technology; and the Aga Khan University in Tanzania. This consortium was successful in obtaining a 3-year ESRC grant that was awarded in May 2016.

Conceptual framework

Literature agrees that formative assessment is advanced as a pedagogical strategy that improves learning (Andersson & Palm 2018; Bennet 2011; Black & Wiliam 1998). Bennet (2011) posits that there are contrasting views on what formative assessment is, and the contradictions could cause tension in the implementation of formative assessment. For clarity purposes, this article will conform to Black and Wiliam’s (2009) definition of formative assessment:

[I]s formative to the extent that evidence about student achievement is elicited, interpreted, and used by teachers, learners, or their peers, to make decisions about the next steps in instruction that are likely to be better, or better founded, than the decisions they would have taken in the absence of the evidence that was elicited. (p. 9)

This definition highlights the need for learners to take charge of their learning and not rely on teachers to be the sole knowledge providers. Moreover, the definition aligns with Wiliam and Thompson’s (2007) formative assessment framework, on which the workshop content was based. According to Wiliam and Thompson (2007), teachers should help learners respond to the following questions: (1) ‘Where am I going?’, (2) ‘How am I right now?’ and (3) ‘How to get there?’

‘Where am I going?’

To respond to the first question, teachers should share and clarify the learning intentions (LI) and success criteria (SC) with learners. Formative assessment strategy 1 (FAS1) informs learners what to expect during the lesson (Fisher, Frey & Hattie 2016; Popham 2008). Hattie (2009) describes LIs as what learners should know, understand and do in the teaching and learning activity. On the other hand, the SC are viewed as mechanisms or steps that help learners to observe their progress towards attaining the intended learning (Fisher et al. 2016; Popham 2008). Therefore, sharing LIs and SC is a pedagogical practice that improves learner engagement and learning in the teaching and learning activity. Thus, Moss, Brookhart and Long (2011:66) posit that learners who do not know, identify and see what they are going to learn are ‘flying blind’ and following only what the teacher directs them to do and not on the learning. Formative assessment knowledge and understanding are the object of this study.

The AFLA professional development programme was intended to mediate teachers’ knowledge and understanding and utilised different mediating tools and artefacts in this regard. Swaffield (2009:4) cautioned that the limited training could lead to the implementation of LI and SC in a ‘procedural and ritualistic manner’ and not improve learners’ learning. For this reason, this study investigated how the AFLA project enhanced teachers’ knowledge and skills to develop and integrate the LI and SC in their everyday teaching practices.

‘How am I right now?’

To address the second question, eliciting evidence of learning (FAS2), teachers use questioning strategies and techniques, as well as classroom discussions, to equitably engage all learners in the teaching and learning activity. For example, in this study, various techniques were introduced, such as: wait time; miniboards; traffic lights or robot cards; no hands up, except to ask a question; name sticks; basketball discussion; thumbs up or thumbs down; entrance and exit tickets; pair and share; and phone a friend to improve learner engagement (Harris & Brown 2013; Ingram & Elliott 2016; Wiliam 2011; Wylie & Lyon 2015).

The use of wait time as a formative assessment technique enhances teachers’ pedagogical practices and improves the substance of the questions and questioning techniques they utilise to improve teacher–learner engagement (Wiliam 2011; Wylie & Lyon 2015). Literature on wait time highlights that mathematics teaching and learning improves as learners are given time to think and process information based on the questions, thus enhancing the quality of responses provided and engagement within the learning community (Ingram & Elliott 2016; Jamil, Larsen & Hamre 2018; Wiliam 2011).

‘How to get there?’

The third question focused on feedback that improves learning (FAS3) (Hattie & Timperley 2007). Feedback is an important aspect of assessment, as it mediates learning and is ingrained in the teaching and learning process (Black & Wiliam 2009; Hattie & Gan 2011; Hattie & Timperley 2007). Feedback is defined as ‘information on one’s performance or knowledge offered by an agent (e.g. a teacher, a peer, a book, a parent, oneself, or experience)’ (Hattie & Timperley 2007:81). More importantly, feedback should be linked to LIs and SC and tell learners where they are in their learning and how to move forward (Brookhart 2008; Clarke 2014; Hattie & Timperley 2007).

There are different modes to provide feedback, and they include oral and written feedback (Brookhart 2008; Keeley & Tobey 2011; Wiliam 2011). Various authors report that written feedback comprises scores, grades, written comments and grades (Black & Wiliam 1998; Brookhart 2011; Charteris & Smardon 2015; Wiliam 2011), while Mkhwanazi (2014) discovered that teachers lacked basic feedback knowledge. Teachers in her study utilised ticks and stickers for correct work and crosses for incorrect work, as well as markings and simple evaluative comments like ‘good’, ‘very good’ and ‘you are a star’. According to Tunstall and Gipps (1996), teachers in the early grades utilise stickers to motivate students in their learning activities. However, this type of feedback is reported not to enhance learners’ learning. Hence, this study investigated Grade 3 teachers’ knowledge and understanding of the different types of feedback. For example, teachers had to distinguish responses that describe motivational and effective descriptive feedback.

Peer assessment

Other strategies involved peer assessment (FAS4) (Clarke 2014; Mundia 2012; Ndoye 2017; Wiliam 2011, 2014) and self-assessment (FAS5). According to Ratminingsih, Artini and Padmadewi (2017), peer assessments help learners collaborate and collectively develop acceptable meaning, explanations and strategies and motivate one another to demonstrate, defend and explore their thinking. Learning becomes a social activity and improves learner engagement in the teaching and learning activity (Wiliam 2011). Various studies investigated the use of peer assessment and found that learners learn better from their peers (Double, McGrane & Hopfenbeck 2019; Webb et al. 2017; Wiliam 2011). However, Gielen and her colleagues question the quality of peer assessment feedback. They highlight that learners’ ‘judgment or advice may be partially correct, fully incorrect or misleading’ (2010:305). Thus, professional development programmes should mediate teachers’ knowledge and understanding to use this strategy in the teaching and learning activity.


Regarding self-assessment (FAS5), Ndoye (2017) suggests that learners should be activated to be owners of their learning. There is a clear need for learners to be active participants in their learning. Hence, Black and Wiliam (1998:21) highlight that it would be helpful for teachers to be viewed as passive recipients of information and that teachers should understand how learners learn. Furthermore, self-assessment should be linked to LIs and SC and expose to learners appropriately to improve learning (Harris & Brown 2013).

Intervention framework

The Assessment for Learning Capacity Development Programme (AfL CDP) was implemented using the reflect, mediate, acquire and adapt, plan, prepare, present, support (ReMAPS) framework to mediate and support Grade 3 teachers to enhance their formative assessment knowledge and understanding (Figure 1). The ReMAPS framework is grounded on Vygotsky’s (1978) sociocultural theory. Núñez (2009) and Vygotsky (1978) posit that knowledge and understanding are mediated through cultural artefacts, signs, symbols and tools in the environment, especially when knowledge is shared between and within the community, guided by a more knowledgeable member (Vygotsky 1978). Tadesse et al. (2021) and Vygotsky (1978) explained that human behaviour is transformed when mediated and connected to the individuals’ social-cultural environment. This notion suggests that the Capacity Development Programme (CDP) should consider participants’ cultural context and modify the programme accordingly. In this article, programme facilitators were the more knowledgeable ones. During the workshops and professional learning communities, facilitators mediated and scaffolded teachers’ formative assessment knowledge and contextualised the CDP to cater to participants’ experiences and needs to effectively implement the programme.

FIGURE 1: ReMAPS framework used to implement the assessment for learning in Africa project.

The implementation of the ReMAPS framework is based on a cyclical approach that required participants to first engage with facilitators and colleagues as they develop their formative assessment knowledge and understanding in a series of workshops. In between workshops, participants are required to apply their new knowledge and understanding in practical school settings to develop their pedagogical skills, reflect and record these experiences. Thereafter, participants attend the next workshop, which begins with sharing and reviewing their experiences with colleagues and facilitators before new knowledge and skills are introduced by the facilitators. The next step of the cycle then begins and continues over the duration of the intervention. In practice, the ReMAPS framework is composed of the following stages: (1) reflect on prior and current knowledge and experience; (2) mediate new knowledge and skills introduced and facilitated by the programme facilitators; (3) acquire and adapt new knowledge and skills to participants’ context; (4a) plan and (4b) prepare for implementation; (4c) present and apply new knowledge and skills during participants’ daily teaching practice (where possible, participants were also required to make presentations during workshops); and (5) support through engagements with various stakeholders in professional learning communities, colleagues, programme facilitators, heads of departments (HoDs) and district officials to enhance the effective implementation of new FA knowledge and skills during the teaching and learning activity.

The AfL CDP was facilitated over 6–7-h workshops conducted over 8 months on Saturdays. In addition, there were four 3-h ‘support’ sessions’ presented between workshops after school hours. Table 1 shows the date and the content covered for the project. The workshop’s content was based on the formative assessment strategies and techniques proposed by Wiliam and Thompson (2007). Participants were provided with relevant materials aligned to formative assessment and practical activities that accounted for the specification of the national curriculum and assessment policy statements (for more details, see Kanjee & Bhana 2020). Participants completed the activities during or after workshops or during their professional learning community meetings and were required to complete reflection exercises regarding their experiences and challenges. All the workshops were interactive, and participants collaborated and played an active role in their learning (Darling-Hammond 2006; Desimone & Pak 2017). Andersson (2015) notes that participants’ collaboration improves learning. In this study, participants created professional learning communities and followed ReMAPS framework stages to incorporate formative assessment in their everyday teaching and learning activity.

TABLE 1: The Assessment for Learning in Africa professional development workshop dates and contents.

During the workshops, programme facilitators modelled all formative assessment strategies and techniques introduced to link the theoretical and practical aspects of the developed knowledge and skills. Furthermore, participants watched and were provided with videos demonstrating the different formative strategies and techniques for support and use even after the professional development programme. After the programme facilitators’ presentations and discussions about the new FAS and techniques, participants had to plan for the next lesson for their class implementation. In so doing, participants collaborated, shared ideas and eased the pressure of planning for lessons as individuals. After implementing their newly acquired skills between workshops, that is, over a 3-week period, participants engaged and shared their experiences with the newly acquired skill in the next workshop. Also, programme facilitators collected the portfolios to monitor and record participants’ engagement with the material provided.

Research design and methods

The evaluation framework for this study was based on Pawson’s (2006) realist approach, which views evaluation as a process that identifies how the evaluated programme works and how it expects to achieve its objectives. Thus, the methodology for this study was underpinned by a predominantly interpretivist philosophy in which a mixed-methods approach was applied to obtain a blend of qualitative and quantitative data. This article is based primarily on data collected using the baseline and endline reflection exercises, which comprised both open- and closed-ended questions. The reflection exercise sourced information about participants’ background information and their views about assessment and formative assessment, including sharing LIs and SC, questioning, feedback, peer and self-assessment and techniques to improve learner engagement during the teaching and learning activity. Baseline reflection data were collected before the workshops began to understand participants’ formative assessment knowledge and understanding. The endline reflection was administered after the completion of workshops to record any transformation in participants’ formative assessment knowledge and understanding.


The district purposefully selected eight schools and 22 foundation phase teachers, including one HoD from each school. The schools were chosen because they were the only schools in the circuit that did not participate in any formative assessment professional development programmes. The schools were mainly quintile 1 schools located in a low socio-economic context. Two schools withdrew immediately after launching the capacity development programme, and only 20 teachers participated in the baseline reflection exercise. The schools cited various reasons for withdrawing, including conflicting interests and competing workshops. Moreover, the endline also included 10 participants who had completed the baseline reflection exercise. All participants were female, with the majority indicating their teaching experience ranging from 20 to 27 years, and six had at least 10 years of teaching experience.

Data collection instrument

Data used in this study were collected through baseline and endline formative assessment reflection exercises (FARE) that comprised open- and closed-ended questions. The reflection exercise had four sections and sourced teachers’ (1) background information, (2) views on how children learn, (3) formative assessment knowledge and (4) formative classroom assessment: classroom practices. The quantitative section of the FARE comprised 25 multiple-choice question (MCQ) items that sought participants’ knowledge and understanding of the five formative assessment strategies and techniques. For this study, only results from the quantitative section are reported.

Data analysis

Analysis was conducted using G*Power (Heinrich Heine University Düsseldorf, Düsseldorf, Germany) version (Faul et al. 2009) to determine the minimum sample required for a significance criterion of α = 0.05, power = 0.80, for an effect size of 0.60. The power analysis results indicate the obtained sample size of N = 20 is more than adequate to test the study hypothesis using a t-test. Given the relatively small sample size, we also tested for the normality of results to identify the appropriate statistical test. The results of the Shapiro–Wilk test did not show evidence of non-normality (W = 0.965, p = 0.242), indicating the application of a parametric test. The responses to the MCQs were analysed using the JASP programme (2021) (University of Amsterdam, Amsterdam, Netherlands). We applied t-tests to compare differences in the mean scores between teachers’ baseline and endline formative assessment knowledge and understanding (Creswell & Plano-Clark 2017).


The primary purpose of these results was to determine the impact of AfL CDP on teachers’ knowledge and understanding. The findings are presented according to Wiliam and Thompson’s (2007) five formative assessment strategies, as well as formative assessment techniques.

The overall impact of assessment for learning capacity development programme

Table 2 presents the overall results to show the change in teachers’ formative assessment knowledge and understanding, and subsequently, the results of each strategy are presented. The overall mean scores from the baseline scores (M = 40.5) and endline scores (M = 55.2) indicate a significant improvement (M = 14.64, standard deviation [SD] = 3.32) on the subjects’ FA knowledge and understanding, t(38) = 4.42, p < 0.001. The Cohen d’s value of 1.397 indicates a large effect size (Cohen 1988:185) on the overall subjects’ knowledge and understanding.

TABLE 2: Baseline and endline paired samples t-test (95% confidence levels).
Formative assessment strategy 1: Sharing learning intentions and success criteria

The FAS1 results show a significant improvement on teachers’ knowledge and understanding (M = 12.5), the endline scores (M = 64.2) as compared to the baseline scores (M = 51.7, SD = 5.56), t(38) = 2.25, p < 0.015. The Cohen d’s value of 0.71 indicates a medium effect size (Cohen 1988:185) on the teachers’ knowledge and understanding of introducing the lesson. Figure 2 reveals that 25% (n = 5) of the teachers scored above 70%, compared to 10% (n = 2) of the baseline scores. The findings further show that the endline scores, 40% (n = 8) of the teachers compared to 35% (n = 7) in the baseline, scored between 61% and 70%, followed by 30% (n = 6) endline and 20% (n = 2) baseline scores in the range between 41% and 50%. None of the teachers scored between 31% and 40% in the endline to 25% (n = 5) baseline scores. In both the baseline and endline scores, none of the teachers scored between 21% and 30%, while 5% (n = 1) of endline compared to 10% (n = 2) baseline scores were less than 20%. This improved knowledge suggests that the AFLA professional development programme achieved the object of the activity system. In this case, teachers scored high on FAS1 questions and suggested that at the end of the project, they understood that they should inform learners what they are expected to learn and do during the lesson (Clarke 2014; Crossouard & Pryor 2008; Snape 2011; Wiliam 2010). Informing learners what they are expected to learn during the lesson is a first step towards achieving the activity theory outcomes, namely improving learners’ learning.

FIGURE 2: Formative assessment strategy 1 frequency score distribution.

Formative assessment strategy 2: Questioning

Formative assessment strategy 2 results show a significant improvement in the endline (M = 66.3) as compared to the baseline mean of (M = 51.3, SD = 6.48), t(38) = 2.32, p < 0.013. The Cohen d’s value of 0.732 indicates a medium effect size (Cohen 1988:185) on the subjects’ knowledge and understanding of introducing the lesson in the teaching and learning activity. Figure 3 indicates that most of the teachers, 65% (n = 13) in the endline, scored 71% and above, compared to 35% (n = 7) in the baseline. These findings suggest that the number of teachers who scored above 70% increased by 30% (n = 6) after the professional development programme. In addition, 25% (n = 5) of the teachers scored between 41% and 50% in the endline compared to 35% (n = 7) baseline. However, few teachers, 10% (n = 2) in the endline, scored between 21% and 30% compared to 30% (n = 6) in the baseline, indicating that the number of teachers with low scores decreased as most of the teachers’ scores improved in general. These results revealed that after the professional development programme, teachers’ knowledge and understanding of FAS2 improved. Thus, teachers could mediate FAS2 to elicit evidence of learning through classroom discussion and questioning techniques to improve thinking in the teaching and learning activity (Clarke 2014; Wiliam & Thompson 2007).

FIGURE 3: Formative assessment strategy 2 frequency score distribution.

Formative assessment strategy 3: Feedback

The results for FAS3 also showed a significant increase in mean scores (M = 8.89) after the capacity development (M = 32.8) as compared to before (M = 23.9, SD = 5.15), t(38) = 1.73, p < 0.046. The Cohen d’s value of 0.546 indicates a medium effect size (Cohen 1988:185) on the subjects, feedback knowledge and understanding. However, it should be noted that the mean score for both the baseline and endline was relatively low. Figure 4 shows that only 5% (n = 1), that is, one teacher in the endline, scored above 71% in the endline compared to none in the baseline. None of the teachers scored between 61% and 70%. The endline scores further show that none of the teachers scored between 51% and 60% compared to 5% (n = 1) in the baseline. In addition, 20% (n = 4) of the teachers’ scores ranged between 41% and 50% in the baseline and endline tests, while most of the teachers, 40% (n = 8), scored between 31% and 40% in endline compared to 5% (n = 1) in the baseline test. Even though this score category was low, there was an increase of 35%. Moreover, the number of teachers who scored below 20% decreased in the endline, 15% (n = 3) compared to 40% (n = 8) in the baseline test. While the overall results indicate an improvement, these findings are still concerning, given the limited number of teachers that demonstrated a higher level of knowledge and understanding of this strategy.

FIGURE 4: Formative assessment strategy 3 frequency score distribution.

In practice, this finding indicates a minimal improvement in teachers’ knowledge to use feedback as a mediating tool. Feedback is an important mediating tool that provides learners with information on how to move forward in their learning. Hattie and Timperley (2007:85) suggest that teachers should respond to the following questions: ‘where am I going, how am I going? and where to next?’ They argue that teachers responding to these questions could provide learners with feedback that improves learning (Hattie & Timperley 2007:85). Therefore, the lack of any significant improvement in knowledge of FAS3 suggests that they may struggle to mediate this specific artefact effectively. Consequently, they may be unable to attain the object of the activity theory system, namely to provide learners with descriptive feedback and enhanced learner engagement in the teaching and learning activity.

Formative assessment strategies 4 and 5: Peer and self-assessment

The FAS4 and FAS5 scores show the most significant knowledge increase (M = 20%), after the capacity development programme (M = 51.3), as compared to before (M = 31.3, SD = 4.92), t(38) = 4.07, p < 0.001. The Cohen d’s value of 1.286 displays a large effect size (Cohen 1988:185) on the subjects’ knowledge and understanding of peer and self-assessment in the teaching and learning activity. Figure 5 shows that 20% (n = 4) of the teachers scored above 71% in the endline compared to none in the baseline test. Most of the teachers, 65% (n = 13), scored between 41% and 50% in the endline FARE compared to 35% (n = 7) baseline scores. Few teachers 15% (n = 3) scored between 21% and 30% in the endline as compared to 55% (n = 11) in the baseline test. None of the teachers scored below 20% in the endline compared to the baseline test’s 10% (n = 2). These results show that the AFLA professional development programme changed teachers’ knowledge and understanding to activate learners as instructional resources of one another (Wiliam 2014) and to self-assess (Ndoye 2017). These findings suggest that teachers were able to effectively mediate this specific artefact, namely FAS4 and FAS5, and thus were well placed towards attaining the object of the activity system, that is, improved learner engagement in the teaching and learning activity.

FIGURE 5: Formative assessment strategies 4 and 5 frequency score distribution.

Formative assessment techniques

The results of formative assessment techniques as mediating tools and artefacts used to ensure equitable learner engagement showed significant improvement (M = 23%), a difference between the endline scores (M = 79) as compared to the baseline scores (M = 56, SD = 6.17), t(38) = 4.07, p < 0.001. The Cohen d’s value of 1.179 indicates a large effect size (Cohen 1988:185) on the teachers’ knowledge and understanding of the different FA techniques utilised in this study. The score categories in Figure 6 indicate that most of the teachers, 75% (n = 15), scored above 70% in the endline test, contrasted to 30% (n = 6) in the baseline test. Furthermore, 15% (n = 3) scored between 51% and 60% in the endline compared to 30% (n = 6) in the baseline test. The other 10% (n = 2) scored the lowest, between 31% and 40% in the endline compared to 30% (n = 6) in the baseline test, while 10% (n = 2) of the teachers scored the lowest in the baseline FARE. These findings imply that teachers in this study acquired the necessary knowledge to use different FA techniques to improve learner engagement in the teaching and learning activity. With this knowledge, they could achieve the outcome of this activity system, namely to enhance equitable learner engagement during the lessons (Harris & Brown 2013; Ingram & Elliott 2016; Wiliam 2011; Wylie & Lyon 2015).

FIGURE 6: Formative assessment techniques frequency score distribution.


After participating in the AfL CDP, the results show that Grade 3 teachers’ formative assessment knowledge and understanding improved significantly across all strategies as well as the techniques. The effective mediation of participants’ new knowledge and understanding, along with the use of relevant materials that called for the integration of theory and practice advocated in the ReMAPS framework, served to motivate participants to attend the workshops, which were held for 7 h on Saturdays, as well as improve their pedagogical practices despite having more than 20 years of teaching experience. These results are consistent with the findings by Andersson and Palm (2018), who reported that regular ongoing support and not a once-off workshop motivated teachers to successfully attend and implement formative assessment in the classroom (Andersson & Palm 2018).

It was interesting to note that teachers initially struggled to develop their knowledge and understanding of FAS1. This finding was not surprising, as Didau (2015) posits that teachers with limited assessment knowledge often struggle to develop LIs and SC. Furthermore, Crichton and McDaid (2016) found that teachers seem to confuse LIs as informing learners on what the lesson is about instead of what they were expected to know, to understand and do during the lesson. In addressing this challenge, workshop facilitators repeated the same topic in the next session. In addition, for each workshop, a review of the strategies introduced in previous sessions was undertaken in order to consolidate participants’ knowledge and understanding. Hence, it was not surprising that teachers’ understanding and knowledge of FAS1 were transformed after the professional development programme in this study.

Regarding FAS2, the results showed that teachers also acquired the knowledge and skills to facilitate effective classroom discussions during the lesson. The results revealed that teachers were aware that questions should be linked to the LIs and SC (Wiggins & McTighe 2011). In addition, they were aware that questions should be directed to individuals, small groups or the whole class (Beccles et al. 2016). Compared to Kanjee’s (2020) study, most teachers used traditional questioning methods, where questions were posed to the whole class and teachers only sourced responses from the few learners who raised their hands.

Concerning FAS3, the results also indicate that the AfL CDP positively impacted teachers’ knowledge and understanding. However, it remains unknown whether teachers in this study who demonstrated some improvement will translate their knowledge into practice and provide learners with descriptive (written) feedback that informs them on how to improve their learning (Brookhart 2011; Shrum 2016). In this study, Kanjee (2020) found a prevalence of evaluative and procedural feedback and limited descriptive feedback for high-performing and low-performing learners.

It should be noted that there were some limitations to the number of questions allotted to peer and self-assessment (FAS4 and FAS5). However, the results revealed that teachers made significant knowledge gains. With this limitation at hand, one is more curious whether the teachers will effectively implement the FAS4 and FAS5 successfully in practice. Based on the context in which peer and self-assessment were observed, Kanjee (2020) asked a similar question and reported that these strategies were implemented in a ritualistic manner, raising concerns about whether learners would benefit from the process.

It did not surprise us that a large positive effect size was observed in formative assessment techniques. The improvement in teachers’ knowledge and understanding of the techniques could have been influenced by the immediate and practical application to engage more learners in the lesson. For example, with the use of name sticks, the teacher writes learners’ names on the name sticks and places them in the container from which she or he randomly selects any name stick. When a name stick is selected, the teacher calls out the name, and the chosen learner is required to respond to a question or instruction (i.e. complete the exercise on the board) (Wiliam 2011). Andersson and Palm (2018) found similar results: teachers in their study used different techniques to assess learners, and teachers were able to change their instructional strategies to suit their learning. Therefore, in practice, teachers in this study could use these techniques to engage more learners and amend their lessons to cater to all learners’ learning needs.


Formative assessment professional development highlights the relevance of formative assessment as a mediating tool and artefact and ensures that all learners are equitably engaged during the lesson. The AfL CDP implemented as part of the AFLA project followed all the critical processes to successfully support teachers in functioning in low-resource environments to enhance their formative assessment knowledge and understanding (Andersson 2015; Brookhart, Moss & Long 2010; Darling-Hammond 2006; Desimone & Pak 2017; Guskey 2002), particularly with the use of ReMAPS framework (Kanjee 2018). A key benefit of the AfL CDP was that it was undertaken over a long period and it was not just a once-off workshop. In addition, programme facilitators modelled all the formative assessment strategies and techniques during the workshops, while the materials provided focused on both theoretical and practical examples based on the South African context (Kanjee & Bhana 2020). Moreover, the videos provided were linked to each formative assessment practice and were also used to model and demonstrate how to implement the mediating tools and artefacts in class and enhance teachers’ knowledge and understanding. All the materials and resources in the professional development promoted self-directed learning. According to Sixel (2013:24), self-directed learning is critical, and professional development programmes should encourage teachers to explore, apply or change mediating tools according to their knowledge and experience for effective change to happen.

The ReMAPS framework used to implement the AfL CDP also required teachers to reflect and be observed and supported in class. Teachers engaged actively with all the formative assessment mediating tools and artefacts (strategies and techniques). The collaboration between programme facilitators and participants as community members contributed significantly to the expansive learning of the teachers’ formative assessment knowledge and understanding. More importantly, as Opfer, Pedder and Lavicza (2011:446) suggest, transformation is determined by individuals’ beliefs, interests, motivations and social and historical contexts. This study considered teachers’ history and allowed teachers to contextualise the mediating tools and artefacts to suit their context. Thus, participants were motivated to attend workshops and additional sessions, knowing that they received support from programme facilitators, peers and their professional communities. In this respect, a key contribution of this study is the use of the ReMAPS framework for supporting the Department of Education to implement their AfL pedagogical strategy in all schools, which foregrounds the effective use of formative assessment by teachers (DBE 2020a, 2020b). However, while the study demonstrated the potential impact of successfully implementing the AfL CDP, additional research is still required on the extent to which this new knowledge and these new skills are translated into practice, whether these practices will be sustained after the completion of the workshops, and most critical, the extent to which teachers are able to apply their new formative assessment knowledge and skills to identify and address specific learning needs of all learners, especially learners from poor and marginalised backgrounds.


Competing interests

The authors have declared that no competing interest exists.

Authors’ contributions

Both the authors contributed to the conceptualisation, data collection and writing of the manuscript.

Ethical considerations

Ethical clearance for the Assessment for Learning in Africa (AFLA) project was obtained from the Research and Ethics Committees at both the Tshwane University of Technology and the University of Oxford (ref. no. FCRE/PE/STF/2017/01). Participation was voluntary, and participants signed informed consent forms and were informed that they could leave the project at any time.

Funding information

This study was funded by a grant obtained from the Education Sciences Research Council (ESRC) (United Kingdom) and the Assessment for Learning Research Niche Area at the Tshwane University of Technology.

Data availability

The data that support the findings of this study are available on request from the corresponding author, J.R.


The views and opinions expressed in this article are those of the authors and do not necessarily reflect the official policy or position of any affiliated agency of the authors.


Andersson, C., 2015, ‘Professional development in formative assessment: Effects on teacher classroom practice and student achievement’, Doctoral dissertation, Umeå Universitet.

Andersson, C. & Palm, T., 2018, ‘Reasons for teachers’ successful development of a formative assessment practice through professional development a motivation perspective’, Assessment in Education: Principles, Policy & Practice 25(6), 576–597. https://doi.org/10.1080/0969594X.2018.1430685

Beccles, C., Ikeda, H., Kwaah, C.Y. & Otami, D.C., 2016, ‘Teacher response model for the management of student answers to teacher questions’, International Journal of Education, Learning and Development 4(10), 1–14.

Bennet, R.E., 2011, ‘Formative assessment: A critical review’, Assessment in Education: Principles, Policy & Practice 18(1), 5–25. https://doi.org/10.1080/0969594X.2010.513678

Black, P. & Wiliam, D., 1998, ‘Inside the Black Box: Raising standards through classroom assessment’, Phi Delta Kappan 80(2), 139–148.

Black, P. & Wiliam, D., 2009, ‘Developing theory of formative assessment’, Educational Assessment, Evaluation and Accountability 21(1), 5–31. https://doi.org/10.1007/s11092-008-9068-5

Brookhart, S.M., 2008, How to give effective feedback to your students, ASCD, Alexandria, VA.

Brookhart, S.M., 2011, ‘Educational assessment knowledge and skills for teachers’, Educational Measurement: Issues and Practice 30(1), 3–12. https://doi.org/10.1111/j.1745-3992.2010.00195.x

Brookhart, S.M., Moss, C.M. & Long, B.A., 2010, ‘Teacher inquiry into formative assessment practices in remedial reading classrooms’, Assessment in Education: Principles, Policy & Practice 17(1), 41–58. https://doi.org/10.1080/09695940903565545

Charteris, J. & Smardon, D., 2015, ‘Teacher agency and dialogic feedback: Using classroom data for practitioner inquiry’, Teaching and Teacher Education 50(1), 114–123. https://doi.org/10.1016/j.tate.2015.05.006

Clarke, S., 2014, Active learning through formative assessment, Hodder Education, London.

Cohen, J., 1988, Statistical power analysis for the behavioral sciences, Routledge Academic, New York, NY.

Creswell, J.W. & Plano-Clark, V.L., 2017, Designing & conducting mixed methods research, Sage, London.

Crichton, H. & Mcdaid, A., 2016, ‘Learning intentions and success criteria: Learners’ and teachers’ views’, The Curriculum Journal 27(2), 190–203. https://doi.org/10.1080/09585176.2015.1103278

Crossouard, B. & Pryor, J., 2008, ‘Becoming researchers: A sociocultural perspective on assessment, learning and the construction of identity in a professional doctorate’, Pedagogy, Culture & Society 16(3), 221–237. https://doi.org/10.1080/14681360802346614

Darling-Hammond, L., 2006, ‘Constructing 21st-century teacher education’, Journal of Teacher Education 57(3), 300–314. https://doi.org/10.1177/0022487105285962

Department of Basic Education (DBE), 2020a, National assessment circular 3 of 2020 implementation of formative assessment in the general education and training (get) band, viewed 29 September 2021, from https://irp-cdn.multiscreensite.com/c0cc1c10/files/uploaded/NA%20Circular%2003%20on%20Formative%20Assessment.pdf.

Department of Basic Education (DBE), 2020b, Teacher guidelines for implementing revised annual teaching plans (ATPs) teacher version, viewed 06 June 2021, from https://kfmulaudzi.files.wordpress.com/2020/07/teacher-guidelines-for-the-implementation-of-the-revised-atps-6-july.pdf.

Desimone, L. & Pak, K., 2017, ‘Instructional coaching as high-quality professional development’, Theory into Practice 56(1), 3–12. https://doi.org/10.1080/00405841.2016.1241947

Didau, D., 2015, What if everything you knew about education was wrong? Camarthen Crown House, Bancyfelin.

Double, K.S., Mcgrane, J.A. & Hopfenbeck, T.N., 2019, The impact of peer assessment on academic performance: A meta-analysis of control group studies. Educ Psychol Rev 32, 481–509. https://doi.org/10.1007/s10648-019-09510-3.

Fangxi, T., Teng, E., Tan, J. & Peng, Y.W., 2014, ‘Holistic assessment implementation in Singapore primary schools – Part II: Developing teacher assessment capacity to improve student learning’, Paper presented at the 40th Conference of the International Association for Educational Assessment, Singapore, pp. 1–8.

Faul, F., Erdfelder, E., Buchner, A. & Lang, A.-G., 2009, ‘Statistical power analyses using G*Power 3.1: Tests for correlation and regression analyses’, Behavior Research Methods 41(4), 1149–1160. https://doi.org/10.3758/BRM.41.4.1149

Fisher, D., Frey, N. & Hattie, J., 2016, Visible learning for literacy, grades K-12: Implementing the practices that work best to accelerate student learning, San Diego State University, San Diego, CA.

Franklin, H. & Harrington, I., 2019, ‘A review into effective classroom management and strategies for student engagement: Teacher and student roles in today’s classrooms’, Journal of Education and Training Studies 7(12), 1. https://doi.org/10.11114/jets.v7i12.4491

Gielen, S., Peeters, E., Dochy, F., Onghena, P. & Struyven, K., 2010, ‘Improving the effectiveness of peer feedback for learning’, Learning and Instruction 20(4), 304–315. https://doi.org/10.1016/j.learninstruc.2009.08.007

Guskey, T.R., 2002, ‘Professional development and teacher change’, Teachers and Teaching: Theory and Practice 8(3/4), 389–391. https://doi.org/10.1080/135406002100000512

Harris, L.R. & Brown, G.T.L., 2013, ‘Opportunities and obstacles to consider when using peer-and – Self assessment to improve student learning: Case studies into teachers’ implementation’, Teaching and Teacher Education 36(1), 101–111. https://doi.org/10.1016/j.tate.2013.07.008

Hattie, J., 2009, Visible learning: A synthesis of over 800 meta-analyses relating to achievement, Routledge, London.

Hattie, J. & Gan, M., 2011, ‘Instruction based on feedback’, in R. Mayer & P. Alexander (eds.), Handbook of research on learning and instruction, pp. 249–271, Routledge, New York, NY.

Hattie, J. & Timperley, H., 2007, ‘The power of feedback’, Review of Educational Research 77(1), 81–112. https://doi.org/10.3102/003465430298487

Ingram, J. & Elliott, V., 2016, ‘A critical analysis of the role of wait time in classroom interactions and the effects on student and teacher interactional behaviours’, Cambridge Journal of Education 46(1), 37–53. https://doi.org/10.1080/0305764X.2015.1009365

Jamil, F.M., Larsen, R.A. & Hamre, B.K., 2018, ‘Exploring longitudinal changes in teacher expectancy effects on children’s mathematics achievement’, Journal for Research in Mathematics Education 49(1), 57–90. https://doi.org/10.5951/jresematheduc.49.1.0057

JASP Team, 2021, JASP (0.15) [Computer software], viewed 17 October 2022, from https://jasp-stats.org.

Kanjee, A., 2018, Enhancing pedagogies of engagement: Using formative assessment for improving teaching and learning, Evaluation report submitted to Zenex Foundation, Assessment for Learning Niche Area, Tshwane University of Technology, Pretoria.

Kanjee, A., 2020, ‘Exploring teachers’ use of formative assessment: Prospects and challenges to improve learning for all’, South African Journal of Childhood Education 10(1), 1–13, a824. https://doi.org/10.4102/sajce.v10i1.824

Kanjee, A. & Bhana, J., 2020, Using formative assessment to improve learning and teaching: Practical guidelines for teacher during the COVID-19 pandemic, Oxford University Press, Cape Town, viewed 20 November 2021, from https://assessmentforlearning.co.za.

Kanjee, A. & Mthembu, T.J., 2015, ‘Assessment literacy among foundation phase teachers: An exploratory study’, South African Journal of Childhood Education 5(1), 142–168. https://doi.org/10.4102/sajce.v5i1.354

Kanjee, A. & Sayed, Y., 2013, ‘Assessment policy in post-apartheid South Africa: Challenges for improving education quality and learning’, Assessment in Education: Principles, Policy and Practice 20(4), 442–469. https://doi.org/10.1080/0969594X.2013.838541

Kanjee, A. & White, C.J., 2014, ‘Evaluation of a national professional development programme to improve teachers formative assessment practices’, Unpublished report, Department of Educational Studies, Tshwane University of Technology, Pretoria.

Keeley, P. & Tobey, C.R., 2011, Mathematics formative assessment, Corwin Press, Thousand Oaks, CA.

Mkhwanazi, H.M.N., 2014, ‘Teachers’ use of formative assessment in the teaching of reading comprehension in Grade 3’, PhD thesis, University of Pretoria, Pretoria.

Molefe, M.M.R., 2015, ‘Developing teachers expertise of assessment for learning’, PhD thesis, Dept. of Primary Education, Tshwane University of Technology.

Moss, C.M., Brookhart, S.M. & Long, B.A., 2011, ‘Knowing your learning target. The first thing students need to learn is what they’re supposed to be learning’, Educational Leadership 68(6), 66–69.

Mundia, L., 2012, ‘The assessment of Math learning difficulties in a primary Grade-4 child with high support needs: Mixed methods approach’, International Electronic Journal of Elementary Education 4(2), 347–366.

Ndoye, A., 2017, ‘Peer/self-assessment and student learning’, International Journal of Teaching and Learning in Higher Education 29(2), 255–269.

Núñez, I., 2009, ‘Activity theory and the utilisation of the activity system according to the mathematics educational community’, Educate Special Issue, December, pp. 7–20.

Opfer, V.D., Pedder, D.G. & Lavicza, Z., 2011, ‘The role of teachers’ orientation to learning in professional development and change: A national study of teachers in England’, Teaching and Teacher Education 27(2), 443–453. https://doi.org/10.1016/j.tate.2010.09.014

Pawson, R., 2006, Evidence-based policy: The promise of systematic review. Evidence-based policy: A realist perspective, pp. 2–16, SAGE, London.

Pedler, M., Hudson, S. & Yeigh, T., 2020, ‘The teachers’ role in student engagement: A review’, Australian Journal of Teacher Education 45(3), 48–62. https://doi.org/10.14221/ajte.2020v45n3.4

Popham, J., 2008, Transformative assessment, Association for Supervision and Curriculum Development Publications, Alexandria.

Ramollo, J.K., 2022, ‘Enhancing grade 3 mathematics teachers’ formative assessment knowledge and skills in no-fee schools: Case studies from Gauteng Province’, PhD thesis, Tshwane University of Technology, Pretoria.

Ratminingsih, N.M., Artini, L.P. & Padmadewi, N.N., 2017, ‘Incorporating self and peer assessment in reflective teaching practices’, International Journal of Instruction 10(4), 165–184. https://doi.org/10.12973/iji.2017.10410a

Shrum, S.F, ‘The influence of written formative feedback on student learning in elementary mathematics’, Doctoral dissertation, Walden University.

Sixel, D.M., 2013, ‘Teacher perceptions of professional development required by the Wisconsin quality educator initiative’, PhD theses, University of Wisconsin, Milwaukee.

Snape, P., 2011, ‘Quality learning for technology education: An effective approach to target achievement and deeper learning’, Problems of Education in the 21st Century 38, 95–104.

Swaffield, A., 2009, ‘The misrepresentation of assessment for learning and the woeful waste of a wonderful opportunity’, Presentation at the 2009 National Conference of the Association for Achievement and Improvement through Assessment, Bournemouth, 16–18 September, pp. 1–16.

Tadesse, A., Eskelä-Haapanen, S., Posti-Ahokas, H. & Lehesvuori, S., 2021, ‘Eritrean teachers’ perceptions of learner-centred interactive pedagogy’, Learning, Culture and Social Interaction 28, 100451. https://doi.org/10.1016/j.lcsi.2020.100451

Tunstall, P. & Gipps, C., 1996, ‘“How does your teacher help you to make your work better?” Children’s understanding of formative assessment’, The Curriculum Journal 7(2), 185–203. https://doi.org/10.1080/0958517960070205

Van Der Nest, A., Long, C. & Engelbrecht, J., 2018, ‘The impact of formative assessment activities on the development of teacher agency in mathematics teachers’, South African Journal of Education 38(1), 1–10. https://doi.org/10.15700/saje.v38n1a1382

Vygotsky, L.S., 1978, Mind in society, Harvard University Press, Cambridge, MA.

Webb, N.M., Franke, M.L., Ing, M., Turrou, A.C., Johnson, N.C. & Zimmerman, J., 2017, ‘Teacher practices that promote productive dialogue and learning in mathematics classrooms’, International Journal of Educational Research 97(5), 176–186. https://doi.org/10.1016/j.ijer.2017.07.009

Wiggins, G.P. & McTighe, J., 2011, The understanding by design guide to creating high-quality units, ASCD, Alexandria, VA.

Wiliam, D., 2010, ‘Standardised testing and school accountability’, Educational Psychologist 45(2), 107–122. https://doi.org/10.1080/00461521003703060

Wiliam, D., 2011, ‘What is assessment for learning?’, Studies in Educational Evaluation 37(1), 3–14. https://doi.org/10.1016/j.stueduc.2011.03.001

Wiliam, D., 2014, ‘Formative assessment and contingency in the regulation of learning processes’, Paper presented in a Symposium entitled Toward a Theory of Classroom Assessment as the Regulation of Learning at the Annual Meeting of the American Educational Research Association, Philadelphia, PA, April 2014, pp. 1–13.

Wiliam, D. & Thompson, M., 2007, ‘Integrating assessment with learning: What will it take to make it work?’, in C.A. Dwyer (ed.), The future of assessment: Shaping teaching and learning, pp. 53–82, Lawrence Erlbaum Associates, Mahway, NJ.

Wylie, E.C. & Lyon, C.J., 2015, ‘The fidelity of formative assessment implementation: Issues of breadth and quality’, Assessment in Education: Principles, Policy & Practice 22(1), 140–160. https://doi.org/10.1080/0969594X.2014.990416


1. The Assessment for Learning Niche Area was established to undertake relevant research and contribute to national and international debates on developing effective and enabling systems and practices for supporting learners, teachers, parents, school leaders and education officials in addressing the key challenge of equity and quality in education.

2. That is, schools that are characterised by limited resources, poor facilities and teachers with low qualification levels and that comprise mainly learners who come from poor and marginalised backgrounds.

3. That is, schools that are well resourced, have good facilities, have highly qualified teachers and comprise mainly learners who come from middle and high socio-economic backgrounds.


Crossref Citations

1. Impact of Socio-Demographic Variables of Basic Level Teachers’ School-Based Assessment Practices in Jasikan Municipality, Ghana
Saviour Kwadjo Kudjordji, Millicent Narh-Kert, George Brains Budu, Pearl Worlali Wotordzor, Christopher Addo
doi: 10.46606/eajess2023v04i03.0280