An analysis of the results of literacy assessments conducted in South African primary schools

the print to higher level skills involving syntax, semantics and discourse, and even skills of text representation and integration of ideas with the learners’ global knowledge (Nassaji 2011:173). They must also develop physical skills involving forming letters, and higher level skills required for spelling and writing essays (Cook 2008:87). Basic skills in literacy are prerequisites for academic learning, economic development and stability through the ability to effectively participate in the labour market, meaningful participation in Background: South African primary school learners have participated in several national and international literacy (reading and writing) studies that measure learners’ achievement in different grades and at different intervals. Numerous scholars have analysed the results of these assessments. We extended their analyses by investigating the grade coverage and the aspects of literacy that were included in these assessments, as well as whether the learners’ home language impacts their results. Aim: The authors aim to determine how reliable the results of these assessments are in improving and informing policies relating to literacy teaching in primary schools and to provide recommendations to improve the administration of literacy assessments. Method: Literature on various national and international literacy studies that were conducted in South African primary schools from 2000 to 2016 was identified and analysed according to grade, province, languages in which the assessments were conducted, aspects of literacy that were included in the assessments and the accuracy of the results. Results: The analysis provides evidence that suggests that most literacy assessments target learners in the Intermediate Phase (Grades 4–6) and are not available in all 11 South African official languages. Presently, there are no large-scale literacy assessments for Foundation Phase (Grades 1–3) learners. Moreover, the results of these assessments do not provide us with reliable information about literacy levels in the country because there are vast discrepancies in assessment scores. Conclusion: This article highlights the importance of obtaining reliable information in determining literacy levels in the country and in informing decisions regarding literacy-related policies.


Introduction
Because of considerable international debate regarding the meaning of 'literacy' (Street 2006:1), it is defined in numerous ways, and these definitions are constantly evolving. Technological advances in recent years have led to a proliferation of 'literacies' referred to as multiliteracies. Multiliteracies theory suggests that the definition of literacy should be extended to reflect 'linguistic and cultural diversity as well as the multiplicity of communication channels through which people may choose to make and transmit meaning' (Fellowes & Oakley 2014:4). Thus, the term 'multiliteracies' has widely been used and incorporates terms such as digital, information, library, computer, media, religious, cultural, health, economic, reading, science and financial (Cambridge Assessment 2013:17). In this article, the authors focus, however, on the cognitive skills of reading and writing that learners have to master at a young age (United Nations Educational, Scientific and Cultural Organisation 2005:149).
In order to read and write, learners must develop skills ranging from the basic lower level processes involved in decoding the print to higher level skills involving syntax, semantics and discourse, and even skills of text representation and integration of ideas with the learners' global knowledge (Nassaji 2011:173). They must also develop physical skills involving forming letters, and higher level skills required for spelling and writing essays (Cook 2008:87).
Basic skills in literacy are prerequisites for academic learning, economic development and stability through the ability to effectively participate in the labour market, meaningful participation in society, lifelong learning, sustainable development, individual well-being and even civilisation (Cambridge Assessment 2013:10; De Vos & Van der Merwe 2014:3; Peregoy & Boyle 2000:237;Trudell et al. 2012:5;Wagner 2010:16). Good reading skills and reading with comprehension, that is, the 'active extraction and construction of meaning from various text types' (McElveny & Trendtel 2014:220) assist learners in accessing the curriculum. The results of these literacy assessments have been analysed by the DBE (Department of Basic Education 2010:42-43, 2014:47-59, 2017cDepartment of Education 2003:57, 2008, as well as numerous academics (Howie et al. 2012:6;Mullis et al. 2017:58;Piper 2009:1-7;Spaull 2013:3-4;Venter & Howie 2008:19). However, in the published scholarly literature, there has been a paucity of studies that focus on the analysis of these assessments in terms of their grade coverage, the accuracy of the results and their role in policy construction. Furthermore, the Department of Basic Education (2017b:2) emphasised that although there were and still are various initiatives to support early grade reading, there is little or no evidence of what is working or why.
This study addresses this gap in research by exploring which school grades were covered in the literacy research and the accuracy of the results of literacy assessments in primary schools through the analysis of the quantity and quality of the information that is currently available. The study also explores whether all aspects of literacy were considered in the research studies and whether the learners' home language played a role in the results. This study thus aims to investigate how reliable the results of these assessments are as a starting point for improvements in literacy teaching and learner performance, as well as policy formulation regarding literacy teaching. Not only is a large proportion of South Africa's gross domestic product (GDP) devoted to education but also conducting the various assessments is a costly and time-consuming exercise for researchers, educators and learners alike.

Research questions
The following research questions were addressed in this study: • Which aspects of literacy were considered in the research studies? • In which languages were the assessments available or conducted? • Which grades were assessed in the national (SE and ANA) and international (EGRA, SACMEQ and PIRLS) literacy assessments conducted in South African primary schools? • How reliable are the results of these assessments in terms of improving and informing policies relating to the teaching of literacy in primary schools?

Background
To improve teaching and learning and inform educational policies in South Africa, five literacy assessments (SE, ANA, EGRA, SACMEQ and PIRLS) have been conducted periodically in primary schools at various grade levels since 2000.

Systemic evaluation
Systemic evaluation was a national-level assessment that was conducted every 3 years (2001, 2004 and 2007) in Grade 3 (2001) and Grade 6 (2004 to determine the literacy and numeracy levels of primary school learners (Kanjee & Makgamata 2008:2). It also entailed the evaluation of the extent to which the DBE achieves 'set social, economic and transformational goals' (Department of Education 2003:2). In the Foundation Phase, learners were tested for listening comprehension, reading and writing (Department of Education 2003:8). In an attempt to include the other grades (Grades 1, 2, 4 and 5) in primary schools, SE was eventually replaced by the ANA.

Annual National Assessments
The ANAs were annual, nationally standardised tests of literacy and numeracy attainment for learners in Grades 1-6 and Grade 9. These were written tests based on the content of the first three terms of the CAPS (Department of Basic Education 2014:26). They were intended (Kanjee & Moloi 2016): [T]o provide an objective picture of learners' competency levels, provide an analysis of difficulties experienced by learners and assist schools to design teaching programmes that are targeted at improving learning in classrooms. (pp. 29-30) In 2008 and 2009, trial runs of ANA were conducted in most schools across the country with a focus on exposing educators http://www.sajce.co.za Open Access to better assessment practices. Annual National Assessment 2011, which was administered in February 2011, involved 'universal ANA', whereby all learners from Grades 2-7 were tested in languages and mathematics and 'verification ANA', in which more rigorous processes were applied to a sample of approximately 1800 schools involving Grade 3 or 6 learners to verify the results emerging from universal ANA (Department of Basic Education 2011:5). The focus was on the levels of learner performance in the preceding year, that is, in Grades 1-6 (Department of Basic Education 2011:5). This was the first year in which ANA produced adequately standardised data which allowed for analysis.
Although ANA presented some information regarding learning in the primary grades, which allowed for the initial detection and remediation of learning difficulties (Spaull 2013:3), some academics expressed their concerns about the assessment. It was criticised for (1) its content and level of testing, (2) encouraging educators and learners to focus on maximising test scores that resulted in educators 'teaching to test', (3) encouraging rote learning and the memorisation of random facts, (4) burdening of educators because of its additional administrative demand, (5) its lack of variety in the range of questions, (6) poor comparability over time and grades, (7) unsatisfactory administration and (8) its lack of independence because of the fact that the DBE set the papers, marked the papers and reported the results (Howie et al. 2012:4; South African Democratic Teachers' Union 2014:1-2; Spaull 2015:6-12). In addition, there has been anecdotal evidence of educators writing the answers on the chalkboard, tests sent home as homework and increased learner absenteeism on test days. In some cases, Grade 1 and 2 educators invigilated their own classes, which could have produced biased results; in other cases, not all data were captured, and in some grades and provinces, the response rate was 60% (Spaull 2015:12-13). The validity of the administration of ANA was dependent on educators, and if they went beyond what they were supposed to do in terms of assisting learners, it could have compromised the assessment function of the tool (Bansilal 2012:3). Spaull (2013:3) argued that while ANA was significant in the improvement of the quality of education in the country, its execution and deficiency in external authentication reduced much of its value. Thus, ANA could not be viewed as a dependable gauge of progress.
In 2014, the South African Democratic Teachers' Union (SADTU) proposed that ANA should be discontinued as an annual assessment and be administered over a period of 3 years to enable systematic monitoring of educational progress, to facilitate educator and learner performance over time and to generate relevant and timely information for the improvement of the education system (South African Democratic Teachers' Union 2014:1). Because of numerous criticisms of ANA by academics and SADTU, the DBE's review of ANA resulted in the need for a new perspective on national SE models in the South African context. The SE model for 2018 and beyond is a tri-annual SE that will be conducted on a sample of Grade 3, 6 and 9 learners, and the assessment instruments will allow for international benchmarking and trend analysis across years (Department of Basic Education 2017a:6). It should be noted that up to the end of 2019, no further information has been provided to schools.

Early Grade Reading Assessment
The EGRA, which was developed in 2006 by the Research Triangle Institute (RTI) International, has been implemented in more than 60 countries (as of January 2013), with 23 of them being located in Africa, including Egypt, Gambia, Kenya, Liberia, Malawi, Mali, Mozambique and South Africa (Gove et al. 2013:374;Trudell et al. 2012:6). It is an international diagnostic oral reading assessment that is individually administered to Grades 1-3 learners, and it assesses letter-name and letter-sound knowledge, syllable decoding, familiar and non-familiar word reading, oral reading fluency, and listening and reading comprehension ( (Govender & Hugo 2018:25-26).

Southern and East African Consortium for Monitoring Educational Quality
The SACMEQ was established in 1995 by several Ministries of Education in Southern and Eastern Africa (Moloi & Strauss 2005:2). Fifteen ministries are now members of SACMEQ. The aim of SACMEQ is to track reading achievement trends of Grade 6 learners, to expand the quality of education in sub-Saharan Africa and to provide educational planners with opportunities to acquire the procedural skills needed to monitor and assess the quality of their basic education systems (Department of Basic Education 2010:7; Parliament of the Republic of South Africa 2016:1; Spaull 2011:13).

Southern and East African Consortium for Monitoring
Educational Quality assesses learners' achievement levels in literacy and numeracy, and it is administered only to Grade 6 learners approximately every 7 years (2000,2007,2013). In SACMEQ III and IV, eight levels of reading achievements (pre-reading, emergent reading, basic reading, reading for meaning, interpretive reading, inferential reading, analytical reading and critical reading) were used, where learners at the lowest reading competency levels, namely, the pre-reading and emergent reading levels, were found to be hardly literate, while those at the higher reading competency levels, namely, analytical reading and critical reading levels, demonstrated high and complicated reading competencies (Department of Basic Education 2017c:29).
In South Africa, the SACMEQ assessments were accessible in only two languages: English and Afrikaans (Department of Basic Education 2010:16). Consequently, learners who do not speak English or Afrikaans as their home language would have been at a disadvantage. As only 8.1% of the population speak English and 12.2% speak Afrikaans as home languages (Statistics South Africa 2018:8), the majority of the learners are therefore negatively affected.

Progress in International Literacy Study
The PIRLS is an international assessment of reading literacy that is administered to Grade 4 learners every 5 years (Shiel & Eivers 2009:346). Progress in International Literacy Study defines reading literacy as the ability to comprehend and utilise the written language forms that are required by society and/or valued by the learner (Mullis et al. 2004:3;Mullis & Martin 2015:12). Progress in International Literacy Study emphasises that readers actively construct meaning from texts, and it recognises the significance of literacy in empowering learners to develop reflection, critique and empathy (Kennedy et al. 2012:10).
The PIRLS 2006, 2011 and 2016 assessment frameworks focussed on (1) reading purposes, which include reading for literacy experience and the ability to obtain and use information; (2) comprehension processes, which require learners to retrieve information that is explicitly stated, make straightforward inferences, understand and assimilate thoughts and information, and study and analyse content, language and written elements; and (3) reading behaviours and attitudes towards reading (Mullis et al. 2004(Mullis et al. :5, 2007(Mullis et al. :47, 2017Mullis & Martin 2015:6). Grade 4 learners were tested particularly because the fourth year of formal schooling is viewed as a significant transition stage in the child's development as a reader, and children at this stage should have 'learned how to read and are now reading to learn' :1, 2015. South Africa's first participation in PIRLS was in 2006. This was viewed as the most multifaceted national reading literacy study that was conducted within an international comparative study where languages are concerned (Howie, Venter & Van Staden 2008:551).

Method
This qualitative study uses secondary quantitative data from journal articles and reports to analyse the results of literacy assessments. Literature on the national and international literacy assessments that were conducted in South African primary schools from 2000 to 2016 was identified with a focus on other scholars' interpretations and analyses of these assessments. The three SE scores: the 2012, 2013 and 2014 ANA marks; the Grade 1 EGRA scores; the SACMEQ II, III and IV results; and the PIRLS 2006, 2011 and 2016 scores, where possible, were analysed according to grade, province, languages in which the assessments were conducted, the aspects of literacy that were included in the assessments and the accuracy of the results of the assessments. Scores over time were compared with other comparable scores for the assessments conducted from 2000 to 2016 to make meaningful comparisons and to verify the effectiveness of the assessments. Available information on the results of these literacy assessments was summarised and presented in the form of tables to compare results and to identify gaps.
It is accepted that there could be limitations to the study as only the five main literacy assessments (two national and three international assessments) were included. The authors are, however, of the opinion that because these five literacy assessments were conducted on national or international levels, they provide some insight into the literacy situation in South Africa over a period of time.

Ethical consideration
This article followed all ethical standards for a research without direct contact with human or animal subjects.

Results
In this section, the results of the various assessments that were administered to primary school learners in South Africa and the views of various scholars will be discussed.

Systemic evaluation
In 2001, a sample of approximately 52 000 Grade 3 learners from all provinces and districts was selected for SE (Department of Education 2003:57). For literacy, learners were assessed on (1) reading and writing and (2) listening comprehension, with a national average of 39% and 68% for each of these areas, respectively. An analysis of the literacy scores revealed a consistent pattern of performance of learners in all provinces, in which learners attained higher scores in the reading tasks than in the writing tasks. Except for KwaZulu-Natal, the mean scores for every province were lower than 50% for the reading assessments and under 35% for the writing assessments.
The 2004 SE in South Africa revealed that only 14% of the learners were outstanding in their language competence, 23% were partially competent, but a large majority (63%) lacked the required competence for their age level (Department of Education 2008:6). In the 2007 SE, the overall literacy score for Grade 3 learners in South Africa was 35.9% (Jhingran 2011:6). Only 44.2% of Grade 3 learners could read and 33.6% of Grade 3 learners could write. This again suggests that writing performance is significantly lower than that of reading.
Overall, SE revealed extremely low levels of reading and writing ability across the country, and highlighted that large numbers of South African children are unable to read (Department of Education 2008:4). Systemic evaluation involved the assessment of primary school learners' literacy achievement at the end of the Foundation Phase (Grade 3) and at the end of the Intermediate Phase (Grade 6). Thus, learners in the other grades were not exposed to any national assessments. Apart from the fact that assessment should be conducted in all grades, the SE also clearly shows that qualitative research should be conducted because statistics alone does not explain or help to address the problem. The qualitative research should be aimed at understanding the reasons as to why so many learners in South African primary schools fail to master the basic skills in reading and writing.

Annual National Assessments
In 2012

Early Grade Reading Assessment
In The achievement scores of learners for the four subtasks of EGRA provided very disheartening results. In-depth analysis revealed that only four of the 650 learners in the sample (0.6%) met the Dynamic Indicators of Basic Literacy Skills (DIBELS) international benchmark for learners not at risk for reading difficulties (Piper 2009:7). This implies that 99.4% of South African learners sampled are at risk. This suggests that very few of the learners in the baseline sample had much, or any, introduction to basic phonemic awareness and phonic skills. Notably, 65.2% of learners sampled were unable to identify a single letter sound at the baseline (Piper 2009:1).
Of the 650 learners who attempted the letter sound task, only 524 undertook the common word identification task. Of those who did, the mean score was only 0.18 (Piper 2009:7). When asked to identify commonly used words, 90.2% of the sample did not identify any words at all, and none met the international benchmark for word identification (Piper 2009:1). Only two learners attempted to read the short passage, and one of those was unable to correctly read any words. Only one learner attempted the comprehension questions associated with the passage reading but was unsuccessful.
These statistics provide strong evidence that the reading skills of the learners in the baseline sample were low, and for most learners, they were largely non-existent. Most learners showed few literacy skills at the beginning of Grade 1. An important factor to consider is that data were collected at the very beginning of Grade 1, less than a month into the school year, when learners are still getting accustomed to the formal school environment.
The Story-Powered Schools (SPS) intervention, which included the EGRA assessment, is run by a South African non-profit organisation, Nalibali. The SPS programme focusses on nurturing a love for reading, in the home language and English and on unlocking children's capacity to learn (Menendez & Ardington 2018:1). The SPS programme was implemented in 360 primary schools in isiXhosa and IsiZulu, in the Eastern Cape and KwaZulu-Natal. This baseline data collection was conducted over 2 years: in 2017 and 2018. In total, 30 learners from each school (10 each from Grades 2-4, comprising five girls and five boys, whenever possible). The average scores in the EGRA subtasks were very low in both the provinces. One in four learners was a non-reader in his or her home language -he or she could not read a word from a short, grade-level paragraph (Menendez & Ardington 2018:2). Although there was progress from grade to grade, by Grade 4, 13% of the learners were still unable to read one word, and 24% of them could not correctly answer even one comprehension question based on the passage read.
Research like the EGRA could provide useful information to improve literacy teaching in primary schools, as it involved all the grades in the Foundation Phase, and the home languages of some of the learners were taken into consideration. This research study also included certain reading skills that are important for literacy development.
Research like this conducted later in the school year, and performed more frequently, taking all learners' home languages and the language of learning and teaching (LoLT) into consideration, could provide valuable information that could assist in literacy teaching, thus enhancing literacy abilities among young learners in primary school.

The Southern and East African Consortium for Monitoring Educational Quality
The approximate South African sample size of each SACMEQ project or study was as follows: • There was an increase in the number of schools and learners that participated in SACMEQ III and a decrease in participation in SACMEQ IV. To gain a better understanding of learners' literacy levels, more learners should be exposed to these studies. Therefore, it is disconcerting to note that learner participation decreased by 2025 in SACMEQ IV. In SACMEQ IV, the most learners were sampled from KwaZulu-Natal (1504), Gauteng (1088) and Limpopo (967) (487), the lowest reading scores were in Limpopo.
In SACMEQ III, 27% of Grade 6 learners in South Africa were found to be illiterate because they could not read and comprehend a short simple passage (Spaull 2013:4). Only four provinces -Western Cape (583), Gauteng (573), North West (506) and Northern Cape (506) -achieved scores in reading above the SACMEQ mean score of 500. The North West had the greatest increase in reading scores (by 78 points). In Mpumalanga (46), Free State (45), Northern Cape (36) and Eastern Cape (4), there were also improvements in reading scores. In the other provinces, reading scores decreased over this period. There was also a decline in reading achievement in . The majority of the learners in KwaZulu-Natal, Eastern Cape and Limpopo did not reach acceptable levels of reading achievement (level 4 [reading for meaning] and above) (Department of Basic Education 2010:46).
Learners who 'always' spoke English at home achieved 37.5 points higher in the literacy test than those who did not, and learners who 'sometimes' spoke English at home attained 19 points higher than those who did not (Spaull 2011:14). As argued earlier, given that the SACMEQ tests were only conducted in English (a language that is the home language of only 8.1% of the population) and Afrikaans (a language that is the home language of only 12.2% of the population), learners whose home language is not English or Afrikaans were likely to be at a disadvantage. This could explain why South African learners came 10th out of the 14 education systems assessed for reading, below much poorer countries such as Kenya, Swaziland and Tanzania.
The SACMEQ studies showed clearly that some Grade 6 learners were illiterate. This implies that these learners have not developed basic reading skills (pre-reading, emergent reading, basic reading, reading for meaning and interpretive reading). It is important to note that many learners would have written the assessment in the First Additional Language (FAL

Progress in International Literacy Study
Approximately 215 000 children from 40 countries and 45 education systems participated in PIRLS 2006, and about 30 000 Grade 4 and Grade 5 learners from 441 South African schools were tested in the 11 official languages of the country (Howie et al. 2008b:2, 6, 55). In most of the countries, learners were assessed at the Grade 4 level, but in South Africa, Grades 4 and 5 learners were tested. The inclusion of Grade 5 learners (Howie et al. 2017a): [W]as based upon apprehension about the South African Grade 4 learners being able to cope with the demands of the assessment and particularly given the fact that Grade 4 is an important and demanding transition year for many moving into LoLT in a second language. (p. 10).
Learners were assessed in the LoLT used by the respective schools from Grade 1 (Howie et al. 2008b:12).
South African learners attained the lowest scores, with nearly 80% unable to reach the Low International Benchmark, implying that they had not achieved basic reading skills (Howie et al. 2012:6). The difference between the two grades tested was 49 points. The South African Grades 4 and 5 learners' average scores were 253 and 302, respectively (Venter & Howie 2008:19). Both these scores are substantially lower than the PIRLS scale average of 500. The most shocking results are those of the Grade 5 learners who scored significantly lower than the Grade 4 learners from the other participating countries.
Approximately, 325 000 learners from 50 countries participated in PIRLS (now referred to as PIRLS Literacy) (Howie et al. 2017b:1). In South Africa, about 20 000 Grades 4 and 5 learners from more than 400 schools were tested (Howie et al. 2012:XV). The Grade 4 learners were tested in the 11 official languages using pre-PIRLS (now referred to as PIRLS Literacy) (Howie et al. 2017b:1). Most of the PIRLS Literacy assessment is based on shorter texts with a greater percentage of straightforward questions (Mullis & Martin 2015:5). This makes it easier for Grade 4 learners, who are still developing fundamental skills in reading, to participate in the study.
In pre-PIRLS 2011, Grade 4 learners from South Africa attained an average score of 461, which is lower than the pre-PIRLS 2011 centre point score of 500 (Howie et al. 2012:28). Grade 4 learners who wrote in English or Afrikaans attained average scores of 530 and 525, respectively. These scores were higher than the international centre point of 500. However, the average scores obtained by learners who wrote in the African languages were well below the international centre point (Howie et al. 2012:29). This indicates that there is a greater problem related to the teaching of reading to learners whose home language is one of the African languages. The PIRLS 2011 assessment was written by Grade 5 learners in English and Afrikaans schools (Howie et al. 2017b:1). In PIRLS 2011, South Africa's average score was 323, which is 177 points below the PIRLS centre point score of 500 (Mullis et al. 2017:33).
Approximately, 18 000 learners from South Africa participated in PIRLS 2016 (Howie et al. 2017b:1). Grades 4 and 5 learners in isiZulu schools were tested using the PIRLS literacy texts that had been translated into all the official languages. Grade 5 learners were tested using PIRLS, and they were assessed in English and Afrikaans only. Of the 50 countries, comprising 12 000 schools and 319 000 learners, that participated in PIRLS 2016, South Africa achieved the lowest overall average score of 320, 261 points lower than the highest achieving country, the Russian Federation (581) (Mullis et al. 2017:3, 20) and 180 points below the PIRLS centre point of 500. Progress in International Literacy Study 2016 revealed that 78% of South African Grade 4 learners are not able to read and comprehend a text (Ollis 2017:1). The average score of the Grade 5 learners was 406; almost 100 points below the PIRLS centre point of 500 (Mullis et al. 2017:21). Many learners in South Africa attend schools where the LoLT differs from their home language. These learners learn to read in a language they are not familiar with -a process that is very different from learning to read in one's home language (Martin et al. 2003:26). Progress in International Literacy Study is a home language assessment, which is written by many learners in an additional language (Howie et al. 2008b:12). These learners cannot be expected to compete with learners writing in their home language. For South African language learners who write the tests in their home language, the assessment is written in Grade 4 or 5, after they have switched from reading in their home language to reading in English or Afrikaans. Clearly, these learners are disadvantaged to a greater extent than those who read in their home language.

Discussion
In this section, we analyse the results of national and international literacy assessments in terms of grade and phase coverage, the languages in which the assessments are available and the literacy skills that were considered in the research studies. We also look at how reliable the results of these assessments are with regard to policy implications.

Foundation Phase (Grades 1, 2 and 3)
Grades 1 and 2 are exposed to the least national and international literacy assessments. Between 2000 and 2008, there were no literacy assessments conducted at the Grade 1 level. In January 2009, EGRA was administered to Grade 1 learners in only three provinces and in only three languages (Limpopo [Sepedi], Mpumalanga [isiZulu] and North West [Setswana]); and in 2015, it was administered in all the official languages, in all the provinces to Grades 1-3 learners. If EGRA was conducted in these provinces towards the latter part of the school year, more credible data could have been obtained.
Unlike large-scale assessments (SACMEQ and PIRLS) that do not target learners until Grade 4 and often much later, EGRA focusses on learning and interventions in the early grades. In Grade 4, learners may be far behind in reading development. Thus, if the DBE works with individuals from higher education institutions and, more importantly, Foundation Phase educators to develop a standardised literacy assessment tool similar to EGRA, which incorporates emergent literacy (concepts about print, phonemic awareness and listening comprehension), decoding (letter naming, letter sounds, syllable naming and familiar word reading), fluency (oral reading fluency with comprehension) and writing (spelling, grammar, sentence construction, comprehension and story writing), and taking the learners' diversity into consideration, then more reliable data could be generated. These data, combined with qualitative research, aimed at understanding the reasons as to why so many learners in South African schools fail to master basic literacy skills, can play an important role in informing policies on the teaching of literacy in the Foundation Phase.
After ANA was terminated in 2014 because of the flaws identified in its process, there has been only one literacy assessment (EGRA 2015) for Grade 1 learners. Between 2007 and 2009, EGRA was piloted in only five provinces (Gauteng, Mpumalanga, Eastern Cape, KwaZulu-Natal and Western Cape). As the learner population in South African schools is diverse, the EGRA assessment tool could have produced more reliable data if it had also been piloted in the other four provinces. In addition, if learners from each province were exposed to a similar number of EGRAs each year; this would help to enhance the monitoring of learners' reading progress in the country. The literature shows that learners from the Free State and Northern Cape were exposed to the least number (only once) of EGRAs, while learners from the Eastern Cape, KwaZulu-Natal and Mpumalanga had the most (three times -including the pilot programme) exposure to the EGRA.
From 2000 to 2010, and from 2016 to date, no literacy assessments (apart from the SPS intervention in only two provinces) were administered to Grade 2 learners. Annual National Assessment was the only assessment (2011)(2012)(2013)(2014).
The results generated from ANA are flawed, as has been discussed previously, and 5 years out of date. As the foundation for the development of literacy skills, more attention should be given to conducting assessment during Grades 1 and 2. This will assist in monitoring learners' progress in the early grades.
In 2015, Grade 3 learners from all the provinces were exposed to the EGRA. However, between 2017 and 2018, Grade 3 learners from only two provinces were subjected to the EGRA. At present, the DBE seems to focus attention on learners at the end of the two primary school phases (Grades 3 and 6). Systemic evaluation was conducted in Grade 3 in 2001 and 2007, and ANA was administered from 2011 to 2014.
As the EGRA was administered to Foundation Phase learners in selected provinces in different years, and the pilot programme was not extended to learners from all the provinces, and the ANA was heavily criticised by academics, it is unlikely that the results obtained from EGRA and ANA could have produced reliable data to inform policies relating to the teaching of literacy in the Foundation Phase.
The best opportunity to teach children reading and writing skills is in the early grades (1-3) (Gove & Cvelich 2010:ii). If this window is missed, then children who have not begun to read and comprehend what they read will continuously fall behind. In this context, it is crucial that there be reliable structures and assessment tools that can identify and remediate reading and writing difficulties as early as possible. Specific needs that young learners might have in order to become literate could thus be identified and addressed, and all Grades 1 and 2 educators could receive in-service training to be able to adequately address these literacy needs in their classrooms.

Intermediate Phase (Grades 4-6)
Grade 6 learners have been exposed to the greatest number of assessments. Systemic evaluation was conducted in 2004, ANA from 2011and SACMEQ in 2000 in Grade 6. In 2013, ANA and SACMEQ were administered to Grade 6 learners. The top three provinces for ANA 2013 were Free State, Western Cape and Gauteng Province (see Table 2). Southern and East African Consortium for Monitoring Educational Quality IV (2013) produced the same top three provinces but not in the same order (see Table 3). The bottom three provinces for ANA were Northern Cape, Limpopo Province and Eastern Cape (see Table 2) and for SACMEQ they were North West, Eastern Cape and Limpopo Province (see Table 3). North West was fourth for ANA and seventh for SACMEQ. Although there are some consistencies between the 2013 Grade 6 ANA and 2013 Grade 6 SACMEQ results, in SACMEQ III, South African Grade 6 learners came 10th out of 14 education systems (Spaull 2013:4).

Southern and East African Consortium for Monitoring
Educational Quality is a comprehensive reading assessment that focusses on eight levels of reading achievements, but it is available in only two languages (English and Afrikaans). The DBE (Department of Basic Education 2017c:18) reiterated that the introduction of English FAL as a subject in Grade 1 will equip learners who will learn in English from Grade 4 onwards. However, learners whose home language is neither English nor Afrikaans will immediately be at a disadvantage. If SACMEQ were to be translated into the remaining nine languages in South Africa, then children whose home language is not English or Afrikaans would not be at a disadvantage, and stakeholders may learn the true literacy rate of these nine populations.
Grade 4 learners were assessed in ANA from 2011 to 2014 and Pre-PIRLS in 2006, 2011 and 2016, but the Grade 4 learners still achieved low scores. Grade 4 is the beginning of the Intermediate Phase where learners should be able to read fluently to access information or to learn. This is crucial for the years that learners spend at school and for any further studies. Grade 5 learners were assessed in ANA from 2011 to , which is at Grade 4 level, and still scored lower than Grade 4 learners from other countries.
From 2012 to 2014, there were improvements in the ANA results in all the grades in all the provinces (see Tables 1  and 2), and from SACMEQ II to IV, there have been improvements in all the provinces, except Western Cape (see Table 3). There was a decrease in achievement scores for Grade 4 learners from PIRLS 2011 to 2016, and an increase in test scores of Grade 5 learners from PIRLS 2011 to 2016; however, South Africa still came last out of the 50 countries that participated in PIRLS 2016.
In the analysis of South Africa's SACMEQ scores, it is expedient to note the drastic increase in reading scores from 2007 to 2013, which warrants further investigation (see Table 3). These results need to be interpreted with caution. As PIRLS is a comprehensive reading assessment that emphasises reading purposes and comprehension, and is available in all 11 languages, it appears to be more suitable for South African learners. However, the results of the 2016 PIRLS study, which is the most recent literacy study, amplify the crisis in South Africa's basic education and, particularly, the predicament of early grade learners. Comparisons of the 2014 ANA results (see Tables 1 and 2), the 2013 SACMEQ results (see Table 3) and the 2016 PIRLS results show that all the results are disappointingly low.
The DBE's decision to review and terminate ANA, even if it is flawed, has resulted in a 5-year gap in any kind of national assessment material. It has, however, highlighted the need for a new perspective on a national SE model in the South African context. The SE model for 2018 and beyond is a triannual SE that will be conducted on a sample of Grades 3, 6 and 9 learners, and the assessment instruments will allow for international benchmarking and trend analysis across years (Department of Basic Education 2017a:6). Up to the end of 2019, no further information has been provided to schools.
Whether the new SE model will improve on ANA and provide more reliable information on primary school learners' literacy levels remains to be seen.
Because of the fact that some of the assessments were not available in all the official languages, some learners were assessed in their FAL. Thus, the credibility of the results generated from some of these literacy assessments needs to be considered. To use these results to inform policies relating to the teaching of literacy skills in primary schools becomes problematic. Annual National Assessment was terminated in 2014 because of criticisms from various stakeholders such as SADTU and academics. It is difficult and potentially dangerous to plan policies and interventions for literacy improvement in South Africa based on flawed information generated by outof-date and highly criticised assessments (ANA 2011(ANA -2014. Periodic monitoring and assessing of primary school learners' literacy skills using nationally representative assessments is vital in informing policies related to literacy instruction.

Conclusion
Through the development of national literacy assessments and participation in international literacy studies, the South African DBE has highlighted the fact that literacy levels are low in the country. Regardless of which subject or grade is assessed, most of the South African learners are performing considerably below the curriculum, frequently unable to attain basic numeracy and literacy skills (Spaull 2013:7). Although there have been improvements in the ANA and SACMEQ results, the veracity of the results of these studies has been criticised by academics and the 'horrifically low levels of reading achievement' (Spaull 2017:2) that are evident in the most recent PIRLS places immense pressure on the DBE. However, merely identifying the weaknesses through conducting these assessments will do nothing in itself to improve children's literacy skills, unless the educators also get to know how to remediate the problem and make instructional changes that address the weaknesses that the children exhibit on the assessments.
In its attempt to document learner performance in literacy skills and to establish system needs for improving instruction, the DBE has been involved in more literacy assessments that are targeted at learners from Grade 4 onwards, but has not given much attention to the Foundation Phase. Monitoring learning in this phase is critical because it assists in drawing attention to discrepancies in learning outcomes in the early grades. It is therefore vital that more attention be given to learners in this phase. Although EGRA assessed learners from Grades 1-3, in some years, it was administered in selected provinces only.
The EGRA is suitable for Foundation Phase learners because it covers all the aspects that are associated with basic reading skills. However, EGRA covers only one aspect of literacy, namely, reading. Thus, the development of an assessment tool that assesses Foundation Phase learners' writing skills, together with the EGRA, could prove to be valuable in assessing learners' literacy skills. This, together with qualitative research that focusses on the reasons as to why many learners in South African primary schools fail to master basic literacy skills, could assist in informing policies on literacy instruction in the Foundation Phase.
The Grades 4 and 5 learners are exposed to PIRLS, and their results are nothing but disastrous. Their limited experience and exposure to large-scale assessments in the Foundation Phase could have contributed to these poor results. For those learners who are weak readers, it is often too late for effective remedial instruction.
The DBE indicated that the SE model for 2018 and beyond will be conducted on a sample of Grades 3, 6 and 9 learners, and the assessment instruments will allow for international benchmarking and trend analysis across years (Department of Basic Education 2017a:6), but no information about this new SE model has been conveyed to schools as yet.
The results of these literacy assessments do not provide us with reliable information about the literacy levels in the country in order to effectively inform decision-making regarding policies. There are discrepancies in assessment scores when comparing ANA and SACMEQ with PIRLS. Although ANA was discontinued, it provided some indication of learners' literacy levels. At present, there is a gap (between 2015 and 2019) in information regarding learners' achievement in literacy. If the DBE finds more reliable methods of administering national and international literacy assessments to primary school learners, it can assist in eradicating inconsistencies in assessment scores.
Studies suggest that learners whose home language is that of the school will have an easier transition into reading than those who have to learn a new language while they learn to read (Martin et al. 2003:26). Learners who are still developing proficiency in the language of instruction and testing can be at a serious disadvantage. The very poor foundation in home language literacy at the beginning of Grade 4 is particularly concerning, as learners transition to English as the LoLT in this grade (Menendez & Ardington 2018:48). This disadvantage is exacerbated by the fact that the LoLT of the majority of Grade 4 learners in South Africa changes from their home language to English. If the literacy assessments are available in all the official languages, then more credible data could be generated in order to inform literacy instruction policies in primary schools.