• No results found

Assessment in Queensland Secondary Schools 1983-1990

N/A
N/A
Protected

Academic year: 2023

Share "Assessment in Queensland Secondary Schools 1983-1990"

Copied!
37
0
0

Loading.... (view fulltext now)

Full text

(1)

Assessment in

Queensland Secondary Schools 1983-1990

Department of Education, Queensland 1990

(2)

Assessment in Queensland Secondary Schools 1983-1990

E. Clarke

This history was written in response to a request from Professor Nancy Viviani, Australian National University, who was appointed in February 1990 to review tertiary entrance procedures in Queensland. The history supplements E. Clarke's

Assessment in Queensland Secondary Schools: Two decades of change, 1964-1983.

(Brisbane: Department of Education, Queensland. May 1987.)

(3)

ASSESSMENT IN QUEENSLAND SECONDARY SCHOOLS, 1983-1990.

This period was marked by continual reviews and adjustments to the system of assessment and the TE Score System.

EXTERNAL EXAMINATIONS

The Board of Secondary School Studies continued to provide external examinations and issue External Junior and Senior Examination

Certificates. The candidates were mature age students or students who were isolated by distance, or who were ill, injured or incapacitated. In 1984, 595 students received Junior Examination Certificates and 2,852 students received Senior Examination Certificates. In 1988 the numbers were 371 and 4,397 respectively.

In 1985 the external examination syllabuses were redrafted to allow for the certificates issued for the 1987 Junior and Senior Examinations to record student achievement in competency based terms, ranging from Very High Achievement to Very Limited Achievement1.

The Chief Examiners held meetings each year with teachers who prepared

candidates for the external examination. These teachers found the meetings to be very helpful2.

PHASING OUT OF MODERATION, 1983 -1985.

The Board introduced competency based assessment of specific objectives advocated by ROSBA in three phases. Consequently, the process of moderation no longer applied to the Phase I schools, the seventeen

schools selected in the Brisbane and Townsville districts. Moderation still applied though to the Phase II schools in the Brisbane-South Coast, Townsville and Cairns districts in 1983, and to the rest of the schools of Phase III to the end of 1985.

Even though it was in the throes of implementing a reformed system of assessment, the Board was not able to relax its handling of moderation

procedures during these last three years, since these procedures were vital to maintaining the credibility of Board certificates. The established procedures and practices of the moderation system continued. These included special meetings of school principals to discuss matters related to moderation, visits by chief moderators and/or Board representatives to remote area schools to provide assistance and advice, and the continued supply of A Handbook of Procedures for the Moderation of Standards of Assessment. At the annual conference of district, State and chief moderators held in the last three years of the operation of the moderation system, a major theme was the importance of maintaining the efficiency of the system. Moderation panels of teachers continued to scrutinise carefully student scripts and test instruments for

(4)

suitability of questions, levels and standards t o ensure that teachers did not try to gain an unfair advantage for their schools. Moderation procedures for the Junior subjects were less stringent than those for Senior subjects3.

STAGES OF IMPLEMENTATION OF ROSBA, 1983-1990

The Pathfinders. Completion of Phases I and II, 1983 -1984.

At this point it would be useful to examine more closely the machinery and procedures used by the Board to implement ROSBA. The recommendations of the Scott Committee, on the advice of a public relations firm, were

presented to the public as ROSBA (Review of School Based Assessment) as an effective way of 'marketing' the changes involved. The use of this term possibly has led many to think of a new system instead of reforms to the existing one4.

Because assessment was to become an integral part of the syllabus, it is necessary to follow the syllabus from its inception to its

implementation. The existing subject committees of the Board had the major responsibility to ensure that syllabuses were rewritten, stating objectives in terms of student outcomes in the four broad areas of process objectives (cognitive skills to be developed in relation to the

discipline), content objectives (factual knowledge), skill objectives (practical skill) and affective objectives

(attitudes, values and feelings). These committees also had the responsibility to provide the criteria to assess a student's level of achievement in these objectives, and to suggest some useful assessment techniques. The students were to be informed of the objectives, and it was hoped that the new form of assessment would result in student co- operation to achieve a stated self-determined goal instead of competing for a predetermined limited number of ratings5.

Phase I schools were given the trail blazing task of implementing these syllabuses for Years 9 and 11 in 1982 and Years 10 and 12 in 1983. Phase II schools carried out the same task in 1983 and 1984.

After receiving their syllabuses, Phase I schools, and, later, Phase II schools were required to write work programs for each subject offered which detailed the depth of study of the various objectives and their assessment.

The following guidelines were provided for the assessment part of the work program:

• An appropriate range of assessment techniques should

provide a balance in the assessment program. The range should be sufficient to encompass as many as possible of

the objectives.

• Suitable criteria should be stated for each assessment technique.

These criteria are the qualities sought in student work in response to the assessment instruments.

• The techniques and criteria should be consistent with the stated objectives.

(5)

• Sufficient information should be provided concerning the manner of implementation of the assessment. This would

include the timing and nature of the assessment instruments including length (in time, words, pages, etc.) a s well a s the conditions which apply e.g. supervised exam, teacher - assisted research work, etc.

• The school should show the mechanism by which it reaches a single measure of exit achievement as generated from all the assessment data6.

One major problem which arose at this stage was that many teachers used Radford item band test instruments and made no attempt to construct assessment instruments which reflected the criteria which were now the basis of assessment7.

The work programs were submitted to District Review Panels which accredited them if they were satisfactory. If they were not satisfactory, the Chairman of the Panel consulted with the school until agreement was reached. State Review Panels viewed representative samples of work programs from each district to maintain comparable standards. They also acted as arbiters in any disputes between district panels and schools. These panels comprised experienced teachers, and each panel was responsible for a specific subject or subject grouping.

The District Review Panels were also responsible for monitoring and reviewing school assessment to ensure the maintenance of standards and comparability between schools in each district and the State Review Panels had the same responsibility for the districts. Monitoring was carried out at the end of Year 11 (Year 9 was not mandatory) and reviewing at the end of Years 10 and 12. Monitoring was mainly of an advisory nature while reviewing could require a school to make changes in its judgments before it could receive Board certification. For

monitoring and reviewing purposes, each school was required to forward a submission to the District Review Panels comprising:

• all assessment instruments in the printed form used for summative assessment of students,

• details of any other methods of collecting data on achievement and an indication of expected student responses,

• sets of student work representative of the range of achievement levels proposed,

• a statement showing the number of students proposed for each level of achievement at the time of the submission.

Results of various tests in schools were usually given in numbers before they were translated to levels of achievement8.

The Board provided in 1984 the following underlying principles of the Board's policy on assessment which henceforth became an integral part of school syllabuses:

• Exit assessment should be devised to provide the fullest and latest information on a student's achievement in the course of study.

• Assessment of a student's achievement should be in the significant aspects of the course of study identified in the syllabus and the school's work program.

(6)

• Information should be gathered through a process of continuous assessment.

• Selective updating of a student's profile of achievement

(Student's attainment of the course objectives should develop progressively over the duration of the program. T o ensure that information about student achievement is the fullest and latest, certain aspects of the course, such as higher order processes, are best assessed summatively after their period of development. Those pieces of data which best describe

performance which will not improve, such as knowledge of particular content, should be retained).

• Exit achievement levels should be devised from student achievement in all areas identified in the syllabus as being mandatory.

• Balance of assessments is a balance over the course of study and not necessarily a balance within a semester or between semesters (i.e.

it is appropriate to have different emphases upon aspects of the total program at various times during the course. Some test

instruments may assess wholly content or process or skill, while in other cases content and process may be assessed together)9. Year 10 certification did not require monitoring in Year 9 and procedures in Year 10 differed slightly from those for Year 12 (See Appendix,

Figures 1 and 2). Each September, teachers of Year 10 subjects formed consortia (convenient groupings of schools), and reached a common agreement on submissions to District Panels whose proposals were reviewed by State Panels. Schools issued certificates on the basis of agreed September proposals following review procedures. These consortium meetings provided an opportunity for Professional interchange as well as being a step in certification10.

Unfortunately, in many schools the awarding of exit levels of achievement, which was meant to provide the fullest and latest information on a student's achievement, was initially negated by the weighting of

semester results and the use of arbitrary cut offs on a 100 point scale11.

The Junior and Senior Certificates recorded only Board and Board - registered school subjects. Board school subjects were those subjected to Board accreditation and certification processes, while Board- registered ones were those which were subjected to Board accreditation with the school being responsible for assessment. The latter category was introduced to remedy a situation which had arisen over the previous few years. The situation was that a category, School subjects,

especially those at year 10 level, appeared on Board certificates, but the Board did not monitor these in any way. Subsequently, the results of these subjects came to lack credibility with parents and employers. The new category of Board-registered school subjects allowed schools to retain flexibility in providing subjects which catered for the needs of its students and, at the same time, to maintain standards through Board accreditation procedures which ensured public confidence in the value of the certificates12.

(7)

For Phase I I schools, the Board drew attention t o the available resources. The written resources were draft ROSBA syllabuses, sample work programs from Phase I schools, exemplar work programs issued by the Department of Education, ROSBA Bulletins 1-5, Information

Statements/Memorandums (During 1984, 224 memorandums were sent out to schools). The people resources listed were the Department of Education consultants, the Board of Secondary School Studies co-ordinators, and other teachers in the school or the district who had previously come to grips with the task13.

The Board also continued an existing information service to students by issuing a number of pamphlets which explained the new form of assessment, the nature of the certificates issued and the Tertiary Entrance

Score14.

Continued Discontent in Phase I and II Schools

Dissatisfaction among teachers implementing ROSBA had bubbled up in 1982, and measures taken in January 1983 to eliminate this discontent were inadequate. It was clear by the middle of 1983 that teachers remained discontented.

In October, the Queensland Teachers Journal reported that teachers of the Phase I schools in Townsville were complaining of excessive

administrative requirements imposed by Review Panels relating to the

submission of student scripts, and both the Townsville and Brisbane Phase I schools proposed industrial action as a counter to what they regarded as unreasonable demands in the assessment procedures.

The Queensland Teachers Union (QTU) held discussions with the Department of Education, putting forward the view that current ROSBA assessment

procedures in Years 10, 11, and 12 were unduly onerous and placed unreasonable administrative burdens on teachers. The QTU and the Department reached a compromise. Year 10 scripts would continue to be submitted to re-establish parity of standards to maintain public

confidence, but the number of scripts were reduced. Year 11 scripts were not required, and a slight reduction was made in the number of Year 12 scripts. The Assistant Director General, George Berkeley, said that further experience might lead to other modifications

in requirements15.

At the beginning of 1983, the Minister for Education, Lin Powell, had publicly criticised some aspects of ROSBA assessment procedures and had stated that it would be foolish for the Department of Education to implement such a costly system too quickly16. During the Townsville disput e, the Townsville Bulletin reported that Lin Powell conceded that extra work was needed initially for ROSBA assessment procedures, but that he believed that a saving of work and better assessment resulted in the long term. That newspaper also reported that he accused a minority of teachers of making an industrial and political issue

(8)

out of the problem17.

Some participants in Phase I and II schools voiced their opinions individually. As well as referring to the onerous work loads, they drew attention to oth er perceived problems in the implementation process including:

• Over-assessment still existed.

• While the mechanisms of certification had been clearly established, the operating procedures of criteria-based assessment were not so clearly established.

• Panels were accused of not giving positive guidance, acting solely in the role of critics. Some Panels, with over twenty members, were too large. More clear and precise guidelines should be provided to Panels.

• Phase II schools were given inadequate feedback of the experience of Phase I schools to assist them at the appropriate times.

• The quality and quantity of in-service education and support was inadequate, with the result that teachers suffered the same trauma as that experienced when the Radford Scheme was first implemented.

• Because of the evolutionary nature of implementation, the trail blazing schools experienced uncertainty, criticism, confusion and pointless hard work.

One commentator referred to the conflicting advice coming from the State Review Panels, co-ordinators, consultants, senior officials of the

Department of Education and the Board as a veritable Tower of Babel while another described the exercises involved as reliving the Charge of the Light Brigade. At the same time, most of these commentators expressed a belief that the implementation of ROSBA was resulting in improved teaching18.

The Queensland Council of Parents and Citizens Associations (QCPCA), in its annual policy statements, supported a policy of school achievement-based assessment. It e xpressed the belief that tertiary institutions or prospective employers should establish their own methods of assessment, and should not dictate processes of assessment of student achievement which should be concerned with individual progress and should be non-

competitive as between student and student. This policy was maintained by the QCPCA through to 199019.

Phase III Implementation, 1984-6.

In November 1983, the QTU executive organised district meetings to find out the views of its members on ROSBA and assessment in particular. These meetings held in March 1984 throughout the State comprised QTU members in secondary schools. By this stage, teachers in Phase III schools were involved in the preparatory stage of ROSBA, prior to its implementation by those schools in 1985 and 1986.

The views expressed at these meetings were summarised in The Queensland Teachers Journal. Several meetings called for the release of the 1983 Campbell report on the implementation of

(9)

ROSBA in Phase I and I I schools (This report was never publicly released). Two district meetings expressed complete opposition to ROSBA while most others accepted ROSBA in principle, but expressed concern about the implementation process. The general attitudes to assessment procedures were as follows:

• lack of confidence in panels which were perceived to be elitist,

• preference for the moderation type of meetings with its interchange of ideas among teachers on assessment methods,

• opposition to monitoring and reviewing processes for Years 10 and 11, especially Year 10 which had not required submission of student scripts for many years,

• agreement on the need for submission of student scripts at each level of achievement for Year 12,

• a belief that teachers' comments on the implementation process were not considered by officers of the Board or the Department.

The meetings also dealt with concerns about procedures related to the TE Score20.

During the 1984 teachers' conference, one of the delegates provided a startling case in support of the overall reduction of scripts required for review and retention. She claimed that, in Toowoomba State High School, with an enrolment of 1,200 students, current ROSBA administrative arrangements would, by her calculations, create an estimated 150,000 pieces of work requiring 30 filing cabinets in one year alone21.

Following the compilation of the results of the QTU district meetings on ROSBA, the QTU conducted negotiations with the Department of

Education. As a result, the Department provided extra supply

teachers to provide some relief to teachers with extra work loads imposed by the implementation process22.

Meanwhile, during 1984, the Board continued to issue published information material to teachers, and continued a comprehensive system of in-service education designed to overcome many of the shortcomings exposed during the previous years. In addition to a series of workshops, conferences, seminars, and consultant visits to schools, the work of a consultant team of eleven teachers included telephone advice and assistance to schools with problems, including those related to assessment procedures. Finance for the consultant team depended initially on special government funding, and its personnel was reduced in 1986 and 1987 and discontinued thereafter23.

In 1984 the Board of Secondary School Studies contacted the Board of Teacher Education requesting that future teachers be competent in ROSBA operations. The main institutions preparing teachers for secondary schools provided such preparation, with varying degrees of depth24. In response to one of the recommendations of the 1983 Campbell Report, the Government provided funds in the 1984-5 Budget for additional support services for the Board25. This enabled the

(10)

Board t o establish in 1985 the Assessment Unit which operated until the end of 1987when funds were no longer made available to the Board. This Unit had the responsibility of helping to

establish a sound theoretical foundation for a school-based

assessment system, and to clarify and make suggestions about the practical a spects of such a system in schools. In 1985, the Unit gave special attention to the issues surrounding criteria-based

assessment.

In the first of a series of discussion papers issued by the Assessment Unit in 1986, the author of the paper gave a clear picture of the problems

confronting teachers who were learning on the job a new philosophy of

assessment and grappling with the practical aspects of implementation.

He also envisaged what the participants would accomplish:

This change in direction [from norm-referenced assessment, and towards criterion-referenced principles] will be ultimately of greater significance than the mechanics of the changeover. At present, teachers, schools, the in-service team, and review officers are more-or-less submerged in a deluge of paper, meetings, work programs, criteria and standards, assessment instruments,

student folios, and some confused signals emanating from a number of sources as to what the whole exercise is about ... When the flood subsides, there will be not only a clearer conception of what standards-based assessment is, but a considerable amount will be known about how to put it into

operation26,

Also in 1985 seven review officers were seconded to the Board to assist the work of the Review Panels, and nine District Board Agents were appointed to provide clerical and organisational support for Board

activities, including the process of certification27.

In 1985 the Board's Senior Certificates included for the first time results of subjects which students had studied at TAFE institutions and which had been approved by the Board. This led to the establishment of a third category of subjects in addition to Board subjects and Board- registered Subjects, i.e. Board-recorded subjects. Such subjects were offered by institutions apart from those offering the conventional secondary education courses. To be accepted by the Board as Board - recorded subjects, the subjects had to meet certain Board criteria28. The Board continued its intensive publicity campaign throughout the P hase III period. It made use of published material, talks, displays and newspaper advertisements to keep the public informed of how the system of assessment worked and the nature and effect of the changes

undertaken29.

Up to 1986 the Board had to face considerable internal and external

pressures. It continued to conduct external examinations, supervise school-based assessment using Radford moderation procedures, and, at the same time, it undertook the implementation of new ROSBA procedures involving a considerable

(11)

amount of re-organisation and in-service education. I t took the responsibility for the actions of the various committees, sub- committees and panels. It was also subjected to considerable criticism and misrepresentation. It would appear, after taking into consideration the volume of information, such as pamphlets,

information statements, memoranda and handbooks, which the Board sent out to teachers, employers, parents and students, that much of this criticism and misunderstanding would not have appeared if the information had been read30.

The Press shed little light on the issues or the problems despite information provided by the Board to press representatives. An illustrative case of this is provided by The Courier-Mail headlines on 20 December 1985which declared, 'TE system a

disaster: Senate Committee'. The report which followed completely misrepresented and distorted information about the implementation of ROSBA in Queensland. It included the statement, 'The report of a Senate Standing Committee on Education has found Queensland's system of secondary school assessment is a disaster, widely misunderstood and lacking community support'.

A reply from John Pitman, Executive Officer of the Board, appeared as a letter to the Editor several days later. This pointed out that the article failed to distinguish between Queensland's assessment system and TE scores, and that the Senate Standing Committee did not make any

investigation of Queensland's assessment system31. In spite of this clarification, three weeks later an article in The Sunday Mail on TE scores included the following statement, 'A report recently issued by the committee [The Senate Standing Committee on Education] found Queensland's system of secondary school assessment a disaster, widely misunderstood and lacking community support'32. A week after, the Director-General of Education, George Berkeley, in a letter to the editor of The Sunday Mai133, referred to this statement as 'absolutely untrue' and gave detailed

supporting evidence similar to that which John Pitman provided to The Courier-Mail.

Consolidation of ROSBA, 1987-1990.

While the Board claimed in 1987that the principles and philosophy of ROSBA had been accepted by the majority of Queensland teachers, it believed that to put this acceptance into practice and to achieve the aims of criteria-based assessment in all Queensland secondary schools would require continuous development and refinement over the next decade. At the same time, the Board was convinced that Queensland was a world leader in criteria based assessment, judging by the interest both interstate and overseas educators were showing in the developments taking place in Queensland34.

One of the Board's successes through the school based assessment system was the rapid increase in provision of subjects related to the needs of the students. In 1982, 78Board subjects and

(12)

530 Board Registered subjects were recorded a s being provided in Queensland secondary schools (Years 8-12). I n 1989, these figures were 8 1 and 1,683 respectively35.

Early in 1989 the QTU, in response t o concerns of its members, placed a work ban on the Board's organised activities until full

travelling expenses were paid to members participating in these activities, and until adequate teacher relief was provided. This ban was supported by the Queensland Association of Teachers in Independent Schools. Some months later, this ban was lifted because the Union claimed that it had resulted in Departmental acceptance of their claims36.

Assessment remained an issue through to 1990. In 1988, Errol Vieth, a QTU representative on the Board, presented a case study of ROSBA which he claimed revealed that the meaning of 'education' was synonymous with

'assessment', that the 'invisible curriculum code' of performance, goals and assessment were made obvious as well as the degree of hegemony control which this ideology had over education. He concluded however that the case study did not show that a system of external examinations was preferable to ROSBA, despite 'the almost pathological rantings of some elements of the press against systems of assessment which do not have external examinations'. He believed that there were other solutions to the problems37.

Prompted by Vieth's study, three academics from James Cook University conducted a survey on assessment in Townsville secondary schools in 1988. They concluded from their study that the tension between a criterion-referenced system of assessment and the generation of a rank order for TE score purposes was working against the acceptance of ROSBA.

They also reported perceptions that, under ROSBA, competition had not decreased and student levels of achievement had not risen, that over-

assessment with a focus on quantity was taking place and that schools should reduce the quantity of assessment in favour of increased quality, a

principle accepted by the Board. The academics stated that the Board needed to strengthen in-service education and publicity measures. They concluded that it was important to emphasise that ROSBA had been

successful in introducing the concept of criterion-referenced evaluation into classroom assessment practices, with consequential benefits38. In 1989 a Departmental research report dealt with the effects of major changes in secondary schools. The report attribute d these changes to significant economic, technological and social changes in Australian society and identified the implementation of ROSBA, including assessment, as one of the significant problems involved in secondary school changes. The report maintained that teachers generally supported the concepts behind ROSBA and the consequent changes, but complained of the associated extra work and time. It also maintained that many teachers were frustrated, angry or confused about the changes39.

In the following year, another Departmental research report dealt with teachers' and students' concerns about assessment

(13)

practices in Years 11 and 12 a s one of four major issues in senior schooling. This report revealed the following after a sampling of secondary school teachers' concerns:

• There was general support for school-based assessment in principle, and some support for the direction it was taking.

• Many teachers strongly expressed concern about the workload or the pressure on students caused by assessment.

• Some teachers strongly believed that there was too much assessment in some subjects.

• Some strongly complained of excessive clerical workload created for teachers by assessment procedures. (The researcher believed that this was caused by the Board's requirements for information and the need for full documentation of students' achievement to justify the ratings given, and because of the high emphasis placed on

comparability of ratings within and between schools made necessary by the TE Score System).

• Some were strongly worried about 'too much variation' or 'lack of comparability' between schools or subjects in assessment practices or standards.

• Some forms of external examinations were supported by some and opposed, sometimes quite strongly, by others.

• Some comments indicated a broad range of disagreement about what makes a desirable schedule of assessment.

Other teachers' concerns raised were:

• Achievement categories were too broad.

• Assessment demoralised many students.

• Assessment pressure interfered with students' participation in school activities.

• Course rigour was determined by assessment.

• Numerical results in assessment were more manageable in large schools and preferred by students over verbal ratings.

• The assessment system was over complex.

• Manipulation of assessment ratings occurred in schools.

• The assessment system was not well understood.

• Many students were not motivated to achieve.

The same interview study found that while students seemed to accept assessment practices, there were two strong and widespread conce rns.

These were that at times too many assignments and tests came at once, and that some assessment components carried too high a proportion of the marks for a subject. Other lesser concerns were related to

pressure or stress associated with assessment, and perceived levels of consistency, fairness or impartiality among teachers. A minority of students, mainly those with definite occupational and educational plans and studying five or six Board subjects, believed their assessment loads were excessive. Close to one -third of the students raised no concerns40.

THE TERTIARY ENTRANCE SCORE SYSTEM, 1983-1990.

(14)

The Calculation of Tertiary Entrance scores in operation

The Board was required by regulations to provide the results of candidates of Board examinations in rank order of merit for tertiary entrance and it developed the TE Score System for that purpose. The calculation of this, following the ROSBA changes, can be described simply.

Individual students are awarded marks, known as SSA (Special Subject Assessments) in each of their subjects. These marks are given by teachers and form the basis for the TE score.

All students who are eligible for a TE score then sit for the ASAT

(Australian Scholastic Aptitude Tests) examination. The SSA which the teacher has awarded is then adjusted by comparing the different groups of students in a particular school. All those doing Maths I have their marks altered in a particular way, those doing French have their marks altered in a different way, and so on. In this procedure the relative positions of students are not changed, so that if the teacher places Mary Jones ahead of Bob Brown in chemistry, she is still ahead of him after adjustment.

The adjusted marks of each student for all subjects are then totalled.

These totals are again adjusted by comparing the overall ASAT results from different schools. If Mary Jones' total mark is higher than Bob Brown's, then her adjusted total is higher than his.

All students in the State are then placed in order of merit, using the final adjusted totals. The top one percent get 990, the next half percent 985, and so on41.

Students have the right to have TE scores verified or recalculated, and individual schools have the right to claim that some standard procedure in the compilation should not be applied to a particular group of students42.

Misunderstanding and Misuse of the TE Score System

Unfortunately, because of the various depths of complexity associated with the operations of the TE Score system (See Appendix, Figures 3 and 4), many people continued to misunderstand and misuse it. In addition to the continuation of earlier misunderstandings, a new one arose when people tried to compare TE Scores from one year to the next. This was an invalid comparison because the TE Score is no more than a place on a rank order of banded or grouped students and as the number of TE Score candidates

increases, the lowest score issued and the median score decrease.

Many employers continued to misunderstand and rely too heavily on the TE Score as a measure of general aptitude, and ignored Board advice to make greater use of Board Certificates which

(15)

provided a wider range of information43.

Continuing criticism and suggestions for Change

By 1986, Queensland had the highest Year 12 retention rate of all the Australian States and, a t the same time, the lowest per

capita percentage of tertiary places. John Pitman, Executive Officer of the Board, compared these two pressures to 'a giant vice squeezing in from both sides'44. This trend continued in the following years (See Appendix, Figures 5 and 6). One of the results of this pressure was an increase in the number of students repeating Year 12 to

obtain a higher TE Score45. Another result was closer public scrutiny of the operation of the TE Score. The Board contended that this placed excessive pressure on the TE Score 'which at best, is a broad-brush measure of global achievement'46.

In 1986 reports on the TE Score System from various bodies appeared in The Courier-Mail. These were in relation to a request from the Board to interested persons and organisations for submissions on tertiary entrance.

Reports from tertiary institutions revealed differing viewpoints. The University of Queensland stated that the TE Score had been generally a good predictor of future performance, but was critical of ASAT. The Brisbane College of Advanced Education (BCAE) was also critical of some aspects of ASAT and called for a broadening of the process of tertiary

entrance which it believed should be based on performance in approved subjects with reference to prior elected tertiary courses. The Queensland Institute of Technology was basically satisfied with the TE Score System but suggested compulsory external examinations in English and

Mathematics which would provide 50% of the marks, with the other 50% coming from internal assessment. These results would then be used in the calculation of the TE Score. These educational institutions did not appear interested in implementing their own entrance system, probably because of the cost of administration47.

The QTU called for the abolition of the single index TE Score and its replacement by school based assessment, a process of interviews, and quotas for disadvantaged groups. The Queensland Association of Teachers in Independent Schools (QATIS) recommended that student profiles and interviews should supplement the TE Score. These would provide information on experience, aptitude and interests of applicants.

Both the QTU and the QATIS recommended a common first year course at a tertiary institution before specialisation48.

The Queensland Trades and Labor Council (QTLC) recommended that school

based assessment should be kept and improved, and that ASAT and TE Scores should be scrapped49, while the Queensland Confederation of Industry (QCI) advocated that students not bound for tertiary study needed an

alternative educational system to the TE Score. In the following year, the QCI claimed

(16)

that it would launch State-wide industry entrance tests for school leavers by the end of 1989 because TE Scores were meaningless to employers. This proposal was not well received by tertiary

institutions, the QTU, the QTLC, the Minister for Education and the parliamentary opposition parties. The proposal was not

implemented51.

The QCPCA policy expressed in 1986 was that tertiary institutions or

prospective employers should establish their own methods of assessment and should not dictate processes of assessment of student achievement52. A Department of Education research report on issues in senior schooling (1990) included a survey of students' and teachers' perceptions of the TE Score. The research results indicated that the TE Score was regarded by most students as important and useful as an 'all-purpose'

credential, but that, otherwise, it was widely disliked and mistrusted by students (See Appendix, Figure 7), for whom it was surrounded by mystery and suspicion. The research results also indicated that teachers generally accepted the TE Score in principle, but questioned its validity and the way employers used it. The report pointed out the features which underlie these negative responses, and stated that the TE Score itself was not the cause of all of the problems and that the intensity of the competition for

credentials was the underlying cause. The report provided criteria which it believed any new system for tertiary entrance should take into consideration if it were to alleviate the problems. It also expressed the belief that the 1987 recommendations of the Working Party on Tertiary Entrance for amending the tertiary entrance system did not address the undesirable features revealed by the report, nor meet the report's criteria for any new system. The researcher believed that complexity, imposed by the wish to achieve comparability, was at the heart of the problem, and the researcher outlined four changes to overcome the complexity53.

Some critics looked beyond perceived weaknesses of the TE Score in operation and directed their attention to what they regarded as the incompatible tasks that the Board was obliged to undertake - the provision of a general

education and selection of students for tertiary education. Mary Kelly, President of the QTU, said in 1986 that 'These two tasks pull in different directions', while Mike Middleton of the BCAE referred to them in the following year as two pressures tearing education apart54. Another critic, Max Howell, was well qualified to philosophise over this

problem. He had been on the Radford Committee, on the Board since it was constituted in 1971, and a member of the subcommittee which produced the Scott report. On the eve of his retirement as Head Master of the

Brisbane Grammar at the end of 1989, he stated that professional educators had tried over the last four years to improve the methods of selection without any marked success. He believed that one should not place too much credence in the promises of politicians to solve the problems.

In his view, the problem would only be solved when State and Federal governments provided sufficient funding to ensure that

(17)

all young people who met a qualifying level of performance were given the opportunity for tertiary education55.

A perusal of Queensland newspaper cuttings dealing with the TE Score issue during the period 1983-1990 revealed that on occasions the newspapers provided accurate coverage, but on many other occasions the issue was misreported and sensationalised. Two examples well illustrate this latter situation. The Daily sun editorial, 26 August 1988, referred to the Australian Council for Educational Research as the Australian Council of Research, which it incorrectly stated was 'controlled locally by the Board of Secondary School Studies'. This editorial also exhibited a complete misunderstanding of ASAT. The Sunday Mail, 2 March 1986, reported a critical statement about the TE Score System by Dr Phil Meade, Principal of the BCAE Mt. Gravatt Campus. Dr Meade's statement was taken out of context, and made into the sensational leading sentence of an article. Dr Meade protested to the Editor who redressed the situation by publishing in the next week's paper Dr Meade's appraisal of TE Scores which contained Dr Meade's criticisms of the TE Score within a rational

framework56. Br Steve McLaughlin, Head Master of Nudgee College and a member of the Reference Committee appointed in 1990, suffered a similar fate in The Sunday Mail, 18 February 1990, which featured a headline, 'Head vows to end "horrible" TE score'. He immediately

explained to his staff in a memo the following day how the headline and following report had misquoted and distorted his statements57.

A M.Ed.Studies thesis (1990) made an analysis of 154 newspaper items dealing with the TE Score System from 1986 to 1988. This thesis pointed out that the TE Score System was a perennial issue which guaranteed press coverage.

The results of the analysis indicated that most authors of published material on the TE Score System expressed poor opinions of the present Queensland system, and, by inference, were unfavourably disposed towards the subject matter that they were discus sing. The thesis stated that perceived media domination of content led to the inference that the media were controlling public perceptions of the issue to some extent. It also expressed a conviction that the inability of the media to serve as a forum for some serious exploration of solutions was a factor in

undermining public confidence58.

At the end of 1989, before the State elections, the political parties expressed their policies on the TE Score system. The National Party Minister for Education, Brian Littleproud, stated that the Board was in the process of changing the system. The Liberal Party spokesperson, Lyle Schuntner, said that his party was committed to the 1987

recommendations of the Working Party on Tertiary Entrance, as published in Tertiary Entrance in Queensland: A Review. The Labor Opposition Leader, Wayne Goss, declared that his party would replace the TE Score system59.

After the Labor Party was successful in the December 1989 State elections, the new Labor Cabinet approved in the same month funding for an extra 1,500 tertiary places. This was a step

(18)

which some believed would result in taking some of the heat out of the TE Score controversy60. Then the new Labor Minister for Education, Paul Braddy, announced in February 1990 that Cabinet had approved the appointment of Professor Nancy Viviani to review tertiary entrance procedures in

Queensland and to present a report and recommendations by 30 June 1990. The Minister stated that there was widespread dissatisfaction with the existing tertiary entrance system and that he was committed to the

introduction of a new system for Year 11 students in 199161.

Professor Nancy Viviani, Professor and Head of Political Science, Faculty of Arts, Australian National University, had wide experience of tertiary entrance systems of most Australian States, and was familiar with systems in use in the USA, the UK and Asia. The terms of reference for her

review were:

•to review the present system for the compilation of Tertiary Entrance Scores in Queensland,

•to recommend an alternative system which would:

o be fair, equitable and easily understood by students, parents and teachers,

o aim to provide a tertiary entrance profile which includes as separate components, school-based assessments of achievements as recorded on the Senior Certificate and independent measures of aptitude for tertiary entrance,

o aim to use measures which depend, and are seen to depend, on each individual student's performance,

o avoid using a single score as an indication of a student's aptitude for tertiary studies,

o avoid the necessity to rescale school assessments using procedures reliant on group performance,

o reduce the pressures imposed by Tertiary Entrance Score requirements on the curriculum in the senior secondary school and on the subject choi ces of individuals,

o be accessed by students completing Year 12 who wish to compete for tertiary entrance.

• to consult with tertiary institutions concerning the ways in which the alternative system would be used,

• to recommend arrangements through which the alternative system could be administered and operated62.

Cabinet also approved of the appointment of a broadly representative reference committee to provide Professor Viviani with information, advice and reactions throughout the review process. The convener of The Reference Committee was Professor K.W. Wiltshire, Department of Government, University of Queensland, and the members were:

• Mr Bob Moritz, Queensland Council of Parents and Citizens' Association,

• Mr Leo Dunne, Parents and Friends Federation, Queensland,

• Mr Brian Flaherty, Independent Parents and Friends Council of Queensland,

• Mrs Dianne Goosem, Queensland Catholic Education Commission,

(19)

• Mr Gilbert Case, Associations of Independent Schools in Queensland,

• Mr Richard Warry, Department of Education,

• Mr Stan Sielaff, Department of Employment, Vocational Education, Training and Industrial Relations,

• Ms Mary Kelly, Queensland Teachers Union,

• Mr Doug Watson, Queensland Association of Teachers in Independent Schools,

• Br Vince Connors, Queensland Catholic Education,

• Br Steve McLaughlin, Independent Schools in Queensland,

• Mr Daryl Hanly, State Education in Queensland,

• Mr John Pitman, Board of Senior Secondary School Studies,

• Ms Avril McClelland, Tertiary Admissions Centre,

• Mr Doug Porter, Registrar, University of Queensland,

• Dr Geoff Masters, Australian Council for Educational Research,

• Professor Glen Evans, Dr Graham Maxwell, both from the Faculty of Education, Queensland,

• Professor Jack Walton, James Cook University of North Queensland,

• Mr Bob McQueen, Griffith University Staff Association63.

Board Measures to Improve the System

The Board, required to produce a rank order of merit for tertiary entrance, continued to improve the operation of the existing system and to examine alternative procedures. Each year it provided

information which explained the TE Score system to teachers, students and parents. It introduced changes to improve the system and initiated investigations into any legitimate criticisms64.

Evolving from a Board initiative in 1983, The Working Party on Tertiary Entrance was established by the Minister's Joint Advisory Committee on

Post-Secondary Education and the Board of Secondary School Studies with John Pitman as Chairman. The terms of reference were 'to review all aspects of entrance to tertiary institutions in Queensland'. Its first meetings were held in 1985 and its extremely comprehensive report appeared in July 1987 as Tertiary Entrance in Queensland: A Review.

This report made 52 recommendations for change, the majority of which mesh tog ether in a total package. The authors determined that fairness was the most important dimension in their deliberations - sometimes at the expense of mechanical simplicity - because such simplicity often conceals serious injustice. The report dealt with major concerns often expressed about TE Score procedures,

including:

• students with similar levels of achievement in the same subjects with quite different TE Scores,

• the alleged advantage enjoyed by students repeating Year 12,

• perceived conflict between the TE Score and criteria-based assessment,

• use of the TE Score by employers,

(20)

• the TE Score a s a predictor of tertiary success,

• the issuing of TE Scores after the closing date for Queensland Tertiary Admissions Centre (QTAC) preferences,

• tertiary pre-requisites ,

• alleged manipulation of data by schools,

• ASAT:

o influence of absentees from ASAT, o effect of coaching,

o value of ASAT as a scaling device, o malpractice in administration of ASAT, o bias in ASAT - cultural, sex, content,

o influence of poorly motivated students doing ASAT, o inclusion of a written expression component.

A major recommendation was that eligible students receive an Achievement Position Profile comprising (i) a single general-purpose indicator, to be known as an Overall Achievement Position (OAP), which compares eligible students' overall achievements in senior secondary school studies, and (ii) four special-purpose indicators, to be known as Specific Achievement

Positions, which compare the achievements of students with the same Overall Achievement Position65.

The first step, the calculation of the OAP, will place students into 20 broad bands, each containing 5% of the eligible students. This will provide a course ranking of students. Students in band 1 will be considered to have achieved at a higher le vel than those in band 2.

Similarly, the achievement of students in band 2 will be higher than for those in band 3, and so on to band 20.

Thus, while the OAPs provide a course distinction among students, it does not provide any information to distinguish among students within the same band. This is achieved by the four special purpose

indicators. Each of these further divides each band into 10 groups, each containing the 10% of the number of students in the band. The purpose of the special purpose indicators is to provide each faculty within tertiary institutions with a range of measures on which to base the fine decisions within the 'grey' area of selection, i.e. with students close to the border-line for the cut-off (See Appendix, Diagrams 8 and 9).

In 1988 the Board introduced a major refinement to the TE Score system which had been one of the recommendations of the Working Party on Tertiary Entrance. This was a range of anomaly detection procedures for checking the use of standard scaling procedures in calculating

students' TE Scores. Where the data from schools exhibit significant irregularities, anomaly detection procedures are designed to seek objective evidence for a decision to vary the standard procedures, i.e. to make

special calculations for those schools66.

To implement another Working Party recommendation, the Board in 1989

introduced a Writing Task, which tested English expression, to enhance the overall validity of ASAT as a scaling device. The Writing Task and ASAT were combined to form the Common

(21)

Scaling Test (CST), with the Writing Task constituting 25% and ASAT 75% of the CST. Group scores on the CST were then used to scale school- based assessments for the purpose of calculating TE

Scores67.

CONTROL OF THE JUNIOR CERTIFICATE

During the late 1970s, the Department of Education began to develop the concept of P-10, which advocated a continuity in the curriculum from Pre- school to Year 10. This was included in the proposals of the

Departmental blueprint for future developments for education, Education 2000, which appeared in 198568. T h i s document referred to the rapidly diminishing reliance on the Junior Certificate. Following the

appearance of this document, a joint effort by the Department and the Board to coordinate ROSBA with plans for the implementation of P-10 soon experienced difficulties. One of the visible signs of these difficulties which emerged by 1988 was the conflict over the future of the Junior (Year 10) Certificate.

In May 1987, the Minister for Education, Lin Powell, proposed legislation to replace the Board of Secondary School Studies, the Board of Advanced Education, and the Board of Teacher Education by a Queensland Course Accreditation Council which he stated would provide more options and flexibility in post-compulsory education. Powell accused the Board of being slow to implement new courses in response to community needs. The proposed legislation, inter alia, gave power to the Minister to implement P-10 unhampered by ROSBA assessment considerations. However, it created considerable hostility from parliamentary opposition members, independent schools and some academics. The major opposition was that it gave too much political power to the Minister over the curriculum69. After changes were made to the proposed legislation, an amendment was made to the Education Act in November 1987 which created a Queensland Post- Compulsory Course Accreditation Council, a Queensland Teachers Registration Council, and an Advisory Council on Education for Economic Development to replace existing boards including the Board of Secondary School Studies.

At that point, the government changed its Premier and its Minister for Education, and the amendments were not proclaimed.

Informed in 1988 that the National Party new Minister for Education, Brian Littleproud, intended to abolish the Board issued Junior Certificate, the Board established a Junior Certification Task Force to review Junior certification. Based on 224 public submissions and its own

deliberations, the Task Force made a principal recommendation that a centrally issued, centrally-validated certificate based on a range of comparability techniques be retained70.

In November 1988, Brian Littleproud introduced new Education Bills into Parliament, one of which was designed to replace the Board of Secondary School Studies by the Board of Senior Secondary School Studies and remove control of the Junior

(22)

Curriculum and the Junior Certificate from the new Board which would concentrate on Senior secondary education. The Minister claimed that the Junior Certificate was no longer of much significance and that a Ministerial Consultative Council on Curriculum would have an

advisory role in the Departmental implementation of the P-10. He said that the push to implement P-10 was influenced by the Australian

Education Council and Federal interest in introducing greater conformity in State school systems. Paul Braddy, a Labor member of parliament, expressed the belief that it was better to retain the existing board and a centrally issued Junior Certificate. Lyle Schuntner, a Liberal member, claimed that there was some degree of conflict between the Board and the Department for control of curriculum development, and he also supported the retention of a certificate issued by the Board71.

This Bill was passed at the end of 1988, and in 1989 a new Board of Senior Secondary School Studies came into operation. The number of members (including the Chairperson) was reduced from twenty-two to seventeen, and, in the process, the proportion of representation of tertiary institutions, Department of Education, and non-government schools was reduced, and the Director of the Board was no longer an ex officio member. Employers and community groups involved in education were given direct representation for the first time. The new legislation gave the Minister for

Education increased power over the Board, and the power to nominate four practising teachers to each of the Subject Advisory Committees passed from the Board to the Director-General of Education. Furthermore, the Board was given only temporary control of Junior Certification. A change in structure provided a Moderation Committee with Sub-committees responsible for Review Procedures, External Examinations, and

Tertiary Entrance72.

CONCLUSION

During the period 1983 to 1986, the Board of Secondary School Studies continued to have difficulty in reconciling its two tasks. These were the provision of an education suitable for a wide range of individual abilities and aptitudes and for the needs of a modern society, and the provision of a rank order of merit for tertiary entrance. While it won growing acceptance of ROSBA, the Board's efforts to attend to accountability, comparability, and standards resulted in complex organisational structures and onerous work loads for teachers involved in assessment procedures.

Another problem arose when increasing numbers of TE Score candidates had to compete for a limited number of places in tertiary

institutions. This led to heightened public criticism of the TE Score System, criticism fostered by an unfriendly press. The period also saw the Department of Education taking steps to gain control over Junior Certificates to further planned curriculum reforms, with a consequent change to the nature and powers of the Board responsible for assessment in secondary schools.

(23)

REFERENCES

Board of Secondary School Studies, Handbook of External 1.

Examination Requirements for 1986, p.11.

2. Board of Secondary School Studies Memorandum, 38/83.

'Report of the Board of Secondary School Stud ies', 3.

1983, 1984, 1985, I n Report of the Minister for Education, 1983, p.64; 1984, p.51; 1985, p.78.

4. A. Bennett. 'ROSBA. Where it came from. What i t means', The History Teacher, The Journal of the Queensland History Teachers' Association, No. 32, 1983, p.28.

5. A. Bennett, op. Cit., p.29.

6. Board of Secondary School Studies, Information Statement, 89/83.

7. Information supplied by members of District Review Panels, V. Ingham and J. Cronk, Division of Curriculum Services,

Department of Educa tion.

8. 'Report of the Board of Secondary School Studies', 1982, in Report of the Minister for Education, 1982, pp.68 -71.

9. Guidelines for the Development of Syllabuses for Board Subjects. Brisbane : Board of Secondary School Subjects, 1984.

10. Board of Secondary School Studies, Handbook of Procedures for Accreditation and Certification, p.42.

11. Information supplied by members of District Review Panels, V. Ingham and J. Cronk, Division of Curriculum Services,

Department of Education.

12. 'Report of the Board of Secondary School Studies', 1982 and

1983, in Report of the Minister for Education, 1982, pp.68- 9; 1983, pp.64-5. Board of Secondary School Studies

Information Statement, 89/1983.

13. Board Information Statement 1/83.

14. Board Memora ndum 54/83.

15. Queensland Teachers Journal, Vol.6, No.12, 1983, pp.3,8.

16. Queensland Teachers Journal, Vol.6, No.1, 1983, pp.8-9.

17. Queensland Teachers Journal, Vol.6, No.12, 1983, pp.3,8.

(24)

18.

See for example Queensland Teachers Journal, Vol.6, No.2, 1983, pp.6,20; Vol.6, No.9, 1983, p.20; Vol.6, No.13, 1983, p . 2 . A. Bennett, op. cit., p . 3 1

.

J . Findlay, 'Criteria-

based Assessment in Queensland', The Australian Mathematics Teacher, Vol.43, No.3, 1987, pp.5 -6. Recollections by

Craig Sherrin in Queensland Parliamentary Debates, Vol.

310, 1988-89, pp.2969-70.

19.

See annual QCPCA Policy Statements in P & C Guide, 1983-9.

20.

Queensland Teachers Journal , Vol.7, No.6, 1984, p.7.

21.

Queensland Teachers Journal, Vol.7, No.8, 1984, p.8.

22.

Queensland Teachers Journal, Vol.7, No.7, 1984, p.6.

23.

Queensland Teachers Journal, Vol.6, No.10, 1983, p.8.

Information from Information Officer, Board of Secondary School Studies.

24.

Board of Secondary School Studies Memorandum, 223/84.

Information provided by University of Queensland and BCAE Kelvin Grove Campus, 6 March 1990.

25.

Queensland Parliamentary Debates, Vol.300, 1985-6, p.2093.

26.

R. Sadler, Assessment Unit, A ROSBA'S Family Connections, Discussion Paper 1, Board of Secondary School Studies,

January 1986, p.7.

27.

'Report of Board of Secondary School Studies, 1 9 8 5 ' , in Report of the Minister for Education, 1985, p.78.

28.

Report of the Board of Secondary School Studies, in Report of the Minister for Education, 1985, p.79. Information provided by Information Officer, Board of Senior Secondary School Studies.

29.

Recorded annually in 'Information and Publications' section of Reports of the Board of Secondary School Studies.

30.

Board of Secondary School Studies Information Statements, Memoranda, Handbooks, etc. 1980 -6. Board of Senior Secondary School Studies Collection.

31.

The Courier-Mail, 26 December 1985.

32.

The Sunday Mail, 16 February 1986.

33.

The Sunday Mail, 23 February 1986.

34.

Annual Report of the Board of Secondary School Studies, 1987, p.6.

(25)

35. Statement made by Minister for Education, The Courier- Mail, 20 May 1986, and information from Board of Senior Secondary School Studies.

36. Queensland Teachers Journal, Vol.12, No.1, 1989, p.3; No.2, 1989, p.3.

37. E. Vieth, 'ROSBA: The void between hope and happening', Queensland Teachers Journal, Vol.11, No.5, 1988, pp.12-3.

38. Queensland Teachers Journal, Vol.11, No.7, 1988, p.7.

39. E. Hobbs, Research Report: Managing the Effects of Change in Secondary Education, Research Services, Division of Curriculum Services, Department of Education, Queensland, 1989, pp.1,4.

40. E. Hobbs, Research Report: Issues in Senior Schooling.

Research Services, Division of Curriculum Services,

Department of Education, Queensland, 1990, pp.29, 44-5, 54- 5, 57-8.

41. Based on a description by Dr Ken Smith in Sunday Mail, 9 March 1986.

42. 'Report of the Board of Secondary School Studies', in

43.

Report of the Minister for Education, 1984, p.52.

The Courier-Mail, 12 September 1989.

44. The Daily Sun, 13 November 1986.

45. The Sunday Mail, 21 February 1988.

46. 'Report of the Board of Secondary School Studies', in Report of the Minister for Education, 1984, p.52.

47. The Courier-Mail, 6, 10, 14 August 1986.

48. The Courier-Mail, 5 July 1986.

49. The Courier-Mail, 16 August 1986.

50. The Courier-Mail, 14 June 1986, 18 June 1987, 12 September 1989.

51. Information supplied by spokesperson of QCI, 2 March 1990.

52. P & C Guide, Vol.39, No.3, 1986, p.11.

53. E. Hobbs, op. cit., pp.22,54,56 -7.

54. Queensland Teachers Journal, Vol.9, No.4, 1986, p.5. T h e

55.

Courier-Mail, 22 May 1987.

The Courier-Mail, 27 November 1989.

(26)

56. Information provided by D r Phil Meade.

57. Memo t o Staff, 19 February 1990, from Br McLaughlin.

58. Judith Hewton, 'Entrance to Tertiary Education: Coverage by a Queensland Newspaper, 1986-1988'. M.Ed. Studies thesis submitted January 1990, University of Queensland.

59. The Courier-Mail, 20 November 1989.

60. See statements by Max Howell and Sister Elvera Sesta in The Courier- Mail, 19, 22 February 1990.

61. Minister for Education, Press Release.

62. Information provided by B. McBryde, Research Services, Department of Education.

63. Information provided by B. McBryde, Research Services, Department of Education.

64. See Report of Board of Secondary School Studies, 1983-8 for details.

65. Tertiary Entrance in Queensland: A Review. Report of the Working Party on Tertiary Entrance, established by the Minister's Joint Advisory Committee on Post-Secondary Education and the Board of Secondary School Studies at the request of the Queensland Minister for Education, Brisbane, July 1987.

66. Report of the Board of Secondary School Studies, 1988, p.17.

67. The Courier-Mail, 22 August 1989.

68. Education 2000: Issues and Options for the Future of Education in Queensland. Brisbane, Department of Education, 1985, pp.22-5.

69. The Daily Sun, 25 May, 1 June 1987. The Courier -Mail, 1,3 June, 24 August, 17 November 1987. The Australian, 3 June 1987.

70. Report of the Board of Secondary School Studies, 1988, pp.10-11.

71. Queensland Parliamentary Debates, Vol.310, 1988 -9, pp.2108- 9, 2966-8, 2975, 2977.

72. Queensland Teachers Journal, Vol.12, No.2, 1989,p.5.

(27)

Appendix

(28)

Figure 1

Source: Handbook of Procedures for Accreditation and Certification (ROSBA), p.47.

START SCHOOL

FLOW CHART OF PROCEDURES FOR CERTIFICATION ( S e n i o r S u b j e c t s )

1. The school prepares submission of student scripts and Level of Achievement on Form R6.

2. (a) The school receives the original and duplicate of the Form R6 with the student materials.

No further action is taken at this time.

(Some materials are sampled by the State Review Panel and will be returned to the school later.) 2. (b) The school receives the

original, duplicat e a n d triplicate of the Form R6 with the student materials.

If consultation is sought by the panel or desired by the school, direct contact should be made with the District Review Panel chairman by the school. The school considers the panel advice a n d sub mits a subsequent proposal on the Form R6.

3. (a) The school might receive advice from the State Review Panel.

3. (b) The -school makes a subsequent proposal to the State Review Panel.

4. Should the school feel that it is still not satisfied with the advice from a panel, a direct appeal may be made to the Board.

5. The school submits an Exit Proposal on the Form R6 with the Form S5.

6. The school issues Senior Certificates to students.

Submission to DRP

Agreement reached on proposal

District Board Centre

To be further considered by school

District Review Panel

District Board Centre

State Review Panel

Board of Secondary School Studies

NOTE:

1. State Subjects

For certain subjects, there are no District Review Panels because of the small number of schools offering them in the District. The school submissions are reviewed by the State Review Panel.

2. This page shows in detail that part of the certification process which concerns schools. The complete flow chart is contained in the Handbook of Procedures held by the Principal.

Figure

Diagram 10  A possible sequence

References

Related documents

Two such issues were discussed in a submission by this office to an earlier inquiry undertaken by the Parliamentary Joint Committee: the arguments for extending the jurisdiction of