At our October 2015 meeting, I shared with the Board of Elementary and Secondary Education three conclusions that I had reached regarding the Massachusetts Comprehensive Assessment System (MCAS)/Partnership for the Assessment of Readiness for College and Careers (PARCC) decision.
MCAS has reached a point of diminishing returns. MCAS has served the Commonwealth well. Our K-12 public school students lead the nation in academic achievement and are competitive internationally. That success would not have been possible without a high quality assessment providing feedback on student, school, district, and state achievement and progress. In 2015, MCAS was administered for the 18th year. MCAS was a terrific 20th century assessment. We have a better understanding now than we did one or two decades ago about learning progression in mathematics, text complexity and the interplay of reading and writing, and the academic expectations of higher education and employers.
Now that we have the benefit of two decades of experience, and we have upgraded our learning expectations through revisions to our curriculum frameworks and content standards, it is time to upgrade our assessments to a new generation. As we look to the Commonwealth's next-generation assessment, we have the opportunity to build on this knowledge and experience. Perhaps my greatest concern about continuing with MCAS as it exists now is that we have reached a point of diminishing returns. As I see in my visits to schools and as I hear from educators and parents, too often the response to MCAS is instruction designed to teach students to succeed on the test rather than instruction designed to meet the learning standards.
PARCC is a substantial advancement over our current MCAS test. Our goal in joining the PARCC consortium was to build a better test. We had access to more than $100 million in funding for the development work, as well as expertise from state education departments across the country. Massachusetts played a leading role in the consortium, and the Commonwealth's efforts are reflected in the strong quality of PARCC.
In many ways PARCC sets a higher bar than MCAS for student performance. This is particularly true as students move up the grades into middle and high school. This higher bar is not simply about being harder. PARCC provides more opportunity for critical thinking, application of knowledge, research, and connections between reading and writing. As I travel the Commonwealth, I see more and more schools that have upgraded curriculum and instruction to align with our 2010 frameworks, which in turn are represented in the PARCC assessments. Classroom instruction is increasingly focused on the knowledge and skills in the frameworks rather than test preparation.
I also have observed that the computer-based testing experience is qualitatively different from a paper-and-pencil test. The computer-based environment is a more engaging experience, preferred by students by almost a two to one margin. The introduction of video and audio increases accessibility for many students, including students with disabilities and English language learners. Most importantly, the computer-based setting mirrors the digital world that is ubiquitous in students' current and future lives.
We need to ensure the Commonwealth's control of our standards and assessments. The Board's discussions and the public comments we heard have helped me to understand the importance of ensuring the Commonwealth's control over our standards and assessments as we move forward. While Massachusetts has exercised a leadership role among the consortium states, any path forward to a next-generation test that builds on the PARCC assessment must be a direction that the Commonwealth controls.
For these reasons, I am recommending to the Board that we begin work on a next-generation, computer-based MCAS assessment program. This new test will build on the best elements of both PARCC and MCAS and will allow us to retain final control over our test content, testing policies, and test administration procedures.
The following are my recommendations that I am asking the Board to endorse next week:
We will incorporate into an upcoming procurement for a new MCAS contract1 the services needed to develop next-generation English language arts (ELA) and mathematics assessments, to be administered in all schools beginning in the spring of 2017. In order to expedite the development process and minimize costs, we will maximize the use of existing PARCC development, as well as MCAS test items, as appropriate. These will be augmented by additional test items developed to meet our needs. We remain committed to a policy of transparency with regard to releasing test items, as we currently do with MCAS.
Because of the time required to conduct a procurement for a new MCAS testing contractor, spring 2016 will need to be a transitional year for grades 3-8. Districts that administered PARCC in spring 2015 will administer PARCC again, and will again have the option to select the computer-based or paper-based versions. Districts that administered MCAS in spring 2015 will administer MCAS again, unless the district affirmatively elects to switch to PARCC (either computer-based or paper-based). The MCAS tests will be augmented with a limited number of PARCC test items to facilitate statewide comparisons and to provide teachers and students in MCAS districts with some initial exposure to these types of questions.
We will convene technical advisory committees representing Massachusetts K-12 teachers, higher education faculty, and assessment experts to advise on the content and test administration policies of the next-generation assessments. Among the policies to be reviewed are the content and length of our tests; the scheduling of test administration windows; our testing policies for students with disabilities and English language learners; and the requirements for the new high school competency determination.2 We will also discuss the timing for reinstituting a history and social science test.
As an adjunct to the test development process, we will convene review panels comprised of Massachusetts K-12 teachers and higher education faculty to review the current ELA and mathematics curriculum frameworks and identify any modifications or additions to ensure that the Commonwealth's standards match those of the most aspirational education systems in the world, thus representing a course of study that best prepares students for the 21st century.
We will commit to computer-based testing for our state assessments. A paper-based option will be made available through the spring 2018 administration, with a goal of implementing computer-based testing statewide by spring 2019. We will work with districts to help them identify funding sources for the needed technology.
As we did in spring 2015, districts administering PARCC in grades 3-8 for the first time in spring 2016 will be held harmless for any negative changes in their school and district accountability levels. In spring 2017, when we return to a single test for all districts, every district will be subject to accountability level adjustments.
For ELA and mathematics assessments at the high school level in spring 2016, we will offer only the current MCAS grade 10 tests, in order to focus our efforts on the new test development work. We will consult with our technical advisory committees to propose a broader range of high school testing options beginning in spring 2017. Our current MCAS graduation requirement will remain unchanged at least through the Class of 2019.
We will work to ensure that the new PARCC consortium memorandum of understanding, currently under development, fully protects our ability to use PARCC intellectual property in future Massachusetts-based tests.
We expect to remain an active member of the PARCC consortium. I anticipate that continued membership will give us access to high quality assessment research and new test items, with the costs shared among the participating states. Membership also will provide us with useful multi-state data comparisons. Because we will be contracting with our own testing vendor, we will have the flexibility to leave the consortium at any time that membership is no longer of added value to Massachusetts.
In this memorandum I will review the background on my recommendations; comment on some of the concerns and issues raised; and provide a detailed outline of my proposed path forward.
The landmark 1993 Massachusetts education reform law3 directed the Board of Elementary and Secondary Education4 to develop and administer a statewide assessment system to measure the academic achievement and progress of districts, schools, and individual students. Under the Board's direction, the Department of Elementary and Secondary Education developed the Massachusetts Comprehensive Assessment System (MCAS), which has been administered annually since 1998.
In 2011 Massachusetts joined the Partnership for Advancement of Readiness for College and Careers (PARCC), a multi-state consortium formed to develop a new set of assessments for English language arts and mathematics. In November 2013, the Board voted to conduct a two-year "test drive" of the PARCC assessments, in order to decide whether we should adopt them in place of our existing MCAS assessments in those two subjects. In the spring of 2014, PARCC was field tested in a randomized sample of schools in Massachusetts and in the other consortium states. In the spring of 2015, PARCC was administered in full operational mode. In Massachusetts, districts were given the choice of administering either the computer-based version of PARCC, the paper-based version of PARCC, or MCAS.
During the past several months, you have had the opportunity to review numerous research studies and hear presentations from many experts. At our meeting on Tuesday, November 17, I will ask you to discuss and vote on the findings and recommendations presented in this memorandum. Your decision will determine the direction of student assessment in the Commonwealth for the years ahead.
I want to express my thanks and appreciation to all of those who have assisted us in the development and evaluation of the PARCC assessments, including:
I would also like to thank the many educators, public officials, students, and private citizens who have offered thoughtful comments and feedback during this process, either at one of the Board's five public comment sessions earlier this year or in other venues and meetings. In this memorandum I have tried to address many of the recurring themes and concerns that we have heard. Board members are reminded that we will have one final public comment session, on Monday, November 16, from 4:00 pm to 7:00 pm in the Malden High School auditorium. This final session will give you an opportunity to hear feedback on the recommendations presented in this memorandum.
It is impossible to fully separate the assessment debate from the broader debate, here in Massachusetts and nationally, on curriculum frameworks. I want to start by addressing those issues.
The Massachusetts curriculum frameworks date back to the 1993 education reform law, when the Legislature directed the Board to define the skills and knowledge students should have in each grade and in each subject area. Setting statewide curriculum standards for Massachusetts public schools is a fundamental responsibility of the Board. The statewide standards also provide a consistent basis for measuring school and student performance, and assure continuity for students who move from district to district.
Massachusetts currently has curriculum standards and frameworks in seven areas: arts; comprehensive health; English language arts; foreign languages; history and social science; mathematics; and science and technology/engineering. There are also curriculum standards for the 44 career and vocational technical education programs.6 Each was developed with extensive participation by Massachusetts teachers, curriculum specialists, and subject matter experts. Each set of standards is periodically reviewed and updated.
Curriculum standards or frameworks are not the same as a curriculum. A curriculum is a planned sequence of instructional units drawing upon textbooks and other instructional materials. Daily lesson plans define the specific activities and assignments for each class. Curricular decisions have always been made, and continue to be made, at the local level by school committees, school and district administrators, and classroom teachers. Although some states do have state-mandated curricula and textbooks, that is not true in Massachusetts.
In 2008, the National Governors Association (NGA), the Council of Chief State School Officers (CCSSO),7 and Achieve, Inc, published Benchmarking for Success: Ensuring U.S. Students Receive a World-Class Education. The first recommendation of this bipartisan call to action was: "Upgrade state standards by adopting a common core of internationally benchmarked standards in math and language arts for grades K-12 to ensure that students are equipped with the necessary knowledge and skills to be globally competitive." Governors and chief state school officers were aware that in a world where state and national boundaries are increasingly irrelevant to economic and social opportunity, it made little sense for each state to have its own definition and assessment standards for what it means to be literate and know math.
In 2008, the Department began a review and update of our English language arts (ELA) and mathematics frameworks. These are the two foundational academic subjects. Without proficiency in ELA and mathematics, students are highly unlikely to succeed after high school. Feedback from the business community and from higher education indicated that too often we were doing an insufficient job in preparing all students in these two subjects. Many other states were facing similar concerns, and that prompted a multi-state effort led by the NGA and the CCSSO. Pooling resources among many states seemed to us to be an efficient and effective means of developing new ELA and mathematics frameworks that would better represent college and career readiness. Common standards across state lines would also benefit students in an increasingly mobile society. Massachusetts participated actively in the development of the so-called common core state standards, and in fact the new standards drew heavily from our state's earlier standards.
In 2010, the Board reviewed the common core work and voted to incorporate it into a new set of Massachusetts ELA and mathematics frameworks, along with some additional standards recommended by Massachusetts educators. Our districts have invested a significant amount of time and effort in implementing these standards, including acquisition of new curriculum and instructional materials and extensive professional development for teachers. Feedback from educators in the field who are familiar with the 2010 frameworks has been very positive. Even among teachers who have concerns about our assessment program, I hear very little criticism of the frameworks themselves.
I believe our students will be best served by continuing to implement the 2010 Massachusetts frameworks. Any wholesale change would be both disruptive and costly to our schools. That is not to say that I believe the frameworks are perfect. We need to draw upon our teachers' experiences using the frameworks over the past five years to identify any particular standards that are not working as well as they should and any gaps that need to be filled. Incremental improvement can be done at the same time that we are reviewing and updating our assessment program, and with minimal disruption to local curricular and instructional efforts.
The 1993 education reform law directed the Board to institute an annual statewide assessment program. This was part of the "grand bargain" incorporated in that landmark statute - clear standards, a significant increase in state funding and other resources, and accountability for results. A lively debate is currently underway, here in Massachusetts and across the nation, on the subject of standardized testing. It is entirely appropriate for us to look at what we are testing, how much time we are spending on testing, whether test results are helping to improve instruction, and whether test preparation activities are crowding out more effective uses of classroom time.
But I disagree with those who would eliminate or suspend our annual statewide assessments. I know of no high performing system that fails to benchmark its performance and hold itself responsible for results. MCAS results have supported our education efforts in a number of ways:
The Commonwealth has a constitutional obligation to ensure that all students have the opportunity to receive an adequate education.8 MCAS results are one of several sources of information the Department and the Board use to identify schools and districts that require some additional assistance or intervention from the state.
High quality assessments send important signals about the kinds of curriculum and instruction, teaching and learning that are reflected in the standards.
Teachers and administrators are provided with detailed analyses of student test results, offering useful information on what parts of their curriculum are effective and where instruction needs to be strengthened.
Test results also allow us to identify higher performing schools and districts and spotlight effective practices.
Parents deserve objective feedback on their children's progress through elementary and secondary school grades. When students are performing below their grade-level expectations, we hope that their MCAS score reports will prompt constructive conversations among parents, teachers, and guidance counselors.
Passing the tenth grade MCAS tests is one of the requirements for a student to receive a Massachusetts high school diploma. Before education reform, too many students, especially in our larger and poorer cities, were receiving diplomas without having even a basic foundation of skills and knowledge.
Finally, test scores help us to demonstrate our achievements and our progress to the Legislature and to the public at large. We spend more than $16 billion a year on K-12 public education in the Commonwealth. We have an obligation to demonstrate to the taxpayers that we are spending that money effectively.
I agree that testing by itself does not improve instruction — but it provides essential information to support those improvement efforts.
The 2001 reauthorization of the federal Elementary and Secondary Education Act (popularly called "No Child Left Behind") added a federal mandate for annual statewide testing. Congress is currently considering proposals for a new reauthorization of this law, some of which reduce the federal testing requirement. If and when a new federal law is passed, it will give us an additional opportunity to review and reflect on our state testing program.
Many comments and concerns we heard at our public comment sessions related to testing in general rather than the strengths and weaknesses of specific tests. Here are my thoughts on some of the comments we heard most frequently.
"Our tests don't measure everything." I agree that we want our schools to foster many skills that are not easily measured on standardized statewide tests, for example, creativity or working with others cooperatively. But I also believe that English language arts and mathematics are foundational for success in all other areas. If our schools are not teaching students to be literate and numerate, they are failing those students, regardless of what other successes they may be having.
"Testing takes up too much time." This has been a very widely expressed concern, not only from the public but from educators as well. We have an obligation to ensure that the time required to administer state tests is the minimum necessary to obtain the information we need. But concern over "too much testing" also reflects on assessments selected by districts themselves, as well as classroom time spent in preparing for tests. Research indicates that the value of these activities varies widely. The Department has been studying the amount of time spent in districts on statewide assessments, and we will continue to be vigilant in this area as we encourage and assist districts in evaluating the usefulness of their own testing programs.
"Statewide tests put too much pressure on students." For students, MCAS is a "high stakes" test only in tenth grade, where it is part of the high school graduation requirement. There are no high stakes for students taking the test in the lower grades, so if these students are feeling undue pressure, it seems likely that it is coming from their teachers, principals, and parents. I understand that some educators feel anxiety when we ask how well their schools are performing, but we should expect that they are not sharing those anxieties with their students.
"Our tests are too difficult for students with disabilities and English language learners." We offer a range of accommodations, special tests, and testing policies for these students to reflect their unique needs. We will continue to work with the advocates for these groups to ensure that our testing program is fair. But I do not want to return to the days when we had low aspirations and expectations for these students.
"Testing in some subjects forces schools to deemphasize others." We currently administer statewide tests in English language arts, mathematics, and science. The 1993 education reform law also calls for tests in history and social science, foreign languages, and the arts. Adding additional tests is feasible but pushes against the concerns over too much testing time. There does appear to be considerable interest in reinstating the history and social science assessment, and I expect that we will have more discussion with the field on this topic in the months ahead.
"Private testing companies could misuse confidential student data." We have contracted with private testing companies for more than two decades to help administer our large-scale assessments, including MCAS. All use of confidential student data is subject to federal and state data privacy laws, and we make every effort to ensure that our contractors use best practices in data security. There is no evidence that any of our current testing contractors have misused confidential data, and it is unlikely that they would stay in business very long if they did.
Background
In 2008, the Department began planning for a next-generation MCAS to replace the existing, ten-year-old tests. Data from our state higher education system regarding the high number of students requiring remedial courses pointed out the need for more rigorous assessments at the high school level to signal readiness for post-secondary work. At all grades, we wanted to provide added focus on critical thinking skills as well as factual knowledge, and we wanted to provide richer feedback to students and teachers on areas of strength and weakness. We wanted to explore options for a computer-based assessment, and we knew that changes would be needed to reflect the new ELA and mathematics frameworks then under development.
Budget constraints arising out of the Great Recession of the mid-2000s ended this effort before it got very far. But then the U.S. Department of Education offered funding from the American Recovery and Reinvestment Act to states that were willing to work together in partnership to develop state-of-the-art assessments. Two such multi-state consortia were established and funded: the Smarter Balanced Assessment Consortium (SBAC) and the PARCC consortium. Massachusetts was one of the founding members of the PARCC consortium. Our participation in this partnership offered the opportunity to pool our expertise with other states, share the costs of test development, and realize economies of scale in test administration.9
The governing board of the consortium is comprised of the chief state school officer of each member state. I was selected by my colleagues to chair the governing board meetings. Each state also provides the time and expertise of state agency staff, educators from the field, and higher education faculty, to participate in various leadership groups, advisory committees, and test development activities. Staff from our Student Assessment Services office have devoted a substantial amount of time to the PARCC project over the past five years.10
Test Content and Administration
Our current MCAS assessment includes ELA and mathematics tests in grades 3 through 8 and grade 10. PARCC also has ELA and mathematics tests in grades 3 through 8, but has a broader range of high school tests. There are ELA tests for grades 9, 10, and 11, and course-specific mathematics tests for algebra I, algebra II, and geometry.11
The content and design of the PARCC test items have proved to be of very high quality. The material is well aligned to the common core state standards and provides a richer assessment of reasoning and critical thinking skills than MCAS. Feedback on test content was generally positive from educators who were familiar with both tests. There is, however, room for improvement. There were some isolated instances of test questions that had editing errors or that simply could have been written more clearly (or using vocabulary more appropriate to the grade level). This is not an uncommon occurrence in the initial development of a new test; similar problems cropped up in the first years of our MCAS administration. We also noted that some of the PARCC tests did not have as good a balance in the difficulty of questions as we would like.
The use of time limits, in comparison to the untimed MCAS test, pleased many people because it helped to reduce the amount of time students spent in the test session. Others felt that it was a problem for some students. In general, a timed test with reasonably generous time limits is to be preferred. Whether the PARCC time limits meet that standard or require further adjustment is worth additional study.
The move to computer-based testing (CBT) probably occasioned more comment than the actual content of the test. Last spring's administration demonstrated the significant value of CBT. Test items can include richer and more engaging content and a greater range of accessibility features; tests can be scored more quickly and at a lower cost; and CBT reflects the reality that students in the 21st century are doing more keyboarding than handwriting. We also learned that there is a significant learning curve for test administrators in setting up and administering a computer-based test, but districts that did so in both 2014 and 2015 reported that the process was much smoother the second time. The Pearson testing platform performed extraordinarily well, handling millions of users with only scattered problems. Less satisfactory was the performance of the Pearson call center in handling those scattered problems; improvements are being implemented for 2016.
Until all schools have the necessary technology to administer a CBT, we will need to offer a paper version. But we need to help schools get that technology as soon as possible, not just for assessment but to support more individualized and creative instruction and learning. Today's students need to be technologically literate if they want to succeed in college or the workforce. Schools that do not make the effort to upgrade their technology will find themselves losing students to other schools and districts.
Reporting of Results
PARCC student results are reported in five performance bands, compared to four for MCAS. The standards for each performance band are set by the consortium, allowing for potentially useful comparisons of data among the participating states. In contrast, each state determines how the results will be used in its accountability systems. For example, in states such as Massachusetts that require high school students to pass a state test for graduation, the passing score would also be set by the state.
PARCC is developing an expanded set of reporting tools for use by teachers and administrators. These are intended to provide extensive and useful data to inform curriculum and instruction. Because the complete suite of reports has not yet been made available, we cannot evaluate their usefulness at this time.
In terms of reporting timeliness, first year results were delayed, as expected, due to the standard-setting process. Results in future years will be available earlier; however, the goal of having results by the end of the school year is not likely to be met in the near term. This is due to the decision to combine the two testing windows into a single window. Open-ended and essay questions, which take the longest to score, will now be given later in the year.
Additional Diagnostic Assessment Tools
In addition to the summative annual assessments that have been the focus of our efforts, the PARCC project also includes the development of diagnostic assessment tools that districts will be able to purchase for their own use on a voluntary basis. These tools have not yet been released, and the potential costs have not yet been determined. Because it is too soon to gauge the value of and level of interest in these tools, their availability is not a significant factor in my evaluation.
Costs
The total cost of our statewide assessment system is a small fraction of our total K-12 education spending (less than two-tenths of one percent), so I would argue that our decision should be based primarily on the quality of the assessments, not by transitional increases or decreases in that cost as we migrate to the next-generation tests. That said, the per pupil cost of the PARCC assessments is lower than our current MCAS costs, because: (a) the development costs were heavily subsidized by federal and foundation grants; (b) computer-based testing is less expensive to deliver and score; and (c) joining with other states provides economies of scale.12 All testing contracts are subject to periodic cost increases when they are re-bid. The current MCAS testing contract is in its last year; the PARCC testing contract runs through June 2018.13
The development costs for next-generation MCAS ELA and mathematics tests are difficult to project without conducting an actual procurement. Costs will depend in part on the length of the tests; the degree to which existing PARCC and MCAS items can be used; and the speed with which we move to all computer-based testing. Combining the new ELA and mathematics tests in the same contract with the MCAS science and legacy grade 10 tests will provide some economies. We can expect an incremental annual cost of several million dollars, to be applied for three or four years. Savings from even a partial move to computer-based testing will help to offset the development costs.
Once the procurement is conducted, we will be able to provide the Governor and the Legislature with accurate cost information to inform the state budget development.
Governance and Sustainability
Many of the concerns expressed about the PARCC assessment have focused more on the governance structure of the consortium and on its future prospects.
In addition to Massachusetts, the following are currently active members of the consortium: Colorado, District of Columbia, Illinois, Maryland, New Jersey, New Mexico, and Rhode Island.Aside from Massachusetts, the other members have all committed to using PARCC as their state assessment and are clearly interested in continuing the enterprise. The memorandum of understanding that governs the consortium is scheduled to be renewed at the end of this calendar year; discussions are already underway on needed changes to update and improve the governance structure. In the event that the consortium disbanded for any reason at any time in the future, a process is in place to designate a third party to take over and manage the consortium's intellectual property (test items, scoring rubrics, standards, etc.) for the benefit of the members.
With respect to the consortium's decision making, policies are now set by the governing board, and I would expect that some form of that arrangement will continue. Because Massachusetts has had a leadership role in the consortium, there have been relatively few instances where we disagreed with a policy decision. Nevertheless, we do need to acknowledge that we are only one state with one vote, and there are no guarantees that the other states will always move in the direction that we think is appropriate.
The consortium has engaged a consultant, Bellwether Partners, to study and advise it on its structure going forward. A major focus is the development of options for states (both member and non-member) to access and use the PARCC test content without needing to give the complete assessment or needing to use the designated PARCC testing contractor. A number of states in addition to Massachusetts, as well as other educational entities, are interested in these options. I expect the consortium to issue a statement shortly in which the members express their support for this new direction.
For all of the reasons described above, I am asking for your support for the recommendations presented earlier in this memorandum. A motion for your consideration is attached.
Also attached is an initial draft of the scope and workplan for the proposed next-generation MCAS test development program, prepared by our Student Assessment Services office. If you adopt my recommendations, this will be expanded and refined in consultation with our stakeholders.
The approach I have recommended lets us continue to benefit from a high quality, next-generation assessment in which we have invested a great deal of time and effort. It also ensures that the assessment will reflect the Commonwealth's unique needs and concerns. I look forward to discussing this with you next week.
1 The current MCAS contract with Measured Progress, Inc. expires at the end of December 2016. At a minimum, a successor contract is needed for the science tests and for the continued administration of the legacy ELA and mathematics tests used for the high school competency determination.
2 The Board has previously voted to retain the legacy MCAS test as the high school competency determination through at least the class of 2019. The next-generation test would become the competency determination for the class of 2020.
3 St. 1993, c.71.
4 The Board and Department of Elementary and Secondary Education were called the Board and Department of Education until a statutory change in 2008.
5 As I previously reported to you, Liz Davis very recently left the Department to relocate out of the area. Michol Stapel is currently serving as acting associate commissioner for student assessment.
6 The Board has also adopted the English language development standards from the WIDA consortium, a multi-state curriculum effort focusing on English language learners. Massachusetts is one of 37 states in the WIDA consortium.
7 The chief state school officer is the senior public official responsible for K-12 education policy. In Massachusetts that is the commissioner of elementary and secondary education. In other states, the title is commissioner of education, state superintendent of schools, secretary of education, or some other variant.
8 McDuffy v. Secretary of Education, 415 Mass. 545 (1993).
9 In many ways, this partnership among states parallels the partnerships among Massachusetts municipalities that have been created in recent years to share the costs of various administrative services, for example, regional 911 call centers.
10 It has been suggested by some that our participation in the project, and in particular my participation as a member and chairman of the governing board, creates a conflict of interest. From a legal perspective, the State Ethics Commission has reviewed this matter and determined that there is no conflict. From a policy perspective, my ex officio participation on the governing board is no different than superintendents who serve ex officio on the governing boards of the educational collaboratives to which their district belongs. Further, as the Board has noted, our active participation has enabled Massachusetts to advocate for maintaining high standards in the project. I receive no personal gain, fiscal or otherwise, from my role as chairman. Finally, I have no vote in the Board's decision.
11 PARCC also offered Integrated Mathematics tests in high school, but these are being phased out due to lack of participation.
12 Even though many of the original consortium members have since withdrawn, the total number of students in the remaining states is still larger than any one state.
13 The current MCAS testing contractor is Measured Progress. The current PARCC testing contractor is Pearson LLC.
Massachusetts Department of Elementary and Secondary Education 135 Santilli Highway, Everett, MA 02149
Voice: (781) 338-3000 TTY: (800) 439-2370
Disclaimer: A reference in this website to any specific commercial products, processes, or services, or the use of any trade, firm, or corporation name is for the information and convenience of the public and does not constitute endorsement or recommendation by the Massachusetts Department of Elementary and Secondary Education.