News You Can Use:

Arkansas Times, “State Concedes Error, Says No Intention to Soften School Testing Standards”: On Tuesday, Arkansas Education Commissioner John Key issued a statement that state officials had mischaracterized test scores and “left a misleading impression that Arkansas was backing away from high standards.” A release to parents earlier this month indicated that a score of 3 on recent PARCC assessments amounted to adequate preparation, even though PARCC officials designate a level 3 score only as “approaching proficiency.” “Any assertion that Arkansas has adopted Level 3 as demonstrating proficiency is inaccurate,” Key wrote. “Our description of Level 3 and above as reflecting students being ‘on track for college and career readiness’ was in error. We should have then, as we will from this point forward, used the actual descriptions from PARCC to accurately reflect the performance of students.”

What It Means: Arkansas officials deserve credit for acknowledging their misstep and taking action to correct it. Their clarification – which reaffirms its commitment to high expectations for students – and the state’s steps to set proficiency benchmarks at high levels, demonstrate the importance of providing parents and teachers with honest information about student readiness. Over the past several years, states like Arkansas have taken great efforts to raise classroom expectations. By expanding the definition of proficiency to include students who are less-than-proficiency, Arkansas would have walked back that work. States that inflate proficiency rates to paint a misleading picture of how well students are doing “should give pause to parents, community leaders and policy makers who expect transparency.”

Mathematica Policy Research, “Predictive Validity of MCAS and PARCC”: As Massachusetts officials prepare to decide whether to use MCAS or PARCC tests to measure student readiness, a Mathematica report comparing the two assessments concludes that both are good predictors of college readiness, but finds that PARCC assessments are “significantly better” at determining students’ likeliness to earn B or better grades at the collegiate level. In math, meeting the PARCC standard for college readiness predicts a higher level of college performance than meeting the MCAS standard for proficiency, the report states. Additionally, students meeting PARCC’s proficiency benchmarks were less likely to need remediation than those that met those benchmarks on MCAS. The report finds that PARCC assessments measure college readiness as well as the SAT. “In sum, in English language arts, PARCC and MCAS provide equally useful information about college readiness,” the report concludes. “[B]ut, PARCC’s standards for college readiness [in math] is better than MCAS’s proficiency standard at identifying students who do not need remediation and can earn “B” grades in college.”

What It Means: The Mathematica report, the first to analyze both PARCC and MCAS assessments as predictors of college readiness, emphasizes that the PARCC test better measures how well prepared students are for college-level work. The findings underscore that PARCC assessments measure students against rigorous academic expectations, especially in math, ensuring that when met students have the skills to succeed in college and careers. The findings should help inform policymakers’ decisions as they determine which test to use going forward.

New Orleans Advocate, “BESE Endorses Plan to Allow Common Core Test Comparisons with 10 States and DC”: On Tuesday, the Louisiana Board of Elementary and Secondary Education voted to approve a plan to set proficiency benchmarks on assessments administered this spring at levels consistent with PARCC’s cut scores and that will allow comparison to other states using the same tests. State Superintendent John White said setting achievement goals at comparable levels is critical to improving classroom performance long-term. “Our job is not to graduate students. Our job is to graduate students who can compete with other students across the country,” added Jeanne Burns, associate commissioner of the State Board of Regents. First comparisons to other states will happen when the BESE meets in December. Louisiana is expected to rank low, the article reports. “We want to know how students in our state compare with other states,” said Stephanie Desselle, education director for the Council for a Better Louisiana.

What It Means: BESE’s decision to set proficiency benchmarks on student assessments at levels that align with other states that use PARCC tests ensures that parents and teachers will get honest information about their students’ progress. State officials will be able to make an apples-to-apples comparison of student performance and collaborate across state lines. While initial scores may be “sobering,” by setting proficiency levels high Louisiana will avoid masking poor student performance and give families and teachers accurate information, a necessary first step to begin improving outcomes. As State Superintendent John White explains, “The real question ahead of us is not did our students significantly change but how will we change as adults in using the tests.”


Correcting the Record:

Parsippany Daily Record, “PARCC Proficiency a Matter of Choice”: Results from the first year of PARCC assessments in New Jersey come “in a vacuum of sorts,” preventing “any kind of meaningful comparison,” the editorial board writes. “The only context is provided by PARCC itself, which establishes ‘cut scores’ to determine the dividing line between proficient and non-proficient performance…At least two states – Ohio and Arkansas – have already jiggered their own state-level interpretations that include the top three levels as ‘proficient,’ significantly increasing the number of students considered to have received passing marks.” The New Jersey Board of Education can control perceptions with cut scores, the piece argues, opening the door to “a preconceived desire to label a certain percentage of students as failures in an attempt to buttress [Gov.] Christie’s anti-teacher, anti-public school agenda.” Or, “the goal may be to tamp down any dissenting uproar by lowering the bar for a passing grade.” The editorial concludes,” Don’t trust anyone pretending to draw conclusions from the results of an unproven test lacking any context for comparison. Yes, PARCC is supposed to be more difficult – ‘rigorous’ in educator-speak – but that doesn’t automatically make it a better test, or even a fair one.”

Where They Went Wrong: PARCC assessments measure students to levels that reflect what they need to know and be able to do to succeed in college and careers. By measuring to these higher criteria, PARCC gives parents and teachers an accurate measure of student readiness so they can effectively support students’ needs. A Teach Plus study this year found that 79 percent of teacher participants believe PARCC assessments are better than those their states used before. The editorial board is right to caution against setting proficiency benchmarks too low. Expanding the net of proficiency to include students who are less-than-proficient, as Ohio did, would walk back efforts to raise classrooms expectations. But as Mike Petrilli, president of the Fordham Institute explains, those cases are the exception, not the rule. Most states are closing Honesty Gaps by implementing assessments aligned to Common Core Standards.


On Our Reading List:

The Seventy Four Million, “Ohio, Arkansas Backpedal on Proficiency. They’re the Exception”: For a long time, a patchwork of academic expectations and inconsistent definitions of student proficiency, which were “all over the map,” sent parents and teachers “false positives” that their kids were on track in school when in fact they weren’t. That realization was one reason states developed Common Core State Standards and high-quality assessments aligned to them, and initial results are now trickling in. While Ohio “blinked, setting a standard for proficiency that is well below ‘college and career ready,’” those states are the “rare exceptions,” explains Mike Petrilli, president of the Fordham Institute. “The rule is that states are moving aggressively – and impressively – in the direction of higher standards and more honest definitions of proficiency.” Over the past several years states have begun to ramp up proficiency benchmarks, getting closer to the truth about student readiness. “To be sure, we still need to keep a watchful eye,” Petrilli cautions, as “there’s reason to fret that some [states] might want to go wobbly.” [As noted above, since publication of this article, Arkansas has clarified its position, stating that students will be considered on track at levels 4 and 5 and reaffirming its commitment to high expectations for students]

Associated Press, “Utah Asks Judge to Toss Common Core Lawsuit”: Utah Judge Paige Petersen is expected to rule early next month whether the State Board of Education illegally adopted Common Core State Standards, as alleged by a group of parents, teachers and local school board members. “In our case, the Board didn’t do anything other than listen to outside interests,” says Jerry Salcido, an attorney for the plaintiffs. Representatives of the Board of Education say they did everything by the book. A review by Utah’s attorney general last year, which was ordered by Gov. Gary Herbert, found adoption of the standards was legal and that the state has not lost any control over its standards or curriculum. Judge Petersen will rule on the case November 3.

Times Picayune, “States Do What They Want with Common Core Test Results”: The fact that states are setting cut scores for assessments aligned to Common Core State Standards demonstrates that federal authorities are not “pulling the strings,” writes columnist Jarvis DeBerry. “It seems obvious that Common Core isn’t being run from Washington. It’s being run from Columbus and Sacramento and Raleigh and Tallahassee. But not as it should be.” DeBerry says policymakers have “cranked up their public relations machines” to inflate proficiency rates and give the appearance students are doing better than they are. “Parents, teachers, government officials and the students themselves should want to know how students here measure up to their counterparts in other states. But measuring American students one against the other becomes more difficult – if not impossible – if some states are arbitrarily deciding what constitutes passing.”