Search results

97 results found.

Presentation of “Value Added Assessment (Outcomes)” in the Madison School District, Including Individual School & Demographic Information



Complete Report: 1.5MB PDF File

Value added is the use of statistical technique to identify the effects of schooling on measured student performance. The value added model uses what data are available about students–past test scores and student demographics in particular–to control for prior student knowledge, home and community environment, and other relevant factors to better measure the effects of schools on student achievement. In practice, value added focuses on student improvement on an assessment from one year to the next.
This report presents value-added results for Madison Metropolitan School District (MMSD) for the two-year period between November 2007 to November 2009, measuring student improvement on the November test administrations of the Wisconsin Knowledge and Concepts Examination (WKCE) in grades three through eight. Also presented are results for the two-year period between November 2005 to November 2007, as well as the two-year period between November 2006 to November 2008. This allows for some context from the past, presenting value added over time as a two-year moving average.
Changes to the Value Added Model
Some of the details of the value-added system have changed in 2010. The two most substantial changes are the the inclusion of differential-effects value-added results and the addition to the set of control variables of full-academic-year (FAY) attendance.
Differential Effects
In additional to overall school- and grade-level value-added measures, this year’s value-added results also include value-added measures for student subgroups within schools. The subgroups included in this year’s value-added results are students with disabilities, English language learners, black students, Hispanic students, and students who receive free or reduced-price lunches. The results measure the growth of students in these subgroups at a school. For example, if a school has a value added of +5 for students with disabilities, then students with disabilities at this school gained 5 more points on the WKCE relative to observationally similar students across MMSD.
The subgroup results are designed to measure differences across schools in the performance of students in that subgroup relative to the overall performance of students in that subgroup across MMSD. Any overall, district-wide effect of (for example) disability is controlled for in the value-added model and is not included in the subgroup results. The subgroup results reflect relative differences across schools in the growth of students in that subgroup.

Much more on “Value Added Assessment”, here.




A Look at Madison’s Use of Value Added Assessment



Lynn Welch:

In the two years Madison has collected and shared value-added numbers, it has seen some patterns emerging in elementary school math learning. But when compared with other districts, such as Milwaukee, Kiefer says there’s much less variation in the value- added scores of schools within the Madison district.
“You don’t see the variation because we do a fairly good job at making sure all staff has the same professional development,” he says.
Proponents of the value-added approach agree the data would be more useful if the Wisconsin Department of Public Instruction were to establish a statewide value-added system. DPI is instead developing an assessment system to look at school-wide trends and improve instruction for individual students.
…..
But some question whether value-added data truly benefits all students, or is geared toward closing the gap between high- and low-performing students.
“Will the MMSD use new assessments…of students’ progress to match instruction levels with demonstrated learning levels?” asks Lorie Raihala, a Madison parent who is part of a group seeking better programming for high-achieving ninth- and 10th-graders at West High School. “So far the district has not done this.”
Others are leery of adding another measurement tool. David Wasserman, a teacher at Sennett Middle School and part of a planning group pushing to open Badger Rock Middle School, a green charter (see sidebar), made national news a few years ago when he refused to administer a mandatory statewide test. He still feels that a broad, student-centered evaluation model that takes multiple assessments into account gives the best picture.
“Assessment,” he says, “shouldn’t drive learning.”

Notes and links on “Value Added Assessment“, and the oft-criticized WKCE, on which it is based, here.




More Comments on the Los Angeles Value Added Assessment Report



Melissa Westbrook:

So most of you may have heard that the LA Times is doing a huge multi-part story about teacher evaluation. One of the biggest parts is a listing of every single public school teacher and their classroom test scores (and the teachers are called out by name).
From the article:

Though the government spends billions of dollars every year on education, relatively little of the money has gone to figuring out which teachers are effective and why.
Seeking to shed light on the problem, The Times obtained seven years of math and English test scores from the Los Angeles Unified School District and used the information to estimate the effectiveness of L.A. teachers — something the district could do but has not.
The Times used a statistical approach known as value-added analysis, which rates teachers based on their students’ progress on standardized tests from year to year. Each student’s performance is compared with his or her own in past years, which largely controls for outside influences often blamed for academic failure: poverty, prior learning and other factors.

Interestingly, the LA Times apparently had access to more than 50 elementary school classrooms. (Yes, I know it’s public school but man, you can get pushback as a parent to sit in on a class so I’m amazed they got into so many.) And guess what, these journalists, who may or may not have ever attended a public school or have kids, made these observations:




Governance: Madison School Board Members Proposed 2010-2011 Budget Amendments: Cole, Hughes, Mathiak, Moss & Silveira. Reading Recovery, Teaching & Learning, “Value Added Assessment” based on WKCE on the Chopping Block



Well worth reading, particularly Maya Cole’s suggestions on Reading Recovery (60% to 42%: Madison School District’s Reading Recovery Effectiveness Lags “National Average”: Administration seeks to continue its use) spending, Administrative compensation comparison, a proposal to eliminate the District’s public information position, Ed Hughes suggestion to eliminate the District’s lobbyist (Madison is the only District in the state with a lobbyist), trade salary increases for jobs, Lucy Mathiak’s recommendations vis a vis Teaching & Learning, the elimination of the “expulsion navigator position”, reduction of Administrative travel to fund Instructional Resource Teachers, Arlene Silveira’s recommendation to reduce supply spending in an effort to fund elementary school coaches and a $200,000 reduction in consultant spending. Details via the following links:
Maya Cole: 36K PDF
Ed Hughes: 127K PDF
Lucy Mathiak: 114K PDF
Beth Moss: 10K PDF
Arlene Silveira: 114K PDF
The Madison School District Administration responded in the following pdf documents:

Much more on the proposed 2010-2011 Madison School District Budget here.




Another Look at the Madison School District’s Use of “Value Added Assessment”





Andy Hall:

The analysis of data from 27 elementary schools and 11 middle schools is based on scores from the Wisconsin Knowledge and Concepts Examination (WKCE), a state test required by the federal No Child Left Behind law.
Madison is the second Wisconsin district, after Milwaukee, to make a major push toward value-added systems, which are gaining support nationally as an improved way of measuring school performance.
Advocates say it’s better to track specific students’ gains over time than the current system, which holds schools accountable for how many students at a single point in time are rated proficient on state tests.
“This is very important,” Madison schools Superintendent Daniel Nerad said. “We think it’s a particularly fair way … because it’s looking at the growth in that school and ascertaining the influence that the school is having on that outcome.”
The findings will be used to pinpoint effective teaching methods and classroom design strategies, officials said. But they won’t be used to evaluate teachers: That’s forbidden by state law.
The district paid about $60,000 for the study.

Much more on “Value Added Assessment” here.
Ironically, the Wisconsin Department of Public Instruction stated the following:

“… The WKCE is a large-scale assessment designed to provide a snapshot of how well a district or school is doing at helping all students reach proficiency on state standards, with a focus on school and district-level accountability. A large-scale, summative assessment such as the WKCE is not designed to provide diagnostic information about individual students. Those assessments are best done at the local level, where immediate results can be obtained. Schools should not rely on only WKCE data to gauge progress of individual students or to determine effectiveness of programs or curriculum.”

Related:




“Value Added Assessment” Madison School Board’s Performance & Achievement Committee Looks at “A Model to Measure Student Performance”



Video / 20MB Mp3 Audio

Superintendent Art Rainwater gave a presentation on “Value Added Assessment” to the Madison School Board’s Performance & Achievement committee Monday evening. Art described VAA “as a method to track student growth longitudinally over time and to utilize that data to look at how successful we are at all levels of our organization”. MMSD CIO Kurt Kiefer, Ernie Morgan, Mike Christian and Rob Meyer, a senior scientist at WCER presented this information to the committee (there were two others whose names I could not decipher from the audio).

Related Links:

The fact that the School Board is actually discussing this topic is a positive change from the recent past. One paradox of this initiative is that while the MMSD is apparently collecting more student performance data, some parents (there are some teachers who provide full report cards) are actually receiving less via the report card reduction activities (more here and here). Perhaps the school district’s new parent portal will provide more up to date student data.
A few interesting quotes from the discussion:

45 minutes: Kurt has built a very rich student database over the years (goes back to 1990).
46 Superintendent Art Rainwater: We used to always have the opinion here that if we didn’t invent it, it couldn’t possibly be any good because we’re so smart that we’ve have thought of it before anybody else if it was any good. Hopefully, we’ve begun to understand that there are 15,000 school districts in America and that all of them are doing some things that we can learn from.
47 Art, continued: It’s a shame Ruth (Robarts) isn’t sitting here because a lot of things that Ruth used to ask us to do that we said we just don’t have the tools to do that with I think, over time, this will give us the tools that we need. More from Ruth here and here.
55 Arlene Silveira asked about staff reaction in Milwaukee and Chicago to this type of analysis.
69 Maya asked about how the School Board will use this to determine if this program or that program is working. Maya also asked earlier about the data source for this analysis, whether it is WKCE or NAEP. Kurt responded that they would use WKCE (which, unfortunately seems to change every few years).
71 Lawrie Kobza: This has been one of the most interesting discussions I’ve been at since I’ve been on the school board.

Lawrie, Arlene and Maya look like they will be rather active over the next 8 months.




New Jersey’s “Value Added” Teacher Assessment Program



Laura Waters:

Education News looks at the first results of N.J.’s value-added teacher evaluations, which “found that overall, 23.4% of teachers received “highly effective ratings; 73.9% of teachers were rated “effective”; 2.5% were rated “partially effective”, a rating which can affect tenure; and .02%, about 200 total teachers in the state, were rated “ineffective.” Also see NJ Spotlight, the Record, Here’s the DOE report.




ASA Statement on Using Value-Added Models for Educational Assessment



American Statistical Association:

Many states and school districts have adopted Value-Added Models (VAMs) as part of educational accountability systems. The goal of these models, which are also referred to as Value-Added Assessment (VAA) Models, is to estimate effects of individual teachers or schools on student achievement while accounting for differences in student background. VAMs are increasingly promoted or mandated as a component in high-stakes decisions such as determining compensation, evaluating and ranking teachers, hiring or dismissing teachers, awarding tenure, and closing schools.

The American Statistical Association (ASA) makes the following recommendations regarding the use of VAMs:

The ASA endorses wise use of data, statistical models, and designed experiments for improving the quality of education.

VAMs are complex statistical models, and high-level statistical expertise is needed to develop the models and interpret their results.

Estimates from VAMs should always be accompanied by measures of precision and a discussion of the assumptions and possible limitations of the model. These limitations are particularly relevant if VAMs are used for high-stakes purposes.

VAMs are generally based on standardized test scores, and do not directly measure potential teacher contributions toward other student outcomes.

VAMs typically measure correlation, not causation: Effects – positive or negative – attributed to a teacher may actually be caused by other factors that are not captured in the model.

Under some conditions, VAM scores and rankings can change substantially when a different model or test is used, and a thorough analysis should be undertaken to evaluate the sensitivity of estimates to different models.

Much more on value added assessment, here.




Devaluing value-added assessments



Jay Matthews:

I don’t spend much time debunking our most powerful educational fad: value-added assessments to rate teachers. My colleague Valerie Strauss eviscerates value-added several times a week on her Answer Sheet blog with the verve of a samurai, so who needs me?
Unfortunately, value-added is still growing in every corner of our nation, including D.C. schools, despite all that torn flesh and missing pieces. It’s like those monsters lumbering through this year’s action films. We’ve got to stop them! Let me fling my small, aged body in their way with the best argument against value-added I have seen in some time.
It comes from education analyst and teacher trainer Grant Wiggins and his “Granted, but . . .” blog. He starts with the reasons many people, including him and me, like the idea of value-added. Why not rate teachers by how much their students improve over time? In theory, this allows us to judge teachers in low- and high-income schools fairly, instead of declaring, as we tend to do, that the teachers in rich neighborhoods are better than those in poor neighborhoods because their students’ test scores are higher.

Much more on “value added assessment“, here.




Value Added Teacher Assessment



Jason Felch & Jason Song:

Terry Grier, former superintendent of San Diego schools, encountered union opposition when he tried to use the novel method. His fight offers a peek at a brewing national debate.
When Terry Grier was hired to run San Diego Unified School District in January 2008, he hoped to bring with him a revolutionary tool that had never been tried in a large California school system.
Its name — “value-added” — sounded innocuous enough. But this number-crunching approach threatened to upend many traditional notions of what worked and what didn’t in the nation’s classrooms.
It was novel because rather than using tests to take a snapshot of overall student achievement, it used scores to track each pupil’s academic progress from year to year. What made it incendiary, however, was its potential to single out the best and worst teachers in a nation that currently gives virtually all teachers a passing grade.
In previous jobs in the South, Grier had used the method as a basis for removing underperforming principals, denying ineffective teachers tenure and rewarding the best educators with additional pay.
In California, where powerful teachers unions have been especially protective of tenure and resistant to merit pay, Grier had a more modest goal: to find out if students in the San Diego district’s poorest schools had equal access to effective instructors.




“Value-added measures are the Mark of the Devil”



Caitlin Emma:

Eskelsen García already has fiery words for the feds, who she holds responsible for the growing use of “value-added measures,” or VAMs, an algorithm that aims to assess teacher effectiveness by student growth on standardized tests. The idea has gained traction under the Obama administration through waivers from No Child Left Behind and the administration’s signature Race to the Top program. But studies, including some funded by the Education Department, have cast doubt on the validity of the measures.

VAMs “are the mark of the devil,” Eskelsen García said.

The algorithms do aim to account for variables such as student poverty levels. But Eskelsen García said they can’t capture the complete picture.

The year she taught 22 students in one class and the year she taught 39 students in one class — “Is that factored into a value-added model? No,” she said. “Did they factor in the year that we didn’t have enough textbooks so all four fifth-grade teachers had to share them on a cart and I couldn’t send any books home to do homework with my kids?”

“It’s beyond absurd,” she added. “And anyone who thinks they can defend that is trying to sell you something.”

Locally, Madison schools have been spending money and time on value-added assessment for years.




AFT’s Weingarten Backtracks on Using Value-Added Measures for Teacher Evaluations



Stephen Sawchuk:

American Federation of Teachers President Randi Weingarten has announced that she’ll call for the end of using “value added” measures as a component in teacher-evaluation systems.
Politico first reported that the AFT is beginning a campaign to discredit the measures, beginning with the catchy (if not totally original) slogan “VAM is a sham.” We don’t yet know exactly what this campaign will encompass, but it will apparently include an appeal to the U.S. Department of Education, generally a proponent of VAM.
Value-added methods use statistical algorithms to figure out how much each teacher contributes to his or her students’ learning, holding constant factors like student demographics.
In all, though, Weingarten’s announcement is less major policy news than it is something of a retreat to a former position.
When I first interviewed Weingarten about the use of test scores in evaluation systems, in 2008, she said that educators have “a moral, statistical, and educational reason not to use these things for teacher evaluation.”

Much more on “value added assessment”, here.




About Value-Added And “Junk Science”



Matthew DiCarlo:

One can often hear opponents of value-added referring to these methods as “junk science.” The term is meant to express the argument that value-added is unreliable and/or invalid, and that its scientific “façade” is without merit.
Now, I personally am not opposed to using these estimates in evaluations and other personnel policies, but I certainly understand opponents’ skepticism. For one thing, there are some states and districts in which design and implementation has been somewhat careless, and, in these situations, I very much share the skepticism. Moreover, the common argument that evaluations, in order to be “meaningful,” must consist of value-added measures in a heavily-weighted role (e.g., 45-50 percent) is, in my view, unsupportable.
All that said, calling value-added “junk science” completely obscures the important issues. The real questions here are less about the merits of the models per se than how they’re being used.
If value-added is “junk science” regardless of how it’s employed, then a fairly large chunk of social scientific research is “junk science.” If that’s your opinion, then okay – you’re entitled to it – but it’s not very compelling, at least in my (admittedly biased) view.




Using Value-Added Analysis to Raise Student Achievement in Wisconsin



Sarah Archibald & Mike Ford:

Past attempts to improve student assessment in Wisconsin provide reasons to view current efforts with caution. The promise of additional funds, the political cover of broad committees, and the satisfaction of setting less-than ambitious goals have too often led to student assessment policies that provide little meaningful information to parents, teachers, schools and taxpayers. A state assessment system should provide meaningful information to all of these groups.
Data on student progress can make the work of teachers, students, parents, administrators and policymakers more effective. It can ensure that during the course of the school year, students make progress toward their own growth targets and those who do not are flagged and interventions are done to get those students back on track. It should not come as a surprise that to have meaningful, timely data, one must administer meaningful, timely tests, and Wisconsin is falling short in this department in a number of ways.
School-level value-added analyses of student test scores are already being calculated for all schools with third- to eighth-graders statewide by a respected institution right here in Wisconsin. This information should be used by schools and districts to raise school and teacher productivity. We should continue to explore the use of value-added at the classroom level, a necessary step to implementing the new teacher-evaluation system proposed by the DPI that is statutorily required for implementation in 2014-’15.

Related: www.wisconsin2.org.




I dare you to measure the “value” I add



Summer:

(When i wrote this, I had no idea just how deeply this would speak to people and how widely it would spread. So, I think a better title is I Dare You to Measure the Value WE Add, and I invite you to share below your value as you see it.)
Tell me how you determine the value I add to my class.
Tell me about the algorithms you applied when you took data from 16 students over a course of nearly five years of teaching and somehow used it to judge me as “below average” and “average”.
Tell me how you can examine my skills and talents and attribute worth to them without knowing me, my class, or my curriculum requirements.
Tell me how and I will tell you:

Much more on “value added assessment“, here.




Turning the Tables: VAM (Value Added Models) on Trial



David Cohen:

Los Angeles Unified School District is embroiled in negotiations over teacher evaluations, and will now face pressure from outside the district intended to force counter-productive teacher evaluation methods into use. Yesterday, I read this Los Angeles Times article about a lawsuit to be filed by an unnamed “group of parents and education advocates.” The article notes that, “The lawsuit was drafted in consultation with EdVoice, a Sacramento-based group. Its board includes arts and education philanthropist Eli Broad, former ambassador Frank Baxter and healthcare company executive Richard Merkin.” While the defendant in the suit is technically LAUSD, the real reason a lawsuit is necessary according to the article is that “United Teachers Los Angeles leaders say tests scores are too unreliable and narrowly focused to use for high-stakes personnel decisions.” Note that, once again, we see a journalist telling us what the unions say and think, without ever, ever bothering to mention why, offering no acknowledgment that the bulk of the research and the three leading organizations for education research and measurement (AERA, NCME, and APA) say the same thing as the union (or rather, the union is saying the same thing as the testing expert). Upon what research does the other side base arguments in favor of using test scores and “value-added” measurement (VAM) as a legitimate measurement of teacher effectiveness? They never answer, but the debate somehow continues ad nauseum.
It’s not that the plaintiffs in this case are wrong about the need to improve teacher evaluations. Accomplished California Teachers has published a teacher evaluation report that has concrete suggestions for improving evaluations as well, and we are similarly disappointed in the implementation of the Stull Act, which has been allowed to become an empty exercise in too many schools and districts.

Much more on “value added assessment”, here.




The Inevitability of the Use of Value-Added Measures in Teacher Evaluations



Madison School Board Member Ed Hughes

Value added” or “VA” refers to the use of statistical techniques to measure teachers’ impacts on their students’ standardized test scores, controlling for such student characteristics as prior years’ scores, gender, ethnicity, disability, and low-income status.
Reports on a massive new study that seem to affirm the use of the technique have recently been splashed across the media and chewed over in the blogosphere. Further from the limelight, developments in Wisconsin seem to ensure that in the coming years value-added analyses will play an increasingly important role in teacher evaluations across the state. Assuming the analyses are performed and applied sensibly, this is a positive development for student learning.
The Chetty Study
Since the first article touting its findings was published on the front page of the January 6 New York Times, a new research study by three economists assessing the value-added contributions of elementary school teachers and their long-term impact on their students’ lives – referred to as the Chetty article after the lead author – has created as much of a stir as could ever be expected for a dense academic study.

Much more on value added assessment, here.
It is important to note that the Madison School District’s value added assessment initiative is based on the oft-criticized WKCE.




The Value-Add Map Is Not the Teaching Territory, But You’ll Still Get Lost without It



Greg Forster:

Since we’re so deep into the subject of value-added testing and the political pressures surrounding it, I thought I’d point out this recently published study tracking two and a half million students from a major urban district all the way to adulthood. (HT Whitney Tilson)
They compare teacher-specific value added on math and English scores with eventual life outcomes, and apply tests to determine whether the results are biased either by student sorting on observable variables (the life outcomes of their parents, obtained from the same life-outcome data) or unobserved variables (they use teacher switches to create a quasi-experimental approach).

Much more on value added assessment, here.




What Value-Added Research Does And Does Not Show



Matthew DiCarlo:

Value-added and other types of growth models are probably the most controversial issue in education today. These methods, which use sophisticated statistical techniques to attempt to isolate a teacher’s effect on student test score growth, are rapidly assuming a central role in policy, particularly in the new teacher evaluation systems currently being designed and implemented. Proponents view them as a primary tool for differentiating teachers based on performance/effectiveness.
Opponents, on the other hand, including a great many teachers, argue that the models’ estimates are unstable over time, subject to bias and imprecision, and that they rely entirely on standardized test scores, which are, at best, an extremely partial measure of student performance. Many have come to view growth models as exemplifying all that’s wrong with the market-based approach to education policy.
It’s very easy to understand this frustration. But it’s also important to separate the research on value-added from the manner in which the estimates are being used. Virtually all of the contention pertains to the latter, not the former. Actually, you would be hard-pressed to find many solid findings in the value-added literature that wouldn’t ring true to most educators.

Much more on value added assessment, here.




LAUSD won’t release teacher names with ‘value-added’ scores



Jason Song:

The Los Angeles Unified School District has declined to release to The Times the names of teachers and their scores indicating their effectiveness in raising student performance.
The nation’s second-largest school district calculated confidential “academic growth over time” ratings for about 12,000 math and English teachers last year. This fall, the district issued new ones to about 14,000 instructors that can also be viewed by their principals. The scores are based on an analysis of a student’s performance on several years of standardized tests and estimate a teacher’s role in raising or lowering student achievement.

Much more on value-added assessment, which, in Madison is based on the oft-criticized WKCE.




Value Added Report for the Madison School District



Full Report 1.1MB PDF

Value added is the use of statistical technique to isolate the contributions of schools to measured student knowledge from other influences such as prior student knowledge and demographics. In practice, value added focuses on the improvement of students from one year to the next on an annual state examination or other periodic assessment. The Value-Added Research Center (VARC) of the Wisconsin Center for Education Research produces value-added measures for schools in Madison using the Wisconsin Knowledge and Concepts Examination (WKCE) as an outcome. The model controls for prior-year WKCE scores, gender, ethnicity, disability, English language learner, low-income status, parent education, and full academic year enrollment to capture the effects of schools on student performance on the WKCE. This model yields measures of student growth in schools in Madison relative to each other. VARC also produces value-added measures using the entire state of Wisconsin as a data set, which yields measures of student growth in Madison Metropolitan School District (MMSD) relative to the rest of the state.
Some of the most notable results are:
1. Value added for the entire district of Madison relative to the rest of the state is generally positive, but it differs by subject and grade. In both 2008-09 and 2009-10, and in both math and reading, the value added of Madison Metropolitan School District was positive in more grades than it was negative, and the average value added across grades was positive in both subjects in both years. There are variations across grades and subjects, however. In grade 4, value-added is significantly positive in both years in reading and significantly negative in both years in math. In contrast, value-added in math is significantly positive–to a very substantial extent–in grade 7. Some of these variations may be the result of the extent to which instruction in those grades facilitate student learning on tested material relative to non-tested material. Overall, between November 2009 and November 2010, value-added for MMSD as a whole relative to the state was very slightly above average in math and substantially above average in reading. The section “Results from the Wisconsin Value-Added Model” present these results in detail.
2. The variance of value added across schools is generally smaller in Madison than in the state of Wisconsin as a whole, specifically in math. In other words, at least in terms of what is measured by value added, the extent to which schools differ from each other in Madison is smaller than the extent to which schools differ from each other elsewhere in Wisconsin. This appears to be more strongly the case in the middle school grades than in the elementary grades. Some of this result may be an artifact of schools in Madison being relatively large; when schools are large, they encompass more classrooms per grade, leading to more across-classroom variance being within-school rather than across-school. More of this result may be that while the variance across schools in Madison is entirely within one district, the variance across schools for the rest of the state is across many districts, and so differences in district policies will likely generate more variance across the entire state. The section “Results from the Wisconsin Value-Added Model” present results on the variance of value added from the statewide value-added model. This result is also evident in the charts in the “School Value-Added Charts from the MMSD Value-Added Model” section: one can see that the majority of schools’ confidence intervals cross (1) the district average, which means that we cannot reject the hypothesis that these schools’ values added are not different from the district average.
Even with a relatively small variance across schools in the district in general, several individual schools have values added that are statistically significantly greater or less than the district average. At the elementary level, both Lake View and Randall have values added in both reading and math that are significantly greater than the district average. In math, Marquette, Nuestro Mundo, Shorewood Hills, and Van Hise also have values added that are significantly greater than the district average. Values added are lower than the district average in math at Crestwood, Hawthorne, Kennedy, and Stephens, and in reading at Allis. At the middle school level, value added in reading is greater than the district average at Toki and lower than the district average at Black Hawk and Sennett. Value added in math is lower than the district average at Toki and Whitehorse.
3. Gaps in student improvement persist across subgroups of students. The value-added model measures gaps in student growth over time by race, gender, English language learner, and several other subgroups. The gaps are overall gaps, not gaps relative to the rest of the state. These gaps are especially informative because they are partial coefficients. These measure the black/white, ELL/non-ELL, or high-school/college-graduate-parent gaps, controlling for all variables available, including both demographic variables and schools attended. If one wanted to measure the combined effect of being both ELL and Hispanic relative to non-ELL and white, one would add the ELL/non-ELL gap to the Hispanic/white gap to find the combined effect. The gaps are within-school gaps, based on comparison of students in different subgroups who are in the same schools; consequently, these gaps do not include any effects of students of different subgroups sorting into different schools, and reflect within-school differences only. There does not appear to be an evident trend over time in gaps by race, low-income status, and parent education measured by the value-added model. The section “Coefficients from the MMSD Value-Added Model” present these results.
4. The gap in student improvement by English language learner, race, or low-income status usually does not differ substantively across schools; that between students with disabilities and students without disabilities sometimes does differ across schools. This can be seen in the subgroup value-added results across schools, which appear in the Appendix. There are some schools where value-added for students with disabilities differs substantively from overall value- added. Some of these differences may be due to differences in the composition of students with disabilities across schools, although the model already controls for overall differences between students with learning disabilities, students with speech disabilities, and students with all other disabilities. In contrast, value-added for black, Hispanic, ELL, or economically disadvantaged students is usually very close to overall value added.
Value added for students with disabilities is greater than the school’s overall value added in math at Falk and Whitehorse and in reading at Marquette; it is lower than the school’s overall value added in math at O’Keefe and Sennett and in reading at Allis, Schenk, and Thoreau. Value added in math for Hispanic students is lower than the school’s overall value added at Lincoln, and greater than the school’s overall value added at Nuestro Mundo. Value added in math is also higher for ELL and low-income students than it is for the school overall at Nuestro Mundo.

Much more on “value added assessment”, here.




‘Value-added’ teacher evaluations: Los Angeles Unified tackles a tough formula



Teresa Watanabe:

In Houston, school district officials introduced a test score-based evaluation system to determine teacher bonuses, then — in the face of massive protests — jettisoned the formula after one year to devise a better one.
In New York, teachers union officials are fighting the public release of ratings for more than 12,000 teachers, arguing that the estimates can be drastically wrong.
Despite such controversies, Los Angeles school district leaders are poised to plunge ahead with their own confidential “value-added” ratings this spring, saying the approach is far more objective and accurate than any other evaluation tool available.
“We are not questing for perfect,” said L.A. Unified’s incoming Supt. John Deasy. “We are questing for much better.”

Much more on “Value Added Assessment“, here.




Study backs ‘value-added’ analysis of teacher effectiveness



Classroom effectiveness can be reliably estimated by gauging students’ progress on standardized tests, Gates foundation study shows. Results come amid a national effort to reform teacher evaluations.
Teachers’ effectiveness can be reliably estimated by gauging their students’ progress on standardized tests, according to the preliminary findings of a large-scale study released Friday by leading education researchers.
The study, funded by the Bill and Melinda Gates Foundation, provides some of the strongest evidence to date of the validity of “value-added” analysis, whose accuracy has been hotly contested by teachers unions and some education experts who question the use of test scores to evaluate teachers.
The approach estimates a teacher’s effectiveness by comparing his or her students’ performance on standardized tests to their performance in previous years. It has been adopted around the country in cities including New York; Washington, D.C.; Houston; and soon, if local officials have their way, Los Angeles.
The $45-million Measures of Effective Teaching study is a groundbreaking effort to identify reliable gauges of teacher performance through an intensive look at 3,000 teachers in cities throughout the country. Ultimately, it will examine multiple approaches, including using sophisticated observation tools and teachers’ assessments of their own performance

Much more on value added assessment, here.




Adding Value to the Value-Added Debate



Liam Goldrick & Dr. Sara Goldrick-Rab

Seeing as I am not paid to blog as part of my daily job, it’s basically impossible for me to be even close to first out of the box on the issues of the day. Add to that being a parent of two small children (my most important job – right up there with being a husband) and that only adds to my sometimes frustration of not being able to weigh in on some of these issues quickly.
That said, here is my attempt to distill some key points and share my opinions — add value, if you will — to the debate that is raging as a result of the Los Angeles Times’s decision to publish the value-added scores of individual teachers in the L.A. Unified School District.
First of all, let me address the issue at hand. I believe that the LA Times’s decision to publish the value-added scores of individual teachers was irresponsible. Given what we know about the unreliability and variability in such scores and the likelihood that consumers of said scores will use them at face value without fully understanding all of the caveats, this was a dish that should have been sent back to the kitchen.
Although the LA Times is not a government or public entity, it does operate in the public sphere. And it has a responsibility as such an actor. Its decision to label LA teachers as ‘effective’ and ‘ineffective’ based on suspect value-added data alone is akin to an auditor secretly investigating a firm or agency without an engagement letter and publishing findings that may or may not hold water.
Frankly, I don’t care what positive benefits this decision by the LA Times might have engendered. Yes, the district and the teachers union have agreed to begin negotiations on a new evaluation system. Top district officials have said they want at least 30% of a teacher’s review to be based on value-added and have wisely said that the majority of the evaluations should depend on classroom observations. Such a development exonerates the LA Times, as some have argued. In my mind, any such benefits are purloined and come at the expense of sticking it — rightly in some cases, certainly wrongly in others — to individual teachers who mostly are trying their best.




Value Added Models& Student Information Systems



147K PDF via a Dan Dempsey email:

The following abstract and conclusion is taken from:
Volume 4, Issue 4 – Fall 2009 – Special Issue: Key Issues in Value-Added Modeling
Would Accountability Based on Teacher Value Added Be Smart Policy? An Examination of the Statistical Properties and Policy Alternatives
Douglas N. Harris of University of Wisconsin Madison
Education Finance and Policy Fall 2009, Vol. 4, No. 4: 319-350.
Available here:
http://www.mitpressjournals.org/doi/pdfplus/10.1162/edfp.2009.4.4.319
Abstract
Annual student testing may make it possible to measure the contributions to student achievement made by individual teachers. But would these “teacher value added” measures help to improve student achievement? I consider the statistical validity, purposes, and costs of teacher value-added policies. Many of the key assumptions of teacher value added are rejected by empirical evidence. However, the assumption violations may not be severe, and value-added measures still seem to contain useful information. I also compare teacher value-added accountability with three main policy alternatives: teacher credentials, school value-added accountability, and formative uses of test data. I argue that using teacher value-added measures is likely to increase student achievement more efficiently than a teacher credentials-only strategy but may not be the most cost-effective policy overall. Resolving this issue will require a new research and policy agenda that goes beyond analysis of assumptions and statistical properties and focuses on the effects of actual policy alternatives.
6. CONCLUSION
A great deal of attention has been paid recently to the statistical assumptions of VAMs, and many of the most important papers are contained in the present volume. The assumptions about the role of past achievement in affecting current achievement (Assumption No. 2) and the lack of variation in teacher effects across student types (Assumption No. 4) seem least problematic. However, unobserved differences are likely to be important, and it is unclear whether the student fixed effects models, or any other models, really account for them (Assumption No. 3). The test scale is also a problem and will likely remain so because the assumptions underlying the scales are untestable. There is relatively little evidence on how administration and teamwork affect teachers (Assumption No. 1).

Related: Value Added Assessment, Standards Based Report Cards and Los Angeles’s Value Added Teacher Data.
Many notes and links on the Madison School District’s student information system: Infinite Campus are here.




Who’s teaching L.A.’s kids? A Times “Value Added” analysis, using data largely ignored by LAUSD, looks at which educators help students learn, and which hold them back.



Jason Felch, Jason Song and Doug Smith

The fifth-graders at Broadous Elementary School come from the same world — the poorest corner of the San Fernando Valley, a Pacoima neighborhood framed by two freeways where some have lost friends to the stray bullets of rival gangs.
Many are the sons and daughters of Latino immigrants who never finished high school, hard-working parents who keep a respectful distance and trust educators to do what’s best.
The students study the same lessons. They are often on the same chapter of the same book.
Yet year after year, one fifth-grade class learns far more than the other down the hall. The difference has almost nothing to do with the size of the class, the students or their parents.
It’s their teachers.
With Miguel Aguilar, students consistently have made striking gains on state standardized tests, many of them vaulting from the bottom third of students in Los Angeles schools to well above average, according to a Times analysis. John Smith’s pupils next door have started out slightly ahead of Aguilar’s but by the end of the year have been far behind.

Much more on “Value Added Assessment” and teacher evaluations here. Locally, Madison’s Value Added Assessment evaluations are based on the oft criticized WKCE.




The hype of ‘value-added’ in teacher evaluation



Lisa Guisbond:

As a rookie mom, I used to be shocked when another parent expressed horror about a teacher I thought was a superstar. No more. The fact is that your kids’ results will vary with teachers, just as they do with pills, diets and exercise regimens.
Nonetheless, we all want our kids to have at least a few excellent teachers along the way, so it’s tempting to buy into hype about value-added measures (VAM) as a way to separate the excellent from the horrifying, or least the better from the worse.
It’s so tempting that VAM is likely to be part of a reauthorized No Child Left Behind. The problem is, researchers urge caution because of the same kinds of varied results featured in playground conversations.
Value-added measures use test scores to track the growth of individual students as they progress through the grades and see how much “value” a teacher has added.

The Madison School District has been using Value Added Assessment based on the oft – criticized WKCE.




A “Value Added” Report for the Madison School District



Kurt Kiefer:

Attached are the most recent results from our MMSD value added analysis project, and effort in which we are collaborating with the Wisconsin center for Educational Research Value Added Research Center (WCERVARC). These data include the two-year models for both the 2006-2008 and 2005-2007 school year spans.
This allows us in a single report to view value added performance for consecutive intervals of time and thereby begin to identify trends. Obviously, it is a trend pattern that will provide the greatest insights into best practices in our schools.
As it relates to results, there do seem to be some patterns emerging among elementary schools especially in regard to mathematics. As for middle schools, the variation across schools is once again – as it was last year with the first set of value added results – remarkably narrow, i.e., schools perform very similar to each other, statistically speaking.
Also included in this report are attachments that show the type of information used with our school principals and staff in their professional development sessions focused on how to interpret and use the data meaningfully. The feedback from the sessions has been very positive.

Much more on the Madison School District’s Value Added Assessment program here. The “value added assessment” data is based on Wisconsin’s oft-criticized WKCE.






Table E1 presents value added at the school level for 28 elementary schools in Madison Metropolitan School District. Values added are presented for two overlapping time periods; the period between the November 2005 to November 2007 WKCE administrations, and the more recent period between the November 2006 and November 2008 WKCE. This presents value added as a two-year moving average to increase precision and avoid overinterpretation of trends. Value added is measured in reading and math.
VA is equal to the school’s value added. It is equal to the number ofextra points students at a school scored on the WKCE relative to observationally similar students across the district A school with a zero value added is an average school in terms of value added. Students at a school with a value added of 3 scored 3 points higher on the WKCE on average than observationally similar students at other schools.
Std. Err. is the standard error ofthe school’s value added. Because schools have only a finite number of students, value added (and any other school-level statistic) is measured with some error. Although it is impossible to ascertain the sign of measurement error, we can measure its likely magnitude by using its standard error. This makes it possible to create a plausible range for a school’s true value added. In particular, a school’s measured value added plus or minus 1.96 standard errors provides a 95 percent confidence interval for a school’s true value added.
N is the number of students used to measure value added. It covers students whose WKCE scores can be matched from one year to the next.




Wisconsin Assessment Recommendations (To Replace the WKCE)



Wisconsin School Administrators Alliance, via a kind reader’s email [View the 146K PDF]

On August 27, 2009, State Superintendent Tony Evers stated that the State of Wisconsin would eliminate the current WKCE to move to a Balanced System of Assessment. In his statement, the State Superintendent said the following:

New assessments at the elementary and middle school level will likely be computer- based with multiple opportunities to benchmark student progress during the school year. This type of assessment tool allows for immediate and detailed information about student understanding and facilitates the teachers’ ability to re-teach or accelerate classroom instruction. At the high school level, the WKCE will be replaced by assessments that provide more information on college and workforce readiness.

By March 2010, the US Department of Education intends to announce a $350 million grant competition that would support one or more applications from a consortia of states working to develop high quality state assessments. The WI DPI is currently in conversation with other states regarding forming consortia to apply for this federal funding.
In September, 2009, the School Administrators Alliance formed a Project Team to make recommendations regarding the future of state assessment in Wisconsin. The Project Team has met and outlined recommendations what school and district administrators believe can transform Wisconsin’s state assessment system into a powerful tool to support student learning.
Criteria Underlying the Recommendations:

  • Wisconsin’s new assessment system must be one that has the following characteristics:
  • Benchmarked to skills and knowledge for college and career readiness • Measures student achievement and growth of all students
  • Relevant to students, parents, teachers and external stakeholders
  • Provides timely feedback that adds value to the learning process • Efficient to administer
  • Aligned with and supportive of each school district’s teaching and learning
  • Advances the State’s vision of a balanced assessment system

Wisconsin’s Assessment test: The WKCE has been oft criticized for its lack of rigor.
The WKCE serves as the foundation for the Madison School District’s “Value Added Assessment” initiative, via the UW-Madison School of Education.




Value-Added Education in the Race to the Top



David Davenport:

Bill Clinton may have invented triangulation – the art of finding a “third way” out of a policy dilemma – but U.S. Secretary of Education Arne Duncan is practicing it to make desperately needed improvements in K-12 education. Unfortunately, his promotion of value-added education through “Race to the Top” grants to states could be thrown under the bus by powerful teachers’ unions that view reforms more for how they affect pay and job security than whether they improve student learning.
The traditional view of education holds that it is more process than product. Educators design a process, hire teachers and administrators to run it, put students through it and consider it a success. The focus is on the inputs – how much can we spend, what curriculum shall we use, what class size is best – with very little on measuring outputs, whether students actually learn. The popular surveys of America’s best schools and colleges reinforce this, measuring resources and reputation, not results. As they say, Harvard University has good graduates because it admits strong applicants, not necessarily because of what happens in the educational process.
In the last decade, the federal No Child Left Behind program has ushered in a new era of testing and accountability, seeking to shift the focus to outcomes. But this more businesslike approach does not always fit a people-centered field such as education. Some students test well, and others do not. Some schools serve a disproportionately high number of students who are not well prepared. Even in good schools, a system driven by testing and accountability incentivizes teaching to the test, neglecting other important and interesting ways to engage and educate students. As a result, policymakers and educators have been ambivalent, at best, about the No Child Left Behind regime.

Value Added Assessment” is underway in Madison, though the work is based in the oft-criticized state WKCE examinations.




A Look at the University of Wisconsin’s Value Added Research Center:



Todd Finkelmeyer:

Rob Meyer can’t help but get excited when he hears President Barack Obama talking about the need for states to start measuring whether their teachers, schools and districts are doing enough to help students succeed.
“What he’s talking about is what we are doing,” says Meyer, director of the University of Wisconsin-Madison’s Value-Added Research Center.
If states hope to secure a piece of Obama’s $4.35 billion “Race to the Top” stimulus money, they’ll have to commit to using research data to evaluate student progress and the effectiveness of teachers, schools and districts.
Crunching numbers and producing statistical models that measure these things is what Meyer and his staff of 50 educators, researchers and various stakeholders do at the Value-Added Research Center, which was founded in 2004. These so-called “value-added” models of evaluation are designed to measure the contributions teachers and schools make to student academic growth. This method not only looks at standardized test results, but also uses statistical models to take into account a range of factors that might affect scores – including a student’s race, English language ability, family income and parental education level.
“What the value-added model is designed to do is measure the effect and contribution of the educational unit on a student, whether it’s a classroom, a team of teachers, a school or a program,” says Meyer. Most other evaluation systems currently in use simply hold schools accountable for how many students at a single point in time are rated proficient on state tests.

Much more on “value added assessment” here, along with the oft-criticized WKCE test, the soft foundation of much of this local work.




Wisconsin Department of Public Instruction statewide value added project results (Including the Madison & Milwaukee Public Schools)



Kurt Kiefer, Madison School District Chief Information Officer [150K PDF]:

Attached is a summary of the results form a recently completed research project conducted by The Value Added Research center (VARC) within the UW-Madison Wisconsin Center for Educational Research (WCER). Dr. Rob Meyer and Dr. Mike Christian will be on hand at the September 14 Board of Education meeting to review these findings.
The study was commissioned by the Wisconsin Department of Public Instruction (DPI). Both the Milwaukee Public Schools (MPS) and Madison Metropolitan School District (MMSD) were district participants. The purpose of the study was to determine the feasibility of a statewide value added statistical model and the development of state reporting and analysis prototypes. We are pleased with the results in that this creates yet one more vehicle through which we may benchmark our district and school performance.
At the September 14, 2009 Board meeting we will also share plans for continued professional development with our principals and staff around value added during the upcoming school year.
In November we plan to return to the Board with another presentation on the 2008-09 results that are to include additional methods of reporting data developed by VARC in conjunction with MPS and the DPI. We will also share progress with the professional development efforts.

Related:




The Death of WKCE? Task Force to Develop “Comprehensive Assessment System for Wisconsin”



The Wisconsin Department of Public Instruction [150K PDF], via a kind reader’s email:

Wisconsin needs a comprehensive assessment system that provides educators and parents with timely and relevant information that helps them make instructional decisions to improve student achievement,” said State Superintendent Elizabeth Burmaster in announcing members of a statewide Next Generation Assessment Task Force.
Representatives from business, commerce, and education will make recommendations to the state superintendent on the components of an assessment system that are essential to increase student achievement. Task force members will review the history of assessment in Wisconsin and learn about the value, limitations, and costs of a range of assessment approaches. They will hear presentations on a number of other states’ assessment systems. Those systems may include ACT as part of a comprehensive assessment system, diagnostic or benchmark assessments given throughout the year, or other assessment instruments and test administration methods. The group’s first meeting will be held October 8 in Madison.

A few notes:

.




Madison schools need to get real on equity, New value-added approach is needed for improving schools



Madison School Board Member Ed Hughes, writing in this week’s Isthmus:

A couple of weeks ago in these pages, Marc Eisen had some harsh words for the work of the Madison school district’s Equity Task Force (“When Policy Trumps Results,” 5/2/09). As a new school board member, I too have some doubts about the utility of the task force’s report. Perhaps it’s to be expected that while Eisen’s concerns touch on theory and rhetoric, mine are focused more on the nitty-gritty of decision making.
The smart and dedicated members of the Equity Task Force were assigned an impossible task: detailing an equity policy for me and other board members to follow. Equity is such a critical and nuanced consideration in school board decisions that, to be blunt, I’m not going to let any individual or group tell me what to do.
I am unwilling to delegate my responsibility to exercise my judgment on equity issues to a task force, no matter how impressive the group. Just as one school board cannot bind a future school board’s policymaking, I don’t think that the deliberations of a task force can restrict my exercise of independent judgment.
Admittedly, the task force faced a difficult challenge. It was obligated by the nature of its assignment to discuss equity issues in the abstract and offer up broad statements of principle.
Not surprisingly, most of the recommendations fall into the “of course” category. These include “Distribute resources based on student needs” and “Foster high academic expectations for all students.” I agree.

Related:




Florida Teachers’ Union Sues State on Data-Based Teacher Evaluations



Laura Waters:

Motoko Rich in the New York Times describes the federal lawsuit, initiated by seven Florida teachers with support from local NEA affiliates, which contends that the Florida DOE’s system of grading teachers based on student outcomes “violates teachers’ rights of due process and equal protection.”

Much more on “value added assessment, here“. Madison’s value added assessment scheme relies on the oft-criticized WKCE.




UW-Madison lab works with controversial data for Chicago schools



Todd Finkelmeyer:

Nearly 30,000 public school teachers and support staff went on strike in Chicago this past week in a move that left some 350,000 students without classes to attend.
And while this contentious battle between Chicago Public Schools and the Chicago Teachers Union blew up due to a range of issues — including compensation, health care benefits and job security concerns — one of the key sticking points reportedly was over the implementation of a new teacher evaluation system.
That’s noteworthy locally because researchers with UW-Madison’s Value Added Research Center (VARC) have been collaborating with the Chicago Public Schools for more than five years now in an effort to develop a comprehensive program to measure and evaluate the effectiveness of schools and teachers in that district.
But Rob Meyer, the director of VARC, says his center has stayed above the fray in this showdown between the Chicago teachers and the district, which appears close to being resolved.
“The controversy isn’t really about the merits of value-added and what we do,” says Meyer. “So we’ve simply tried to provide all the stakeholders in this discussion the best scientific information we can so everybody knows what they’re talking about.”

Much more on “value added assessment”, here.




New York Releases Teacher Rankings



Lisa Fleisher:

New York City on Friday released internal rankings of about 18,000 public schoolteachers who were measured over three years on their ability to affect student test scores.
The release of teacher’s job-performance data follows a yearlong legal battle with the United Federation of Teachers, which sued to block the release and protect teachers’ privacy. News organizations, including The Wall Street Journal, had requested the data in 2010 under the state Freedom of Information Law.
Friday’s release covers math and English teachers active between 2007 and 2010 in fourth- through eighth-grade classrooms. It does not include charter school teachers in those grades.
Schools Chancellor Dennis Walcott, who has pushed for accountability based on test scores, cautioned that the data were old and represented just one way to look at teacher performance.
“I don’t want our teachers disparaged in any way and I don’t want our teachers denigrated based on this information,” Mr. Walcott said Friday while briefing reporters on the Teacher Data Reports. “This is very rich data that has evolved over the years. … It’s old data and it’s just one piece of information.”

Related:




Teachers Are Put to the Test More States Tie Tenure, Bonuses to New Formulas for Measuring Test Scores



Stephanie Banchero & David Kesmodel:

Teacher evaluations for years were based on brief classroom observations by the principal. But now, prodded by President Barack Obama’s $4.35 billion Race to the Top program, at least 26 states have agreed to judge teachers based, in part, on results from their students’ performance on standardized tests.
So with millions of teachers back in the classroom, many are finding their careers increasingly hinge on obscure formulas like the one that fills a whiteboard in an economist’s office here.
The metric created by Value-Added Research Center, a nonprofit housed at the University of Wisconsin’s education department, is a new kind of report card that attempts to gauge how much of students’ growth on tests is attributable to the teacher.
For the first time this year, teachers in Rhode Island and Florida will see their evaluations linked to the complex metric. Louisiana and New Jersey will pilot the formulas this year and roll them out next school year. At least a dozen other states and school districts will spend the year finalizing their teacher-rating formulas.
“We have to deliver quality and speed, because [schools] need the data now,” said Rob Meyer, the bowtie-wearing economist who runs the Value-Added Research Center, known as VARC, and calls his statistical model a “well-crafted recipe.”

Much more on value added assessment, here.




Wisconsin Considering New Ways of Evaluating Teacher Effectiveness



Alan Borsuk:

What does just about every fifth-grader know that stumps experts?
Who the best teachers are in that kid’s school. Who’s hard, who’s easy, who makes you work, who lets you get away with stuff, who gets you interested in things, who’s not really on top of what’s going on. In other words: how good each teacher is.
A lot of the time, the fifth-grader’s opinions are on target.
But would you want to base a teacher’s pay or career on that?
Sorry, the experts are right. It’s tough to get a fair, thorough and insightful handle on how to judge a teacher.
“If there was a magic answer for this, somebody would have thought of it a long time ago,” Bradley Carl of Wisconsin Center for Education Research at the University of Wisconsin-Madison:  told a gathering of about 100 educators and policy-makers last week.

The Wisconsin Center for Education Research at the University of Wisconsin-Madison has been working on “Value Added Assessment” using the oft-criticized WKCE




Madison school officials want new standardized tests



Matthew DeFour:

Madison students are slated to get a double dose of standardized tests in the coming years as the state redesigns its annual series of exams while school districts seek better ways to measure learning.
For years, district students in grades three through eight and grade 10 have taken the Wisconsin Knowledge and Concepts Examination (WKCE), a series of state-mandated tests that measure school accountability.
Last month, in addition to the state tests, eighth- and ninth-graders took one of three different tests the district plans to introduce in grades three through 10. Compared with the WKCE, the tests are supposed to more accurately assess whether students are learning at, above or below grade level. Teachers also will get the results more quickly.
“Right now we have a vacuum of appropriate assessment tools,” said Tim Peterson, Madison’s assistant director of curriculum and assessment. “The standards have changed, but the measurement tool that we’re required by law to use — the WKCE — is not connected.”

Related Links:

I’m glad that the District is planning alternatives to the WKCE.




Beating the odds: 3 high-poverty Madison schools find success in ‘catching kids up’



Susan Troller:

When it comes to the quality of Madison’s public schools, the issue is pretty much black and white.
The Madison Metropolitan School District’s reputation for providing stellar public education is as strong as it ever was for white, middle-class students. Especially for these students, the district continues to post high test scores and turn out a long list of National Merit Scholars — usually at a rate of at least six times the average for a district this size.
But the story is often different for Hispanic and black kids, and students who come from economically disadvantaged backgrounds.
Madison is far from alone in having a significant performance gap. In fact, the well-documented achievement gap is in large measure responsible for the ferocious national outcry for more effective teachers and an overhaul of the public school system. Locally, frustration over the achievement gap has helped fuel a proposal from the Urban League of Greater Madison and its president and CEO, Kaleem Caire, to create a non-union public charter school targeted at minority boys in grades six through 12.
“In Madison, I can point to a long history of failure when it comes to educating African-American boys,” says Caire, who is black, a Madison native and a graduate of West High School. “We have one of the worst achievement gaps in the entire country. I’m not seeing a concrete plan to address that fact, even in a district that prides itself on innovative education.”
What often gets lost in the discussion over the failures of public education, however, is that there are some high-poverty, highly diverse schools that are beating the odds by employing innovative ways to reach students who have fallen through the cracks elsewhere.

Related: A Deeper Look at Madison’s National Merit Scholar Results.
Troller’s article referenced use of the oft criticized WKCE (Wisconsin Knowledge & Concepts Examination) (WKCE Clusty search) state examinations.
Related: value added assessment (based on the WKCE).
Dave Baskerville has argued that Wisconsin needs two big goals, one of which is to “Lift the math, science and reading scores of all K-12, non-special education students in Wisconsin above world-class standards by 2030”. Ongoing use of and progress measurement via the WKCE would seem to be insufficient in our global economy.
Steve Chapman on “curbing excellence”.




Wisconsin Teachers’ Union Proposed Education Reforms



Wisconsin Education Association Council:

State officers of the Wisconsin Education Association Council (WEAC) today unveiled three dramatic proposals as part of their quality-improvement platform called “Moving Education Forward: Bold Reforms.” The proposals include the creation of a statewide system to evaluate educators; instituting performance pay to recognize teaching excellence; and breaking up the Milwaukee Public School District into a series of manageable-sized districts within the city.
“In our work with WEAC leaders and members we have debated and discussed many ideas related to modernizing pay systems, better evaluation models, and ways to help turn around struggling schools in Milwaukee,” said WEAC President Mary Bell. “We believe bold actions are needed in these three areas to move education forward. The time for change is now. This is a pivotal time in public education and we’re in an era of tight resources. We must have systems in place to ensure high standards for accountability – that means those working in the system must be held accountable to high standards of excellence.”
TEACHER EVALUATION: In WEAC’s proposed teacher evaluation system, new teachers would be reviewed annually for their first three years by a Peer Assistance and Review (PAR) panel made up of both teachers and administrators. The PAR panels judge performance in four areas:

  • Planning and preparing for student learning
  • Creating a quality learning environment
  • Effective teaching
  • Professional responsibility

The proposed system would utilize the expertise of the UW Value-Added Research Center (Value Added Assessment) and would include the review of various student data to inform evaluation decisions and to develop corrective strategies for struggling teachers. Teachers who do not demonstrate effectiveness to the PAR panels are exited out of the profession and offered career transition programs and services through locally negotiated agreements.
Veteran teachers would be evaluated every three years, using a combination of video and written analysis and administrator observation. Underperforming veteran teachers would be required to go through this process a second year. If they were still deemed unsatisfactory, they would be re-entered into the PAR program and could ultimately face removal.
“The union is accepting our responsibility for improving the quality of the profession, not just for protecting the due process rights of our members,” said Bell. “Our goal is to have the highest-quality teachers at the front of every classroom across the state. And we see a role for classroom teachers to contribute as peer reviewers, much like a process often used in many private sector performance evaluation models.”
“If you want to drive change in Milwaukee’s public schools, connect the educators and the community together into smaller districts within the city, and without a doubt it can happen,” said Bell. “We must put the needs of Milwaukee’s students and families ahead of what’s best for the adults in the system,” said Bell. “That includes our union – we must act differently – we must lead.”

Madison’s “value added assessment” program is based on the oft-criticized WKCE examinations.
Related: student learning has become focused instead on adult employment – Ripon Superintendent Richard Zimman.




Time for Seattle Public Schools and teachers to partner in steps toward reform



Seattle Times:

SEATTLE Public Schools is right to push for a better, more honest way of evaluating teachers, even at the risk of a strike.
Tense contract negotiations between the district and the Seattle Education Association underscore the enormous opportunity at stake. Both sides agree the current system used to judge teachers is weak and unreliable. Ineffective teachers are ignored or shuffled to other schools to become other parents’ nightmare. Excellent teachers languish in a system that has no means to recognize or reward them.
The union leadership called for a few tweaks. But the district proposed a revamped system using student growth, as measured by test scores. Supporters of the status quo have tried to downplay the other forms of appraisal that would be used. They include student growth measurements selected by the teacher, principal observations of instruction and peer reviews. Also, student input at the high-school level.

Much more an value added assessment, here.




Serious ideas from State of Education speech. Seriously.



Susan Troller:

For instance, he’s the only state elected official to actually and seriously float a proposal to repair the broken state funding system for schools. He promises the proposal for his “Funding for Our Future” will be ready to introduce to lawmakers this fall and will include details on its impact on the state’s 424 school districts.
Evers also is interested in the potential of charter schools. Let’s be open and supportive about education alternatives, he says, but mindful of what’s already working well in public schools.
And he says qualified 11th and 12th graders should be allowed to move directly on to post-secondary education or training if they wish. Dual enrollment opportunites for high school age students attending college and technical schools will require a shift in thinking that shares turf and breaks down barriers, making seamless education — pre-K through post-secondary — a reality instead of some distant dream, according to Evers.
As to Evers’ comments on teacher testing, he joins a national conversation that has been sparked, in part, by the Obama administration as well as research that shows the single universal element in improved student performance is teacher quality. We recently featured a story about concerns over teacher evaluation based on student performance and test scores, and the issue has been a potent topic elsewhere, as well.

The proof, as always, is in the pudding, or substance.
Melissa Westbrook wrote a very useful and timely article on education reform:

I think many ed reformers rightly say, “Kids can’t wait.” I agree.
There is nothing more depressing than realizing that any change that might be good will likely come AFTER your child ages out of elementary, middle or high school. Not to say that we don’t do things for the greater good or the future greater good but as a parent, you want for your child now. Of course, we are told that change needs to happen now but the reality is what it might or might not produce in results is years off. (Which matters not to Bill Gates or President Obama because their children are in private schools.)
All this leads to wonder about our teachers and what this change will mean. A reader, Lendlees, passed on a link to a story that appeared in the LA Times about their teacher ratings. (You may recall that the LA Times got the classroom test scores for every single teacher in Los Angeles and published them in ranked order.)

Susan Troller notes that Wisconsin’s oft criticized WKCE (on which Madison’s value added assessment program is based) will be replaced – by 2014:

Evers also promised that the much maligned Wisconsin Knowledge and Concepts Exam, used to test student proficiency in 3rd through 6th, 8th and 10th grades, is on its way out. By 2014, there will be a much better assessment of student proficiency to take its place, Evers says, and he should know. He’s become a leading figure in the push for national core education standards, and for effective means for measuring student progress.




Where newspaper goes in rating teachers, others soon will follow



Alan Borsuk

So you want to know if the teacher your child has for the new school year is the star you’re hoping for. How do you find out?
Well, you can ask around. Often even grade school kids will give you the word. But what you hear informally might be on the mark and might be baloney. Isn’t there some way to get a good answer?
Um, not really. You want a handle on how your kid is doing, there’s plenty of data. You want information on students in the school or the school district, no problem.
But teachers? If they had meaningful evaluation reports, the reports would be confidential. And you can be quite confident they don’t have evaluations like that – across the U.S., and certainly in Wisconsin, the large majority of teachers get superficial and almost always favorable evaluations based on brief visits by an administrator to their classrooms, research shows. The evaluations are of almost no use in actually guiding teachers to improve.
Perhaps you could move to Los Angeles. The Los Angeles Times began running a project last Sunday on teachers and the progress students made while in their classes. It named a few names and said it will unveil in coming weeks specific data on thousands of teachers.

Related: Value added assessment.




Needs Improvement: Where Teacher Report Cards Fall Short



Carl Bialik:

Local school districts have started to grade teachers based on student test scores, but the early results suggest the effort deserves an incomplete.
The new type of teacher evaluations make use of the standardized tests that have become an annual rite for American public-school students. The tests mainly have been used to measure the progress of students and schools, but with some statistical finesse they can be transformed into a lens for identifying which teachers are producing the best test results.
At least, that’s the hope among some education experts. But the performance numbers that have emerged from these studies rely on a flawed statistical approach.
One perplexing finding: A large proportion of teachers who rate highly one year fall to the bottom of the charts the next year. For example, in a group of elementary-school math teachers who ranked in the top 20% in five Florida counties early last decade, more than three in five didn’t stay in the top quintile the following year, according to a study published last year in the journal Education Finance and Policy.

Related: Standards Based Report Cards and Value Added Assessment.




Monona Grove School District (WI) uses ACT-related tests to boost academic performance



Susan Troller:

Test early, test often, and make sure the results you get are meaningful to students, teachers and parents.
Although that may sound simple, in the last three years it’s become a mantra in the Monona Grove School District that’s helping all middle and high school students increase their skills, whether they’re heading to college or a career. The program, based on using ACT-related tests, is helping to establish the suburban Dane County district as a leader in educational innovation in Wisconsin.
In fact, Monona Grove recently hosted a half-day session for administrators and board members from Milwaukee and Madison who were interested in learning more about Monona Grove’s experiences and how the school community is responding to the program. In a pilot program this spring in Madison, students in eighth grade at Sherman Middle School will take ACT’s Explore test for younger students. At Memorial, freshmen will take the Explore test.
Known primarily as a college entrance examination, ACT Inc. also provides a battery of other tests for younger students. Monona Grove is using these tests — the Explore tests for grades 8 and 9, and the Plan tests for grades 10 and 11 — to paint an annual picture of each student’s academic skills and what he or she needs to focus on to be ready to take on the challenges of post-secondary education or the work force. The tests are given midway through the first semester, and results are ready a month later.
“We’re very, very interested in what Monona Grove is doing,” says Pam Nash, assistant superintendent for secondary education for the Madison district. “We’ve heard our state is looking at ACT as a possible replacement for the WKCE (Wisconsin Knowledge and Concepts Exam), and the intrinsic reliability of the ACT is well known. The WKCE is so unrelated to the students. The scores come in so late, it’s not useful.

The Madison School District’s “Value Added Assessment” program uses data from the oft-criticized WKCE.




New annual Wisconsin school testing system on hold



Amy Hetzner:

Nearly six months after the state announced it was scrapping its annual test for public school students, efforts to replace it with a new assessment are on hold and state officials now estimate it will take at least three years to make the switch.
The reason for the delay is tied to what is happening in the national education scene.
Wisconsin is among the 48 states that have signed onto the Common Core State Standards Initiative, which expects to complete work on grade-by-grade expectations for students in English and math by early spring. Once that is done, the anticipation is that the state will adopt the new standards, using them to help craft the new statewide test.
Wisconsin officials also are planning to compete for part of $350 million that the U.S. Education Department plans to award in the fall to state consortiums for test development.

The WKCE (Wisconsin Knowledge & Concepts Exam) has been criticized for its lack of rigor. The Madison School District is using the WKCE as the basis for its value added assessment initiative.




Data-Driven Schools See Rising Scores



John Hechinger:

Last fall, high-school senior Duane Wilson started getting Ds on assignments in his Advanced Placement history, psychology and literature classes. Like a smoke detector sensing fire, a school computer sounded an alarm.
The Edline system used by the Montgomery County, Md., Public Schools emailed each poor grade to his mother as soon as teachers logged it in. Coretta Brunton, Duane’s mother, sat her son down for a stern talk. Duane hit the books and began earning Bs. He is headed to Atlanta’s Morehouse College in the fall.
If it hadn’t been for the tracking system, says the 17-year-old, “I might have failed and I wouldn’t be going to college next year.”
Montgomery County has made progress in improving the lagging academic performance of African-American and Hispanic students. See data.
Montgomery spends $47 million a year on technology like Edline. It is at the vanguard of what is known as the “data-driven” movement in U.S. education — an approach that builds on the heavy testing of President George W. Bush’s No Child Left Behind law. Using district-issued Palm Pilots, for instance, teachers can pull up detailed snapshots of each student’s progress on tests and other measures of proficiency.
The high-tech strategy, which uses intensified assessments and the real-time collection of test scores, grades and other data to identify problems and speed up interventions, has just received a huge boost from President Barack Obama and Education Secretary Arne Duncan.

Related notes and links: Wisconsin Knowledge & Concepts (WKCE) Exam, Value Added Assessments, Standards Based Report Cards and Infinite Campus.
Tools such as Edline, if used pervasively, can be very powerful. They can also save a great deal of time and money.




Despite initial low test scores, Madison’s Nuestro Mundo gains fans



Samara Kalk Derby:

It’s Thursday afternoon at Madison’s Nuestro Mundo Elementary School and teacher Christina Amberson, “Maestra Cristina” to her kindergarten students, speaks in rapid-fire Spanish. If you didn’t know better, you would assume Spanish was Amberson’s native language. But her impeccable Spanish is a product of many years of studying and teaching abroad in a number of Spanish-speaking countries.



Children respond only in Spanish. The only time they speak English is when English-speaking children are sitting together at tables. If Amberson overhears, she reminds them to use their Spanish.



Amberson’s kindergartners — a nearly even mix of native Spanish speakers and native English speakers — seem more confident with their language than a typical student in a high school or college Spanish class.



Everything posted on the dual-language immersion school’s bulletin boards or blackboards is in Spanish except for a little section of photos and articles about “El Presidente Barack Obama.”

It is ironic that WKCE results are used in this way, given the Wisconsin DPI’s statement: “Schools should not rely on only WKCE data to gauge progress of individual students or to determine effectiveness of programs or curriculum”. Much more on the WKCE here. The Madison School District is using WKCE data for “Value Added Assessment“.




Wisconsin, Mississippi Have “Easy State K-12 Exams” – NY Times



Sam Dillon:

A state-by-state analysis by The New York Times found that in the 40 states reporting on their compliance so far this year, on average, 4 in 10 schools fell short of the law’s testing targets, up from about 3 in 10 last year. Few schools missed targets in states with easy exams, like Wisconsin and Mississippi, but states with tough tests had a harder time. In Hawaii, Massachusetts and New Mexico, which have stringent exams, 60 to 70 percent of schools missed testing goals. And in South Carolina, which has what may be the nation’s most rigorous tests, 83 percent of schools missed targets.

Related:




Study: “Ohio State Tests Invalid for Rating Schools”



Randy Hoover:

This is the table of contents to the final findings from the research study of Ohio school district performance on the OPT and OSRC. This site is the data, graph, links, and comment page for Hoover’s research study of Ohio school district proficiency test and school report card performance accountability. These data and findings have been released to the public as of February 27, 2000. The entire study is available online for your use. If you wish to be included in the emailing list of updates about OPT and OSRC issues, click on the logo at the top of this page and send me your request.
The graphs and data presented here are from the final replication of the study. This final analysis represents the culmination of several hundred hours of work put forth to gain empirical insights into OPT performance across all Ohio school districts. At the time the study was completed there were 611 school districts in the State of Ohio. This study uses data from 593 districts out of the 611 total. 18 districts were not included in the study because of incomplete data or because the districts were too small such as North Bass Island. All data were taken from EMIS online data and no data other than the data presented by the State of Ohio were used. My confidence level is high that there are very few errors in the data array. Though errors are certainly possible, I am confident that if they exist they are minor and do not significantly affect the overall conclusions of this study. (RLH)

Scott Elliott has more.
Related: The Madison School District’s “Value Added Assessment” program uses the Wisconsin Department of Public instruction’s WKCE results. The WKCE’s rigor has been criticized.




How Well Are They Really Doing? Criticism for State’s “Weak Student Tests”



NY Times Editorial:

Congress has several concerns as it moves toward reauthorizing the No Child Left Behind Act of 2002. Whatever else they do, lawmakers need to strengthen the requirement that states document student performance in yearly tests in exchange for federal aid.
The states have made a mockery of that provision, using weak tests, setting passing scores low or rewriting tests from year to year, making it impossible to compare progress — or its absence — over time.
The country will have difficulty moving ahead educationally until that changes.
Most states that report strong performances on their own tests do poorly on the more rigorous and respected National Assessment of Educational Progress, which is often referred to as NAEP and is also known as the nation’s report card. That test is periodically given to a sample of students in designated grades in both public and private schools. States are resisting the idea of replacing their own tests with the NAEP, arguing that the national test is not aligned to state standards. But the problem is that state standards are generally weak, especially in math and science.

Letters, in response to this editorial:

In discussing how some states game their student test results, you state, “The federal government could actually embarrass the laggard states by naming the ones that cling to weak tests.” The evidence on these states has been available for some time.
In 2005, Tennessee tested its eighth-grade students in math and found 87 percent of students performed at or above the proficiency level, while the National Assessment of Educational Progress, or NAEP, test indicated only 21 percent of Tennessee’s eighth graders proficient in math.
In Mississippi, 89 percent of fourth graders performed at or above proficiency on the state reading test, while only 18 percent demonstrated proficiency on the federal test. In Alabama, 83 percent of fourth-grade students scored at or above proficient on the state’s reading test, while only 22 percent were proficient on the NAEP test.
Other states were also found guilty in their determinations of proficient when compared with the federal NAEP test.
The No Child Left Behind Act will never be able to realize its potential as long as entire states are left behind because of the duplicitous efforts of their state officials. If Congress adopted national standards with a corresponding set of national exams in its reauthorization of the law, it could effectively minimize or eliminate these individual state shenanigans.
Paul Hoss
Marshfield, Mass., Aug.

Locally, the Madison School District’s Value Added Assessment Program is based on the State Department of Instruction’s Standards.




Wisconsin Moving Toward New Academic Standards



Wisconsin Department of Public Instruction:

Wisconsin took an important step Wednesday toward new academic standards which will provide the rigor and relevance students need to succeed in the 21st century.
During the Partnership for 21st Century Skills (P21) Best Practices Forum (Institute.21) in Madison, State Superintendent Elizabeth Burmaster received final recommendations for revising and then implementing Model Academic Standards in English language arts and mathematics.
The recommendations represent the work of leadership and design teams made up of educators, legislators, parents, and business representatives.

Wisconsin’s standards have been criticized by the Fordham Foundation. The Madison School District is planning to use “Value Added Assessment” based on the state standards.




Dane County, WI Schools Consider MAP Assessement Tests After Frustration with State WKCE Exams
Waunakee Urges that the State Dump the WKCE



Andy Hall takes a look at a useful topic:

From Wisconsin Heights on the west to Marshall on the east, 10 Dane County school districts and the private Eagle School in Fitchburg are among more than 170 Wisconsin public and private school systems purchasing tests from Northwest Evaluation Association, a nonprofit group based in the state of Oregon.
The aim of those tests, known as Measures of Academic Progress, and others purchased from other vendors, is to give educators, students and parents more information about students ‘ strengths and weaknesses. Officials at these districts say the cost, about $12 per student per year for MAP tests, is a good investment.
The tests ‘ popularity also reflects widespread frustration over the state ‘s $10 million testing program, the Wisconsin Knowledge and Concepts Examination.
Critics say that WKCE, which is used to hold schools accountable under the federal No Child Left Behind law, fails to provide adequate data to help improve the teaching methods and curriculum used in the classrooms.
They complain that because the tests are administered just once a year, and it takes nearly six months to receive the results, the information arrives in May — too late to be of use to teachers during the school year.
The testing controversy is “a healthy debate, ” said Tony Evers, deputy state superintendent of public instruction, whose agency contends that there ‘s room for both WKCE and MAP.
….
“It ‘s a test that we feel is much more relevant to assisting students and helping them with their skills development, ” said Mike Hensgen, director of curriculum and instruction for the Waunakee School District, who acknowledges he ‘s a radical in his dislike of WKCE.
“To me, the WKCE is not rigorous enough. When a kid sees he ‘s proficient, ‘ he thinks he ‘s fine. ”
Hensgen contends that the WKCE, which is based on the state ‘s academic content for each grade level, does a poor job of depicting what elite students, and students performing at the bottom level, really know.
The Waunakee School Board, in a letter being distributed this month, is urging state legislators and education officials to find ways to dump WKCE in favor of MAP and tests from ACT and other vendors.

The Madison School District and the Wisconsin Center for Education Research are using the WKCE as a benchmark for “Value Added Assessment”.
Related:




“Schools should not rely on only WKCE data to gauge progress of individual students or to determine effectiveness of programs or curriculum”



Peter Sobol on the 2007 Wisconsin DPI State test results (WKCE):

The results for the WKCE test administered in November 2007 were finally released on May 30th. That is more than six months after the test was given. Worse, the data files containing the detailed results that can be used for proper statistical analysis of the results are STILL not available for download. Assessments are information that degrades over time. The fact that it takes six months to get the data out (whatever its other shortcomings) cheats the taxpayers of the full value of their investment.
At the very least the WI DPI should be embarrassed by the fact it takes this long to release the test results. Personally I find it outrageous. I had an email exchange with DPI officials concerning this long delay and the loss of value, this is an excerpt from part of that response (italics mine):

… The WKCE is a large-scale assessment designed to provide a snapshot of how well a district or school is doing at helping all students reach proficiency on state standards, with a focus on school and district-level accountability. A large-scale, summative assessment such as the WKCE is not designed to provide diagnostic information about individual students. Those assessments are best done at the local level, where immediate results can be obtained. Schools should not rely on only WKCE data to gauge progress of individual students or to determine effectiveness of programs or curriculum.

Does anyone else find the fact that the state issues WKCE results to individual students surprising given the above statement?

The Madison School District, together with the Wisconsin Center for Education Research is using local WKCE results for “Value Added Assessment“.
Much more on the WKCE here.
Minnesota recently administered their first online science test.




Advocating a Standard Graduation Rate & Madison’s “2004 Elimination of the Racial Achievement Gap in 3rd Grade Reading Scores”



Leslie Ann Howard:

Back in 1995, when the Wisconsin State Journal and WISC-TV began a civic journalism project to study the racial achievement gaps in our schools, the statistical measures of student achievement and reading in third grade put the issue in sharp focus.
United Way and our community partners’ efforts, through a variety of strategies including the Schools of Hope tutoring program, relied on those strong, focused statistics to measure the success of our 1-on-1 and 1-on-2 tutoring.
By 2004, Superintendent Art Rainwater was able to announce the elimination of the racial achievement gap in third grade reading scores, because our community had focused on stable statistical measure for over 10 years.
A standard graduation rate formula would create the same public focus for our nation’s efforts to increase high school graduation rates.

Related:




Some of California’s most gifted students are being ignored, advocates say



Carla Rivera:

If you reviewed Dalton Sargent’s report cards, you’d know only half his story. The 15-year-old Altadena junior has lousy grades in many subjects. He has blown off assignments and been dissatisfied with many of his teachers. It would be accurate to call him a problematic student. But he is also gifted.
Dalton is among the sizable number of highly intelligent or talented children in the nation’s classrooms who find little in the standard curriculum to rouse their interest and who often fall by the wayside.
With schools under intense pressure from state and federal mandates such as No Child Left Behind to raise test scores of low-achieving pupils, the educational needs of gifted students — who usually perform well on standardized tests — too often are ignored, advocates say.
Nationally, about 3 million kindergarten through 12th-grade students are identified as gifted, but 80% of them do not receive specialized instruction, experts say. Studies have found that 5% to 20% of students who drop out are gifted.
There is no federal law mandating special programs for gifted children, though many educators argue that these students — whose curiosity and creativity often coexist with emotional and social problems — deserve the same status as those with special needs. Services for gifted students vary from state to state. In California, about 512,000 students are enrolled in the Gifted and Talented Education program, which aims to provide specialized and accelerated instruction.

Linda Scholl @ Wisconsin Center for Education Research: SCALE Case Study: Evolution of K-8 Science Instructional Guidance in Madison Metropolitan School District [PDF report]

In addition, by instituting a standards-based report card system K-8, the department has increased accountability for teaching to the standards.
The Department is struggling, however, to sharpen its efforts to reduce the achievement gap. While progress has been made in third grade reading, significant gaps are still evident in other subject areas, including math and science. Educational equity issues within the school district are the source of much public controversy, with a relatively small but vocal parent community that is advocating for directing greater resources toward meeting the needs of high achieving students. This has slowed efforts to implement strong academic equity initiatives, particularly at the middle and early high school levels. Nonetheless, T&L content areas specialists continue working with teachers to provide a rigorous curriculum and to differentiate instruction for all students. In that context, the new high school biology initiative represents a significant effort to raise the achievement of students of color and economic disadvantage.

WCER’s tight relationship with the Madison School District has been the source of some controversy.
Related:

Scholl’s error, in my view, is viewing the controversy as an issue of “advocating for directing greater resources toward meeting the needs of high achieving students”. The real issue is raising standards for all, rathing than reducing the curriculum quality (see West High School Math teachers letter to the Isthmus:

Moreover, parents of future West High students should take notice: As you read this, our department is under pressure from the administration and the math coordinator’s office to phase out our “accelerated” course offerings beginning next year. Rather than addressing the problems of equity and closing the gap by identifying minority math talent earlier, and fostering minority participation in the accelerated programs, our administration wants to take the cheaper way out by forcing all kids into a one-size-fits-all curriculum.
It seems the administration and our school board have re-defined “success” as merely producing “fewer failures.” Astonishingly, excellence in student achievement is visited by some school district administrators with apathy at best, and with contempt at worst. But, while raising low achievers is a laudable goal, it is woefully short-sighted and, ironically, racist in the most insidious way. Somehow, limiting opportunities for excellence has become the definition of providing equity! Could there be a greater insult to the minority community?

)
A friend mentioned a few years ago that the problems are in elementary and middle school. Rather than addressing those, the administration is trying to make high school changes.
Thanks to a reader for sending along these links.




Edweek Chat: The Use of International Data to Improve US Schools



4/30/2008 @ 2:30p.m. CST:

Join us for a live Web chat about the impact of A Nation at Risk and the potential for using international comparison data to improve academic standards and student achievement in U.S. schools.
Twenty-five years ago, a federal commission issued the landmark report that declared a “rising tide of mediocrity” in U.S. education posed a threat to America’s prosperity and status in the world. Today, many policymakers and members of the business and education communities are sounding the same alarm bells.
Some experts are recommending that the United States put more stock in measuring itself against other countries, including having individual states benchmark their progress against those countries to get a clear and true picture of the status of American education. Would that help improve education in America? What can the United States do to improve education and continue to compete globally? Are the problems with the U.S. education system, compared with those of other industrialized countries’, overblown? Join us for this discussion.
About the guests:
• Dane Linn, is the director of the education division of the National Governors Association, a Washington-based research and advocacy organization that has taken an active role in examining how states might align their academic standards and practices to those of top-performing nations
• Iris C. Rotberg, is the co-director of the Center for Curriculum, Standards, and Technology at George Washington University, in Washington, D.C.
Submit questions in advance.

Related: Fordham Foundation – Wisconsin DPI’s Academic Standards = D-. The Madison School District is implementing “value added assessment” based on the DPI standards.
Watch the Madison School Board’s most recent discussion of “Value Added Assessment“.




Madison School Board Detailed Agenda Posted Online – Including a Proposed Wisconsin Center for Education Research Contract



A reader’s email mentioned that the Madison School Board has begun posting more detailed agenda items on their meeting web page. Monday, March 3’s full agenda includes Superintedent Art Rainwater’s discussion of the proposed Middle School report card changes along with a recommendation to approve an agreement with the Wisconsin Center for Education Research (1.5MB PDF):

The focus of this project is to develop a value-added system for the Madison Metropolitan School District and produce value-added reports using assessment data from November 2005 to November 2007. Since the data from the November 2007 assessment will not be available until March 2008, WCER will first develop a value-added system based on two years of state assessment data (November 2005 and November 2006). After the 2007 data becomes available (about Ma r c h 1 2008), WCER will extend the value-added system so that it incorporates all three years of data. Below, we list the tasks for this project and a project timeline.
Task 1. Specify features o f MMSD value-added model
Task 2. Develop value-added model using 2005 and 2006 assessment dat a
Task 3. Produce value-added reports using 2005 and 2006 assessment data
Task 4. Develop value-added model using 2005, 2006, and 2007 assessment
Task 5. Produce value-added reports using 2005-2007 assessment data

August, 2007 presentation to the Madison School Board’s Performance & Achievement Committee on “Value Added Assessment”.




“A Modest Proposal for the Schools:”
Eliminate local control



A provocative title for a must read. It addresses a number of issues, from local outsize influence on school boards to Wisconsin’s low state standards:

Congress erred big-time when NCLB assigned each state to set its own standards and devise and score its own tests … this study underscores the folly of a big modern nation, worried about its global competitiveness, nodding with approval as Wisconsin sets its eighth-grade reading passing level at the 14th percentile while South Carolina sets its at the 71st percentile.

Matt Miller via a kind reader’s email:

It wasn’t just the slate and pencil on every desk, or the absence of daily beatings. As Horace Mann sat in a Leipzig classroom in the summer of 1843, it was the entire Prussian system of schools that impressed him. Mann was six years into the work as Massachusetts secretary of education that would earn him lasting fame as the “father of public education.” He had sailed from Boston to England several weeks earlier with his new wife, combining a European honeymoon with educational fact-finding. In England, the couple had been startled by the luxury and refinement of the upper classes, which exceeded anything they had seen in America and stood in stark contrast to the poverty and ignorance of the masses. If the United States was to avoid this awful chasm and the social upheaval it seemed sure to create, he thought, education was the answer. Now he was seeing firsthand the Prussian schools that were the talk of reformers on both sides of the Atlantic.
In Massachusetts, Mann’s vision of “common schools,” publicly funded and attended by all, represented an inspiring democratic advance over the state’s hodgepodge of privately funded and charity schools. But beyond using the bully pulpit, Mann had little power to make his vision a reality. Prussia, by contrast, had a system designed from the center. School attendance was compulsory. Teachers were trained at national institutes with the same care that went into training military officers. Their enthusiasm for their subjects was contagious, and their devotion to students evoked reciprocal affection and respect, making Boston’s routine resort to classroom whippings seem barbaric.
Mann also admired Prussia’s rigorous national curriculum and tests. The results spoke for themselves: illiteracy had been vanquished. To be sure, Prussian schools sought to create obedient subjects of the kaiser—hardly Mann’s aim. Yet the lessons were undeniable, and Mann returned home determined to share what he had seen. In the seventh of his legendary “Annual Reports” on education to the Commonwealth of Massachusetts, he touted the benefits of a national system and cautioned against the “calamities which result … from leaving this most important of all the functions of a government to chance.”
Mann’s epiphany that summer put him on the wrong side of America’s tradition of radical localism when it came to schools. And although his efforts in the years that followed made Massachusetts a model for taxpayer-funded schools and state-sponsored teacher training, the obsession with local control—not incidentally, an almost uniquely American obsession—still dominates U.S. education to this day. For much of the 150 or so years between Mann’s era and now, the system served us adequately: during that time, we extended more schooling to more people than any nation had before and rose to superpower status. But let’s look at what local control gives us today, in the “flat” world in which our students will have to compete.
The United States spends more than nearly every other nation on schools, but out of 29 developed countries in a 2003 assessment, we ranked 24th in math and in problem-solving, 18th in science, and 15th in reading. Half of all black and Latino students in the U.S. don’t graduate on time (or ever) from high school. As of 2005, about 70 percent of eighth-graders were not proficient in reading. By the end of eighth grade, what passes for a math curriculum in America is two years behind that of other countries.
Dismal fact after dismal fact; by now, they are hardly news. But in the 25 years since the landmark report A Nation at Risk sounded the alarm about our educational mediocrity, America’s response has been scattershot and ineffective, orchestrated mainly by some 15,000 school districts acting alone, with help more recently from the states. It’s as if after Pearl Harbor, FDR had suggested we prepare for war through the uncoordinated efforts of thousands of small factories; they’d know what kinds of planes and tanks were needed, right?
When you look at what local control of education has wrought, the conclusion is inescapable: we must carry Mann’s insights to their logical end and nationalize our schools, to some degree. But before delving into the details of why and how, let’s back up for a moment and consider what brought us to this pass.

Related:





How Effective are Tennessee’s Teacher Preparation Programs?



Education Consumers foundation:

Tennessee’s Value Added Assessment System has been in place since 1995. It enables users to estimate the success of teachers, schools, and districts in lifting student achievement and it does so in a way that permits statistically fair comparisons.

Since 2007, the Tennessee Higher Education Commission has published a report card that uses the TVAAS data to estimate success of Tennessee’s teacher preparation programs in graduating “highly effective” new teachers. Highly effective new teachers are ones who are able bring about as much student learning per year as top-performing veteran teachers. Highly ineffective teachers are ones who bring about the same learning gains per year as the State’s least effective veteran teachers. The teacher classification is based on the first 3 years of teaching.

Related: Foundations of Reading.

MTEL

Madison’s long term, disastrous reading results.




Apples to Apples; Comparing Wisconsin public, charter, and private voucher schools



Will Flanders:

It’s an unfortunate reality that demographic factors historically play a large role in student performance; any honest assessment of how schools and school sectors are performing must take those factors into account. Much of the reporting on school performance, though, ignores this reality. This report endeavors to incorporate these factors through rigorous statistical modeling that controls for, and assesses the impact of, several student characteristics. This report has been updated to include data from the 2022-23 report cards.

Among the key findings:

  • Students in the Milwaukee Parental Choice Program continue to outperform their public school peers. Proficiency rates in private choice schools were about 8.6% higher in English/Language Arts (ELA) and 7.0% higher in math on average than proficiency rates in traditional public schools in Milwaukee.
  • Charter school students in Milwaukee continue to outperform their public school peers. District charters saw 6.9% and 6.6% higher proficiency in ELA and math respectively than traditional public schools.
  • Statewide, choice students outperform their public school peers in ELA. Proficiency rates were about 5.4% higher in ELA for students participating in school choice statewide than traditional public school students. No difference was found in math performance.
  • Wisconsin continues to struggle with its achievement gaps. Statewide, a school with 100% low-income students would be expected to have proficiency rates 40.6% lower in ELA and 44.0% lower in math compared to a hypothetical school with zero low-income students. For African American students, that gap is 17.8% in ELA and 20.3% in math. Hispanic students have an achievement gap of approximately 6.3% in math, but no significant gap was found in ELA.
  • Choice and charter schools are more efficient with taxpayer money. Once the demographics of students in the schools are taken into account, choice and charter schools earn more proficiency per $1,000 of spending than traditional public schools in both Milwaukee and the state as a whole.
  • Choice schools offer more value added. 12 of the top 20 schools in the state where student performance exceeds expectations based on demographics are in the state’s choice programs.
  • Rural schools perform worse than schools in any other type of geography. On average, proficiency in Wisconsin’s rural schools is significantly lower in both ELA and math than urban, suburban, or town schools.

Commentary.

Underly and our long term disastrous reading results….

WEAC: $1.57 million for Four Wisconsin Senators

Legislation and Reading: The Wisconsin Experience 2004-

“Well, it’s kind of too bad that we’ve got the smartest people at our universities, and yet we have to create a law to tell them how to teach.”

The data clearly indicate that being able to read is not a requirement for graduation at (Madison) East, especially if you are black or Hispanic”

My Question to Wisconsin Governor Tony Evers on Teacher Mulligans and our Disastrous Reading Results

2017: West High Reading Interventionist Teacher’s Remarks to the School Board on Madison’s Disastrous Reading Results 

Madison’s taxpayer supported K-12 school district, despite spending far more than most, has long tolerated disastrous reading results.

“An emphasis on adult employment”

Wisconsin Public Policy Forum Madison School District Report[PDF]

WEAC: $1.57 million for Four Wisconsin Senators

Friday Afternoon Veto: Governor Evers Rejects AB446/SB454; an effort to address our long term, disastrous reading results

Booked, but can’t read (Madison): functional literacy, National citizenship and the new face of Dred Scott in the age of mass incarceration.

When A Stands for Average: Students at the UW-Madison School of Education Receive Sky-High Grades. How Smart is That?




“What’s wrong with a B? I’d much rather get that than spend six hours every week on business studies”



Lucy Kellaway:

Life at my school is founded on the Gospel values which, I found after a spot of googling, involve the sort of thing even the most devout atheist should be able to sign up to: forgiveness, honesty, trust, family and, above all, love.

I listened with disbelief in the first staff meeting when we were told it was our job to love all our students — especially the ones who were hardest to love. This was a departure from the successful academy school in east London where I trained, when staff would gather together in the name of no excuses, exam results and value-added scores.

This emphasis on love seems to me oddly profound, because from it everything else flows. If you force yourself to care deeply for every one of your students, you work harder for them, you want the best for them. All the other stuff I learnt in teacher training after leaving my job as a columnist at the Financial Times — differentiation and assessment for learning — seems a bit by the by. 

It is not only the Gospel that is making me have a rethink. It is the experience of teaching and living 300 miles from the capital, my home for the past 63 years.

The stats bear this out. According to the University of Essex’s Understanding Society study, the North East is the least mobile place in the country, with 55 per cent of survey respondents living within 15 miles of their mother — more than three times as many as in the capital. And, if my students are any guide, this statistic is not about to change, as few of them plan to leave. They might go abroad for a bit (I tried to warn them that Brexit has made this harder), but after that they want to return home. No one has any interest in moving to London. They know they can’t afford it, and don’t fancy it anyway.




Using Admissions Lotteries to Validate and Improve School Quality Measures



Blueprint Labs:

Parents increasingly rely on data-driven measures of school quality to choose schools, while districts use the same information to guide policy. School quality information often comes in the form of “school report cards” like New York City’s School Quality Snapshot and “school finder” websites, like GreatSchools.org. These school ratings are highly consequential, influencing family enrollment decisions and home prices around the country. State and district restructuring decisions, such as school closures or mergers, also turn on these measures. In short, school performance measures matter.

Our study examines the predictive validity of alternative school performance measures to show how they can be improved in districts that use centralized assignment to match students to schools. Using data from New York City and Denver, we show that a new approach that harnesses student data from centralized assignment, which we call a risk controlled value-added model (RC VAM), improves upon conventional methods. We also study a range of other value-added models. In practice, analysts may not have the data required to compute RC VAM and not all districts assign seats centrally. Our study sheds light on approaches that might best serve as performance measures in the absence of ideal data.

The validity of school ratings is important to both policymakers and parents, who rely on them for consequential decisions. Inaccurate measures of school quality can unfairly reward or punish schools erroneously deemed to be more or less effective. For organizations that engage in the provision of quality measures, the methods developed in our study offer a new tool that can provide fairer assessments of school effectiveness.




Looking Back on DC Education Reform 10 Years After, Part 1: The Grand Tour



Richard P Phelps:

Ten years ago, I worked as the Director of Assessments for the District of Columbia Public Schools (DCPS). My tenure coincided with Michelle Rhee’s last nine months as Chancellor. I departed shortly after Vincent Gray defeated Adrian Fenty in the September 2010 DC mayoral primary.

My primary task was to design an expansion of that testing program that served the IMPACT teacher evaluation system to include all core subjects and all grade levels. Despite its fame (or infamy), the test score aspect of the IMPACT program affected only 13% of teachers, those teaching either reading or math in grades four through eight. Only those subjects and grade levels included the requisite pre- and post-tests required for teacher “value added” measurements (VAM). Not included were most subjects (e.g., science, social studies, art, music, physical education), grades kindergarten to two, and high school.

Chancellor Rhee wanted many more teachers included. So, I designed a system that would cover more than half the DCPS teacher force, from kindergarten through high school. You haven’t heard about it because it never happened. The newly elected Vincent Gray had promised during his mayoral campaign to reduce the amount of testing; the proposed expansion would have increased it fourfold.

VAM affected teachers’ jobs. A low value-added score could lead to termination; a high score, to promotion and a cash bonus. VAM as it was then structured was obviously, glaringly flawed,[1] as anyone with a strong background in educational testing could have seen. Unfortunately, among the many new central office hires from the elite of ed reform circles, none had such a background.




“Why Johnny can’t write”



Heather Mac Donald:

American employers regard the nation’s educational system as an irrelevance, according to a Census Bureau survey released in February of this year. Businesses ignore a prospective employee’s educational credentials in favor of his work history and attitude. Although the census researchers did not venture any hypothesis for this strange behavior, anyone familiar with the current state of academia could have provided explanations aplenty.

One overlooked corner of the academic madhouse bears in particular on graduates’ job-readiness: the teaching of writing. In the field of writing, today’s education is not just an irrelevance, it is positively detrimental to a student’s development. For years, composition teachers have absorbed the worst strains in both popular and academic culture. The result is an indigestible stew of 1960s liberationist zeal, 1970s deconstructionist nihilism, and 1980s multicultural proselytizing. The only thing that composition teachers are not talking about these days is how to teach students to compose clear, logical prose.

Predictably, the corruption of writing pedagogy began in the sixties. In 1966, the Carnegie Endowment funded a conference of American and British writing teachers at Dartmouth College. The event was organized by the Modern Language Association and the National Conference of Teachers of English. The Dartmouth Conference was the Woodstock of the composition professions: It liberated teachers from the dull routine of teaching grammar and logic.

The Dartmouth Conference rejected what was called a “transmission model” of English in favor of a “growth model.” In a transmission mode, teachers pass along composition skills and literary knowledge. In a growth mode, according to Joseph Harris, a professor of English at the University of Pittsburgh, they focus on students’ “experience of language in all forms”—including ungrammatical ones. A big problem with the transmission model of English, apparently, is that it implies that teachers actually know more than their students do. In the growth model, in contrast, the teacher is not an authority figure; rather, he is a supportive, nurturing friend, who works with, rather than challenges, what a student has to say. Dartmouth proponents claimed that improvement in students’ linguistic skills need not come through direct training in grammar and style but, rather, would flower incidentally as students experiment with personal and expressive forms of talk and writing.

The Dartmouth Conference and subsequent writing pedagogy reflected the political culture of the time. It was anti-authoritarian and liberationist; it celebrated inarticulateness and error as proof of authenticity. But it was also a response to the looming problem of race. City University of New York (CUNY) began the nation’s first academic affirmative-action program in 1966; other schools would soon follow suit. The movement to legitimate black English began at that time. Confronted with a barrage of students who had no experience in formal grammar or written language, it was highly convenient for professors to learn that students’ natural way of speaking and writing should be preserved, not corrected.

There is a final ideological strand in composition pedagogy that has its roots in the late 1960s: Marxism. Teachers on the radical left began arguing that the demand for literacy oppresses the masses. Writing in Radical Teacher, Massachusetts Institute of Technology humanities professor Wayne O’Neill explains that “it has become important for the ruling class to exclude the potentially radicalizing elements of higher education from the colleges. Thus everywhere along the scale of education there is a relentless march toward the basics.” James Sledd, professor emeritus of English at the University of Texas at Austin, writes in College English that standard English is “essentially an instrument of domination,” and that coercing students to speak properly conditions them to accept the coercion of capitalism. Richard Ohmann, humanities professor at Wesleyan, has pronounced the “decline of literacy…a fiction, if not a hoax.”

The political process

The Dartmouth Conference gave rise to what became known as the process school of composition. Peter Elbow of Evergreen State College is its most influential practitioner. Not all of Elbow’s ideas are bad. He emphasizes that writing is a continuous process, composed mostly of rewriting. He encourages students to think of their essays in terms of multiple drafts, rather than single-shot efforts. He had vigorously promoted “free writing,” a warm-up exercise in which the author writes continuously for a fixed period of time, uninhibited by grammar, punctuation, or logic.

But the drawbacks of the process school cancel its contributions. Elevating process has driven out standards. Rather than judging a piece of student writing by an objective measure of coherence and correctness, teachers are supposed to evaluate how much the student has grown over the course of a semester. The hottest trend in grading—portfolio assessment—grows out of the process school. Elbow created the method after he saw the “harmful effects of writing proficiency exams.”

Among the most harmful of those effects is apparently the assault on self-esteem that results from a poor grade. In portfolio assessment, students’ evaluations are based on drafts of papers, diary entries, letters, and other informal assignments compiled over the course of a semester, rather than on the freestanding merit of a paper or exam. Often the student “collaborates” with the teacher in assigning a grade to the portfolio. Portfolio assessment allows for the radical reduction of standards, imports greater subjectivity into grading, and is extremely time-consuming.

For the process school, politics undermines pedagogy. Elbow added an additional week of free writing to the start of his courses at Evergreen State College when he saw how useful the practice was in “building community” in the classroom. Elbow rails against grading because it interferes with his ability to connect meaningfully with his students. “Good writing teachers like student writing,” he explains, and “it’s hard to like something if we know we have to give it a D.”

In keeping with the anti-authoritarian commitment of process practitioners, students in a process classroom teach each other. Students form small groups to read aloud and comment on each other’s writing, while the teacher surveys the scene benignly. The students may be admonished to say two good, as well as two critical things about each other’s essay—a task that would tax the invention of Shakespeare. Many of the groups I have observed quickly turned their attention to more compelling matters, like last weekend’s parties or the newest sneakers. And no wonder, given the abysmal prose they are supposed to discuss. The following two paragraphs are from a student’s answer on CUNY’s writing-proficiency exam. The question was: “Do you think the personal life of a political candidate…should be considered a factor in determining his or her ability to do the job?”

“We are living in a world that’s getting worse everyday. And what we are doing nothing, just complaining about the other person’s life. We should stop because if we don’t stop by looking on every candidate lifestyle and focus more on how, we could make it better. We all gonna die of, hungry, because we wouldn’t have nothing to eat and no place to life.

“People tends to make mistake in life. We all are humans. That’s why we should never judge a person for the cover of a book. People change in life, most of them tends to learn from their mistake. We live in a world that we should learn to forgive and forget everyone mistake and move forward.”

While peer teaching may have value for more experienced student-writers, for the incompetent—which includes not just remedial students but increasing numbers of all incoming students—it is an egregious case of the blind leading the blind. It ignores the reason students are in remedial classes in the first place and violates the time-honored principle that one learns to write by reading good, not awful, writing.

The process school’s determination to break down hierarchy extends beyond the teacher-student divide. A pioneering freshman composition course at City College combines students who fail the CUNY writing exam with those who passed. Says Acting Provost Mike Aarons: “The idea behind the program [which is being replicated in other areas of the college] is that the more successful students help the less successful.”

Aarons might have added that another idea behind such programs is radical egalitarianism. Individual effort must go to raising the collectivity, not to raising oneself above the collectivity; individual success betrays the good of the whole. The course received a grant from the Fund for the Improvement of Post-Secondary Education—apparently the federal government likes the idea of fighting elitism as well.

In a process classroom, content eclipses form. The college essay and an 18-year-old’s personality become one and the same. Effie Cochran, an English as a second language professor at Baruch College, gushes: “Here I am—teacher-confessor. All these [gay] people are coming out to me through autobiographical reports who wouldn’t come out to a priest.” One process professor recommends that the profession “pay more attention to the experiences of psychotherapists regarding role-modeling, sexual tension, and transference.”

Students who have been told in their writing classes to let their deepest selves loose on the page and not worry about syntax, logic, or form have trouble adjusting to their other classes. A student at St. Anselm’s College complained to her writing teacher that her humanities professor had prevented her from developing her ideas on Homer, Cicero and the Hebrew prophets. His sin? He had insisted on numerous references to the text and correct English prose. “In humanities,” she whined, “I have to remember a certain format and I have to back up every general statement with specific examples. Oh, and that word, ‘I,’ I just used. You would never see that word in one of my humanities papers.” In process-school jargon, the poor humanities student has been denied “access to a personal language.”

With its emphasis on personal experience and expression, the process school forgets that the ultimate task of college writing is to teach students how to think. In the personal essay, assertions need not be backed up by anything more than the author’s sincerity. According to Rolf Norgaard of the University of Colorado, evaluation then becomes a judgment upon students’ lives, their personalities, their souls. But how can you tell a student, he asks, that her experiences or family life were not terribly original or striking?

The process school of writing has spread well beyond college campuses. Washington Irving Elementary School in Chicago introduced process methods six years ago in the hope of improving students’ catastrophic performance in reading and writing. Teachers tossed out their red pencils and workbooks; from then on, students would simply write, unfettered by such enthusiasm-crushing methods as rote learning. Students worked in groups, grades were out, cooperation was in.

The initial response, euphoria, was short-lived. Student groups rarely completed their assignments. They made little progress in mechanics. Some teachers started giving grades and teaching the basics again. But when they handed out incompletes and tried to hold students to higher standards, they caught heat from both parents and the principal, who told them that their expectations were too high. Lesson: Once out of the bottle, the process genie is hard to get back in.

Derrida’s writing lessons

In the early 1980s, a few process teachers started to sense that something was deeply wrong. While they had been unleashing an orgy of self-expression in their classes, across the hall in the literature department, the hippest teachers were preaching that the self was a fiction, a mere product of language. The process theorists, in other words, stumbled across deconstruction. In the 1970s and 1980s, this was not difficult to do, since just about every field in the humanities during that period scrambled to parrot the impenetrable prose of Jacques Derrida, Paul de Man, and Michel Foucault.

What an embarrassment for the poor process teachers! Deconstructionism declared the self dead, and they had been assiduously cultivating it. And what to do about their favorite genre, the personal essay, which seems to presuppose a writing subject, a concept anathema to deconstructionists?

The solution to this dilemma demonstrates the resourcefulness of college professors today. While some process advocates, such as Elbow, have continued their former ways unchanged, many others have simply grafted deconstructive rhetoric onto a process methodology. The result is pedagogical chaos. Students are writing personal essays, but they are deconstructing them at the same time. Such writing assignments are designed with one sole purpose: to make the professor feel that he is at the cutting-edge. They have nothing to do with teaching writing.

Witness the rhetorical sleight of hand of Joel Haefner, a professor at Illinois State University. Haefner manages to demonstrate disdain for process pedagogy, while nevertheless preserving it. “Calls to revive the personal essay,” he writes in College English,

“carry a hidden agenda and rest on the shibboleth of individualism, and concomitantly, the ideology of American democracy…As we interrogate our assumption about the essay genre and its role in a “democratic” and “individualistic” pedagogy, we will find, I think, that it makes more sense to see the essay as a cultural product, as a special kind of collective discourse. Hence there is still a place for the “personal” essay in a collaborative pedagogy.”

This tortured reasoning may preserve Haefner’s credibility with the post-structuralists, but its practical result must tie students up in knots. Here are some of Haefner’s deconstructive writing projects that are intended to “critique the fiction of a singular author”: writing groups create a personal essay that purports to be the work of a single author; individual students write a personal essay using “we”; teams rewrite a personal essay from other singular viewpoints; and (this is my favorite) students are encouraged not to create a unified and coherent first-person-singular voice, but, rather, a mix of “I” speakers.

This borders on pedagogical malpractice. Here are students who are unable to write coherent paragraphs, and they are being encouraged to cultivate an incoherent writing voice.

Multicultural writing

But academia can be cruel. No sooner did writing teachers master deconstructive jargon than a new, improved version came along. After years on the top of the charts, deconstructionism has been pushed aside by multiculturalism. Multiculturalism is both the direct offspring of deconstructionism and its nemesis. The current obsession with racial, sexual, and ethnic difference grew directly out of deconstructionism’s obsession with so-called linguistic difference. But, whereas deconstructionism was a mandarin pursuit that had only contempt for political engagement, multiculturalism asserts the centrality of politics to every human endeavor.

For would-be composition theorists, the most important consequence of multiculturalism has been the reemergence of the self as the central focus of concern. But the new multicultural self is defined exclusively by racial, sexual, and ethnic identity. The multicultural writing classroom is a workshop on racial and sexual oppression. Rather than studying possessive pronouns, students are learning how language silences women and blacks.

As New York Times reporter Richard Bernstein described in his recent book, Dictatorship of Virtue, the University of Texas at Austin exploded in controversy in 1990 over a proposed writing course called “Writing about Difference.” The course text was Racism and Sexism: An Integrated Study, by Paula Rothenberg, a national leader in the movement to inject race and gender into every aspect of the curriculum. “One assumption of this book,” writes Rothenberg, “is that racism and sexism pervade American culture, that they are learned at an early age and reinforced throughout life by a variety of institutions that are part of growing up and living in the United States.” Students in the new writing course would use the text’s readings to explore their own role as oppressors or victims.

In a rare victory for common sense, the course was cancelled after a bitter fight. Most colleges have not been so lucky, however. Students in Muhlenberg College’s Third World Experience composition course, for example, study works by third-world authors to learn how colonialism and gender each have their unique system of oppression. According to two critics of the course at Muhlenberg, it primarily requires that students “wade through the material, applaud, and announce its authenticity.”

Effie Cochran of Baruch College assigns her remedial-writing students role-playing exercises so that women can vent their anger at the discrimination they suffer in and out of school. Whether these performances improve students’ writing skills is anyone’s guess.

The personal essay remains a cornerstone for the multicultural classroom; it is a special favorite of feminists. But it has been supplemented by “ethnography.” David Bleich’s students at the University of Rochester conduct personal ethnographies on social relations in the classroom, observing how their gender, race, and class allegedly determine their response to literary works. The most frequently assigned topic for student ethnographers, however, is popular culture—in other words, describe and respond to your favorite rock video.

Every writing theory of the past 30 years has come up with reasons why it’s not necessary to teach grammar and style. For the multiculturalists, the main reason is that grammatical errors signify that the author is politically engaged. According to Min-Zhan Lu of Drake University, the “individual consciousness is necessarily heterogeneous, contradictory, and in process. The writer writes at the site of conflict.”

It is the goal of current writing theory to accentuate that conflict. Today’s theorists berate former City College professor Mina Shaugnessy, whose book, Errors and Expectations, heralded the remedial-writing movement, for trying to introduce her students—however gently—to academic prose. Min-Zhan Lu write: “We need to contest teaching methods which offer to ‘cure’ all signs of conflict and struggle which the dominant conservative ideology of the 1990s seeks to contain.”

There is a basic law at work in current composition theory: As students’ writing gets worse, the critical vocabulary used to assess it grows ever more pompous. James Zebrowski of Syracuse University claims that doing ethnographies makes students “constructors of knowledge.” John Trimbur of Worcester Polytechnic Institute describes what he calls “post-process, post-cognitivist theory”: It “represents literacy as an ideological arena and composing as a cultural activity by which writers position and reposition themselves in relation to their own and others’ subjectivities, discourses, practices, and institutions.” According to Trimbur, “literacy crises result not from declining skills but from the contention of various interested representations of literacy.” In other words, students who can’t read and write are simply offering up another version of literacy, which the oppressive conservative ideology refuses to recognize. Such double-talk harks back to the 1960s, when open-admissions students were described as coming from a culture where “orality” was dominant.

Wanted: writers

The bottom line to all this nonsense is drastically lowered expectations of student skills. Marilyn Sternglass, a composition theorist at City College, argues that students should be able to pick up the topics for CUNY’s writing-proficiency exam before the test is administered because “responding to the questions cold makes too many demands on students. If they concentrate on content, their mechanics will suffer; if they concentrate on mechanics, they lose their train of thought.” It never occurs to her that such a zero-sum tradeoff indicates precisely what the test is supposed to measure: the inability to write.

Professors are expending vast amounts of energy making excuses for their students. At a 1994 composition conference at the CUNY Graduate Center, Geraldine de Luca, director of freshman English at Brooklyn College, railed against grammatical rules. Though teaching rules in response to individual students’ questions, she said, can be “empowering, the rules have a way of taking over. And some teachers think that’s fine: ‘It’s about time they learned some grammar,’ they say. ‘I knew this stuff when I was in the fifth grade.’ But in what time, in what community, in what country?” asked Luca melodramatically. “Even the concept of error,” she concluded, “is beginning to feel repugnant to me.”

Today, at CUNY and elsewhere, there is a growing movement to abolish the distinction between remedial writing and reading courses and regular freshman courses, on the grounds that placing students in remedial courses injures their self-esteem. Remedial-writing courses at Baruch College and elsewhere are now known as “English as a Second Dialect,” or ESD, courses. Proudly displaying their knowledge of Foucault, composition theorists argue that the category “remedial education” is merely an artificial construct imposed by the ruling class on the oppressed. Marilyn Sternglass of City College quickly corrected me when I asked about students who needed remedial work: “They are ‘judged’ to need remedial classes,” she retorted haughtily.

Professors who exempt students from the very standards that governed them when they were in school feel compassionate, noble, and powerful. But the professors’ power is limited to their world. Though they may be willing to overlook spelling, punctuation, and grammatical errors in favor of a “holistic” approach to student writing, employers are clearly not as generous, as the census survey suggests.

[Heather Mac Donald graduated summa cum laude from Yale, and earned an M.A. at Cambridge University. She holds the J.D. degree from Stanford Law School, and is a John M. Olin Fellow at the Manhattan Institute and a contributing editor to City Journal] – Via Will Fitzhugh.




Madison Schools’ MAP Test Data Sharing Agreement



Madison School District PDF:

Data Sources
a) MMSD will sign NWEA’s release form allowing NWEA to transfer MMSD’s test data to Consultant.

b) In signing this contract, MMSD authorizes DPI to disclose student-level information to the Consultant for the purpose of linking demographic, enrollment, and other necessary data elements to student test scores during the analysis.

i. If data from DPI cannot be used to link student test scores to student demographic data as required by the value-added model, then Consultant will terminate the contract as outlined in the Termination section of the contract.

ii. NWEA student identifiers (full name, date of birth, and any other identifying information) will be sent to DPI with all test data removed to link the state and test IDs. No MAP test scores will be sent to DPI.

c) All data will be destroyed after 10 years or as required in disclosure forms.

d) The Consultant may add de-identified data to its national student growth reference
group database.

Much more on the “Measures of Academic Progress” (MAP), here.




The New Wisconsin Forward Exam



Wisconsin Reading Coalition:

The Badger Exam lasted just one year, to be replaced this spring with the Wisconsin Forward Exam. Wisconsin contracted with Data Recognition Corporation (DRC) to develop the new test with input from Wisconsin teachers.

In addition to rolling out the new assessment, DPI must complete the important process of setting proficiency standards. We hope they will continue to set proficiency cut scores based on the NAEP standards. For many years, Wisconsin yielded to the temptation to set its standards low, making it appear that we had higher percentages of proficient students than was actually the case. As reported by the Wisconsin Policy Research Institute, the required level for proficiency in 4th grade reading in Wisconsin was actually below the level that NAEP set for the basic performance category.

“In this report, which involved mapping state proficiency standards in reading and mathematics onto the appropriate NAEP scale (2004-05), Wisconsin was among the states at the lower levels.

At grade 4 in reading, Wisconsin proficiency levels rated well below the NAEP Basic cut score and considerably below the NAEP Proficient cut score. Wisconsin ranked 22cd out of 32 states, far behind the leading states of Massachusetts, South Carolina, and Wyoming. “

DPI has also been asked by the U.S. Department of Education with finding a way to increase the number of students taking the statewide assessment. Wisconsin was one of 12 states that had a lower-than-required participation rate last year. A letter from the Department of Education spelled out the requirements as well as listing possible strategies to achieve compliance from districts.

Wisconsin’s long serving WKCE exam was often criticized for its low standards.




Data-Driven Instruction Can’t Work If Instructors Don’t Use The Data



Matthew DiCarlo

In education today, data, particularly testing data, are everywhere. One of many potentially valuable uses of these data is helping teachers improve instruction – e.g., identifying students’ strengths and weaknesses, etc. Of course, this positive impact depends on the quality of the data and how it is presented to educators, among other factors. But there’s an even more basic requirement – teachers actually have to use it.
In an article published in the latest issue of the journal Education Finance and Policy, economist John Tyler takes a thorough look at teachers’ use of an online data system in a mid-sized urban district between 2008 and 2010. A few years prior, this district invested heavily in benchmark formative assessments (four per year) for students in grades 3-8, and an online “dashboard” system to go along with them. The assessments’ results are fed into the system in a timely manner. The basic idea is to give these teachers a continual stream of information, past and present, about their students’ performance.
Tyler uses weblogs from the district, as well as focus groups with teachers, to examine the extent and nature of teachers’ data usage (as well as a few other things, such as the relationship between usage and value-added). What he finds is not particularly heartening. In short, teachers didn’t really use the data.




Teacher Evaluation in Tennessee



Tennessee Department of Education:

In July 2011, Tennessee became one of the first states in the country to implement a comprehensive, student outcomes-based, statewide educator evaluation system. This implementation was a key tenet of Tennessee’s First to the Top Act, adopted by the General Assembly with bipartisan support during 2010’s extraordinary session under the backdrop of the federal Race to the Top competition. This landmark legislation established the parameters of a new teacher and principal evaluation system and committed to implementation during the 2011-12 school year. The act required 50 percent of the evaluation to be comprised of student achievement data–35 percent based on student growth as represented by the Tennessee Value-Added Assessment System (TVAAS) or a comparable measure and the other 15 percent based on additional measures of student achievement adopted by the State Board of Education and chosen through mutual agreement by the educator and evaluator. The remaining 50 percent of the evaluation is determined through qualitative measures such as teacher observations, personal conferences and review of prior evaluations and work.
An important component of the First to the Top Act was the creation of the Teacher Evaluation Advisory Committee (TEAC), a group of teachers, principals, superintendents, legislators, business leaders, and other community members, which met 21 times over the course of the following year to review and discuss various issues related to policy and implementation. The committee reviewed field tests of four different observation rubrics, which were conducted in the 2010-11 school year in approximately 125 schools across the state. The TEAC supported use of the TEAM (Tennessee Educator Acceleration Model) rubric as the state model and also voted on a number of key components of implementation, including the number and structure of observations for the year. By law, those recommendations were made to the State Board of Education, which was charged with adopting the final guidelines and criteria for the annual evaluation of all teachers and principals. The board ultimately unanimously adopted the TEAC endorsed TEAM model and, in addition, approved three alternative models – 1) Project Coach in Hamilton County; 2) TEM (Teacher Effectiveness Measure) in Memphis City; and 3) TIGER (Teacher Instructional Growth for Effectiveness and Results) in 12, mostly municipal, school systems statewide. The board also approved a menu of achievement measures that could be used as part of the 15 percent measure.
In the summer of 2011, the Tennessee Department of Education contracted with the National Institute for Excellence in Teaching (NIET) to provide a four-day training for all evaluators across the state. NIET trained more than 5,000 evaluators intensively in the state model (districts using alternative instruments delivered their own training). Evaluators were required to pass an inter-rater reliability exam, in which they viewed video recordings of teachers delivering lessons and rated them to ensure they understood the distinction between differing levels of performance.




Education wake-up call is looming



Sarah Archibald & Michael Ford:

Wisconsin is about to get a wake-up call about the quality of its K-12 education system.
The Department of Public Instruction’s attempt to get a waiver from the federal government’s flawed No Child Left Behind law includes plans to increase testing standards for Wisconsin pupils.
According to the DPI, the effect of this change will be “dramatic” because while Wisconsin students will take the same test they do now, they’ll need much higher scores to be deemed proficient.
Currently, 83% of eighth-graders who take the Wisconsin Knowledge and Concepts Exam are said to be proficient in reading, for example. Under the proposal to index scores to the National Assessment of Educational Progress, that percentage would plummet to just 35%.
In other words, state testing next year will show substantially fewer Wisconsin pupils proficient in core subjects.
More rigorous standards will demand smarter instruction. Fortunately, Wisconsin is well-positioned to use value-added analyses of standardized tests as a tool to improve instructional decisions in ways that benefit students.

Using Value-Added Analysis to Raise Student Achievement in Wisconsin.
www.wisconsin2.org




Teacher evaluation: What it should look like



Valerie Strauss:

A new report from Stanford University researcher Linda Darling-Hammond details what the components of a comprehensive teacher evaluation system should look like at a time when such assessments have become one of the most contentious debates in education today.
Much of the controversy swirls around the growing trend of using students’ standardized test scores over time to help assess teacher effectiveness.
This “value-added” method of assessment — which involves the use of complicated formulas that supposedly evaluate how much “value” a teacher adds to a student’s achievement — is considered unreliable and not valid by many experts, though school reformers have glommed onto it with great zeal.
Any reader of this blog will have seen numerous pieces from educators, mathematicians and others explaining why this method is unfair, as well as pieces on what does work in teacher evaluation.




The Trouble With Humiliating Teachers: Making rankings public undermines the trust educators need to build collaborative teams.



Wendy Kopp:

When I dropped my kids off at school last week, I had a hard time looking their teachers in the eye. The New York City government had just posted their performance assessments online, and though I’m a strong supporter of teacher accountability and effectiveness, I was baffled and embarrassed by the decision.
So-called value-added rankings–which rank teachers according to the recorded growth in their students’ test scores–are an important indicator of teacher effectiveness, but making them public is counterproductive to helping teachers improve. Doing so doesn’t help teachers feel safe and respected, which is necessary if they are going to provide our kids with the positive energy and environment we all hope for.
The release of the rankings (which follows a similar release last year in Los Angeles) is based on a misconception that “fixing” teachers is the solution to all that ails our education system.




Print|Email LAUSD Principal Focuses On Real Miramonte Criminals: The Children



Tim Cavanaugh:

One of the many privileges of having kids in the Los Angeles Unified School District is the accelerated education they get in official corruption, the stupidity of grownups, union strong-arming and many other topics – any topics other than reading, writing and arithmetic, that is.
The recent sex-abuse arrests of two teachers at Miramonte Elementary have become a feature of playground scuttlebutt and official conniptions. The school my children attend (separated from Miramonte by more than 15 miles, though both schools score in the “Least Effective” category in the L.A. Times’ value-added assessment) is no exception.
Yesterday my daughters brought home copies of a flyer containing the principal’s thoughts on the scandal. I guess this page of skylarking was intended to reassure us or something. I wouldn’t take note of it at all except that one paragraph illustrates the pathology of public employees with stunning clarity:




Trial And Error Is Fine, So Long As You Know The Difference



Matthew DiCarlo:

It’s fair to say that improved teacher evaluation is the cornerstone of most current education reform efforts. Although very few people have disagreed on the need to design and implement new evaluation systems, there has been a great deal of disagreement over how best to do so – specifically with regard to the incorporation of test-based measures of teacher productivity (i.e., value-added and other growth model estimates).
The use of these measures has become a polarizing issue. Opponents tend to adamantly object to any degree of incorporation, while many proponents do not consider new evaluations meaningful unless they include test-based measures as a major element (say, at least 40-50 percent). Despite the air of certainty on both sides, this debate has mostly been proceeding based on speculation. The new evaluations are just getting up and running, and there is virtually no evidence as to their effects under actual high-stakes implementation.




Wisconsin Read to Lead Task Force 8.25.2011 Meeting Summary



Wisconsin Reading Coaltion, via a kind Chan Stroman-Roll email:

Summary of the August 25, 2011 Read to Lead Task Force Meeting
Green Bay, WI
The fifth meeting of the Read to Lead task force was held on August 25, 2011, at Lambeau Field in Green Bay. Governor Walker was delayed, so State Superintendent Tony Evers opened the meeting. The main topic of discussion was accountability for reading outcomes, including the strategy of mandatory grade retention. Troy Couillard from DPI also presented an overview of reading reform in Milwaukee Public Schools.
Accountability
Superintendent Evers said that Wisconsin will seek a waiver from the No Child Left Behind proficiency requirements by instituting a new system of accountability. His Educator Effectiveness and Accountability Design teams are working on this, with the goal of a new accountability system being in place by late 2011.
Accountability at the educator level:
The concept of using student achievement or growth data in teacher and principal evaluations is not without controversy, but Wisconsin is including student data in its evaluation model, keeping in mind fairness and validity. The current thought is to base 50% of the educator evaluation on qualitative considerations, using the Danielson Framework http://www.danielsongroup.org (“promoting professional learning through self assessment, reflection on practice, and professional conversations”), and 50% on student data, including multiple measures of performance. 10% of the student data portion of the evaluation (5% of the total evaluation) would be based on whole-school performance. This 5% would be based on a proficiency standard as opposed to a value-added measurement. The 5% is thought to be small enough that it will not affect an individual teacher adversely, but large enough to send a message that all teachers need to work together to raise achievement in a school. The task force was asked if it could endorse whole-school performance as part of teacher evaluation. The task force members seemed to have some support for that notion, especially at the principal level, but had some reservations at the level of the individual teacher.
Kathy Champeau was concerned that some schools do not have the resources to serve some children. She also felt it might not be fair to teachers, as they have no control over other teachers in the school or the principal.
Steve Dykstra said it is important to make sure any value-added system is designed to be fair.
Rachel Lander felt it would be better to use value-added data for whole-school performance rather than a proficiency standard, but supported the importance of schoolwide standards.
Rep. Steve Kestell supported the 5% requirement, and questioned what the qualitative half of the evaluation would be based on. He felt perhaps there could be some schoolwide standards to be met in that part of the evaluation, also.
Tony Evers responded that the Danielson Framework was research-based observations, and that the evaluators would need to be highly trained and consistent in their evaluations.
Tony Pedriana had questions about the type of research on which the Danielson Framework is based.
Evers said he would provide further information to the task force.
Mara Brown said she cannot control what the teacher down the hall does, and that the 5% should apply only to principals.
Linda Pils agreed with the 5%, but felt principals need to be watching and guiding new teachers. She agreed with Dykstra’s comments on measuring growth.
Sen. Luther Olsen was concerned that the 5% portion of a teacher’s evaluation may be the part that tips the balance on job retention for an individual, yet that individual has no control over whole-school performance. He understood the principle of getting everyone involved and committed to a goal, but was concerned with possible consequences.
Mandatory Retention:
The task force was asked to consider whether Wisconsin should implement a mandatory retention policy. If so, what would it look like, and if not, what can be done to make sure students are reading at grade level?
After a guest presentation and discussion, the consensus of the task force was that Wisconsin should not have mandatory retention. Reasons cited were negative effects on later achievement, graduation, self esteem, and psychological well-being. Third grade was felt to be far too late to start intervention, and there needs to be more emphasis on developing teacher expertise and focusing on the responsibility of teachers, principals, and higher education as opposed to threatening the students with retention. Retention without changing the curriculum for the student the following year is pointless.
Dr. Elaine Allensworth, a director at the Consortium on Chicago School Research, joined the task force by telephone to summarize the outcomes of a mandatory retention project in Chicago. Students more than 1 year below the cut-off level on certain tested skills were retained unless they passed the test after a summer bridge program. Students identified as at-risk were given after-school tutoring during the year. Retention was thought to have three primary mechanisms that would affect student performance: motivation for students, families, and teachers to work harder, supplemental instruction after school and during the summer, and an additional year in the grade for failing students. All students in the school could be affected by the motivation and the supplemental instruction, but only the retained students by the extra year of instruction. The study found that the threat of retention worked as a positive motivator for teachers, parents, and some older students. However, there were also negatives in terms of higher-achieving students receiving less attention, more time on test preparation, and an instructional shift to focus on tested skills. The supplemental instruction, especially the summer bridge program, was the biggest positive of the retention project. There was high participation, increased personal attention, and higher-quality instruction. Retention itself had more negative effects than positive. Academic gains were either non-existent or rapidly-disappearing. Multiple year retentions resulted in a problematic mix of ages in classrooms, students unable to finish high school by age 18, and a negative overall attitude toward school.
Dykstra said it appeared that the impetus to do things differently because of the threat of retention had some benefit, but the actual retention had either no effect or a negative effect. He wondered if there was some way to provide the motivation without retention.
Allensworth agreed that the challenge was to provide a motivation without having a threat.
Pils asked if third graders could even understand the threat of retention.
Allensworth replied that they understood if teachers helped them. She also said that some schools with low-quality instruction had no way to improve student learning even with the threat of retention.
Rep. Jason Fields asked how you could avoid teaching to the test.
Allensworth replied that teaching the skills on the test was productive, but not the excessive time that was spent on test-taking strategies. She also said the tendency to teach more narrowly could cause problems later in high school where students needed to be able to participate in broader learning.
Marcia Henry inquired about students who returned to their old rate of learning when they returned to the regular classroom after successfully completing the summer bridge.
Allensworth replied that the summer program used higher quality curriculum and teachers, there was more time provided with students, and the students were more highly motivated.
Dykstra asked if it was possible to determine how much of the summer gain was due to student motivation, and how much due to teachers or parents.
Allensworth said those factors could not be pulled apart.
Champeau questioned whether the summer bridge program taught to the test.
Allensworth replied that it taught in a good way to the skills that the test assessed.
Brown asked if intervention was provided for the first time in third grade.
Allensworth replied that some schools began providing intervention and retaining in first or second grade.
Dykstra asked if the project created a situation where a majority of the school’s resources were concentrated in third grade, leaving other grades short.
Allensworth said they didn’t look at that, though some schools appeared to put their better teachers at certain grades.
Dykstra thought it was the wrong approach to tie services and supports to a specific grade rather than a specific student.
Are some types of consequences necessary to achieve the urgency and intensity necessary for performance improvement? Should there be mandatory summer school or other motivators? The task force did not seem to arrive at a consensus on this.
Lander said schools need the resources to do early intervention, plus information on what should be done in early intervention, and this is not currently the case in Wisconsin.
Pils questioned where teachers would find the time to provide intervention. She liked the idea of after-school and summer programs as well as reading the classics to kids. Providing a model of best instruction is important for teachers who don’t have that background.
Mary Read commented on Bill Gates’ experience with spending a lot of money for minimal results, and the conclusion that money needs to go into teacher training and proven programs such as the Kipp schools or into a national core curriculum.
Dykstra noted that everyone agrees that teacher training is essential, but there is disagreement as to curriculum and training content. His experience is that teachers are generally unable to pinpoint what is going wrong with a student’s reading. We must understand how poor and widespread current teacher training is, apologize to teachers, and then fix the problem, but not at teachers’ expense.
The facilitators asked what the policy should be. Is there an alternative to using retention? Should teacher re-training be mandatory for those who need the support?
Evers said that a school-by-school response does not work. The reforms in Milwaukee may have some relevance.
Olsen suggested that there are some reading programs that have been proven successful. If a school is not successful, perhaps they should be required to choose from a list of approved instructional methods and assessment tools, show their results, and monitor program fidelity. He feels we have a great resource in successful teachers in Wisconsin and other states, and the biggest issue is agreeing on programs that work for intervention and doing it right the first time.
Kestell said some major problems are teachers with high numbers of failing students, poor teacher preparation, the quality of early childhood education, and over-funding of 4K programs without a mandate on how that money is used. There has been some poor decision-making, and the kids are not responsible for that. We must somehow hold schools, school board, and individual educators accountable.
Champeau said teachers have no control over how money is spent. This accountability must be at the school and district level. More resources need to be available to some schools depending on the needs of their student population.
Lander: We must provide the necessary resources to identified schools.
Dykstra: We must develop an excellent system of value-added data so we can determine which schools are actually doing well. Right now we have no way of knowing. High-performing schools may actually be under-performing given their student demographics; projected student growth will not be the same in high and low performing schools.
Pedriana: We have long known how to teach even the most at-risk readers with evidence-based instruction. The truth is that much of our teacher training and classroom instruction is not evidence-based. We need the collective will to identify the evidence base on which we will base our choices, and then apply it consistently across the state. The task force has not yet taken on this critical question.
Pils: In her experience, she feels Wisconsin teachers are among the best in the country. There are some gaps we need to close.
Pedriana: Saying how good we are does not help the kids who are struggling.
Pils: We need to have our best teachers in the inner city, and teachers should not need to purchase their own supplies. We have to be careful with a limited list of approved programs. This may lead to ethics violations.
Pedriana: Referring to Pils’ mention of Wisconsin’s high graduation rates in a previous meeting, what does our poor performance on the NAEP reading test say about our graduation standards?
Michael Brickman (Governor’s aide): There is evidence of problems when you do retention, and evidence of problems when you do nothing. We can’t reduce the failing readers to zero using task force recommendations, so what should we do with students who leave 3rd grade not reading anywhere near grade level? Should we have mandatory summer school?
Henry: Response to Intervention (RTI) is a perfect model for intervening early in an appropriate way. A summer bridge program is excellent if it has the right focus. We must think more realistically about the budget we will require to do this intervention.
Olsen: If we do early intervention, we should have a very small number of kids who are still behind in 3rd grade. Are we teaching the right, most efficient way? We spend a lot of money on K-12 education in Wisconsin, but we may need to set priorities in reading. There is enough money to do it. Reading should be our mission at each grade level.
Facilitator: What will be the “stick” to make people provide the best instruction?
Dykstra: Accountability needs to start at the top in the state’s education system. When the same people continue to make the same mistakes, yet there are no consequences, we need to let some people go. That is what they did in Massachusetts and Florida: start with two or three people in whom you have great confidence, and build from there.
Facilitator: Is there consensus on mandatory summer school for failing students?
Michele Erickson: Summer school is OK if the right resources are available for curriculum and teachers.
Kestell: All grades 4K – 3 are gateway grades. They are all important.
Champeau: Summer school is a good idea, but we would need to solve transportation issues.
Dykstra: We should open up the concept of summer school beyond public schools to any agency that offers quality instruction using highly qualified instructors from outside the educational establishment.
Lander: Supports Dykstra’s idea. You can’t lay summer instruction on schools that can hardly educate during the school year.
Brown: Could support summer school in addition to, but not in place of, early intervention during the school year.
Erickson: Look at the school year first when allocating resources. Summer school is a hard sell to families.
Pedriana: Agrees with Olsen that we probably have sufficient funds for the school year, but we need to spend it more wisely. We cannot expect districts to make the commitment to extra instruction if there is no accountability at the top (including institutions of higher education). We need to resolve the issue of what knowledge and content standards will be taught before we address summer school or other issues.
Milwaukee Public Schools’ tiered RTI system was presented by DPI’s Troy Couillard as an example of an accountability system. MPS chose a new core reading program for 2010-11 after submitting its research base to DPI. Teachers were provided with some in-service training, and there are some site checks for fidelity of implementation. Tier 2 interventions will begin in 2011-12, and Tier 3 interventions in 2012-13. He felt that the pace of these changes, plus development of a data accountability system, student screening with MAP and other testing, progress monitoring, and professional development, has MPS moving much faster than most districts around the county on implementing RTI. DPI embedded RTI in the district’s Comprehensive Literacy Plan. DPI is pushing interventions that are listed on the National RTI site, but teachers are allowed to submit research for things they are using to see if those tools might be used.
Pils: Kids in MPS are already struggling. Reading First would suggest that they have 120 minuets of reading a day instead of the 90 minutes provided in the MPS plan.
Couillard: Tier 2 intervention for struggling students will add onto the 90 minutes of core instruction.
Olsen: Can this system work statewide without DPI monitoring all the districts?
Couillard: Districts are trained to monitor their own programs.
Pils: Veteran schools with proven strategies could be paired with struggling schools as mentors and models.
Pedriana: We have no way of knowing what proven strategies are unless we discuss what scientific evidence says works in reading. The task force must grapple with this question.
Brickman: Read to Lead task force needs to start with larger questions and then move to finer grain; this task force may not be able to do everything.
Pedriana: Is there anything more important for this task force to do than to decide what evidence-based reading instruction is?
Brickman: Task force members may submit suggestions for issues to discuss at the final meeting in September. Tony could submit some sample language on “evidence-based instruction” as a starting point for discussion.
Henry: The worst schools should be required to at least have specific guidelines, whether it is a legislative or DPI issue. Teacher retraining (not a 1-day workshop) is a necessity. Teachers are unprepared to teach.
Olsen: Wisconsin has always been a local control state, but one of the outcomes of the task force may be that we have a method for identifying schools that are not doing well, and then intervene with a plan. The state is ultimately responsible for K-12 education. Districts should take the state blueprint or come up with their own for approval by the state.
Erickson: Can we define what will work so districts can just do it?
Evers: MPS experience shows there is a process that works, and districts can do their own monitoring.
Dykstra: Sees value in making a list of things that districts are not allowed to do in reading instruction; also value in making a list of recommended programs based on alignment with the convergence of the science of reading research. That list would not be closed, but it should not include programs based on individual, publisher-funded studies that do not align with the convergence of the science. This could be of benefit to all districts. Even those doing relatively well could be doing better. Right now there is no list, and no learning targets. The MPS plan contains the Wisconsin Model Early Learning Standards, which contain errors. DPI needs to correct that information and distribute it right now. That would be a good example of accountability at the state level.
Couillard: The new statewide data collection system will help districts monitor their own data.
Champeau: School needs change depending on demographics. The goal should be to build decision-making capacity at the local level, not dictation from outside. We should be talking more about people than programs. Have MPS teachers been doing a better job? What will they do if their program goes away? We need to work on the underlying expertise and knowledge base.
Facilitator: There appears to be agreement that the state can intervene in failing districts.
Lander: We might have some consensus as to what teachers need to know, and then go into schools to see if they know it. If not, we need to teach them.
Pedriana: What is so bad about providing a program, with training, of course? It would help people.
Facilitator: There is consensus around training of teachers.
Dykstra: Some of the distinction between training and programs is artificial. You need both.
Other things the state could require: weighting of reading in evaluation systems, grading of schools etc.
Dykstra: If giving schools grades, they should get separate grades for how they do in teaching separate content areas. In addition, everything should be reported in the best value-added system we can create, because it’s the only way to know if you’re doing a good job.
Pils: Doesn’t like grading of schools. She has a whole folder on cheating in districts that have grading of schools and high stakes tests.
Evers: Do we just want to measure what schools are doing, or do we want to use it to leverage change?
Erickson: Wisconsin has gone from 3rd to 30th on the NAEP, so of course we should be seeking change.
Walker: The idea is not to pick on failing schools, but to help them. We must be able to deploy the resources to the things that work in accordance with science and research to teach reading right.
Dykstra: We should seek small kernels of detailed information about which teachers consistently produce better results in a given type of school for a given type of student. There is a problem with reliability when using MAP data at an individual student level.
Supt. Evers talked about the new state accountability system as being a better alternative to no Child Left Behind. Governor Walker said the state is not just doing this as an alternative to NCLB, but in response to comments from business that our graduates are not well-prepared. Parents want to know what all schools are doing.
Olsen: We need a system to monitor reading in Wisconsin before we get into big trouble. Our changing population is leading us to discover challenges that other states have dealt with for years.
Kestell: The accountability design team is an excellent opportunity to discuss priorities in education; a time to set aside personal agendas and look for solutions that work.
Next Meeting/Status of Report
Michael Brickman will try to send out a draft of a report the week of August 29 with his best interpretation of task force consensus items. The final meeting will be Sept. 27, perhaps in Madison, Eau Claire, or Wausau. Some task force issues will need to be passed on to other task forces in the future.

Related: A Capitol Conversation on Wisconsin’s Reading Challenges and Excellence in Education explains Florida’s reading reforms and compares Florida’s NAEP progress with Wisconsin’s at the July 29th Read to Lead task force meeting and www.wisconsin2.org.




July 29 Wisconsin Read to Lead task force meeting



Julie Gocey, via email:

The fourth meeting of the Governor’s Read to Lead task force took place in Milwaukee on Friday, July 29. The meeting was filmed by Wisconsin Eye, but we have not seen it offered yet through their website. We will send out a notice when that occurs. As always, we encourage you to watch and draw your own conclusions.
Following is a synopsis of the meeting, which centered on reading improvement success in Florida and previously-discussed task force topics (teacher preparation, licensing, professional development, screening/intervention, early childhood). In addition, Superintendent Evers gave an update on activity within DPI. The discussion of the impact of societal factors on reading achievement was held over to the next meeting, as was further revisiting of early childhood issues.

In addition to this summary, you can access Chan Stroman’s Eduphilia tweets at http://twitter.com/#!/eduphilia
Opening: Governor Walker welcomed everyone and stressed the importance of this conversation on reading. Using WKCE data, which has been criticized nationally and locally for years as being derived from low standards, the Governor stated that 80% of Wisconsin students are proficient or advanced in reading, and he is seeking to serve the other 20%. The NAEP data, which figured prominently in the presentation of the guest speakers, tell a very different story. Superintendent Evers thanked the task force members and indicated that this is all about “connecting the dots” and putting all of the “puzzle pieces” together. The work of this task force will impact the work going on in other education-focused committees.
The Florida Story: Guest speakers were Patricia Levesque, the Executive Director of the Foundation for Excellence in Education and the Foundation for Florida’s Future, and Mary Laura Bragg, the director of Florida’s statewide reading initiative, Just Read, Florida! from 2001 to 2006.
In a series of slides, Levesque compared Wisconsin, Florida, and national performance on the NAEP reading test over the past decade. Despite challenges in terms of English language learners, a huge percentage of students on free/reduced lunch, and a minority-majority demographic, Florida has moved from the scraping the bottom on the NAEP to the top group of states. Over the same time period, Wisconsin has plummeted in national ranking, and our students now score below the national average in all subgroups for which NAEP data is disaggregated. 10 points on the NAEP scale is roughly equivalent to one grade level in performance, and Florida has moved from two grade levels below Wisconsin to 1/2 grade level above. For a full discussion of Wisconsin’s NAEP performance, see our website, http://www.wisconsinreadingcoalition.org.
Levesque and Bragg also described the components of the reading initiative in Florida, which included grading all schools from A to F, an objective test-based promotion policy from third to fourth grade, required state-approved reading plans in each district, trained reading coaches in schools, research assistance from the Florida Center for Reading Research, required individual student intervention plans for struggling students, universal K-2 screening for reading problems, improved licensure testing for teachers and principals, the creation of a reading endorsement for teaching licenses, and on-line professional development available to all teachers. As noted above, achievement has gone up dramatically, the gap between demographic groups has narrowed, early intervention is much more common, and third grade retention percentages continue to fall. The middle school performance is now rising as those children who received early intervention in elementary school reach that level. Those students have not yet reached high school, and there is still work to be done there. To accomplish all this, Florida leveraged federal funds for Title 1 and 2 and IDEA, requiring that they be spent for state-approved reading purposes. The Governor also worked actively with business to create private/public partnerships supporting reading. Just Read, Florida! was able to engineer a statewide conference for principals that was funded from vendor fees. While Florida is a strong local control state, reading is controlled from the state level, eliminating the need for local curriculum directors to research and design reading plans without the resources or manpower to do so. Florida also cut off funding to university professors who refused to go along with science-based reading instruction and assessment.
Florida is now sharing its story with other states, and offering assistance in reading plan development, as well as their screening program (FAIR assessment system) and their online professional development, which cost millions to develop. Levesque invited Wisconsin to join Indiana and other states at a conference in Florida this fall.
Questions for, or challenges to, the presenters came from three task force members.

  • Rachel Lander asked about the reading coaches, and Bragg responded that they were extensively trained by the state office, beginning with Reading First money. They are in the classroom modeling for teachers and also work with principals on understanding data and becoming building reading leaders. The coaches now have an association that has acquired a presence in the state.
  • Linda Pils stated her belief that Wisconsin outperforms Florida at the middle school level, and that we have higher graduation rates than Florida. She cited opinions that third grade retention has some immediate effect, but the results are the same or better for non-retained students later, and that most retained students will not graduate from high school. She also pointed out Florida’s class size reduction requirement, and suggested that the NAEP gains came from that. Levesque explained that the retention studies to which Pils was referring were from other states, where retention decisions were made subjectively by teachers, and there was no requirement for science-based individual intervention plans. The gains for retained students in Florida are greater than for matched students who are not retained, and the gains persist over time. Further, retention did not adversely affect graduation rates. In fact, graduation rates have increased, and dropout rates have declined. The University of Arkansas is planning to do a study of Florida retention. The class size reduction policy did not take effect in Florida until last year, and a Harvard study concluded that it had no effect on student reading achievement. Task force member Steve Dykstra pointed out that you cannot compare the NAEP scores from two states without considering the difference in student demographics. Wisconsin’s middle school scores benefit from the fact that we have a relative abundance of white students who are not on free/reduced lunch. Our overall average student score in middle school may be higher than Florida, but when we compare similar cohorts from both states, Florida is far ahead.
  • Tony Pedriana asked what kinds of incentives have been put in place for higher education, principals, etc. to move to a science-based system of instruction. The guests noted that when schools are graded, reading performance receives double weight in the formula. They also withheld funding for university programs that were not science-based.

DPI Update: Superintendent Evers indicated that DPI is looking at action in fours areas: teacher licensure, the Wisconsin Model Early Learning Standards, the use of a screener to detect reading problems, and implementation of the Common Core State Standards.

  • The committee looking at licensing is trying to decide whether they should recommend an existing, off-the-shelf competency exam, or revise the exam they are currently requiring (Praxis 2). He did not indicate who is on the committee or what existing tests they were looking at. In the past, several members of the task force have recommended that Wisconsin use the Foundations of Reading test given in Massachusetts and Connecticut.
  • DPI is revising the WMELS to correct definitions and descriptions of phonological and phonemic awareness and phonics. The changes will align the WMELS with both the Report of the National Reading Panel and the Common Core State Standards. Per the suggestion of Eboni Howard, a guest speaker at the last meeting, they will get an outside opinion on the WMELS when they are finished. Evers did not indicate who is doing this work.
  • DPI is looking at the possibility of using PALS screening or some other tool recommended by the National RTI Center to screen students in grades K-2 or K-3. Evers previously mentioned that this committee had been meeting for 6-7 months, but he did not indicate who is on it.
  • Evers made reference to communication that was circulated this week (by Dr. Dan Gustafson and John Humphries) that expressed concern over the method in which DPI is implementing the Common Core. He stated that districts have been asking DPI for help in implementing the CC, and they want to provide districts with a number of resources. One of those is the model curriculum being developed by CESA 7. DPI is looking at it to see how it could help the state move forward, but no final decision has yet been made.

Task force member Pam Heyde, substituting for Marcia Henry, suggested that it would be better to look at what Florida is doing rather than start from ground zero looking at guidelines. Patricia Levesque confirmed that Florida was willing to assist other states, and invited Wisconsin to join a meeting of state reading commissioners in October.
Teacher Preparation: The discussion centered around what needs to change in teacher preparation programs, and how to fit this into a four-year degree.
Steve Dykstra said that Texas has looked at this issue extensively. Most schools need three courses to cover reading adequately, but it is also important to look at the texts that are used in the courses. He referenced a study by Joshi that showed most of the college texts to be inadequate.
Dawnene Hassett, UW-Madison literacy professor in charge of elementary teacher reading preparation, was invited to participate in this part of the discussion. She indicated we should talk in terms of content knowledge, not number of credits. In a couple of years, teachers will have to pass a Teacher Performance Assessment in order to graduate. This was described as a metacognitive exercise using student data. In 2012-13, UW-Madison will change its coursework, combining courses in some of the arts, and dropping some of the pedagogical, psychological offerings.
Tony Pedriana said he felt schools of education had fallen down on teaching content derived from empirical studies.
Hassett said schools teach all five “pillars” of reading, but they may not be doing it well enough. She said you cannot replicate classroom research, so you need research “plus.”
Pils was impressed with the assistance the FCRR gives to classroom teachers regarding interventions that work. She also said spending levels were important.
Dykstra asked Mary Laura Bragg if she had worked with professors who thought they were in alignment with the research, but really weren’t.
Bragg responded that “there’s research, and then there’s research.” They had to educate people on the difference between “research” from vendors and empirical research, which involves issues of fidelity and validation with different groups of students.
Levesque stated that Florida increased reading requirements for elementary candidates from 3 to 6 credits, and added a 3 credit requirement for secondary candidates. Colleges were required to fit this in by eliminating non-content area pedagogy courses.
Kathy Champeau repeated a concern from earlier meetings that teacher candidates need the opportunity to practice their new knowledge in a classroom setting, or they will forget it.
Hassett hoped the Teacher Performance Assessment would help this. The TPA would probably require certain things to be included in the teacher candidate’s portfolio.
Governor Walker said that the key to the effectiveness of Florida’s retention policy was the intervention provided to the students. He asked what they did to make sure intervention was successful.
Levesque replied that one key was reading coaches in the classroom. Also, district reading plans, individual intervention plans, student academies, etc. all need to be approved by the state.
There was consensus that there should be a difference in reading requirements for elementary vs. secondary teachers. There was no discussion of preparation for reading teachers, reading specialists, or special education teachers.
Licensing: The discussion centered around what teacher standards need to be tested.
Dykstra suggested that the Knowledge and Practice Standards for Teachers of Reading, written by Louisa Moats, et al, and published by the International Dyslexia Association in 2010, would be good teacher standards, and the basis for a teacher competency exam. There was no need for DPI to spend the next year discussing and inventing new teacher standards.
Champeau said that the International Reading Association also has standards.
Pedriana asked if those standards are based on research.
Dykstra suggested that the task force look at the two sets of standards side-by-side and compare them.
Professional Development: The facilitators looked for input on how professional development for practicing teachers should be targeted. Should the state target struggling teachers, schools, or districts for professional development?
Rep. Jason Fields felt all three needed to be targeted.
Heyde asked Levesque for more details on how Wisconsin could do professional development, when we often hear there is no money.
Levesque provided more detail on the state making reading a priority, building public/private partnerships, and being more creative with federal grant money (e.g., the 20% of each grant that is normally carved out by the state for administration). There should be a clear reading plan (Florida started with just two people running their initiative, and after a decade only has eight people), and all the spending should align with the plan to be effective. You cannot keep sending money down the hole. Additional manpower was provided by the provision that all state employees would get one paid hour per week to volunteer on approved reading projects in schools, and also by community service requirements for high school students.
Bragg suggested using the online Florida training modules, and perhaps combining them with modules from Louisiana.
Dykstra also suggested taking advantage of existing training, including LETRS, which was made widely available in Massachusetts. He also stressed the importance of professional development for principals, coaches, and specialists.
Bragg pointed out that many online training modules are free, or provided for a nominal charge that does not come close to what it would cost Wisconsin to develop its own professional development.
Lander said there were many Wisconsin teachers who don’t need the training, and it should not be punitive.
Champeau suggested that Florida spends way more money on education that Wisconsin, based on information provided by the NAEP.
Levesque clarified that Florida actually is below the national average in cost per student. The only reason they spend more than Wisconsin is that they have more students.
Rep. Steve Kestell stated that teachers around the entire state have a need for professional development, and it is dangerous to give it only to the districts that are performing the worst.
Sarah Archibald (sitting in for Sen. Luther Olsen) said it would be good to look at the value added in districts across the state when trying to identify the greatest needs for professional development. The new statewide information system should provide us with some of this value added information, but not at a classroom teacher level.
Evers commented that the state could require new teacher Professional Development Plans to include or be focused on reading.
Pils commented that districts can have low and high performing schools, so it is not enough to look at district data.
Champeau said that administrators also need this professional development. They cannot evaluate teachers if they do not have the knowledge themselves.
Dykstra mentioned a Florida guidebook for principals with a checklist to help them. He is concerned about teachers who develop PDP’s with no guidance, and spend a lot of time and money on poor training and learning. There is a need for a clearinghouse for professional development programs.
Screening/Intervention: One of the main questions here was whether the screening should be universal using the same tools across the state.
Champeau repeated a belief that there are districts who are doing well with the screening they are doing, and they should not be required to change or add something new.
Dykstra responded that we need comparable data from every school to use value added analysis, so a universal tool makes sense. He also said there was going to be a lot of opposition to this, given the statements against screening that were issued when Rep. Keith Ripp introduced legislation on this topic in the last biennium. He felt the task force has not seen any screener in enough detail to recommend a particular one at this time.
Heyde said we need a screener that screens for the right things.
Pils agreed with Dykstra and Heyde. She mentioned that DIBELS is free and doesn’t take much time.
Michele Erickson asked if a task force recommendation would turn into a mandate. She asked if Florida used a universal screener.
Levesque replied that Florida initially used DIBELS statewide, and then the FCRR developed the FAIR assessments for them. The legislature in Florida mandated the policy of universal kindergarten screening that also traces students back to their pre-K programs to see which ones are doing a better job. Wisconsin could purchase the FAIR assessments from Florida.
Archilbald suggested phasing in screening if we could not afford to do it all at once.
Evers supports local control, but said there are reasons to have a universal screener for data systems, to inform college programs, and to implement professional development.
Lander asked what screening information we could get from the WKCE.
Evers responded that the WKCE doesn’t start unitl third grade.
Dykstra said we need a rubric about screening, and who needs what type and how often.
Pedriana said student mobility is another reason for a universal screener.
There was consensus that early screening is important. Certainly by 4K or 5K, but even at age three if a system could be established. Possibilities mentioned were district-run screenings or pediatrician screenings.
Walker reminded the task force that it only makes sense to screen if you have the ability to intervene with something.
Mara Brown wasn’t sure that a universal screener would tell her anything more about her students than she already knows.
Levesque said she could provide a screening roadmap rubric for the task force.
No one on the task force had suggestions for specific interventions. The feeling was that it is more important to have a well-trained teacher. Both Florida and Oregon started evaluating and rating interventions, but stopped because they got bogged down. Wisconsin must also be careful about evaluations by What Works Clearinghouse, which has some problems.
Pedriana asked if the task force is prepared to endorse a model of instruction based on science, where failure is not an option.
The facilitator said this discussion would have to wait for later.
Early Childhood: The task force agreed that YoungStar should include more specific literacy targets.
Rep. Kestell felt that some district are opening 4K programs primarily for added revenue, and that there is wide variability in quality. There is a need to spend more time on this and decide what 4K should look like.
Evers said we should use the Common Core and work backward to determine what needs to be done in 4K.
Wrap-Up: Further discussion of early childhood will be put over to the next meeting, as will the societal issues and accountability. A meeting site has not yet been set, but Governor Walker indicted he liked moving around the state. The Governor’s aides will follow up as to locations and specific agenda. The next meeting will be Thursday, August 25. All meetings are open to the public.

Related: An Open Letter to the Wisconsin Read To Lead Task Force on Implementing Common Core Academic Standards; DPI: “Leading Us Backwards” and how does Wisconsin Compare? www.wisconsin2.org.
Much more on Wisconsin’s Read to Lead Task Force, here.




What’s the Best Way to Grade Teachers?



Kristina Rizga:

>Last year, battles over charter schools dominated much of education coverage. This year, the controversy over “teacher evaluations” is poised to be the biggest fight among people with competing visions for improving public schools. For a primer on how these new teacher assessments work, don’t miss Sam Dillon’s recent piece in the New York Times. Reporting from Washington, D.C., Dillon found that last year the city fired 165 teachers using a new teacher evaluation system; this year, the number will top 200.

D.C. relies on a relatively new evaluation system called Impact, a legacy of its former school chief Michelle Rhee, who noticed that, despite the district’s low test scores, most teachers were getting nearly perfect evaluations. Rhee and the proponents of this new evaluation system feel that the old system relied too much on the subjective evaluations by the principal or a few experienced teachers. Opponents of the old system say these internal measurements are not data-driven or rigurous enough to allow principals and districts to identify struggling teachers who need assistance or to find the successful ones who deserve to be recognized and empowered.

Impact or other new evaluation systems are currently being implemented in around 20 states. The basic idea to use performance-based evaluations that use external measures such as test scores in addition to the internal measures mentioned above. Sparked by President Obama’s Race to the Top grants, these "value-added" evaluations rely heavily on kids’ test scores in math and reading. Teachers whose subjects are not measured by test scores are observed in the classroom. For example, D.C. teachers get five yearly classroom observations, three by principals and two by "master educators" from other schools.




Q & A: Charter School Proposal for Madison Preparatory Academy for Young Men



570K PDF:

APPENDIX MMM-7-21 January 31, 2011
Urban League of Greater Madison
SUMMARY
On December 6, 2010, the Urban League of Greater Madison presented an initial proposal for the establishment of Madison Preparatory Academy for Young Men (a non-instrumentality all-boys secondary charter school) to the Planning and Development Committee of the MMSD Board of Education. During the discussion that followed, Board members agreed to submit follow-up questions to the Urban Leagne, to which the Urban Leagne would respond before the next meeting of the Planning and Development Committee. Questions were submitted by Ed Hughes and Lucy Mathiak. Furthermore, Arlene Silveira submitted questions presented to her by several connnunity members. Below each numbered Board member question, you will find the ULGM response.
1. Ed Hughes: Do you have a response to the suggestion that your proposal may violate Wis. Stat. sec. 118.40(4)(c) other than that you also intend sometime in the future to develop and operate a school for girls? If so, what is the response?
ULGM: Please refer to our letter to MMSD Board of Education members that responded to the ACLU’s opposition to Madison Prep. The answer to your question is contained in that letter. We have attached the letter to this document for your review.
2. Ed Hughes: To the extent the information is available to you, please list the 37 or so non instrumentality charter schools currently operating in Wisconsin.
ULGM: The following list of non-instrumentality charter schools currently operating in Wisconsin was compiled from the 20 I 0-20 II Charter Schools Yearbook published by the Department of Public Instruction. You can find the complete Yearbook online at: http://dpi.wi.gov/sms/pdf/2010.llyearbook.pdf
1. Barron, North Star Academy
2. Cambridge, JEDI Virtual High School
3. City of Milwaukee, Central City Cyberschool
4. City of Milwaukee, Darrell Lynn Hines (DLH) Academy
5. City of Milwaukee, Downtown Montessori Academy
6. City of Milwaukee, King’s Academy
7. City of Milwaukee, Milwaukee Academy of Science
8. Grantsburg, Insight School of Wisconsin
9. Hayward, Hayward Center for Individualized Learning
10. Hayward, Waadookodaading Charter School
11. McFarland, Wisconsin Virtual Academy
12. Milwaukee, Carmen High School of Science and Technology
13. Milwaukee, Highland Community School
14. Milwaukee, Hmong American Peace Academy (HAPA)
15. Milwaukee, International Peace Academy
16. Milwaukee, La Causa Charter School
17. Milwaukee, Milwaukee Community Cyber (MC2) High School
18. Milwaukee, Next Door Charter School
19. Milwaukee, Wings Academy
20. Milwaukee, Wisconsin Career Academy
21. Nekoosa, Niikuusra Community School
22. New Lisbon, Juneau County Charter School
23. New Richmond, NR4Kids Charter School
24. Sheboygan, Lake Country Academy
25. UW-Milwaukee, Bruce Guadalupe Community School
26. UW-Milwaukee, Business & Economics Academy of Milwaukee (BEAM)
27. UW-Milwaukee, Capitol West Academy
28. UW-Milwaukee, Milwaukee College Preparatory School
29. UW-Milwaukee, Milwaukee Renaissance Academy
30. UW-Milwaukee, School for Early Development & Achievement (SEDA)
31. UW-Milwaukee, Seeds of Health Elementary School
32. UW-Milwaukee, Tenor High School
33. UW-Milwaukee, Urban Day Charter School, Inc
34. UW-Milwaukee, Veritas High School
35. UW-Milwaukee, Woodlands School
36. UW -Milwaukee, YMCA Young Leaders Academy
37. UW-Parkside, 21st Century Preparatory School
38. Weyauwega-Fremont, Waupaca County Charter School
3. Ed Hughes: Do you have copies of any of the contracts Wisconsin non-instrumentality charter schools have entered into with their school districts? If so, please list the contracts and provide a copy of at least one of them.
ULGM: See attached contracts for Lake Country Academy in Sheboygan and the Wisconsin Virtual Academy in McFarland, which are both non-instrumentality charter schools.
4. Ed Hughes: To the extent the information is available to you, please list the amount ofper.student payment each non-instrumentality charter school in Wisconsin is contractually entitled to receive from its sponsoring school district.
ULGM: We have requested information from the DPI on the current per-student payments to each non-instrumentality charter school in Wisconsin, but we understand that DPI does not now have the information consolidated in one database. We expect that the per-student payment information will be available from DPI by January 17, and we will submit that information to the board and administration as soon as it becomes available from the DPI. The per-pupil payment to each district.authorized charter school in Wisconsin, including instrumentality and non-instrumentality charter schools, is determined through negotiations and mutual agreement between the school district, as the charter school authorizer, and the charter school developer/operator.
5. Ed Hughes: Please identify the minimum per-student payment from the school district that would be required for Madison Prep to be financially feasible from your perspective. If you don’t have a specific figure, provide your best estimate of the range in which that figure is likely to fall.
ULGM: The MMSD Superintendent and Assistant Superintendent-Business in agreement with us that more time is needed to present a projected minimum payment from the school district. DPI’s School Finance Data Warehouse indicates that MMSD reported $14,432 in revenue per student and spent $13,881 per student iu 2008-09. We are certain that we will not request more per student than what MMSD spends annually.
6. Lucy Mathiak: Do you know what Madison Prep will cost the district? And do you know where the money will come from?
ULGM: We have an idea ofwhat our school will cost but as stated in the answer to question number 5, we are working through several costs and line items with MMSD’s Superintendent and Assistant Superintendent-Business. In Wisconsin, public charter schools are funded primarily by school districts or the state legislature (non-school district authorized schools). Generally, private funding is limited to 5% of costs during the budgeting process. However we will raise significantly more in private funding during the pre-implementation and implementation years of the school than we will in out years.
7. Lucy Mathiak: How the financial commitment asked of the district compares to the financial commitment to its existing schools?
ULGM: Assuming you mean existing traditional public schools, we will require more information from MMSD’s administration to make this comparison. Given that Madison Prep will be a new school and a non-instrumentality, there will be costs that Madison Prep has that the school system does not, and vice versa. However, we are firmly committed to ensuring our school is operated within the annual per pupil cost MMSD now spends to educate students in middle and high schools.
8. Community Member, via Arlene Silveira: First of all, has the funding that is indicated as part of the proposal actually been acquired or promised? The proposal indicates $100,000/ year from the Madison Community Foundation, but I can’t find any information from MCF itself about funding Madison Prep. All I can see is that they donated to the Urban League’s capital and Workforce campaigns. Will you check into this? Also, the proposal indicates $250,000/ year for 3 years from Partners for Developing Futures. Last year, despite having received 25 applications for funding from “education entrepreneurs,” this organization did not fund any of them due to the quality of the applications. How is the Madison Prep planning team able to claim this as a source of funding? Have promises been made?
ULGM: The Madison Community Foundation and Partners for Developing Futures were listed as potential revenue sources; these dollars were not committed. Our business plan followed the same approach as most business plans for start-up initiatives: listing prospective revenue sources. However, we do intend to pursue funding through these and other sources. Our private fundraising goals and needs in our five-year budget plan are reasonable.
9. Lucy Mathiak: What additional resources are needed to make the Madison Prep model work?
ULGM: Our school is designed as a demonstration school to be replicable, in whole or in part, by MMSD and other school systems. Therefore, we will not request more than the district’s own annual costs per pupil at the middle and high school levels.
10. Lucy Mathiak: What resources are in hand and what resources will you need to raise?
ULGM: We presently have $50,000 to support the planning of the school, with the offer of additional support. However, we will secure additional private and public funding once the Board of Education formally approves the DPI planning grant application/detailed proposal for Madison Prep.
11. Lucy Mathiak: Ifthere is a proposed endowment, what is the amount of the endowment in hand, the estimated annual rate of return, and the estimated income available for use?
ULGM: New charter schools generally do not budget for endowment in their first few years of operation. We intend to build an endowment at some point and have line items for this in Madison Prep’s budget, but these issues will be decided by the Board ofDirectors ofthe school, for which we will not begin recruiting until the Board of Education approves our DPI plauning grant application/detailed proposal.
12. Ed Hughes: Which parts of your proposal do you require non-instrumentality status to implement?
ULGM: Non-instrumentality status will be vital to Madison Prep’s ability to offer an extended school day, extended school year, as well as the expectations we have of teachers to serve as mentors and coaches to students. The collective bargaining contract between the Board of Education and Madison Teachers, Inc. would not allow for this added instructional time. Yet this added instructional time will be necessary in order for students to meet Madison Prep’s ambitious achievement goals. In addition, our professional development program will also require more hours of training. We also intend to implement other special activities for students and faculty that would not be allowed under MMSD and MTI’s collective bargaining agreement.
13. Ed Hughes: What will be the school’s admission policy? Please describe any preferences that the admission policy will include. To what extent will students who live outside ofthe Madison school district be considered for admission?
ULGM: Madison Prep will comply with all federal and state regulations relating to charter school admissions. In its inaugural school year (20 12-20 13), Madison Prep will be open to any 61h and 7’h grade male student residing within the boundaries of MMSD.
All interested families will complete an Enrollment Form at the Urban League’s offices, online, during community meetings and outreach activities, through local partners, or during a visit to the school (after it opens). If Madison Prep receives less than 45 enrollment forms for either grade (6 and 7) in the tirst year, all students’ who applied will be admitted. If the school receives more than 45 enrollment forms for either grade level in the first year, or enrollment forms exceed the seats available in subsequent years, Madison Prep will hold a public random lottery at a location that provides enough space for applicant students and families. The lottery will be held in accordance with DPI guidelines for random lotteries. If Madison Prep does not fill all available seats, it will continue its grassroots recruitment efforts until it reaches its enrollment goal.
14. Community Member, via Arlene Silveira: We know that Madison Prep won’t accept girls. Will it except boys with Autism or Aspergers? If a boy has a learning disability, will he be allowed to attend? What ifthis learning disability makes it not possible for him to perform above grade level on a standardized test? Will he be allowed in? And can they kick him out if his test scores aren’t advanced/proficient?
ULGM: Please see our answer to question #13. To be clear, Madison Prep will accept students with special learning needs, including students who speak English as a second language. As always, IEP teams will determine on a case-by-case basis if Madison Prep is an appropriate placement for special education students. No Madison Prep student will ever be expelled for academic performance.
15. Ed Hughes: An attraction ofthe proposed school is that it could provide the kind ofiutense academic and other sorts of support that could change the trajectories of its students from failure to success. How will you ensure that your school serves primarily students who require the sort of approach the school will offer in order to be successful?
ULGM: Please see our answer to question #13 and question #16 below. We will go to great lengths to inform parents about Madison Prep as an option for their child, and to recruit students and families to our school. We will over-market our efforts in low-income communities and through media, sports clubs, community centers, churches, employers, and other vehicles that reach these students and their parents. We are also exploring the legality of our ability to set an income goal or threshold for student admissions. Nonetheless, we believe that any young man, regardless of their family background, would be well served by Madison Prep.
16. Ed Hughes: To the extent yon know them, describe what the school’s stndent recruitment and marketing strategies will be.
ULGM: Madison Prep’s marketing plan will support three priorities and goals:
1. Enrollment: Recruiting, retaining, and expanding student enrollment annually -share Madison Prep with as many parents and students as possible and establish a wait-list of at least 20 students at each grade level by June I each year (with the exception of year one).
2. Staffing: Recruiting and retaining a talented, effective, and committed faculty and staff -field qualified applicants for each position in a timeframe that enables us to hire by June 30 each year.
3. Public Image and Support: Building, maintaining, and solidifying a base of support among local leaders, financial contributors, key partners, the media, and the general public.
To ensure the public is well acquainted with the school, Madison Prep, with the support of the Urban League of Greater Madison, will make use of a variety of marketing strategies to accomplish its enrollment, staffing, fundraising, and publicity goals. Each strategy will be phased in, from pre.launch of the school through the first three years of operation. These marketing strategies are less expensive and more sustainable with the budget of a new charter school than television, radio, and popular print advertisements. They also deliver a great return on investment if executed effectively. Each strategy will enable Madison Prep, with its limited staff, to promote itself to the general public and hard-to-reach communities, build relationships, sustain communications and achieve its goals.
A. Image Management: Madison Prep’s logo and images of young men projecting the Madison Prep brand will be featured on the school’.s website, in informational and print materials, and on inexpensive paraphernalia (lapel pins, emblems, ink pens, etc). Students will be required to wear uniforms that include a red or black blazer featuring the Madison Prep emblem, a sweater, a red or black tie, white shirt, black or khaki pants, and black or brown dress shoes. They will also have a gym uniform and athletic team wear that features the Madison Prep emblem. Additionally, Madison Prep will ensure that its school grounds, educational facility, and learning spaces are clean, orderly and well-maintained at all times, and that these physical spaces reflect positive images of Madison Prep students, positive adult males, community leaders, families, and supporters. Madison Prep’s Core Values will be visible through the school as well, and its students, faculty, staff, and Board of Directors will reflect an image in school and in public that is consistent with the school’s Core Values and Leadership Dimensions.
B. Grassroots Engagement: Madison Prep’s founders, Board members, volunteers, and its key staff (once hired) will go door-to-door in target neighborhoods, and other areas within MMSD boundaries where prospective candidates can be found, to build relationships with young men, families, and local community resource persons and advocates to recruit young men to attend Madison Prep. Recruiters will be dressed in the Madison Prep uniform (either a polo shirt, sweater or suit jacket/tie, each showing the Madison emblem, and dress slacks or skirt) and will visit homes in two person teams.
Madison Prep will also partner with City Council members, Advisory Neighborhood Commissioners, and local libraries to host community meetings year-round to promote the school in target neighborhoods and military bases. It will also promote the school to citizens in high traffic residential areas of the city, including metro stops, restaurants, community centers, community health agencies, and at public events. Madison Prep will engage the religious community as well, promoting the school to church leaders and requesting to speak before their congregations or have the church publicize the school during their announcements on Sundays and ministry activities during the week. Area businesses, hospitals, government agencies, foster care agencies, and mentorship programs will be asked to make information available to their patrons, clients, and families. Madison Prep will also seek to form partnerships with the Police Department and Court System to ensure judges, attorneys, neighborhood police officers, and family advocates know about the school and can make referrals of young men they believe will benefit from joining Madison Prep’s school community.
C. Online Presence & Partnerships: Madison Prep will launch a website and update its current Facebook and Twitter pages prior ·to the school opening to expand its public presence. The Facebook page for Madison Prep presently has more than 100 members, has been operational for less than 2 months, and has not yet been widely marketed. The page is used to raise awareness, expand support, communicate progress, announce activities and events, and promote small-donor fundraising campaigns. The website will be used to recruit students, staff, and eventually serve as an entry-point to a member only section on the Internet for faculty, students, and parents. Madison Prep will also seek to establish strategic alliance partnerships with service associations (100 Black Men, Sororities and Fraternities, Civic Clubs or Organizations, etc.), enlisting their participation in the school’s annual events. In addition, Madison Prep will establish partnerships with other public and private schools in the Madison area to recruit students, particularly elementary schools.
D. Viral Marketing: Madison Prep will use email announcements and social networking sites to share its mission, activities, employment opportunities, and successes with its base of supporters and will inspire and encourage them to share the information with their friends, colleagues, parents and young men they know who might be interested in the school. Madison Prep will add to its base of supporters through its other marketing strategies, collecting names and contact information when and where appropriate.
E. Buzz Marketing: Madison Prep will use subtle forms of marketing to recruit students and faculty, increase its donor and support base, and develop a positive public image. The school will maintain an influential board of directors and advisors, will engage notable people and organizations in the school, and will publicize these assets to the general public. The school will also prepare key messages and strategically involve its students, staff, and parents in key events and activities to market its brand -high achieving, thoughtful, forward thinking, confident and empowered young men who are being groomed for leadership and success by equally talented, passionate and committed adults. The messages, images, and quality of interactions that the broader community has with members of the greater Madison community will create a positive buzz about the school, its impact, and the success of its students.
F. School Visits & Activity Participation: Each year, from the week after Thanksgiving through the end of the school year, Madison Prep will invite prospective students and parents, funders, and members of the community to visit the school. A visit program and weekly schedule will be established to ensure that the school day and learning is not interrupted by visitors. Madison Prep will also establish an open visit policy for parents, and will create opportunities for them to leverage their ongoing involvement with the school and their young men. Through nurturing positive relationships with parents, and establishing an enviromnent where they are wanted and respected, Madison Prep will create spokespersons in the community who help grow its student body and community support. Finally, Madison Prep will host an annual community event that engages its school community with the greater Madison community in a day of fun, competitive events for families, and will serve as a resource to parents whose children do not attend Madison Prep by inviting them to participate in its Destination Planning workshops.
G. Popular Media: Madison Prep will allocate resources to market itself on Urban and News Radio during the peak student recruitment season in two phases. Phase I will take place in November 2011 and Phase 2 advertising will take place between Jannary and May 2012. To defray costs, Madison Prep will enlist the support of local and national celebrities for feature interviews, spotlights, and PSAs with Madison Prep’s Leadership to promote the school.
17. Community Member, via Arlene Silveira: It looks like the Charter school is aiming for 50% of its population to be low-income. The middle school my children will go to, Sherman, is 71% low income. Blackhawk is at 62%. Wright is 83%. Sennett is 65%. Cherokee is at 63%. Toki is at 51%. Can we, in good conscious, start a new school-designed to help low income students -that has a lower percentage oflow-income students than six of our existing middle schools?
ULGM: The Urban League has set the 50% low-income target as a floor, not as a ceiling. In fact, we expect that more than 50% of Madison Prep students will qualifY for free or reduced lunch.
Furthermore, we have chosen to use the 50% figure to allow us to be conservative in our budgeting process. No matter what the level of low income students at Madison Prep -50% or higher-the student achievement goals and overall program quality will remain unchanged.
18. Ed Hughes: Have you considered limiting admission to students who have scored minimal or basic on their WKCE tests?
ULGM: No. Madison Prep will be open to any male student who wishes to attend, regardless of past academic performance.
19. Ed Hughes: Some have suggested that Madison Prep could skim offthe most academically.motivated African-American students from the District’s middle and high schools, leaving fewer role models and academic peers for the African-American boys who remain in our existing schools. What is your response to that concern?
ULGM: The notion that charter schools skim off the most motivated students is a common misconception. First, this argument is not logical. Parents/caregivers ofchildren who are academically motivated and doing well in traditional public schools have little incentive to change their students’ educational environment. Those kids will likely stay put. When a parent, teacher, social worker, or school counselor recognizes that a child isn’t doing well in the traditional school and seeks an alternative, the charter school that is sought as an alternative does not in this process gain some advantage. In fact, research suggests the opposite. A 2009 study by researchers at Michigan State University, the University of Wisconsin, and Mathematic Policy Research examined charter schools from across the country to test the “skimming” theory. The researchers found no evidence of skimming. In fact, they found students who go to charter schools typically have LOWER test scores than their counterparts in traditional public schools. (Read the full paper at http://www.vanderbilt.edu/schoolchoice/conference/papers/Zimmer_COMPLETE.pdf)
20. Ed Hughes: Have you extended preliminary or informal offers of employment at Madison Prep to anyone? If so, identify to whom the preliminary or informal offers were made and for which positions.
ULGM:No.
21. Ed Hughes: What will he your strategy for recruiting teachers? What qualifications will you establish for teachers? Please describe the general range of salary and benefits you expect to offer to teachers.
ULGM: Teacher Recruitment -The overarching goal of teacher recruitment will be to hire a highly qualified, passionate, hard-working, diverse staff. The recruitment effort will include casting a wide net that allows Madison Prep to draw from the pool oflocal teachers as well as teachers statewide and nationwide who will embrace the opportunity to help build a school from the ground up. We will recruit though typical both typical means (postings on our website, WECAN, charter school association job pages) as well as through recruitment fairs outside of the state. Our hiring process will take place in early and mid spring rather than late spring and summer so that we may have a competitive edge in recruiting the teachers that are the best fit for Madison Prep. While the Head of School will be responsible for the hiring of teachers, he/she will engage a committee of teachers, community members, parents, and students in the process ofselecting teachers and other staff. In addition to a thorough interview, teacher candidates will be required to teach a sample lesson to a group of students, as well as other interview committee members. Teacher Qualifications-All teachers at Madison Prep will be licensed by the Department of Public Instruction.
General Salary Range and Benefits*-For the 2012-2013 school year, the salary for Master Teachers (of which there will be two) is currently projected to be $61,406 with a signing bonus of $2,000 and a maximum performance bonus of $2,750. The salary for general education teachers is currently projected to be $50,055 for the 2012-2013 school year, with a signing bonus of$2,000 and a maximum performance bonus of$1,750. Madison Prep intends to provide a full range of benefits to its teachers. *Salary and bonus figures are subject to change
22. Ed Hughes: MMSD already has a charter middle school with a very diverse student population -James C. Wright Middle School. If the school district chose to continue James C. Wright as an instrumentality charter school but modeled on your Madison Prep proposal, which components of your proposal do yon think could be implemented at the school and which components of your proposal could not?
ULGM: The Urban League is not in a position to determine how the fundamental elements ofthe Madison Prep proposal could or could not be implemented at James C. Wright Middle School. That determination would have to be made by the district administration and c01mnunity at Wright.
23. Community Member, via Arlene Silveira: Here is the annual report from one of the Urban League charter schools that the proposal cites as a model for Madison Prep:
http://www.doe.mass.edu/charter/reports/2009/annual/0471.doc This is a report from the school’s lO'” year in existence. Please note the test achievement goals and scores on page 4 and compare them with the extremely overconfident goals of the Madison Prep proposal. IfMadison Prep is serious about attaining the goal of 75% oftheir students scoring 22 or higher on the ACT or 1100 or higher on the SAT, how do they plan to achieve this and what will happen with those students who fail to meet this standard? What will happen to the teachers who don’t meet their quota ofstudent test scores above this level? Please investigate these questions in detail and within the framework of Madison Prep processes from admissions through expulsion.
ULGM: The reference to the New Leadership Charter School in Springfield, Massachusetts in the Madison Prep initial proposal was meant to show the precedent for the establishment of charter schools by Urban League affiliates; the New Leadership Charter School is NOT a model for Madison Prep, nor was this ever stated in the initial proposal. That said, Madison Prep IS serious about our student achievement goals related to the ACT and SAT. We plan to meet these goals through-as the proposal states-an all-male student body, the International Baccalaureate Curriculum, college preparatory educational program, Harkness Teaching, an extended school day and year,mentoring and coll1111unity support, and a prep year. Students will be carefully assessed for years leading up to these tests to ensure their preparedness. When formative assessments indicate re-teaching is needed in order to meet the goal, students will receive further individualized instruction. Madison Prep teachers will not have student test score “quotas.”
24. Lucy Mathiak: What would a timeline for the counterpart girls’ school look like?
ULGM: We would like to initiate the process for the girls’ school in the fall of 2012, with an opening aimed at 2014-2015.

I continue to believe that the fate of this initiative will be a defining moment for the Madison School District. If approved and implemented, it will, over time, affect other traditional schools within the District. If it is rejected, a neighboring District will likely step in.
Finally, I found the Urban League’s response to Ed Hughes’ question #5 interesting:

DPI’s School Finance Data Warehouse indicates that MMSD reported $14,432 in revenue per student and spent $13,881 per student iu 2008-09. We are certain that we will not request more per student than what MMSD spends annually.




What the LA Seniority Settlement Does and Doesn’t Do



There has been much concern that somehow the proposed LA Seniority Settlement is eliminating seniority. Lets be clear here – this settlement does not eliminate seniority either at the protected school sites or in the district. This settlement simply means that some schools would be protected from experiencing the mass layoff when budget cuts are required. These schools will not even be protected from cuts. When the district has a cut, say 5 percent of staff in the district, this settlement will mean that only 5 percent of teachers at a protected site can be cut. And, those teachers would be selected based on seniority. What will this mean for other schools at the district? It will mean that more senior teachers will be laid off in the wealthier parts of the district, but isn’t that fair? And how will those layoff decisions be made? That’s right, based on seniority. So, this settlement simply spreads the pain a little more evenly across the district, but still bases decisions on seniority. When there are budget cuts shouldn’t all the schools in the district feel some impact on their teaching staff?
Now on the point of should their be broader reforms to teacher seniority policies. I think the answer is yes. But first, districts must have much better teacher evaluation systems in place. For example, I am a fan of the TAP model in which the teacher evaluation system includes at least 6 classroom observations a year by a combination of administrators and master teachers using an agreed upon evaluation rubric. These measures are combined with value-added assessments and other outcome based measures. Once there are more rigorous teacher evaluation systems, then that system can protect from arbitrary feelings of a principal.




We can measure teacher effectiveness, objectively and transparently Read more: Richmond County Daily Journal – We can measure teacher effectiveness objectively and transparently



Terry Stoops

Is it possible to measure teacher effectiveness? For decades, public school principals have subjected teachers to a battery of observations and evaluations purportedly designed to assess the quality of classroom instruction. Rather than yield appreciable information, however, these kinds of teacher assessments merely served as one of the few formal requirements needed to attain lifetime job security, also known as tenure.
On the other hand, the “value-added” method of teacher evaluation continues to show promise as an objective and reliable assessment of teacher quality. Value-added analysis uses standardized tests to estimate teacher effectiveness. This powerful evaluation method employs advanced statistical techniques to project the future performance of individual students based on their past performance. The difference between the projected and actual performance of students determines the value added or subtracted by the teacher.
Value-added analysis has upended the conventional wisdom on teacher quality. For years, public school advocacy groups complained that the most talented teachers snub minority and low-income schools by migrating to less challenging and higher paying schools in culturally and economically homogeneous suburbs.

Terry Stoops is director of education studies at The John Locke Foundation.




An analysis of Tennessee School Performance



Education Consumers Foundation:

Tennessee schools are measured on two things: achievement, seen in standardized assessment and ACT results; and growth, reported through the state’s value-added assessment system. For the first time, parents and other Tennessee citizens can plot the performance of their child’s school and others across the district or state through the ECF’s interactive Growth vs. Achievement Charts.
To view charts for each major grade level grouping, visit the following links:




The L.A. Times Flunks L.A. Schoolteachers The newspaper takes on the two L.A. sacred cows–teachers and unions–and lives to print again!



Jack Shafer

Nobody but a schoolteacher or a union acolyte could criticize the Los Angeles Times’ terrific package of stories–complete with searchable database–about teacher performance in the Los Angeles Unified School District.
Union leader A.J. Duffy of the United Teachers Los Angeles stupidly called for a boycott of the Times. Boycotts can be sensible things, but threatening to boycott a newspaper is like threatening to throw it into a briar patch. Hell, Duffy might as well have volunteered to sell Times subscriptions, door-to-door, as to threaten a boycott. Doesn’t he understand that the UTLA has no constituency outside its own members and lip service from members of other Los Angeles unions? Even they know the UTLA stands between them and a good education for their children.
Duffy further grouched that the Times was “leading people in a dangerous direction, making it seem like you can judge the quality of a teacher by … a test.” [Ellipsis in the original.] Gee, Mr. Duffy, aren’t students judged by test results?
American Federation of Teachers President Randi Weingarten also knocked the Times for publishing the database that measures the performance of 6,000 elementary-school teachers. Weingarten went on to denounce the database as “incomplete data masked as comprehensive evaluations.” Of course, had the Times analysis flattered teachers, Weingarten would be praising the results of the analysis.




Formula to Grade Teachers’ Skill Gains in Use, and Critics



Sam Dillon

How good is one teacher compared with another?
A growing number of school districts have adopted a system called value-added modeling to answer that question, provoking battles from Washington to Los Angeles — with some saying it is an effective method for increasing teacher accountability, and others arguing that it can give an inaccurate picture of teachers’ work.
The system calculates the value teachers add to their students’ achievement, based on changes in test scores from year to year and how the students perform compared with others in their grade.
People who analyze the data, making a few statistical assumptions, can produce a list ranking teachers from best to worst.
Use of value-added modeling is exploding nationwide. Hundreds of school systems, including those in Chicago, New York and Washington, are already using it to measure the performance of schools or teachers. Many more are expected to join them, partly because the Obama administration has prodded states and districts to develop more effective teacher-evaluation systems than traditional classroom observation by administrators.




High schools should dare to measure success differently



Jay Matthews:

On my blog, washingtonpost.com/class-struggle, I gush over my many genius ideas, worthy of the Nobel Prize for education writing if there was one. Here is a sample from last month:
“Why not take the Collegiate Learning Assessment, a new essay exam that measures analysis and critical thinking, and apply it to high schools? Some colleges give it to all of their freshmen, and then again to that class when they are seniors, and see how much value their professors at that college have added. We could do the same for high schools, with maybe a somewhat less strenuous version.”
Readers usually ignore these eruptions of ego. But after I posted that idea, a young man named Chris Jackson e-mailed me that his organization had thought of it four years ago and had it up and running. Very cheeky, I thought, but also intriguing. I never thought anyone would try such a daring concept. If your high school’s seniors didn’t score much better than your freshmen, what would you do? What schools would have the courage to put themselves to that test or, even worse, quantify the level of their failure, as the program does?




How teacher pay should work



Tom Vander Ark:

Kim Marshall’s December 16 EdWeek commentary attempts to “demolish the argument for individual merit pay.” He makes good points that suggest that individual bonuses based solely on value-added test scores are not a good idea. He suggests, instead, team-based bonuses and more pay for master teachers.
There’s an alternative in between that most big organizations and it works like this:

  • In collaboration with peers and a manager, a Personal Performance Plan, sets out objectives for the year. For a teacher these objectives may include several objective assessments, but would also include team contributions, and a personal growth plan.
  • A pool for merit increases is set based on the financial health of the organization and cost of living (let’s assume an annual target of 2.5%)
  • Quarterly conversations about performance are summarized in a year end document.
  • Merit increases would range from 0% for teachers that accomplished few objectives and 5% for teachers that exceeded expectations.




Teachers to Face Individual, School Evaluations of Student Success



Bill Turque:

District teachers will be evaluated on their individual effectiveness and their school’s overall success in improving student performance under an assessment system to be unveiled this fall, Schools Chancellor Michelle A. Rhee said yesterday.
In her most detailed public comments to date on planned changes to the evaluation system, Rhee said at a D.C. Council hearing that the approach would combine standardized test scores where practical, intensive classroom observation and “value added” measurements of students’ growth during the year.
Teachers would also be allowed to set buildingwide goals for achievement that would be used in evaluating their performance.
Rhee said the Professional Performance Evaluation Program, which the District has used in recent years, is inadequate and does not reflect a teacher’s worth or how much he or she has helped students grow. She said the federal No Child Left Behind law, as written, is too narrowly focused on test results and not student progress from year to year.




Not Everyone Wants to Move Toward Rating Educators by Student Progress



Jay Matthews:

For a while, the fight over how to improve public schools seemed to be quieting down. During the presidential campaign, Republican and Democratic education advisers happily finished each other’s sentences on such issues as expanding charter schools, recruiting better teachers and, in particular, rating schools by how much students improve.
Moving to the growth model for school assessment, by measuring each student’s progress, seems to be the favorite education reform of the incoming Obama administration. Up till now, we have measured schools by comparing the average student score one year with the average for the previous year’s students. It was like rating pumpkin farmers by comparing this year’s crop with last year’s rather than by how much growth they managed to coax out of each pumpkin.
The growth model appeals to parents because it focuses on each child. It gives researchers a clearer picture of what affects student achievement and what does not. Officials throughout the Washington area have joined the growth model (sometimes called “value-added”) fan club. The next step would be to use the same data to see which teachers add the most value to their students each year.




Board Talks Will Focus on a New Blueprint



Susan Troller
The Capital Times
September 25, 2007

Football coach Barry Switzer’s famous quote, “Some people are born on third base and go through life thinking they hit a triple,” could easily apply to schools and school districts that take credit for students who enter school with every advantage and continue as high achievers all along.
But how do you fairly judge the job that teachers, schools and districts with many children who have significant obstacles — obstacles like poverty, low parental expectations, illness and disability or lack of English proficiency — are doing? Likewise, how do you make certain that your top students are adding growth every year as they go through school, rather than just coasting toward some average or proficient standard?

(more…)




Continue Elementary Strings – 550 Low-Income Children Deserve the Opportunity to Proudly Play Their Instruments



On Wednesday, May 31st, the MMSD School Board will consider amendments to the 2006-2007 school budget proposed by the Superintedent. In his proposal, the Superintendent proposed cutting Grade 4 strings this year and Grade 5 strings the end of next year. One amendment to be discussed on Wednesday would have Grade 4 strings 1x per week (45 minutes) and Grade 5 2x per week (45 minutes each class).
Students who will be affected the most are our low-income children. There is no other place in Dane County that can teach so many low-income children. This year about 550 low-income students took elementary strings. Fewer opportunities at this age will lead to fewer low-income/minority students in our middle and high school orchestras and band – this is a direction we do not want to move in as our student body becomes more diverse.
Like it or not, people moving into the area with children check out what schools offer – our suburban school districts have elementary string programs that are growing in many towns.
I’ve advocated for a community committee for fine arts education to develop a long-term plan for this academic area. I hope this comes to pass, but first I hope the School Board favorably considers this amendment and follows Lawrie Kobza’s idea – hold off spending on “things” because people cannot be added back in as easily as things can be added back into the budget.
I’ve written a letter to the school board that follows:

(more…)




When Ability Grouping Makes Good Sense



By James J. Gallagher
I am posting this article from 1992 given the recent debate on one size fits all classrooms. Professor Gallagher makes the point that the argument that homogeneous grouping hurts no one is clearly false: research consistently shows that high ability students do better when they are in classes with similarly able peers.
The recent educational literature has been filled with discussions of the effects of ability grouping, tracking, etc., and new virtues have been found in the concept of heterogeneous grouping of students. The homogeneous grouping of slow-learning children does not appear to be profitable, but the homogeneous grouping of bright students is a very different matter, and often ignored in these discussions. (See “Tracking Found To Hurt Prospects of Low Achievers,” Education Week, Sept. 16, 1992.)
The goal of heterogeneous grouping appears to be a social one, not an academic one.(emphasis added) The desirability of that goal needs to be argued on its own merits, which I believe to be considerable. The argument is clouded, however, by the insistence of the proponents that nothing is lost in academic performance by such grouping. This position is clearly false, in my judgment, as it applies to bright students. Apart from the meta-analyses which indicate substantial gains for gifted students grouped for ability, there is a small matter of common sense.

(more…)




Examining Student Scores for Opportunities for Academic Improvement



Jay Mathews, Washington Post staff writer, wrote an article in the December 14, 2004 Washington Post (Mining Scores for Nuances in Improvement) about using value-added assessments, which “…use test scores to compare each child’s progress to predictions based on past performance…” Not everyone is pleased with value-added assessments. “Value-added assessment has also become a political irritant because some school boards and superintendents want to pay teachers based on how much value they are adding, as measured by individual student test scores, for students in their classes. In Ohio and most other states, the system is being used only to diagnose student needs, leaving the question of teacher pay for later.” Value-added assessments, which can be done by principals or teachers, is one approach that attempts to bring analysis of student data closer to the school/teacher.
Mining Scores for Nuances in Improvement




Madison Superintendent Declines $2M in Federal Funds Without Consulting the Board



On Friday, October 15, Madison School Board members received an e-mail from Superintendent Art Rainwater announcing that the district will withdraw from a federal program known as Reading First.
In subsequent interviews with local newspapers, Rainwater estimated that the decision means forgoing approximately $2M in funds for materials to help students in the primary grades learn to read. The Cap Times
Wisconsin State Journal
Whenever the district qualifies for such federal grants, the Board votes to increase the budget to reflect the new revenues. To the best of my knowledge, the superintendent has not discussed this decision with the Performance & Achievement Committee. He has certainly not included the full Board in the decision to withdraw from Reading First.
The memo follows (click on the link below to view it or click here to view a 200K PDF):

(more…)




The �No Child� Law�s Biggest Victims



An Answer That May Surprise
Margaret DeLacy’s recent article in Education Week
Since education is high on the national agenda, here�s a pop quiz that every American should take.
Question: What group of students makes the lowest achievement gains in school?
Answer: The brightest students.
In a pioneering study of the effects of teachers and schools on student learning, William Sanders and his staff at the Tennessee Value-Added Assessment System put in this way: “Student achievement level was the second most important predictor of student learning. The higher the achievement level, the less growth a student was likely to have.”
Mr. Sanders found this problem in schools throughout the state, and with different levels of poverty and of minority enrollments. He speculated that the problem was due to a “lack of opportunity for highscoring students to proceed at their own pace, lack of challenging materials, lack of accelerated course
offerings, and concentration of instruction on the average or below-average student.”
While less effective teachers produced gains for lower-achieving students, Mr. Sanders found, only the top one-fifth of teachers were effective with high-achieving students. These problems have been confirmed in other states. There is overwhelming evidence that gifted students simply do not succeed on their own.

(more…)