If they are then .....What is up with this?

Beaverton Oregon SD has the following posted on their website in regard to CMP2:

http://www.beaverton.k12.or.us/pdf/ins/ins_mptmr_CMP_research1.pdf

This is a link to a two page report that says:

Conclusion

Students using Pearson’s CMP2 curriculum showed a significant increase in their average ITBS scores from pretest to posttest. This analysis simply shows that treatment students showed significant growth in problem solving, estimation, computation, and data interpretation. Further analyses of curriculum implementation and any possible differences across school sites will be included in the final report. A comparison of treatment and control ITBS scores will also be included, along with analyses on the remaining outcome measures. The incorporation of other statistical and qualitative analyses will provide a more inclusive representation of any changes in student attitudes, mathematical conceptual understanding, and/or mathematical achievement across both treatment and control groups.

Now contrast that conclusion with pg 74 of the full report:

http://www.pearsoned.com/RESRPTS_FOR_POSTING/MATH_RESEARCH_STUDIES/cmp2_efficacy_study_2007-08_final_report.pdf

for which apparently the BSD website prefers not to provide a link.

Key Findings:

Comparable Achievement Results:

The first area of discussion involves the lack of distinction between the treatment and control groups on the two primary outcomes measures—the ITBS and the BAM. These data suggest that mathematics skills were fairly equivalent across the treatment and control group, and that

**CMP2 did not offer any distinct advantage for students when compared to using the existing curriculum in their schools.**The one finding that was significant, the rate of increase on the BAM for treatment students in comparison to control students, was not a large enough effect to withstand a more conservative statistical test.

-----------------------------

This study was paid for by the CMP2 publisher.

This was done by the Claremont Graduate University.

Are the publisher funded researchers unbiased?

You tell me.

The end of the conclusion says:

However, we are optimistic that if there are true differences for students in CMP2 classrooms compared to those using other curricula, we will be much more likely to observe these differences after two years.

**We expect that as teachers become more familiar and more skilled at using the CMP2 curriculum; their students will similarly show the effects of this improved skill.**

-----------------

Study fails to mention previous k-5 materials that these sixth graders used.

The control groups used three different sets of materials depending on where they were enrolled. These are not mentioned by title.

Perhaps the headline should be CMP2 unable to beat Brand X, or Brand Y, or Brand Z.

## 9 comments:

This is a moral dilemma? If the researchers don't say something positive, then they won't get funded. On the other hand, the public gets misled into adopting a hack's textbook.

Exactly ... let Seattle take a big EDM bow. The school directors just want to vote without thinking .. only reading what the publishers feed them.

What makes you think Beaverton SD wants to make objective decisions? It is my experience as a former customer of the Beaverton SD is that the decision drives the process. When I presented them with data from the publisher showing the Algebra standards from CA not covered and asked for data showing how they would cover those topics, I was met with only verbal assurances that they would be covered. No curriculum maps or hard data. They want reform math and they won't let a little thing like data and objective analysis stand in the way. Of course, this is all my opinion.

CMP2 is CMP 1 with math facts studded in it like nuts in a fruitcake. Some of these facts use symbols and concepts (like the square root sign) used no where else in the texts. It is obvious that these have been stuck in to satisfy negative comments about CMP 1. They won't teach anyone who doesn't already know the subject anything. I would guess if the kids in the study showed any increases, it was due to sympathetic teachers who pulled out the math facts and put them in a context an ordinary kid could understand.

Christi - Has anyone else followed up on the school board member that also was a math teacher in the district? This was my experience in another school district. Only it was the former math department head turned Core Plus textbook consultant pulling the budget purse strings with his principal. His cousin was the director of the MSP pushing for reform - formerly from the AAAS and Coalition of Schools. Reform is massively inbred.

The only board member I know of that was a former teacher has a wife who currently teaches math. The board member did not. His wife is the math coach for her school and was on the math cadre for the district.

We uncovered that the TOSA leading the previous adoption founded a non-profit that did reform-based staff development. She recommended the non-profit to the district for training and encourage the district to join an NSF grant with the non-profit and a university for reform-based staff development. All this before the adoption was made.

Yes, reform is massively inbred. And the same disasterous results breed like crazy.

As so often happens we see NO intelligence used.

Christi has nailed it:

The (pre-made) decision drives the process.

Until the intelligent application of relevant data is used there will be little if any improvement.

It will not be looking good ... until a lot of school board members either step up or are removed.

Just curious - who are the math coaches from the Dana Center leading Beavertons reform? Are they from Oregon?

Anonymous,

I left Beaverton almost 2 years ago, so I don't have much information on the current adoption process. It wouldn't surprise me to hear of more tangled associations. I am not sure BSD learned anything from the issues we pointed out from the last adoption.

Post a Comment