Monday, June 9, 2008

Draft #2 of IMR --- with feedback

Executive Summary

The purpose of this report is to document the recommendations of the Mathematics Instructional Materials Review (IMR) Advisory Group and other key stakeholders, including the State Board of Education Math Panel; and to outline the expected process for reviewing mathematics instructional materials.

The major task at hand is to select texts that will allow Washington Students to have the opportunity to become internationally competitive in Math.
The work of these key stakeholder groups is crucial to the success of the instructional materials review project. OSPI has committed to an inclusive process that actively solicits information and advice from many stakeholder groups. Why would many stakeholder groups be knowledgeable about math textbooks and other instructional materials? It is essential that the review methodology and process measures the appropriate factors, and takes into account a broad range of instructional materials-related criteria that contribute to effective teaching and learning. Ultimately, OSPI will recommend three core/comprehensive mathematics texts at the elementary, middle and high school level. It is imperative that the process, evaluation and final recommendations support the success of all students in the Washington K-12 system.

The process has clearly once again become the goal (so common with OSPI process trumps content again). The selection of the best math materials for Washington students is at best a secondary priority.


Expected Outcomes

The following expectations guided the development of the criteria.

• Examine criteria and processes used by other states and districts within Washington to review mathematics textbooks. Successful? How is this success determined? Are we looking at improved measurable outcomes? If so how were these outcomes measured?

• Identify 5-7 general categories for use in the OSPI Instructional Materials Review that are well defined and based upon accepted research, such as the National Mathematics Advisory Panel Foundations of Success Report.
What does the term accepted research mean?

• Identify specific, measurable, efficient and valuable criteria within each of the categories to help us determine which three texts to recommend at the elementary, middle and high school levels.
How will the determination of efficient and valuable be determined?

• Identify why the criteria is important to measure, and develop ideas on how to measure those criteria.
What criteria are of value and how is this determined?

• Recommend relative weights for selected categories. Recommend threshold categories that curricula must pass before being considered further.
How is the weighting determined?

• It is vital to have the selection criteria unambiguous and measureable, and define relative weightings for categories. The evaluation criteria should be straightforward, and where possible, mapped to approved standards (ex: WA Revised Math Standards, NMAP). The instructional materials review process should include math educators, curriculum specialists, mathematicians, parents and industry representatives.
If such a collection of individuals should comprise the IMR review process, why was there no such diversity present on the IMR criteria panel, which was selected privately and unilaterally by OSPI without even a public awareness of this team’s existence? This is a huge problem if the public is to believe that there is open government.


Figure 1. Content Alignment is the key consideration in the Washington Mathematics Instructional Materials Review.
This is really vague. Aligned with what? The three primary source documents? These do not even align with each other. Does alignment need to be at the specific grade level?
Stakeholders recommend that the review process be structured in two parts.
What stakeholders? This entire process has been almost exclusively IMR criteria team thus far.
• Part 1 is Content/Standards Alignment. All curricula must meet a certain (to be determined) When will this be determined? By what person or group? Based on what alignment? threshold in order to be considered further as one of the three recommended core/comprehensive texts.
• Part 2 will consider other criteria (determined and weighted by what group?) in five categories. Note that the second stage categories have different weights, as shown in Figure 2.

Research Overview

Research
One of the goals of the Mathematics Instructional Materials Review Project is to design a review process and methodology that is grounded in current research and utilizes practices gleaned from other states and districts, which have recently completed successful mathematics instructional materials review projects. (successful as judged successful by what tool or data?)
Stakeholders used (used as in past tense, who were these stakeholders that already used?)
the following research and publications to finalize the category and criteria recommendations.
• Washington Revised Mathematics Standards (4/08)
• National Mathematics Advisory Panel Foundations for Success
• NCTM Curriculum Focal Points

The following additional research and publications were used as secondary sources to inform the process.

• Math Educators Summary of Effective Programs
This is a meta-analysis and of little value

• Park City Mathematics Standards Study Group Report
Gives specific recommendations based on what works internationally.

• Framework for 21st Century Learning
• How People Learn: Brain, Mind, Experience and School
• How Students Learn: Mathematics in the Classroom
The above three appear to be largely pedagogical – any data of success?

• NCTM Principles and Standards for School Mathematics
• Choosing a Standards-Based Mathematics Curriculum – Chapter 6: Developing and Applying Selection Criteria
• Choosing a Standards-Based Mathematics Curriculum – Appendix: Sample Selection Criteria
The above three played a pivotal role in the USA’s poor performance of the last decade. The Appendix: Sample Selection Criteria were particularly harmful.

I have great concerns about the entire direction of this IMR process thus far. There is a lot of focus on process with inadequate reflection upon the eventual product.
Thus far this bears a great resemblance to the processes that have produced more than a decade of substandard materials. These materials in both Washington State and nationally became more and more aligned with NCTM Standards and the results became more and more unsatisfactory on International tests.

Given that a goal is to produce an internationally competitive mathematics education in Washington State, why is such weight placed on using so many instruments that produced the current failing system? Why put particular emphasis on the desire for a Standards-Based Curriculum? Most of the documents referenced in - Appendix: Sample Selection Criteria, produced particularly ineffective programs. It seems likely that the same failures will occur for us if these are used. In particular the Evaluation Criteria from the U.S. Department of Education’s Expert Panel on Mathematics and Science gave rise to the 1999 Exemplary and Promising Math Programs that have yet to produce evidence of positive academic results. The 1999 materials as selected were all NCTM Standards-Based and likely are at least partially responsible for the USA’s decline in mathematics internationally. As the US has slipped internationally, it is very apparent that we are the only nation advocating for the positions and practices promoted by the Standards Based thinking. The appendix also makes reference to the San Diego City Schools k-8 instrument. This produced another extremely poor text series. When many Schools in California switched curricula to meet the new California Standards adopted in 1998, San Diego city schools did not. In the Hook-Bishop-Hook study that compared various districts in California, it was very clear that San Diego and LAUSD, which stayed with their earlier Standards Aligned texts, performed poorly.

Standards Aligned is NOT synonymous with likely to achieve positive academic results.
It appears that this IMR criteria document draft#2 has a mistaken belief that Standards-Based equates with effective in bringing about internationally competitive results.



In addition to the research review, participants examined mathematics instructional materials review processes tools from five states (OR, NC, CT, IN and CA) and two Washington school districts (Edmonds, Vancouver). They identified good ideas to adopt from these sample processes and tools.
How did an IMR criteria panel that had little math content knowledge (so little that draft#1 did not include the NCTM Focal Points) determine what constituted a good idea?
A summary of a few of the state comparisons is shown below.

CHART SKIPPED

Proposed Categories
Part 1: Content/Standards Alignment
Part I is a review of the alignment of the core/comprehensive instructional materials to the revised Washington Mathematics standards. Materials that meet a to-be-determined threshold of alignment with state standards could be considered for inclusion in the list of three recommended mathematics curricula.
The Content/Standards Alignment part of the review process would determine to what degree the mathematical concepts, skills and processes are in alignment with revised state mathematics standards. Reviewers would look for evidence that each Washington state standard core process, content and additional key information was met in the expected grade level.
One of the goals of the Content/Standards Alignment process will be to identify the necessary supplementation for existing materials to meet state standards.
When did this become part of the task required in SB6534?
Part 2: Other Factors
Part 2 of the review process would examine the following categories:
Category Description

Program Organization and Design:
Overall program and design. Includes scope and sequence, appropriate use of technology. Content is presented in strands, with definitive beginnings and endings, and not in spirals. The material is logically organized, and includes text-based tools like tables of contents and indexes.

Student Learning:
Tasks lead to the development of core content and process understanding. They present opportunities for students to think about their thinking, develop both skills and understanding, and apply multiple strategies to solve real world problems. Opportunities exist for students to build computational fluency, number sense and operations.

Instructional Planning and Professional Support:
Support for teachers that is embedded in the instructional materials to assist them in teaching the content and standards. Instructional materials provide suggestions for teachers in initiating and orchestrating mathematical discourse. Includes key information about content knowledge to help teachers understand the underlying mathematics. Materials help demonstrate typical student misconceptions and provide ideas for helping address them.

Assessment:
Tools for teachers and students to formally and informally evaluate learning and guide instruction.

Equity and Access:
Support for ELL, unbiased materials, support for gifted and talented students, support for students with disabilities, differentiated instruction, DI is not a best practice. Why is it here? diversity of role models, (I can not find this in the three primary source documents. This is about building useful mathematical structures in students’ brains. I do not believe that this is in the three primary source documents.) parent involvement, intervention strategies, quality website, (again really vague what is a quality website and how is quality determined?) community involvement ideas.

Figure 2. Proposed category weights for Part 2 of the Mathematics Instructional Materials Review. Note that Content Alignment is not shown in this chart. Content Alignment is a threshold category, meaning that curricula must meet a to-be-determined percentage of agreement before the material can be considered for possible inclusion in the three recommended core/comprehensive texts.

Proposed Measurement Criteria
For Part 1 (Content/Standards Alignment), there will be a 3 point scale (0-2, corresponding with No, Partial, Yes) for each performance expectation. The criteria are the Washington Revised Mathematics Standards (4/08).

Part 2 will use a consistent measurement scale for each item; a 4-point Likert scale (strongly disagree, disagree, agree, strongly agree). For each of the Part 2 categories listed above, stakeholders identified proposed measurement criteria.
Part 1: Content/Standards Alignment
Part 1 will ensure that Washington state math standards designated for the specific course and/or grade level/band are addressed. It will ensure that the mathematics content within the program is rigorous and accurate, with few errors of fact or interpretation. A sample rating form is shown below.

Figure 3. Sample rating form for Content/Standards Alignment Review.
Note to reviewers. I wonder if we should have the Content/Standards Alignment be both a threshold category AND have a weighting like the Part 2 review categories. For example, what if Content/Standards Alignment were weighted at 70% and the other categories distributed across the remaining 30% in rough proportion to the weights proposed in Figure 2?

Part 2: Other Factors Many of these factors need to be cited within the source documents as to where these factors are found.
Program Organization and Design
1. The scope and sequence of materials matches state standards
2. The content has a coherent and well-developed sequence
3. Program includes a balance of skill-building, conceptual understanding, and application
4. Builds from and extends concepts previously developed
5. Intentional review & application of previously taught skills and concepts
6. Some but not all tasks are of an open nature with multiple solutions, Open nature multiple solutions – where is this found in the source documents? others with a correct and verifiable answer
7. The materials help promote classroom discourse
8. The program is organized into units, modules or other structure so that students have sufficient time to develop in-depth major mathematical ideas
9. The instructional materials provide for the use of technology with reflects 21st century ideals for a future-ready student and school -
10. Support materials provided, such as electronic learning resources or manipulatives, are an integral part of the instructional program and are clearly aligned with the content
11. Objectives are written from the student, not the teacher perspective
12. Instructional materials include mathematically accurate and complete indexes and tables of contents to locate specific topics or lessons
13. The materials have pictures that match the text in close proximity, with few unrelated images
14. Materials are concise and balance contextual learning with brevity
15. Mathematics concepts are developed for conceptual understanding, computational fluency and problem solving ability, including how to address non-routine problems
16. The program contains an balance of skill building, conceptual understanding and application
17. The materials include formulas and teaches standard algorithms

Note to reviewers: I think there are too many criteria in this category and recommend eliminating those that are either difficult to measure, are redundant with other criteria, or have less value in relation to the remaining criteria. Recommend a target maximum of 10 criteria in this category. 1 & 2 seem redundant. 3 & 4 seem redundant, 3 and 16 seem redundant. 15 and 17 might be accounted for in the Content Alignment/Standards Review. 11 might be best placed in Student Learning. I am concerned about ambiguity and inter-rater reliability on several of these elements.

Student Learning
1. Tasks lead to the development of core content and process understanding
2. Tasks build upon prior knowledge
3. Tasks lead to problem solving for abstract, real-world and non-routine problems
4. Tasks require students to think about their own thinking
5. The program provides opportunities to develop students’ computational fluency using brain power rather than technology
6. Tasks occasionally use technology to deal with messier numbers or help the students see the math with graphical displays
7. The program promotes understanding and fluency in number sense and operations
8. The program leads students to mastery of rigorous multiple-step story problems
9. The materials build students’ understanding of standard mathematics terminology/vocabulary

Instructional Planning and Professional Support
1. The instructional materials provide suggestions to teachers so that in tasks and lessons teachers can help students to:
a. Access prior learning as a foundation for further math learning
b. Learn to conjecture, reason, generalize and solve problems
c. Connect mathematics ideas and applications to other math topics, other disciplines and real world context
d. Develop a responsibility for learning and self confidence
2. Instructional materials provide support to teachers in supplementary mathematical content knowledge
3. Background information is included so that the concept is explicit in the teacher guide

4. Instructional materials help teachers anticipate common student misconceptions
5. The instructional materials identify typical student misconceptions
6. The materials can be used by a wide range of teachers with different teaching styles
7. The materials support a balanced methodology including direct instruction, example-based instruction and discovery
8. Math concepts are addressed in a context-rich setting (giving examples in context, for instance)
9. Teacher’s guides are clear and concise with easy to understand instructions

Note to reviewers: Multi-part items are difficult to deal with in scale design, and are subject to variable interpretation. Consider simplifying #1. Numbers 2 & 3 seem redundant. Numbers 4 & 5 seem redundant. Numbers 6 & 7 seem redundant.
Assessment
1. The program provides regular assessments to guide student learning
2. There are opportunities for student self-assessment of learning
3. Assessments reflect content and process goals and objectives
4. The program includes assessments with multiple purposes (formative, summative and diagnostic)
5. Assessments include multiple choice, short answer and extended response.
6. Rubrics or scoring guidelines accurately reflect learning objectives
7. Rubrics or scoring guidelines identify possible student responses both correct & incorrect
8. Accurate answer keys are provided
9. Equity and Access
10. The program provides methods and materials for differentiating instruction (students with disabilities, gifted/talented, ELL, disadvantaged)
11. Materials support intervention strategies
12. Materials, including assessments are unbiased and relevant to diverse cultures
13. Materials are available in a variety of languages
14. The program includes easily accessible materials which help families to become active participants in their students’ education (e.g. “How You Can Help at Home” letters with explanations, key ideas & vocabulary for each unit, free or inexpensive activities which can be done at home, ideas for community involvement)
15. The program includes guidance and examples to allow students with little home support to be self-sufficient

Next Steps
These steps remain:
1) Solicit feedback from reviewers on Draft #2
2) Clearly define the process for how OSPI will select three programs to recommend. Will it be based solely on the defined criteria, or will there be some level of review/decision making by a panel?
3) Review each measurement criteria with the following framework in mind:
a) Is the criteria measurable using professional judgment and/or available evidence? (measurability)
b) Is the criteria clear enough to avoid multiple interpretations? (specificity, variability)
c) How will this help inform our recommendation process? (value)
d) Is the criteria in the most logical category? (organization)
e) Does each criteria cover just one concept? (atomicity)
4) Remove redundant criteria
5) Test the draft review tool

Bibliography
(To be completed in 3rd draft. It will contain bibliographic references to primary and secondary sources in the Research section.)

Blue comments by Dan Dempsey

8 comments:

  1. If you want to help students learn math, it is important to know how students think. I am a math professor at Rider U. See the new book on amazon.com: "Teaching and Helping Students Think and Do Better".

    ReplyDelete
  2. This controversy is not about students, its about poorly written textbooks and careless research on curriculum that is then pushed into schools without regard to what the community wants.

    It is a one-sided debate and furthermore people are being prevented from earning high school diplomas.

    You are Dr. Aramoff? And from the threads linking back to your homepage, my guess support the Math Forum and probably a friend of both Michael Goldenburg and Phil Daro.

    Tell them they should go open a Core Plus textbook and see if they can decode for themselves what the authors are attempting to teach students. Are their books really supposed to model how students think.

    Look inside an alternative program where I live in washington and you will be dismayed by what you see.

    It is a pit seething with neglect and all these children have to learn math with is a Core plus textbook. No one has ever graduated from this school or passed math with it. A third of our children drop out of normal school. We have teenagers living in the streets and some of the highest teenage pregnancy rates in Washington.

    If your kind want to help, convince the NCTM, NSF, and MAA they should adopt more than a vision; if they had any courage or honesty they would adopt Singapore Standards - a vision with curriculum to go with it. Americans are waiting for an apology, not more empty rhetoric.

    You teach gifted children, what does that have to do with the books that other children use? You have the luxury of choosing what you want to teach and who you teach. I'm sending children to college, who's families have never had that opportunity. I'm helping homeless children escape poverty. You and your friends have some nerve and should be ashamed of yourselves.

    ReplyDelete
  3. Actually since we're back on track with Mike - here's his explanation for why the negative of a negative is positive. I didn't know his training was literature - gee and he must have gotten into math when signed on with Diane Briars in New York and look now he's on the NSF grant committee.

    As a non-mathematician mathematics educator (my original training was in literature; I 'defected' to math ed in the early '90's), my approach to such questions is fairly informal. One way I like to think about signed number operations is on the number line. I also try to look at the
    underlying connections between arithmetic operations (addition repeated = multiplication; subtraction repeated = division; addition and subtraction are inverse operations, hence, so are multiplication and division). Once kids can see how addition and subtraction of signed integers work on the number line (or by modeling with different colored tokens, which works nicely for demonstrating why subtracting a negative is the same as adding a positive), you can show multiplication on the number line as follows: start at 0...


    Here's a quote from Eric Hart (CPMP) - regarding NSF funding cutbacks back in 2006

    Article: "At a May 3 hearing of the House Science Committee, members of both parties said they saw recent cuts to the math and science education portion of the NSF's budgets as a particularly troubling sign."

    Troubling because:
    Article: "curtails innovation and increases the potential for
    political interference in school research" ... instead of giving "a
    stronger role for the independent federal agency [NSF]."

    The Dept of Education tends to be politically motivated, while the
    NSF tends to be independent and scientifically motivated. Also, check out the research. There is a lot of research indicating the success of NSF-funded programs.


    This is what inspires us - the type of bs that goes on between professors.

    ReplyDelete
  4. This is a blog organized by parents in Columbia, MO - home of the ShowMe Center.

    http://maththatworks.blogspot.com/2008/04/investigations-fall-short-of-nctm-focal.html

    http://www.showmenews.com/2007/Mar/20070318Feat002.asp

    In September 2006, the NCTM printed Focal Points, which outlined important math topics that should be covered by eighth grade. The curriculum document also highlighted the importance of learning computation.

    "The spirit of the Focal Points is almost diametrically opposed to the spirit of Investigations," said Harvard mathematician Wilfried Schmid, a member of the National Mathematics Advisory Panel and vocal opponent of reformed math.

    "Virtually any other curriculum would be an improvement over Investigations," Schmid said in an e-mail to the Tribune.

    ReplyDelete
  5. Wow!! This is yielding a goldmine of information.

    I want to give you a short answer to your concerns:
    Stick with traditional Math.

    Here is a site that I developed for the University of Missouri. The URL is http://mathonline.missouri.edu Students who score above 75% on a specific test (Algebra, Geometry, Trigonometry, Calculus readiness), would be judged as proficient in this subject. I do not think that the majority of the students taking Integrated Math will score well on these tests. Get you children to practice often on this site before they take the ACT or SAT.
    ~Professor Elias Saab, University of Missouri

    http://www.claytonmathmatters.com/professorFeedback.html

    The quality of the secondary math education in the United States has long become a standard joke among mathematicians around the world. It is very painful to see how the most powerful country trails behind European, Asian and third world countries in the level of mathematical skills of high school graduates. This concern has surfaced in the most recent State of the Union address, so it is clearly getting the national attention.
    To my opinion, the major reason is that, unlike many countries, algebra is not included in the middle school curriculum in the United States. Algebra was invented a few thousands years ago as the universal language of mathematics, allowing to avoid lengthy word explanations of mathematical procedures and making mathematical studies logical and connected. It is essential that the students start learning algebra as early as in the 5th grade, which brings mathematical formulas to them naturally later in their lives. It is still possible for the best students to start algebra in the 8th grade and be successful, but in general American students continue looking at mathematics as a foreign language. I think the switch to starting algebra in the 5th grade must be made as soon as possible, by adopting a system used in one of the countries like Russia, France or Germany. American students are as smart as anybody, and they and their teachers will quickly adjust to this system.
    The so-called Connected Mathematics Project, Integrated Math and other recent innovations are even worse than the "traditional" American system. They further water down the curriculum and leave the students largely unprepared for college mathematics. Instead of bringing our children back to the level of "ancient Greeks playing with stones on the beach", American math education must quickly switch to the most advanced methods of mathematical learning.
    Sincerely,
    Alexander Koldobsky
    Leonard M. Blumenthal Distinguished Professor of Mathematics
    University of Missouri-Columbia

    Thank you, I concur 100%. Successful road-building societies introduce algebra at an earlier age. The US integrated math movement is reversing that trend and endangering the future security of our nation. All their research amounted to costly propaganda.

    ReplyDelete
  6. Everyday I learn something new - Phil Daro is an English Major

    Phil Daro is a polymath - an English major, he became a research methods and evaluation expert at Berkeley. He moved through the California State Department of Education to lead that State's professional development in Mathematics as Director of the California (and American) Mathematics Projects. He has long been interested in the processes of educational change, and in the role of assessment in forwarding it. He helped devise the Balanced Assessment Project, and is one of its directors.

    His advice is greatly in demand nationally. Among recent responsibilities, he was co-chair of the Task Force advising on the revision of the California Mathematics Framework.

    He is now Executive Director of the New Standards Project, which provides the most wide-ranging set of performance assessment resources at Grades 4, 8 and 10 that are currently available across the US.

    Phil Daro is a co-director of the MARS, with particular responsibility for strategy and systems issues.

    So Michael and Phil share something in common.

    ReplyDelete
  7. And who does the independent evaluation studies for MARS?

    Inverness Associates (Mark St. John)

    Evaluation Evidence: Overview
    This section is a guide to the independent external evidence on the quality of MARS work, and the products and processes it designs and develops. The linked pages offer more detail, in layers down to raw data, so that you can probe as far as you may wish. The structure and content of most of these evaluation pages is under the control of Inverness Research Associates (inverness-research.org) (IR), the independent evaluation consultants for MARS' work. If asked, they would be happy to respond to further enquiries, as will many of the referees from MARS client systems, whose email addresses are given with their reports.

    Who does the on-site evaluations for the School Improvement Plans in Washington State?
    Inverness Research Associates


    Stokes, Laura, St. John, M., Helms, J., and Maxon, D. (2004, July) Investing in a Teaching Leadership Infrastructure for Washington Education: A Summative Assessment of Washington Initiative for National
    Board Teacher Certification. Inverness, CA. Inverness Research Associates.

    ReplyDelete
  8. Washington State Science Standards:
    An Independent Review
    Final Report
    Submitted
    May 7, 2008
    PROJECT TEAM
    David Heil
    Rodger W. Bybee
    Harold A. Pratt
    Kasey McCracken

    Guess who was on the panel review?
    Mark St. John (IR)

    ReplyDelete