Seattle Schools Math Adoption Committee Work
I attended the Math Adoption Committee meeting a week ago. I stayed about 2 hours (out of a daylong meeting). I didn't have high hopes at the beginning as the meeting seemed disorganized and slow to start. The meeting was monitored by staffers Anna Box and Shauna Heath.
They first went over the discussion from the Board-asked-for Work Session that had occurred the day before about the math adoption process. (More on that in a minute).
The next meeting of the Math Adoption Committee is tentatively set for April 25th (I did not hear a reason they were skipping March). As soon as they announce their final selections, they want public comment. There will be viewing of materials at five schools (one in each region) and one library. These are Arbor Heights, South Shore K-8, Coe, Northgate and, for the Central region, Douglass-Truth Library.
It was noted by staff that they have about $1.5M for this math adoption.
One Committee member said they were getting so far away from the curriculum (in terms of time spent on reading each one) that she herself would have to go back and review the final 3-4.
What followed was impressive to me. The Committee went through a long process of discussion over how to rank and score the final choices. Two members had worked through a process and then the entire committee walked it thru with comments. I was very grateful for their willingness to take this time and desire to make sure that they could document their choices on each vote. They very clearly want the public to know how the narrowing down happened.
My one final comment on the meeting is that I hope that the district will take their work and use it as a template for future curriculum adoptions (or other types of activities for other committees/taskforces). While I appreciated their hard work, I thought, this is taking nearly an hour and a half just for this. I hope no one else feels compelled to reinvent the wheel the next time this kind of voting takes place in SPS.
Below is a report by two people; one who is close to the community and one who is a community member. These are their thoughts and reporting but it is quite detailed and may offer some clarity to what is happening on the committee.
Start of report
An emergency board work session was called for Feb 6, 2014, requested by Director Sharon Peaslee to discuss board concerns with the math adoption process. Board directors had submitted a list of specific questions for district staff, and that list formed the meeting agenda, although the majority of discussion focused on several specific questions, listed below.
On February 7, 2014, the K-5 Math Materials Adoption Committee (MAC) convened a regular all-day meeting. Summary notes for the MAC meeting follow the summary notes for the emergency Board session. Notes for both meeting are organized by theme or questions, rather than by order of discussion.
Feb 6, 2014 Emergency Work Session
Present: Directors Peaslee, McLaren, Carr, Patu, Peters; Charles Wright, Deputy Superintendent; Michael Tolley, Assistant Superintendent of Teaching and Learning; Shauna Heath, Executive Director of Curriculum, Assessment and Instruction; Anna Box, Mathematics Program Manager; Adam Dysart, Mathematics Curriculum Specialist; Eric Caldwell, Manager, Instructional Technology; Shawn Sipe, Instructional Materials Specialist
Question: Why were four programs brought forward?
Directors’ concerns: All prior documents and status reports provided to the board had set the expectation for 3 or 2 programs to be brought forward for final screening, but now there are four finalists. Why was this changed, and who made this change?
Discussion: Programs #3 and #4 were scored as a statistical tie by the Math Adoption Committee (MAC). A tiebreaker had been proposed by a MAC subcommittee, but not voted on by the full MAC so it could not be utilized as a formal part of the process. Directors emphasized that not identifying a formal process to bring forward 3 or 2 programs was a major oversight. District staff was largely non-specific in addressing this concern. Directors were dissatisfied with staff explanation of the process, the incorporation of community input, and the MAC role in the choice to select four finalists. They asked that the MAC explicitly be given the opportunity to choose 2 or 3 programs in their subsequent meeting.
Follow-up: At next day's MAC meeting, the facilitator asked “Are you o.k. with process we did to get to decision of four, and/or would you like a new process to talk about going to two or three?” The comments indicated broad support for keeping four. The facilitator then called for a vote on this question, and the MAC voted nearly unanimously to keep the four.
Question: Why was community input not incorporated into the process for Round1?
Directors’ concerns: The initial program screening results and rationale included tabulated scores from 122 in-person and 140 online community reviews. District documents claimed that community input was used in deciding which programs moved forward. It was mentioned that the district was sued over the last math adoption due to community input oversights, and nobody wanted to repeat that.
Discussion: This was the most extensively contested issue of the meeting. Multiple directors reaffirmed their commitment to community engagement and incorporating the community voice into the program selection process. Based on the results provided and discussion at the meeting, it was evident that community input was not explicitly used in determining the finalists. Rather, it was stated that the “community input supports the promotion of the top three programs in this list” [of the four programs brought forward]. The proposed (but not approved) MAC tiebreaker proposal would apply community ratings to resolve any scoring ties from the MAC results. Had this proposal been voted on and approved by the MAC, it would have resulted in the use of community feedback to narrow the MAC top four programs to three. Director Peaslee was very concerned that this work [to develop a tiebreaker proposal] took place and was not approved, essentially negating the community input received. District staff reaffirmed that extensive feedback was collected and tabulated in stage 1 of this process, and this feedback was shared with the MAC as part of the rationale for program selection. Director Peters stated that having the community input taken into consideration but not quantified is going to be a very hard sell to the community. This issue was never fully reconciled during the meeting, but the message was clear to the district staff that future work by the MAC needed to incorporate community input in a predetermined and transparent manner, including in-person and online submissions.
Follow-up: At next day's meeting, the committee developed the general details of a community input plan for the 2nd round, and which addressed the content of the input form, when/how the input would be communicated to the MAC, how the MAC would use the input, and how that MAC would document the use of the input. The committee approved the general details through a vote. The committee also approved a process for finalizing the details of the community input plan.
Question: Why was JUMP Math dismissed on the grounds of being incomplete?
Directors’ Concerns: JUMP Math was highly regarded in the community review, but was not included in the top 4 programs. Community will ask why JUMP was even reviewed if it was not complete.
Discussion: The eight programs evaluated by the MAC were submitted by publishers based on a predefined set of submission criteria. Materials submitted were not reviewed by district staff for completeness, only by the committee based on their approved first round screening criteria, which included alignment to Common Core State Standards. Since JUMP Math did not have all grades of the CCSS edition completed at the time of SSD’s review process, it potentially scored lower in multiple categories.
Follow-up: JUMP was not discussed at next day's MAC meeting. All materials other than the top four had already been removed from the JSSC library by the time the MAC had assembled in that library for their meeting yesterday.
Question: Why was data tabulated after the deadline, switching the ranking order?
Directors’ Concerns: The results of the MAC screening process resulted in two separate results and rationale documents (Jan 10 and Jan 15). These documents were very similar, but the Jan 15th document resulted in the 3rd and 4th ranked programs switching positions. Allowing additional inputs following the release of results is not scientific or objective.
Discussion: All MAC members were directed that results had to be submitted by Jan 10th. During initial tabulation, several entries were undecipherable and these were scanned and e-mailed to the rater for clarification. Additionally, there was one rater who did not have access to a scanner and hand-delivered on the 11th. The change in ranking order was because the last two programs were so close that they flipped easily. This was the reason for selecting four programs to move forward. Director Peters clarified a significant concern for the record that there is one program which has benefited from all these process issues. The elimination of Jump Math from consideration, bringing forth four programs, and retabulation all benefited My Math.
Question: Why is benchmarking not being provided?
Directors’ Concerns: Policy 2015 stated that adopted materials must be based on best practices and research including benchmarking from similar districts. Will the committee be considering performance data on finalist programs?
Discussion: District staff have considered possible data sources, and invited MAC to look for it, but no really relevant benchmark data is available at this time, since the Common Core State Standards are so new, and the assessments are not in place yet. Director Peters reaffirmed she didn’t believe the purpose of this adoption is so that students can pass cc test, if our kids learn math well they can pass any test.
Follow-up: The first round screen does not include benchmark data. At next day's MAC meeting, staff did not share with MAC that Board had asked about benchmark data, and no member or staff made any proposals to make benchmark data a factor in the 2nd round selection process.
Question: Why is cost data not being provided?
Directors’ Concerns: One of the formal components of the recommendation from the adoption committee is an assessment of the initial and ongoing cost of the adoption. Will the committee be considering this? Director McLaren referenced some program cost spreadsheets for JUMP Math which were part of the materials display at JSCEE and asked if the other publishers submitted cost estimates.
Discussion: District staff indicated that pricing is typically not shared with committee members in the first round. A subcommittee on the MAC has identified critical components for each program, and procurement is requesting pricing for these items. This data will be shared with the MAC and they will take it into consideration as part of their final program selection.
Follow-up: There seemed to be some uncertainty at next day's MAC meeting as to whether policy allowed the cost data could be used in the selection process, and at what stage in that process. The 2nd round as discussed included two stages: the scoring stage where programs are initially ranked by MAC members, and the deliberation stage where programs were successively eliminated. Since cost does not already appear as a criterion on the screen (new criteria cannot be added to the screen) it appears the MAC is barred from using cost data in the scoring stage. The MAC approved a process for the final stage of the 2nd round that allows members to modify their ranking of curricula during deliberation, if they include the justification. The MAC took no vote on how/when to use cost data, but there was some discussion that it might be considered during this deliberation stage. District staff indicated that the MAC is responsible for coming up with an “essentials”list, wherein the MAC will specify what published elements of the curriculum are“essential.” The MAC seems to have left themselves open to the possibility that they will discover after they pick their final candidate that the cost of what they deem essential for that finalist curriculum exceeds the adoption budget by an inordinate amount. Incidentally, the MAC was promised cost data would be provided before the meeting closed, but this did not happen.
Question: Are evaluation criteria are being changed after the initial review of textbooks?
Directors’ Concerns: The policy dictates that selection criteria are to be determined prior to any programs are evaluated. Now that programs have been evaluated and selected for final review, what are the directives to the MAC regarding these criteria? There has been some mention of “non-negotiables”, which is not part of the existing first round criteria. Opening up the criteria for changing after the programs have been evaluated heightens the risk of manipulating the process or altering the rankings, which is why the policy is written as it is.
Discussion: District staff reiterated the answer multiple times that the MAC isn’t changing any criteria; they are putting more detail onto existing, approved criteria, so that MAC can take a deeper look at 2nd round. It was confirmed that changing the criteria at this point without IMC approval would be a policy violation. The committee isn’t seeking in this stage 2 to change, but rather to add detail and provide background and context. A key element of CCSS is focus. Some of the criteria relate to focus. In 2nd screen looking to add some actual figures, like 80% of content should focus on these subset of topics. So we are not changing criteria but adding detail. If the MAC used exactly same selection tool in the second round, they couldn’t get a deeper look at the materials. It was clarified that the numerical guidance for this process was coming from a document from Achieve The Core, which provides guidance document to help districts check alignment. Directors questioned whether this was discussed and approved by the full MAC, and were informed that criteria review would take place the following day, Feb 7, at the next MAC meeting.
Follow-up: At next day's MAC meeting, the prohibition against changing the screen was duly noted. MAC voted overwhelming to keep “Student Friendly” as its own criterion, even though it was not a criterion in the 1st round screen. Apart from “Student Friendly,” proposed edits that appeared on the draft screen as new criteria were moved under approved criteria, so that they would not appear to be new criteria. A proposal to some criteria non-negotiable was discussed at length, but then was voted down. MAC did not vote on any of the other proposed edits to the screen, but did agreed on a process for finalizing the screen through email. They hoped to be able to get the screen finalized in time to submit to IMC for approval.
Question: Will a program be rated lower if it introduces a topic at earlier grades than called for in CCSS?
Directors’ Concerns: In some cases, the CCSS in mathematics are actually lower than our current standards, so if we are faithful to CCSS we may be lowering our existing standards. Is there room for curricula which are more rigorous, introducing topics earlier than called for in CCSS? Do we automatically disqualify materials that go beyond CCSS?
Discussion: There was some ambiguity in this discussion as presented by district staff, identified by both directors and Charles Wright. The policy states that applicable standards are a “minimum level of rigor”. Shauna Heath stated that the state requires us to implement CCSS, but through professional development we hope to teach beyond the standards. Adam Dysart implied that programs containing content not exactly aligned with CCSS would be scored down. The MAC selection criteria include alignment, assessment, and coherence. Alignment and assessment are grade-specific, while coherence is less critical about grade level. Coherence is more about building the mathematics sequentially. Director Peaslee concluded she would hate to see fidelity to CCSS weighted higher than accessibility of materials by disadvantaged students. This is a really major concern shared by several board directors. Tying the selection criteria very heavily to CCSS without sufficiently taking into account other considerations puts us at risk for adopting a curriculum that aligns to CCSS but doesn’t address the problems we are struggling with now.
Follow-up: At next day’s meeting, there was no mention by staff or facilitator of this concern. There does not appear to be any element of the screen and proposed edits that would necessitate lower scoring of a curriculum that introduces some topics earlier than called for in common core.
Question: How is accessibility being incorporated into the selection process?
Directors’ Concerns: Accessibility criteria in board policy are critically important. Has accessibility by ELL and SpEd students been communicated to the MAC explicitly? Directors Carr and Peaslee in particular made it clear that they will not vote for a candidate program that is text-heavy. Director Patu emphasized the need to realize a lot of kids of color and ELL kids learn differently, and with the math adoption that we have right now, a lot more are failing.
Discussion: The concern about language-heavy programs being inaccessible to many of Seattle’s struggling students was brought up repeatedly. The Directors present were emphatic that any program adopted must be accessible. The Curriculum and Instruction Committee made this point very clearly prior to this adoption, even writing it into the policy. Eric Caldwell described the district’s cultural responsiveness criteria document. Director Peaslee stated the docs were vague on this point, stating that many ELL kids are trying to learn and get support on the math at home, and those documents don’t address that. That ELL students and students weak in reading and writing can still access the materials and their families who may also be ELL can see what math is being taught, without having to read multiple paragraphs of anecdotal text that eventually leads to math content. We need texts that make it easy for kids of any educational or cultural background to learn the math they are required to learn. Shauna Heath committed to make sure that this is clarified to the committee, and Director Carr recommended that clarification be made before the next meeting (following day), and that this process needs to satisfy the requirements and expectation of the board, so whatever steps need to be taken, a) to ensure integrity of process, b) to address the concerns you hear from board, are imperative. Director Carr stated that there is a range of viable math programs from text heavy to text light. If the committee recommendation comes in closer to text heavy vs text light, you are going to run into trouble when it comes time to [the Board] vote.
Follow-up: At the next day’s meeting, staff’s representation of the Board’s accessibility concern went little beyond the MAC being told that accessibility was very important to the Board. The most explicit mention came from Adam Dysert, who stated, “What I heard very clearly from the Board last night is this issue about students be able to access the mathematics. They expect various students to be able to access the math. This is important to them.” There was no mention of the “text heavy vs text light” concern. Arguable, staff and facilitator did not get across to MAC the message that the Board considered it imperative that the MAC address the Board’s various concerns. There was considerable attentiveness, however, to integrity of process during the meeting.
Summation of Comments:
All attendees expressed their appreciation for the work underway by the MAC, and the dedication of the individual members. Shawna Heath emphasized that process challenges and transparency questions may cause some uncomfortable discussions, but ultimately raise the whole adoption and all the participants to a higher standard, and that’s ultimately a positive outcome.
Meeting concluded at ~8:30 PM
Math Adoption Committee (MAC) meeting on February 7, 2013, 8:30 am – 3:45 pm at JSSC.
Text appearing within brackets is added editorially for clarification. The information reported here is presented in process order, rather than in the order in which the topics were discussed at the meeting. The facilitator is an independent consultant named Barbara Grant.
Overview of processes agreed on.
1. Members must finalize their community input form and their 2nd round screening tool. The screen provides the criteria and weightings that members will use to develop scores for curricula. 2. Scoring stage of 2nd round: Members will work individually (or possibly in grade band teams) to develop scores for each candidate curriculum. Members will be provided with all of the qualitative community input, so that they can allow it to influence their scoring to the extent they chose. 3. Elimination stage of 2nd round: Shortly before the meeting starts (probably not more than a few days in advance) members will be provided with the tabulation of the quantitative community input (the tabulation will produce an aggregate community ranking of curricula). All members will meet to share their views on curricula, community input, and possibly cost data. Through deliberation, non-binding straw votes, and three binding votes, the committee will successively eliminate three candidates. The last remaining candidate will be the finalist. 4. The recommendation forwarded to the Board will include documentation of changes to scores (or ranks) during the elimination round, and documentation that each member had access to the community input data and took it into consideration.
Questions from the Board
The facilitator invited MAC members to comment on (1) whether they had adequate opportunity to think about and consider community feedback, and (2) whether they had enough opportunity to vote on the tie breaker. The suite of comments in response to these two questions do not provide a clear answer. Facilitator did not call for any vote on the first two questions. Facilitator asked for comments on third question (3) “Are you o.k. with process we did to get to decision of four, and/or would you like a new process to talk about going to two or three?” The comments indicated broad support for keeping four. The facilitator called for a vote on this question, and the MAC voted nearly unanimously to keep the four.
Questions about public display of materials: when, how many sites, what sites, extended hours; will on-line input be allowed?
Policy requires a minimum of five locations at which the materials will be used and that input be allowed both on paper and on-line. The materials will be on display for about three weeks in the spring, around early April. Staff has picked five elementary schools and one library, aiming for appropriate geographic coverage of the school district. Staff noted that some MAC members have been advocating for more public library sites [in order to improve public access to the materials geographically and on evenings and weekends], and that some MAC members had already made enquiries and had gotten informal commitments from a couple of public libraries for material display. Staff asked MAC to let staff handle arrangements and decisions for display, except for the question of what languages needed to be displayed, which they said was a MAC decision. Staff used the rationale of tradition and relationships rather than policy to justify its position. Staff wants MAC to decide what languages should be displayed (this question still needs an answer).
What will be on the community input form?
Staff communicated to the MAC that, while policy requires the MAC to take community input, the MAC gets to decide how to do that, but needs to have a defensible procedure.
A subcommittee brought forth two proposals. Full committee discussed at these at length, then agreed by vote on the general content and structure of the community input form. The community input form allows for both qualitative input and quantitative input:
1. One person – one form (single unified form for all four curricula) 2. Identify role (parent, teacher, student, other). 3. Identify region/school of residence/association 4. Listing of MAC screening criteria, with links online for more info. 4. Yes/No box for each curriculum, with space for respondent to explain yes/no response in each case. Inform respondent that yes/no responses will not tabulated if they omit answer for one or more of the four candidates. 5. Section for extended free comments
The committee agreed on (by vote) a process for finalizing the community input form. The finalization process will take place via email and survey monkey, and will repeat until everyone votes six or lower on the eight-point scale (where 1=full support, 7 or 8 means unacceptable.).
Though not specifically voted on, the notes indicate a general understanding that the quantitative input (i.e. answers to yes/no check boxes for curriculum) would be tabulated by staff, and turned into a community ranking of the four curricula.
Consideration of proposals to edit the screen, add non-negotiables, and change the criteria weights.
A subcommittee was presented proposed edits to the first round screen. As the start of this portion of the meeting, staff made clear that the screening criteria established in the first round cannot change for the second round, but refinement and clarification is allowed. Some of the edits looked like new criteria, so these were repositioned so as to be sub-items under the 1st round criteria.
Proposal for “non-negotiables:” The subcommittee advocated that certain statements in the screen be labeled as “non-negotiable,” by which they meant that a curriculum that failed to satisfy a criteria deemed non-negotiable would be eliminated from consideration. The committee voted on the proposal to have non-negotiables in the screen. The proposal was rejected.
The committee deliberated proposed edits, but did not finalize the screen. MAC agreed (by vote) on a process for finalizing the screen in time to submit to Instructional Materials Committee for approval at the IMC’s Feb 11 meeting. [Revisions to the screen must be approved by the IMC.]
It was proposed to keep “student friendly” as its own scorable section [thus elevating this to the level of a criterion]. This proposal was voted on and strongly supported by the MAC.
Staff indicated that the MAC can change the weighting in the 2nd round. The committee discussed changing the weights. One person proposed to allow each member to choose their own weighting, but the argument against this idea appeared to be persuasive to the committee.
MAC discussed a proposal to divide up the screening by into grade level teams who would schedule time to meet to do their work. A date for grade-level team meetings was discussed. According to observer’s notes, no decision was made on the proposal.
When will MAC receive the qualitative community input results?
MAC members voted on two alternatives: (a) data provided on a rolling basis (as it comes in during the display period) to MAC members, who will be working on their scoring; (b) data provided in bulk after scoring is done, before the elimination meeting starts. A hand vote showed that the MAC favored option (a) by strong majority.
How will the qualitative community Input data be used by the MAC?
Each MAC member will use the community input data to inform/influence his/her personal scoring of the curricula against the approved criteria. [The agreed process seems to allow members to bring up community input for discussion during the elimination round, if they choose to.]
How will the usage of qualitative community input data be documented?
Each member will check a box on their scoring sheet that says “I reviewed and used community in put in the course of making my decision.” or “I had access to and reviewed the community data, and it influenced how I scored. “ or something similar. This suggestion was not voted on, but was rather stated by the facilitator as what would be done.
When will MAC receive the tabulation of quantitative community input and the community ranking of curriculum?
The MAC is supposed to get the tabulation of quantitative community input after the display & comment period has ended, a few days before the start the elimination round.
What will be the procedure for the determining the finalist curriculum?
The elimination round is a meeting of the committee in full for purpose of carrying out the following agreed-on process for identifying the finalist curriculum. Each member has to finish their initial scoring before they come to the elimination meeting. The scores are converted to ranks (each person’s top scoring curriculum becomes their Rank 1 curriculum). The elimination process includes non-time-limited deliberation where members openly share their views, and discuss cost data and community input. Each person can make adjustments to their scoring or ranking during the deliberation time, as long as they have a criteria-based rationale; the rationale for any score/rank change after the elimination meeting starts must be recorded by each screener. [It’s not clear to note-taker how MAC can use cost and community input data to change their scores or rankings, since cost and community input don’t appear on the screen as criteria.] A straw poll (to tabulate provisional ranks) will be conducted anytime a member requests it.
A binding vote results in the elimination of that curriculum that has the fewest number of members giving it a rank of one. At any point a member can ask for a binding vote. The MAC will then vote on whether to taking a binding vote (14 yes votes is required to trigger a binding vote). Three conclusive binding votes will result in identification of the finalist curriculum.
This procedure was voted on using the 1-8 consensus vote (where 1=strongly support, 8=strongly oppose). Hand Vote result: 1=14. 2 = 8. 3. 4. 5-8: 0.
What about a “None of the Above” option in a binding vote? (meaning, what if a member finds none of the curricula satisfactory, so has nothing to vote for in a binding vote) This was discussed fairly extensively, but no decision was made how to handle this case.
How will the quantitative community Input data be used by the MAC?
The MAC discussed allowing members to use this data as a factor for adjusting their ranks in the elimination round.
They first went over the discussion from the Board-asked-for Work Session that had occurred the day before about the math adoption process. (More on that in a minute).
The next meeting of the Math Adoption Committee is tentatively set for April 25th (I did not hear a reason they were skipping March). As soon as they announce their final selections, they want public comment. There will be viewing of materials at five schools (one in each region) and one library. These are Arbor Heights, South Shore K-8, Coe, Northgate and, for the Central region, Douglass-Truth Library.
It was noted by staff that they have about $1.5M for this math adoption.
One Committee member said they were getting so far away from the curriculum (in terms of time spent on reading each one) that she herself would have to go back and review the final 3-4.
What followed was impressive to me. The Committee went through a long process of discussion over how to rank and score the final choices. Two members had worked through a process and then the entire committee walked it thru with comments. I was very grateful for their willingness to take this time and desire to make sure that they could document their choices on each vote. They very clearly want the public to know how the narrowing down happened.
My one final comment on the meeting is that I hope that the district will take their work and use it as a template for future curriculum adoptions (or other types of activities for other committees/taskforces). While I appreciated their hard work, I thought, this is taking nearly an hour and a half just for this. I hope no one else feels compelled to reinvent the wheel the next time this kind of voting takes place in SPS.
Below is a report by two people; one who is close to the community and one who is a community member. These are their thoughts and reporting but it is quite detailed and may offer some clarity to what is happening on the committee.
Start of report
An emergency board work session was called for Feb 6, 2014, requested by Director Sharon Peaslee to discuss board concerns with the math adoption process. Board directors had submitted a list of specific questions for district staff, and that list formed the meeting agenda, although the majority of discussion focused on several specific questions, listed below.
On February 7, 2014, the K-5 Math Materials Adoption Committee (MAC) convened a regular all-day meeting. Summary notes for the MAC meeting follow the summary notes for the emergency Board session. Notes for both meeting are organized by theme or questions, rather than by order of discussion.
Feb 6, 2014 Emergency Work Session
Present: Directors Peaslee, McLaren, Carr, Patu, Peters; Charles Wright, Deputy Superintendent; Michael Tolley, Assistant Superintendent of Teaching and Learning; Shauna Heath, Executive Director of Curriculum, Assessment and Instruction; Anna Box, Mathematics Program Manager; Adam Dysart, Mathematics Curriculum Specialist; Eric Caldwell, Manager, Instructional Technology; Shawn Sipe, Instructional Materials Specialist
Question: Why were four programs brought forward?
Directors’ concerns: All prior documents and status reports provided to the board had set the expectation for 3 or 2 programs to be brought forward for final screening, but now there are four finalists. Why was this changed, and who made this change?
Discussion: Programs #3 and #4 were scored as a statistical tie by the Math Adoption Committee (MAC). A tiebreaker had been proposed by a MAC subcommittee, but not voted on by the full MAC so it could not be utilized as a formal part of the process. Directors emphasized that not identifying a formal process to bring forward 3 or 2 programs was a major oversight. District staff was largely non-specific in addressing this concern. Directors were dissatisfied with staff explanation of the process, the incorporation of community input, and the MAC role in the choice to select four finalists. They asked that the MAC explicitly be given the opportunity to choose 2 or 3 programs in their subsequent meeting.
Follow-up: At next day's MAC meeting, the facilitator asked “Are you o.k. with process we did to get to decision of four, and/or would you like a new process to talk about going to two or three?” The comments indicated broad support for keeping four. The facilitator then called for a vote on this question, and the MAC voted nearly unanimously to keep the four.
Question: Why was community input not incorporated into the process for Round1?
Directors’ concerns: The initial program screening results and rationale included tabulated scores from 122 in-person and 140 online community reviews. District documents claimed that community input was used in deciding which programs moved forward. It was mentioned that the district was sued over the last math adoption due to community input oversights, and nobody wanted to repeat that.
Discussion: This was the most extensively contested issue of the meeting. Multiple directors reaffirmed their commitment to community engagement and incorporating the community voice into the program selection process. Based on the results provided and discussion at the meeting, it was evident that community input was not explicitly used in determining the finalists. Rather, it was stated that the “community input supports the promotion of the top three programs in this list” [of the four programs brought forward]. The proposed (but not approved) MAC tiebreaker proposal would apply community ratings to resolve any scoring ties from the MAC results. Had this proposal been voted on and approved by the MAC, it would have resulted in the use of community feedback to narrow the MAC top four programs to three. Director Peaslee was very concerned that this work [to develop a tiebreaker proposal] took place and was not approved, essentially negating the community input received. District staff reaffirmed that extensive feedback was collected and tabulated in stage 1 of this process, and this feedback was shared with the MAC as part of the rationale for program selection. Director Peters stated that having the community input taken into consideration but not quantified is going to be a very hard sell to the community. This issue was never fully reconciled during the meeting, but the message was clear to the district staff that future work by the MAC needed to incorporate community input in a predetermined and transparent manner, including in-person and online submissions.
Follow-up: At next day's meeting, the committee developed the general details of a community input plan for the 2nd round, and which addressed the content of the input form, when/how the input would be communicated to the MAC, how the MAC would use the input, and how that MAC would document the use of the input. The committee approved the general details through a vote. The committee also approved a process for finalizing the details of the community input plan.
Question: Why was JUMP Math dismissed on the grounds of being incomplete?
Directors’ Concerns: JUMP Math was highly regarded in the community review, but was not included in the top 4 programs. Community will ask why JUMP was even reviewed if it was not complete.
Discussion: The eight programs evaluated by the MAC were submitted by publishers based on a predefined set of submission criteria. Materials submitted were not reviewed by district staff for completeness, only by the committee based on their approved first round screening criteria, which included alignment to Common Core State Standards. Since JUMP Math did not have all grades of the CCSS edition completed at the time of SSD’s review process, it potentially scored lower in multiple categories.
Follow-up: JUMP was not discussed at next day's MAC meeting. All materials other than the top four had already been removed from the JSSC library by the time the MAC had assembled in that library for their meeting yesterday.
Question: Why was data tabulated after the deadline, switching the ranking order?
Directors’ Concerns: The results of the MAC screening process resulted in two separate results and rationale documents (Jan 10 and Jan 15). These documents were very similar, but the Jan 15th document resulted in the 3rd and 4th ranked programs switching positions. Allowing additional inputs following the release of results is not scientific or objective.
Discussion: All MAC members were directed that results had to be submitted by Jan 10th. During initial tabulation, several entries were undecipherable and these were scanned and e-mailed to the rater for clarification. Additionally, there was one rater who did not have access to a scanner and hand-delivered on the 11th. The change in ranking order was because the last two programs were so close that they flipped easily. This was the reason for selecting four programs to move forward. Director Peters clarified a significant concern for the record that there is one program which has benefited from all these process issues. The elimination of Jump Math from consideration, bringing forth four programs, and retabulation all benefited My Math.
Question: Why is benchmarking not being provided?
Directors’ Concerns: Policy 2015 stated that adopted materials must be based on best practices and research including benchmarking from similar districts. Will the committee be considering performance data on finalist programs?
Discussion: District staff have considered possible data sources, and invited MAC to look for it, but no really relevant benchmark data is available at this time, since the Common Core State Standards are so new, and the assessments are not in place yet. Director Peters reaffirmed she didn’t believe the purpose of this adoption is so that students can pass cc test, if our kids learn math well they can pass any test.
Follow-up: The first round screen does not include benchmark data. At next day's MAC meeting, staff did not share with MAC that Board had asked about benchmark data, and no member or staff made any proposals to make benchmark data a factor in the 2nd round selection process.
Question: Why is cost data not being provided?
Directors’ Concerns: One of the formal components of the recommendation from the adoption committee is an assessment of the initial and ongoing cost of the adoption. Will the committee be considering this? Director McLaren referenced some program cost spreadsheets for JUMP Math which were part of the materials display at JSCEE and asked if the other publishers submitted cost estimates.
Discussion: District staff indicated that pricing is typically not shared with committee members in the first round. A subcommittee on the MAC has identified critical components for each program, and procurement is requesting pricing for these items. This data will be shared with the MAC and they will take it into consideration as part of their final program selection.
Follow-up: There seemed to be some uncertainty at next day's MAC meeting as to whether policy allowed the cost data could be used in the selection process, and at what stage in that process. The 2nd round as discussed included two stages: the scoring stage where programs are initially ranked by MAC members, and the deliberation stage where programs were successively eliminated. Since cost does not already appear as a criterion on the screen (new criteria cannot be added to the screen) it appears the MAC is barred from using cost data in the scoring stage. The MAC approved a process for the final stage of the 2nd round that allows members to modify their ranking of curricula during deliberation, if they include the justification. The MAC took no vote on how/when to use cost data, but there was some discussion that it might be considered during this deliberation stage. District staff indicated that the MAC is responsible for coming up with an “essentials”list, wherein the MAC will specify what published elements of the curriculum are“essential.” The MAC seems to have left themselves open to the possibility that they will discover after they pick their final candidate that the cost of what they deem essential for that finalist curriculum exceeds the adoption budget by an inordinate amount. Incidentally, the MAC was promised cost data would be provided before the meeting closed, but this did not happen.
Question: Are evaluation criteria are being changed after the initial review of textbooks?
Directors’ Concerns: The policy dictates that selection criteria are to be determined prior to any programs are evaluated. Now that programs have been evaluated and selected for final review, what are the directives to the MAC regarding these criteria? There has been some mention of “non-negotiables”, which is not part of the existing first round criteria. Opening up the criteria for changing after the programs have been evaluated heightens the risk of manipulating the process or altering the rankings, which is why the policy is written as it is.
Discussion: District staff reiterated the answer multiple times that the MAC isn’t changing any criteria; they are putting more detail onto existing, approved criteria, so that MAC can take a deeper look at 2nd round. It was confirmed that changing the criteria at this point without IMC approval would be a policy violation. The committee isn’t seeking in this stage 2 to change, but rather to add detail and provide background and context. A key element of CCSS is focus. Some of the criteria relate to focus. In 2nd screen looking to add some actual figures, like 80% of content should focus on these subset of topics. So we are not changing criteria but adding detail. If the MAC used exactly same selection tool in the second round, they couldn’t get a deeper look at the materials. It was clarified that the numerical guidance for this process was coming from a document from Achieve The Core, which provides guidance document to help districts check alignment. Directors questioned whether this was discussed and approved by the full MAC, and were informed that criteria review would take place the following day, Feb 7, at the next MAC meeting.
Follow-up: At next day's MAC meeting, the prohibition against changing the screen was duly noted. MAC voted overwhelming to keep “Student Friendly” as its own criterion, even though it was not a criterion in the 1st round screen. Apart from “Student Friendly,” proposed edits that appeared on the draft screen as new criteria were moved under approved criteria, so that they would not appear to be new criteria. A proposal to some criteria non-negotiable was discussed at length, but then was voted down. MAC did not vote on any of the other proposed edits to the screen, but did agreed on a process for finalizing the screen through email. They hoped to be able to get the screen finalized in time to submit to IMC for approval.
Question: Will a program be rated lower if it introduces a topic at earlier grades than called for in CCSS?
Directors’ Concerns: In some cases, the CCSS in mathematics are actually lower than our current standards, so if we are faithful to CCSS we may be lowering our existing standards. Is there room for curricula which are more rigorous, introducing topics earlier than called for in CCSS? Do we automatically disqualify materials that go beyond CCSS?
Discussion: There was some ambiguity in this discussion as presented by district staff, identified by both directors and Charles Wright. The policy states that applicable standards are a “minimum level of rigor”. Shauna Heath stated that the state requires us to implement CCSS, but through professional development we hope to teach beyond the standards. Adam Dysart implied that programs containing content not exactly aligned with CCSS would be scored down. The MAC selection criteria include alignment, assessment, and coherence. Alignment and assessment are grade-specific, while coherence is less critical about grade level. Coherence is more about building the mathematics sequentially. Director Peaslee concluded she would hate to see fidelity to CCSS weighted higher than accessibility of materials by disadvantaged students. This is a really major concern shared by several board directors. Tying the selection criteria very heavily to CCSS without sufficiently taking into account other considerations puts us at risk for adopting a curriculum that aligns to CCSS but doesn’t address the problems we are struggling with now.
Follow-up: At next day’s meeting, there was no mention by staff or facilitator of this concern. There does not appear to be any element of the screen and proposed edits that would necessitate lower scoring of a curriculum that introduces some topics earlier than called for in common core.
Question: How is accessibility being incorporated into the selection process?
Directors’ Concerns: Accessibility criteria in board policy are critically important. Has accessibility by ELL and SpEd students been communicated to the MAC explicitly? Directors Carr and Peaslee in particular made it clear that they will not vote for a candidate program that is text-heavy. Director Patu emphasized the need to realize a lot of kids of color and ELL kids learn differently, and with the math adoption that we have right now, a lot more are failing.
Discussion: The concern about language-heavy programs being inaccessible to many of Seattle’s struggling students was brought up repeatedly. The Directors present were emphatic that any program adopted must be accessible. The Curriculum and Instruction Committee made this point very clearly prior to this adoption, even writing it into the policy. Eric Caldwell described the district’s cultural responsiveness criteria document. Director Peaslee stated the docs were vague on this point, stating that many ELL kids are trying to learn and get support on the math at home, and those documents don’t address that. That ELL students and students weak in reading and writing can still access the materials and their families who may also be ELL can see what math is being taught, without having to read multiple paragraphs of anecdotal text that eventually leads to math content. We need texts that make it easy for kids of any educational or cultural background to learn the math they are required to learn. Shauna Heath committed to make sure that this is clarified to the committee, and Director Carr recommended that clarification be made before the next meeting (following day), and that this process needs to satisfy the requirements and expectation of the board, so whatever steps need to be taken, a) to ensure integrity of process, b) to address the concerns you hear from board, are imperative. Director Carr stated that there is a range of viable math programs from text heavy to text light. If the committee recommendation comes in closer to text heavy vs text light, you are going to run into trouble when it comes time to [the Board] vote.
Follow-up: At the next day’s meeting, staff’s representation of the Board’s accessibility concern went little beyond the MAC being told that accessibility was very important to the Board. The most explicit mention came from Adam Dysert, who stated, “What I heard very clearly from the Board last night is this issue about students be able to access the mathematics. They expect various students to be able to access the math. This is important to them.” There was no mention of the “text heavy vs text light” concern. Arguable, staff and facilitator did not get across to MAC the message that the Board considered it imperative that the MAC address the Board’s various concerns. There was considerable attentiveness, however, to integrity of process during the meeting.
Summation of Comments:
All attendees expressed their appreciation for the work underway by the MAC, and the dedication of the individual members. Shawna Heath emphasized that process challenges and transparency questions may cause some uncomfortable discussions, but ultimately raise the whole adoption and all the participants to a higher standard, and that’s ultimately a positive outcome.
Meeting concluded at ~8:30 PM
Math Adoption Committee (MAC) meeting on February 7, 2013, 8:30 am – 3:45 pm at JSSC.
Text appearing within brackets is added editorially for clarification. The information reported here is presented in process order, rather than in the order in which the topics were discussed at the meeting. The facilitator is an independent consultant named Barbara Grant.
Overview of processes agreed on.
1. Members must finalize their community input form and their 2nd round screening tool. The screen provides the criteria and weightings that members will use to develop scores for curricula. 2. Scoring stage of 2nd round: Members will work individually (or possibly in grade band teams) to develop scores for each candidate curriculum. Members will be provided with all of the qualitative community input, so that they can allow it to influence their scoring to the extent they chose. 3. Elimination stage of 2nd round: Shortly before the meeting starts (probably not more than a few days in advance) members will be provided with the tabulation of the quantitative community input (the tabulation will produce an aggregate community ranking of curricula). All members will meet to share their views on curricula, community input, and possibly cost data. Through deliberation, non-binding straw votes, and three binding votes, the committee will successively eliminate three candidates. The last remaining candidate will be the finalist. 4. The recommendation forwarded to the Board will include documentation of changes to scores (or ranks) during the elimination round, and documentation that each member had access to the community input data and took it into consideration.
Questions from the Board
The facilitator invited MAC members to comment on (1) whether they had adequate opportunity to think about and consider community feedback, and (2) whether they had enough opportunity to vote on the tie breaker. The suite of comments in response to these two questions do not provide a clear answer. Facilitator did not call for any vote on the first two questions. Facilitator asked for comments on third question (3) “Are you o.k. with process we did to get to decision of four, and/or would you like a new process to talk about going to two or three?” The comments indicated broad support for keeping four. The facilitator called for a vote on this question, and the MAC voted nearly unanimously to keep the four.
Questions about public display of materials: when, how many sites, what sites, extended hours; will on-line input be allowed?
Policy requires a minimum of five locations at which the materials will be used and that input be allowed both on paper and on-line. The materials will be on display for about three weeks in the spring, around early April. Staff has picked five elementary schools and one library, aiming for appropriate geographic coverage of the school district. Staff noted that some MAC members have been advocating for more public library sites [in order to improve public access to the materials geographically and on evenings and weekends], and that some MAC members had already made enquiries and had gotten informal commitments from a couple of public libraries for material display. Staff asked MAC to let staff handle arrangements and decisions for display, except for the question of what languages needed to be displayed, which they said was a MAC decision. Staff used the rationale of tradition and relationships rather than policy to justify its position. Staff wants MAC to decide what languages should be displayed (this question still needs an answer).
What will be on the community input form?
Staff communicated to the MAC that, while policy requires the MAC to take community input, the MAC gets to decide how to do that, but needs to have a defensible procedure.
A subcommittee brought forth two proposals. Full committee discussed at these at length, then agreed by vote on the general content and structure of the community input form. The community input form allows for both qualitative input and quantitative input:
1. One person – one form (single unified form for all four curricula) 2. Identify role (parent, teacher, student, other). 3. Identify region/school of residence/association 4. Listing of MAC screening criteria, with links online for more info. 4. Yes/No box for each curriculum, with space for respondent to explain yes/no response in each case. Inform respondent that yes/no responses will not tabulated if they omit answer for one or more of the four candidates. 5. Section for extended free comments
The committee agreed on (by vote) a process for finalizing the community input form. The finalization process will take place via email and survey monkey, and will repeat until everyone votes six or lower on the eight-point scale (where 1=full support, 7 or 8 means unacceptable.).
Though not specifically voted on, the notes indicate a general understanding that the quantitative input (i.e. answers to yes/no check boxes for curriculum) would be tabulated by staff, and turned into a community ranking of the four curricula.
Consideration of proposals to edit the screen, add non-negotiables, and change the criteria weights.
A subcommittee was presented proposed edits to the first round screen. As the start of this portion of the meeting, staff made clear that the screening criteria established in the first round cannot change for the second round, but refinement and clarification is allowed. Some of the edits looked like new criteria, so these were repositioned so as to be sub-items under the 1st round criteria.
Proposal for “non-negotiables:” The subcommittee advocated that certain statements in the screen be labeled as “non-negotiable,” by which they meant that a curriculum that failed to satisfy a criteria deemed non-negotiable would be eliminated from consideration. The committee voted on the proposal to have non-negotiables in the screen. The proposal was rejected.
The committee deliberated proposed edits, but did not finalize the screen. MAC agreed (by vote) on a process for finalizing the screen in time to submit to Instructional Materials Committee for approval at the IMC’s Feb 11 meeting. [Revisions to the screen must be approved by the IMC.]
It was proposed to keep “student friendly” as its own scorable section [thus elevating this to the level of a criterion]. This proposal was voted on and strongly supported by the MAC.
Staff indicated that the MAC can change the weighting in the 2nd round. The committee discussed changing the weights. One person proposed to allow each member to choose their own weighting, but the argument against this idea appeared to be persuasive to the committee.
MAC discussed a proposal to divide up the screening by into grade level teams who would schedule time to meet to do their work. A date for grade-level team meetings was discussed. According to observer’s notes, no decision was made on the proposal.
When will MAC receive the qualitative community input results?
MAC members voted on two alternatives: (a) data provided on a rolling basis (as it comes in during the display period) to MAC members, who will be working on their scoring; (b) data provided in bulk after scoring is done, before the elimination meeting starts. A hand vote showed that the MAC favored option (a) by strong majority.
How will the qualitative community Input data be used by the MAC?
Each MAC member will use the community input data to inform/influence his/her personal scoring of the curricula against the approved criteria. [The agreed process seems to allow members to bring up community input for discussion during the elimination round, if they choose to.]
How will the usage of qualitative community input data be documented?
Each member will check a box on their scoring sheet that says “I reviewed and used community in put in the course of making my decision.” or “I had access to and reviewed the community data, and it influenced how I scored. “ or something similar. This suggestion was not voted on, but was rather stated by the facilitator as what would be done.
When will MAC receive the tabulation of quantitative community input and the community ranking of curriculum?
The MAC is supposed to get the tabulation of quantitative community input after the display & comment period has ended, a few days before the start the elimination round.
What will be the procedure for the determining the finalist curriculum?
The elimination round is a meeting of the committee in full for purpose of carrying out the following agreed-on process for identifying the finalist curriculum. Each member has to finish their initial scoring before they come to the elimination meeting. The scores are converted to ranks (each person’s top scoring curriculum becomes their Rank 1 curriculum). The elimination process includes non-time-limited deliberation where members openly share their views, and discuss cost data and community input. Each person can make adjustments to their scoring or ranking during the deliberation time, as long as they have a criteria-based rationale; the rationale for any score/rank change after the elimination meeting starts must be recorded by each screener. [It’s not clear to note-taker how MAC can use cost and community input data to change their scores or rankings, since cost and community input don’t appear on the screen as criteria.] A straw poll (to tabulate provisional ranks) will be conducted anytime a member requests it.
A binding vote results in the elimination of that curriculum that has the fewest number of members giving it a rank of one. At any point a member can ask for a binding vote. The MAC will then vote on whether to taking a binding vote (14 yes votes is required to trigger a binding vote). Three conclusive binding votes will result in identification of the finalist curriculum.
This procedure was voted on using the 1-8 consensus vote (where 1=strongly support, 8=strongly oppose). Hand Vote result: 1=14. 2 = 8. 3. 4. 5-8: 0.
What about a “None of the Above” option in a binding vote? (meaning, what if a member finds none of the curricula satisfactory, so has nothing to vote for in a binding vote) This was discussed fairly extensively, but no decision was made how to handle this case.
How will the quantitative community Input data be used by the MAC?
The MAC discussed allowing members to use this data as a factor for adjusting their ranks in the elimination round.
Comments
Thanks for the thorough update.
I am disappointed that they bailed on Jump Math, recommended by Cliff Mass, UW science professor. I hope they don’t use Common Core requirements to drive the decision. We need a fundamentally sound math curriculum. It is well overdue.
S parent
1. The board is really keeping an eye on this and encouraging a transparent adoption process.
2. Directors Carr and Peaslee stated that they will not approve a text-heavy adoption. That is a huge problem with textbooks we have now.
3. There is resistance on the board to making exact correspondence to Common Core standards an overriding "non-negotiable" criterion. It's much more important to have a good textbook; if it's missing some Common Core item (or if it covers it in an earlier (or even a later) grade), it's not hard to get materials to adjust to that.
4. The board is insisting that community input be a significant component of the evaluation of the materials.
But there are things that are disturbing:
1. I spent considerable time reviewing materials and filling out reports. It looks like my submissions were not considered in a meaningful way. I hope that the board's stand will change this situation in the second round.
2. JUMP Math, which I thought was a very good series, was dismissed, apparently because the current version didn't correspond that well to the Common Core standards. I wish that the first round screening had been more in accordance with the board's desires as I mentioned in #2 in my "encouraging" list above.
3. The materials need to be made available in more than one public library. At a minimum there must be two – one in the north end and one in the south.
4. Benchmarking data hasn't been included. It had better be included in the second round! We have lots of examples in this district and other neighboring districts showing that materials more aligned to explicit instruction have better outcomes on state tests than materials aligned to instruction using inquiry or discovery methods. To avoid using benchmarking because testing hasn't been done for Common Core tests is ridiculous! That discards what is probably the best tool for evaluation that we have!
5. I'm alarmed that "numerical guidance" for the second round was coming from a document from "Achieve the Core." I never heard of that organization before, but a google search shows that one of the directors is Phil Daro, an inquiry math supporter who unfortunately heads up implementation of the Common Core math standards. We have him on tape a few years ago telling Bellevue math teachers how to parry questions from parents who do not like to have excessive use of calculators in their kids' math classes. Who decided to use this document? How can we be assured that it's not slanted?
Thanks very much to the individuals who wrote the documents in this posting.
A final comment: this stuff is really important. I'm so tired of having 9th, 10th, 11th, and sometimes 12th graders in my classes where a significant percentage don't know basic arithmetic (such as times tables, cancelling when multiplying fractions, etc.) This has to stop! Part of the problem is good textbooks. That's not all, of course, but let's fix this problem now that we have the opportunity to do it!
I share your concerns. Thank you for articulating them.
I attended both meetings covered in this report.
I am quite disturbed that staff failed to communicate board's concerns to the MAC on Friday.
I understand if the Board gets email from the community sharing concerns such as you write her, the Board has a stronger position for keeping the pressure on Staff to do the right thing.
I think that if these meetings had a professional transcriptor, and the transcript was published immediately after each meeting, Staff would behave much better.
This would also be very useful when committees are trying to remember their decisions. The official minutes are so sparse that they are pretty useless.
community observer
coach outlet
michael kors outlet
ray ban sunglasses
true religion jeans
beats headphones
kate spade outlet
chanel outlet
burberry outlet
tory burch outlet
air jordan shoes
canada goose outlet
moncler coats
the north face jackets
the north face outlet
north face outlet
the north face clearance
north face jackets
the north face uk
the north face outlet store
ugg boots
mbt shoes
air jordan 13
bottega veneta outlet
hermes outlet
hermes birkin
hermes belt
mulberry outlet
tommy hilfiger outlet
juicy couture outlet
cheap wedding dresses
evening dresses
ysl outlet
mizuno running
stuart weitzman boots
dansko outlet
surpa sneakers
vans shoes
nike outlet store
cheap nike shoes
0530maoqiuyun