News from West Seattle
Note that Cleveland will be there. This comes before any Open Houses so if you are interested in any of these schools, it might worth attending.Start the New Year off right by attending the next Madison PTSA meeting, to be held on Wednesday, January 13, 2010. We will be holding a panel for parents, guardians and students called “High School Choices for Your Tween.” At this PTSA meeting from 7:00 pm – 8:30 pm, come and hear from various high schools in the areas about their school and the programs it has to offer. High schools who will be attending thus far include: West Seattle, Cleveland, and Chief Sealth. There will also be time at the end for Q & A (questions and answers)! The PTSA is looking forward to seeing you and your child at this meeting!
We’ll also have time to discuss the early dismissals and its impact on this school year 2009-2010, as well as next year’s schedule & early dismissals for 2010-2011! If you have questions or concerns, feel free to contact Kim Early at ann8726@aol.com or 206-328-9335.
Also kudos to Madison for at least having a discussion about early dismissals (I'm assuming these are school-based, not district-based).
From the West Seattle PTSA newsletter (this is just an excerpt):
By 2015, the Seattle Schools Estimates Enrollment at WSHS to Drop to 756. As of October 2009, WSHS’s enrollment was 1,137 and Madison Middle School had 906 students. The Student Assignment Plan estimates that by 2015, WSHS’s enrollment to be 34% less than it is today, and
Madison’s enrollment will be 27% lower. If accurate, both schools will have the largest unfilled capacity in the District for the grade levels they serve.
The loss of enrollment at a school equates to a smaller operating budget, fewer staff, fewer class offerings, and curriculum upheaval - to name a few. Parents are justifiably concerned about a precipitous loss of students and staff at West Seattle and Madison over the next several years.
Meanwhile by 2015, Denny is projected to be at 100% capacity and Sealth at 90% capacity. With the new maps, six elementary schools feed into Denny/Sealth, whereas only four feed into Madison/WSHS.
That's a big change for one area.They were seeking a change to the SAP based on this projection but I don't believe it came through. Anyone?
They also note that MAP testing is occurring again in January. Has your child taken another MAP test yet?
Comments
Is it just the five year result of trying to balance in 2010-2011?
A couple of things will happen so this imbalance doesn't actually occur.
1) Families in the Denny and Sealth attendance area will choose Madison and West Seattle.
2) There will soon be a turnover of District leadership and the new Board and superintendent will not know about the commitments made with the Denny/Sealth co-location. They will not feel constrained by them, and they will break them. In five years the District will re-draw the attendance area boundaries and the planned imbalance will go away.
"There will soon be a turnover of District leadership and the new Board and superintendent"
Glad to hear Seattle will have a new Superintendent soon.
I think my daughter actually enjoys it, although it does cause her some anxiety. In the fall, she didn't know what the minus sign with the dots above and below it meant, but she asked us to teach it (division) to her over Thanksgiving break, and she was hoping to get division problems this week so she could show off what she's learned. Of course, if her scores go up relative to the Fall, it might not reflect what she's learned in school, but rather, what we've taught her, based on her individual readiness and personal interests.
Just something else to consider given that the district wants to tie teacher performance to scores. My child hates EverydayMath, and we battle and endure tears over every homework assignment. Yet she scores well on the MAP, not because of EDM but despite EDM.
Actually, I can't provide any reasonable explanation. The first round of testing showed my Kindergartener was well above grade level, and at conference the teacher told us the school couldn't really support our child at that level.
So what value does this additional testing provide? Any gain at this point is unrelated to the curriculum, the teacher or the school. They aren't going to use the information in any way, and I doubt very much they will share it with us.
It seems to be very much a pointless bureaucratic exercise for our kid.
My daughter is doing MAP testing at Thurgood Marshall this week.
Bird, the MAP is supposed to inform instruction, and you have every right to see the results (request them under FERPA if necessary) and to ask what's being done for your child.
Helen Schinske
Solvay Girl does make a good point that the MAP testing can't reflect what happens either at home or thru tutoring. It might be an interesting survey for the district to find out how many parents actually help their kids with homework or hire tutors.
"The first round of testing showed my Kindergartener was well above grade level, and at conference the teacher told us the school couldn't really support our child at that level."
To say to you that they can't support your child is, to me, shocking.
Bird, honestly you have two choices here. One, is to go to your principal and ask what the ALO is at your school. Every school is supposed to have something to support higher learners. If there is something, gently ask why your teacher doesn't know about it. If your principal gives you nothing, I would seriously consider testing your child in the Fall for APP/Spectrum. I can only use my experience but a bright child who is not doing challenging work is a bored child. And, he/she might start tuning school out or even acting out in class.
(If it were me and the principal had no support to offer, I would also let Dr. Enfield know but you may be a more cautious person than me.)
Anecdotally, the reading comp sections seems to unfairly penalize those who think outside the box. I know a few students who received far different MAP scores than the in-person reading comprehension assessments.
I don't think the math sections suffer nearly as much from the problem. I'm going to present the issue to the Board when I can get a speaking slot.
Eric
The previous situation was that a teacher who, let's say, took a fourth-grade student from first-grade level to third-grade level reading in one year (which would be a fine achievement) wouldn't get credit for anything at all, because the kid still wouldn't be "at grade level." In contrast, a teacher who had a fourth-grade student who coasted at sixth-grade level all year and learned nothing new would get credit for an "above-level" student. The new system isn't perfect, but it has the potential to be fairer than the old (always assuming the MAP is reliable, which we don't have a lot of data on yet).
Helen Schinske
Helen Schinske
It might be surprising, but, at this time, I'm reasonably happy with my kid's experience at school. I would, of course, be happier if my child were appropriately supported in the core curriculum, but my Kindergartener is getting something out of school.
My child is learning other things, besides reading and math, and is very focused on the social aspects of school. Such is the nature of Kindergarten. At this point, I have heard no complaints about boredom.
Nevertheless, I think it's a little ridiculous to have my kid tested on rules of capitalization, multiplication and a variety of other things that the school has no intention of teaching at this point.
I don't object entirely to some MAP testing, but this mid-year exam, in this case, seems particularly useless.
But in the aggregate, it should be possible to use, say, 30 MAP scores in a classroom to see if overall progress was made. If a teacher had, overall, maybe two or three years of little progress, while the teacher next door had years of great gains, this might indicate some sort of teaching difficulty. THIS is where we might see MAP used to evaluate teachers.
Unless one teacher was consistently getting more students who were not getting outside help, which is doubtful. But it would be problematic to compare a teacher in a school with a high level of free/reduced lunch, for instance, with a teacher in a school with a low level, as the teacher in the low level school will probably show better "growth" than the high level F/RL school.
This points to the MAP scores being used by building admins, rather than at a district level, unless district is correlating mitigating factors.
It's a few hours playing on the computer. If your child isn't being challenged in his or her class anyway, what's the harm of missing a few hours of that class?
We had to explain this to our child several times because she felt upset with herself for not knowing what the division sign was or what the word "syallable" meant. She thought she was supposed to know these things, and given her temperament and perfectionist tendencies, it was upsetting to her not to know the answers. And as my earlier example showed, she wanted to learn these things.
But because EDM provides no flexibility at all, she's at school manipulating blocks and drawing dots on dominoes when she really wants to know the times tables! Yes, it's frustrating. But we are finding ways to teach her things at home that she's interested in, and the MAP made us aware of what she's capable of.
We also found the reading lexile score from MAP to be very useful. You can find appropriate reading materials based on the lexile score, and even find books above the score that might challenge your reader and help them grow. I'm very curious to see how the lexile score has changed from the fall (if at all).
And finally, all elementary schools are supposed to have formal ALO programs in place by next year. There may be a committee at your school working on this, and perhaps you could join to help shape your program. I was highly skeptical that differentiated learning can happen in large classes (still am), particularly with a curriculum like EDM that seems to force everyone to stay together. But I am optimistic that we will have better opportunities for advanced learners next year at our school.
If some teachers do test prep and some don't, or if one teacher DOESN'T prep, say, September, then DOES prep January, than DOESN'T in spring, then tests will not be accurate views of gains, but will include some view of test prep. As Charlie pointed out in another thread, over time this would balance out, but in the short term...
District should have strict instructions NOT to prep students in any way, except routine explanations of process.
There's also the motivation factor: Not all students will take tests seriously all the time: High scores will (generally) indicate knowledge, but low scores (generally) will NOT indicate lack of knowledge too well, unless it's known student did their best. So other correlations are in order for low scores, but not so much for high scores
The district could then bring pressure on that school to change the policy.
That sounds great, but I'm not sure in practice it would work. It all depends on how advanced your advanced students are. For a kid who's an outlier among outliers (maybe the only kid at that level in their grade), there simply isn't enough data to attribute the level of growth to the individual teacher or school.
The small sample size would be a problem for any population group, and, for the most advanced students, you may also have to consider the extra factor that kids who show up to Kindergarten working three or four years ahead of grade level have already shown themselves to be quite adept at learning outside of school.
I realize that this sort of data has it's own problems and imprecision, but it seems to me it does fill gaps that aren't captured by standardized tests.
I would think it would also be somewhat less prone to manipulation. Test scores can rise if teachers cheat by coaching to the exact questions on the test. They can rise if the adminstration and bureaucrats change the test to one set to a lower standard.
One would hope that parents opinions about what their children have and should be learning are not quite as maleable -- though, obviously, they are somewhat possible to affect.
My complaint about that particular question is that there is considerable gray in the answers. I shouldn't really have said that any of the four answers are acceptable. So all the readers don't have to follow the link, here's the text of the question:
Miss Hill's Class was painting a big picture to put on the walls in the hall. Paintings were all spread out on the floor. Suddenly, there was a loud noise. It was a fire drill! What probably happened next?
1. The children went right outside and some paintings were stepped on.
2. The other classes waited to leave until the paintings were moved.
3. The fire drill was called off.
4. Miss Hill's class finished the paintings before they went outside.
I'll grant that 4 isn't really reasonable. It's also fairly clear to an adult that #1 is the correct answer, particularly if you know that test writers are trying to trick you. However, if second graders are asked what the best answer is, I can see a definite minority choosing #2 or #3 as a better outcome for the class. Both of those are plausible scenarios. After all, it's not good if paintings are stepped on.
I'm not an educator, not a test designer, but I do feel that the reading comprehension has to test actual reading skills and therefore has to have most or all of the information you need to answer the question. Should a second grader who had been home schooled have his score reduced because he's never been in a fire drill?
Given this issue and the problem of MAP scores not at all matching in-person evaluations (at least in lower grades and with some students), I see the MAP as being more a test of test-taking skills.
Several of their friends tested several grades lower in reading when in fact they are years ahead. What are the teachers to do when they know the real level of the students?
IPP-ERB-MAP-IDEA- APP-Best Practices- 5yr plan- "community input".........
Btw, I apologize for off topic but what was behind changing
Horizon & IPP to APP and AlOs& spectrum?
Curious because while d didn't qualify for the earlier programs despite a tested iq of 160, and our local school ( west woodland) told us at the time we should look elsewhere, from conversations with current parents involved in APP, it sounds as if the indentification process and the current curriculum is better suited to students needs?
Because they are 5-year-olds who can't necessarily read or write, they were given headphones and audio prompts as they sat at the computers.
When told to 'put their mouse on the letter A,' nearly all of them lifted up their mouse and placed it directly on the computer screen!
Our tax dollars at work, folks! (or is it Gates Foundation money...?)
So Spectrum and APP have been around a long time. (I know someone who would have a history of APP or you could look up the APP group and they likely have it.)
ALOs came in a couple of superintendents back (Raj?) as a way for schools to serve high-performing kids in the school they choose to be in as well as have more rigor available on a subject by subject basis. ALOs were supposed to be certified by district staff. Now, I have no idea how they get developed or what kind of standards there are. It might be at the Advanced Learning website.
I do believe that Dr. Goodloe-Johnson would like to get rid of Spectrum.