Friday, January 18, 2008

How to Measure Student Performance

The Express-Times reports (read it here) on a new plan to require graduation tests and notes that Districts are not pleased. A year long review process would be required and the testing wouldn't begin until the graduating class of 2014.

The article notes, " Under regulations approved Thursday by the Pennsylvania Board of Education, students could be required to take new subject-specific tests drafted by the state in order to graduate."

The position of the Board is expressed here:
How districts assess their students does not appear to be in line with how the state expects students to perform, said Jim Buckheit, executive director of the state board of education.

About 125,000 high school students graduate each year, but 57,000 students in 2006 were not proficient in reading and math on the Pennsylvania System of School Assessment (PSSA) test, he said.

One take from the Districts was presented by NASD's Dr. Lesky:

The standardized tests, however, will not account for differences in performance measures or subject areas, such as local history, across such a diverse state, said Nazareth Area School District Superintendent Victor Lesky. Teachers would be forced to adapt their classroom instruction to prepare students for a test being written by state officials, Lesky said.

If anything, students should not be required to take the PSSA test if the school districts are using the state standardized tests as performance measures, Lesky said. The PSSA test is used to measure accountability with federal standards.

The article also closes with this, "'To me, we're over-testing and it's not going to give the state or the community an indication of how well students are performing,' Lesky said."

And this I believe begs the question how do we determine student performance? What do you think?

18 comments:

Unknown said...

I believe that standardized testing it NOT the way to measure performance. Before everyone jumps on that comment, I will ask how many of you ever blew a test because mabye you were sick or were just having a bad day? How many of you were just bad at taking any kind of standardized test?

I have always contended that they way to measure a school is to look at individual graduating classes. How many students started in the freshman year, how many left (dropped out) prior to senior year (discounting those that moved away), what was the distribution across GPA, and how many went off to secondary education and the levels/types of universities they are being accepted to.

I want my kids to be prepared to go off to a qulaity university, and standardized testing in no way moves us in that direction and has absolutely no bearing on any of the above.

Unknown said...

Why not give all students the SAT Reasoning Test, with math, critical reading, and writing scores providing the minimum basis for graduation?

Not without its own issues, and cost, but possibly two birds with one stone in my opinion.

@anonymous: Last I checked, and forgive me it's been a while, students were able to take the test more than once to ward against bad days, or being sick. I believe there are also safeguards built in for disabilities, etc.

Unknown said...

stilfx,

Yes, the SAT would kill two birds with one stone, and makes sense, which is why it will most likely never happen.

As for PSSA's, it is once and done. If you blow it, you have to wait until next year to recover. SAT's, you can take as many times as you want.

What I was implying is that a single test is not always the measure of someone's abilities. And if a senior failed the test the first time around (possibly a straight A student), that kid would have the added pressure of having retake the test in order to graduate, even though they could have had top grades throughout high school.

All this additional test will add is more time by students spent on practicing yet another standardized test.

RossRN said...

Anonymous:

Good points all, and I think theoretically the concept that you judge- how many graduate, how many graduate to attend what type of school, and then how many graduate from their respective type of school (because getting isn't the indicator of accomplishment, but graduating would indicate the 'resume' wasn't inflated.)

This however becomes a huge data-collection project (ask any class officer how hard it is to even track down every class member after five years let alone ten).

While many don't like testing for a variety of reasons, giving the same test to all students across districts throughout the state remains to me the only unbiased method that is easily attainable.

It is the one way that we can compare schools, because grades are very subjective.

The problem with testing isn't the test, but the schools' response to it (I know you've heard this from me before).

If you have a solid curriculum and you have good teachers who are given the resources they need to teach and the time to prepare to do so, the students will be prepared for the test.

If you manipulate your curriculum and drain the time your teachers have to prepare lessons and work with students, the students will not be prepared and the results will show it.

And again, if tests don't show student performance, what does? How is student performance objectively determined, particularly between schools, if we don't use testing?

Unknown said...

Ross,

Good points, but I will ask you a couple of questions.

If NASD had a high drop out rate, but high test scores, would you still consider it a good district?

If only a small minority of the graduates go on to secondary education, yet had high test scores, would you consider it a good district?

I will agree that testing gives us an apple to apples comparison, and by the above examples, a district would appear to be a good district.

However, when you look below the surface, it may not really be because everything kind of ends there.

Yes, it is hard to track alumni, however, a school does indeed know where the vast majority of its graduating seniors are going to. Whether or not they complete that follow on education is another story, but the fact that a certain percentage was good enough to be accepted to four year schools (or not), to me, defines whether a school is good or not.

RossRN said...

Thanks - regarding the questions, I think if a school pushed academics on all students so hard that the choice to drop out was routinely made, but others who were able to succeed did extremely well, I'd say the school failed.

On the flip side, if students were all brought to levels of proficiency or above on state test standards and had grades indicative of reasonbly good performance (c or better) then I would consider the school to be meeting its obligation.

To be a good school, they would additionally need to score better than most others in the geographic area.

Whether or not students chose to go to school or a trade isn't indicative to me of a school's performance.

Let's face it, if you pay money your kid can go to school and with the average home value in our district, most people can afford to send their kids to college.

I believe in one of Brad's reports the head of Community College noted that Nazareth has the most students attending the school and the most often selected class is bowling.

Being accepted into a school is typically done based on SAT, grades, references, and possibly an essay or two.

If a student is not a good test taker (as is often the case against SAT and PSSA) but has excellent grades and good references you could expect they'd still get in.

But what if the grades are over-inflated and it turns out the kid really isn't a bad test taker after-all, but only appears bad because he/she has always gotten good grades and scores low on standardized tests?

Our MS had 60% of a single grade on honor or high honor roll, not to knock the achievement, but maybe we aren't being tough enough on our grading and hence the disconnect between grades and test scores.

Finishing college would indicate that the individual not only was able to get in, but graduated from another institution. If you had a percentage who entered and percent who graduated you would get a gauge if most or all students who attended college were prepared. Whereas acceptance doesn't accomplish this.

I don't have an answer to the performance question and in the absence of one it seems the test is the best alternative.

Unknown said...

In my opinion, giving all students the SAT's resolves nearly all of the mentioned issues.

The state pays for the first, the student pays for all subsequent re-test's. Sounds far cheaper than "A year long review process (that) would be required and the testing wouldn't begin until the graduating class of 2014.". I'd imagine state-wide SAT testing could begin sooner than that.

What can "new subject-specific tests drafted by the state" do that that the SAT's cannot?

The SATs provide a worthy baseline for acceptance to higher education, why can't it be use as an indicator for primary graduation?

And would the cost/benefit ratio even be worth it, since we're simply trying to understand whether or not our students are proficient enough to graduate with basic reading, writing, and math skills?

The SATs have been in constant refinement since its inception in 1901.

Not to be all conspiracy theorist, but It almost sounds like the state is creating a whole lot of work for... someone. I suppose we shall see.

Unknown said...

Almost forgot,

..and as a side benefit to using the SATs, neither the students, nor the teachers would need to prepare for yet another test.

Unknown said...

stilfx,

You may be on to something with the SATs. Of course, you may be on to something else with your "conspiracy theory". I don't put anything above Ed "Santa Claus" Rendell in creating something to take care of a political ally.

But I digress. This idea would be brilliant, especially if it eliminated the PSSA tests. Teachers could teach, students could learn (something other than how to fill in little circles), and we would know a lot more about our graduating seniors.

Ross, you actually made my case. The largest population at NCC comes from Nazareth. Do you not see an problem with that? Why are they there and not at state schools or name brand universities? You could use the socioeconomic reasons, but then why isn't NCC inundated with Easton or Bethlehem students?

Short story is, we can pat ourselves on the back for our PSSA scores, but when such a large population is NOT going on to four year universities, there is a problem there.

anonymous said...

I agree with previous posts regarding the usefulness of the SAT as a graduation test. However, until the SAT is modified to include science and social studies, I cannot accept its overall value.

Anonymous said...

In response to the questions:

The largest population at NCC comes from Nazareth. Do you not see a problem with that? Why are they there and not at state schools or name brand universities?

I most absolutely see this as a problem. NASD continues to talk about the quality of education that they provide and yet….the students flock to NCC. Okay maybe this is a good choice for some students, but to see the high number of NASD graduates ending up at NCC is scary.

I know first hand that many of the educators and administrator feel that this is okay. They are not alarmed with this fact.

Unknown said...

"the majority of NCC students come from Nazareth"

Your "glass half empty" or "we are producing less educated children" theory may be flawed.

In my view, having more students attempting college from a far less densely populated area tells me we are doing something right — we are out-producing Easton and Bethlehem in higher education ready students.

You need to understand the scope of the equation before you come to a positive or negative assumption. There are many factors we simply do not have values for. All we know is, "the majority of NCC students come from Nazareth".

Short of any other variables, in my mind, thats a huge positive for Nazareth.

PS: I find NCC as a first choice in college anything but scary, for most student (and I do mean most) it is a downright great idea.

anon said...

I am in agreement with Stiflix. I don't think you can make any judgements about the meaning of more students from Nazareth go to NAAC than students from surronding schools. I think you need more information. Do we have the individual numbers that show how many students attend college, both 2 and 4 year programs? Is the total number higher overall than other schools? Is it lower?

From personal experience ( I presently have 2 sons in 4 year college programs), the friends that they have that have chosen to go to NAAC are ones who probably would not have been successful at a 4 year institution due to grades, immaturity, or a lack of ambition. For some it was just a matter of economics. It is very difficult to get financial aid for most of the middle class and not everyone is willing to be saddled with thousands of dollars worth of loans upon graduation. NAAC was a great choice for them and several are transfering to 4 year schools to complete their degrees. I think looking at NAAC as a negative for continuing a post high school education is wrong.

anon said...

I did a little research since I was curious. This data comes from the Standard and Poors Website School Matters for 2005.

NASD
Seniors planning on attending college 83.3%
2 year institutions 25.7%
4 year institutions 57.3%

BASD
Seniors planning on attending college 79.4%
2 year institutions 31.3%
4 year institutions 45.0%

EASD
Seniors planning on attending college 77.2%
2 year institutions 34.4%
4 year institutions 41.4%

State
Seniors planning on attending college 69.7%
2 year institutions 16.8%
4 year institutions 48.8%

RossRN said...

To clarify, the majority of students at community college aren't necessarily from Nazareth, instead it was stated that Nazareth sent more students than any other school.

Also, the context of the discussion was related to HS performance.

One person suggested acceptance to post-secondary education was a measure of HS performance.

I disagreed for two primary reasons, one community college or any two year would inflate the number that went to post-secondary school.

And two, was that you'd need to know how many graduated, not how many started.

This seems to have veered a bit off course, but I'd still welcome a good way to measure performance on HS level.

Unknown said...

Anon 5:03,

Thank you for the statistics. As a side note, if you look at Notre Dame HS,

93% of the 2006 class are attending secondary education with 76% going to 4 year schools and 17% going to 2 year schools.

Moravian Academy basically has 100% of their students going to 4 year schools (other than internships, study abroad per their web site), with 85% going to private universities.

In both cases, they are listed as "started", not "planning on going". I know we can't say if they graduate or not, but the important thing is that a larger percentage are starting.

Also notice that both of these are private schools and actually accountable to the parents that pay tuition.

Our public schools are failing us, and more testing to rationalize that isn't going to help.

Standardized tests measure reading an math (PSSA, SAT, etc.), and I suspect the ultimate state test will be more of the same, or maybe even add in the critical analysis piece that the SATs now have.

Where is science? Where is technology? Where is social studies/world cultures?

Additionally, I love the part when there is a large percentage of students performing above grade level. This is wrong on a number of levels.

Either our teachers are teaching the test, or, they are teaching material meant for the next grade level, or what I suspect, the standards are just plain low. I would expect the majority of students to be "at grade level", not above.

I know there are those of you that put full faith in these tests, but I for one think they are giving us a false sense of accomplishment.

When our public schoold numbers match those for the private institutions going to secondary education, then I will agree that standardized tests matter.

RossRN said...

I don't think you can compare percents of a private school going to college vs. a public school.

If you paid all your school taxes, plus you ponied up another $20,000 a year (I'm guessing don't know what Moravian Academy's tuition is), I'd imagine you'd expect your child will go on to post-secondary education.

Plus, private schools can be selective and public schools cannot.

Teaching the ones who have nothing but support and opportunity vs. teaching everyone up to a certain age is very different.

This is the reason the administration and school boards in general are against vouchers and charter schools. If you suck the 'good' kids out what is going to be left for the public schools?

I personally (and again this is purely an opinion) don't think a parochial or private school in our area can match the resources the public schools have.

If we were in a city environment, it would be different, but there is very little Nazareth lacks in the way of resources.

This is of course why I'd like to see above average performance - if only we could figure out how to determine performance;-)

Unknown said...

Ross,

You make a good point and brought out the fact that public schools are not held to the same accountability as a private school, and why they are so against vouchers. If the public was given a choice to spend their money at different schools, you would see a mass exodus from the "public schools" to private ones, and you would see an explosion of new private institutions.

These schools are truly accountable to providing a good education because they are in a fight for dollars from the public (not to say NASD doesn't provide a good education).

Yes, standardized tests are supposed to provide us an apples to apples comparison between districts, but that assumes that ALL districts are relying only on the kids being tested on what they learned and not altering the cirriculum and teaching the test, which we all know that some schools are doing. The result is an apples to apples-like comparison, not a true one.

Teachers should be free to teach our kids, accountability should come from whether our kids are prepared for life, not for taking some test that goes no further than basic english and math.