Tuesday, October 31, 2006

Teachers' Pay and Test Scores

A reader brought an article that appeared in the Morning Call (read it here) on Sunday to my attention. I too had read the article, but the reader was curious what everyone's thoughts were on the subject of Teachers' Pay and Test Scores given Nazareth's current contract talks.

In short, one Director in Quakertown questioned the district's PSSA test scores in relation to teachers' pay:
School Director Manuel Alfonso was pleased overall with scores unveiled at Thursday's school board meeting but said there still was a disconnect between the district's teacher salaries and its scores.

''We have among the highest salaries in Pennsylvania, and our student test scores do not reflect that investment,'' Alfonso said.
Given the teachers' refusal of the districts' offer of about 4% per year for five years, what do you think the expectation of student performance should be once the new contract is approved? What measurement should be used? How should results be announced to the community? Should there be no change at all (ie no correlation)?

Past NewsOverCoffee articles on PSSA.
Past NewsOverCoffee articles on TeachersStrike.

9 comments:

Anonymous said...

The Quakertown article was an interesting read. Is it possible that people are starting to realize that throwing $$ at schools isn’t necessarily the solution?

On a different note, I was at the bus stop this morning when the teacher’s contract situation came up. One parent continued to “sing the praises” of the Nazareth School District and how the “scores” are the “best in the Valley”. Well, I wanted to believe that, so I took the 2006 PSSA scores and did a little work in Excel.

For my analysis, I used all the districts in Northampton Co (Bangor, Bethlehem, Easton, Nazareth, Northampton, Pen Argyl, Saucon Valley and Wilson), Lehigh Co (Allentown, Catasaqua, East Penn, No. Lehigh, NW Lehigh, Parkland, Salisbury, So. Lehigh and Whitehall). I also threw Quakertown in there because of the article above. The total number of districts in my analysis is 18.

I sorted by grade (they take the tests in 3, 4, 5, 6, 7, 8 & 11) and percentage advanced+proficient for both math and reading. The scores for each grade are the combined scores from all schools in that district. For example the 3rd through 5th grades for Nazareth would encompass (Bushkill, Lower Nazareth and Shafer Elementary). My findings:

3rd grade - Nazareth: 11th for Math; 10th for Reading

4th grade – Nazareth: 9th Math; 8th Reading

5th grade – Nazareth: 8th Math; 6th Reading

6th grade – Nazareth: 8th Math; 7th Reading

7th grade – Nazareth: 5th Math; 2nd Reading

8th grade – Nazareth: 10th Math; 9th Reading

11th grade – Nazareth: 9th Math; 3rd Reading

Do the numbers surprise anyone? I honestly thought better of the Nazareth schools. We were only 2nd in one category and 3rd in another. The rest of the rankings fell right in the middle of 18 districts.

Furthermore, when you break down the schools in the top 3 of each grade for both reading and math, some very familiar names rise to the top.

So. Lehigh MATH (1 first, 2 seconds); READING (2 first, 1 second, 1 third)
Saucon Valley MATH (1/2/2); READING (1/1/1)
Quakertown MATH (2/1/1); READING (1/1/2)
Parkland MATH (1/1/2); READING (2/2/0)

So what’s this say? Maybe Nazareth’s reputation is just that. Reputation only.

Anonymous said...

I am not surprised by your findings in comparing Nazareth's PSSA scores with others. I agree this school district is not standing up to the reputation that it has. I feel that in the recent years the quality of education in the Nazareth Area School District has decreased.

I moved here because of the quality of education. However, I am finding myself disappointed every year. I find is hard to swallow when I hear of teachers stating that they should be receiving higher pay…prove to taxpayers that you are worth the higher pay. These teachers work one of the shortest days in the state and are not showing test scores that make them top level teachers.

The irony of this is that every year more of our children’s day is carved out of PSSA prep. The schools are teaching to the PSSA tests and they are still not achieving high levels of success. Guess what the PSSA’s do not count for your children, they are only a grade for the schools. I am tired of schools putting pressure on in regards to the PSSA tests. When all is said and done, there is not a college/university that cares about their PSSA scores. Remember it is their GPA’s, SAT’s and ACT’s scores. Why don’t the teachers focus on those 3 factors? I will put pressure on my kids to succeed on the PSSAs once the school district is willing to pay for any college they attend.

I hope that more parents start paying attention to the education that their children in Nazareth are receiving.

Anonymous said...

Wow! Thanks for the detailed analysis. It is hard to argue with pure facts. I as well moved here because of the distict's reputation and (at the time) its small class size.

With these numbers, one has to wonder it they are a direct result of the ever increasing class size, or if there are other factors involved as well.

RossRN said...

Regarding class size, at least on the elementary level they are fairly small most at or around 20 with highest 25 (one class in one building).

The latest enrollments I saw were from a board agenda you can read here in PDF.

Each elementary has by grade the number of classes and the number of students per class. What you can't tell from the two pages of enrollments is the number of students per class on the MS or HS level.

I also don't have historical reference to compare these class sizes against to see if the actual classroom as opposed to grade level class size has grown.

Anonymous said...

Have not been very impressed with the scores and all the emphasis place on these tests. And the data is consuming the teachers time to teach. Again, I am tired of the pressure and teaching for the pssa test. Science is very much lacking and I do not even know if by 6th grade these kids know the states and capitals.

Anonymous said...

I will agree that schools have become preoccupied with 'teaching to the test'. However, as of right now, it is the only thing we have to assess and compare our schools.

That said, is it not surprising that Nazareth ranks near the middle? The 'other' highly regarded school districts are consistently scoring in the top 3 but Nazareth is not.

I have read in the papers and the forums where all of these parents get up and say how great the schools are and how they chose to move her because of the schools. I wonder what is the reputation of the NASD built on? Is it truly the schools or teachers or is it the fact that your child will not be sitting next to another child named Jose?

RossRN said...

Thanks to everyone for participating, and a few comments on a few different points:

+ I think the use of PSSA is valuable because all students are required to take it, compared to the SAT or ACT which is optional. In this way you can make a comparison from one school to another(albeit within one state).

+ I agree it would be very valuable to measure scores over time for the same grade to assess improvement.

+ I wonder to what extent teaching the test, versus providing a quality education would improve scores. It seems to me if the education is above average it will translate to higher test scores whether it is taught to or not.

+ I think there is a good reputation for a variety of reasons, but again, it needs to be supported with a quantifiable metric, and that seems to be missing right now.

Nazareth is perceived as being a safe environment for kids. Further it is sought after because academically it can offer more than a small school, but is not too big where the children become a number and are forgotten about.

+ I think another aspect to consider is that the reputation was built over time. During the past five to seven years alot of teachers have retired as is evidenced by the count of teachers per step and scale.

A large number remain at the top, but there are small numbers in the middle, and then a large number at the first five steps.

+ You also have to consider the significant turn-over in adminstration, particularly in the principal position at all levels. A lack of consistency often requires focusing teachers' time on issues other than teaching their class.

Alot of different factors, don't know if one or another is "the" reason, but what I think is more important is setting a course moving forward. Defining a metric, using the mesaurement (maybe going backward for a benchmark), and then seeing our progress as we move forward.

Thanks again to everyone!

Anonymous said...

The metric we really need to follow is the one based on the number of students that start in a class the number that graduate, and the number that drop out (adjusted for students coming into the district and/or moving away). Then look at the number that move on to secondary education (2-4 year college/university). This will give us a true metric on how successful our schools are.

The PSSA is in reality a measure of the teachers in the schools and is effectively their report card. If the scores are low, are the students "flunked", or is the penalty paid by the school administration and teachers?

A senior class could take the PSSA's, score in the highest percentiles across the board. That same class might also have a 10% dropout/failure rate. Would you consider that school a success or a failure?

Based on PSSA's, it is a success, but by having any amount of students falling out of the picture (even 1), is a failure in my book, regardless of how good the scores are.

I might be off my rocker, but this makes more sense than banking on the PSSA.

RossRN said...

Playing the role of the devil's advocate, the problem with basing success on a graduation rate is the one you find in systems where the easiest thing to do is promote a kid to the next grade to get him/her through the system.

The result is a district with graduates who can't read.

I'm not saying that would happen here, but the benefit of the PSSA is that all students in all districts take the test and as a result the entire district can be assessed equally with another.

A way around that may be to group, schools by PSSA score range and compare within those groups the graduation rate so you have some semblance of academic equality along with graduation rate as opposed to either in isolation of the another.