- Under a new federal bill, teachers would make a minimum salary of $60,000 - December 17, 2022
- Redefining in loco parentis: What does it mean to care for Black children? - October 5, 2022
- Quinta Brunson + ABC Network Sued For Copyright Infringement For Television Show ‘Abbott Elementary’ - July 18, 2022
- We Crowdsourced What Teachers Said Can Stop Gun Violence in Schools - May 27, 2022
- Weird News: Why Are People Asking Quinta Brunson To Do a 'School Shooting' Episode? - May 25, 2022
- After Another School Shooting, No More Words. - May 25, 2022
- Teacher Appreciation Week Deals 2022 - May 2, 2022
- Abbott Elementary When Discretionary Funds Are On the Line - April 6, 2022
- Abbott Elementary Tackles Tik Tok Challenges - April 6, 2022
- The Dangerous Suppression of “Don’t Say Gay” - March 23, 2022
By Jon Alfuth
In my sophomore year of high school, my AP European History teacher gave us a test on ancient Rome in week two of the course. The problem? We hadn’t learned anything about Roman history! To this day I still don’t understand her decision, but the experience left me with a strong distaste for being required to complete an academic assessment without adequate preparation.
This experience has informed to my practices as a teacher. I always do my best, as I believe every teacher should, to prepare my students for any assessment that I ask them to complete. With this in mind, I was very upset when I learned the State Department of education was requiring our school to give a practice constructed response assessment as a part of its push to PAARC common core assessments this coming year.
Constructed response assessments contrast with the traditional way we have tested in both Tennessee and across the country in that they are completely open answer and require students to explain their thinking using calculations, visuals and proofs in geometry. For years we have raised students on a steady diet of bubble tests. While I had been pushing my students to explain their work, I feared those years of standardized testing would likely leave my kids feeling angry and frustrated. As we collectively move towards either the PAARC or Smarter Balance Consortiums, I expect many teachers have similar fears.
After I cooled down, however, I realized that this was an incredible opportunity to have a conversation with my kids about the best way to assess student knowledge. Educators and policy makers continually search for better ways to assess student learning, but rarely do we actually ask our kids for their input on what they think constitutes a good test!
With this in mind, I put the following question to my students immediately after completing their first mock EOC:
Assuming you knew everything on the test, do you think that this is a better test of student abilities than other math EOCs you’ve taken?
Given my concerns, I expected mostly negative feedback with a few positives here and there. Instead I was blown away when 63 percent of my kids self-reported that they felt that this test was a better measure of student abilities than other tests they’d taken. Here are some direct quotes from students that responded in the affirmative (names are withheld to protect student identity):
- Yes, it’s harder to get lucky and still get the correct answer because you have to show your work plus get the correct answer.
- Yes, it’s preparing us for when we have to do things on our own and things won’t be given to us.
- Yes it would ensure that the teacher is doing his job and if the student learns. If you know everything there shouldn’t be a problem with taking it.
- Yes because it forces you to know the information. You can’t guess and pass.
However, I did get a healthy dose of no’s. Here’s some of the feedback that summarizes the main themes in the negatives:
- No, because we’re used to doing multiple choice.
- No, questions are confusing and weren’t explained.
- No because on the EOC it makes it easier and straight forward.
- No because you shouldn’t care about why you can get the answer, only whether or not you can get the answer.
At a time when educators and policy makers alike continually search for the best way to assess student outcomes, I have two key takeaways from this feedback. First, more rigorous doesn’t necessarily lead to demoralized and frustrated students. My student responses fly in the faces of many anti-common core commentators who claim that taking the CRA will damage their self-confidence. This feedback demonstrates that students not only know a good test and a bad test when they see it, but that they are willing and able to take on these more challenging forms of assessment.
Second, just as with any test, we need to ensure we prepare our kids to be creative and thoughtful in their answers. Students who said they felt it was a worse test than traditional multiple choice tests typically reported that they felt confused by the questions or weren’t used to non-multiple choice exams. Many also reported that they weren’t used to having to explain their answer which led to frustration.
This feedback should not be viewed as a deal breaker for this type of testing. Instead, it’s merely a roadblock that can be overcome with, what else? Strong preparation through strong teaching! If teachers require our students to explain their logic and reasoning each and every day, they will be prepared for these types of assessments. To suggest that our kids won’t be able to handle these type of tests is an insult to great teachers and hardworking students everywhere.
Testing isn’t the enemy; bad testing is. And as indicated by my students, they recognize a rigorous assessment that demands a high level of critical thought when they see them. We must prepare them for these more rigorous exams through strong teaching that pushes them to higher levels of thinking. In the words of one of my students, “If you know everything there shouldn’t be a problem with taking it.”
Editorials do not necessarily express the views of The Educator's Room.