Tag Archives: science

AP Annual Conference: Formative Assessment in the AP Science Classroom

**Formative Assessment in the AP Science Classroom**

*Ryan Fedewa, Stevenson High School*

*I attended this session to learn different approaches for formative assessments that could be applied to all AP Science courses; not just physics. While Standards Based Grading (SBG) wasn’t explicitly mentioned, some aspects of the presenters formative assessments would align. I would imagine a tool like [BlueHarvest](http://main.blueharvestfeedback.com) would also work well.*

* 5 Formative Assessment characteristics
* the provision of effective feedback to students
* the active involvement of student in their own learning
* the adjustment of teaching to take into account the results of the assessment
* the recognition of the profound influence assessment has on the motivation and self-esteems of students, both of which are critical influences on learning
* the need for students to be able to assess themselves and understand how to improve
* “We’re going to let our students know where they’re at, and let them know how they can improve from there.”
* Example Tools
* [SurveyMonkey.com](SurveyMonkey.com)
* [PollEverywhere.com](PollEverywhere.com)
* [Mastery Manager](http://www.masterymanager.com)
* administered weekly 5-10 question multiple choice quiz as the source of the data
* also included unit exams and final exam (but will exclude final exam next year)
* plan to incorporate practice AP exams and practice ACT exams
* tagged each question with a science skill as well as a science content area
* College Board Science Practices work well as tags

Three Realizations about SBAR (Start of Year 1 vs. Year 2)

Now six weeks into the school year, I’m reflecting on how standards-based assessment and reporting (SBAR) is impacting my students and colleagues this year compared to last. There are a number of significant changes. Last year, my colleague and I were two of only a handful of teachers who were implementing SBAR into their classes. Last year, I only integrated SBAR into my honors physics class and not my regular physics class. Last year, I used SnapGrades to report learning progress to students and parents. Last year, I jumped aboard the SBAR Express with both feet. Last year, I was a neophyte. ***Last year was the best year ever.***

The most important realization is that ***standards-based assessment and reporting is a philosophical change*** made by teachers, students, parents, and administrators. It is not simply a function mapping a traditional grading scale to another set of numbers and symbols. If any participant; teacher, student, parent, or administrator; fails to realize this, the benefits of SBAR will not be realized. Even worse, the SBAR movement will suffer as misguided or half-hearted efforts labeled “SBAR” fail to improve learning. If the teacher doesn’t make this philosophical jump, there is no hope that students or parents will. An administrator recently shared with me that the term Standards-Based Grading was a bit of a misnomer since grading is only a small part of what SBG encompasses. I shared the Standards-Based Assessment and Reporting term (which you’ll notice I’m using exclusively in this post) as a more apt alternative. Last year, my colleague and I did not set out to implement SBG or SBAR or any other acronym. Rather, we set out to change students’ perspectives on their learning and the role of grades in our class. SBAR was simply a tool that helped us achieve these goals. As more and more teachers and teams integrate SBAR practices into their classes, I’m very worried that they see SBAR as the end goal as opposed to the means to much more important ones.

The second key realization is that ***clearly presenting the rationale behind SBAR to my students is critical***. Last year, I made a very conscious and deliberative effort to explain SBAR, it purpose, and my rationale for integrating it into our class. My colleague and I received feedback that our students had a very clear understanding of SBAR in our class and our rationale for integrating it. I expect that I haven’t made enough of an effort this year to communicate the rationale. While I may be more familiar and comfortable with SBAR, many of my students are not. Until this year, I didn’t fully appreciate that the manner in which grades are reported to students and parents affects my ability to change students’ attitudes about learning and grades. Last year we reported learning progress with [SnapGrades](http://snapgrades.net/). The “report card” had no percentages and no letter grades. Just a list of standards and a note of which the student had demonstrated mastery:

Screen shot 2010-10-05 at 1.19.30 AM.png

This year, SnapGrades is not an option and we’re using our aging and soon-to-be-replaced online grade book. When students are parents look online, they don’t see any description of standards or clear indication of mastery. They see misleading percentages and letter grades:

Screen shot 2010-10-05 at 12.56.37 AM.png

How can students focus on developing their understanding when they are confronted with “0% (F)” and a “C” in bold, red type? This year, I’m fielding more questions from students and parents about improving their “grade” as opposed to their understanding. I have taken some steps to mitigate the negative impact of our online grade book and will be doing more shortly. More importantly, now that we’ve been together for six weeks, its time to discuss the rationale for SBAR again in each class.

The third realization is that ***taking small steps to integrate SBAR is actually harder and less effective than jumping aboard with both feet***. In my regular physics class, my team agreed to a more conservative approach. We are not measuring student understanding in terms of “mastery” and “developing mastery.” Instead we are using a 1-4 grading scale. The challenge with a 1-4 scale is that students and parents (and some teachers) see it as points or A, B, C, and D. I know that many students see a “2” and think, “that’s a C” rather than “there’s a major concept here that I don’t yet understand.” I’ve had multiple conversations with students who ask why if they only missed one part on an assessment they have a “2.” They are thinking in terms of percentage of questions answered correctly and not that they failed to demonstrate a major concept that is essential to understanding. In order to help students breakaway from their grade-centric mentality, I have to create as large as possible disconnect between symbols used to provide feedback and grades. Since I don’t see the 1-4 grading scale going away in the future (and actually fear it becoming required), I need to work extra hard in class to tie my feedback to their learning and not to their grade.

Despite the challenges that I’m facing, I want to be clear that I’m pleased and hopeful about where we’re heading this year. The best indication that I’m on the right track is that ***I can’t imagine going back to teaching my regular physics class like I did last year***.

This reflection has helped me realize how much work I have to do this year if I want it to be as successful as last year. If you are new to SBAR, hopefully my perspective of two years of introducing SBAR to my classes will help make your efforts more productive. If you have any suggestions, please do leave a comment!

Letting Students Teach

I’m really making an effort this year to have a much greater percentage of class time spent with students learning together in small groups as they solve physics problems rather than me solving problems on the board. I’ll still model how to solve certain types of problem to demonstrate problem solving best practices, but I’ve observed much more effective learning when students are working through problems with a small group of peers rather than copying what I’m writing. However, what I don’t want to happen is for one student in a group to understand how to solve the problem and simply tell everyone else in the group the solution such that they just copy what she writes.

I realized that this was an opportunity for some coaching. I requested that, while groups work on solutions to the problems, they refrain from simply telling each other the answers. Since we were working on drawing graphs of motion (position vs. time and velocity vs. time) from descriptions, I asked that the students confident of their answers instead describe the motion graphed by the other students. When the students hears the description of the motion that doesn’t match their intended descriptions, how to correct the graph may be clear. It wasn’t too much of a stretch to have students facilitate their group’s discussion in this manner since students are slowly becoming familiar with the socratic questioning during whiteboarding and are already used to the fact that I respond to almost every question with one or more questions of my own.

As I walked around the room, I witnessed a dozen teachers effectively giving individual attention and support to a dozen students.

No one asked me question.

Polar Bears around an Ice Hole

I started some of my classes today with the “Polar Bears around an Ice Hole” riddle:

> The game is in the name of the game – polar bears around an ice hole – invented in the days of Ghengis Khan.
>
> A clue for you to keep you true – like petals around a rose, you can count each bear’s nose.
>
> How many polar bears do you see?

You then roll a bunch of dice. (I created six 5″ dice from styrofoam and black pom poms.) A physics teacher from another school uses this as his introduction activity on the first day of school and shared the activity a couple of years ago. I had planned on using this activity as an extended analogy to introduce specific aspects of the class culture:

* You may feel frustrated as you try to figure physics out. That’s okay.
* Physics is hard to understand until you know the “rules of the game.”
* But, once you discover the rules, physics often seems easy and you may be surprised that others don’t understand.
* However, remember that you didn’t always understand.
* When you discover the rules and understand without someone just telling you the “answer”, you are excited.
* The journey to understanding is very important. So, no one is going to tell you the answer, but we’re all here to support each other on our journeys.
* Being told the “answer” at most gives you one answer that you didn’t know. Learning to think critically and arrive at the answer with support develops a skill that you will use to find many answers.

As the activity progressed, I realized that this activity also served as an excellent example of scientific inquiry. As we continued to try and solve the riddle, I introduced several important ideas:

* make careful observations
* gather lots of data (many roles of the dice)
* look for patterns, compare and contrast, look for extremes
* simply the problem being investigated (roll fewer dice)
* constrain the variables (set dice to specific values)
* propose a hypothesis, test it, modify it based on results, repeat

After discussing the activity, I grabbed my notebook and nonchalantly asked who solved the riddle within the first five minutes. I then announced that they would receive As for today. I then asked who solved the riddle in ten minutes and announced that they would receive Bs. Next, who solved the riddle in fifteen minutes and announced that they would receive Cs. Everyone else would receive Fs. This provided a great hook to transition to our discussion about standards-based grading.

Reflections on A Framework for Science Education

I just finished reading the National Research Council’s preliminary public draft of [A Framework for Science Education](http://www7.nationalacademies.org/bose/Standards_Framework_Preliminary_Public_Draft.pdf). Since there has been some confusion, I’ll mention that this document is a framework for science and engineering education and not a collection of standards. Standards and curricula will likely be developed in the context of this framework.

As an engineer, I found it refreshing that the framework focuses on both science and engineering and the connections and similarities between them. Given the amount of time I spent as an engineer reading and writing technical documents (and the time I just spent reading this document), I was pleased that one of the practices was reading and analyzing technical documents. As someone interested in the history of science and engineering, the framework confirmed my experience that sharing the historical perspective increases students’ interest in science and engineering.

I don’t know if there was an explicit effort by the framework’s authors to incorporate the principles of the Modeling Methodology, but, regardless, the framework’s practices are closely aligned with it. Both model building and questioning are practices enumerated in the framework. I hope to better incorporate both of these aspects of modeling into my classroom this year.

In the prototype learning progressions, some specific concepts are enumerated. I was surprised by some of the concepts included. The emphasis on waves as a core idea was intriguing since, in my limited experience, sound and electromagnetic waves are not always part of a typical physics curriculum. For example, the prototype learning progressions included the concepts of modulation of electromagnetic waves and diffraction.

Overall, the framework’s architecture of core ideas, cross-cutting elements, and practices and its philosophy of depth versus breadth reinforces the direction that I believe my team is heading in physics. Of course, we’ll have to see how this framework influences the standards and curriculum developed within it.