Tag Archives: sbar

Three Realizations about SBAR (Start of Year 1 vs. Year 2)

Now six weeks into the school year, I’m reflecting on how standards-based assessment and reporting (SBAR) is impacting my students and colleagues this year compared to last. There are a number of significant changes. Last year, my colleague and I were two of only a handful of teachers who were implementing SBAR into their classes. Last year, I only integrated SBAR into my honors physics class and not my regular physics class. Last year, I used SnapGrades to report learning progress to students and parents. Last year, I jumped aboard the SBAR Express with both feet. Last year, I was a neophyte. ***Last year was the best year ever.***

The most important realization is that ***standards-based assessment and reporting is a philosophical change*** made by teachers, students, parents, and administrators. It is not simply a function mapping a traditional grading scale to another set of numbers and symbols. If any participant; teacher, student, parent, or administrator; fails to realize this, the benefits of SBAR will not be realized. Even worse, the SBAR movement will suffer as misguided or half-hearted efforts labeled “SBAR” fail to improve learning. If the teacher doesn’t make this philosophical jump, there is no hope that students or parents will. An administrator recently shared with me that the term Standards-Based Grading was a bit of a misnomer since grading is only a small part of what SBG encompasses. I shared the Standards-Based Assessment and Reporting term (which you’ll notice I’m using exclusively in this post) as a more apt alternative. Last year, my colleague and I did not set out to implement SBG or SBAR or any other acronym. Rather, we set out to change students’ perspectives on their learning and the role of grades in our class. SBAR was simply a tool that helped us achieve these goals. As more and more teachers and teams integrate SBAR practices into their classes, I’m very worried that they see SBAR as the end goal as opposed to the means to much more important ones.

The second key realization is that ***clearly presenting the rationale behind SBAR to my students is critical***. Last year, I made a very conscious and deliberative effort to explain SBAR, it purpose, and my rationale for integrating it into our class. My colleague and I received feedback that our students had a very clear understanding of SBAR in our class and our rationale for integrating it. I expect that I haven’t made enough of an effort this year to communicate the rationale. While I may be more familiar and comfortable with SBAR, many of my students are not. Until this year, I didn’t fully appreciate that the manner in which grades are reported to students and parents affects my ability to change students’ attitudes about learning and grades. Last year we reported learning progress with [SnapGrades](http://snapgrades.net/). The “report card” had no percentages and no letter grades. Just a list of standards and a note of which the student had demonstrated mastery:

Screen shot 2010-10-05 at 1.19.30 AM.png

This year, SnapGrades is not an option and we’re using our aging and soon-to-be-replaced online grade book. When students are parents look online, they don’t see any description of standards or clear indication of mastery. They see misleading percentages and letter grades:

Screen shot 2010-10-05 at 12.56.37 AM.png

How can students focus on developing their understanding when they are confronted with “0% (F)” and a “C” in bold, red type? This year, I’m fielding more questions from students and parents about improving their “grade” as opposed to their understanding. I have taken some steps to mitigate the negative impact of our online grade book and will be doing more shortly. More importantly, now that we’ve been together for six weeks, its time to discuss the rationale for SBAR again in each class.

The third realization is that ***taking small steps to integrate SBAR is actually harder and less effective than jumping aboard with both feet***. In my regular physics class, my team agreed to a more conservative approach. We are not measuring student understanding in terms of “mastery” and “developing mastery.” Instead we are using a 1-4 grading scale. The challenge with a 1-4 scale is that students and parents (and some teachers) see it as points or A, B, C, and D. I know that many students see a “2” and think, “that’s a C” rather than “there’s a major concept here that I don’t yet understand.” I’ve had multiple conversations with students who ask why if they only missed one part on an assessment they have a “2.” They are thinking in terms of percentage of questions answered correctly and not that they failed to demonstrate a major concept that is essential to understanding. In order to help students breakaway from their grade-centric mentality, I have to create as large as possible disconnect between symbols used to provide feedback and grades. Since I don’t see the 1-4 grading scale going away in the future (and actually fear it becoming required), I need to work extra hard in class to tie my feedback to their learning and not to their grade.

Despite the challenges that I’m facing, I want to be clear that I’m pleased and hopeful about where we’re heading this year. The best indication that I’m on the right track is that ***I can’t imagine going back to teaching my regular physics class like I did last year***.

This reflection has helped me realize how much work I have to do this year if I want it to be as successful as last year. If you are new to SBAR, hopefully my perspective of two years of introducing SBAR to my classes will help make your efforts more productive. If you have any suggestions, please do leave a comment!

Targets Calendars

One goal that my team has for this year is to help students become more responsible for managing their own learning. One way we do this is to encourage them to track the development of their understanding on targets calendars. Targets calendars (i.e., standards calendars) enumerate the targets (standards) for the current unit and associate targets with specific days, activities, and homework assignments. The targets calendars for my General Physics class and Enriched (Honors) Physics class are a bit different due to the different structure of each course.

In General Physics, there are weekly quizzes and each target is assessed for three consecutive weeks (the 1st, 2nd, 3rd columns). The best two of three scores (on a 1-4 scale) comprise the overall score (the Overall column).

In Enriched Physics, there is only one assessment for a target in class (the A1 column). We encourage students to perform their own self assessment in preparation for this assessment (the SA column). If a student doesn’t demonstrate mastery of the target, they have a second opportunity to do so outside of class (the A2 column). However, they are first encouraged to perform additional practice and seek assistance before this second attempt. Again, we encourage them to self assess before the second attempt (column A2P).

Our targets need refinement but we are improving them each year. Hopefully, if interested, you can adapt the structure to your classes. Leave a comment if you have links to your own organizers that help your students manage their learning.

generalTargetsCalendar.pdf

enrichedTargetsCalendar.pdf

General Physics Syllabus

I’ve been intending to share my syllabi for my classes and finally made the time to do so for my General (regular) Physics class:

syllabus.pdf

If you trying to implement standards-based grading (SBG) in your classroom, you may find the approach taken by my team interesting. The structure that we created is based on how my colleague and I organized our Enriched (honors) Physics class last year when we first implemented SBG.

When communicating our SBG methodology to students, parents, and other teachers; I’ve found the categorization of activities into the two buckets of learning activities and summative assessments very effective. It helps make very clear the difference between learning and demonstrating understanding.

One more note, the conversion of the 1-4 grading scale to percentages is only done to work with the severely limited grading software that we have to use. I’m looking forward to a new software system next year that can support SBG. Hopefully, it works as well as SnapGrades, which I used last year.

(If the idea of homework as a learning activity and summative assessment nauseates you, I [share your feeling](https://pedagoguepadawan.net/11/igradehomework/) and am trying to make it better.)

Polar Bears around an Ice Hole

I started some of my classes today with the “Polar Bears around an Ice Hole” riddle:

> The game is in the name of the game – polar bears around an ice hole – invented in the days of Ghengis Khan.
>
> A clue for you to keep you true – like petals around a rose, you can count each bear’s nose.
>
> How many polar bears do you see?

You then roll a bunch of dice. (I created six 5″ dice from styrofoam and black pom poms.) A physics teacher from another school uses this as his introduction activity on the first day of school and shared the activity a couple of years ago. I had planned on using this activity as an extended analogy to introduce specific aspects of the class culture:

* You may feel frustrated as you try to figure physics out. That’s okay.
* Physics is hard to understand until you know the “rules of the game.”
* But, once you discover the rules, physics often seems easy and you may be surprised that others don’t understand.
* However, remember that you didn’t always understand.
* When you discover the rules and understand without someone just telling you the “answer”, you are excited.
* The journey to understanding is very important. So, no one is going to tell you the answer, but we’re all here to support each other on our journeys.
* Being told the “answer” at most gives you one answer that you didn’t know. Learning to think critically and arrive at the answer with support develops a skill that you will use to find many answers.

As the activity progressed, I realized that this activity also served as an excellent example of scientific inquiry. As we continued to try and solve the riddle, I introduced several important ideas:

* make careful observations
* gather lots of data (many roles of the dice)
* look for patterns, compare and contrast, look for extremes
* simply the problem being investigated (roll fewer dice)
* constrain the variables (set dice to specific values)
* propose a hypothesis, test it, modify it based on results, repeat

After discussing the activity, I grabbed my notebook and nonchalantly asked who solved the riddle within the first five minutes. I then announced that they would receive As for today. I then asked who solved the riddle in ten minutes and announced that they would receive Bs. Next, who solved the riddle in fifteen minutes and announced that they would receive Cs. Everyone else would receive Fs. This provided a great hook to transition to our discussion about standards-based grading.

I Grade Homework

Last year, myself and a colleague jumped into the standards-based grading pool with both feet with our honors-level physics class. We appreciated that homework was for practice and should not be graded. I was very excited about this departure from the traditional model of checking homework every day and keeping track of completed homework and absences which wasted valuable class time.

At first, students attempted their homework as assigned. However, before too long, it was apparent that a vast majority of students were not attempting the homework problems before we were to whiteboard them in class. After one particularly ineffective whiteboard session, due to a vast majority of students being unprepared, I attempted to use that experience to illustrate the importance of using the homework problems as practice.

Why did this happen? Did we fail to explain our standards-based grading philosophy? No, I think students appreciated the importance of the learning activities. Students were engaged in learning activities such as labs even though they were not graded. Were the homework problems unnecessary busywork? No, this class moves at a fast pace and, for a vast majority of students, practice outside of class is essential. Students weren’t attempting just the problems they felt they needed to practice; they weren’t attempting any problems.

After this ineffective whiteboard session, a few students with whom I had stronger relationships made a point to talk to me about why they hadn’t attempted their homework. All of them said that they appreciated that they needed to practice these problems. All of them said that they knew that they wouldn’t be able to effectively whiteboard the problems without having at least attempted the homework. All of them knew that eventually they would need to practice in order to do well on the summative assessments. However, all of them also explained that not doing their homework was a conscious decision. They explained that they get home late due to soccer/marching band/play practice. They explained that they have more homework assigned then they can possibly complete in a night (another issue to address). If they don’t complete their math/social studies/other science homework, they lose points, their grade is impacted, their GPA is affected. They believe the only logical choice is not to do their physics homework.

**When other classes assign points to homework, overloaded students that are grade-centric won’t do homework that isn’t assigned points.**

What did we do? We started grading the homework the next semester. We reconciled this change by framing homework as both a learning activity and summative assessment. We continued to whiteboard homework problems (learning activity), but, by the end of the unit, students were required to submit their homework solutions via WebAssign (summative assessment). We used WebAssign since we were able to randomize the numerical values in otherwise identical problems. This allowed students to collaborate but not copy final values.

We haven’t satisfactorily solved the problem of homework. Our current approach is simply the best idea we have at the moment. Over time, this issue may be mitigated as more and more classes in our high school adopt standards-based grading and fewer and fewer teachers grade homework.

If you’ve encountered this problem and are taking another approach, please share! We can always make a change next semester!

I Like Reading Lab Reports

When I first started teaching, I loathed grading lab reports. I had a seemingly never-ending pile of papers almost a foot tall sitting on the front, right-hand corner of my desk glaring at me with that look of “we’re not going anywhere, you know.”

Eventually, I would grab a chunk of labs and my green pen and start reading. I would read each lab and deduct points for errors and omissions. Sure, I would write comments and feedback as well, but I found it challenging to have both a point-deduction-centric perspective and feedback-centric perspective in my head at the same time. Even with a rubric, I spent a lot of time debating with myself, “Is this vague statement *close enough* to receive full credit?” “This paragraph shows some understanding, but I can’t tell what this student really understands; -1 or -2?” After grading a lab, I would flip back through each page counting how many points I had deducted. Not being a very number-centric person, I would at times ask myself, “Wait, is this lab out of 20 or 25 points; let me check.” I would then check to see if I had made a note on the lab that it was submitted late in which case I would apply a late penalty as well which wasn’t always simple: “Okay the 20th was four days ago but I think we didn’t have class one day; let me check…. Ah, we didn’t; so, that only counts as three days late.” At times, I would question if a specific error or omission should result in a one or two point deduction and, under the guise of fairness, I would search through previously graded labs to find one that I vaguely remembered having made the same error or omission. I spent hours and hours grading lab reports.

When I finished grading a lab for the entire class, I would hand it back but worry that, once I did so, students who hadn’t turned in the lab would copy a friend’s graded lab and submit it as their own. When I did hand back the labs, I would watch students’ reactions. If I put a grade on the front of the lab, they would look at the grade and apply, perhaps subconsciously, an algorithm that resulted in a positive, ambivalent, or negative feeling and then file the lab in their folder. *Many wouldn’t even flip through the pages to read my comments.* After noticing this pattern, I started writing the grade on the last page. Many students would then skim their graded labs, but they weren’t reading my feedback; they were scanning for the grade that they knew must be in there somewhere. If I forgot to total the points and write the grade on a couple of labs, they would certainly ask about their grades. However, the students were so entrenched in the grades game that I never had any ask for feedback if I didn’t write comments.

**This sucked. Grading labs was my least favorite part of teaching. There had to be a better way.**

Last year, myself and a colleague integrated our fledgling standards-based grading philosophy into our honors physics classes. We categorized most of labs that were previously graded as learning activities which we defined as “activities that don’t directly affect your grade, they are essential in that they are your opportunity to explore, discover, take risks, make mistakes, ask questions, help each other, practice, and get feedback before having to demonstrate mastery.” We explained this to our students and started our first lab activity. The next day, everyone turned in their labs.

I went home that night and I didn’t grade their labs. I *read* them. As I read them, I wrote comments, asked questions, made minor corrections. I never thought about points. I didn’t calculate a score. It was wonderful.

The next day in class, I handed back the labs. This new standards-based grading methodology was unfamiliar to the students and many hadn’t internalized the role of these learning activities. I observed some students scanning their labs for a grade. “Mr. Schmit, what is my grade on this?” “It is a learning activity, no grade; just feedback.” As they began to understand that no matter how hard they looked, they wouldn’t find a 18/20 anywhere in their lab, I saw students actually reading my feedback. Some students even asked questions about what I had written.

Lab reports got better. As students embraced the standards-based grading philosophy, they started taking risks because they weren’t worried about losing points. The vagueness of statements was diminished. Students began to write what they actually thought instead of what they thought was sufficiently generic to result in credit. Some students even started writing questions in their lab reports to ask for clarification. Many times, after a productive class discussion or whiteboarding session, I wouldn’t feel that I needed to collect the lab reports and provide additional feedback. The students had already provided all of it to each other.

I had a number of goals, hopes, and dreams when I started standards-based grading last year. Liberating students from grading such that they could focus on their learning was one. Liberating myself from grading such that I actually enjoyed reading lab reports wasn’t one of them, but it was a very pleasant surprise.

How Many Standards?

When my colleague and I started our standards-based grading journey in the Fall of 2009, we started with a list of objectives defined years previously by a now retired teacher. Since our goal was to make minimal changes to the curriculum and focus on changing the methodology for the class, we decided to use these objectives as the starting point for our standards (which we refer to as “targets”).

What I quickly learned is that I needed to know exactly how I would provide multiple learning activities and multiple summative assessments for each and every standard. Our first unit had 26 standards! While several were lab-specific, that was way too many! We immediately appreciated the importance of defining fewer and more general standards.

How many standards are right for a unit; how many for a semester? I think the answer is different for every class, but after a year of experience, I’ve found that seven or eight standards of which one or two may be lab-specific works well for our honor-level physics class and students.

I just finished revising the standards for the upcoming Fall semester for this class. I ended up with about sixty standards for the semester. This is a fast-paced class and that is reflected in the number of standards. In comparison, my regular-level physics class will have a little more than half as many standards this Fall.

Am I completely satisfied with the number and granularity of the standards for the Fall semester? It’s definitely a step in the right direction, but, no, I’m not completely satisfied. I think I did the best I could balancing the tradeoff between a manageable number of standards from an assessment perspective and sufficiently specific standards such that students are clear on what they need to understand.

I’m not positive how I’m going to improve this aspect of the methodology, but I think the eventual solution is to move to a two-tier system. The top tier would consist of fewer, higher-level standards that are assessed and reported while being manageable. The second tier would contain many more specific sub-standards (“targets”) that students can readily understand.

Please feel free to leave a comment and share your approach for defining standards.