Measurement Uncertainty Activity

In previous years, my students have always struggled to really understand measurement uncertainty. Due to my background in the computer-based measurement and automation industry, I was always troubled that I didn’t do I better job helping them understand. So, this year, I developed a set of six activities to provide a hands-on way to practice applying the definitions as well as provide a context to discuss the complexities of measurement uncertainty. Each group investigated one of the activities and then whiteboarded and presented their results with the rest of the class. Each activity had the group determine the measurement uncertainty of a measuring device and calculate the maximum percent uncertainty of their measurements. However, each activity also had a deeper purpose that led to good class discussions during whiteboarding.

1. Measure the dimensions of a block with a ruler. Deeper purpose: calculate the percent uncertainty of the volume of the block.

2. Measure the width and length of the lab table with a modified meter stick (cm precision). Deeper purpose: how does having to make multiple measurements to measure the length affect the measurement uncertainty?

3. Measure the period of a pendulum with the wall clock. Deeper purpose: how does the percent uncertainty change if 2, 5, 10, or 20 oscillations of the pendulum are measured instead?

4. Measure the temperature of ice water and hot water with a digital temperature probe. Deeper purpose: is the percent uncertainty of the cold-water measurement actually greater than that of the hot-water measurement? How does measuring the temperature differ than all the other measurements (difference vs. absolute)?

5. Measure the time for a ball to drop from the table to floor and the ceiling to floor with a digital stopwatch. Deeper purpose: Are the measurement as precise as the measurement uncertainty of the digital stopwatch (1/100 of a second)?

6. Measure the speed of the cart on the track using a photogate connected to the computer. Deeper purpose: What does the computer actually measure? What determines the measurement uncertainty? Determining the actual uncertainty of a photogate connected to a laptop running Logger Pro via a LabPro is well beyond the scope of this course (although, in my Advanced Physics course, we figure it out). Still, students realizing that computer-based measurements don’t have infinite precision is an important lesson.

The class discussions that occurred while whiteboarding were fantastic and this year’s students have a much greater appreciation of measurement uncertainty than those of previous years.

Polar Bears around an Ice Hole

I started some of my classes today with the “Polar Bears around an Ice Hole” riddle:

> The game is in the name of the game – polar bears around an ice hole – invented in the days of Ghengis Khan.
>
> A clue for you to keep you true – like petals around a rose, you can count each bear’s nose.
>
> How many polar bears do you see?

You then roll a bunch of dice. (I created six 5″ dice from styrofoam and black pom poms.) A physics teacher from another school uses this as his introduction activity on the first day of school and shared the activity a couple of years ago. I had planned on using this activity as an extended analogy to introduce specific aspects of the class culture:

* You may feel frustrated as you try to figure physics out. That’s okay.
* Physics is hard to understand until you know the “rules of the game.”
* But, once you discover the rules, physics often seems easy and you may be surprised that others don’t understand.
* However, remember that you didn’t always understand.
* When you discover the rules and understand without someone just telling you the “answer”, you are excited.
* The journey to understanding is very important. So, no one is going to tell you the answer, but we’re all here to support each other on our journeys.
* Being told the “answer” at most gives you one answer that you didn’t know. Learning to think critically and arrive at the answer with support develops a skill that you will use to find many answers.

As the activity progressed, I realized that this activity also served as an excellent example of scientific inquiry. As we continued to try and solve the riddle, I introduced several important ideas:

* make careful observations
* gather lots of data (many roles of the dice)
* look for patterns, compare and contrast, look for extremes
* simply the problem being investigated (roll fewer dice)
* constrain the variables (set dice to specific values)
* propose a hypothesis, test it, modify it based on results, repeat

After discussing the activity, I grabbed my notebook and nonchalantly asked who solved the riddle within the first five minutes. I then announced that they would receive As for today. I then asked who solved the riddle in ten minutes and announced that they would receive Bs. Next, who solved the riddle in fifteen minutes and announced that they would receive Cs. Everyone else would receive Fs. This provided a great hook to transition to our discussion about standards-based grading.

I Grade Homework

Last year, myself and a colleague jumped into the standards-based grading pool with both feet with our honors-level physics class. We appreciated that homework was for practice and should not be graded. I was very excited about this departure from the traditional model of checking homework every day and keeping track of completed homework and absences which wasted valuable class time.

At first, students attempted their homework as assigned. However, before too long, it was apparent that a vast majority of students were not attempting the homework problems before we were to whiteboard them in class. After one particularly ineffective whiteboard session, due to a vast majority of students being unprepared, I attempted to use that experience to illustrate the importance of using the homework problems as practice.

Why did this happen? Did we fail to explain our standards-based grading philosophy? No, I think students appreciated the importance of the learning activities. Students were engaged in learning activities such as labs even though they were not graded. Were the homework problems unnecessary busywork? No, this class moves at a fast pace and, for a vast majority of students, practice outside of class is essential. Students weren’t attempting just the problems they felt they needed to practice; they weren’t attempting any problems.

After this ineffective whiteboard session, a few students with whom I had stronger relationships made a point to talk to me about why they hadn’t attempted their homework. All of them said that they appreciated that they needed to practice these problems. All of them said that they knew that they wouldn’t be able to effectively whiteboard the problems without having at least attempted the homework. All of them knew that eventually they would need to practice in order to do well on the summative assessments. However, all of them also explained that not doing their homework was a conscious decision. They explained that they get home late due to soccer/marching band/play practice. They explained that they have more homework assigned then they can possibly complete in a night (another issue to address). If they don’t complete their math/social studies/other science homework, they lose points, their grade is impacted, their GPA is affected. They believe the only logical choice is not to do their physics homework.

**When other classes assign points to homework, overloaded students that are grade-centric won’t do homework that isn’t assigned points.**

What did we do? We started grading the homework the next semester. We reconciled this change by framing homework as both a learning activity and summative assessment. We continued to whiteboard homework problems (learning activity), but, by the end of the unit, students were required to submit their homework solutions via WebAssign (summative assessment). We used WebAssign since we were able to randomize the numerical values in otherwise identical problems. This allowed students to collaborate but not copy final values.

We haven’t satisfactorily solved the problem of homework. Our current approach is simply the best idea we have at the moment. Over time, this issue may be mitigated as more and more classes in our high school adopt standards-based grading and fewer and fewer teachers grade homework.

If you’ve encountered this problem and are taking another approach, please share! We can always make a change next semester!

I Like Reading Lab Reports

When I first started teaching, I loathed grading lab reports. I had a seemingly never-ending pile of papers almost a foot tall sitting on the front, right-hand corner of my desk glaring at me with that look of “we’re not going anywhere, you know.”

Eventually, I would grab a chunk of labs and my green pen and start reading. I would read each lab and deduct points for errors and omissions. Sure, I would write comments and feedback as well, but I found it challenging to have both a point-deduction-centric perspective and feedback-centric perspective in my head at the same time. Even with a rubric, I spent a lot of time debating with myself, “Is this vague statement *close enough* to receive full credit?” “This paragraph shows some understanding, but I can’t tell what this student really understands; -1 or -2?” After grading a lab, I would flip back through each page counting how many points I had deducted. Not being a very number-centric person, I would at times ask myself, “Wait, is this lab out of 20 or 25 points; let me check.” I would then check to see if I had made a note on the lab that it was submitted late in which case I would apply a late penalty as well which wasn’t always simple: “Okay the 20th was four days ago but I think we didn’t have class one day; let me check…. Ah, we didn’t; so, that only counts as three days late.” At times, I would question if a specific error or omission should result in a one or two point deduction and, under the guise of fairness, I would search through previously graded labs to find one that I vaguely remembered having made the same error or omission. I spent hours and hours grading lab reports.

When I finished grading a lab for the entire class, I would hand it back but worry that, once I did so, students who hadn’t turned in the lab would copy a friend’s graded lab and submit it as their own. When I did hand back the labs, I would watch students’ reactions. If I put a grade on the front of the lab, they would look at the grade and apply, perhaps subconsciously, an algorithm that resulted in a positive, ambivalent, or negative feeling and then file the lab in their folder. *Many wouldn’t even flip through the pages to read my comments.* After noticing this pattern, I started writing the grade on the last page. Many students would then skim their graded labs, but they weren’t reading my feedback; they were scanning for the grade that they knew must be in there somewhere. If I forgot to total the points and write the grade on a couple of labs, they would certainly ask about their grades. However, the students were so entrenched in the grades game that I never had any ask for feedback if I didn’t write comments.

**This sucked. Grading labs was my least favorite part of teaching. There had to be a better way.**

Last year, myself and a colleague integrated our fledgling standards-based grading philosophy into our honors physics classes. We categorized most of labs that were previously graded as learning activities which we defined as “activities that don’t directly affect your grade, they are essential in that they are your opportunity to explore, discover, take risks, make mistakes, ask questions, help each other, practice, and get feedback before having to demonstrate mastery.” We explained this to our students and started our first lab activity. The next day, everyone turned in their labs.

I went home that night and I didn’t grade their labs. I *read* them. As I read them, I wrote comments, asked questions, made minor corrections. I never thought about points. I didn’t calculate a score. It was wonderful.

The next day in class, I handed back the labs. This new standards-based grading methodology was unfamiliar to the students and many hadn’t internalized the role of these learning activities. I observed some students scanning their labs for a grade. “Mr. Schmit, what is my grade on this?” “It is a learning activity, no grade; just feedback.” As they began to understand that no matter how hard they looked, they wouldn’t find a 18/20 anywhere in their lab, I saw students actually reading my feedback. Some students even asked questions about what I had written.

Lab reports got better. As students embraced the standards-based grading philosophy, they started taking risks because they weren’t worried about losing points. The vagueness of statements was diminished. Students began to write what they actually thought instead of what they thought was sufficiently generic to result in credit. Some students even started writing questions in their lab reports to ask for clarification. Many times, after a productive class discussion or whiteboarding session, I wouldn’t feel that I needed to collect the lab reports and provide additional feedback. The students had already provided all of it to each other.

I had a number of goals, hopes, and dreams when I started standards-based grading last year. Liberating students from grading such that they could focus on their learning was one. Liberating myself from grading such that I actually enjoyed reading lab reports wasn’t one of them, but it was a very pleasant surprise.

Feynman the Teacher

I started reading *Six Easy Pieces* by Richard Feynman today. I absolutely loved his autobiographical collection of stories: *Surely You’re Joking, Mr. Feynman!* and *What Do You Care What Other People Think?*. However, I wanted to read something that would give me more insight into Feynman the Teacher. So, I started reading *Six Easy Pieces* since I don’t have time to read the entire *Lectures on Physics* this summer. I’m just getting started, but I found a couple of great quotes in the introductions. Here’s a note he wrote in 1952:

First figure out why you want the students to learn the subject and what you want them to know, and the method will result more or less by common sense.

Of course what is common sense for Feynman probably isn’t for the rest of us. Given his reputation as a showman and brilliant lecturer, I find his “solution to the problem of education” particularly insightful:

I think, however, that there isn’t any solution to this problem of education other than to realize that the best teaching can be done only when there is a direct individual relationship between a student and a good teacher — a situation in which the student discusses the ideas, thinks about the things, and talks about the things. It’s impossible to learn very much by simply sitting in a lecture, or even by simply doing problems that are assigned.

My summer inspiration.

How Many Standards?

When my colleague and I started our standards-based grading journey in the Fall of 2009, we started with a list of objectives defined years previously by a now retired teacher. Since our goal was to make minimal changes to the curriculum and focus on changing the methodology for the class, we decided to use these objectives as the starting point for our standards (which we refer to as “targets”).

What I quickly learned is that I needed to know exactly how I would provide multiple learning activities and multiple summative assessments for each and every standard. Our first unit had 26 standards! While several were lab-specific, that was way too many! We immediately appreciated the importance of defining fewer and more general standards.

How many standards are right for a unit; how many for a semester? I think the answer is different for every class, but after a year of experience, I’ve found that seven or eight standards of which one or two may be lab-specific works well for our honor-level physics class and students.

I just finished revising the standards for the upcoming Fall semester for this class. I ended up with about sixty standards for the semester. This is a fast-paced class and that is reflected in the number of standards. In comparison, my regular-level physics class will have a little more than half as many standards this Fall.

Am I completely satisfied with the number and granularity of the standards for the Fall semester? It’s definitely a step in the right direction, but, no, I’m not completely satisfied. I think I did the best I could balancing the tradeoff between a manageable number of standards from an assessment perspective and sufficiently specific standards such that students are clear on what they need to understand.

I’m not positive how I’m going to improve this aspect of the methodology, but I think the eventual solution is to move to a two-tier system. The top tier would consist of fewer, higher-level standards that are assessed and reported while being manageable. The second tier would contain many more specific sub-standards (“targets”) that students can readily understand.

Please feel free to leave a comment and share your approach for defining standards.

Reflections on A Framework for Science Education

I just finished reading the National Research Council’s preliminary public draft of [A Framework for Science Education](http://www7.nationalacademies.org/bose/Standards_Framework_Preliminary_Public_Draft.pdf). Since there has been some confusion, I’ll mention that this document is a framework for science and engineering education and not a collection of standards. Standards and curricula will likely be developed in the context of this framework.

As an engineer, I found it refreshing that the framework focuses on both science and engineering and the connections and similarities between them. Given the amount of time I spent as an engineer reading and writing technical documents (and the time I just spent reading this document), I was pleased that one of the practices was reading and analyzing technical documents. As someone interested in the history of science and engineering, the framework confirmed my experience that sharing the historical perspective increases students’ interest in science and engineering.

I don’t know if there was an explicit effort by the framework’s authors to incorporate the principles of the Modeling Methodology, but, regardless, the framework’s practices are closely aligned with it. Both model building and questioning are practices enumerated in the framework. I hope to better incorporate both of these aspects of modeling into my classroom this year.

In the prototype learning progressions, some specific concepts are enumerated. I was surprised by some of the concepts included. The emphasis on waves as a core idea was intriguing since, in my limited experience, sound and electromagnetic waves are not always part of a typical physics curriculum. For example, the prototype learning progressions included the concepts of modulation of electromagnetic waves and diffraction.

Overall, the framework’s architecture of core ideas, cross-cutting elements, and practices and its philosophy of depth versus breadth reinforces the direction that I believe my team is heading in physics. Of course, we’ll have to see how this framework influences the standards and curriculum developed within it.

Introducing the Pedagogue Padawan

Welcome to Pedagogue Padawan!

To learn more about my past, present, and future, please read the [About](https://pedagoguepadawan.net/about/) page.

To summarize, I’m a high school physics teacher hoping to make time to share my reflections on learning to help others learn. I expect to focus on my interests in assessment, engineering, mastery learning, modeling, physics, standards-based grading, and technology.