Tag Archives: modeling

Honors Physics Changes

Several factors combined into a perfect storm that set the stage to make major changes to our Honors Physics course. One, last year was rough and several aspects of class were disappointing. I’m not going to dwell on those here. Two, we have an extra section of Honors Physics this upcoming year and another physics teacher will join my colleague and I in teaching Honors Physics. She is a really good influence on us! Three, we want to pilot the AP Physics 1 course to prepare for the first official year of AP Physics 1/2 in 2014-2015 and prime a pipeline of students ready for AP Physics 2. As a result, we are changing almost every aspect of this course.

First is the curriculum. We are aligning our curriculum to that of AP Physics 1. This changes the emphasis from content to understanding and skills. As a result, we will finally be able to implement [Modeling Instruction](http://modelinginstruction.org/) in Honors Physics! The shift to Modeling Instruction, which we have been using in General Physics for a few years, will have a tremendous impact on these students. We are also taking some of the most successful aspects of my AP Physics B course and incorporating them into Honors Physics. We will have formative quizzes for each unit and we will have peer instruction to focus on conceptual understanding.

This change in curriculum and pedagogy required us to redefine all of our units and materials. All new standards, in-class packets, quizzes, lab activities, lab practicums, and exams. Fortunately, we didn’t have to create too many materials from scratch. We started with Kelly O’Shea’s [Honors Physics Standards](http://kellyoshea.wordpress.com/2011/08/10/honors-physics-2012-objectives/). We used worksheets from the Modeling Workshop along with portions of Kelly’s packets. We used peer instruction questions I compiled for AP Physics B. We combined quiz and exam questions from a variety of sources. We kept our favorite labs and found or created new ones.

We are also trying to incorporate and emphasize certain themes throughout the course. One is growth mindset. Reading Dr. Carol Dweck’s book [Mindset](http://www.amazon.com/Mindset-The-New-Psychology-Success/dp/0345472322/) and Daniel Coyle’s [The Talent Code](http://www.amazon.com/The-Talent-Code-Greatness-Grown/dp/055380684X/) this summer, helped me to find the commonality of behaviors and attitudes that some physics students, especially honors physics students, have that make them really struggle in the course. I prepared a mini-lesson (upcoming post) to introduce the concepts of fixed vs. growth mindset and deep practice. Another area of focus will be measurement uncertainty in labs. While we have a good set of [measurement uncertainty activities](https://pedagoguepadawan.net/198/updated-measurement-uncertainty-activities/), we don’t sufficiently reinforce these concepts throughout the year. At the most recent QuarkNet Workshop at Fermilab, we heard and discussed how critical it was for students to understand and appreciate the concept of measurement uncertainty.

A good sign that we are on the right track for this revamped Honors Physics course is that I’m excited and looking forward to this class this year. Without these changes, I don’t think I would be saying that….

Mindstorms

I put *[Mindstorms: Children, Computers, and Powerful Ideas](http://www.amazon.com/Mindstorms-Children-Computers-Powerful-Ideas/dp/0465046746)* by Seymour Papert on my reading list when I started teaching AP Computer Science. Being unfamiliar with how best to teach high school students computer science, I figured I needed all the help I could get and heard that *Mindstorms* was the seminal text on how kids learn computing. If I had better understood what *Mindstorms* was about, I would have read it six years ago when I started teaching.

*Mindstorms* isn’t just about teaching kids about computer science. I was surprised at how frequently learning physics was a topic. Papert shared insights on everything from of what does “learning physics” consist (hint: it is not plugging numbers in equations) to how to support learners’ conceptual intuitions rather than attack their “misconceptions.” I was reminded of everything from Modeling Instruction to computational thinking using VPython as I read those sections.

I was also surprised at how useful *Mindstorms* was as a guide, and a cautionary tale, of the role that technology should play in education. I would recommend it to every teacher interested in leveraging technology to improve learning, every technology integrator, and every administrator who may otherwise approve a purchase order for an interactive whiteboard. It clearly presents how the focus needs to be on the student, on her learning, and not on the technology. A reminder that echnology enables us to do better things not do things better.

*Mindstorms* was written at the advent of the personal computer revolution. Papert was advocating for a revolution in education. While Logo continues to appear in classrooms (my nine-year-old used Logo some in Math class this year), unfortunately, the ideals of *Mindstorms* haven’t been realized and, with few exceptions, technology hasn’t been used to change the culture of education. It is sad to reflect on this history and the opportunity that has been lost. I feel that now thirty-three years later, we are at the advent of another technological revolution. Instead of a personal computer in every home, we have a personal computer in every pocket. However, how we will choose to leverage this technology in the educational sphere remains to be seen. With the proliferation and prominence of MOOCs, flipping, gamification, and Khan Academy, I worry that we will once again fail to seize this opportunity. There are beacons of hope: hackerspaces, [FIRST Lego League](http://www.usfirst.org/roboticsprograms/fll), and [The Big Ideas School](http://www.shawncornally.com/BIG/). Personally, I’m reinvigorated to revolutionize my small sphere of influence through [FIRST Robotics](http://team3061.org), [Physics Club](https://pedagoguepadawan.net/191/inspiring-younger-students-with-near-space-balloons/), and improving physics instruction.

*(I had a slow start reading this book. If you encounter the same, I would recommend skipping the two forewords and the two introductions. In addition, the paperback that I purchased was visually awful. It looked like a printout of a poor scan. If you can find an older copy, your eyes will thank you.)*

Honors Physics Reflection

I previously shared my [end-of-semester reflection](https://pedagoguepadawan.net/160/mechanics-modeling-instruction-reflection/) for my regular physics class. I wanted to do the same for my honors physics class which is significantly different from my regular physics class. We do not use Modeling Instruction, and it is a fast-paced, problem-solving focused, class. It is basically an AP Physics B class that covers all topics except for fluid mechanics, thermal physics, atomic physics and quantum effects, and nuclear physics. We actually cover some topics beyond the scope of the AP Physics B curriculum. That said, it does have many progressive elements. We are now in our third year of standards-based assessment and reporting. There are no points as it is a mastery-based system. Many labs are not scored but serve as discovery labs through guided inquiry. We leverage some aspects of Modeling Instruction such as whiteboarding and socratic dialog.

We move through units at a very fast pace. In the fall semester, we covered Giancoli Chapters 1-7 and 9. While the curriculum is “a mile wide,” it isn’t “an inch deep.” The mastery system requires our students to develop a significant understanding of these topics. That said, multiple representations are noticeably lacking. I’m always surprised when I see that graphical representations for kinematics is an optional section in Giancoli (but not in the class).

Since implementing SBAR, I’ve been pleased with the learning that occurs in honors physics despite its more traditional elements. To check if I’m completely misleading myself, I administer the FCI at the beginning and end of the fall semester. This year’s gain was 0.58 which was just a tad lower than the gain of 0.60 the previous two years.

My reflection regarding honors physics this fall has been focused on why the structure of the class seems to be working. Should I be satisfied with the degree to which students are replacing and refining their preconceptions about mechanics? Would I see a deeper level of understanding if I moved to Modeling Instruction? At what cost?

While musing on these questions, I thought back to my own experience in high school and college. As best I can recall, I learned physics in mostly traditional classrooms. How was it that I developed a decent understanding without many misconceptions in these environments?

The conclusion that I have arrived at is that I perform a mini-modeling discourse and modeling building with myself as I listen to a lecture or practice solving problems. I have an ongoing commentary in my head where I’m asking myself questions that connect one idea to the next, finding patterns, building models, testing models, refining models. I never was, and still am not, good at memorizing stuff; so, I had to construct and derive solutions on the fly.

I appreciate that not all of my students in honors physics do this, but I believe that many do. Whenever I hear that students cannot learn from lecture, I wince a bit since I believe that some students can. I think that those that can intrinsically do what many progressive pedagogies do explicitly with the entire class.

I don’t think that the current structure of honors physics is perfect by any means. While we are going to make some minor SBAR-related changes this semester (post coming soon), I don’t anticipate any major changes next year. Instead, I’m going to focus my efforts on preparing for a new AP Physics B class that I will be teaching. Furthermore, before I make any significant changes to honors physics, I want to see the new AP Physics B curriculum. I have a feeling that it will require significant changes to honors physics if not replace the course entirely. That will provide an opportunity to reassess all of these ideas.

If you think I’ve missed something major in my analysis, please don’t hesitate to call it out. Likewise, if you’ve come to a similar conclusion, I’d appreciate the reinforcement.

STEM Talk at NI

Yesterday, I had the honor of presenting my experiences this past summer working on the Fermilab Holometer as well as my perspectives on STEM education at the high school level at National Instruments. Since my contribution to the Holometer project used National Instruments products and my family was vacationing in Austin, Texas, I offered to visit and share my experiences. I was a bit surprised when I was also asked to share my perspectives on STEM education in high school.

My presentation about the Holometer was pretty much the same as the one I gave the [Global Physics Department](http://globalphysicsdept.posterous.com/geoff-schmit-on-research-for-hs-teachers). (I’ve [written several posts](https://pedagoguepadawan.net/holometer) about the Holometer.) I added more technical details on the NI products involved and how the signal analysis was performed to better match the audience.

At first, I didn’t feel qualified to address National Instruments employees, who work for a company that are amazing supporters of STEM in K-12 with their efforts with FIRST and LEGO. As a result, I started my presentation with disclaimers:

* I do not have a master’s degree in STEM education
* I am not a STEM education expert
* I have not attended conferences and workshops in STEM education
* I have taught at a one high school for five years

However, once I sat down and started thinking about what I would share, I realized that I, like most physics teachers, am qualified to at least share my perspective because:

every morning I get up and try to inspire students in science, technology, engineering, and mathematics by leveraging my experience as an engineer, an interviewer, a supervisor, and a teacher.

In my case, I specifically left National Instruments and software development to become a physics teacher to make some small contribution by inspiring students to pursue studies and careers in STEM-related fields.

I structured my presentation around three high-level themes which I elaborated with photos, videos, and stories:

**Inspire Students with Experiences**

I shared that few students are inspired because of something they only read or hear or see; they are inspired by their experience doing it. I shared the experiences of my FIRST Robotics Team, Science Olympiad Team, and Physics Club. Physics Club is an after school, student-driven, low-commitment group that allows all students opportunities to play, inquire, create, share, and explore. I shared our past experiences with [near-space ballooning](https://pedagoguepadawan.net/60/nearspaceballoon/) and the [ping pong ball cannon](https://pedagoguepadawan.net/157/pingpongballcannon/). The second theme is:

**Inspire Younger Students with Older Students**

The main ideas for this theme are that students respond best to other students and students can loose interest in science during middle school. To address this, Physics Club and the FIRST Robotics Team perform outreach activities where younger students see projects done by the older students and build their own smaller-scale projects with the assistance of older students. The third theme is:

**Inspire the other 98% in the Classroom**

I was somewhat disappointed when I realized that all my efforts with FIRST Robotics, Science Olympiad, and Physics Club only involve 2% of the students at my school. I shared that this is a significant challenge but the most important theme. Many changes to a traditional classroom are required to inspire students about STEM:

* Change Perceptions
* Change Mindset
* Change Pedagogy
* Change Culture

I shared the importance of bring professionals into the classroom to share their experience and helping students appreciate that science is an active process done by real people. Despite significant local press about standards-based assessment and reporting, I shared how critical it is in my classrooms. I talked about Modeling Instruction, guided inquiry, project-based learning, and Project Lead the Way.

At the end, I felt compelled to take advantage of this opportunity to encourage those in attendance to help inspire students about STEM. I charged them to:

* Be Aware
* Promote Reform
* Provide Support

I was honestly surprised at the level of interest in my presentation based on the attendance and the number of positive comments afterward. So, for those of you like me who are career changers, if the opportunity presents itself, share your experiences as a teacher with your former colleagues. We may gain more allies in the challenges that we face everyday.

Mechanics Modeling Instruction Reflection

I just finished my second year of Modeling Instruction for mechanics in my regular physics class.

While I attended a mechanics modeling workshop a few years ago, I remember when I first decided to jump into modeling with both feet. I was looking at a problem involving electromagnetic induction that required use of the equation F = BIl. All students had to do was to find three numbers, one in units of tesla, one in amps, one in meters and multiply them together without any understanding of physics. This was reinforced when I saw students in the next question trying to solve for resistance using Ohm’s Law and plugging in a velocity instead of a voltage. Many of my students weren’t understanding physics, they were learning to match variables with units and plug-and-chug. Our curriculum was much wider than deep and I felt that I had to make a change.

Fortunately, my desire to change the emphasis of the curriculum coincided with a county-wide effort to define a core curriculum for physics. While it wasn’t easy, the team of physics teachers at my school agreed that we had to at least cover the core curriculum as defined by the county effort. This was the opportunity to reduce the breadth of the curriculum, focus on understanding and critical thinking, and use Modeling Instruction for mechanics.

I felt that the first year of Modeling Instruction was a huge improvement in terms of student understanding. This past semester was even better. While just one measure, FCI gains reinforce my beliefs. In 2009, the year before introducing Modeling Instruction, my students’ average FCI Gain was .33. In 2010, the first year of Modeling Instruction, it was .43. This year, the FCI gain was .47. While I don’t credit Modeling Instruction as the sole factor that produced these improvements in students’ conceptual understanding, it is probably the most significant. We also started standard-based assessment and reporting in 2010 and, hopefully, I’m improving as a teacher in other ways. For me, the most important confirmation that I was on the right path was that I couldn’t imagine going back to the way that I was teaching before.

The three most important changes that I made this year were: [goalless problems](http://quantumprogress.wordpress.com/2010/11/20/goal-less-problems/), sequencing of units (CVPM, BFPM, CAPM, UBFPM, PMPM), and [revised Modeling Worksheets](http://kellyoshea.wordpress.com/physics-materials/) based on the work of [Kelly O’Shea](http://kellyoshea.wordpress.com/), [Mark Schober](http://science.jburroughs.org/mschober/physics.html), and Matt Greenwolfe.

There is still plenty of room for improvement, however. Pacing was a big issue. We still have to finish mechanics in one semester. As a result of the time spent in other units, I really had to rush energy and momentum. While students could connect to many concepts in the momentum unit with previous models, energy was completely different. However, this experience had a silver lining in that it may provide hope for other teachers who want to adopt Modeling Instruction but are concerned that they won’t have time to cover their curriculum. I decided at the beginning of the semester that I would spend the time I felt was needed on each unit to develop the underlying skills of critical thinking, problem solving, and conceptual understanding. When I got near the end of the semester and had to fly through energy, I didn’t introduce it as another modeling unit. Instead, I presented it to the students as another representation of mechanics. I encouraged them to apply their critical thinking and problem solving skills to this different approach. I was pleasantly surprised when they did as well as previous years’ classes on the energy summative exam despite the incredible short amount of time we spend on the unit. I think this supports the idea that students versed in Modeling Instruction will have a strong foundation that will allow them to readily understand unfamiliar topics as well as, if not better, than students who covered those topics in a traditional fashion.

Whiteboarding continues to be an area that requires improvement. I made a couple of changes that improved the level of discourse among students. When whiteboarding labs, I either explicitly jigsawed the lab activities or guided groups to explore different areas such that each group had unique information to present to the class. This variety improved engagement and discussion. When whiteboarding problems, we played the mistake game on several occasions. This too increased engagement and discussion. However, I feel that I still have a long way to go to achieve the socratic dialog that I believe is possible.

Next fall, I will dramatically shorten the first unit which focuses on experimental design and analysis. I will probably still start with the bouncing ball lab but then immediately move onto the constant-velocity buggies. That should allow enough time to explore energy and momentum in a more reasonable time frame.

At least I feel like I’m on the right path.

The Danger of Misapplying Powerful Tools

When I was a software engineer, I frequently used powerful tools such as C++ and techniques such as object-oriented analysis and design to implement software that performed complex operations in an efficient and effective manner. I also spent a lot of time sharing these with others. However, I learned to provide a caveat: if misapplied, these tools and techniques can result in a much more significant problem than would result when applying less powerful ones. That is, if you are not skilled in the deployment of these tools and techniques, the risk is much larger than the benefit.

Other engineers didn’t always appreciate this caveat. So, I would try to communicate with an analogy. You can build a desk with a saw, hammer, screwdriver, and drill. You can build a desk more efficiently using a table saw, drill press, and nail gun. If you make a mistake with the hammer, you may loose a fingernail. If you make a mistake with the table saw, you may loose a finger. If you are not adept at deploying the tools and techniques, maybe you should stick with the hand tools until you are.

In reality, the risk of misapplying these tools and techniques is more significant than the impact on the immediate project. The broader risk is that others who observe the troubled project associate the failure with the tools and techniques instead of the application of those tools and techniques. People get the impression, and share their impression, that “C++ and object-oriented analysis and design is a load of crap. Did you see what happened to project X?” Rarely do people, especially people not skilled with these tools and techniques, have the impression that the problem is the application of the tools and techniques rather than the tools and techniques themselves. This, in fact, is a much more serious risk that threatens future applications of the tools and techniques in a proficient manner due to their now tarnished reputation.

A series of articles and posts recently reminded me of my experience writing software and this analogy. I feel compelled to start with a disclaimer since this post has the potential to come across as arrogant, which is certainly not my intention. I have not performed any longitudinal studies that support my conclusions. My conclusions are based on few observations and my gut instinct. I tend to trust my gut instinct since it has served me well in the past. So, if you find this post arrogant, before you write me off, see if these ideas resonate with your experience.

**SBAR**

Let’s start with Standards-Based Reporting and Assessment (SBAR) (a.k.a., Standards-Based Grading (SBG)). Last year, my school started [adapting SBAR school-wide](https://pedagoguepadawan.net/23/growingsbarschoolwide/). SBAR is a powerful methodology that requires proficient deployment. It is not easy to adapt and effectively apply SBAR to a classroom in an effective way that resonates with parents, students, teachers, and administrators. Proper deployment requires a fundamental change in the teacher’s and students’ philosophy of learning. While the effect of a failed deployment on the individual classes is unfortunate, the larger problem is that teachers and parents attribute the problems to SBAR and not its application. It takes much less effort to convince a parent confused about SBAR of its value than it does to convince a parent livid about SBAR due to a poor experience in another class. At my school, one early SBAR adopter stopped referencing SBAR or SBG at all in his class to distance his methodology from the problematic applications. Fortunately, my school has pulled back a bit this year. This is the risk of mandating application of a powerful tool by those not proficient in its deployment. This is not [a unique experience](http://t-cubed-teaching.blogspot.com/2011/10/sbg-goes-up-in-smoke.html).

Two years ago, another teacher and I decided to try to apply SBAR to our Honors Physics class. We mitigated the risk by limiting deployment to six sections of a single class taught just by the two of us. We sent letters to parents, talked to parent groups, discussed the system with students during class. Only after gaining a year of experience, did we attempt to adapt SBAR to our General Physics class which contained ten sections and was taught by four different teachers. The risk of trying to deploy SBAR on this scale initially was too great given our proficiency.

**Technology**

Someone recently shared [this New York Times article](http://www.nytimes.com/2011/09/04/technology/technology-in-schools-faces-questions-on-value.html?_r=2&pagewanted=all) that questions the value of technology in the classroom. In general, a given piece of technology on its own isn’t effective or not effective. Whether technology is effective or not depends as much on its application as the technology itself. It depends on the teacher and the students and the class. Personally, I’ll stick with my [$2 interactive whiteboards](http://fnoschese.wordpress.com/2010/08/06/the-2-interactive-whiteboard/). This isn’t because SMART Boards are inherently ineffective. It is because they aren’t effective for me and my students given my classroom and my expertise. I expect there are teachers out there who use SMART Boards quite effectively. They are probably sick of hearing how they are a complete waste of money.

I hope to have a class set of iPads at some point this year. My school isn’t going to buy iPads for every student. Instead, we’ll put iPad in the hands of 25 General Physics students in my classroom and see what we can do together. Start small, reflect, adjust, expand.

**Modeling**

I participated in a [Modeling Instruction Physics](http://modeling.asu.edu/) workshop in the summer of 2008. I didn’t dare to really start modeling in my classroom until last fall. Why? I believed that the potential risk to my students due to a misapplication of the modeling methodology was tremendous. I decided that it was better for my students to learn what they could via more traditional instruction than what I foresaw as a potential disaster if I misapplied the deployment of modeling. Even more importantly, I was concerned that I could put Modeling Instruction at risk of never being adopted if my failed deployment was interpreted as a failure of Modeling Instruction itself. Only after more research, practice of Modeling Instruction techniques, and discussions with others, did I feel comfortable deploying Modeling in my class last fall. In an attempt to shield modeling from my potential deployment failures, this is the first year that I’ve associated the label “Modeling Instruction” to my class.

I used to be surprised at how adamantly some Modelers warned teachers not to do Modeling Instruction unless they had taken a workshop. I now believe they are worried about the same potential risk that I am. Modeling Instruction is a collection of powerful tools and techniques. Done well, by a skilled practitioner, Modeling Instruction can be incredibly effective. Applied ineffectively, Modeling Instruction can be a disaster and tarnish its reputation. I think students are better served by traditional instruction than by Modeling Instruction applied ineffectively. Traditional instruction may result in a lost fingernail. Ineffective modeling instruction may result in a lost finger. There, I said it. Disagree in the comments. Just don’t take that quote out of context.

While not directly related to modeling, I believe [this recent article](http://www.palmbeachpost.com/news/schools/science-teachers-at-loxahatchee-middle-school-strike-back-1916851.html?viewAsSinglePage=true) supports my conclusions. The problem isn’t that hands-on labs are ineffective, it is that ineffective deployment of hands-on labs is ineffective.

**Conclusion**

I don’t want my thoughts that I’ve shared here to paralyze you into inaction. Rather, I hope that I’ve encouraged you to make sure that you have sufficient expertise so you can apply your powerful tools and techniques in an effective manner. Your students will benefit and the reputation of these powerful tools and techniques will benefit as well.

How do you do this?

* Attend professional development opportunities (e.g., [Modeling Instruction Workshops](http://modeling.asu.edu/MW_nation.html)) that increase your skill with these powerful tools and techniques.
* Apply these powerful tools and techniques in a limited manner as you gain experience and expertise.
* Participate on Twitter, start a blog, read a bunch of blogs, participate in online discussions (e.g., [Global Physics Department](http://globalphysicsdept.posterous.com/#!/)), and subscribe to email lists to accelerate your knowledge of these powerful tools and techniques.
* Observe [skilled practitioners](http://quantumprogress.wordpress.com/2011/08/25/my-grading-sales-pitch/) of these tools and techniques, [find a coach](http://quantumprogress.wordpress.com/2011/10/06/taking-my-pln-to-the-next-level—virtual-coaching/) to observe you, welcome feedback from everyone.

Next-Time Questions

One of my favorite resources for developing conceptual understanding of physics are Paul Hewitt’s Next-Time Questions. Older ones are [hosted by Arbor Scientific](http://www.arborsci.com/Labs/CP_NTQ.aspx) and every month a new one is published in [The Physics Teacher](http://tpt.aapt.org/).

These questions often appear deceptively simple. However, a student’s first impression is often incorrect. I find that these are a great way to discuss and refine preconceptions. These questions are intended to be presented during one class and not discussed until the next. I always have students who are so excited to share their answer they are practically bouncing in their seats. I have to remind them that these are “next-time” questions and, therefore, we will discuss them the next-time we meet. I encourage them to discuss them with their friends over lunch or after school.

Hewitt implores us to use them as he intends:


Although these are copyrighted, teachers are free to download any or all of them for sharing with their students. But please, DO NOT show the answers to these in the same class period where the question is posed!!! Do not use these as quickie quizzes with short wait times in your lecture. Taking this easy and careless route misses your opportunity for increased student learning to occur. In my experience students have benefited by the discussions, and sometimes arguments, about answers to many of these questions. When they’d ask for early “official” answers, I’d tell them to confer with friends. When friends weren’t helpful, I’d suggest they seek new friends! It is in such discussions that learning takes place.

Here is one that I recently used during the Balanced Force Particle Model unit.

Next-Time Question

The next time my class met, the discussion of this question consumed almost the entire class time. The discussion started with a review that the forces must be balanced since the book is at rest (the special kind of constant velocity where the velocity is zero). We practiced drawing the free-body diagram for the book which was a good review of the force of friction and the normal force. We were just beginning to explore vector components, and this was a great introduction since the force from the woman’s hand is directly both upward and to the right. We then debated if the force of friction should be directed upward or downward. Students had valid arguments for each. Another student asked if there was a force of friction at all. Eventually, we drew three different free-body diagrams for the cases where there is no friction, where there is friction directed upward, and where there is friction directed downward. A fantastic discussion all centered around a single drawing and simple question.

Some time ago, I reviewed every next-time question, downloaded those that aligned with concepts we cover, and copied them into unit folders so I would remember to use them when the time was appropriate. Now, I just review each month’s next-time question in The Physics Teacher and file it appropriately.

Give one a try in class. I think you and your students will love it.

CV Buggy Lab

Last week, I participated in a great discussion on Twitter about the various ways Modelers perform the Constant-Velocity Buggy Lab in their classrooms. The CV Buggy Lab is the paradigm lab for constant-velocity and, as a result, Modeling classrooms are filled with toy cars in the fall. I’m not sure why, but it seems that the red cars are always configured to go “fast” and the blue cars configured to go “slow”.1

CV buggies

We’ve always done a CV buggy lab, even before I started modeling, but this year we did something different. To provide some context, before we do the CV buggy lab, students have already completed a mini-modeling cycle involving the bouncing ball and explored non-linear relationships with the sliding box of mass and rubber bands. We have also briefly discussed the concept of position in terms of specifying the location of something relative to a commonly defined point. For example, “my chair is 5 floor tiles from the south wall and 10 floor tiles from the west wall.” Another teacher and I were discussing that since students were rocking these labs, our typical buggy lab that involves only one car might not be as engaging or beneficial. She decided to have students start with both cars from the start. I thought this was a great idea and decided that I also wanted each group to analyze a different scenario which would make the post-lab whiteboards discussion more interesting.

As a class, we go through the usual process of making observations, determining what we can measure, and, eventually, coming up with the purpose for the lab:

To graphically and mathematically model the relationship between position and time for two buggies traveling at different speeds.

At this point, I had to constrain the lab more than I usually would by specifying the starting position and direction for each car. I assigned each lab group a scenario (this allowed some degree of differentiation in terms of difficulty):

1. red positive direction, blue negative direction; red at 0 m, blue at 2 m
2. red positive direction, blue negative direction; red at -1 m, blue at 1 m
3. red negative direction, blue positive direction; red at 2 m, blue at 0 m
4. red positive direction, blue positive direction; red at 0 m, blue at 0.5 m
5. red positive direction, blue positive direction; red at -1 m, blue at -0.5 m
6. red negative direction, blue negative direction; red at 2 m, blue at 1.5 m

Their homework was to draw a picture of their scenario and brainstorm on how they would design the experiment.

The next day, groups designed their experiment. I didn’t provide any additional restrictions. I only verified that their pictures matched the scenarios that I had specified. Some groups decided that their independent variable would be time; others, position; others, distance. One group decided to gather data from both cars at the same time! Another group taped a marker to the back of the cars which traced their paths on butcher paper and allowed them to make more accurate measurements of the actual distance traveled.

When groups started graphing their data, I requested that they plot time on the horizontal axis. Some objected and remarked that if time was their dependent variable it should be plotted on the vertical axis. I explained that I wanted all the groups to be able to share their results which would be easier if we used a common set of axes. I reassured them that the graph police would not come and get them for plotting their dependent variable on the horizontal axis. (Anyone know why this is the convention?)

Some expected and unexpected issues emerged as students began to graph their data. As expected, those groups who chose to measure distance instead of position soon realized that their graph wasn’t going to convey everything they wanted. They went back, and using their picture, calculated positions corresponding to each distance. We use LoggerPro for graphing, and those groups who made time their independent variable, simply added a new column for the position of the second buggy. LoggerPro makes it super simple to graph multiple sets of values on the vertical axis (click on the vertical axis label and choose More…). However, those groups that made position their independent variable had more trouble since LoggerPro only allows one column to be plotted on the horizontal axis. These groups required more assistance and, in the end, I discovered that it was best to create two data sets and name the time columns identically for each. LoggerPro would then plot this “common” time column on the horizontal axis and the two position columns on the vertical axis. Not super simple, but doable.

2 data sets in LoggerPro

Each group drew their picture, graph, and equations on a whiteboard. We did a “circle whiteboard” discussion rather than having each group formally present their results. At first, the discussion focused on how the graph described the motion of the buggies. As students became more comfortable with those ideas, the discussion shifted to comparing and contrasting the different whiteboards. This was the best whiteboard discussion for the CV Buggy Lab that I have ever had. At the end of class, I confidently shared that their whiteboards captured everything that we would learn about constant velocity. We just needed more time to digest, appreciate, and refine what they had already created.

I’ll definitely do this again next year, but I hope to find a way to not assign each group a scenario and yet still end up with a variety of initial positions, directions, and relative motion. Perhaps, if I ask each group to design their own scenario, I can subtly encourage small changes to ensure the variety still exists. Plus, students usually create scenarios that I never would consider!

1 There are many ways to make the blue buggy slow. I have used wooden dowels wrapped in aluminum foil and wooden dowels with thumbtacks and wire. Others have shared that they use dead batteries, electrical tape, and aluminum foil. This year, I tried something completely different. I found these wires with magnetic ends while cleaning last spring (I have no idea who sells them). While in previous years, it seems that in every class someone’s blue buggy has an intermittent connection, I had no problems at all this year.

making a slow car

The Preconception Eliciting Tennis Ball

After investigating the motion of a falling object, I ask my students to draw position vs. time, velocity vs. time, and acceleration vs. time graphs of a ball that is thrown upward and then caught at the same height. As I walk around the room, most students have the position vs. time graph correct but struggle with the velocity vs. time and the acceleration vs. time graphs. For those students that struggle, the most common sketch of the velocity vs. time graph is a ‘V’ rather than a straight line with a negative slope. They then struggle to reconcile an acceleration vs. time graph with this V-shaped velocity vs. time graph.

I then model how I reason through these types of conceptual problems. I hold the tennis ball in my hand and ask, “Immediately after I release the ball, in which direction is it moving?” (They confidently say “up.”) I ask, “Immediately after I release the ball, is it moving fast or slow?” (They confidently say “fast.”) I then encourage them to plot that point on their velocity vs. time graph. I then ask while climbing on top of a lab stool, “As the ball travels upwards, how does its velocity change?” (They confidently say “it slows.”) While holding the ball near the ceiling, I ask, “When the ball is at its peak, what is its velocity?” (They confidently say “zero!”)

I now expose their preconception by immediately asking, “What is its acceleration?” (The answers are split between “9.8 m/s/s” and “zero!” depending on the class) I keep the ball near the ceiling and ask one of the students who enthusiastically answered “zero!”, “If its acceleration is zero and its velocity is zero, what would happen to the ball?” After some thought, the student realizes that the ball wouldn’t fall. I then release the ball and it sticks to the ceiling.

This demonstration appears to be sufficiently memorable due to its humor or unexpected outcome, that students can replace their preconception about the acceleration of an object at its peak. After some laughs, a reference to all the balls that are not suspended in midair over the tennis courts, and an [xkcd comic](http://xkcd.com/942/), I continue demonstrating how I reason through the creation of velocity vs. time graphs. I ask the final part, “When the ball is about to be caught, in which direction is it moving?” and “Is it moving fast or slow?” I encourage them to plot this final point and then they have replaced the V-shaped graph with the proper velocity vs. time graph. The slope of their corrected velocity vs. time graph confirms that the acceleration of the ball must remain constant. The tennis ball spends the rest of the class period stuck to the blackboard.

We have a group of Physics teachers that meet at an area school monthly and share ideas. I learned this demo from a great Physics teacher at one of these meetings. He has practiced enough where he can throw the tennis ball and have it stick. He showed us how to modify a tennis ball:

tennis ball demo materials
*Materials: Neodymium magnets, tennis ball, utility knife, hot glue gun.*

magnet glued inside tennis ball
*Slice the tennis ball, squirt in a bunch of hot glue, and stick in the magnet.*

tennis ball sealed
*Seal the slit in the tennis ball and let harden.*

tennis ball experiencing no acceleration
*Stick the tennis ball on the ceiling!*

Teaching Energy

For the last couple of years, I’ve approach teaching energy from a conservation of energy perspective, deemphasized work, and focused on energy storage modes and transfer mechanisms. I think this has been very helpful for students, at least compared to starting with work and the work-energy theorem like I used to do. They understand the analogy as I pour water from the gravitational potential energy beaker into the kinetic energy beaker as the cart rolls down ramp. Students seem to more readily appreciate the idea that energy is always conserved, and, if a system doesn’t have as much energy as it used to have, we simply need to find to where it was transferred. It’s like a mystery.

This year, I’m trying to leverage as much of the [modeling methodology](http://modeling.asu.edu/) as I possibly can which includes energy pie charts and bar charts. As usual, I started conceptually and avoid numbers. We drew energy pie charts for various scenarios. Here’s an example from the Modeling curriculum:

energybarchart.png

Students readily understood and easily created these visual models and seemed to appreciate that they could actually handle real-world aspects like friction. If an object was sliding across the floor, we would include the floor in our system so that the total energy in our system, and, therefore the size of the pie chart, would remain constant as energy is transferred from kinetic energy storage mode to the internal energy storage mode. No problems here.

We then moved to energy bar charts but continued to postpone introducing numbers in Joules calculated from equations. Students had little trouble with this visual representation. For the object sliding across the floor scenario, most groups continued to include the “surface” as part of their system such that the total energy in the system remained constant and no energy flowed out of their system. For a scenario where someone pushes a box up a ramp, some groups wanted to include the person in their system, but after a discussion of the complex energy transfers that occur within the human body, they decided to keep people out of the system and include energy flowing into the system.

We started having problems when we started calculating specific energies. Students continued to want to account for energy being transferred to the internal energy storage mode. So, for example, when asked to calculate “the average force exerted by a ball on a glove,” they would get stuck trying to calculate how much of the kinetic energy of the ball is transferred to the internal energy of the ball and how much is transferred out of the system by working. I felt like an idiot when my response was, “well, since we don’t have a model that can help us calculate how much energy is transferred to the internal energy of the ball and how much energy is transferred outside of the system, we’ll have to assume that all of the energy is transferred outside of the system.” The students looked at me with that expression of, “you have gotta to be kidding me; if that is the case, why have we been including internal energy all this time?”

Basically, we stopped including internal energy in our quantitative energy bar charts and always had energy be transferred out of the system. With the aid of this visual model, students would consistently solve relatively complicated roller coaster problems without making the typical common mistakes. I could honestly tell my classes, “those of you who drew the energy bar charts, solved this problem correctly, and those of you who didn’t bother, didn’t.” Despite this clear improvement over previous years, not having a clear rationale for why why we handled internal energy differently in the quantitative bar charts compared to the conceptual visual models was disappointing. I’m sure the students were confused by this.

Suggestions for next year?