I shared these with my students and thought that others may be interested as well. I typed up solutions to the 2016 AP Computer Science free response questions. The zip file includes a BlueJ project file and test code to verify solutions. As I tell my students, no guarantee that I wrote perfect solutions and there are multiple ways to answer these questions.
Author Archives: geoff
AAPTWM16: Blueprints for Accessible and Affordable High-Altitude Ballooning
Mark Rowzee and I spoke at the American Association of Physics Teachers (AAPT) 2016 Winter meeting as part of Session EI: Quadcopters, Drones and High Altitude Balloons. Our talk was “Blueprints for Accessible and Affordable High-Altitude Ballooning.”
**Abstract**: We’ll provide you will the blueprints for success since the moment you release your first high-altitude balloon, you are stricken with an unsettling combination of joy and terror. It is relativity easy to launch a high-altitude balloon; it requires much more planning, resources, and luck to get it back. We will share our experiences designing, launching, and recovering high altitude balloons over the past six years. We will share the science that can be done with a variety of student age groups (elementary, junior high, and high school). We will share the materials necessary for a successful launch and recovery for a variety of budgets. We will share the safety precautions that are required. Finally, we have photos, videos, resources, and stories that we hope will inspire you to conduct your own launch.
* [slides from talk (including all extra slides)](https://drive.google.com/file/d/0B5RGLWvvovYUdzhRVEM5M0Y2dmc/view?usp=sharing)
* [useful and interesting links for space ballooning](https://docs.google.com/document/d/1-FDhuwiNHHLUaVpsAPedmm2R5nku2K9l28nGm23Fglw/edit?usp=sharing)
* [packing and pre-launch checklists](https://docs.google.com/document/d/1anxhRdpZZd2gB84jDEqIa94520NwJHduz6FMz9WtEgI/edit?usp=sharing)
* [Physics Club Near-Space Balloon Flight](https://vimeo.com/42751486)
AP Physics 2 Reflection
On the eve of the first day of school, I felt that I better capture my thoughts on AP Physics 2 last year. My perspective may be different than other’s (at least different than the vocal minority(?) on the AP Teacher Community).
I started last year eagerly anticipating the new AP Physics 2 course. For the past seven years, I had taught some type of a second-year physics course. For most of that time, I taught what we called Advanced Physics, a one-semester course after which some of my students would take the AP Physics B exam. For a couple of years, I taught an official, year-long, AP Physics B course. I felt that the AP Physics B course had too much content to cover well, even as a second-year course. This was compounded by the mismatch between the groups of student that enrolled in the course. About a third of the students had previously taken our General Physics course, and two-thirds, Honors Physics. The Honors Physics students had studied additional units not part of the General Physics course. As a result, for some “review” units in AP Physics B, the pace was much too fast for those from General Physics and much too slow for those from Honors Physics.
The new AP Physics 2 course contained less content. In addition, the emphasis shifted towards deeper conceptual understanding of physics rather than numeric or algebraic problem solving. As a result of these changes, I felt that I could at last integrate much more of [Modeling Instruction](http://modelinginstruction.org) into a second-year physics course. I wasn’t too concerned about the shift towards deeper conceptual understanding since I had been moving my course in that direction for the past couple of years based on student performance on the AP Physics B exam. My students had done extremely well on the free response portion of the AP Physics B exam; therefore, I had adjusted class to focus more on conceptual understanding since the greatest area for growth was on the multiple choice portion of the exam. During the summer of 2014, I attended an AP Summer Institute to learn more about the new course. As a result of all of this, I started last year much more excited than anxious.
Reflecting back on AP Physics 2 last year, it was my favorite year teaching a second-year physics course. That said, while many aspects of the course worked well, there are definite areas for me to improve this year.
What Worked
—-
Peer instruction was very effective at developing students’ conceptual understanding. Of all the various types activities done in class, students ranked peer instruction as the most helpful (over 75% of students agreed with the statement “Participating in peer instruction of conceptual questions helped me understand the material.” on the end-of-year survey). The manner by which I conduct peer instruction is strongly influenced by the research of Stephanie Chasteen who writes at [sciencegeekgirl](http://blog.sciencegeekgirl.com). The questions I use are a combination of Paul Hewitt’s [Next-Time Questions](http://www.arborsci.com/next-time-questions), the end-of-chapter conceptual questions in Knight’s College Physics text, and those in clicker questions banks from CU Boulder and OSU.
The number and variety of lab activities also worked well. Some labs were informal stations, some typical Modeling Instruction paradigm labs, some lab practicums. With less content, we had time for more, and deeper, labs. Some of the labs and skills involved went beyond that required by the AP Physics 2 curriculum, but some of these were the students favorite. We will continue to explore [computational modeling](http://180.pedagoguepadawan.net/805/day-19-fluid-projectile-motion-lab-practicum-with-computational-models/), build [more advanced circuits](http://180.pedagoguepadawan.net/997/day-66-rc-circuits/) on breadboards, and [explore particle physics](http://180.pedagoguepadawan.net/1247/day-130-particle-physics-masterclass-at-fermilab/).
What Didn’t Work
—-
[Building my standards, and grading, on the Enduring Understanding](https://pedagoguepadawan.net/368/standards-for-ap-physics-2/) defined for each Big Idea did not work well. While my goal was for students to see the connections between the various content areas and appreciate the Big Ideas, students shared that organizing the standards and grades in this manner didn’t help accomplish this. It did result in a lot of extra work for me. After the fall semester, I mostly abandoned this approach. Below, I’ll explain my approach for this year.
Whiteboarding homework problems did not work well. My approach was for six groups of students to prepare and present whiteboards based on assigned homework problems. This didn’t work well because too few students had done the homework problems in advance of whiteboarding. As a result, most of the group would watch those who had done the problems prepare the whiteboards and didn’t really understand the solution. This issue was compounded when whiteboards were presented. Too few students had struggled with the problem in advance to result in a good discussion. This wasn’t the case every time, but much too often.
What I’m Trying This Year
—-
My attempts to prepare students for the free response portion of the AP Physics 2 exam fell somewhere between working and not working. I overestimated students’ ability to write clear, concise, and correct free responses. As a result, I didn’t dedicate sufficient time to practicing this skill. What did work well was [using Socrative to share student responses and peer critique these responses](http://180.pedagoguepadawan.net/1286/day-138-using-socrative-for-short-response-questions/). We will do this much more this year.
While my attempts to reinforce the Big Ideas by structuring standards and scores around Enduring Understandings didn’t work, emphasizing the AP Science Practices did work well. Inspired by [Chris Ludwig’s work with portfolios](http://see.ludwig.lajuntaschools.org/?p=1197) and our discussion at NSTA earlier this year, my students will create a lab notebook and portfolio on their own Google Site. The notebook will capture all the labs and the portfolio will be a curated collection of labs that demonstrate their performance of the various AP Science Practices. I hope to share the details of this soon.
To improve the value of whiteboarding, I’m making several changes. Instead of six groups preparing and presenting six problems, groups will prepare and present only two problems. Each problem will be prepared by three groups. The problem won’t be assigned as homework. Rather, we will spend more class time as each group works together to solve the problem. A randomly selected member of each group will be responsible for presenting the whiteboard, and the class will focus on comparing and contrasting solutions between the various groups in addition to the solution itself.
Scores
—-
The average AP Physics 2 scores were about a point lower than the previous year’s AP Physics B scores (3.344 vs. 4.484). However, as I considered the standards and expectations for AP Physics 2 compared to AP Physics B and carefully considered each of my students, their scores were what I expected, except for a few.
Summary
—-
I’m thrilled with the new AP Physics 2 class and excited about teaching this course for the second time. All that I miss from AP Physics B is a huge collection of exam questions from which I could build my own assessments. My one wish is that the College Board releases additional questions as questions in the style of the new exam are very difficult to create. I hope that the changes that I have planned for this year help students to develop an even stronger and deeper understanding of physics and proficiency in science practices than last year’s. If you are interested in more detail about my approach last year, my [180 blog](http://180.pedagoguepadawan.net/) focused solely on AP Physics 2.
Selling Our Colleagues Short with SAMR
I’ve been seeing more and more references to SAMR. Maybe it’s because ISTE is starting, or maybe it’s because my district is promoting it as a tool for teachers embarking on our Digital Learning Initiative (1:1 devices), or maybe its just the [Baader-Meinhof phenomenon](https://en.wikipedia.org/wiki/List_of_cognitive_biases#Frequency_illusion). Regardless, I can only tolerate so many SAMR infographics before I’m pushed over the edge, and I have to say something.
**Due to its overemphasis on technology, SAMR is the least helpful model to promote with teachers if you want to provide a resource to positively impact student learning.**
Depending on the teacher, it confuses, at best, and misleads, at worse. I’m not alone in this sentiment. Several of my colleagues both local and online have expressed similar feelings. Most eloquent are [a couple](https://learningandphysics.wordpress.com/2014/10/22/i-am-not-satisfied/) [of posts](https://learningandphysics.wordpress.com/2014/12/30/dont-keep-it-simple-stupid/) by Casey Rutherford. My favorite quote:
> On the note of lesson design, I am not satisfied with simplifying the complexities of teaching to where it falls on the SAMR scale. Teaching is nuanced, fluid, and has a ton of moving parts, and we’d be better off embracing that than cheapening it with a stamp of ‘modification.’
I’ll illustrate the problem with a couple of lessons.
Lesson #1
—-
> Embracing the “Flipped Classroom,†the teacher records and shares a demonstration of the projectile motion lab and an overview of the lab procedure. Students can watch the video for homework before coming to class. In addition to the video, the lab procedure is published to Canvas and each student can follow the procedure on their devices while performing the lab. The video (and published procedure) instructs students on how to setup the apparatus and make the necessary measurements of the path of the ball bearing that is launched after rolling down the ramp. The data is captured in a Google doc and plotted using Plotly, allowing group members to share data with each other. Based on their plot, students determine where on the floor to place the target and then the teacher observes their attempt. Points are awarded based on how close the ball bearing lands to the target. Each student creates a video that captures their attempt in which they report their percent error and list potential sources of error. The videos are posted on their public blog. Students, and the public, can comment on these videos posted on the students’ blog, but no one does.
Let’s take a look at this lesson. Hmmm. Flipped classroom, LMS, Google Docs, Plotly, video creation, student blogs. Woohoo! We are swimming in the deep end of the SAMR pool! Modification? Redefinition? Doesn’t matter, we’re above the bar!
There’s just one problem. This lesson sucks. I know, it’s based on one of my actual lessons. My class did it. They weren’t engaged; they didn’t have ownership; they didn’t have choice; they didn’t exercise their creativity. They asked me a ton of questions about the details of the procedure. From a pedagogical perspective, it is flawed. It is a traditional cookbook science lab jazzed up with a bunch of tech.
Don’t worry, I made improvements the next year. I focused on research-based pedagogy, and I integrated technology where it supported the pedagogy and content.
Lesson #2
—-
> Students are presented with a challenge: Given a fixed vertical elevation of the projectile launcher (i.e., “cannonâ€), determine the launch angle and time of launch to hit the toy buggy at a specific location as it “flees†. Students work in small groups to justify the selection of the kind of data needed, design a plan for collecting data, and collect data. They choose the tools with which to collect the data. Some groups use video cameras; others, motion detectors; others, photo gates; others, meter sticks; others, phones. They create a computational model using 3D physics programming language since a traditional mathematical solution is beyond most of their current capabilities (one group solves the problem algebraically using clever trig substitutions, which is fine). Using the computational model they solve for the launch angle and time of launch. Their attempt based on their calculation is recorded with a high speed video camera and [shared publicly](http://180.pedagoguepadawan.net/107/107/) to celebrate their success. Students reflect on the lab practicum with a specific focus on measurement uncertainty and capture their reflections in their electronic portfolio which they will export into an open format (HTML) and take with them to university. During the post lab discussion as a whole class, each group shares what in their evaluation is the most significant aspects of their approach as each group had a unique approach to the lab. Groups compare and contrast these techniques arriving at a set of best practices for minimizing measurement uncertainty.
Students were motivated and engaged. They were creative and collaborative. They asked each other many questions. They surprised me with their solutions. They focused on deeper science practices beyond the content of projectile motion. Some groups incorporated technology to help them meet the challenge. Some groups hardly used any technology at all.
Some may rebuke my assertion and claim that I’m oversimplifying SAMR and there is more to it than what I’m presenting. I’m missing the student-centered vs. teacher-centered aspect. Maybe there is, but you wouldn’t know it from most of the resources out there. [SAMR Coffee](https://www.google.com/search?q=samr+coffee&tbm=isch&tbo=u&source=univ&sa=X&ei=e5WQVYbAGtXaoATpr7TIBg&ved=0CB4QsAQ)? [SAMR Apps?](https://www.google.com/search?q=samr+apps&tbm=isch&tbo=u&source=univ&sa=X&ei=k5WQVYiNNo_ooASyi7CgCQ&ved=0CB4QsAQ) Really?
Some may argue that teachers new to tech need a simple model to reference. Yes, SAMR is simple. But, why promote it when there are better and more inclusive models available? Do we really think [TPACK](http://www.matt-koehler.com/tpack/what-is-tpack/) is too complex for our colleagues? Are we selling them that short?
I’m not.
Formative Assessment Tools for Peer Instruction and Peer Critque of Written Responses
This past year, as my AP Physics 2 cases piloted Chromebooks, we used a couple of formative assessment tools frequently in class. For Peer Instruction, we used [InfuseLearning](http://www.infuselearning.com). InfuseLearning’s stand-out feature was their support for draw-response questions. Having students sketch graphs and draw diagrams is very valuable as a formative assessment in physics class. Throughout the year as I shared InfuseLearning with other teachers participating in the pilot, the draw-response feature was the most popular everyone, from elementary through high school.
The second formative assessment activity was focused on preparation for the new paragraph-length responses on the AP Physics 2 exam. To practice these types of responses, students responded to a prompt using [Socrative](http://www.socrative.com). Socrative allows me to share all the responses with students, and students can vote for the best one. We can then, as a class, discuss the elements of the best responses.
Unfortunately, InfuseLearning [closed their doors in April](http://www.infuselearning.com/?page_id=35). In preparation for sharing resources with teachers this summer before we deploy 1:1 Chromebooks for all high school students this fall, I surveyed the current tools available with a focus specifically on Peer Instruction that supports drawing and peer-critique of written responses.
I evaluated the following features.
* **Cost**: Is there a free version? What are the limitations of the free version? Can teachers upgrade to a paid version on an as-needed basis?
* **Account Creation**: How easy is it for students to create accounts? Can they login with their Google account?
* **Prepared Questions**: Does the tool support preparing questions in advance?
* **Spontaneous Questions**: Does the tool support creating a question on-the-fly without preparation ahead of time?
* **Supported Question Types**: What types of questions do the tool support?
* **Multiple Choice Questions**: Since Peer Instruction often uses multiple choice questions, how flexible are these questions? Can the answer choices be customized (e.g., A-D or 1-4)? Can the number of answer choices be customized?
* **Draw Response Questions**: Are draw response questions supported by the tool? How rich are the drawing tools?
* **Sharing Student Responses with Students**: Does the tool support sharing sample student responses with all students?
* **Capturing Student Responses**: Does the tool support capturing student responses for later analysis? What can and cannot be captured?
* **Reporting**: Does the tool support reporting of sessions? What is captured and reported?
[Socrative](http://socrative.com/)
—-
* **Cost**: free
* **Account Creation**: integrated with Google accounts
* **Prepared Questions**: yes
* **Spontaneous Questions**: yes
* **Supported Question Types**: multiple choice, true/false, short answer
* **Multiple Choice Questions**: limited options (exactly 5, A-E)
* **Draw Response Questions**: no
* **Sharing Student Responses with Students**: sharing short answer allows student to vote on best peer answer
* **Capturing Student Responses**: yes
* **Reporting**: for prepared questions and short answer only (not spontaneous multiple choice or true/false)
[The Answer Pad](http://theanswerpad.com)
—-
* **Cost**: free and paid; free is limited (limited templates, question types, creation of own images, capture student responses)
* **Account Creation**: students have to create accounts (doesn’t support Google accounts) if you want to track student responses
* **Prepared Questions**: yes, but not draw response
* **Spontaneous Questions**: yes
* **Supported Question Types**: multiple choice, true/false, yes/no, up/down, fill-in, Likert scale, drawing
* **Multiple Choice Questions**: limited options (exactly 4, A-D)
* **Draw Response Questions**: yes, decent drawing tools
* **Sharing Student Responses with Students**: no
* **Capturing Student Responses**: limited in free version
* **Reporting**: only for prepared questions
[Formative](http://goformative.com/)
—-
* **Cost**: free
* **Account Creation**: integrated with Google accounts
* **Prepared Questions**: yes
* **Spontaneous Questions**: no (maybe have some standard templates?)
* **Supported Question Types**: multiple choice, show your work (draw response), short answer, true/false
* **Multiple Choice Questions**: flexible response choices
* **Draw Response Questions**: yes, but limited (no colors)
* **Sharing Student Responses with Students**: no
* **Capturing Student Responses**: automatic
* **Reporting**: yes
[Pear Deck](http://www.peardeck.com/)
—-
* **Cost**: free and paid; free is limited (draw response in prepared decks, capturing, and reporting are paid features)
* **Account Creation**: integrated with Google accounts
* **Prepared Questions**: yes
* **Spontaneous Questions**: kind of (can ask a quick question in the context of an existing deck)
* **Supported Question Types**: agree/disagree, draw on grid, draw on blank, yes/no, true/false, multiple choice, long text answer, short text answer, numeric answer
* **Multiple Choice Questions**: flexible response choices
* **Draw Response Questions**: yes (quick question only for free version)
* **Sharing Student Responses with Students**: no
* **Capturing Student Responses**: paid only
* **Reporting**: paid only
[NearPod](http://nearpod.com/)
—-
* **Cost**: free and paid (free has limited storage space and reporting export options)
* **Account Creation**: integrated with Google accounts
* **Prepared Questions**: yes
* **Spontaneous Questions**: no (maybe have some standard templates?)
* **Supported Question Types**: open-ended question, poll, quiz, draw it
* **Multiple Choice Questions**: flexible response choices
* **Draw Response Questions**: yes, decent drawing tools
* **Sharing Student Responses with Students**: yes
* **Capturing Student Responses**: yes
* **Reporting**: yes (PDF only in free version)
Conclusions
—-
At our summer professional learning sessions, we will be featuring Socrative. It is easy to use and applies to a wide variety of disciplines. The significant drawback of Socrative is the lack of draw-response questions. For those teachers that need that feature, I’m recommending they use NearPod. I used to use NearPod a couple of years ago when piloting classroom iPads. At that time, NearPod was an iPad-only app. I was thrilled to discover that it now supports all major platforms.
For my physics classroom, I’m going to use NearPod for Peer Instruction because draw-response questions are so important. While I’d rather be able to create spontaneous questions, I’m also interested in capturing student answers to provide me more insight into their learning, which necessitates creating a set of questions ahead of time. I will create a few slides in each deck that can serve as general-purpose placeholders for spontaneous questions.
I’ll still use Socrative for peer-critique of written responses. The ability to share student responses with students and have students vote for the best response is very effective at developing their writing. These two classroom activities – Peer Instruction and peer-critique of written responses are done independently; so, using two different tools should not be inconvenient.
If I’ve missed a tool that would work well for either of these classroom activities, please leave a comment to let me know!
Reflection on This Year’s Learning Experiences
My school district has a new system for how teachers’ professional development earns them credit on the salary schedule. In addition to the traditional approaches of taking graduate courses or competing additional higher education degrees, several other opportunities are now options. Last school year, I [wrote a proposal](https://pedagoguepadawan.net/313/recognizing-true-professional-development/) for a “Discovery, sharing, execution, and enhancement of research-based and field-tested best practices for physics education.” Over the summer, I documented all that I did and did receive credit as part of the new program as a result of these activities. While it took an unexpected amount of effort to navigate the new bureaucracy, those wrinkles can be ironed out as everyone has more experience with the new system. I believe the concept of this new model is sound. It’s a lot easier to adjust the workflow and bureaucracy than to adjust the fundamental concept. Below is the summary reflection that I submitted. Thanks to all of you who influence my professional development!
As I reflect on these learning experiences over the past year, a theme of balance emerges. The strong impact of these experiences was balanced between my learning, the district, student learning, and my peers. The medium through which ideas were exchanged (fact-to-face, virtual real-time, online) was balanced, leveraging the strengths of each. My focus on learning from others and sharing my expertise was balanced. The level of commitment of various professional learning communities was balanced. A small group of high school physics teachers had a very high level of commitment in my Physics Learning Community, while the informality and transient nature of Twitter enabled many to share their insights with minimum initial commitment.
These learning experiences were punctuated by reflections. By capturing and sharing these reflections, I benefit from both the immediate act of reflection and the future ability to reference that reflection; others benefit from the sharing of my reflections from which they may draw their own insights. I continue to be pleased at the regularity with which I reference my writings.
Throughout these learning experiences I was reminded how curious, collaborative, and open many educators are about their profession. The diversity of their backgrounds and current roles provide varied experiences and fresh perspectives. I was also reminded how everyone is at a different point in their professional growth. While some methodologies are entrenched in my practice (e.g., standards-based assessment and reporting), other educators are just starting to struggle with these transitions. It is easy to forget the path one took as they grow; capturing this path helps me to remember. In addition, the 180 blog provided a forum for me to share the smaller ideas and tips that I would normally not bother to share in a standard blog post. I was pleased and sometimes surprised that many educators found these 180 posts informative. Furthermore, historically, my blog stagnates when school is in session. The 180 blog is rigidly structured into smaller chunks such that at least I share something, which is a net gain.
With so many incredible educators willing to share their expertise and such a plethora of methodologies to explore, I must be balanced on which I choose to focus. I focus on what I think is most important for students, on what I am most passionate, on what I find most interesting; and pass along everything. Someone else may pick up what I have set aside and everyone still benefits.
GitHub, Canvas, and Computer Science
There are certain software development techniques or tools that are not strictly part of the AP Computer Science curriculum that I think are valuable for students to learn about and practice in my class. Two years ago, I incorporated pair programming. Last year, I added test-driven development and JUnit. This year, I made [GitHub](http://github.com/) an integral part of the course.
I want students to be aware of and appreciate the value of source control management. GitHub was the obvious choice as they are [supportive of education](https://education.github.com) and is mostly likely the specific tool that students will encounter.
After consulting with a couple former students more familiar with GitHub than I, I decided to create a [repository for each unit in the course](https://github.com/nnhsapcs201415). At the start of each unit, students fork that unit’s repository and clone it to their desktop. They perform these operations through the GitHub web site.
Throughout the unit, I encourage students to put all of their code in their forked repository and frequently commit and sync. This provides students with all of the typical advantages of source control: they can more easily work at both school and home, and they can revert to an earlier version of code if a project goes astray.
At the end of the unit when students have completed their summative lab, they issue a pull request to submit the lab. They then submit the URL to this pull request via the [Canvas](http://www.instructure.com/k-12/) assignment that I created for the summative lab.
I created a video tutorial that captures the student workflow:
The student workflow works well except when they accidentally generate merge conflicts by not keeping home and school synced.
While exposing students to GitHub is enough of a benefit, this particular workflow is extremely efficient from the perspective of me evaluating their summative labs.
I [still use Canvas’s SpeedGrader](https://pedagoguepadawan.net/216/greatest-benefit-of-canvas/) to keep track of who has submitted the lab and to provide detailed feedback to students. In previous years, I had students submit a zip file of their entire project folder. The link to their pull request is much more efficient. My workflow for evaluating their lab is the following:
1. Click on the link in SpeedGrader to open the pull request page in another tab.
2. Click on the “Check out this branch in GitHub for Mac” icon which does exactly that.
3. Open the BlueJ project file for the summative lab, read the code, read the docs, run the project, and provide feedback via SpeedGrader.
4. Close the pull request.
5. Run the following script which switches back to the mater branch and removes all traces of the student’s pull request:
git reset --hard
git clean -xdf
git checkout master
6. After evaluating all of the labs, I list all of the branches that I checked out: git branch --list
7. I then delete each of these branches: git branch -D pr/xx
While the above may seem like a lot of steps, there is very little overhead and it is much more efficient than my previous workflow.
I’m embarrassed to admit that there is another advantage of these GitHub repositories for each unit that I didn’t utilize until this past unit. While making notes to myself about where we had stopped in one class period where I was modeling how to write a certain algorithm, it struck me that I can create branches for each class period. I now create a branch for each class period and, when I demonstrate how to write some code, I commit and sync that branch with some helpful notes to myself at the end of each class period. The next day, I switch to the corresponding class’s branch, read my notes, and we start right where we stopped.
If you have any suggestions on how I can improve my students’ workflow or my workflow, please share. If you have been thinking of incorporating source control management into your computer science class, I encourage you to take the plunge. Your students will learn a very valuable skill!
Fluids Paradigm Lab
I taught a one-semester Advanced Physics class that cumulated in the AP Physics B exam my first five years of teaching. For the past two years, I taught an official AP Physics B course. Both of these courses were packed with content. Despite being a proponent of [Modeling Instruction](http://modelinginstruction.org) and incorporating it into other courses, I never felt I could make it fit in these courses.
This year, I’m teaching the new AP Physics 2 course. The focus on inquiry, deep understanding of physcs, and science practices (and less content) aligns wonderfully with Modeling Instruction.
We just started the first major unit, fluids. I guided my students through a paradigm lab to model the pressure vs. depth in a fluid. We started by watching [this video](https://www.youtube.com/watch?v=fqWL5FsQXRI) of a can being crushed as it descends in a lake. I was worried students would find the phenomenon demonstrate too simple, but that definitely wasn’t the case. Like any paradigm lab, we started by making observations:
* the can gets crushed
* the can gets crushed more as it gets deeper
* the top of the can appears to be sealed
* the can must be empty (student commented that if full, it wouldn’t be crushed)
Students then enumerated variables that may be related to the crushing of the can:
* water pressure
* volume of water above the can
* strength of can
* air pressure inside of can
* gravitational field strength (student said “gravity” and I went on a tangent about fields…)
* temperature of water
* atmospheric pressure
* type (density) of fluid
* water depth
* speed of decent
* dimensions, surface area, shape of can
* motion of water
Students readily agreed that it was the water pressure that crushed the can and it is the dependent variable. In hindsight, I could have better focused the discussion by directing students to focus on the water pressure rather than the can itself. They had a lot of good ideas about what properties of the can would affect it being crushed, which I didn’t expect. I had to admit that I didn’t have any cans and we would have to focus on the fluid instead…. I was amazed that no one in my first class proposed that the depth of the fluid would play a role. Everyone in that class phrased it as the volume of the fluid in the container above the can was a variable to measure. This was fascinating to me and led to a surprising result for the students as the experiment was conducted. I think this illustrates the power of the modeling cycle and guided inquiry labs.
We next determined which of the above variables we could control (independent variables) and measure in the lab given the resources available at the moment:
* volume of water above the can
* type (density) of fluid
* water depth
* speed of decent
The materials we planned on using were Vernier LabQuest 2 interfaces, pressure sensors with glass tube attachments, three different sized beakers (for the volume variable), graduated cylinders, fluids (water, canola oil, saturated salt water).
We then defined the purpose of our experiment:
To graphically and mathematically model the relationship between (TGAMMTRB) pressure, volume of fluid above, depth below surface of fluid, decent rate, and type of fluid (density).
We divided these various experiments among the lab groups, and groups started designing their particular experiment.
At the start of class the next day, groups shared their results. I was particularly impressed with the groups investigating pressure vs. volume of fluid above a point. While they measured a relationship between pressure and volume, their experimental design was sufficiently robust that they also noticed that the same volume above the measurement point resulted in different pressures in different beakers! That is, the pressure with 400 mL of water above the sensor in the 600 mL beaker is different than in the 1000 mL beaker and different again from that in the 2000 mL beaker. After further investigation they concluded that the relationship was based on depth, not volume.
The groups investigating pressure vs. depth in fluid were confident that the pressure at a point depended on the depth below the surface of the fluid, and they had sufficient data that they were also confident that there was a linear relationship between pressure and depth.
The groups that investigated pressure vs. fluid density at constant depth/volume had inconclusive results. The pressure they measured varied by less than 1% between the three types of fluids. This provided an opportunity to discuss how the experimental technique can affect the uncertainty of the measurement. We discussed that with the new understanding of the relationship between pressure and depth, these groups could gather several measurements at various depths in each of the three fluids and compare the slopes of the resulting graphs to see if density has an effect. While we were discussing measurement uncertainty, we also discussed how the depth is defined not by the position of the bottom of the glass tube, but the water level within the glass tube. I learned of this important experimental technique in the article “[Pressure Beneath the Surface of a Fluid: Measuring the Correct Depth](http://scitation.aip.org/content/aapt/journal/tpt/51/5/10.1119/1.4801356)” in The Physics Teacher. While the groups investigating the effect of fluid density on pressure applied their new experimental technique, the rest of the groups repeated gathering pressure vs. depth data while carefully examining the fluid level in the glass tube.
After a second day of measurements, students confirmed the linear relationship between pressure and depth. In addition, with the improved experimental design, students confirmed a relationship between pressure and fluid density. The results were not as accurate as I had expected. We identified a couple of additional errors that may have contributed. One, a couple of groups lost the seal between the glass tube and the plastic tube connected to the pressure sensor when the glass tube was in the fluid. This results in the fluid filling the glass tube and future measurements are incorrect if the glass tube is reconnected without removing it from the fluid.
I asked my TA to minimize the known sources of measurement uncertainty, perform the experiment, and determine how accurately pressure vs. depth could be measured. The slope of his pressure vs. depth graph was within 3.16% of the expected value. This is quite a reasonable result. If we used a taller graduated cylinder, I expect the error could be reduced further.
I’ll definitely do this paradigm lab again next year!
Student Evolution of Descriptions of Learning and Grades
I found this post accidentally saved as a draft from last December! The year referenced is the 2013-2014 school year. I should check this year’s student information survey and see if these patterns persist (although I don’t have Honors Physics students this year). I still want to share this; so, here it is….
At the start of every year, all of my students complete a survey which helps me get to know them better and faster. This year, I noticed a bit of a difference between the responses of my Honors Physics class and my AP Physics B class to a couple of questions. Most of the AP Physics B students took Honors Physics last year and experienced a year of standard-based assessment and reporting indoctrination. One question was “A grade of ‘A’ means …”. I captured the two classes’ responses in a Wordle cloud. My Honors Physics class:
My AP Physics class:
I was pleased that both groups mentioned understanding. I found it interesting that mastered was more prominent with the 2nd year physics students. The Honors Physics students mentioned their parents but no one in AP Physics did. Overall, the AP Physics students had more varied descriptions.
I found the differences between the responses to the question “Learning is …” more insightful. My Honors Physics class:
My AP Physics B class:
My conclusion? My Honors Physics students don’t yet understand what learning is; they could barely describe it. My AP Physics students had much richer descriptions that featured “knowledge”, “understanding”, “fun”, “awesome”, “new”, and “life.”
These word clouds illustrate the growth that students achieve on my colleague’s and mine physics course. This growth doesn’t show up on an AP exam, the ACT, or any other standardized test, but it is important.
Summer Reading
A good summer of reading. I didn’t read quite as much as I had hoped, but more than I feared I would. My focus this summer was influenced by my work on my district’s Science Curriculum Team who is incorporating the Next Generation Science Standards into our science curriculum. As part of my contribution to this team, I want to promote the development of a continuous narrative that students will find engaging throughout primary and secondary school. I’ll write more about this later, but I believe the history of science plays a crucial role in this endeavor.
**[Quantum Man](http://www.amazon.com/Quantum-Man-Richard-Feynmans-Discoveries/dp/0393340651) by Lawrence Krauss**
I find Lawrence Krauss’ writing and speaking engaging. This biography of Richard Feyman focuses more on his pursuit of understanding through science than on his infamous antics.
**[Creating Innovators: The Making of Young People Who Will Change the World](http://www.amazon.com/Creating-Innovators-Making-People-Change/dp/1451611498) by Tony Wagner**
A committee I was on started reading this book last year. It was good and the case studies were interesting. I think it could have been condensed quite a bit without losing too much.
**[The Edge of Physics: A Journey to Earth’s Extremes to Unlock the Secrets of the Universe](http://www.amazon.com/Edge-Physics-Journey-Extremes-Universe/dp/0547394527) by Anil Ananthaswamy**
This book is amazing. Ananthaswamy travels around the world to explore the most interesting experiments in the field of cosmology. Reading how these scientists actually go about their experiments and the challenges they face due to their environment is fascinating. These are the types of stories that need to be shared with students.
**[Trinity: A Graphic History of the First Atomic Bomb](http://www.amazon.com/Trinity-Graphic-History-First-Atomic/dp/0809093553) by Jonathan Fetter-Vorm**
An excellent graphic novel that captures the start of the Atomic Age. This book is a fantastic resource for students research the development of the atomic bomb.
**[The Ten Most Beautiful Experiments](http://www.amazon.com/Ten-Most-Beautiful-Experiments/dp/140003423X) by George Johnson**
This was my least favorite book of the summer. I just didn’t find the stories as engaging as others that capture the history of science.
**[A Short History of Nearly Everything](http://www.amazon.com/Short-History-Nearly-Everything/dp/076790818X) by Bill Bryson**
I read this book years ago, but read it again this summer in order to make annotations that can be incorporated in this narrative of science in which I’m interested. Bryson is an incredibly engaging writer and truly captures the wonder of how little we understand about the world (and beyond) in which we live.
I’m in the midst of two other books and hope to continue to make progress as the school year starts.