Author Archives: geoff

Reflection on This Year’s Learning Experiences

My school district has a new system for how teachers’ professional development earns them credit on the salary schedule. In addition to the traditional approaches of taking graduate courses or competing additional higher education degrees, several other opportunities are now options. Last school year, I wrote a proposal for a “Discovery, sharing, execution, and enhancement of research-based and field-tested best practices for physics education.” Over the summer, I documented all that I did and did receive credit as part of the new program as a result of these activities. While it took an unexpected amount of effort to navigate the new bureaucracy, those wrinkles can be ironed out as everyone has more experience with the new system. I believe the concept of this new model is sound. It’s a lot easier to adjust the workflow and bureaucracy than to adjust the fundamental concept. Below is the summary reflection that I submitted. Thanks to all of you who influence my professional development!

As I reflect on these learning experiences over the past year, a theme of balance emerges. The strong impact of these experiences was balanced between my learning, the district, student learning, and my peers. The medium through which ideas were exchanged (fact-to-face, virtual real-time, online) was balanced, leveraging the strengths of each. My focus on learning from others and sharing my expertise was balanced. The level of commitment of various professional learning communities was balanced. A small group of high school physics teachers had a very high level of commitment in my Physics Learning Community, while the informality and transient nature of Twitter enabled many to share their insights with minimum initial commitment.

These learning experiences were punctuated by reflections. By capturing and sharing these reflections, I benefit from both the immediate act of reflection and the future ability to reference that reflection; others benefit from the sharing of my reflections from which they may draw their own insights. I continue to be pleased at the regularity with which I reference my writings.

Throughout these learning experiences I was reminded how curious, collaborative, and open many educators are about their profession. The diversity of their backgrounds and current roles provide varied experiences and fresh perspectives. I was also reminded how everyone is at a different point in their professional growth. While some methodologies are entrenched in my practice (e.g., standards-based assessment and reporting), other educators are just starting to struggle with these transitions. It is easy to forget the path one took as they grow; capturing this path helps me to remember. In addition, the 180 blog provided a forum for me to share the smaller ideas and tips that I would normally not bother to share in a standard blog post. I was pleased and sometimes surprised that many educators found these 180 posts informative. Furthermore, historically, my blog stagnates when school is in session. The 180 blog is rigidly structured into smaller chunks such that at least I share something, which is a net gain.

With so many incredible educators willing to share their expertise and such a plethora of methodologies to explore, I must be balanced on which I choose to focus. I focus on what I think is most important for students, on what I am most passionate, on what I find most interesting; and pass along everything. Someone else may pick up what I have set aside and everyone still benefits.

GitHub, Canvas, and Computer Science

There are certain software development techniques or tools that are not strictly part of the AP Computer Science curriculum that I think are valuable for students to learn about and practice in my class. Two years ago, I incorporated pair programming. Last year, I added test-driven development and JUnit. This year, I made GitHub an integral part of the course.

I want students to be aware of and appreciate the value of source control management. GitHub was the obvious choice as they are supportive of education and is mostly likely the specific tool that students will encounter.

After consulting with a couple former students more familiar with GitHub than I, I decided to create a repository for each unit in the course. At the start of each unit, students fork that unit’s repository and clone it to their desktop. They perform these operations through the GitHub web site.

Throughout the unit, I encourage students to put all of their code in their forked repository and frequently commit and sync. This provides students with all of the typical advantages of source control: they can more easily work at both school and home, and they can revert to an earlier version of code if a project goes astray.

At the end of the unit when students have completed their summative lab, they issue a pull request to submit the lab. They then submit the URL to this pull request via the Canvas assignment that I created for the summative lab.

I created a video tutorial that captures the student workflow:

The student workflow works well except when they accidentally generate merge conflicts by not keeping home and school synced.

While exposing students to GitHub is enough of a benefit, this particular workflow is extremely efficient from the perspective of me evaluating their summative labs.

I still use Canvas’s SpeedGrader to keep track of who has submitted the lab and to provide detailed feedback to students. In previous years, I had students submit a zip file of their entire project folder. The link to their pull request is much more efficient. My workflow for evaluating their lab is the following:

  1. Click on the link in SpeedGrader to open the pull request page in another tab.
  2. Click on the “Check out this branch in GitHub for Mac” icon which does exactly that.
  3. Open the BlueJ project file for the summative lab, read the code, read the docs, run the project, and provide feedback via SpeedGrader.
  4. Close the pull request.
  5. Run the following script which switches back to the mater branch and removes all traces of the student’s pull request:

    
    git reset --hard
    git clean -xdf
    git checkout master
    

  6. After evaluating all of the labs, I list all of the branches that I checked out: git branch --list

  7. I then delete each of these branches: git branch -D pr/xx

While the above may seem like a lot of steps, there is very little overhead and it is much more efficient than my previous workflow.

I’m embarrassed to admit that there is another advantage of these GitHub repositories for each unit that I didn’t utilize until this past unit. While making notes to myself about where we had stopped in one class period where I was modeling how to write a certain algorithm, it struck me that I can create branches for each class period. I now create a branch for each class period and, when I demonstrate how to write some code, I commit and sync that branch with some helpful notes to myself at the end of each class period. The next day, I switch to the corresponding class’s branch, read my notes, and we start right where we stopped.

If you have any suggestions on how I can improve my students’ workflow or my workflow, please share. If you have been thinking of incorporating source control management into your computer science class, I encourage you to take the plunge. Your students will learn a very valuable skill!

Fluids Paradigm Lab

I taught a one-semester Advanced Physics class that cumulated in the AP Physics B exam my first five years of teaching. For the past two years, I taught an official AP Physics B course. Both of these courses were packed with content. Despite being a proponent of Modeling Instruction and incorporating it into other courses, I never felt I could make it fit in these courses.

This year, I’m teaching the new AP Physics 2 course. The focus on inquiry, deep understanding of physcs, and science practices (and less content) aligns wonderfully with Modeling Instruction.

We just started the first major unit, fluids. I guided my students through a paradigm lab to model the pressure vs. depth in a fluid. We started by watching this video of a can being crushed as it descends in a lake. I was worried students would find the phenomenon demonstrate too simple, but that definitely wasn’t the case. Like any paradigm lab, we started by making observations:

  • the can gets crushed
  • the can gets crushed more as it gets deeper
  • the top of the can appears to be sealed
  • the can must be empty (student commented that if full, it wouldn’t be crushed)

Students then enumerated variables that may be related to the crushing of the can:

  • water pressure
  • volume of water above the can
  • strength of can
  • air pressure inside of can
  • gravitational field strength (student said “gravity” and I went on a tangent about fields…)
  • temperature of water
  • atmospheric pressure
  • type (density) of fluid
  • water depth
  • speed of decent
  • dimensions, surface area, shape of can
  • motion of water

Students readily agreed that it was the water pressure that crushed the can and it is the dependent variable. In hindsight, I could have better focused the discussion by directing students to focus on the water pressure rather than the can itself. They had a lot of good ideas about what properties of the can would affect it being crushed, which I didn’t expect. I had to admit that I didn’t have any cans and we would have to focus on the fluid instead…. I was amazed that no one in my first class proposed that the depth of the fluid would play a role. Everyone in that class phrased it as the volume of the fluid in the container above the can was a variable to measure. This was fascinating to me and led to a surprising result for the students as the experiment was conducted. I think this illustrates the power of the modeling cycle and guided inquiry labs.

We next determined which of the above variables we could control (independent variables) and measure in the lab given the resources available at the moment:

  • volume of water above the can
  • type (density) of fluid
  • water depth
  • speed of decent

The materials we planned on using were Vernier LabQuest 2 interfaces, pressure sensors with glass tube attachments, three different sized beakers (for the volume variable), graduated cylinders, fluids (water, canola oil, saturated salt water).

We then defined the purpose of our experiment:

To graphically and mathematically model the relationship between (TGAMMTRB) pressure, volume of fluid above, depth below surface of fluid, decent rate, and type of fluid (density).

We divided these various experiments among the lab groups, and groups started designing their particular experiment.

At the start of class the next day, groups shared their results. I was particularly impressed with the groups investigating pressure vs. volume of fluid above a point. While they measured a relationship between pressure and volume, their experimental design was sufficiently robust that they also noticed that the same volume above the measurement point resulted in different pressures in different beakers! That is, the pressure with 400 mL of water above the sensor in the 600 mL beaker is different than in the 1000 mL beaker and different again from that in the 2000 mL beaker. After further investigation they concluded that the relationship was based on depth, not volume.

The groups investigating pressure vs. depth in fluid were confident that the pressure at a point depended on the depth below the surface of the fluid, and they had sufficient data that they were also confident that there was a linear relationship between pressure and depth.

The groups that investigated pressure vs. fluid density at constant depth/volume had inconclusive results. The pressure they measured varied by less than 1% between the three types of fluids. This provided an opportunity to discuss how the experimental technique can affect the uncertainty of the measurement. We discussed that with the new understanding of the relationship between pressure and depth, these groups could gather several measurements at various depths in each of the three fluids and compare the slopes of the resulting graphs to see if density has an effect. While we were discussing measurement uncertainty, we also discussed how the depth is defined not by the position of the bottom of the glass tube, but the water level within the glass tube. I learned of this important experimental technique in the article “Pressure Beneath the Surface of a Fluid: Measuring the Correct Depth” in The Physics Teacher. While the groups investigating the effect of fluid density on pressure applied their new experimental technique, the rest of the groups repeated gathering pressure vs. depth data while carefully examining the fluid level in the glass tube.

After a second day of measurements, students confirmed the linear relationship between pressure and depth. In addition, with the improved experimental design, students confirmed a relationship between pressure and fluid density. The results were not as accurate as I had expected. We identified a couple of additional errors that may have contributed. One, a couple of groups lost the seal between the glass tube and the plastic tube connected to the pressure sensor when the glass tube was in the fluid. This results in the fluid filling the glass tube and future measurements are incorrect if the glass tube is reconnected without removing it from the fluid.

I asked my TA to minimize the known sources of measurement uncertainty, perform the experiment, and determine how accurately pressure vs. depth could be measured. The slope of his pressure vs. depth graph was within 3.16% of the expected value. This is quite a reasonable result. If we used a taller graduated cylinder, I expect the error could be reduced further.

I’ll definitely do this paradigm lab again next year!

Student Evolution of Descriptions of Learning and Grades

I found this post accidentally saved as a draft from last December! The year referenced is the 2013-2014 school year. I should check this year’s student information survey and see if these patterns persist (although I don’t have Honors Physics students this year). I still want to share this; so, here it is….

At the start of every year, all of my students complete a survey which helps me get to know them better and faster. This year, I noticed a bit of a difference between the responses of my Honors Physics class and my AP Physics B class to a couple of questions. Most of the AP Physics B students took Honors Physics last year and experienced a year of standard-based assessment and reporting indoctrination. One question was “A grade of ‘A’ means …”. I captured the two classes’ responses in a Wordle cloud. My Honors Physics class:

Honors A

My AP Physics class:

AP A

I was pleased that both groups mentioned understanding. I found it interesting that mastered was more prominent with the 2nd year physics students. The Honors Physics students mentioned their parents but no one in AP Physics did. Overall, the AP Physics students had more varied descriptions.

I found the differences between the responses to the question “Learning is …” more insightful. My Honors Physics class:

Honors learning

My AP Physics B class:

AP learning

My conclusion? My Honors Physics students don’t yet understand what learning is; they could barely describe it. My AP Physics students had much richer descriptions that featured “knowledge”, “understanding”, “fun”, “awesome”, “new”, and “life.”

These word clouds illustrate the growth that students achieve on my colleague’s and mine physics course. This growth doesn’t show up on an AP exam, the ACT, or any other standardized test, but it is important.

Summer Reading

A good summer of reading. I didn’t read quite as much as I had hoped, but more than I feared I would. My focus this summer was influenced by my work on my district’s Science Curriculum Team who is incorporating the Next Generation Science Standards into our science curriculum. As part of my contribution to this team, I want to promote the development of a continuous narrative that students will find engaging throughout primary and secondary school. I’ll write more about this later, but I believe the history of science plays a crucial role in this endeavor.

Quantum Man by Lawrence Krauss

I find Lawrence Krauss’ writing and speaking engaging. This biography of Richard Feyman focuses more on his pursuit of understanding through science than on his infamous antics.

Creating Innovators: The Making of Young People Who Will Change the World by Tony Wagner

A committee I was on started reading this book last year. It was good and the case studies were interesting. I think it could have been condensed quite a bit without losing too much.

The Edge of Physics: A Journey to Earth’s Extremes to Unlock the Secrets of the Universe by Anil Ananthaswamy

This book is amazing. Ananthaswamy travels around the world to explore the most interesting experiments in the field of cosmology. Reading how these scientists actually go about their experiments and the challenges they face due to their environment is fascinating. These are the types of stories that need to be shared with students.

Trinity: A Graphic History of the First Atomic Bomb by Jonathan Fetter-Vorm

An excellent graphic novel that captures the start of the Atomic Age. This book is a fantastic resource for students research the development of the atomic bomb.

The Ten Most Beautiful Experiments by George Johnson

This was my least favorite book of the summer. I just didn’t find the stories as engaging as others that capture the history of science.

A Short History of Nearly Everything by Bill Bryson

I read this book years ago, but read it again this summer in order to make annotations that can be incorporated in this narrative of science in which I’m interested. Bryson is an incredibly engaging writer and truly captures the wonder of how little we understand about the world (and beyond) in which we live.

I’m in the midst of two other books and hope to continue to make progress as the school year starts.

Chromebook Toolchain for AP Physics

This fall, my AP Physics 2 classes will be using Chromebooks as part of my school district’s 1:1 pilot. Chromebooks were new to me; so, it took some time this summer to find the apps to support the workflow I want for this class. While I’m sure the toolchain will change throughout the semester, and there will be surprises (both pleasant and otherwise), here is the starting toolchain:

  • Canvas. Everything starts and ends with this learning-management system.

We will do a lot of lab activities. The workflow depends on the amount of data acquired and the level of graphical analysis required. The start of the workflow is the same:

  • LabQuest 2. Vernier’s LabQuest 2 can create its own ad-hoc network or connect to the school’s wireless network. The LabQuest 2 hosts its own web page as part of their Connected Science System. Students can then access the device, the data, and graphs via Chrome. Data and graphs can be exported to the Chromebook via the web page.

The next tool depends upon the lab. For some labs, the data and graphs produced on the LabQuest 2 are sufficient. Students will import these into their Google Document and create whatever is required for their lab report. If additional analysis is required and the data sets are relatively small:

If data sets are large or more sophisticated analysis is required:

  • Plot.ly. Plot.ly seemed to explode onto the education scene this summer, or maybe I was just paying more attention. Data exported from the LabQuest 2 can easily be imported into Plot.ly. Like Desmos, graphs can be shared via a link and an image can be embedded in the Google document. Plot.ly can also embed its graphs in an iframe, but I couldn’t find a way to embed that in a Google document as opposed to a web page. Fran Poodry from Vernier made a great screencast demonstrating the integration of the LabQuest 2 and Ploy.ly.

Regardless of the analysis performed, in the end, students create their lab report in Google docs and submit it via Canvas.

Another important aspect of class is the exploration and modification of computational models. In the past, we’ve used VPython. I had to find an alternative that would be compatible with Chromebooks:

  • Glowscript. Glowscript is the up-and-coming platform for computational models with the advantage that it runs in a browser that supports WebGL. I’m not a huge fan of JavaScript syntax for novice programmers; so, we will be using CoffeeScript instead. I didn’t write as many starting models over the summer as I had hoped, but I did at least verify that complicated models can be ported.

Peer instruction is one of the most effective and popular classroom activities that we do. In the past, I’ve used handheld clickers. This year, we will use the Chromebooks:

  • InfuseLearning. There are a number of web apps in this space, but I selected InfuseLearing because it allows the creation of spontaneous questions, supports a variety of answer methods including drawing and sort-in-order. Pear Deck looks promising, but I don’t want to be forced to create my set of questions ahead of time.

For notes in class, I’ll leave it up to students to use whatever tool works best for them (including paper and pencil). I’ll suggest they at least take a look at:

  • Evernote. I love Evernote and use it all the time for all sorts of stuff.

I do provide students with PDFs of my slides. I can envision that students may want to annotate these PDFs or other handouts. Surprisingly, this was the hardest tool to find:

  • Crocodoc. The free personal version allows students to upload a PDF, annotate it, and export their annotated version. Other tools I explored are Notable PDF. This requires paid licenses to be useful. We may try this out if we find Crocodoc lacking.

A couple of other tools that looks interesting, but I’m not sure if they fits into the toolchain for my class is:

  • Doctopus. I think Canvas assignments and SpeedGrader cover everything that I personally would do with this app.

  • 81Dash. Private back-channeling app.

I’m sure I will learn of new tools throughout the semester and I’ll make adjustments to the toolchain. If you are using Chromebooks, please share your favorite apps below in the comments!

AP Physics 2 Syllabus, Units, Labs, and Pacing

I previously shared how I will be using the AP Physics 2 Big Ideas and Enduring Understandings as the standards for my flavor of standards-based assessment and reporting for AP Physics 2. Since then, I’ve been working on outlining my sequence of units, pacing, and labs. This allowed me to finish the syllabus to submit for the College Board Audit. I based my syllabus heavily on Dolores Gende’s syllabus. My syllabus is 1252560v1, in case anyone finds it helpful in preparing theirs.

The syllabus that I share with students and parents provides all of the specifics on the structure of the course.

My sequence of units and pacing is based on a fall semester of 15 weeks and 2 days (plus finals) and a spring semester of 13 weeks to April 22nd (at which point we start reviewing for the exam). We will be using College Physics, 3rd Edition, by Knight, Jones, and Field. My pacing reflects our first year physics courses which cover more of electrostatics and circuits than the minimum required by AP Physics 1.

Please share any feedback or questions that you have!

Fall Semester

Unit 1: Relativity and Computational Modeling

  • time: 1 week
  • Knight: Chapter 27
  • computational model:
    • frames of reference

Unit 2: Fluid Mechanics

  • time: 3 weeks
  • Knight: Chapter 13 (sections 13.1-13.6)
  • computational models:
    • buoyancy
    • Torricelli projectile
  • labs:
    • pressure beneath the surface
    • hydrometer
    • Archimedes
    • Bernoulli/Venturi
    • water projectile

Unit 3: Thermodynamics

  • time: 4 weeks
  • Knight: Chapters 10.5; 12.1-12.4, 12.8; 11.1, 11.3-11.8
  • computational model:
    • kinetic theory
    • heat transfer between liquids of different temperatures (thermal equilibrium)
    • entropy
  • labs:
    • heat engine
    • heat transfer
    • temperature and kinetic theory
    • entropy activity

Unit 4: Electrostatics

  • time: 4 weeks
  • Knight: Chapters 20, 21
  • computational model:
    • electric field/potential maps (3D?)
  • labs:
    • Millikan Movies
    • electric potential mapping
    • dielectric constant and parallel plate capacitor lab
    • simulations (field hockey, fields and potentials)

Unit 5: Electric Circuits

  • time: 2.6 weeks
  • Knight: Chapters 22, 23 (23.1-23.7)
  • labs:
    • conductivity/resistivity lab
    • Experimenting with constant current and voltage sources
    • RC circuits

Capstone

  • time: 4 days

Spring Semester

Unit 6: Magnetostatics and Electromagnetism

  • time: 4 weeks
  • Knight: Chapters 24, 25
  • computational models:
    • charged particle in an external magnetic field
  • labs:
    • magnetism activities
    • mass of the electron
    • measurement of a magnetic field
    • electromagnetic induction activities
    • Faraday’s Law
    • electric motors
    • determine number of loops in solenoid
    • Lenz’s Law Demonstration Using an Ultrasound Position Sensor

Unit 7: Geometric and Physical Optics

  • time: 4 weeks
  • Knight: Chapters 17, 18
  • labs:
    • reflection activities
    • mirrors lab
    • refraction activities
    • refraction/total internal reflection lab
    • lenses activity
    • lenses lab
    • diffraction and interference
    • thin film interference lab
    • interferometer thermal expansion
    • holograms
    • Determining the Thickness and Refractive Index of a Mirror

Unit 8: Quantum, Atomic, Nuclear Physics

  • time: 4 weeks
  • Knight: Chapters 28, 29, 30
  • computational model:
    • half life
  • labs:
    • hydrogen spectrum
    • photoelectric effect
    • half life
    • stochastic nature of radiation
    • LED lab for Planck’s constant

Review

  • time: (4 days for final exam) + 6.5 days for analysis and review
  • April 23-24, 27-28: final exam

Unit 9: Particle Physics and Cosmology

  • time: 2 weeks