Author Archives: geoff

GitHub, Canvas, and Computer Science

There are certain software development techniques or tools that are not strictly part of the AP Computer Science curriculum that I think are valuable for students to learn about and practice in my class. Two years ago, I incorporated pair programming. Last year, I added test-driven development and JUnit. This year, I made GitHub an integral part of the course.

I want students to be aware of and appreciate the value of source control management. GitHub was the obvious choice as they are supportive of education and is mostly likely the specific tool that students will encounter.

After consulting with a couple former students more familiar with GitHub than I, I decided to create a repository for each unit in the course. At the start of each unit, students fork that unit’s repository and clone it to their desktop. They perform these operations through the GitHub web site.

Throughout the unit, I encourage students to put all of their code in their forked repository and frequently commit and sync. This provides students with all of the typical advantages of source control: they can more easily work at both school and home, and they can revert to an earlier version of code if a project goes astray.

At the end of the unit when students have completed their summative lab, they issue a pull request to submit the lab. They then submit the URL to this pull request via the Canvas assignment that I created for the summative lab.

I created a video tutorial that captures the student workflow:

The student workflow works well except when they accidentally generate merge conflicts by not keeping home and school synced.

While exposing students to GitHub is enough of a benefit, this particular workflow is extremely efficient from the perspective of me evaluating their summative labs.

I still use Canvas’s SpeedGrader to keep track of who has submitted the lab and to provide detailed feedback to students. In previous years, I had students submit a zip file of their entire project folder. The link to their pull request is much more efficient. My workflow for evaluating their lab is the following:

  1. Click on the link in SpeedGrader to open the pull request page in another tab.
  2. Click on the “Check out this branch in GitHub for Mac” icon which does exactly that.
  3. Open the BlueJ project file for the summative lab, read the code, read the docs, run the project, and provide feedback via SpeedGrader.
  4. Close the pull request.
  5. Run the following script which switches back to the mater branch and removes all traces of the student’s pull request:

    git reset --hard
    git clean -xdf
    git checkout master

  6. After evaluating all of the labs, I list all of the branches that I checked out: git branch --list

  7. I then delete each of these branches: git branch -D pr/xx

While the above may seem like a lot of steps, there is very little overhead and it is much more efficient than my previous workflow.

I’m embarrassed to admit that there is another advantage of these GitHub repositories for each unit that I didn’t utilize until this past unit. While making notes to myself about where we had stopped in one class period where I was modeling how to write a certain algorithm, it struck me that I can create branches for each class period. I now create a branch for each class period and, when I demonstrate how to write some code, I commit and sync that branch with some helpful notes to myself at the end of each class period. The next day, I switch to the corresponding class’s branch, read my notes, and we start right where we stopped.

If you have any suggestions on how I can improve my students’ workflow or my workflow, please share. If you have been thinking of incorporating source control management into your computer science class, I encourage you to take the plunge. Your students will learn a very valuable skill!

Fluids Paradigm Lab

I taught a one-semester Advanced Physics class that cumulated in the AP Physics B exam my first five years of teaching. For the past two years, I taught an official AP Physics B course. Both of these courses were packed with content. Despite being a proponent of Modeling Instruction and incorporating it into other courses, I never felt I could make it fit in these courses.

This year, I’m teaching the new AP Physics 2 course. The focus on inquiry, deep understanding of physcs, and science practices (and less content) aligns wonderfully with Modeling Instruction.

We just started the first major unit, fluids. I guided my students through a paradigm lab to model the pressure vs. depth in a fluid. We started by watching this video of a can being crushed as it descends in a lake. I was worried students would find the phenomenon demonstrate too simple, but that definitely wasn’t the case. Like any paradigm lab, we started by making observations:

  • the can gets crushed
  • the can gets crushed more as it gets deeper
  • the top of the can appears to be sealed
  • the can must be empty (student commented that if full, it wouldn’t be crushed)

Students then enumerated variables that may be related to the crushing of the can:

  • water pressure
  • volume of water above the can
  • strength of can
  • air pressure inside of can
  • gravitational field strength (student said “gravity” and I went on a tangent about fields…)
  • temperature of water
  • atmospheric pressure
  • type (density) of fluid
  • water depth
  • speed of decent
  • dimensions, surface area, shape of can
  • motion of water

Students readily agreed that it was the water pressure that crushed the can and it is the dependent variable. In hindsight, I could have better focused the discussion by directing students to focus on the water pressure rather than the can itself. They had a lot of good ideas about what properties of the can would affect it being crushed, which I didn’t expect. I had to admit that I didn’t have any cans and we would have to focus on the fluid instead…. I was amazed that no one in my first class proposed that the depth of the fluid would play a role. Everyone in that class phrased it as the volume of the fluid in the container above the can was a variable to measure. This was fascinating to me and led to a surprising result for the students as the experiment was conducted. I think this illustrates the power of the modeling cycle and guided inquiry labs.

We next determined which of the above variables we could control (independent variables) and measure in the lab given the resources available at the moment:

  • volume of water above the can
  • type (density) of fluid
  • water depth
  • speed of decent

The materials we planned on using were Vernier LabQuest 2 interfaces, pressure sensors with glass tube attachments, three different sized beakers (for the volume variable), graduated cylinders, fluids (water, canola oil, saturated salt water).

We then defined the purpose of our experiment:

To graphically and mathematically model the relationship between (TGAMMTRB) pressure, volume of fluid above, depth below surface of fluid, decent rate, and type of fluid (density).

We divided these various experiments among the lab groups, and groups started designing their particular experiment.

At the start of class the next day, groups shared their results. I was particularly impressed with the groups investigating pressure vs. volume of fluid above a point. While they measured a relationship between pressure and volume, their experimental design was sufficiently robust that they also noticed that the same volume above the measurement point resulted in different pressures in different beakers! That is, the pressure with 400 mL of water above the sensor in the 600 mL beaker is different than in the 1000 mL beaker and different again from that in the 2000 mL beaker. After further investigation they concluded that the relationship was based on depth, not volume.

The groups investigating pressure vs. depth in fluid were confident that the pressure at a point depended on the depth below the surface of the fluid, and they had sufficient data that they were also confident that there was a linear relationship between pressure and depth.

The groups that investigated pressure vs. fluid density at constant depth/volume had inconclusive results. The pressure they measured varied by less than 1% between the three types of fluids. This provided an opportunity to discuss how the experimental technique can affect the uncertainty of the measurement. We discussed that with the new understanding of the relationship between pressure and depth, these groups could gather several measurements at various depths in each of the three fluids and compare the slopes of the resulting graphs to see if density has an effect. While we were discussing measurement uncertainty, we also discussed how the depth is defined not by the position of the bottom of the glass tube, but the water level within the glass tube. I learned of this important experimental technique in the article “Pressure Beneath the Surface of a Fluid: Measuring the Correct Depth” in The Physics Teacher. While the groups investigating the effect of fluid density on pressure applied their new experimental technique, the rest of the groups repeated gathering pressure vs. depth data while carefully examining the fluid level in the glass tube.

After a second day of measurements, students confirmed the linear relationship between pressure and depth. In addition, with the improved experimental design, students confirmed a relationship between pressure and fluid density. The results were not as accurate as I had expected. We identified a couple of additional errors that may have contributed. One, a couple of groups lost the seal between the glass tube and the plastic tube connected to the pressure sensor when the glass tube was in the fluid. This results in the fluid filling the glass tube and future measurements are incorrect if the glass tube is reconnected without removing it from the fluid.

I asked my TA to minimize the known sources of measurement uncertainty, perform the experiment, and determine how accurately pressure vs. depth could be measured. The slope of his pressure vs. depth graph was within 3.16% of the expected value. This is quite a reasonable result. If we used a taller graduated cylinder, I expect the error could be reduced further.

I’ll definitely do this paradigm lab again next year!

Student Evolution of Descriptions of Learning and Grades

I found this post accidentally saved as a draft from last December! The year referenced is the 2013-2014 school year. I should check this year’s student information survey and see if these patterns persist (although I don’t have Honors Physics students this year). I still want to share this; so, here it is….

At the start of every year, all of my students complete a survey which helps me get to know them better and faster. This year, I noticed a bit of a difference between the responses of my Honors Physics class and my AP Physics B class to a couple of questions. Most of the AP Physics B students took Honors Physics last year and experienced a year of standard-based assessment and reporting indoctrination. One question was “A grade of ‘A’ means …”. I captured the two classes’ responses in a Wordle cloud. My Honors Physics class:

Honors A

My AP Physics class:


I was pleased that both groups mentioned understanding. I found it interesting that mastered was more prominent with the 2nd year physics students. The Honors Physics students mentioned their parents but no one in AP Physics did. Overall, the AP Physics students had more varied descriptions.

I found the differences between the responses to the question “Learning is …” more insightful. My Honors Physics class:

Honors learning

My AP Physics B class:

AP learning

My conclusion? My Honors Physics students don’t yet understand what learning is; they could barely describe it. My AP Physics students had much richer descriptions that featured “knowledge”, “understanding”, “fun”, “awesome”, “new”, and “life.”

These word clouds illustrate the growth that students achieve on my colleague’s and mine physics course. This growth doesn’t show up on an AP exam, the ACT, or any other standardized test, but it is important.

Summer Reading

A good summer of reading. I didn’t read quite as much as I had hoped, but more than I feared I would. My focus this summer was influenced by my work on my district’s Science Curriculum Team who is incorporating the Next Generation Science Standards into our science curriculum. As part of my contribution to this team, I want to promote the development of a continuous narrative that students will find engaging throughout primary and secondary school. I’ll write more about this later, but I believe the history of science plays a crucial role in this endeavor.

Quantum Man by Lawrence Krauss

I find Lawrence Krauss’ writing and speaking engaging. This biography of Richard Feyman focuses more on his pursuit of understanding through science than on his infamous antics.

Creating Innovators: The Making of Young People Who Will Change the World by Tony Wagner

A committee I was on started reading this book last year. It was good and the case studies were interesting. I think it could have been condensed quite a bit without losing too much.

The Edge of Physics: A Journey to Earth’s Extremes to Unlock the Secrets of the Universe by Anil Ananthaswamy

This book is amazing. Ananthaswamy travels around the world to explore the most interesting experiments in the field of cosmology. Reading how these scientists actually go about their experiments and the challenges they face due to their environment is fascinating. These are the types of stories that need to be shared with students.

Trinity: A Graphic History of the First Atomic Bomb by Jonathan Fetter-Vorm

An excellent graphic novel that captures the start of the Atomic Age. This book is a fantastic resource for students research the development of the atomic bomb.

The Ten Most Beautiful Experiments by George Johnson

This was my least favorite book of the summer. I just didn’t find the stories as engaging as others that capture the history of science.

A Short History of Nearly Everything by Bill Bryson

I read this book years ago, but read it again this summer in order to make annotations that can be incorporated in this narrative of science in which I’m interested. Bryson is an incredibly engaging writer and truly captures the wonder of how little we understand about the world (and beyond) in which we live.

I’m in the midst of two other books and hope to continue to make progress as the school year starts.

Chromebook Toolchain for AP Physics

This fall, my AP Physics 2 classes will be using Chromebooks as part of my school district’s 1:1 pilot. Chromebooks were new to me; so, it took some time this summer to find the apps to support the workflow I want for this class. While I’m sure the toolchain will change throughout the semester, and there will be surprises (both pleasant and otherwise), here is the starting toolchain:

  • Canvas. Everything starts and ends with this learning-management system.

We will do a lot of lab activities. The workflow depends on the amount of data acquired and the level of graphical analysis required. The start of the workflow is the same:

  • LabQuest 2. Vernier’s LabQuest 2 can create its own ad-hoc network or connect to the school’s wireless network. The LabQuest 2 hosts its own web page as part of their Connected Science System. Students can then access the device, the data, and graphs via Chrome. Data and graphs can be exported to the Chromebook via the web page.

The next tool depends upon the lab. For some labs, the data and graphs produced on the LabQuest 2 are sufficient. Students will import these into their Google Document and create whatever is required for their lab report. If additional analysis is required and the data sets are relatively small:

If data sets are large or more sophisticated analysis is required:

  • seemed to explode onto the education scene this summer, or maybe I was just paying more attention. Data exported from the LabQuest 2 can easily be imported into Like Desmos, graphs can be shared via a link and an image can be embedded in the Google document. can also embed its graphs in an iframe, but I couldn’t find a way to embed that in a Google document as opposed to a web page. Fran Poodry from Vernier made a great screencast demonstrating the integration of the LabQuest 2 and

Regardless of the analysis performed, in the end, students create their lab report in Google docs and submit it via Canvas.

Another important aspect of class is the exploration and modification of computational models. In the past, we’ve used VPython. I had to find an alternative that would be compatible with Chromebooks:

  • Glowscript. Glowscript is the up-and-coming platform for computational models with the advantage that it runs in a browser that supports WebGL. I’m not a huge fan of JavaScript syntax for novice programmers; so, we will be using CoffeeScript instead. I didn’t write as many starting models over the summer as I had hoped, but I did at least verify that complicated models can be ported.

Peer instruction is one of the most effective and popular classroom activities that we do. In the past, I’ve used handheld clickers. This year, we will use the Chromebooks:

  • InfuseLearning. There are a number of web apps in this space, but I selected InfuseLearing because it allows the creation of spontaneous questions, supports a variety of answer methods including drawing and sort-in-order. Pear Deck looks promising, but I don’t want to be forced to create my set of questions ahead of time.

For notes in class, I’ll leave it up to students to use whatever tool works best for them (including paper and pencil). I’ll suggest they at least take a look at:

  • Evernote. I love Evernote and use it all the time for all sorts of stuff.

I do provide students with PDFs of my slides. I can envision that students may want to annotate these PDFs or other handouts. Surprisingly, this was the hardest tool to find:

  • Crocodoc. The free personal version allows students to upload a PDF, annotate it, and export their annotated version. Other tools I explored are Notable PDF. This requires paid licenses to be useful. We may try this out if we find Crocodoc lacking.

A couple of other tools that looks interesting, but I’m not sure if they fits into the toolchain for my class is:

  • Doctopus. I think Canvas assignments and SpeedGrader cover everything that I personally would do with this app.

  • 81Dash. Private back-channeling app.

I’m sure I will learn of new tools throughout the semester and I’ll make adjustments to the toolchain. If you are using Chromebooks, please share your favorite apps below in the comments!

AP Physics 2 Syllabus, Units, Labs, and Pacing

I previously shared how I will be using the AP Physics 2 Big Ideas and Enduring Understandings as the standards for my flavor of standards-based assessment and reporting for AP Physics 2. Since then, I’ve been working on outlining my sequence of units, pacing, and labs. This allowed me to finish the syllabus to submit for the College Board Audit. I based my syllabus heavily on Dolores Gende’s syllabus. My syllabus is 1252560v1, in case anyone finds it helpful in preparing theirs.

The syllabus that I share with students and parents provides all of the specifics on the structure of the course.

My sequence of units and pacing is based on a fall semester of 15 weeks and 2 days (plus finals) and a spring semester of 13 weeks to April 22nd (at which point we start reviewing for the exam). We will be using College Physics, 3rd Edition, by Knight, Jones, and Field. My pacing reflects our first year physics courses which cover more of electrostatics and circuits than the minimum required by AP Physics 1.

Please share any feedback or questions that you have!

Fall Semester

Unit 1: Relativity and Computational Modeling

  • time: 1 week
  • Knight: Chapter 27
  • computational model:
    • frames of reference

Unit 2: Fluid Mechanics

  • time: 3 weeks
  • Knight: Chapter 13 (sections 13.1-13.6)
  • computational models:
    • buoyancy
    • Torricelli projectile
  • labs:
    • pressure beneath the surface
    • hydrometer
    • Archimedes
    • Bernoulli/Venturi
    • water projectile

Unit 3: Thermodynamics

  • time: 4 weeks
  • Knight: Chapters 10.5; 12.1-12.4, 12.8; 11.1, 11.3-11.8
  • computational model:
    • kinetic theory
    • heat transfer between liquids of different temperatures (thermal equilibrium)
    • entropy
  • labs:
    • heat engine
    • heat transfer
    • temperature and kinetic theory
    • entropy activity

Unit 4: Electrostatics

  • time: 4 weeks
  • Knight: Chapters 20, 21
  • computational model:
    • electric field/potential maps (3D?)
  • labs:
    • Millikan Movies
    • electric potential mapping
    • dielectric constant and parallel plate capacitor lab
    • simulations (field hockey, fields and potentials)

Unit 5: Electric Circuits

  • time: 2.6 weeks
  • Knight: Chapters 22, 23 (23.1-23.7)
  • labs:
    • conductivity/resistivity lab
    • Experimenting with constant current and voltage sources
    • RC circuits


  • time: 4 days

Spring Semester

Unit 6: Magnetostatics and Electromagnetism

  • time: 4 weeks
  • Knight: Chapters 24, 25
  • computational models:
    • charged particle in an external magnetic field
  • labs:
    • magnetism activities
    • mass of the electron
    • measurement of a magnetic field
    • electromagnetic induction activities
    • Faraday’s Law
    • electric motors
    • determine number of loops in solenoid
    • Lenz’s Law Demonstration Using an Ultrasound Position Sensor

Unit 7: Geometric and Physical Optics

  • time: 4 weeks
  • Knight: Chapters 17, 18
  • labs:
    • reflection activities
    • mirrors lab
    • refraction activities
    • refraction/total internal reflection lab
    • lenses activity
    • lenses lab
    • diffraction and interference
    • thin film interference lab
    • interferometer thermal expansion
    • holograms
    • Determining the Thickness and Refractive Index of a Mirror

Unit 8: Quantum, Atomic, Nuclear Physics

  • time: 4 weeks
  • Knight: Chapters 28, 29, 30
  • computational model:
    • half life
  • labs:
    • hydrogen spectrum
    • photoelectric effect
    • half life
    • stochastic nature of radiation
    • LED lab for Planck’s constant


  • time: (4 days for final exam) + 6.5 days for analysis and review
  • April 23-24, 27-28: final exam

Unit 9: Particle Physics and Cosmology

  • time: 2 weeks

AP Computer Science Summative Labs

I have selected the summative labs for next year’s AP Computer Science course. Students complete a huge number of practice programming activities as a whole class, through pair programming, or on their own. The purpose of these practice activities is to focus on learning and receive and provide feedback. At the end of each unit, students complete a summative lab. I kept the most successful labs from last year pretty much unchanged, revised some, and incorporated some new ones. An important element of each summative lab are the extensions. None of the extensions are required, but most students try to implement some of them or create their own extension. I have students with a diverse set of programming experience and knowledge, and the extensions provide the opportunity for each student to be challenged.

A new element of next year’s class, which affects all the labs, is the use of GitHub. I want students to be exposed to and gain familiarity with some of the best practices of software engineering. Last year, students leveraged JUnit to write unit tests. This year, they will use GitHub for source control management. I created a repository for each unit which contains some practice projects and at least an empty project for the summative lab (maybe more, depending on the lab). Students will fork these repositories into their own accounts and commit changes as they work on the practice programming activities and summative lab throughout the unit. When they complete the summative lab, they will create a pull request. In response, I will offer feedback and assess their summative lab. At the end of the course, students will have a collection of repositories in their own GitHub account that captures all of their coding.

Here are the summative labs with links to the associated unit’s GitHub repository:

Turtle Lab

This is a new summative lab. I’m de-emphasizing (but not eliminating) GridWorld. The first lab used to involve placing GridWorld actors in a world. Instead, students will do something similar with turtles. This lab leverages the Media Computation classes from Georgia Tech. This lab nicely frames the first unit as students create a turtle world program on their very first day of class.


The goal of this lab for you to experience what your first assignment as a software developer at a software company may be like. You will be working with a large body of code (Turtle and TurtleWorld) with which you are unfamiliar. You don’t understand all of the intricacies of the existing code base and you aren’t yet familiar with many of the Java language features that you will use. However, through experimentation and browsing the documentation, you will be able to complete this assignment and most likely go beyond the requirements!


  • Sync the TurtleLab directory from GitHub.
  • Draw a pattern using at least two Turtle objects.
  • Invoke multiple methods on the turtle objects to change their attributes.
  • Follow our Java style guidelines.


  • Make a more sophisticated pattern.
  • Make a pattern that involves multiple colors.
  • Change the picture for the turtle to something else.
  • Add more awesome.
  • Something else? Be creative and share!


  • Submit a pull request in GitHub and submit a link to the request with this assignment.

Cityscape Lab

This lab was new last year and was created by my counterpart at our sister high school. I’ve left it as is other than incorporating the design class diagram into the lab description. (I drew the DCD on the whiteboard last year in response to student questions.)


The goal of this lab is for you to demonstrate that you can define classes, create objects, and display graphics via a Java application.


  • Create a Java graphical application (with a Viewer class, Component class) that displays a cityscape.
  • Design at least three classes for elements of the cityscape (e.g., building, window, sun, moon, car, tree)
    • demonstrate good class design (encapsulation)
    • provide configurability through constructors and instance variables (size, color, etc.)
  • Create multiple instances of a class with different properties and display in the cityscape.
  • Document the class using JavaDoc comments as demonstrated in our code template.
  • Peer review another student’s lab and provide comments in the rubric hosted in Canvas.

Here is a design class diagram to illustrate the relationship among the classes in this project:



  • Define additional classes for additional elements of the cityscape.
  • Animate your cityscape (e.g., day and night, lights in windows, moving cars)
  • Randomize your cityscape (a different cityscape every time)
  • Add more awesome.


  • Submit a pull request in GitHub and submit a link to the request with this assignment.

Game of Life Lab

This was the first lab that I created on my own and have used it the past two years. It is a perennial favorite of the students. Last year, I tried to incorporate unit testing into this lab and it was a bit rocky. This year, I’m providing more scaffolding to help students implement their unit tests.


The goal of this lab is to apply your understanding of decision and loop structures to implement a complex algorithm within the context of an unfamiliar and significant software framework (GridWorld). In addition, you will implement a unit test for your program using the JUnit framework and create documentation for your program using JavaDoc.


Write a program that plays Conway’s Game of Life. Conway’s Game of Life is a cellular automaton. From Wikipedia:

The universe of the Game of Life is an infinite two-dimensional orthogonal grid of square cells, each of which is in one of two possible states, alive or dead. Every cell interacts with its eight neighbours, which are the cells that are horizontally, vertically, or diagonally adjacent. At each step in time, the following transitions occur:

  • Any live cell with fewer than two live neighbours dies, as if caused by underpopulation.
  • Any live cell with two or three live neighbours lives on to the next generation.
  • Any live cell with more than three live neighbours dies, as if by overcrowding.
  • Any dead cell with exactly three live neighbours becomes a live cell, as if by reproduction.
  • The initial pattern constitutes the seed of the system. The first generation is created by applying the above rules simultaneously to every cell in the seed—births and deaths occur simultaneously, and the discrete moment at which this happens is sometimes called a tick (in other words, each generation is a pure function of the preceding one). The rules continue to be applied repeatedly to create further generations.

Nonfunctional Requirements:

  • the program must be implemented in Java and utilize the GridWorld platform
  • I recommend not use GridWorld’s execution engine to produce subsequent generations. It will be easier to implement the unit test if you directly produce and display subsequent generations.

Artifacts to Produce:

  • Requirements Document: Many functional and nonfunctional requirements needs to be defined. You must define additional requirements that are reasonable and document them in a requirements document. I must review your requirements document before you start the design document or test plan. You may change the requirements document throughout development.
  • Design Document: You must do some design activity before starting implementation. This may consist of a flow cart, pseudocode, or other design artifact. I must review your design document before you start implementing code. You may change your design document throughout development.
  • Test Plan: You must create a test plan with specific test cases (at least two) before starting implementation. I must review your test plan before you start implementing your test class. You may change your test plan throughout development. The provided code in GitHub is an example of implementing an initial test case.
  • GameOfLifeTestTest Class: JUnit-compatible test class that implements the test plan.
  • GameOfLife class: You must produce a working class the meets the requirements and is verified and validated by your test plan.
  • Reflection Document: This is a significant and challenging lab. Please reflect on this experience and share your feedback with me. What did you like or dislike and why? What was surprising or unexpected? What did you learn? What questions do you still have? What advice would you offer next year’s students?


  • Add custom icons for alive or dead cells.
  • Implement more sophisticated seed patterns.
  • Add more awesome.


  • Ensure the following artifacts have been committed to GitHub in addition to your code:
    • requirements document
    • design document (pseudocode, flow charts, etc.)
    • test plan (with specific test cases)
    • HTML documentation generated by JavaDoc
    • reflection document
  • Submit a pull request in GitHub and submit a link to the request with this assignment.


The implementation will be evaluated according to the Computer Science Grading Rubric in Canvas. The other artifacts will be evaluated independently.

Signals in Noise Lab

This is a new lab. It is inspired by the post “Detecting Signals and Noise” on the DataGenetics blog. I was looking for a new lab for arrays and array lists. I also wanted a lab that more directly represented an authentic application of computer science. This technique to detect a signal in the presence of noise is conceptually very similar to that used in particle physics experiments (e.g., dark matter candidates hitting multiple CCD pixels or tracks reconstructed within a particle collider). I wrote the code for the case of a stationary target as an example. Students will tackle the more challenging case of a moving target. I’m really looking forward to seeing how students engage with this lab.


The goal of this lab is to apply your understanding of 2D arrays to implement a complex algorithm. In addition, you will implement a unit test for your program using the JUnit framework and create documentation for your program using JavaDoc.

One common application of computing is signal analysis. In real-world applications, the data that is processed is a combination of something of interest (the signal) and garbage that obscures the signal (noise). Computational techniques for signal analysis are prevalent in a wide variety of scientific and financial applications. This lab provides a simplified context in which to explore signal analysis. Conceptually, the techniques you will use in this lab are similar to those used in particle physics experiments.


  • Write a Monster Early Warning program that finds a moving monster based on the data from your noisy radar system. This is described in DataGenetics blog post, “Detecting Signals in Noise.” The sample code in GitHub implements the case of a stationary monster. You may leverage this code to implement the case of a moving monster, which is more complicated.
  • Design your program such that the initial position and velocity (dx and dy) of the monster can be specified interactively (e.g., from a user) or as parameters (e.g., for your unit test).
  • Implement a unit test for your program that tests multiple cases.
  • Document your code with standard JavaDoc comments.


  • Add support for detecting multiple monsters.
  • Develop a different detection algorithm.
  • Add support for straight-line constant acceleration motion.
  • Add more awesome.


  • Submit a pull request in GitHub and submit a link to the request with this assignment.

Collage Lab

This lab is from the Media Computation materials from Georgia Tech. Students create incredibly creative collages. This is the first lab of the spring semester and is a good review of the fall semester after winter break. This lab is very similar to the new AP Picture Lab. I’m going to have students complete most of the activities in the AP Picture Lab as practice programming activities.


The goal of this lab is for you to apply creatively the filters and transformations that you have developed throughout this unit.


  • collage contains at least 4 copies of the image
  • the 4 copies includes the original image and at least 3 modifications of the original image
  • modifications must include one or more filters (changing colors) and one or more transformations (scaling, cropping, mirroring)
  • the collage must run on its own (don’t invoke FileChooser.pickAFile())
    • instead copy the source image file into the same folder as your Java source files and just specify the file name to the Picture constructor
    • For example: Picture sourcePic = new Picture(“beach.jpg”);
  • the collage must be saved as an image file, the path needs to be absolute and specific to where you have your bookClasses folder stored. I recommend copying and pasting the path from an Explorer window to your code and then escaping the backslashes. For example:
    • finalPic.write(“H:\2014\AP Computer Science\Media Computation\bookClasses\MyCollage.jpg”);


  • Add more copies.
  • Add more filters
    • sepia:
      • convert to grayscale
      • if red < 60 then reduce all three components to 90% of their original value
      • else if red < 190 then reduce just blue to 80% of its original value
      • else reduce just blue to 90% of its original value
    • posterize
      • set all color component values in a range to one value (the midpoint of the range)
      • try with four ranges for each color (0-63, 64-127, 128-191, 192-255)
    • pixelate
  • Add more transformations (e.g., rotation)
  • Add more awesome.


  • Commit the generated collage image file to GitHub.
  • Submit a pull request in GitHub and submit a link to the request with this assignment.

Elevens Lab

This is one of the new AP Computer Science labs. I wanted to try a new object-oriented design lab and this looks like it may be a good fit.


The goal of this lab is for you to apply object-oriented design principles to implement the card game Elevens.


  • complete Activities 1-9 of the AP Elevens Lab.
  • peer review another student’s lab and provide comments in Canvas.


  • complete Activities 10-11 of the AP Elevens Lab.
  • Add more awesome.


  • Submit a pull request in GitHub and submit a link to the request with this assignment.

Fractal Tree Lab

The original version of this lab was inspired by George Peck from Lynbrook High School. The end product of this lab has remained the same over the past two years, but last year it was more effective since I provided much less scaffolding. That change was a result of feedback from students who let me know that the original version of the lab was too easy. I’m keeping it unchanged for next year.


The goal of this lab is to create a recursive algorithm to draw a tree.

Imagine you were describing how to draw a tree. You might say:

  1. Draw a vertical line
  2. At the top of the line, draw two smaller lines (“branches”) in a v shape
  3. At the ends of each of those two branches, draw two even smaller branches
  4. Keep repeating the process, drawing smaller and smaller branches until the branches are too small to draw

This process of repeating the same design at a continually decreasing scale is an example of a Fractal. Using fractals to draw trees can give some interesting and beautiful patterns. In this assignment we will use a recursive branching function to create a fractal tree.


  • Use the Koch Snowflake program as a starting point for an TreeViewer class and a TreeComponent class.
  • The trunk of the tree needs four values: the X and Y of the starting point and the X and Y of the end point. We will also have three member variables that will control:
    • how much smaller the branches are
    • how small the branches will get
    • the angle between the branches.


  • Add controls (e.g., scrollbars) to allow the user to change member variables that affect the construction of the tree.
  • Some fantastic trees can be produced by modify algorithm to add asymmetry, adjust angles,
 adjust thickness, or adjust color.
  • Add more awesome.


  • Submit a pull request in GitHub and submit a link to the request with this assignment.

A Different Kind of Computer Science Literacy: Reading, Writing, Debate

I want my AP Computer Science course to be more than a programming class. I want my students to read, think, write, and debate issues at the intersection of technology, society, and ethics. I suppose this aligns with the AP Computer Science course goal of “understand the ethical and social implications of computer use,” and “Computing in Context” topics (“An awareness of the ethical and social implications of computing systems is necessary for the study of computer science. These topics need not be covered in detail, but should be considered throughout the course.”). I’ve never seen a question related to these topics on the AP Exam, and I wonder how much these topics are addressed in classes throughout the world. Regardless, students should explore these topics. In addition, practicing reading, writing, analyzing, and debating these topics incorporates various Common Core literacy standards.

The first standard for my course is: “Standard 1 – Analyze, evaluate, and debate examples combining technology, society, and ethics.” Last year, students completed five discussion assignments associated with this standard; three the first semester and two the second. In preparation for next year, I revised these assignments. I kept the ones that resulted in the most engagement and discussion relatively unchanged. I combined a couple into a richer collection of articles in hopes of generating a stronger discussion. I created a couple of new assignments to expose students to topics I felt were important. Next year, I plan to have students complete three discussion assignments in the fall semester and three in the spring.

In hopes that other computer science educators will find these assignments useful in their classrooms, I’ve copied the six from Canvas into this blog post.

What is Computer Science? Who should learn it? When should they start?

In the past couple of years, there have been efforts to promote computer science to the public and primary and secondary students in particular. Last December, hosted the Hour of Code. On January 1, 2012, Codeacedemy declared that 2012 would be “Code Year.” Audrey Waters wrote an article about “Code Year,” the press it generated, and if it is a good idea. The controversy erupted when Jeff Atwood, co-founder of Stack Overflow, published his response: “Please Don’t Learn to Code.” In the Association of Computing Machinery journal, Esther Shein wrote an excellent article on the subject. In February, National Public Radio did a piece on these efforts. What complicates all of these efforts is the general lack of agreement about what is computer science. Jonah Kagan raises this point in his article “Computer science isn’t a science and it isn’t about computers“.


  • Read all five articles (Waters, Atwood, Shein, NPR, Kagan) and any others related to the topic that you find interesting.
  • Post a response to these articles. Your response must:
    • Provide a brief overview of each of the four articles.
    • Answer multiple prompts (potential, not exclusive, prompts are enumerated below) in detail.
    • Cite specifics from each of the four articles to support your response and demonstrate that you carefully read and analyzed each article.
    • Connect the articles to your personal experience (past, present, or future).
  • Comment in a civil and respectful manner on other’s responses or comments. (You must post before you can see others’ responses or comments.)

Some potential (but not exclusive) prompts:

  • Do you strongly agree with one side or the other in the everyone should learn to code debate? Why?
  • Does your decision to take AP Computer Science support or refute any of these claims?
  • What is the most important thing to learn in AP Computer Science? Java? Writing programs? What?
  • How is Kagan’s description of computer science consistent or inconsistent with your understanding of computer science?
  • How does this Kagan’s essay support (or doesn’t support) your expectations for this computer science course?
  • Do you agree that society in general and even technically sophisticated people are unsure exactly what computer science is? Does this matter?
  • Is this class a math class, a business class, a science class, an engineering class, or something else? Why?

Net Neutrality

Vi Hart’s video Net Neutrality in the US: Now What? is the best explanation of what Net Neutrality is and the history of the issue. In Forbes, Joshua Steimle, wrote a piece about his opposition to Net Neutrality: Am I The Only Techie Against Net Neutrality?.


  • Watch Vi Hart’s video, read Steimle’s article, and watch or read at least two other videos or articles linked in the description of the video.
  • Post a response to these videos and articles. Your response must:
    • Provide a brief overview of each of the four videos/articles (clearly state which other two videos or articles you watched or read).
    • Answer multiple prompts (potential, not exclusive, prompts are enumerated below) in detail.
    • Cite specifics from each of the four articles to support your response and demonstrate that you carefully read and analyzed each article.
    • Connect the articles to your personal experience (past, present, or future).
  • Comment in a civil and respectful manner on other’s responses or comments. (You must post before you can see others’ responses or comments.)

Some potential (but not exclusive) prompts:

  • Do you support Net Neutrality? Why or why not? Defend your position with evidence.
  • Will the FCC ruling on this topic actually affect you in any significant way?
  • What, if anything, do you plan to do after analyzing this issue as a result of this assignment?

Diversity in Computer Science

As a group, Computer Science students are one of the least diverse. What needs to be done to increase the participation of women and underrepresented minorities? In this interview, Dr. Maria Klawe, president of of Harvey Mudd College in California, addresses this question on PBS Newshour. In her article, Want More Women in Tech? Fix Misperceptions of Computer Science, Shuchi Grover focuses on the image problem of computer science. The last two sections of Tasneem Raja’s recent article Is Coding the New Literacy? summarizes the challenges and various efforts underway. Philip Guo writes about his Silent Technical Privilege as an Asian male.


  • Read or watch all four pieces (Klawe, Grover, Raja, Guo) and any others related to the topic that you find interesting.
  • Post a response to these articles. Your response must:
    • Provide a brief overview of each of the four articles.
    • Answer multiple prompts (potential, not exclusive, prompts are enumerated below) in detail.
    • Cite specifics from each of the four articles to support your response and demonstrate that you carefully read and analyzed each article.
    • Connect the articles to your personal experience (past, present, or future).
  • Comment in a civil and respectful manner on other’s responses or comments. (You must post before you can see others’ responses or comments.)

Some potential (but not exclusive) prompts:

  • Is it feasible to address the gender and underrepresented minority gap in computer science in high school?
  • Do you agree that the lack of diversity is due to a misperception of computer science?
  • What can high school students do to address this gender gap? What about teachers (computer science and others)? What about the school as a whole? What about parents? What about society?
  • Do you have a personal experience that either encouraged or discouraged you to take this or previous computer science courses?


“Three years after it was discovered, Stuxnet, the first publicly disclosed cyberweapon, continues to baffle military strategists, computer security experts, political decision-makers, and the general public. A comfortable narrative has formed around the weapon: how it attacked the Iranian nuclear facility at Natanz, how it was designed to be undiscoverable, how it escaped from Natanz against its creators’ wishes. Major elements of that story are either incorrect or incomplete.”


  • Read this article about Stuxnet and the cyberattack on Iran’s centrifuges. Also browse this detailed report for more information.
  • Post a response to these articles. Your response must:
    • Provide a brief overview of the two variants of the virus. Compare and contrast them to each other .
    • Answer multiple prompts (potential, not exclusive, prompts are enumerated below) in detail.
    • Cite specifics from the article and report to support your response and demonstrate that you carefully read and analyzed the article and report.
    • Connect the articles to your personal experience (past, present, or future).
  • Comment in a civil and respectful manner on other’s responses or comments. (You must post before you can see others’ responses or comments.)

Some potential (but not exclusive) prompts:

  • What security vulnerabilities were exposed?
  • Did the attack work as designed?
  • What does this attack on Iran’s centrifuges mean for the security of critical technologies within the United States?
  • What types of defenses would be effective? What types of defenses are not?

Data, Privacy, and the Future of Ed-Tech


  • Read the following excerpt. Choose two or more of the questions presented to explore further, either by following the links or doing additional research.
  • Post a response to these articles. Your response must:
    • Provide a brief overview of the articles related to the two questions that you chose to explore.
    • Cite specifics from each of the articles related to the two questions that you chose to explore to support your response and demonstrate that you carefully read and analyzed each article.
    • Connect the articles to your personal experience (past, present, or future). For example:
      • How do the issues of data and privacy affect you personally?
      • What personal behaviors are influenced by your understanding of data and privacy?
      • Will you change your behavior after researching this topic?
  • Comment in a civil and respectful manner on other’s responses or comments. (You must post before you can see others’ responses or comments.)

Data, Privacy, and the Future of Ed-Tech

Facebook CEO Mark Zuckerberg famously declared privacy “dead” back in 2010. This year, incidentally, he bought the four houses adjacent to his after hearing that a developer had plans to market a neighboring property as being “next door to Mark Zuckerberg.”

Nevertheless, you hear it a lot in technology circles – “privacy is dead” – often uttered by those with a stake in our handing over increasing amounts of personal data without question.

To see privacy as something will inevitably “die,” to view it as a monolithic notion is quite ahistorical. To do so ignores the varied cultural and social expectations we have about privacy today. It ignores how power relations have always shaped who has rights and access to autonomy, self-determination, solitude. It ignores the ongoing resistance (by teens, for example, by activists, and by librarians) to surveillance.

Nonetheless, as the adoption of ed-tech continues (and with it, the increasing amount of data created – intentionally or unintentionally, as content or as “exhaust“), there are incredibly important discussions to be had about data and privacy:

  • What role will predictive modeling and predictive policing have in education? Who will be marked as “deviant”? Why? Against whom will data discriminate?
  • What role does privacy play – or phrase differently: what role does a respite from surveillance play – in a child’s development?
  • How can we foster agency and experimentation in a world of algorithms?
  • What assumptions go into our algorithms and models? Who builds them? Are they transparent? (After all, data is not objective.)
  • What can we really learn from big data in education? Bill Gates says big data will “save American schools.” Really? Save from what? For whom? Or is all this data hype just bullshit?
  • Who owns education data?
  • What happens to our democracy if we give up our privacy and surrender our data to tech companies and to the federal government? What role will education play in resisting or acquiescing to these institutions’ demands?

The above is a portion of the article Top Ed-Tech Trends of 2013: Data vs Privacy by Audrey Waters who blogs at Hack Education.

Aaron Swartz

From Wikipedia: Aaron Swartz (November 8, 1986 – January 11, 2013) was an American computer programmer, writer, political organizer and Internet Hacktivist.

Swartz was involved in the development of the web feed format RSS,[3] the organization Creative Commons,[4] the website framework[5] and the social news site, Reddit, in which he became a partner after its merger with his company, Infogami.[[i]][410]

Swartz’s work also focused on sociology, civic awareness and activism.[6][7] He helped launch the Progressive Change Campaign Committee in 2009 to learn more about effective online activism. In 2010 he became a research fellow at Harvard University‘s Safra Research Lab on Institutional Corruption, directed by Lawrence Lessig.[8][9] He founded the online group Demand Progress, known for its campaign against the Stop Online Piracy Act.

On January 6, 2011, Swartz was arrested by MIT police on state breaking-and-entering charges, after systematically downloading academic journal articles from JSTOR.[10][11]Federal prosecutors later charged him with two counts of wire fraud and 11 violations of the Computer Fraud and Abuse Act,[12] carrying a cumulative maximum penalty of $1 million in fines, 35 years in prison, asset forfeiture, restitution and supervised release.[13]

Swartz declined a plea bargain under which he would serve six months in federal prison. Two days after the prosecution rejected a counter-offer by Swartz, he was found dead in his Brooklyn, New York apartment, where he had hanged himself.[14][15]

In June 2013, Swartz was posthumously inducted into the Internet Hall of Fame.[16][17]

Boston Magazine published an article titled, “Losing Aaron, Bob Swartz on MIT’s Role in His Son’s Death“.

In January 2013, Aaron’s Law was authored in Congress. The Electronic Frontier Foundation has been leading efforts to reform CFAA and pass Aaron’s Law and similar initiatives.

Lawrence Lessig used the occasion of his Chair Lecture at Harvard to speak about Aaron Swartz: “Aaron’s Laws: Law and Justice in a Digital Age“.


  • Read the articles from Boston Magazine, WIRED, and EFF. (Watch Lessig’s video, if interested; it is excellent.)
  • Post a response to these articles. Your response must:
    • Provide a brief overview of Aaron Swartz, CFAA, and Aaron’s Law.
    • Answer multiple prompts (potential, not exclusive, prompts are enumerated below) in detail.
    • Cite specifics from each of the three articles to support your response and demonstrate that you carefully read and analyzed each article.
    • Connect the articles to your personal experience (past, present, or future).
  • Comment in a civil and respectful manner on other’s responses or comments. (You must post before you can see others’ responses or comments.)

Some potential (but not exclusive) prompts:

  • Is it accurate to compare Aaron’s actions to that of civil disobedience?
  • Did Aaron’s downloading of JSTOR archive cause harm?
  • What role and responsibility does MIT have in this case?
  • What reforms, if any, are needed? Does Aaron’s Law go far enough?
  • What can you do? What should you do?

2013-2014 in Numbers

The 2013-2014 school year by the numbers:

  • 88 students in the fall; 86 students in the spring
  • 71 recommendation letters for 36 different students
  • 30 standards in AP Physics B; 80, in Honors Physics; 20 in AP Computer Science
  • 596 tweets
  • 16 blog posts
  • 186 180 posts
  • 9288 school e-mails received; 4820 sent
  • 23 partial or full days missed; none due to illness
  • 0.55 FCI gain (n=18)
  • 4.484 average AP Physics B score; 4.407 average AP Computer Science score

Standards for AP Physics 2

I floated this idea on Twitter a couple of weeks ago and have decided to give it a try. Historically, I’ve grouped my assessment standards into unit-centric categories. In an attempt to emphasize the big ideas and science practices more strongly, I’m going to group standards by the Big Ideas defined by the College Board for AP Physics 2. My assessment standards are the Enduring Understanding defined for each Big Idea. The Essential Knowledge items and Learning Objectives are too fine grained for my style of standards-based assessment and reporting, especially for an AP class where I want students to focus on the combination of multiple concepts.

There will be multiple assessments (labs and exam questions) for each standard. A given assessment will focus on a subset of learning objectives for that standard. As a result, there will be multiple scores for each standard in the grade book. I hope this will give students more insight into their strengths and areas for improvement as they progress throughout the course. I’ll still have reassessments.

The weights for each Big Idea category will not be the same, but I’m going to do more planning before assigning them. I also need to see how these standards are split between the fall and spring semesters.

If you think I’m courting disaster with this plan, please let me know. If you adopt a similar approach for your AP Physics class, please remember I’ve never tried this before!

  • 1: Objects and systems have properties such as mass and charge. Systems may have internal structure.
    • 1.A: The internal structure of a system determines many properties of the system.
    • 1.B: Electric charge is a property of an object or system that affects its interactions with other objects or systems containing charge.
    • 1.C: Objects and systems have properties of inertial mass and gravitational mass that are experimentally verified to be the same and that satisfy conservation principles.
    • 1.D: Classical mechanics cannot describe all properties of objects.
    • 1.E: Materials have many macroscopic properties that result from the arrangement and interactions of the atoms and molecules that make up the material.
  • 2: Fields existing in space can be used to explain interactions.
    • 2.A: A field associates a value of some physical quantity with every point in space. Field models are useful for describing interactions that occur at a distance (long-range forces) as well as a variety of other physical phenomena.
    • 2.C: An electric field is caused by an object with electric charge.
    • 2.D: A magnetic field is caused by a magnet or a moving electrically charged object. Magnetic fields observed in nature always seem to be produced either by moving charged objects or by magnetic dipoles or combinations of dipoles and never by single poles.
    • 2.E: Physicists often construct a map of isolines connecting points of equal value for some quantity related to a field and use these maps to help visualize the field.
  • 3: The interactions of an object with other objects can be described by forces.
    • 3.A: All forces share certain common characteristics when considered by observers in inertial reference frames.
    • 3.B: Classically, the acceleration of an object interacting with other objects can be predicted by using Newton’s Second Law.
    • 3.C: At the macroscopic level, forces can be categorized as either long-range (action-at-a-distance) forces or contact forces.
    • 3.G: Certain types of forces are considered fundamental.
  • 4: Interactions between systems can result in changes in those systems.
    • 4.C: Interactions with other objects or systems can change the total energy of a system.
    • 4.E: The electric and magnetic properties of a system can change in response to the presence of, or changes in, other objects or systems.
  • 5: Changes that occur as a result of interactions are constrained by conservation laws.
    • 5.B: The energy of a system is conserved.
    • 5.C: The electric charge of a system is conserved.
    • 5.D: The linear momentum of a system is conserved.
    • 5.F: Classically, the mass of a system is conserved.
  • 6: Waves can transfer energy and momentum from one location to another without the permanent transfer of mass and serve as a mathematical model for the description of other phenomena.
    • 6.A: A wave is a traveling disturbance that transfers energy and momentum.
    • 6.B: A periodic wave is one that repeats as a function of both time and position and can be described by its amplitude, frequency, wavelength, speed, and energy.
    • 6.C: Only waves exhibit interference and diffraction.
    • 6.E: The direction of propagation of a wave such as light may be changed when the wave encounters an interface between two media.
    • 6.F: Electromagnetic radiation can be modeled as waves or as fundamental particles.
    • 6.G: All matter can be modeled as waves or as particles.
  • 7: The mathematics of probability can be used to describe the behavior of complex systems and to interpret the behavior of quantum mechanical systems.
    • 7.A: The properties of an ideal gas can be explained in terms of a small number of macroscopic variables including temperature and pressure.
    • 7.B: The tendency of isolated systems to move toward states with higher disorder is described by probability.
    • 7.C: At the quantum scale, matter is described by a wave function, which leads to a probabilistic description of the microscopic world.