Category Archives: technology

Selling Our Colleagues Short with SAMR

I’ve been seeing more and more references to SAMR. Maybe it’s because ISTE is starting, or maybe it’s because my district is promoting it as a tool for teachers embarking on our Digital Learning Initiative (1:1 devices), or maybe its just the Baader-Meinhof phenomenon. Regardless, I can only tolerate so many SAMR infographics before I’m pushed over the edge, and I have to say something.

Due to its overemphasis on technology, SAMR is the least helpful model to promote with teachers if you want to provide a resource to positively impact student learning.

Depending on the teacher, it confuses, at best, and misleads, at worse. I’m not alone in this sentiment. Several of my colleagues both local and online have expressed similar feelings. Most eloquent are a couple of posts by Casey Rutherford. My favorite quote:

On the note of lesson design, I am not satisfied with simplifying the complexities of teaching to where it falls on the SAMR scale. Teaching is nuanced, fluid, and has a ton of moving parts, and we’d be better off embracing that than cheapening it with a stamp of ‘modification.’

I’ll illustrate the problem with a couple of lessons.

Lesson #1

Embracing the “Flipped Classroom,” the teacher records and shares a demonstration of the projectile motion lab and an overview of the lab procedure. Students can watch the video for homework before coming to class. In addition to the video, the lab procedure is published to Canvas and each student can follow the procedure on their devices while performing the lab. The video (and published procedure) instructs students on how to setup the apparatus and make the necessary measurements of the path of the ball bearing that is launched after rolling down the ramp. The data is captured in a Google doc and plotted using Plotly, allowing group members to share data with each other. Based on their plot, students determine where on the floor to place the target and then the teacher observes their attempt. Points are awarded based on how close the ball bearing lands to the target. Each student creates a video that captures their attempt in which they report their percent error and list potential sources of error. The videos are posted on their public blog. Students, and the public, can comment on these videos posted on the students’ blog, but no one does.

Let’s take a look at this lesson. Hmmm. Flipped classroom, LMS, Google Docs, Plotly, video creation, student blogs. Woohoo! We are swimming in the deep end of the SAMR pool! Modification? Redefinition? Doesn’t matter, we’re above the bar!

There’s just one problem. This lesson sucks. I know, it’s based on one of my actual lessons. My class did it. They weren’t engaged; they didn’t have ownership; they didn’t have choice; they didn’t exercise their creativity. They asked me a ton of questions about the details of the procedure. From a pedagogical perspective, it is flawed. It is a traditional cookbook science lab jazzed up with a bunch of tech.

Don’t worry, I made improvements the next year. I focused on research-based pedagogy, and I integrated technology where it supported the pedagogy and content.

Lesson #2

Students are presented with a challenge: Given a fixed vertical elevation of the projectile launcher (i.e., “cannon”), determine the launch angle and time of launch to hit the toy buggy at a specific location as it “flees” . Students work in small groups to justify the selection of the kind of data needed, design a plan for collecting data, and collect data. They choose the tools with which to collect the data. Some groups use video cameras; others, motion detectors; others, photo gates; others, meter sticks; others, phones. They create a computational model using 3D physics programming language since a traditional mathematical solution is beyond most of their current capabilities (one group solves the problem algebraically using clever trig substitutions, which is fine). Using the computational model they solve for the launch angle and time of launch. Their attempt based on their calculation is recorded with a high speed video camera and shared publicly to celebrate their success. Students reflect on the lab practicum with a specific focus on measurement uncertainty and capture their reflections in their electronic portfolio which they will export into an open format (HTML) and take with them to university. During the post lab discussion as a whole class, each group shares what in their evaluation is the most significant aspects of their approach as each group had a unique approach to the lab. Groups compare and contrast these techniques arriving at a set of best practices for minimizing measurement uncertainty.

Students were motivated and engaged. They were creative and collaborative. They asked each other many questions. They surprised me with their solutions. They focused on deeper science practices beyond the content of projectile motion. Some groups incorporated technology to help them meet the challenge. Some groups hardly used any technology at all.

Some may rebuke my assertion and claim that I’m oversimplifying SAMR and there is more to it than what I’m presenting. I’m missing the student-centered vs. teacher-centered aspect. Maybe there is, but you wouldn’t know it from most of the resources out there. SAMR Coffee? SAMR Apps? Really?

Some may argue that teachers new to tech need a simple model to reference. Yes, SAMR is simple. But, why promote it when there are better and more inclusive models available? Do we really think TPACK is too complex for our colleagues? Are we selling them that short?

I’m not.

Formative Assessment Tools for Peer Instruction and Peer Critque of Written Responses

This past year, as my AP Physics 2 cases piloted Chromebooks, we used a couple of formative assessment tools frequently in class. For Peer Instruction, we used InfuseLearning. InfuseLearning’s stand-out feature was their support for draw-response questions. Having students sketch graphs and draw diagrams is very valuable as a formative assessment in physics class. Throughout the year as I shared InfuseLearning with other teachers participating in the pilot, the draw-response feature was the most popular everyone, from elementary through high school.

The second formative assessment activity was focused on preparation for the new paragraph-length responses on the AP Physics 2 exam. To practice these types of responses, students responded to a prompt using Socrative. Socrative allows me to share all the responses with students, and students can vote for the best one. We can then, as a class, discuss the elements of the best responses.

Unfortunately, InfuseLearning closed their doors in April. In preparation for sharing resources with teachers this summer before we deploy 1:1 Chromebooks for all high school students this fall, I surveyed the current tools available with a focus specifically on Peer Instruction that supports drawing and peer-critique of written responses.

I evaluated the following features.

  • Cost: Is there a free version? What are the limitations of the free version? Can teachers upgrade to a paid version on an as-needed basis?
  • Account Creation: How easy is it for students to create accounts? Can they login with their Google account?
  • Prepared Questions: Does the tool support preparing questions in advance?
  • Spontaneous Questions: Does the tool support creating a question on-the-fly without preparation ahead of time?
  • Supported Question Types: What types of questions do the tool support?
  • Multiple Choice Questions: Since Peer Instruction often uses multiple choice questions, how flexible are these questions? Can the answer choices be customized (e.g., A-D or 1-4)? Can the number of answer choices be customized?
  • Draw Response Questions: Are draw response questions supported by the tool? How rich are the drawing tools?
  • Sharing Student Responses with Students: Does the tool support sharing sample student responses with all students?
  • Capturing Student Responses: Does the tool support capturing student responses for later analysis? What can and cannot be captured?
  • Reporting: Does the tool support reporting of sessions? What is captured and reported?


  • Cost: free
  • Account Creation: integrated with Google accounts
  • Prepared Questions: yes
  • Spontaneous Questions: yes
  • Supported Question Types: multiple choice, true/false, short answer
  • Multiple Choice Questions: limited options (exactly 5, A-E)
  • Draw Response Questions: no
  • Sharing Student Responses with Students: sharing short answer allows student to vote on best peer answer
  • Capturing Student Responses: yes
  • Reporting: for prepared questions and short answer only (not spontaneous multiple choice or true/false)

The Answer Pad

  • Cost: free and paid; free is limited (limited templates, question types, creation of own images, capture student responses)
  • Account Creation: students have to create accounts (doesn’t support Google accounts) if you want to track student responses
  • Prepared Questions: yes, but not draw response
  • Spontaneous Questions: yes
  • Supported Question Types: multiple choice, true/false, yes/no, up/down, fill-in, Likert scale, drawing
  • Multiple Choice Questions: limited options (exactly 4, A-D)
  • Draw Response Questions: yes, decent drawing tools
  • Sharing Student Responses with Students: no
  • Capturing Student Responses: limited in free version
  • Reporting: only for prepared questions


  • Cost: free
  • Account Creation: integrated with Google accounts
  • Prepared Questions: yes
  • Spontaneous Questions: no (maybe have some standard templates?)
  • Supported Question Types: multiple choice, show your work (draw response), short answer, true/false
  • Multiple Choice Questions: flexible response choices
  • Draw Response Questions: yes, but limited (no colors)
  • Sharing Student Responses with Students: no
  • Capturing Student Responses: automatic
  • Reporting: yes

Pear Deck

  • Cost: free and paid; free is limited (draw response in prepared decks, capturing, and reporting are paid features)
  • Account Creation: integrated with Google accounts
  • Prepared Questions: yes
  • Spontaneous Questions: kind of (can ask a quick question in the context of an existing deck)
  • Supported Question Types: agree/disagree, draw on grid, draw on blank, yes/no, true/false, multiple choice, long text answer, short text answer, numeric answer
  • Multiple Choice Questions: flexible response choices
  • Draw Response Questions: yes (quick question only for free version)
  • Sharing Student Responses with Students: no
  • Capturing Student Responses: paid only
  • Reporting: paid only


  • Cost: free and paid (free has limited storage space and reporting export options)
  • Account Creation: integrated with Google accounts
  • Prepared Questions: yes
  • Spontaneous Questions: no (maybe have some standard templates?)
  • Supported Question Types: open-ended question, poll, quiz, draw it
  • Multiple Choice Questions: flexible response choices
  • Draw Response Questions: yes, decent drawing tools
  • Sharing Student Responses with Students: yes
  • Capturing Student Responses: yes
  • Reporting: yes (PDF only in free version)


At our summer professional learning sessions, we will be featuring Socrative. It is easy to use and applies to a wide variety of disciplines. The significant drawback of Socrative is the lack of draw-response questions. For those teachers that need that feature, I’m recommending they use NearPod. I used to use NearPod a couple of years ago when piloting classroom iPads. At that time, NearPod was an iPad-only app. I was thrilled to discover that it now supports all major platforms.

For my physics classroom, I’m going to use NearPod for Peer Instruction because draw-response questions are so important. While I’d rather be able to create spontaneous questions, I’m also interested in capturing student answers to provide me more insight into their learning, which necessitates creating a set of questions ahead of time. I will create a few slides in each deck that can serve as general-purpose placeholders for spontaneous questions.

I’ll still use Socrative for peer-critique of written responses. The ability to share student responses with students and have students vote for the best response is very effective at developing their writing. These two classroom activities – Peer Instruction and peer-critique of written responses are done independently; so, using two different tools should not be inconvenient.

If I’ve missed a tool that would work well for either of these classroom activities, please leave a comment to let me know!

GitHub, Canvas, and Computer Science

There are certain software development techniques or tools that are not strictly part of the AP Computer Science curriculum that I think are valuable for students to learn about and practice in my class. Two years ago, I incorporated pair programming. Last year, I added test-driven development and JUnit. This year, I made GitHub an integral part of the course.

I want students to be aware of and appreciate the value of source control management. GitHub was the obvious choice as they are supportive of education and is mostly likely the specific tool that students will encounter.

After consulting with a couple former students more familiar with GitHub than I, I decided to create a repository for each unit in the course. At the start of each unit, students fork that unit’s repository and clone it to their desktop. They perform these operations through the GitHub web site.

Throughout the unit, I encourage students to put all of their code in their forked repository and frequently commit and sync. This provides students with all of the typical advantages of source control: they can more easily work at both school and home, and they can revert to an earlier version of code if a project goes astray.

At the end of the unit when students have completed their summative lab, they issue a pull request to submit the lab. They then submit the URL to this pull request via the Canvas assignment that I created for the summative lab.

I created a video tutorial that captures the student workflow:

The student workflow works well except when they accidentally generate merge conflicts by not keeping home and school synced.

While exposing students to GitHub is enough of a benefit, this particular workflow is extremely efficient from the perspective of me evaluating their summative labs.

I still use Canvas’s SpeedGrader to keep track of who has submitted the lab and to provide detailed feedback to students. In previous years, I had students submit a zip file of their entire project folder. The link to their pull request is much more efficient. My workflow for evaluating their lab is the following:

  1. Click on the link in SpeedGrader to open the pull request page in another tab.
  2. Click on the “Check out this branch in GitHub for Mac” icon which does exactly that.
  3. Open the BlueJ project file for the summative lab, read the code, read the docs, run the project, and provide feedback via SpeedGrader.
  4. Close the pull request.
  5. Run the following script which switches back to the mater branch and removes all traces of the student’s pull request:

    git reset --hard
    git clean -xdf
    git checkout master

  6. After evaluating all of the labs, I list all of the branches that I checked out: git branch --list

  7. I then delete each of these branches: git branch -D pr/xx

While the above may seem like a lot of steps, there is very little overhead and it is much more efficient than my previous workflow.

I’m embarrassed to admit that there is another advantage of these GitHub repositories for each unit that I didn’t utilize until this past unit. While making notes to myself about where we had stopped in one class period where I was modeling how to write a certain algorithm, it struck me that I can create branches for each class period. I now create a branch for each class period and, when I demonstrate how to write some code, I commit and sync that branch with some helpful notes to myself at the end of each class period. The next day, I switch to the corresponding class’s branch, read my notes, and we start right where we stopped.

If you have any suggestions on how I can improve my students’ workflow or my workflow, please share. If you have been thinking of incorporating source control management into your computer science class, I encourage you to take the plunge. Your students will learn a very valuable skill!

Chromebook Toolchain for AP Physics

This fall, my AP Physics 2 classes will be using Chromebooks as part of my school district’s 1:1 pilot. Chromebooks were new to me; so, it took some time this summer to find the apps to support the workflow I want for this class. While I’m sure the toolchain will change throughout the semester, and there will be surprises (both pleasant and otherwise), here is the starting toolchain:

  • Canvas. Everything starts and ends with this learning-management system.

We will do a lot of lab activities. The workflow depends on the amount of data acquired and the level of graphical analysis required. The start of the workflow is the same:

  • LabQuest 2. Vernier’s LabQuest 2 can create its own ad-hoc network or connect to the school’s wireless network. The LabQuest 2 hosts its own web page as part of their Connected Science System. Students can then access the device, the data, and graphs via Chrome. Data and graphs can be exported to the Chromebook via the web page.

The next tool depends upon the lab. For some labs, the data and graphs produced on the LabQuest 2 are sufficient. Students will import these into their Google Document and create whatever is required for their lab report. If additional analysis is required and the data sets are relatively small:

If data sets are large or more sophisticated analysis is required:

  • seemed to explode onto the education scene this summer, or maybe I was just paying more attention. Data exported from the LabQuest 2 can easily be imported into Like Desmos, graphs can be shared via a link and an image can be embedded in the Google document. can also embed its graphs in an iframe, but I couldn’t find a way to embed that in a Google document as opposed to a web page. Fran Poodry from Vernier made a great screencast demonstrating the integration of the LabQuest 2 and

Regardless of the analysis performed, in the end, students create their lab report in Google docs and submit it via Canvas.

Another important aspect of class is the exploration and modification of computational models. In the past, we’ve used VPython. I had to find an alternative that would be compatible with Chromebooks:

  • Glowscript. Glowscript is the up-and-coming platform for computational models with the advantage that it runs in a browser that supports WebGL. I’m not a huge fan of JavaScript syntax for novice programmers; so, we will be using CoffeeScript instead. I didn’t write as many starting models over the summer as I had hoped, but I did at least verify that complicated models can be ported.

Peer instruction is one of the most effective and popular classroom activities that we do. In the past, I’ve used handheld clickers. This year, we will use the Chromebooks:

  • InfuseLearning. There are a number of web apps in this space, but I selected InfuseLearing because it allows the creation of spontaneous questions, supports a variety of answer methods including drawing and sort-in-order. Pear Deck looks promising, but I don’t want to be forced to create my set of questions ahead of time.

For notes in class, I’ll leave it up to students to use whatever tool works best for them (including paper and pencil). I’ll suggest they at least take a look at:

  • Evernote. I love Evernote and use it all the time for all sorts of stuff.

I do provide students with PDFs of my slides. I can envision that students may want to annotate these PDFs or other handouts. Surprisingly, this was the hardest tool to find:

  • Crocodoc. The free personal version allows students to upload a PDF, annotate it, and export their annotated version. Other tools I explored are Notable PDF. This requires paid licenses to be useful. We may try this out if we find Crocodoc lacking.

A couple of other tools that looks interesting, but I’m not sure if they fits into the toolchain for my class is:

  • Doctopus. I think Canvas assignments and SpeedGrader cover everything that I personally would do with this app.

  • 81Dash. Private back-channeling app.

I’m sure I will learn of new tools throughout the semester and I’ll make adjustments to the toolchain. If you are using Chromebooks, please share your favorite apps below in the comments!

iPad Resources for the Science Classroom

A colleague of mine will be the department chair at a 1:1 iPad school next year. While we don’t have a 1:1 program (yet), I have piloted iPads in my classroom. I wanted to share the apps that worked well in a science classroom and general deployment tips.

To start, there are some general apps for any classroom:

  • iWord (Pages, Keynote, Numbers)
  • iLife (iPhoto, iMovie, GarageBand)
  • iBooks
  • iTunes U
  • Dropbox (or another cloud-based file system)
  • Canvas (you are using Canvas, right?)

Labs are a critical part of any classroom. I’m a huge fan of Vernier’s LabQuest 2 devices which play particularly well with their Graphical Analysis app. A lot of great lab work can be done via video analysis through Vernier’s Video Physics app. I didn’t use an app for lab notebooks in my classroom, but I recently visited 4th and 5th grade classrooms where students were working through a STEM unit and were creating fantastic lab notebooks with data tables, graphs, videos, and written reflections using the Creative Book Builder app.

There are several other apps which I have found very useful:

  • For collaborative drawing and problem solving, I haven’t found an app that is better than a $2 whiteboard. For individual note taking and drawing, Notability is my favorite app.
  • For additional analysis, the Desmos app is a fantastic graphing application. The best calculator app is PCalc.
  • For formative assessment and peer instruction, I had a lot of success with Nearpod.
  • For project and screencast projects, Explain Everything is fantastic.
  • It isn’t released yet, but I’m looking forward to Computable which combines IPython and SciPy on the iPad.

These final two aren’t apps for the iPads, but enhance the utility of iPads.

  • An iPad easily (and cheaply) replaces a document camera. I use the first version of Justand, but Justand V2 looks even better.
  • To share whatever is on the teacher’s or any student’s iPad by projecting it so the entire class can see it, I run Reflector on my laptop which is connected to the projector.

You may have the best collection of apps on your iPads, but if you don’t have a strategy for device deployment and management, you’re in trouble. MDM is pretty much required these days and iOS 7 plays well with it. Fraser Speirs and Bradley Chambers have a lot of experience deploying and managing iPads. Their podcast Out of School has a series of episodes focusing on deployment.

We Don’t Need a Technology Integration Team

Last year, I was a member of my high school’s Technology Field Test Team, a group of teachers, Technology Integration Specialists, and administrators who were piloting various technology initiatives (e.g., one-to-one iPads, BYOD, iPad carts, etc.). This year, that team is morphing into a team focused on technology integration building-wide rather than additional pilots. Along with the two Technology Integration Specialists and another teacher, I will be leading this team. Over the summer, I was asked to think about the vision, the scope, and The Why of this team.

After some thought, I realized:

We don’t need a Technology Integration Team


We need a Teaching Best Practices Team

The very idea of a technology integration team puts the emphasis on the wrong syllable. We need a team that can help our teachers adopt pedagogically-sound best practices for teaching. Often, those best practices may involve the integration of technology. Sometimes, they won’t. Regardless, the technology isn’t the first step; and, furthermore, if the technology doesn’t support pedagogically-sound best practices, we need to make sure our colleagues are aware of that.

To be clear, I’m not against technology in the classrooms. I try all sorts of stuff and see what works for me and my students. I feel much better when a particular use of technology is supported by educational research. So, while I don’t send students home at night to watch Khan Academy videos, because that doesn’t help students learn (and may actually reinforce their misconceptions and make them over-confident), I do use iPads as a key tool in peer instruction and follow a process supported by educational research.

Here’s a helpful matrix on the spectrum of technology integration. By focusing primarily on the technology, I think teachers can get stuck on the left side of this spectrum. They use technology in a substitutive manner in which they are doing the same things in a somewhat better way. If we focus first on doing better things, we can explore more transformative uses of technology.

I have a theory that these transformative uses of technology occur in quantum steps. Having a great Learning Management System (LMS) like Instructure’s Canvas enables students to create online portfolios of capstone projects that are easily shared within and outside of the classroom. Having access to seven laptops and Vernier’s LabPro and LoggerPro make possible a whole collection of physics labs. Having access to 15 laptops and Tracker allows pairs of students to learn about physics through video analysis. Having access to 30 iPads and NearPod allows the discussion and debate of rich questions during peer instruction. Having one-to-one of a uniform device and set of apps enables students to … well, I’m not sure since I haven’t experienced that, but I expect it will be another quantum step.

I don’t know who coined the phrase or if the context was even related to technology, but I think this sums up my philosophy of technology in education:

Doing Better Things over Doing Things Better

When I wrote the above quote, I was reminded of Agile Software Development, which was a major focus of mine in my previous career. Personally, I find a great deal of similarities between my educational technology philosophy and my software development philosophy. In fact, upon revisiting the Manifesto for Agile Software Development, I found it surprisingly relevant to the world of education and technology when viewed from that perspective. Here it is:

We are uncovering better ways of developing software by doing it and helping others do it. Through this work we have come to value:

Individuals and interactions    over    processes and tools

Working software    over    comprehensive documentation

Customer collaboration    over    contract negotiation

Responding to change    over    following a plan

That is, while there is value in the items on the right, we value the items on the left more.

From one perspective, I think these principles could apply to the relationship between teachers and students in a classroom. From another perspective, I think they could apply to the relationship between teachers and our technology integration team.

I think there is a lot of wisdom in Stephanie Chasteen’s post about a talk at the AAPT Summer Meeting by Chandra Turpen in which she promotes the idea that “we should focus on providing powerful experiences with educational innovation that allow faculty to see success for themselves.” This perspective combined with developing a growth mindset in our faculty could be a powerful combination.

So, maybe I’ve finished my summer homework. The Why of our team is to better help students learn by helping teachers adopt best practices. Our scope is advocacy and support for pedagogically-sound teaching best practices that may or may not require technology integration. Perhaps our vision could be captured by rephrasing the Agile Manifesto in the context of the relationship between teachers and our team.

I’m sure I’m not the only one to have thought about this. I’d love to hear your ideas and experiences and share those with this new team.

Sharing Resources with Students via Evernote

I love reading about the latest developments in physics and technology. When I began teaching, I started collecting bookmarks for articles that I found online that were related to various topics we would study in class. I also started collecting bookmarks to resources for myself. At the start of each unit, I created a page organizing all of these links to articles, simulations, videos, and projects for students. This page serves as an extension to the class. Many of the topics go beyond the curriculum and are fascinating extensions to the unit of study. I would encourage students to browse this page when they were procrastinating: “If you are procrastinating instead of doing your homework, you might as well browse physics articles.”

I tried to optimize this process as much as possible. I stored the bookmarks in Yojimbo and somewhat automated the process of creating the page for each unit. However, there was still too much effort to keep each unit’s page current. I also wanted to share each unit’s page with a wider audience. Finally, while I collected a large number of links to resources for teachers, I didn’t have a completely automatic way to share them with anyone else.

Based on recommendations from several people, one of my projects this summer was to investigate Evernote. I was pleasantly surprised at how efficient a workflow I could develop.

My first step was to enumerate a superset of units and create an Evernote notebook for each unit. Actually, I created two Evernote notebooks for each unit: one for students and one for teachers:

Evernote Notebooks

I imported my existing bookmarks into Evernote which took a while but doesn’t need to be repeated. Evernote makes it easy to share a notebook publicly. However, I wanted to present the links within a notebook in an organized fashion. So, I created an index for each notebook of student links. This was really easy to do by filtering the notes in the notebook by various tags (articles, simulations, videos, make):

Filtering by Tags

I then selected all of the notes with the specified tag, right-clicked, and copied links to these notes:

Copying Links to Notes

Finally, I pasted these links into the index note under the appropriate heading:

Index Note

I didn’t bother with this level or organization for the notebooks containing teacher-centric links.

I was very pleased to see that it would be easy to keep track of new links that haven’t been added to the index note. Since the notes are sorted by when they are updated, when I start each unit, it is easy to see which links I need to add to the index because they are sorted before the index note:

Newer Notes

When I start each unit next year, I’ll update the index note and post a link to the shared notebook under the current module on Canvas. In addition, I now hope that a wider audience will benefit from these extensions of typical physics units. Evernote has a good web interface for exploring these shared notebooks:

Evernote Web Interface

Shared notebooks for the superset of units across my classes are enumerated on my web site. Feel free to share them with your student and I hope you leverage Evernote to share your own collection of links with your students and other teachers!