Monthly Archives: June 2015

Selling Our Colleagues Short with SAMR

I’ve been seeing more and more references to SAMR. Maybe it’s because ISTE is starting, or maybe it’s because my district is promoting it as a tool for teachers embarking on our Digital Learning Initiative (1:1 devices), or maybe its just the [Baader-Meinhof phenomenon](https://en.wikipedia.org/wiki/List_of_cognitive_biases#Frequency_illusion). Regardless, I can only tolerate so many SAMR infographics before I’m pushed over the edge, and I have to say something.

**Due to its overemphasis on technology, SAMR is the least helpful model to promote with teachers if you want to provide a resource to positively impact student learning.**

Depending on the teacher, it confuses, at best, and misleads, at worse. I’m not alone in this sentiment. Several of my colleagues both local and online have expressed similar feelings. Most eloquent are [a couple](https://learningandphysics.wordpress.com/2014/10/22/i-am-not-satisfied/) [of posts](https://learningandphysics.wordpress.com/2014/12/30/dont-keep-it-simple-stupid/) by Casey Rutherford. My favorite quote:

> On the note of lesson design, I am not satisfied with simplifying the complexities of teaching to where it falls on the SAMR scale. Teaching is nuanced, fluid, and has a ton of moving parts, and we’d be better off embracing that than cheapening it with a stamp of ‘modification.’

I’ll illustrate the problem with a couple of lessons.

Lesson #1
—-

> Embracing the “Flipped Classroom,” the teacher records and shares a demonstration of the projectile motion lab and an overview of the lab procedure. Students can watch the video for homework before coming to class. In addition to the video, the lab procedure is published to Canvas and each student can follow the procedure on their devices while performing the lab. The video (and published procedure) instructs students on how to setup the apparatus and make the necessary measurements of the path of the ball bearing that is launched after rolling down the ramp. The data is captured in a Google doc and plotted using Plotly, allowing group members to share data with each other. Based on their plot, students determine where on the floor to place the target and then the teacher observes their attempt. Points are awarded based on how close the ball bearing lands to the target. Each student creates a video that captures their attempt in which they report their percent error and list potential sources of error. The videos are posted on their public blog. Students, and the public, can comment on these videos posted on the students’ blog, but no one does.

Let’s take a look at this lesson. Hmmm. Flipped classroom, LMS, Google Docs, Plotly, video creation, student blogs. Woohoo! We are swimming in the deep end of the SAMR pool! Modification? Redefinition? Doesn’t matter, we’re above the bar!

There’s just one problem. This lesson sucks. I know, it’s based on one of my actual lessons. My class did it. They weren’t engaged; they didn’t have ownership; they didn’t have choice; they didn’t exercise their creativity. They asked me a ton of questions about the details of the procedure. From a pedagogical perspective, it is flawed. It is a traditional cookbook science lab jazzed up with a bunch of tech.

Don’t worry, I made improvements the next year. I focused on research-based pedagogy, and I integrated technology where it supported the pedagogy and content.

Lesson #2
—-

> Students are presented with a challenge: Given a fixed vertical elevation of the projectile launcher (i.e., “cannon”), determine the launch angle and time of launch to hit the toy buggy at a specific location as it “flees” . Students work in small groups to justify the selection of the kind of data needed, design a plan for collecting data, and collect data. They choose the tools with which to collect the data. Some groups use video cameras; others, motion detectors; others, photo gates; others, meter sticks; others, phones. They create a computational model using 3D physics programming language since a traditional mathematical solution is beyond most of their current capabilities (one group solves the problem algebraically using clever trig substitutions, which is fine). Using the computational model they solve for the launch angle and time of launch. Their attempt based on their calculation is recorded with a high speed video camera and [shared publicly](http://180.pedagoguepadawan.net/107/107/) to celebrate their success. Students reflect on the lab practicum with a specific focus on measurement uncertainty and capture their reflections in their electronic portfolio which they will export into an open format (HTML) and take with them to university. During the post lab discussion as a whole class, each group shares what in their evaluation is the most significant aspects of their approach as each group had a unique approach to the lab. Groups compare and contrast these techniques arriving at a set of best practices for minimizing measurement uncertainty.

Students were motivated and engaged. They were creative and collaborative. They asked each other many questions. They surprised me with their solutions. They focused on deeper science practices beyond the content of projectile motion. Some groups incorporated technology to help them meet the challenge. Some groups hardly used any technology at all.

Some may rebuke my assertion and claim that I’m oversimplifying SAMR and there is more to it than what I’m presenting. I’m missing the student-centered vs. teacher-centered aspect. Maybe there is, but you wouldn’t know it from most of the resources out there. [SAMR Coffee](https://www.google.com/search?q=samr+coffee&tbm=isch&tbo=u&source=univ&sa=X&ei=e5WQVYbAGtXaoATpr7TIBg&ved=0CB4QsAQ)? [SAMR Apps?](https://www.google.com/search?q=samr+apps&tbm=isch&tbo=u&source=univ&sa=X&ei=k5WQVYiNNo_ooASyi7CgCQ&ved=0CB4QsAQ) Really?

Some may argue that teachers new to tech need a simple model to reference. Yes, SAMR is simple. But, why promote it when there are better and more inclusive models available? Do we really think [TPACK](http://www.matt-koehler.com/tpack/what-is-tpack/) is too complex for our colleagues? Are we selling them that short?

I’m not.

Formative Assessment Tools for Peer Instruction and Peer Critque of Written Responses

This past year, as my AP Physics 2 cases piloted Chromebooks, we used a couple of formative assessment tools frequently in class. For Peer Instruction, we used [InfuseLearning](http://www.infuselearning.com). InfuseLearning’s stand-out feature was their support for draw-response questions. Having students sketch graphs and draw diagrams is very valuable as a formative assessment in physics class. Throughout the year as I shared InfuseLearning with other teachers participating in the pilot, the draw-response feature was the most popular everyone, from elementary through high school.

The second formative assessment activity was focused on preparation for the new paragraph-length responses on the AP Physics 2 exam. To practice these types of responses, students responded to a prompt using [Socrative](http://www.socrative.com). Socrative allows me to share all the responses with students, and students can vote for the best one. We can then, as a class, discuss the elements of the best responses.

Unfortunately, InfuseLearning [closed their doors in April](http://www.infuselearning.com/?page_id=35). In preparation for sharing resources with teachers this summer before we deploy 1:1 Chromebooks for all high school students this fall, I surveyed the current tools available with a focus specifically on Peer Instruction that supports drawing and peer-critique of written responses.

I evaluated the following features.

* **Cost**: Is there a free version? What are the limitations of the free version? Can teachers upgrade to a paid version on an as-needed basis?
* **Account Creation**: How easy is it for students to create accounts? Can they login with their Google account?
* **Prepared Questions**: Does the tool support preparing questions in advance?
* **Spontaneous Questions**: Does the tool support creating a question on-the-fly without preparation ahead of time?
* **Supported Question Types**: What types of questions do the tool support?
* **Multiple Choice Questions**: Since Peer Instruction often uses multiple choice questions, how flexible are these questions? Can the answer choices be customized (e.g., A-D or 1-4)? Can the number of answer choices be customized?
* **Draw Response Questions**: Are draw response questions supported by the tool? How rich are the drawing tools?
* **Sharing Student Responses with Students**: Does the tool support sharing sample student responses with all students?
* **Capturing Student Responses**: Does the tool support capturing student responses for later analysis? What can and cannot be captured?
* **Reporting**: Does the tool support reporting of sessions? What is captured and reported?

[Socrative](http://socrative.com/)
—-

* **Cost**: free
* **Account Creation**: integrated with Google accounts
* **Prepared Questions**: yes
* **Spontaneous Questions**: yes
* **Supported Question Types**: multiple choice, true/false, short answer
* **Multiple Choice Questions**: limited options (exactly 5, A-E)
* **Draw Response Questions**: no
* **Sharing Student Responses with Students**: sharing short answer allows student to vote on best peer answer
* **Capturing Student Responses**: yes
* **Reporting**: for prepared questions and short answer only (not spontaneous multiple choice or true/false)

[The Answer Pad](http://theanswerpad.com)
—-

* **Cost**: free and paid; free is limited (limited templates, question types, creation of own images, capture student responses)
* **Account Creation**: students have to create accounts (doesn’t support Google accounts) if you want to track student responses
* **Prepared Questions**: yes, but not draw response
* **Spontaneous Questions**: yes
* **Supported Question Types**: multiple choice, true/false, yes/no, up/down, fill-in, Likert scale, drawing
* **Multiple Choice Questions**: limited options (exactly 4, A-D)
* **Draw Response Questions**: yes, decent drawing tools
* **Sharing Student Responses with Students**: no
* **Capturing Student Responses**: limited in free version
* **Reporting**: only for prepared questions

[Formative](http://goformative.com/)
—-

* **Cost**: free
* **Account Creation**: integrated with Google accounts
* **Prepared Questions**: yes
* **Spontaneous Questions**: no (maybe have some standard templates?)
* **Supported Question Types**: multiple choice, show your work (draw response), short answer, true/false
* **Multiple Choice Questions**: flexible response choices
* **Draw Response Questions**: yes, but limited (no colors)
* **Sharing Student Responses with Students**: no
* **Capturing Student Responses**: automatic
* **Reporting**: yes

[Pear Deck](http://www.peardeck.com/)
—-

* **Cost**: free and paid; free is limited (draw response in prepared decks, capturing, and reporting are paid features)
* **Account Creation**: integrated with Google accounts
* **Prepared Questions**: yes
* **Spontaneous Questions**: kind of (can ask a quick question in the context of an existing deck)
* **Supported Question Types**: agree/disagree, draw on grid, draw on blank, yes/no, true/false, multiple choice, long text answer, short text answer, numeric answer
* **Multiple Choice Questions**: flexible response choices
* **Draw Response Questions**: yes (quick question only for free version)
* **Sharing Student Responses with Students**: no
* **Capturing Student Responses**: paid only
* **Reporting**: paid only

[NearPod](http://nearpod.com/)
—-

* **Cost**: free and paid (free has limited storage space and reporting export options)
* **Account Creation**: integrated with Google accounts
* **Prepared Questions**: yes
* **Spontaneous Questions**: no (maybe have some standard templates?)
* **Supported Question Types**: open-ended question, poll, quiz, draw it
* **Multiple Choice Questions**: flexible response choices
* **Draw Response Questions**: yes, decent drawing tools
* **Sharing Student Responses with Students**: yes
* **Capturing Student Responses**: yes
* **Reporting**: yes (PDF only in free version)

Conclusions
—-

At our summer professional learning sessions, we will be featuring Socrative. It is easy to use and applies to a wide variety of disciplines. The significant drawback of Socrative is the lack of draw-response questions. For those teachers that need that feature, I’m recommending they use NearPod. I used to use NearPod a couple of years ago when piloting classroom iPads. At that time, NearPod was an iPad-only app. I was thrilled to discover that it now supports all major platforms.

For my physics classroom, I’m going to use NearPod for Peer Instruction because draw-response questions are so important. While I’d rather be able to create spontaneous questions, I’m also interested in capturing student answers to provide me more insight into their learning, which necessitates creating a set of questions ahead of time. I will create a few slides in each deck that can serve as general-purpose placeholders for spontaneous questions.

I’ll still use Socrative for peer-critique of written responses. The ability to share student responses with students and have students vote for the best response is very effective at developing their writing. These two classroom activities – Peer Instruction and peer-critique of written responses are done independently; so, using two different tools should not be inconvenient.

If I’ve missed a tool that would work well for either of these classroom activities, please leave a comment to let me know!