Twitter Mapping Lab

Background

For the past four years, we’ve done a Game of Life lab as part of the Decisions and Loops unit in AP Computer Science. I love that lab for many reasons. However, after four years and not using GridWorld anywhere else in the curriculum, I’ve decided it was time for a change. I’ve wanted to incorporate data analytics and visualization into a lab. After some research and ideas from a couple of labs developed by others, I’m excited to try a new lab this year: Twitter Mapping.

Introduction

From what I provide students:

Your application will allow the user to search Twitter for a particular word or phrase in tweets located in each of the 50 US states and then display on a map of the US the degree of positive or negative sentiment associated with that search term in each state. For example, if the search term is coding, the following map may be displayed.

Twitter Data Visualization

  • This lab has several goals beyond the immediate concepts of decisions, loops, and unit tests in this unit:

  • Exposure to data analytics. In this lab you will search a large data set (tweets on Twitter) and analyze that data to derive a new understanding (the sentiment of tweets containing a given keyword in each of the 50 US states).

  • Experience using an API (Application Programming Interface) within the context of an unfamiliar software application. In this lab, you will use the Twitter4J API to access Twitter and write code within the partially implemented Twitter Mapping Lab application.

  • Exposure to data visualization. In this lab you will visually represent the average sentiment in a map of the 50 US states.

Details

<

p>Feel free to read the entire lab document and check out the GitHub repository with the starter code. If you would like a sample solution, please contact me.

Credits

Electronic Lab Portfolios Aligned to AP Physics Science Practices

[Updated 15/7/2016, 10:54 PM: added links to two student lab portfolios.]

As I mentioned briefly in my reflection of the 2014-2015 school year, this past year, students created electronic lab portfolios for AP Physics 2. In summary:

  • many students demonstrated deeper metacognition than I have ever observed
  • several students struggled and their portfolios were incomplete
  • providing feedback and scoring consumed a huge amount of my time
  • structural changes made in the spring semester helped considerably

Structure

I was inspired to have students create electronic lab portfolios based on Chris Ludwig’s work and his presentation and our discussion at NSTA last year.

Before the start of the school year, using siteMaestro, I created a Google Site for each student based on a template that I created. I made both myself and the student owner of the site and kept the site otherwise private. The template consisted of two key portions of the site: a Lab Notebook, which provides a chronologically accounting of all labs; and a Lab Portfolio, which is the best representation of the student’s performance. I shared a document with the students that explained the purpose and distinction between the Lab Notebook and Lab Portfolio.

The lab portfolios were structured around the seven AP Physics Science Practices. I wanted students to evaluate and choose their best work that demonstrated their performance of each Science Practice. I also wanted the most critical and significant labs to be included; so, some labs were required to be in the lab portfolio. In the fall semester, I required that each student publishes at least two examples of their demonstration of each of the seven Science Practices.

I wanted students to think more deeply about the labs then they had in the past, and I didn’t want the lab portfolio to just be a collection of labs. So, in addition to the necessary lab report to demonstrate a given Science Practice, students also had to write a paragraph in which they reflected on why this lab was an excellent demonstration of their performance on the specific Science Practice.

The lab portfolio comprised 40% of the coursework grade for each semester. For the fall semester, the lab portfolio was scored at the end of the semester. I provide a few formal checkpoints throughout the fall semester where students would submit their portfolio (just a link to their site) and I would provide feedback on their labs and paragraphs.

Fall Semester

Many students wrote excellent paragraphs demonstrating a deeper understanding of Science Practices than anything I had previously read. Other students really struggled to distinguish between writing a lab report and writing a paragraph that provided evidence that they had performed a given Science Practice. I did create an example of both a lab report and lab portfolio reflection paragraph based on the shared experiment in first-year physics of the Constant Velocity Buggy Paradigm Lab. However, several students needed much more support to write these reflection paragraphs.

In general, those students who submitted their site for feedback had excellent portfolios by the end of the students; those who didn’t, underestimated the effort required and ended up with incomplete or poor-quality portfolios.

What I liked:

  • The metacognition and understanding of Science Practices demonstrated by many students.
  • Students deciding in which labs they most strongly performed each Science Practice.

What I Didn’t Like:

  • Several students struggled to distinguish a lab report from a paragraph providing evidence of performing a Science Practice.
  • Several students didn’t have enough support to complete a project of this magnitude and ended up with incomplete lab portfolios.
  • Providing feedback and scoring all of the lab portfolios over winter break consumed a huge amount of time.

Spring Semester

The spring semester has some different challenges and constraints:

  • We focus more on preparing for the AP exam and less on lab reports.
  • I don’t have the luxury of a two-week break to score lab portfolios at the end of the semester.

Based on these constraints and our experience during the fall semester, I made some changes for the spring semester. I selected seven required labs in the spring semester, one for each Science Practice. Each lab and reflection paragraph was due a few days after performing the lab, not at the end of the semester.

This had some advantages:

  • the portfolio was scored throughout the semester
  • students had more structure, which helped them stay current

and disadvantages:

  • no student choice in selection of labs to include in portfolio
  • no opportunity to revise a lab or reflection paragraph (the feedback could help them in labs later in the semester)

With these changes and students’ experience from the fall semester, the lab portfolios in the spring semester were largely successful. I think it is important to emphasize that both the changes and the students’ experience contributed to this success. I do not believe that the structure for the spring semester would lead to a more successful fall semester. The feedback I received from students at the end of the year was much more favorable concerning the structure in the spring semester than the structure in the fall semester.

Next Fall

I had the wonderful experience of being coached this year by Tony Borash. Tony provided guidance in many areas, one of which was making these adjustments for the spring semester and, more importantly, planning for next year. Together we were able to come up with a structure that will hopefully combine the strengths of the structure in the fall semester with the structure in the spring semester. My goals for these changes are to:

  • provide more structure for students
  • provide student choice
  • incorporate peer feedback

Here’s the plan for next fall:

  1. I choose the first lab. Students complete and submit the lab and the reflection paragraph. I provide feedback. Students make revisions and re-submit the lab and reflection paragraph. We review the best examples as a class.
  2. I choose the second lab. Students complete the lab and the reflection paragraph. Students provide peer feedback to each other. Students make revisions and submit the lab and reflection paragraph.
  3. Students choose the next lab to include in the portfolio. Students complete the lab and the reflection paragraph. Students provide peer feedback to each other. Students make revisions and submit the lab and reflection paragraph.
  4. Students choose some of the remaining labs, and I choose some of the remaining labs. Students complete the labs and reflection paragraphs. Students specify a subset of Science Practices on which they want formal feedback from me and on which they want feedback from their peers. Students make revisions and re-submit.

This past year, students included a link to their lab report in their lab portfolio and shared the lab report (as a Google Doc) with me. Next year, I will have students embed their lab report into the Google site. This will facilitate peer feedback and enable everyone to use comments within the Google site to provide feedback. I may still have students share the actual doc with me, as well as include a link, so I can provide more detailed suggestions directly within the document.

Student Examples

Conclusion

I’m pleased that my students and I are heading down this path and believe my students will gain a much deeper understanding of Science Practices as a result. While I shared this with my colleagues this past year, I also cautioned them that I didn’t have it figured out, and it wasn’t a smooth ride. I think electronic lab portfolios are an excellent way to assess student performance, and I hope that they will be used in other science courses in the future as they are a natural fit to the NGSS Science and Engineering Practices. I hope that after this next year, I will have something that will provide my colleagues with a stronger framework to adapt to their classes.

New Software Engineering Course

[Updated 6/7/2016, 8:08 PM: Added additional security and web app concepts based on feedback.]

After a few years of effort, I’m thrilled to announce that my school district will be offering a new course in the fall of 2017: Software Engineering! As I will explain below, I’m also excited to ask for your ideas.

Course Description

Software Engineering is a weighted, two-semester course for students that have completed AP Computer Science. The course starts with a core set of software engineering topics (e.g., software engineering process; data structures; technology, society, and ethics) followed by a series of software topics selected by students that are aligned to students’ products (e.g., multithreading, networking, security, web apps, embedded systems, mobile apps). After gaining the necessary background knowledge, small groups of students develop a software product as they iterate through software engineering development cycles to produce a software product. Prerequisites: successful completion of AP Computer Science and teacher recommendation.

Curriculum

We have started to work on the curriculum this summer. The plan is for each student in the class to be provided with their own Raspberry Pi. This sidesteps any issues with student not having the necessary permissions on the computers in the lab and enables students to work on their products outside of class. The software engineering process unit will introduce Agile methodologies in the context of a class project, which will also introduce the students to their Raspberry Pi as they develop an audio server. The data structures unit will be significant and fairly straightforward to design. Concepts will include linked lists, queues, stacks, sets, maps, binary trees, hashes, and hash map. The technology, society, and ethics unit will be similar to what we do in AP Computer Science but be concentrated to enable groups of students to focus on a specific topic in more depth. All students will complete these three units.

The remainder of the first semester will be different for each student group. Each group will decide on a product that they want to develop in the second semester. Based on their product, they will complete various modules (probably two) to gain the necessary background knowledge. Due to my background, I’m comfortable designing the networking and multithreading units. However, for the other units, I’m very interested in your ideas as I fear my expertise is either lacking (security, web apps) or outdated (embedded systems, mobile apps). Below, I’ll share some rough ideas for each unit.

Security

Potential topics: cryptography (history, SSL, encryption), authentication (users, servers, certificate authorities), authorization, session management, firewalling stored user data, attack vectors (hijacking sessions, SQL injections, sanitizing user input)

Questions: Cybersecurity is super popular, but so many of the available resources seem targeted at a much more introductory and basic level than what would be challenging for a post-AP class. What topics would be appropriate for these students? What resources are available? What can their explore with their Raspberry Pi?

Web Apps

Potential topics: Javascript, JQuery, MongoDB, Meteor, Bootstrap, PHP, Laravel, open APIs, Chrome extensions and apps

Questions: The last web app I developed was in Perl. What are the best technologies for students to be learning now? I’m excited to learn, and I want to make sure this unit is relevant. What resources are available?

Embedded Systems

Potential topics: bit-wise operations, I/O, interrupts, buffering, ADC, SPI

Questions: How close to the metal can you get on a Raspberry Pi? Which embedded concepts are most prevalent in today’s products? What resources are available?

Mobile Apps

Potential topics: iOS and Android apps

Questions: While Apple and Google seem to have pretty good resources, are there better resources available? Anything else I should consider for this unit?

Group Product

I’m really looking forward to students working in groups for an extended period of time to develop an authentic product. In the context of an entire semester, groups will have time to complete multiple iterations. Ideally, I would love to partner each group with a stakeholder in the community that would benefit from a software product. I have some leads, and if you have any others, please let me know!

Please comment or strike up a conversation on Twitter if you have any ideas.

2015 Summer Reading List

As I was preparing my stack of books for this summer, I realized that I never published my summer 2015 books read. It wasn’t a great summer of reading, but there are a several good books that I want to share.

what if? Serious Scientific Answers to Absurd Hypothetical Questions by Randall Munroe

This should be required reading for science teachers. It is an inspiring example of how to present science in incredibly engaging, although absurd, contexts. I love xkcd and had the pleasure of hearing Munroe talk on his book tour for Thing Explainer which I hope to finish this summer. I include xkcd comics in my slide notes and link to the what if? blog as extension readings in each unit. Sometimes when an AP Physics 2 student is stuck thinking about capstone ideas, I encourage them to create their own “what if?” capstone.

9 Algorithms that Changed the Future by John McCormick

This was a very accessible book. It would be a good summer read for incoming AP Computer Science students. If I ever have to assign summer work, I would consider assigning this book. I may draw from it for the new Software Engineering course that I’m designing this summer.

The Code Book – The Science of Secrecy from Ancient Egypt to Quantum Cryptogrpahy by Simon Singh

This book was fantastic – computer science, history of science, spy craft. I’ve gifted this book at least three times in the past year to family and students, and I’ve recommended it to several others. Singh includes enough mathematics to make it interesting but complete understanding is not required to appreciate the book.

The Housekeeper and the Professor by Yōko Ogawa

A beautiful book. While there is some math, it should be read for the incredible relationship between the professor and his housekeeper. I recommended this book be added to my school’s library.

All the Light We Cannot See by Anthony Doerr

My mom gave me this book and encouraged me to enjoy reading something not related to teaching, physics, or computer science. I’m glad I did. You all are probably familiar with this book; it did win the Pulitzer Prize after all.

Anathem by Neal Stephenson

I am a huge fan of Stephenson. While Snow Crash and The Diamond Age are monumental works because they defined the cyberpunk genre, and the Baroque Cycle is my favorite collection of historical fiction, Anathem is Stephenson’s greatest work. I actually listened to this entire book on CDs (28 of them while driving all over!). Afterward, I bought the book to share with my son and re-read various sections.

[Update 29/6/2016, 11:50 AM] I forgot a book!

A More Beautiful Question: The Power of Inquiry to Spark Breakthrough Ideas by Warren Berger.

I agree with the premise behind this book: questioning is powerful. For me, many of the examples cited were familiar, and, therefore, I found the book not as groundbreaking as I had hoped. However, I know of others who read it and found it quite insightful. If you are a parent, teacher, or business person, I would recommend checking it out from your local library.

My AP Computer Science 2016 Solutions

I shared these with my students and thought that others may be interested as well. I typed up solutions to the 2016 AP Computer Science free response questions. The zip file includes a BlueJ project file and test code to verify solutions. As I tell my students, no guarantee that I wrote perfect solutions and there are multiple ways to answer these questions.

AAPTWM16: Blueprints for Accessible and Affordable High-Altitude Ballooning

Mark Rowzee and I spoke at the American Association of Physics Teachers (AAPT) 2016 Winter meeting as part of Session EI: Quadcopters, Drones and High Altitude Balloons. Our talk was “Blueprints for Accessible and Affordable High-Altitude Ballooning.”

Abstract: We’ll provide you will the blueprints for success since the moment you release your first high-altitude balloon, you are stricken with an unsettling combination of joy and terror. It is relativity easy to launch a high-altitude balloon; it requires much more planning, resources, and luck to get it back. We will share our experiences designing, launching, and recovering high altitude balloons over the past six years. We will share the science that can be done with a variety of student age groups (elementary, junior high, and high school). We will share the materials necessary for a successful launch and recovery for a variety of budgets. We will share the safety precautions that are required. Finally, we have photos, videos, resources, and stories that we hope will inspire you to conduct your own launch.

AP Physics 2 Reflection

On the eve of the first day of school, I felt that I better capture my thoughts on AP Physics 2 last year. My perspective may be different than other’s (at least different than the vocal minority(?) on the AP Teacher Community).

I started last year eagerly anticipating the new AP Physics 2 course. For the past seven years, I had taught some type of a second-year physics course. For most of that time, I taught what we called Advanced Physics, a one-semester course after which some of my students would take the AP Physics B exam. For a couple of years, I taught an official, year-long, AP Physics B course. I felt that the AP Physics B course had too much content to cover well, even as a second-year course. This was compounded by the mismatch between the groups of student that enrolled in the course. About a third of the students had previously taken our General Physics course, and two-thirds, Honors Physics. The Honors Physics students had studied additional units not part of the General Physics course. As a result, for some “review” units in AP Physics B, the pace was much too fast for those from General Physics and much too slow for those from Honors Physics.

The new AP Physics 2 course contained less content. In addition, the emphasis shifted towards deeper conceptual understanding of physics rather than numeric or algebraic problem solving. As a result of these changes, I felt that I could at last integrate much more of Modeling Instruction into a second-year physics course. I wasn’t too concerned about the shift towards deeper conceptual understanding since I had been moving my course in that direction for the past couple of years based on student performance on the AP Physics B exam. My students had done extremely well on the free response portion of the AP Physics B exam; therefore, I had adjusted class to focus more on conceptual understanding since the greatest area for growth was on the multiple choice portion of the exam. During the summer of 2014, I attended an AP Summer Institute to learn more about the new course. As a result of all of this, I started last year much more excited than anxious.

Reflecting back on AP Physics 2 last year, it was my favorite year teaching a second-year physics course. That said, while many aspects of the course worked well, there are definite areas for me to improve this year.

What Worked

Peer instruction was very effective at developing students’ conceptual understanding. Of all the various types activities done in class, students ranked peer instruction as the most helpful (over 75% of students agreed with the statement “Participating in peer instruction of conceptual questions helped me understand the material.” on the end-of-year survey). The manner by which I conduct peer instruction is strongly influenced by the research of Stephanie Chasteen who writes at sciencegeekgirl. The questions I use are a combination of Paul Hewitt’s Next-Time Questions, the end-of-chapter conceptual questions in Knight’s College Physics text, and those in clicker questions banks from CU Boulder and OSU.

The number and variety of lab activities also worked well. Some labs were informal stations, some typical Modeling Instruction paradigm labs, some lab practicums. With less content, we had time for more, and deeper, labs. Some of the labs and skills involved went beyond that required by the AP Physics 2 curriculum, but some of these were the students favorite. We will continue to explore computational modeling, build more advanced circuits on breadboards, and explore particle physics.

What Didn’t Work

Building my standards, and grading, on the Enduring Understanding defined for each Big Idea did not work well. While my goal was for students to see the connections between the various content areas and appreciate the Big Ideas, students shared that organizing the standards and grades in this manner didn’t help accomplish this. It did result in a lot of extra work for me. After the fall semester, I mostly abandoned this approach. Below, I’ll explain my approach for this year.

Whiteboarding homework problems did not work well. My approach was for six groups of students to prepare and present whiteboards based on assigned homework problems. This didn’t work well because too few students had done the homework problems in advance of whiteboarding. As a result, most of the group would watch those who had done the problems prepare the whiteboards and didn’t really understand the solution. This issue was compounded when whiteboards were presented. Too few students had struggled with the problem in advance to result in a good discussion. This wasn’t the case every time, but much too often.

What I’m Trying This Year

My attempts to prepare students for the free response portion of the AP Physics 2 exam fell somewhere between working and not working. I overestimated students’ ability to write clear, concise, and correct free responses. As a result, I didn’t dedicate sufficient time to practicing this skill. What did work well was using Socrative to share student responses and peer critique these responses. We will do this much more this year.

While my attempts to reinforce the Big Ideas by structuring standards and scores around Enduring Understandings didn’t work, emphasizing the AP Science Practices did work well. Inspired by Chris Ludwig’s work with portfolios and our discussion at NSTA earlier this year, my students will create a lab notebook and portfolio on their own Google Site. The notebook will capture all the labs and the portfolio will be a curated collection of labs that demonstrate their performance of the various AP Science Practices. I hope to share the details of this soon.

To improve the value of whiteboarding, I’m making several changes. Instead of six groups preparing and presenting six problems, groups will prepare and present only two problems. Each problem will be prepared by three groups. The problem won’t be assigned as homework. Rather, we will spend more class time as each group works together to solve the problem. A randomly selected member of each group will be responsible for presenting the whiteboard, and the class will focus on comparing and contrasting solutions between the various groups in addition to the solution itself.

Scores

The average AP Physics 2 scores were about a point lower than the previous year’s AP Physics B scores (3.344 vs. 4.484). However, as I considered the standards and expectations for AP Physics 2 compared to AP Physics B and carefully considered each of my students, their scores were what I expected, except for a few.

Summary

I’m thrilled with the new AP Physics 2 class and excited about teaching this course for the second time. All that I miss from AP Physics B is a huge collection of exam questions from which I could build my own assessments. My one wish is that the College Board releases additional questions as questions in the style of the new exam are very difficult to create. I hope that the changes that I have planned for this year help students to develop an even stronger and deeper understanding of physics and proficiency in science practices than last year’s. If you are interested in more detail about my approach last year, my 180 blog focused solely on AP Physics 2.

Selling Our Colleagues Short with SAMR

I’ve been seeing more and more references to SAMR. Maybe it’s because ISTE is starting, or maybe it’s because my district is promoting it as a tool for teachers embarking on our Digital Learning Initiative (1:1 devices), or maybe its just the Baader-Meinhof phenomenon. Regardless, I can only tolerate so many SAMR infographics before I’m pushed over the edge, and I have to say something.

Due to its overemphasis on technology, SAMR is the least helpful model to promote with teachers if you want to provide a resource to positively impact student learning.

Depending on the teacher, it confuses, at best, and misleads, at worse. I’m not alone in this sentiment. Several of my colleagues both local and online have expressed similar feelings. Most eloquent are a couple of posts by Casey Rutherford. My favorite quote:

On the note of lesson design, I am not satisfied with simplifying the complexities of teaching to where it falls on the SAMR scale. Teaching is nuanced, fluid, and has a ton of moving parts, and we’d be better off embracing that than cheapening it with a stamp of ‘modification.’

I’ll illustrate the problem with a couple of lessons.

Lesson #1

Embracing the “Flipped Classroom,” the teacher records and shares a demonstration of the projectile motion lab and an overview of the lab procedure. Students can watch the video for homework before coming to class. In addition to the video, the lab procedure is published to Canvas and each student can follow the procedure on their devices while performing the lab. The video (and published procedure) instructs students on how to setup the apparatus and make the necessary measurements of the path of the ball bearing that is launched after rolling down the ramp. The data is captured in a Google doc and plotted using Plotly, allowing group members to share data with each other. Based on their plot, students determine where on the floor to place the target and then the teacher observes their attempt. Points are awarded based on how close the ball bearing lands to the target. Each student creates a video that captures their attempt in which they report their percent error and list potential sources of error. The videos are posted on their public blog. Students, and the public, can comment on these videos posted on the students’ blog, but no one does.

Let’s take a look at this lesson. Hmmm. Flipped classroom, LMS, Google Docs, Plotly, video creation, student blogs. Woohoo! We are swimming in the deep end of the SAMR pool! Modification? Redefinition? Doesn’t matter, we’re above the bar!

There’s just one problem. This lesson sucks. I know, it’s based on one of my actual lessons. My class did it. They weren’t engaged; they didn’t have ownership; they didn’t have choice; they didn’t exercise their creativity. They asked me a ton of questions about the details of the procedure. From a pedagogical perspective, it is flawed. It is a traditional cookbook science lab jazzed up with a bunch of tech.

Don’t worry, I made improvements the next year. I focused on research-based pedagogy, and I integrated technology where it supported the pedagogy and content.

Lesson #2

Students are presented with a challenge: Given a fixed vertical elevation of the projectile launcher (i.e., “cannon”), determine the launch angle and time of launch to hit the toy buggy at a specific location as it “flees” . Students work in small groups to justify the selection of the kind of data needed, design a plan for collecting data, and collect data. They choose the tools with which to collect the data. Some groups use video cameras; others, motion detectors; others, photo gates; others, meter sticks; others, phones. They create a computational model using 3D physics programming language since a traditional mathematical solution is beyond most of their current capabilities (one group solves the problem algebraically using clever trig substitutions, which is fine). Using the computational model they solve for the launch angle and time of launch. Their attempt based on their calculation is recorded with a high speed video camera and shared publicly to celebrate their success. Students reflect on the lab practicum with a specific focus on measurement uncertainty and capture their reflections in their electronic portfolio which they will export into an open format (HTML) and take with them to university. During the post lab discussion as a whole class, each group shares what in their evaluation is the most significant aspects of their approach as each group had a unique approach to the lab. Groups compare and contrast these techniques arriving at a set of best practices for minimizing measurement uncertainty.

Students were motivated and engaged. They were creative and collaborative. They asked each other many questions. They surprised me with their solutions. They focused on deeper science practices beyond the content of projectile motion. Some groups incorporated technology to help them meet the challenge. Some groups hardly used any technology at all.

Some may rebuke my assertion and claim that I’m oversimplifying SAMR and there is more to it than what I’m presenting. I’m missing the student-centered vs. teacher-centered aspect. Maybe there is, but you wouldn’t know it from most of the resources out there. SAMR Coffee? SAMR Apps? Really?

Some may argue that teachers new to tech need a simple model to reference. Yes, SAMR is simple. But, why promote it when there are better and more inclusive models available? Do we really think TPACK is too complex for our colleagues? Are we selling them that short?

I’m not.

Formative Assessment Tools for Peer Instruction and Peer Critque of Written Responses

This past year, as my AP Physics 2 cases piloted Chromebooks, we used a couple of formative assessment tools frequently in class. For Peer Instruction, we used InfuseLearning. InfuseLearning’s stand-out feature was their support for draw-response questions. Having students sketch graphs and draw diagrams is very valuable as a formative assessment in physics class. Throughout the year as I shared InfuseLearning with other teachers participating in the pilot, the draw-response feature was the most popular everyone, from elementary through high school.

The second formative assessment activity was focused on preparation for the new paragraph-length responses on the AP Physics 2 exam. To practice these types of responses, students responded to a prompt using Socrative. Socrative allows me to share all the responses with students, and students can vote for the best one. We can then, as a class, discuss the elements of the best responses.

Unfortunately, InfuseLearning closed their doors in April. In preparation for sharing resources with teachers this summer before we deploy 1:1 Chromebooks for all high school students this fall, I surveyed the current tools available with a focus specifically on Peer Instruction that supports drawing and peer-critique of written responses.

I evaluated the following features.

  • Cost: Is there a free version? What are the limitations of the free version? Can teachers upgrade to a paid version on an as-needed basis?
  • Account Creation: How easy is it for students to create accounts? Can they login with their Google account?
  • Prepared Questions: Does the tool support preparing questions in advance?
  • Spontaneous Questions: Does the tool support creating a question on-the-fly without preparation ahead of time?
  • Supported Question Types: What types of questions do the tool support?
  • Multiple Choice Questions: Since Peer Instruction often uses multiple choice questions, how flexible are these questions? Can the answer choices be customized (e.g., A-D or 1-4)? Can the number of answer choices be customized?
  • Draw Response Questions: Are draw response questions supported by the tool? How rich are the drawing tools?
  • Sharing Student Responses with Students: Does the tool support sharing sample student responses with all students?
  • Capturing Student Responses: Does the tool support capturing student responses for later analysis? What can and cannot be captured?
  • Reporting: Does the tool support reporting of sessions? What is captured and reported?

Socrative

  • Cost: free
  • Account Creation: integrated with Google accounts
  • Prepared Questions: yes
  • Spontaneous Questions: yes
  • Supported Question Types: multiple choice, true/false, short answer
  • Multiple Choice Questions: limited options (exactly 5, A-E)
  • Draw Response Questions: no
  • Sharing Student Responses with Students: sharing short answer allows student to vote on best peer answer
  • Capturing Student Responses: yes
  • Reporting: for prepared questions and short answer only (not spontaneous multiple choice or true/false)

The Answer Pad

  • Cost: free and paid; free is limited (limited templates, question types, creation of own images, capture student responses)
  • Account Creation: students have to create accounts (doesn’t support Google accounts) if you want to track student responses
  • Prepared Questions: yes, but not draw response
  • Spontaneous Questions: yes
  • Supported Question Types: multiple choice, true/false, yes/no, up/down, fill-in, Likert scale, drawing
  • Multiple Choice Questions: limited options (exactly 4, A-D)
  • Draw Response Questions: yes, decent drawing tools
  • Sharing Student Responses with Students: no
  • Capturing Student Responses: limited in free version
  • Reporting: only for prepared questions

Formative

  • Cost: free
  • Account Creation: integrated with Google accounts
  • Prepared Questions: yes
  • Spontaneous Questions: no (maybe have some standard templates?)
  • Supported Question Types: multiple choice, show your work (draw response), short answer, true/false
  • Multiple Choice Questions: flexible response choices
  • Draw Response Questions: yes, but limited (no colors)
  • Sharing Student Responses with Students: no
  • Capturing Student Responses: automatic
  • Reporting: yes

Pear Deck

  • Cost: free and paid; free is limited (draw response in prepared decks, capturing, and reporting are paid features)
  • Account Creation: integrated with Google accounts
  • Prepared Questions: yes
  • Spontaneous Questions: kind of (can ask a quick question in the context of an existing deck)
  • Supported Question Types: agree/disagree, draw on grid, draw on blank, yes/no, true/false, multiple choice, long text answer, short text answer, numeric answer
  • Multiple Choice Questions: flexible response choices
  • Draw Response Questions: yes (quick question only for free version)
  • Sharing Student Responses with Students: no
  • Capturing Student Responses: paid only
  • Reporting: paid only

NearPod

  • Cost: free and paid (free has limited storage space and reporting export options)
  • Account Creation: integrated with Google accounts
  • Prepared Questions: yes
  • Spontaneous Questions: no (maybe have some standard templates?)
  • Supported Question Types: open-ended question, poll, quiz, draw it
  • Multiple Choice Questions: flexible response choices
  • Draw Response Questions: yes, decent drawing tools
  • Sharing Student Responses with Students: yes
  • Capturing Student Responses: yes
  • Reporting: yes (PDF only in free version)

Conclusions

At our summer professional learning sessions, we will be featuring Socrative. It is easy to use and applies to a wide variety of disciplines. The significant drawback of Socrative is the lack of draw-response questions. For those teachers that need that feature, I’m recommending they use NearPod. I used to use NearPod a couple of years ago when piloting classroom iPads. At that time, NearPod was an iPad-only app. I was thrilled to discover that it now supports all major platforms.

For my physics classroom, I’m going to use NearPod for Peer Instruction because draw-response questions are so important. While I’d rather be able to create spontaneous questions, I’m also interested in capturing student answers to provide me more insight into their learning, which necessitates creating a set of questions ahead of time. I will create a few slides in each deck that can serve as general-purpose placeholders for spontaneous questions.

I’ll still use Socrative for peer-critique of written responses. The ability to share student responses with students and have students vote for the best response is very effective at developing their writing. These two classroom activities – Peer Instruction and peer-critique of written responses are done independently; so, using two different tools should not be inconvenient.

If I’ve missed a tool that would work well for either of these classroom activities, please leave a comment to let me know!

Reflection on This Year’s Learning Experiences

My school district has a new system for how teachers’ professional development earns them credit on the salary schedule. In addition to the traditional approaches of taking graduate courses or competing additional higher education degrees, several other opportunities are now options. Last school year, I wrote a proposal for a “Discovery, sharing, execution, and enhancement of research-based and field-tested best practices for physics education.” Over the summer, I documented all that I did and did receive credit as part of the new program as a result of these activities. While it took an unexpected amount of effort to navigate the new bureaucracy, those wrinkles can be ironed out as everyone has more experience with the new system. I believe the concept of this new model is sound. It’s a lot easier to adjust the workflow and bureaucracy than to adjust the fundamental concept. Below is the summary reflection that I submitted. Thanks to all of you who influence my professional development!

As I reflect on these learning experiences over the past year, a theme of balance emerges. The strong impact of these experiences was balanced between my learning, the district, student learning, and my peers. The medium through which ideas were exchanged (fact-to-face, virtual real-time, online) was balanced, leveraging the strengths of each. My focus on learning from others and sharing my expertise was balanced. The level of commitment of various professional learning communities was balanced. A small group of high school physics teachers had a very high level of commitment in my Physics Learning Community, while the informality and transient nature of Twitter enabled many to share their insights with minimum initial commitment.

These learning experiences were punctuated by reflections. By capturing and sharing these reflections, I benefit from both the immediate act of reflection and the future ability to reference that reflection; others benefit from the sharing of my reflections from which they may draw their own insights. I continue to be pleased at the regularity with which I reference my writings.

Throughout these learning experiences I was reminded how curious, collaborative, and open many educators are about their profession. The diversity of their backgrounds and current roles provide varied experiences and fresh perspectives. I was also reminded how everyone is at a different point in their professional growth. While some methodologies are entrenched in my practice (e.g., standards-based assessment and reporting), other educators are just starting to struggle with these transitions. It is easy to forget the path one took as they grow; capturing this path helps me to remember. In addition, the 180 blog provided a forum for me to share the smaller ideas and tips that I would normally not bother to share in a standard blog post. I was pleased and sometimes surprised that many educators found these 180 posts informative. Furthermore, historically, my blog stagnates when school is in session. The 180 blog is rigidly structured into smaller chunks such that at least I share something, which is a net gain.

With so many incredible educators willing to share their expertise and such a plethora of methodologies to explore, I must be balanced on which I choose to focus. I focus on what I think is most important for students, on what I am most passionate, on what I find most interesting; and pass along everything. Someone else may pick up what I have set aside and everyone still benefits.