Category Archives: assessments

Software Engineering Book Clubs and Panel Discussion

One of the units during the fall semester in my Software Engineering class focuses on technology, society, and ethics. The big idea is that “Students will research, analyze, discuss, and present contemporary issues at the intersection of technology, society, and ethics”. The guiding question is “How does technology affect change from the critical context of privacy, social justice, economics, education, politics, culture, security, or warfare?”. The rationale for this unit is that technology is having a dramatic impact on every aspect of today’s society. The scope of the impact ranges from the personal to international relations. Today’s students will be tomorrow’s digital citizens who will be designing, applying, and using technological products that will affect change. We achieve these standards through book clubs in which small groups of students (3-5) read, analyze, and discuss a text. Students participate in a panel discussion as the summative assessment for this unit.

The past couple of years, students read one of the following books:

  • Algorithms to Live By: The Computer Science of Human Decisions by Brian Christian
  • Blown to Bits: Your Life, Liberty, and Happiness After the Digital Explosion by Hal Abelson, Ken Ledeen, Harry Lewis
  • The Hacked World Order: How Nations Fight, Trade, Maneuver, and Manipulate in the Digital Age by Adam Segal
  • The Monsters of Education Technology by Audrey Watters
  • The Second Machine Age: Work, Progress, and Prosperity in a Time of Brilliant Technologies by Erik Brynjolfsson and Andrew McAfee
  • Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy by Cathy O’Neil

I present each of the books to the class and then have them force rank their choices. I then form the groups to reflect student preferences while balancing the size of groups. This year, I’ll use my genetic algorithm.

The book groups meet weekly for six weeks in the middle of our data structures unit. This serves as a refreshing change of pace as the data structures unit is fairly intense. In the past, we’ve met in the Learning Commons for book groups which is a great space for this activity.

Inspired by a Cult of Pedagogy article featuring Marisa Thompson, the weekly discussion is structured around Thoughts, lingering Questions, and Epiphanies (TQEs). Each week the book club group collaboratively completes a document (week #1 template) to capture their discussion. This provides some accountability and serves as a great resource for the panel discussion.

The sample prompts change from week to week:

  • Week 1
    • author’s claims
    • personal or group response to the author’s claims
    • evidence (examples) that support the author’s claims
    • aha! (striking) moments while reading or discussing
  • Week 2
    • Has anyone’s personal response or beliefs changed while reading the first third of the book? If no, why not?
    • How does your personal response or beliefs align with those of the author? If yes, how have they changed? Why did they change?
  • Week 3
    • Notes on research on the author. What is their background? What is their education and profession? For whom do they work? Do they have a personal connection to the topic? For which, if any, publications do they write?
    • What are the potential biases of the author?
    • How does awareness of these potential biases affect your perspective of the author’s claims?
  • Week 4
    • Through which of the following critical contexts does the technology highlighted in your text most affect change? (privacy, social justice, economics, education, politics, culture, security, of warfare)
    • how does the technology highlighted in your text affect change from the critical context selected above?
  • Week 5
    • What question is most important to be asked in relationship to your book that has not been asked? That is, what is the question for the answer that you are most excited to share? Why this question is so important?
  • Week 6
    • students are on their own at this point…

After these six weeks of reading, analyzing, and discussing their books, students demonstrate their understanding through their participation in a panel discussion. Originally, I planned on having students write individual essays. One of our Learning Support Coaches recommended alternative assessments and suggested a panel discussion. While I was nervous about managing a panel discussion, it worked incredibly well as a summative assessment and is so much more engaging than a bunch of essays.

The structure of the panel discussion varies somewhat based on the number of students in the class. Overall, there are four panel discussions over two days of class. I set up tables for the panel and the observers in the Learning Commons and invite teachers and administrators to watch the panel discussion. Each panel is comprised of a student from each of the book groups. At times, there may need to be two students from the same book group. This ensures a variety of perspectives as each will address the prompt through from the perspective of their book. Students are expected to address the prompt from the perspective of their book and its author and not their personal opinion. This is important in that I want them to demonstrate that they can take one of their author’s claims, support it with evidence, and connect it to a novel prompt with their reasoning. In addition, students are expected to respond to their peers on the panel as this should be a discussion and not just a round robin of responses to a question. Students are allowed to have notes but are cautioned that use of these notes shouldn’t distract from the panel discussion. I collect their notes at the end of the panel, which serves as supplemental evidence of their understanding. Each student is assessed based on this rubric.

I prepare several prompts for the panel discussion, but one is almost always sufficient for a 15-20 minute panel discussion. Here are the prompts that I’ve used in the past:

  • According the author’s claims presented in your text, through the lens of education [could also be any of the other critical contexts], how do the topics presented in your texts affect change?
  • According the author’s claims presented in your text, how do the concepts presented in your text affect our perception of self identity, control of one’s destiny, and self value?
  • The topics presented in your texts are technologically advanced. According the author’s claims presented in your text, how do they affect everyday people?
  • Many of the topics raised in your text are fairly depressing. According the author’s claims presented in your text, what did you find in your text that was hopeful for the future?
  • According the author’s claims presented in your text, to what extent would your author agree that technology is headed in a positive direction?

Here are a couple of my extra prompts in case a panel gets stalled:

  • The authors of your texts raised several concerns. How did they advise individuals to respond to these issues? What is their call to action?
  • According the author’s claims presented in your text, to what extent would your author agree that the ethical questions raised in this text are subjective?

I’ve been really impressed with students on the panels the past three years. As I mentioned previously, a single prompt almost always lasts 15-20 minutes and is all that is needed for a panel. Students exceeded my expectations. They really took care of each other; if someone was struggling to make a connection, they would prompt them with a bridge to help them make a connection and give them an opportunity to speak. Clearly our Communication Arts teachers have helped these students develop these impressive skills.

For the upcoming school year and the fourth year of my Software Engineering class, I will leave most of this unit unchanged. The one change that I will make is to have students read one of two books and narrow the focus to the critical context of social justice. The district Learning Services department has generously agreed to purchase additional copies such that every student will read either Weapons of Math Destruction or Race after Technology: Abolitionist Tools for the New Jim Code by Ruha Benjamin. While I worry a bit that the more focused lens will limit the panel discussions, both of these books are so rich that I expect the panel discussions will be just as good, if not better, than in the past.

Project-Based Learning in New Software Engineering Course

A couple years ago, I wrote about a new course that would be offered in the 2017-2018 school year, Software Engineering. Last school year, the course was a great success, although there are many refinements I’m preparing for this upcoming school year. The second semester of the course is described as “small groups of students develop a software product as they iterate through software engineering development cycles to produce a software product”. Other teachers expressed the most interest in how to facilitate teams of students managing a semester-long project and how to assess them throughout the semester. Until last semester, I had never done this as a teacher. Fortunately, I remembered that I did this every day in my previous career as a software engineer.

One of my roles in my previous career was that of a methodologist with a specific focus on Agile Methodologies. After a review of the current state of Agile Methodologies, I decided that Scrum would be a good fit for my students. None of these students have any experience with Agile Software Development, and few have worked as part of a team on a substantial project. Scrum provided sufficient structure while maintaining the lightness appropriate for small teams. The activities in each sprint provide the opportunities for everything that is needed in project-based learning: planning, review, and reflection. I attempted to fulfill the role of Scrum Master. Each team had a product owner who was external to the class.

If you are interesting in a succinct overview of Scrum, I highly recommend the book, Scrum: A Breathtakingly Brief and Agile Introduction by Chris Sims and Hillary Louise Johnson. (If you want more, their book The Elements of Scrum is also excellent if the first book leaves you wanting more.)

I scheduled the teams into four, staggered, 4-week-long sprints. While all teams started and ended their project based on the start and end of the semester, the first and last sprint for each team were of varying length such that I could facilitate the activities with each team during class.

The key activity in Scrum is the Daily Scrum or stand-up meeting. Teams would hold their Daily Scrum twice a week. Having a four-week sprint seems long, and holding a Daily Scrum twice a week seems infrequent, but these teams aren’t developing software as a full-time job; they are in class for 50 minutes a day. At the beginning of the semester, I would facilitate every Daily Scrum meeting in my role as Scrum Master. Soon students became adapt and comfortable, and teams held their Daily Scrums while I observed from afar.

We focused on four activities in each sprint: Sprint Planning, Story Time, Sprint Review, and Retrospective. For each of these, the team would assign a student responsible for facilitating the activity, capturing the work, and sharing it with the team. With a team of four students and four sprints, each student would facilitate each of the four activities throughout the semester. The student facilitating the activity was assessed based on the student’s effectiveness in facilitating the activity, their demonstrated understanding of the activity, and the quality of the submitted document.

Repeated feedback is essential for both project-based learning and software development. Teams would receive feedback from their product owner during each sprint review (and more often as needed). Teams would receive feedback from me and each other during each retrospective.

In addition to these standard Scrum activities, I added an additional activity at the end of each sprint, a personal retrospective. Each student would evaluate themselves based on their role as a Scrum Team Member, reflect on the personal goal they set in the previous sprint, and set a new personal goal for the next sprint. Based on student feedback, for the upcoming school year, I will also have students evaluate their teammates.

At the end of the semester, we hold a Demo Event attended by all the students, their product owners, and other stakeholders. Each team presents their product in a modified version of the “3-in-5” presentation format. After the Demo Event, each student completes a final retrospective reflecting on the entire semester.

This structure worked well for my students and their projects last semester, and we will use the same structure next year with some refinements.

One final note – Scrum is used for all kinds of projects; not just software projects. After my experience last semester, I would encourage teachers to consider Scrum as a tool to facilitate any project-based learning of a significant duration.

Questions? Comments? Please comment here or each out to me on Twitter.

Electronic Lab Portfolios Aligned to AP Physics Science Practices

*[Updated 15/7/2016, 10:54 PM: added links to two student lab portfolios.]*

As I mentioned briefly in [my reflection](https://pedagoguepadawan.net/435/ap-physics-2-reflection/) of the 2014-2015 school year, this past year, students created electronic lab portfolios for AP Physics 2. In summary:

* many students demonstrated deeper metacognition than I have ever observed
* several students struggled and their portfolios were incomplete
* providing feedback and scoring consumed a huge amount of my time
* structural changes made in the spring semester helped considerably

Structure
—-

I was inspired to have students create electronic lab portfolios based on [Chris Ludwig’s work](http://see.ludwig.lajuntaschools.org/?p=1197) and his presentation and our discussion at NSTA last year.

Before the start of the school year, using [siteMaestro](https://sites.google.com/a/newvisions.org/scripts_resources/add-ons/sitemaestro), I created a Google Site for each student based on [a template](https://sites.google.com/a/naperville203.org/nnhsapp2portfolio/) that I created. I made both myself and the student owner of the site and kept the site otherwise private. The template consisted of two key portions of the site: a Lab Notebook, which provides a chronologically accounting of all labs; and a Lab Portfolio, which is the best representation of the student’s performance. I [shared a document](https://docs.google.com/document/d/19aUFLSk93LIJUuKWwJh1IHHNDFUDt8U4QMtcfMImI1k/edit) with the students that explained the purpose and distinction between the Lab Notebook and Lab Portfolio.

The lab portfolios were structured around the [seven AP Physics Science Practices.](https://docs.google.com/document/d/1bcIO-B8RT73DM99zMC7R53SstuWF-MjrPz3OhAUdWG0/edit) I wanted students to evaluate and choose their best work that demonstrated their performance of each Science Practice. I also wanted the most critical and significant labs to be included; so, [some labs were required](https://docs.google.com/document/d/1bcIO-B8RT73DM99zMC7R53SstuWF-MjrPz3OhAUdWG0/edit#bookmark=id.u7kv4o1dcpux) to be in the lab portfolio. In the fall semester, I required that each student publishes at least two examples of their demonstration of each of the seven Science Practices.

I wanted students to think more deeply about the labs then they had in the past, and I didn’t want the lab portfolio to just be a collection of labs. So, in addition to the necessary lab report to demonstrate a given Science Practice, students also had to write a paragraph in which they reflected on why this lab was an excellent demonstration of their performance on the specific Science Practice.

The lab portfolio comprised 40% of the coursework grade for each semester. For the fall semester, the lab portfolio was scored at the end of the semester. I provide a few formal checkpoints throughout the fall semester where students would submit their portfolio (just a link to their site) and I would provide feedback on their labs and paragraphs.

Fall Semester
—-

Many students wrote excellent paragraphs demonstrating a deeper understanding of Science Practices than anything I had previously read. Other students really struggled to distinguish between writing a lab report and writing a paragraph that provided evidence that they had performed a given Science Practice. I did [create an example](https://sites.google.com/a/naperville203.org/nnhsapp2portfolio/lab-portfolio/science-practice-4) of both a lab report and lab portfolio reflection paragraph based on the shared experiment in first-year physics of the Constant Velocity Buggy Paradigm Lab. However, several students needed much more support to write these reflection paragraphs.

In general, those students who submitted their site for feedback had excellent portfolios by the end of the students; those who didn’t, underestimated the effort required and ended up with incomplete or poor-quality portfolios.

What I liked:

* The metacognition and understanding of Science Practices demonstrated by many students.
* Students deciding in which labs they most strongly performed each Science Practice.

What I Didn’t Like:

* Several students struggled to distinguish a lab report from a paragraph providing evidence of performing a Science Practice.
* Several students didn’t have enough support to complete a project of this magnitude and ended up with incomplete lab portfolios.
* Providing feedback and scoring all of the lab portfolios over winter break consumed a huge amount of time.

Spring Semester
—-

The spring semester has some different challenges and constraints:

* We focus more on preparing for the AP exam and less on lab reports.
* I don’t have the luxury of a two-week break to score lab portfolios at the end of the semester.

Based on these constraints and our experience during the fall semester, I made some changes for the spring semester. I selected seven required labs in the spring semester, one for each Science Practice. Each lab and reflection paragraph was due a few days after performing the lab, not at the end of the semester.

This had some advantages:

* the portfolio was scored throughout the semester
* students had more structure, which helped them stay current

and disadvantages:

* no student choice in selection of labs to include in portfolio
* no opportunity to revise a lab or reflection paragraph (the feedback could help them in labs later in the semester)

With these changes *and* students’ experience from the fall semester, the lab portfolios in the spring semester were largely successful. I think it is important to emphasize that both the changes *and* the students’ experience contributed to this success. I do not believe that the structure for the spring semester would lead to a more successful fall semester. The feedback I received from students at the end of the year was much more favorable concerning the structure in the spring semester than the structure in the fall semester.

Next Fall
—-

I had the wonderful experience of being coached this year by [Tony Borash](https://about.me/tborash). Tony provided guidance in many areas, one of which was making these adjustments for the spring semester and, more importantly, planning for next year. Together we were able to come up with a structure that will hopefully combine the strengths of the structure in the fall semester with the structure in the spring semester. My goals for these changes are to:

* provide more structure for students
* provide student choice
* incorporate peer feedback

Here’s the plan for next fall:

1. I choose the first lab. Students complete and submit the lab and the reflection paragraph. I provide feedback. Students make revisions and re-submit the lab and reflection paragraph. We review the best examples as a class.
2. I choose the second lab. Students complete the lab and the reflection paragraph. Students provide peer feedback to each other. Students make revisions and submit the lab and reflection paragraph.
3. Students choose the next lab to include in the portfolio. Students complete the lab and the reflection paragraph. Students provide peer feedback to each other. Students make revisions and submit the lab and reflection paragraph.
4. Students choose some of the remaining labs, and I choose some of the remaining labs. Students complete the labs and reflection paragraphs. Students specify a subset of Science Practices on which they want formal feedback from me and on which they want feedback from their peers. Students make revisions and re-submit.

This past year, students included a link to their lab report in their lab portfolio and shared the lab report (as a Google Doc) with me. Next year, I will have students embed their lab report into the Google site. This will facilitate peer feedback and enable everyone to use comments within the Google site to provide feedback. I may still have students share the actual doc with me, as well as include a link, so I can provide more detailed suggestions directly within the document.

Student Examples
—-

* [Nicole’s AP Physics 2 Lab Portfolio](https://sites.google.com/a/naperville203.org/nicoles-ap-physics-2-lab-portfolio-1-1/)
* [Vincent’s AP Physics 2 Lab Portfolio](https://sites.google.com/a/naperville203.org/vincents-ap-physics-2-lab-portfolio/)

Conclusion
—-

I’m pleased that my students and I are heading down this path and believe my students will gain a much deeper understanding of Science Practices as a result. While I shared this with my colleagues this past year, I also cautioned them that I didn’t have it figured out, and it wasn’t a smooth ride. I think electronic lab portfolios are an excellent way to assess student performance, and I hope that they will be used in other science courses in the future as they are a natural fit to the NGSS Science and Engineering Practices. I hope that after this next year, I will have something that will provide my colleagues with a stronger framework to adapt to their classes.

My AP Computer Science 2016 Solutions

I shared these with my students and thought that others may be interested as well. I typed up solutions to the 2016 AP Computer Science free response questions. The zip file includes a BlueJ project file and test code to verify solutions. As I tell my students, no guarantee that I wrote perfect solutions and there are multiple ways to answer these questions.

AP Physics B Assessments

As I’ve [mentioned](https://pedagoguepadawan.net/193/preparing-for-new-ap-physics-b-course/), I’m spending some time this summer preparing for the AP Physics B course that we will be teaching for the first time this fall. I recently finished creating the assessments for this course.

With one exception (fluids multiple choice), all questions are from previous AP Physics B exams. Thanks to the handy indexes available from [Secure PGP](https://secure-pgp.wikispaces.com/), it was relatively easy to review relevant questions and problems and choose those I wanted.

While compiling the assessments, I refined the granularity of the units a bit.

Fall Semester

* Special Relativity
* Kinematics
* Statics and Dynamics
* Fluid Mechanics
* Work, Energy, Power
* Thermodynamics
* Linear Momentum
* Oscillations, Gravity, Waves
* Capstone Project

Spring Semester

* Electrostatics
* Electric Circuits
* Magnetic Fields and Electromagnetism
* Geometric Optics
* Physical Optics
* Particle Physics
* Atomic Physics and Quantum Effects
* Nuclear Physics
* Cosmology

For each unit, I compiled a quiz that contains representative free response problems to be used as a formative assessment. I then created an end-of-unit exam consisting of multiple choice and free response questions. The exam is intended to be completed in a 50-minute class period or less. To support the flavor of standards-based grading that I’m using in this class, I also created a reassessment consisting of multiple choice and free response questions. Scoring rubrics for all free response questions have also been compiled for each assessment.

I [uploaded](https://secure-pgp.wikispaces.com/PDF+and+Word+Problem+banks) the assessments as an archive for each semester to Secure PGP. I included the original Pages documents as well as versions exported as PDFs and Word files. I hope that some of you find these helpful. Please let me know if you find any mistakes.

Critical Thinking Assessment

Those of us teaching physics have made a lot of changes this year. One major change is a focus on depth of understanding and critical thinking, which results in fewer topics covered. While I have qualitative evidence through formative assessments that students this year have developed stronger critical thinking and long-chains-of-reasoning skills, I’ve been bothered that I don’t have a summative assessment to measure this. Ideally, I would like an Force Concept Inventory pre-test/post-test equivalent for critical thinking. I’ve bookmarked the [College and Work Readiness Assessment (CWRA)](http://www.cae.org/content/pro_collegework.htm), but that isn’t an assessment that I can administer to my own class. If you know of another, please let me know!

Due to our crazy calendar and snow days, seniors graduated two weeks ago and I’ve had relatively few students in my regular physics classes since then. We’ve been investigating color, polarization, mirrors, and lenses. Since these students had already completed their final with the seniors, I decided to use the scheduled final exam time this week to try a critical thinking assessment. I wanted them to read a passage that describes a physics phenomenon with which they were unfamiliar, make several observations of a somewhat related physics phenomenon that they had never seen, and propose and defend an explanation for this observed phenomenon based on the prior knowledge. They read about diffraction, observed various wavelengths of light passing through various double slits, and tried to formulate an explanation. We had previously learned about interference of waves (slinkies and beats), but not in the context of light. This is quite a series of inferences and connections for students to make during a final exam; so, I prepared a series of guiding questions to help them make the connections. When a student said they were stuck or were off-track, I gave them one of the five guiding questions. Some students needed all five; one, amazingly, didn’t need any.

Here is my reading passage, observation procedure, and guiding questions:

Download (PDF, 231KB)

The students did quite well connecting what they read about diffraction to what they observed to what they already knew about interference. Here are some of my favorite student comments.

The black parts are shadows, I think we see them because the light is being destructed?

In order to have constructive interference, one ray must travel one wavelength further than the first.

The blue filter causes the lines to be closer than that of the red filer because blue has a shorter wavelength and when it travels to the plate, it forms more concentric circles. Thus, there are more intersection of circles and more lines formed.

and my favorite (written without any guiding questions):

… It’s like light beats.

I also had some really creative explanations of the interference pattern. Students mentioned internal reflection in our eyes as well as lens effects due to the slits.

Students commented that this was unlike any final exam they had previously taken. In fact, several students in one class didn’t want to leave until they were satisfied they had a complete explanation. It certainly seemed more worthwhile than giving students a list of equations and a set of problems with numbers for them to plug in on their calculator. I think there is a kernel of a good idea here, but I need to develop it more. In my largest class, it was hard to manage since I had to interact with each student during the assessment, read their explanations, and give them the appropriate guiding questions. Sometimes this required me asking my own clarification questions and the ensuing discussion could be overheard by other students. If you have tried anything like this, please share your experience!