Tag Archives: ap

Contextual Feedback Using GitHub Pull Requests

After reading @dondi’s workflow for using pull requests to provide feedback to students, I wanted to try it this semester. I wasn’t exactly sure what steps were involved, but I found a workflow that worked for me and I wanted to share it. I decided that a screencast would be an easier way to illustrate the steps rather than trying to type every step.

In general, the key is to edit one of the student’s files (the edit is simply to provide an opportunity to comment in the pull request) so a branch and pull request can be created. At this point, comments can be left where each change was made.

In the past, I’ve provided feedback as comments via Canvas’ SpeedGrader. This new approach is much better in that the code on which I’m providing feedback is adjacent to the specific comment. Once students see that the assignment is marked complete in Canvas, they check the “feedback” pull request to review my comments. If they have questions or if they have answers to questions that I’ve asked, they can continue this conversation in the pull request. While this isn’t a traditional use for pull requests, it works well and it’s good for students to be familiar with participating in conversations for pull requests.

Please comment if you have any suggestions to improve this workflow or if you have any questions!

Polar Bears Dice Activity in AP Computer Science

A few of years ago, computers weren’t ready for AP Computer Science the first day of school. So, on the first day of class, I resurrected an opening-day activity I had used in Physics: the Polar Bears around an Ice Hold Puzzle. There are a lot of similarity between my physics and computer science classes; so, I thought the activity would be a good fit.

dice

I was surprised that not only was the activity a good fit, it was more successful in the context of computer science than it was in physics. I’ve done it every year since, even as the computers are working on the first day.

Our discussion after everyone solved the puzzle focused on how this activity was an analogy for how our class will solve computer science challenges throughout the year:

  • You may feel frustrated as you try to figure computer science concepts and challenges out. That’s okay.
  • Computer science concepts and challenges are hard to understand until you know the “rules of the game.”
  • But, once you discover the rules, computer science concepts and challenges often seems easy and you may be surprised that others don’t understand.
  • However, remember that you didn’t always understand.
  • When you discover the rules and understand without someone just telling you the “answer”, you are excited.
  • The journey to understanding is very important. So, no one is going to tell you the answer, but we’re all here to support each other on our journeys.
  • Being told the “answer” at most gives you one answer that you didn’t know.
  • Learning to think critically and arrive at the answer with support develops a skill that you will use to find many answers.
  • Students who solve the challenge should offer suggestions as to how to reframe the game that helped others solve the problem without directly telling them the answer.

The reason that this activity worked even better in the context of computer science is that it also serves as an excellent example of best practices for algorithm design and debugging. As we continued to try and solve the puzzle, students introduced several important ideas:

  • if you just keep guessing, you may never solve the puzzle
  • reduce the scope of the problem (roll three dice instead of six)
  • test special cases (make all dice show five)
  • change a variable and see the effect (change one die from a five to a one).

I still end the activity by “assigning” grades based on how quickly a student solved the puzzle as described in the original post.

It was a great first day, and we didn’t spend any time on the computers!

Electronic Lab Portfolios Aligned to AP Physics Science Practices

[Updated 15/7/2016, 10:54 PM: added links to two student lab portfolios.]

As I mentioned briefly in my reflection of the 2014-2015 school year, this past year, students created electronic lab portfolios for AP Physics 2. In summary:

  • many students demonstrated deeper metacognition than I have ever observed
  • several students struggled and their portfolios were incomplete
  • providing feedback and scoring consumed a huge amount of my time
  • structural changes made in the spring semester helped considerably

Structure

I was inspired to have students create electronic lab portfolios based on Chris Ludwig’s work and his presentation and our discussion at NSTA last year.

Before the start of the school year, using siteMaestro, I created a Google Site for each student based on a template that I created. I made both myself and the student owner of the site and kept the site otherwise private. The template consisted of two key portions of the site: a Lab Notebook, which provides a chronologically accounting of all labs; and a Lab Portfolio, which is the best representation of the student’s performance. I shared a document with the students that explained the purpose and distinction between the Lab Notebook and Lab Portfolio.

The lab portfolios were structured around the seven AP Physics Science Practices. I wanted students to evaluate and choose their best work that demonstrated their performance of each Science Practice. I also wanted the most critical and significant labs to be included; so, some labs were required to be in the lab portfolio. In the fall semester, I required that each student publishes at least two examples of their demonstration of each of the seven Science Practices.

I wanted students to think more deeply about the labs then they had in the past, and I didn’t want the lab portfolio to just be a collection of labs. So, in addition to the necessary lab report to demonstrate a given Science Practice, students also had to write a paragraph in which they reflected on why this lab was an excellent demonstration of their performance on the specific Science Practice.

The lab portfolio comprised 40% of the coursework grade for each semester. For the fall semester, the lab portfolio was scored at the end of the semester. I provide a few formal checkpoints throughout the fall semester where students would submit their portfolio (just a link to their site) and I would provide feedback on their labs and paragraphs.

Fall Semester

Many students wrote excellent paragraphs demonstrating a deeper understanding of Science Practices than anything I had previously read. Other students really struggled to distinguish between writing a lab report and writing a paragraph that provided evidence that they had performed a given Science Practice. I did create an example of both a lab report and lab portfolio reflection paragraph based on the shared experiment in first-year physics of the Constant Velocity Buggy Paradigm Lab. However, several students needed much more support to write these reflection paragraphs.

In general, those students who submitted their site for feedback had excellent portfolios by the end of the students; those who didn’t, underestimated the effort required and ended up with incomplete or poor-quality portfolios.

What I liked:

  • The metacognition and understanding of Science Practices demonstrated by many students.
  • Students deciding in which labs they most strongly performed each Science Practice.

What I Didn’t Like:

  • Several students struggled to distinguish a lab report from a paragraph providing evidence of performing a Science Practice.
  • Several students didn’t have enough support to complete a project of this magnitude and ended up with incomplete lab portfolios.
  • Providing feedback and scoring all of the lab portfolios over winter break consumed a huge amount of time.

Spring Semester

The spring semester has some different challenges and constraints:

  • We focus more on preparing for the AP exam and less on lab reports.
  • I don’t have the luxury of a two-week break to score lab portfolios at the end of the semester.

Based on these constraints and our experience during the fall semester, I made some changes for the spring semester. I selected seven required labs in the spring semester, one for each Science Practice. Each lab and reflection paragraph was due a few days after performing the lab, not at the end of the semester.

This had some advantages:

  • the portfolio was scored throughout the semester
  • students had more structure, which helped them stay current

and disadvantages:

  • no student choice in selection of labs to include in portfolio
  • no opportunity to revise a lab or reflection paragraph (the feedback could help them in labs later in the semester)

With these changes and students’ experience from the fall semester, the lab portfolios in the spring semester were largely successful. I think it is important to emphasize that both the changes and the students’ experience contributed to this success. I do not believe that the structure for the spring semester would lead to a more successful fall semester. The feedback I received from students at the end of the year was much more favorable concerning the structure in the spring semester than the structure in the fall semester.

Next Fall

I had the wonderful experience of being coached this year by Tony Borash. Tony provided guidance in many areas, one of which was making these adjustments for the spring semester and, more importantly, planning for next year. Together we were able to come up with a structure that will hopefully combine the strengths of the structure in the fall semester with the structure in the spring semester. My goals for these changes are to:

  • provide more structure for students
  • provide student choice
  • incorporate peer feedback

Here’s the plan for next fall:

  1. I choose the first lab. Students complete and submit the lab and the reflection paragraph. I provide feedback. Students make revisions and re-submit the lab and reflection paragraph. We review the best examples as a class.
  2. I choose the second lab. Students complete the lab and the reflection paragraph. Students provide peer feedback to each other. Students make revisions and submit the lab and reflection paragraph.
  3. Students choose the next lab to include in the portfolio. Students complete the lab and the reflection paragraph. Students provide peer feedback to each other. Students make revisions and submit the lab and reflection paragraph.
  4. Students choose some of the remaining labs, and I choose some of the remaining labs. Students complete the labs and reflection paragraphs. Students specify a subset of Science Practices on which they want formal feedback from me and on which they want feedback from their peers. Students make revisions and re-submit.

This past year, students included a link to their lab report in their lab portfolio and shared the lab report (as a Google Doc) with me. Next year, I will have students embed their lab report into the Google site. This will facilitate peer feedback and enable everyone to use comments within the Google site to provide feedback. I may still have students share the actual doc with me, as well as include a link, so I can provide more detailed suggestions directly within the document.

Student Examples

Conclusion

I’m pleased that my students and I are heading down this path and believe my students will gain a much deeper understanding of Science Practices as a result. While I shared this with my colleagues this past year, I also cautioned them that I didn’t have it figured out, and it wasn’t a smooth ride. I think electronic lab portfolios are an excellent way to assess student performance, and I hope that they will be used in other science courses in the future as they are a natural fit to the NGSS Science and Engineering Practices. I hope that after this next year, I will have something that will provide my colleagues with a stronger framework to adapt to their classes.

AP Physics 2 Reflection

On the eve of the first day of school, I felt that I better capture my thoughts on AP Physics 2 last year. My perspective may be different than other’s (at least different than the vocal minority(?) on the AP Teacher Community).

I started last year eagerly anticipating the new AP Physics 2 course. For the past seven years, I had taught some type of a second-year physics course. For most of that time, I taught what we called Advanced Physics, a one-semester course after which some of my students would take the AP Physics B exam. For a couple of years, I taught an official, year-long, AP Physics B course. I felt that the AP Physics B course had too much content to cover well, even as a second-year course. This was compounded by the mismatch between the groups of student that enrolled in the course. About a third of the students had previously taken our General Physics course, and two-thirds, Honors Physics. The Honors Physics students had studied additional units not part of the General Physics course. As a result, for some “review” units in AP Physics B, the pace was much too fast for those from General Physics and much too slow for those from Honors Physics.

The new AP Physics 2 course contained less content. In addition, the emphasis shifted towards deeper conceptual understanding of physics rather than numeric or algebraic problem solving. As a result of these changes, I felt that I could at last integrate much more of Modeling Instruction into a second-year physics course. I wasn’t too concerned about the shift towards deeper conceptual understanding since I had been moving my course in that direction for the past couple of years based on student performance on the AP Physics B exam. My students had done extremely well on the free response portion of the AP Physics B exam; therefore, I had adjusted class to focus more on conceptual understanding since the greatest area for growth was on the multiple choice portion of the exam. During the summer of 2014, I attended an AP Summer Institute to learn more about the new course. As a result of all of this, I started last year much more excited than anxious.

Reflecting back on AP Physics 2 last year, it was my favorite year teaching a second-year physics course. That said, while many aspects of the course worked well, there are definite areas for me to improve this year.

What Worked

Peer instruction was very effective at developing students’ conceptual understanding. Of all the various types activities done in class, students ranked peer instruction as the most helpful (over 75% of students agreed with the statement “Participating in peer instruction of conceptual questions helped me understand the material.” on the end-of-year survey). The manner by which I conduct peer instruction is strongly influenced by the research of Stephanie Chasteen who writes at sciencegeekgirl. The questions I use are a combination of Paul Hewitt’s Next-Time Questions, the end-of-chapter conceptual questions in Knight’s College Physics text, and those in clicker questions banks from CU Boulder and OSU.

The number and variety of lab activities also worked well. Some labs were informal stations, some typical Modeling Instruction paradigm labs, some lab practicums. With less content, we had time for more, and deeper, labs. Some of the labs and skills involved went beyond that required by the AP Physics 2 curriculum, but some of these were the students favorite. We will continue to explore computational modeling, build more advanced circuits on breadboards, and explore particle physics.

What Didn’t Work

Building my standards, and grading, on the Enduring Understanding defined for each Big Idea did not work well. While my goal was for students to see the connections between the various content areas and appreciate the Big Ideas, students shared that organizing the standards and grades in this manner didn’t help accomplish this. It did result in a lot of extra work for me. After the fall semester, I mostly abandoned this approach. Below, I’ll explain my approach for this year.

Whiteboarding homework problems did not work well. My approach was for six groups of students to prepare and present whiteboards based on assigned homework problems. This didn’t work well because too few students had done the homework problems in advance of whiteboarding. As a result, most of the group would watch those who had done the problems prepare the whiteboards and didn’t really understand the solution. This issue was compounded when whiteboards were presented. Too few students had struggled with the problem in advance to result in a good discussion. This wasn’t the case every time, but much too often.

What I’m Trying This Year

My attempts to prepare students for the free response portion of the AP Physics 2 exam fell somewhere between working and not working. I overestimated students’ ability to write clear, concise, and correct free responses. As a result, I didn’t dedicate sufficient time to practicing this skill. What did work well was using Socrative to share student responses and peer critique these responses. We will do this much more this year.

While my attempts to reinforce the Big Ideas by structuring standards and scores around Enduring Understandings didn’t work, emphasizing the AP Science Practices did work well. Inspired by Chris Ludwig’s work with portfolios and our discussion at NSTA earlier this year, my students will create a lab notebook and portfolio on their own Google Site. The notebook will capture all the labs and the portfolio will be a curated collection of labs that demonstrate their performance of the various AP Science Practices. I hope to share the details of this soon.

To improve the value of whiteboarding, I’m making several changes. Instead of six groups preparing and presenting six problems, groups will prepare and present only two problems. Each problem will be prepared by three groups. The problem won’t be assigned as homework. Rather, we will spend more class time as each group works together to solve the problem. A randomly selected member of each group will be responsible for presenting the whiteboard, and the class will focus on comparing and contrasting solutions between the various groups in addition to the solution itself.

Scores

The average AP Physics 2 scores were about a point lower than the previous year’s AP Physics B scores (3.344 vs. 4.484). However, as I considered the standards and expectations for AP Physics 2 compared to AP Physics B and carefully considered each of my students, their scores were what I expected, except for a few.

Summary

I’m thrilled with the new AP Physics 2 class and excited about teaching this course for the second time. All that I miss from AP Physics B is a huge collection of exam questions from which I could build my own assessments. My one wish is that the College Board releases additional questions as questions in the style of the new exam are very difficult to create. I hope that the changes that I have planned for this year help students to develop an even stronger and deeper understanding of physics and proficiency in science practices than last year’s. If you are interested in more detail about my approach last year, my 180 blog focused solely on AP Physics 2.

Fluids Paradigm Lab

I taught a one-semester Advanced Physics class that cumulated in the AP Physics B exam my first five years of teaching. For the past two years, I taught an official AP Physics B course. Both of these courses were packed with content. Despite being a proponent of Modeling Instruction and incorporating it into other courses, I never felt I could make it fit in these courses.

This year, I’m teaching the new AP Physics 2 course. The focus on inquiry, deep understanding of physcs, and science practices (and less content) aligns wonderfully with Modeling Instruction.

We just started the first major unit, fluids. I guided my students through a paradigm lab to model the pressure vs. depth in a fluid. We started by watching this video of a can being crushed as it descends in a lake. I was worried students would find the phenomenon demonstrate too simple, but that definitely wasn’t the case. Like any paradigm lab, we started by making observations:

  • the can gets crushed
  • the can gets crushed more as it gets deeper
  • the top of the can appears to be sealed
  • the can must be empty (student commented that if full, it wouldn’t be crushed)

Students then enumerated variables that may be related to the crushing of the can:

  • water pressure
  • volume of water above the can
  • strength of can
  • air pressure inside of can
  • gravitational field strength (student said “gravity” and I went on a tangent about fields…)
  • temperature of water
  • atmospheric pressure
  • type (density) of fluid
  • water depth
  • speed of decent
  • dimensions, surface area, shape of can
  • motion of water

Students readily agreed that it was the water pressure that crushed the can and it is the dependent variable. In hindsight, I could have better focused the discussion by directing students to focus on the water pressure rather than the can itself. They had a lot of good ideas about what properties of the can would affect it being crushed, which I didn’t expect. I had to admit that I didn’t have any cans and we would have to focus on the fluid instead…. I was amazed that no one in my first class proposed that the depth of the fluid would play a role. Everyone in that class phrased it as the volume of the fluid in the container above the can was a variable to measure. This was fascinating to me and led to a surprising result for the students as the experiment was conducted. I think this illustrates the power of the modeling cycle and guided inquiry labs.

We next determined which of the above variables we could control (independent variables) and measure in the lab given the resources available at the moment:

  • volume of water above the can
  • type (density) of fluid
  • water depth
  • speed of decent

The materials we planned on using were Vernier LabQuest 2 interfaces, pressure sensors with glass tube attachments, three different sized beakers (for the volume variable), graduated cylinders, fluids (water, canola oil, saturated salt water).

We then defined the purpose of our experiment:

To graphically and mathematically model the relationship between (TGAMMTRB) pressure, volume of fluid above, depth below surface of fluid, decent rate, and type of fluid (density).

We divided these various experiments among the lab groups, and groups started designing their particular experiment.

At the start of class the next day, groups shared their results. I was particularly impressed with the groups investigating pressure vs. volume of fluid above a point. While they measured a relationship between pressure and volume, their experimental design was sufficiently robust that they also noticed that the same volume above the measurement point resulted in different pressures in different beakers! That is, the pressure with 400 mL of water above the sensor in the 600 mL beaker is different than in the 1000 mL beaker and different again from that in the 2000 mL beaker. After further investigation they concluded that the relationship was based on depth, not volume.

The groups investigating pressure vs. depth in fluid were confident that the pressure at a point depended on the depth below the surface of the fluid, and they had sufficient data that they were also confident that there was a linear relationship between pressure and depth.

The groups that investigated pressure vs. fluid density at constant depth/volume had inconclusive results. The pressure they measured varied by less than 1% between the three types of fluids. This provided an opportunity to discuss how the experimental technique can affect the uncertainty of the measurement. We discussed that with the new understanding of the relationship between pressure and depth, these groups could gather several measurements at various depths in each of the three fluids and compare the slopes of the resulting graphs to see if density has an effect. While we were discussing measurement uncertainty, we also discussed how the depth is defined not by the position of the bottom of the glass tube, but the water level within the glass tube. I learned of this important experimental technique in the article “Pressure Beneath the Surface of a Fluid: Measuring the Correct Depth” in The Physics Teacher. While the groups investigating the effect of fluid density on pressure applied their new experimental technique, the rest of the groups repeated gathering pressure vs. depth data while carefully examining the fluid level in the glass tube.

After a second day of measurements, students confirmed the linear relationship between pressure and depth. In addition, with the improved experimental design, students confirmed a relationship between pressure and fluid density. The results were not as accurate as I had expected. We identified a couple of additional errors that may have contributed. One, a couple of groups lost the seal between the glass tube and the plastic tube connected to the pressure sensor when the glass tube was in the fluid. This results in the fluid filling the glass tube and future measurements are incorrect if the glass tube is reconnected without removing it from the fluid.

I asked my TA to minimize the known sources of measurement uncertainty, perform the experiment, and determine how accurately pressure vs. depth could be measured. The slope of his pressure vs. depth graph was within 3.16% of the expected value. This is quite a reasonable result. If we used a taller graduated cylinder, I expect the error could be reduced further.

I’ll definitely do this paradigm lab again next year!

Chromebook Toolchain for AP Physics

This fall, my AP Physics 2 classes will be using Chromebooks as part of my school district’s 1:1 pilot. Chromebooks were new to me; so, it took some time this summer to find the apps to support the workflow I want for this class. While I’m sure the toolchain will change throughout the semester, and there will be surprises (both pleasant and otherwise), here is the starting toolchain:

  • Canvas. Everything starts and ends with this learning-management system.

We will do a lot of lab activities. The workflow depends on the amount of data acquired and the level of graphical analysis required. The start of the workflow is the same:

  • LabQuest 2. Vernier’s LabQuest 2 can create its own ad-hoc network or connect to the school’s wireless network. The LabQuest 2 hosts its own web page as part of their Connected Science System. Students can then access the device, the data, and graphs via Chrome. Data and graphs can be exported to the Chromebook via the web page.

The next tool depends upon the lab. For some labs, the data and graphs produced on the LabQuest 2 are sufficient. Students will import these into their Google Document and create whatever is required for their lab report. If additional analysis is required and the data sets are relatively small:

If data sets are large or more sophisticated analysis is required:

  • Plot.ly. Plot.ly seemed to explode onto the education scene this summer, or maybe I was just paying more attention. Data exported from the LabQuest 2 can easily be imported into Plot.ly. Like Desmos, graphs can be shared via a link and an image can be embedded in the Google document. Plot.ly can also embed its graphs in an iframe, but I couldn’t find a way to embed that in a Google document as opposed to a web page. Fran Poodry from Vernier made a great screencast demonstrating the integration of the LabQuest 2 and Ploy.ly.

Regardless of the analysis performed, in the end, students create their lab report in Google docs and submit it via Canvas.

Another important aspect of class is the exploration and modification of computational models. In the past, we’ve used VPython. I had to find an alternative that would be compatible with Chromebooks:

  • Glowscript. Glowscript is the up-and-coming platform for computational models with the advantage that it runs in a browser that supports WebGL. I’m not a huge fan of JavaScript syntax for novice programmers; so, we will be using CoffeeScript instead. I didn’t write as many starting models over the summer as I had hoped, but I did at least verify that complicated models can be ported.

Peer instruction is one of the most effective and popular classroom activities that we do. In the past, I’ve used handheld clickers. This year, we will use the Chromebooks:

  • InfuseLearning. There are a number of web apps in this space, but I selected InfuseLearing because it allows the creation of spontaneous questions, supports a variety of answer methods including drawing and sort-in-order. Pear Deck looks promising, but I don’t want to be forced to create my set of questions ahead of time.

For notes in class, I’ll leave it up to students to use whatever tool works best for them (including paper and pencil). I’ll suggest they at least take a look at:

  • Evernote. I love Evernote and use it all the time for all sorts of stuff.

I do provide students with PDFs of my slides. I can envision that students may want to annotate these PDFs or other handouts. Surprisingly, this was the hardest tool to find:

  • Crocodoc. The free personal version allows students to upload a PDF, annotate it, and export their annotated version. Other tools I explored are Notable PDF. This requires paid licenses to be useful. We may try this out if we find Crocodoc lacking.

A couple of other tools that looks interesting, but I’m not sure if they fits into the toolchain for my class is:

  • Doctopus. I think Canvas assignments and SpeedGrader cover everything that I personally would do with this app.

  • 81Dash. Private back-channeling app.

I’m sure I will learn of new tools throughout the semester and I’ll make adjustments to the toolchain. If you are using Chromebooks, please share your favorite apps below in the comments!

AP Physics 2 Syllabus, Units, Labs, and Pacing

I previously shared how I will be using the AP Physics 2 Big Ideas and Enduring Understandings as the standards for my flavor of standards-based assessment and reporting for AP Physics 2. Since then, I’ve been working on outlining my sequence of units, pacing, and labs. This allowed me to finish the syllabus to submit for the College Board Audit. I based my syllabus heavily on Dolores Gende’s syllabus. My syllabus is 1252560v1, in case anyone finds it helpful in preparing theirs.

The syllabus that I share with students and parents provides all of the specifics on the structure of the course.

My sequence of units and pacing is based on a fall semester of 15 weeks and 2 days (plus finals) and a spring semester of 13 weeks to April 22nd (at which point we start reviewing for the exam). We will be using College Physics, 3rd Edition, by Knight, Jones, and Field. My pacing reflects our first year physics courses which cover more of electrostatics and circuits than the minimum required by AP Physics 1.

Please share any feedback or questions that you have!

Fall Semester

Unit 1: Relativity and Computational Modeling

  • time: 1 week
  • Knight: Chapter 27
  • computational model:
    • frames of reference

Unit 2: Fluid Mechanics

  • time: 3 weeks
  • Knight: Chapter 13 (sections 13.1-13.6)
  • computational models:
    • buoyancy
    • Torricelli projectile
  • labs:
    • pressure beneath the surface
    • hydrometer
    • Archimedes
    • Bernoulli/Venturi
    • water projectile

Unit 3: Thermodynamics

  • time: 4 weeks
  • Knight: Chapters 10.5; 12.1-12.4, 12.8; 11.1, 11.3-11.8
  • computational model:
    • kinetic theory
    • heat transfer between liquids of different temperatures (thermal equilibrium)
    • entropy
  • labs:
    • heat engine
    • heat transfer
    • temperature and kinetic theory
    • entropy activity

Unit 4: Electrostatics

  • time: 4 weeks
  • Knight: Chapters 20, 21
  • computational model:
    • electric field/potential maps (3D?)
  • labs:
    • Millikan Movies
    • electric potential mapping
    • dielectric constant and parallel plate capacitor lab
    • simulations (field hockey, fields and potentials)

Unit 5: Electric Circuits

  • time: 2.6 weeks
  • Knight: Chapters 22, 23 (23.1-23.7)
  • labs:
    • conductivity/resistivity lab
    • Experimenting with constant current and voltage sources
    • RC circuits

Capstone

  • time: 4 days

Spring Semester

Unit 6: Magnetostatics and Electromagnetism

  • time: 4 weeks
  • Knight: Chapters 24, 25
  • computational models:
    • charged particle in an external magnetic field
  • labs:
    • magnetism activities
    • mass of the electron
    • measurement of a magnetic field
    • electromagnetic induction activities
    • Faraday’s Law
    • electric motors
    • determine number of loops in solenoid
    • Lenz’s Law Demonstration Using an Ultrasound Position Sensor

Unit 7: Geometric and Physical Optics

  • time: 4 weeks
  • Knight: Chapters 17, 18
  • labs:
    • reflection activities
    • mirrors lab
    • refraction activities
    • refraction/total internal reflection lab
    • lenses activity
    • lenses lab
    • diffraction and interference
    • thin film interference lab
    • interferometer thermal expansion
    • holograms
    • Determining the Thickness and Refractive Index of a Mirror

Unit 8: Quantum, Atomic, Nuclear Physics

  • time: 4 weeks
  • Knight: Chapters 28, 29, 30
  • computational model:
    • half life
  • labs:
    • hydrogen spectrum
    • photoelectric effect
    • half life
    • stochastic nature of radiation
    • LED lab for Planck’s constant

Review

  • time: (4 days for final exam) + 6.5 days for analysis and review
  • April 23-24, 27-28: final exam

Unit 9: Particle Physics and Cosmology

  • time: 2 weeks