Tag Archives: sbar

Standards for AP Physics 2

I floated this idea on Twitter a couple of weeks ago and have decided to give it a try. Historically, I’ve grouped my assessment standards into unit-centric categories. In an attempt to emphasize the big ideas and science practices more strongly, I’m going to group standards by the Big Ideas defined by the College Board for AP Physics 2. My assessment standards are the Enduring Understanding defined for each Big Idea. The Essential Knowledge items and Learning Objectives are too fine grained for my style of standards-based assessment and reporting, especially for an AP class where I want students to focus on the combination of multiple concepts.

There will be multiple assessments (labs and exam questions) for each standard. A given assessment will focus on a subset of learning objectives for that standard. As a result, there will be multiple scores for each standard in the grade book. I hope this will give students more insight into their strengths and areas for improvement as they progress throughout the course. I’ll still have reassessments.

The weights for each Big Idea category will not be the same, but I’m going to do more planning before assigning them. I also need to see how these standards are split between the fall and spring semesters.

If you think I’m courting disaster with this plan, please let me know. If you adopt a similar approach for your AP Physics class, please remember I’ve never tried this before!

  • 1: Objects and systems have properties such as mass and charge. Systems may have internal structure.
    • 1.A: The internal structure of a system determines many properties of the system.
    • 1.B: Electric charge is a property of an object or system that affects its interactions with other objects or systems containing charge.
    • 1.C: Objects and systems have properties of inertial mass and gravitational mass that are experimentally verified to be the same and that satisfy conservation principles.
    • 1.D: Classical mechanics cannot describe all properties of objects.
    • 1.E: Materials have many macroscopic properties that result from the arrangement and interactions of the atoms and molecules that make up the material.
  • 2: Fields existing in space can be used to explain interactions.
    • 2.A: A field associates a value of some physical quantity with every point in space. Field models are useful for describing interactions that occur at a distance (long-range forces) as well as a variety of other physical phenomena.
    • 2.C: An electric field is caused by an object with electric charge.
    • 2.D: A magnetic field is caused by a magnet or a moving electrically charged object. Magnetic fields observed in nature always seem to be produced either by moving charged objects or by magnetic dipoles or combinations of dipoles and never by single poles.
    • 2.E: Physicists often construct a map of isolines connecting points of equal value for some quantity related to a field and use these maps to help visualize the field.
  • 3: The interactions of an object with other objects can be described by forces.
    • 3.A: All forces share certain common characteristics when considered by observers in inertial reference frames.
    • 3.B: Classically, the acceleration of an object interacting with other objects can be predicted by using Newton’s Second Law.
    • 3.C: At the macroscopic level, forces can be categorized as either long-range (action-at-a-distance) forces or contact forces.
    • 3.G: Certain types of forces are considered fundamental.
  • 4: Interactions between systems can result in changes in those systems.
    • 4.C: Interactions with other objects or systems can change the total energy of a system.
    • 4.E: The electric and magnetic properties of a system can change in response to the presence of, or changes in, other objects or systems.
  • 5: Changes that occur as a result of interactions are constrained by conservation laws.
    • 5.B: The energy of a system is conserved.
    • 5.C: The electric charge of a system is conserved.
    • 5.D: The linear momentum of a system is conserved.
    • 5.F: Classically, the mass of a system is conserved.
  • 6: Waves can transfer energy and momentum from one location to another without the permanent transfer of mass and serve as a mathematical model for the description of other phenomena.
    • 6.A: A wave is a traveling disturbance that transfers energy and momentum.
    • 6.B: A periodic wave is one that repeats as a function of both time and position and can be described by its amplitude, frequency, wavelength, speed, and energy.
    • 6.C: Only waves exhibit interference and diffraction.
    • 6.E: The direction of propagation of a wave such as light may be changed when the wave encounters an interface between two media.
    • 6.F: Electromagnetic radiation can be modeled as waves or as fundamental particles.
    • 6.G: All matter can be modeled as waves or as particles.
  • 7: The mathematics of probability can be used to describe the behavior of complex systems and to interpret the behavior of quantum mechanical systems.
    • 7.A: The properties of an ideal gas can be explained in terms of a small number of macroscopic variables including temperature and pressure.
    • 7.B: The tendency of isolated systems to move toward states with higher disorder is described by probability.
    • 7.C: At the quantum scale, matter is described by a wave function, which leads to a probabilistic description of the microscopic world.

AP Computer Science End-of-Year Survey Results

I recently reviewed the end-of-year feedback from my AP Computer Science students. This year we moved to a new textbook. Last summer, I focused on selecting new practice activities from the textbook and improving the summative labs that students complete at the end of each unit. I made the decision to invest most of my time in the development of the summative labs rather than the practice activities. My focus (and lack of focus) is evident in the feedback. In the following charts, a “1″ represents strongly agree and a “5″ represents strongly disagree.

I see practice activities as the aspect of the class most in need of improvement. While the feedback was largely positive, it was as positive as I would like. I believe the feedback on peer programming was a result of how I introduced, structured, and facilitated peer programming rather than a poor reflection on the methodology itself.

Screen Shot 2014 06 22 at 1 33 52 PM

The feedback on summative labs was much more positive, which is good because I put forth a lot of effort to improve those! I plan to retire the ActorBox lab which was an early introduction to GridWorld. I may do a turtle lab instead. I also need to re-evaluate the Word Search lab. The lack of popularity may be somewhat due to timing rather than the lab itself. I may look for a different lab for arrays and ArrayList. I would love to create something with more social relevance. The DrawingEditor was fairly well liked but was too much of a challenge for too many students. I may consider replacing it with the new AP Elevens lab.

Screen Shot 2014 06 22 at 1 34 23 PM

Screen Shot 2014 06 22 at 1 34 38 PM

The chart is a shout out to Canvas’s Speed Grader. I sung its praises in an earlier post.

Screen Shot 2014 06 22 at 1 34 32 PM

I was surprised how many of my students were planning to major or minor in a computer-related field. I would expect about three-quarters of them would major in a STEM-related field, not solely computing related.

Screen Shot 2014 06 22 at 1 34 45 PM

I had a very simple standards-based assessment and reporting system for this class. Summative assessments were scored on a 1-5 scale. Each unit that consisted of one exam and one lab. I almost never had a conversation with students about scores or grades. Lots of conversations about computer science instead.

Screen Shot 2014 06 22 at 1 34 55 PM

My focus for this summer is to improve the practice activities by selecting fewer and selecting those that students will find more relevant. In addition, with the practice activities, I want to achieve a balance between instructor-led examples, individual development, and peer programming. I specifically want to improve my facilitation of peer programming. I also plan on developing my own slide decks instead of using those that are included with the textbook. Finally, we will be using GitHub next year and I want to move the summative labs into GitHub to provide necessary scaffolding for the students. Looking forward to next year!

AP Physics B Assessments

As I’ve mentioned, I’m spending some time this summer preparing for the AP Physics B course that we will be teaching for the first time this fall. I recently finished creating the assessments for this course.

With one exception (fluids multiple choice), all questions are from previous AP Physics B exams. Thanks to the handy indexes available from Secure PGP, it was relatively easy to review relevant questions and problems and choose those I wanted.

While compiling the assessments, I refined the granularity of the units a bit.

Fall Semester

  • Special Relativity
  • Kinematics
  • Statics and Dynamics
  • Fluid Mechanics
  • Work, Energy, Power
  • Thermodynamics
  • Linear Momentum
  • Oscillations, Gravity, Waves
  • Capstone Project

Spring Semester

  • Electrostatics
  • Electric Circuits
  • Magnetic Fields and Electromagnetism
  • Geometric Optics
  • Physical Optics
  • Particle Physics
  • Atomic Physics and Quantum Effects
  • Nuclear Physics
  • Cosmology

For each unit, I compiled a quiz that contains representative free response problems to be used as a formative assessment. I then created an end-of-unit exam consisting of multiple choice and free response questions. The exam is intended to be completed in a 50-minute class period or less. To support the flavor of standards-based grading that I’m using in this class, I also created a reassessment consisting of multiple choice and free response questions. Scoring rubrics for all free response questions have also been compiled for each assessment.

I uploaded the assessments as an archive for each semester to Secure PGP. I included the original Pages documents as well as versions exported as PDFs and Word files. I hope that some of you find these helpful. Please let me know if you find any mistakes.

Preparing for New AP Physics B Course

I will spend a lot of time this summer preparing for a new AP Physics B course. For most of the past five years, I’ve taught an Advanced Physics course which was a third semester of physics after Honors Physics that covered fluid dynamics, thermodynamics, and modern physics topics. This class wasn’t officially an AP Physics B class, but many students took the AP exam and were well prepared.

However, this new course replaces Advanced Physics, will be a two-semester course, and is open to students who have completed either Physics or Honors Physics. So, the students will have covered different topics and approached physics from different perspectives. For example, the Honors Physics class covers a superset of topics but the Physics class emphasizes the development and understanding of Models. Due to this diversity, and now being an official AP course, I’m taking the opportunity to develop new class materials and try a few new approaches.

Topic Sequence

We will briefly review or cover all AP Physics B topics in this course. Topics that are review will be used as opportunities to perform more sophisticated labs and explore new representations such as computational models. In addition, there are certain topics that I believe should be part of a college physics class and that are of great interest to students but are not part of the AP Physics B curriculum. We will cover those as well.

Fall Semester

  • Special Relativity
  • Kinematics
  • Statics and Dynamics
  • Fluid Mechanics
  • Work, Energy, Power
  • Thermodynamics
  • Linear Momentum
  • Oscillations and Gravity
  • Waves
  • Capstone Project

Spring Semester

  • Electrostatics
  • Electric Circuits
  • Magnetic Fields
  • Electromagnetism
  • Geometric Optics
  • Physical Optics
  • Particle Physics
  • Atomic Physics and Quantum Effects
  • Nuclear Physics
  • Cosmology

Components of Each Unit

I’m going to try a few new ideas in most units. Some of these are driven by methodologies that I have wanted to try for a while (e.g., computational modeling and peer instruction). Others are driven by new technologies available to my students (e.g., Canvas and iPads).

Topic Summary

I’m currently writing an AP Physics B review guide as an iBook. I wanted a review guide tailored to my students’ experiences and the structure of the class. The review guide is organized by topic but focuses on the models applicable to each topic. In addition to a description of the relevant models, the graphical, mathematical, and diagrammatic representation of those models are included as appropriate. I want students to explore an additional representation of the models to reinforce their understanding and have been very impressed with John Burk’s use of computational modeling. So, computation models developed using physutil and VPython are also included. I hope to include the iBook (also as a PDF) as well as related videos and code snippets in an iTunesU course. I’ve been impressed with iBook Author so far and have exported the first chapter as a PDF.

Download (PDF, 5.87MB)

Labs and Lab Notebooks

Since all students have already had a year of physics, I’m looking forward to doing some more sophisticated labs. Students will be creating electronic lab notebooks as portfolios in our new learning management system, Canvas. In addition, since we will have a class set of iPads available, we will be evaluating Vernier’s new LabQuest 2 and the Connected Science System.

Quizzes and Peer Instruction

I have been wanting to explore peer instruction using clickers and I think the more conceptual questions would be a great fit and prepare students for the multiple choice portion of the AP exam. I found some wonderful existing clicker question at OSU and CU Boulder. I’m compiling quizzes from existing AP free-response questions and will use the scoring rubrics to provide formative feedback to prepare students for the free response portion of the AP exam.

Exams

Secure Pretty Good Physics (Secure PGP) is a great resource for AP Physics teachers. Other teachers have indexed questions by topic which makes creating new exams much easier. I’m compiling an exam and a reassessment exam for each unit based on existing AP multiple choice and free response questions. I plan to post these, along with the quizzes, to Secure PGP when I’m done.

Standards-Based Assessment and Reporting

I’m using a slightly modified version of the SBAR structure that we’ve been using in Honors Physics. The biggest change is that assessments will be scored on a five-point scale, like the AP exam itself. This is a small change for those students familiar with Physics’ four-point scale, but a more significant change for those students familiar with Honors Physics’ mastery system. Another significant change is the granularity of standards. Due to the integrated nature of the AP exam, standards will be very broad, usually one standard for each unit. All of the details of the SBAR structure are enumerated in the class syllabus.

Download (PDF, 62KB)

I hope some of you who are also teaching AP Physics B find something here of use. I know that the work that other teachers have done is incredibly helpful as I prepare for this new course. I plan to share pretty much everything I compile either here or on Secure PGP; so, please stay tuned or ask if I forget to post something.

SBAR and Mastery Student Survey

I previously wrote about my challenges with my Honors Physics class this year. I received several comments from other teachers which influenced the survey I administered to students this past Friday. While my colleague and I still need to analyze all the data, reflect on the semester, and decide what changes we will make for next year; I thought I’d share some of the more interesting feedback that we received.

Homework

The biggest change that we made this spring semester is that we no longer provided credit for homework. This was motivated by our experience that students weren’t developing good problem solving techniques and that he homework was really just for practice. This change was the most frequently commented on by students. On the survey, only a third of students agreed with the statement that they complete their homework before the unit exam, but 90% said that they would if it directly affected their grade. In addition, half of students agree with the statement that they complete the homework only because it is required before doing 2nd tries (our reassessment opportunity offered before and after school once a week for two weeks following the summative exam). However, 77% of students agree with the statement that they write out the complete solutions as opposed to just answers. So, while students are developing better problem solving techniques, they aren’t doing their homework.

The reasons for this are captured best by the feedback provided by a couple of students:

I liked the way the system worked first semester much better. Even though webassign was normally a stressful night before rush, I always felt significantly moer prepared after completing it and knowing that it was going to influence my grade if I didn’t do it was just enough motivation to complete it each time. This semester, now that there is no required prep before the exam, I find myself preparing less, which I know is ultimately my own responsibility. I know many students are probably in the same situation though, being motivated by hw completion grades. I think requiring prep before first assessments would also help to lower the percent of people who need second tries since they are more prepared for the first try.

I personally think that you should require the graded webassign before the tests, much like first semester. For me, I typically read the book and do a few practice problems before the exam, but I often don’t have enough time to study adequately (my biggest problem this year). I just don’t have enough time to do everything (school work in general) so I prioritize, and graded assignments take priority. Spending time working on assignments is less time for sleeping at night, so very often the most studing I do is a reading of the entire section during lunch. If webassign was graded, I would do it.”

These comments illustrate that students are aware that they need the practice, but feel unable to do so with out the threat of losing points. These comments are indicative of a much broader issue. Is the problem that we don’t motivate students to do homework by awarding credit or that other classes do? If no one awarded points for homework, then students could decide what to do based on what would help them most.

Preparing for Initial Assessment

The results of the survey and the comments shared by students demonstrate that the homework change is one of the factors contributing to the lack of preparation for the initial assessment. In addition, a couple of other questions on the survey reinforced my concern.

Only about half of students expect to master the standards on the unit exam and half feel lucky when they do master a standard. 88% of students, compared to 63% of students, say they prepare sufficiently for the second assessment compared to the original assessment.

Todd Zimmerman and Kelly O’Shea both suggested that additional formative assessments were needed. I added questions to the survey to solicit feedback on this idea. While only 43% of students want more homework before the initial summative assessment, 68% of students want quizzes with feedback before the unit exam. I think the survey results support Josh Gates’ idea of providing initial formative assessments for feedback not grades. While there is little to no difference between these assessments and homework, simply presenting it in a different way results in very different reactions in students. So, Todd, Josh, and Kelly’s ideas definitely resonated with students. The challenge will be for my colleague and I to find time to create, administer, and provide feedback on these formative assessments.

Standards Based Assessment and Reporting and Mastery Learning

My colleague and I started applying SBAR and mastery learning three years ago for a variety of reasons. The primary one was to help students focus on learning rather than grades and, as a result, better retain the concepts and have less stress.

75% of students agreed with the statement that having the standards enumerated help them prepare for the exam and 62% of students agree with the statement that the mastery system helps them focus more on learning and understanding and less on points and grades. That said, 53% of students say they still focus on their grade in honors physics as much as they do in traditional classes. In addition, while 75% of students said they take 2nd tries to improve understanding, 94% of students said they take them to improve their grade. In general, Honors Physics students are extremely grade conscious. However, in terms of stress, 91% of students find exams less stressful knowing 2nd tries are available.

Another concern that I had was that students weren’t doing their best work but, instead, were putting forth the effort to just barely master the standard. Students disagreed with this assertion. 82% of students said they do the best they can when completing an assignment and only 15% say they do the minimum to achieve mastery. Perhaps this concern of mine was due more to end-of-year malaise than reality.

Most students appreciate the benefits of standards-based assessment and reporting and mastery learning. To provide some context, the initial summative assessment usually consists of a multiple choice portion for some standards and a problem-solving portion for others. On the multiple-choice portion, students can usually miss one or two questions and still demonstrate mastery. On the free response portion, students can make non-critical errors and still demonstrate mastery. Here are some of their comments:

I like the Honors Physics standards system, but sometimes the standards do hurt my grade (if I miss two questions out of 7, it is a 0% instead of a 72%). I really appreciate the second tries for standards because of this. I also feel that the standards system forces students to retain what they learn and helps them be more prepared for the final.

The second tries are extremely beneficial to students because it allows students to relearn a target, which I believe reinforces and strengthens the learning from a topic. I remember second try targets much better than other targets.

I feel as if the class is well oriented in that it helps people to focus more on the concepts rather than having to worry about their grade. It really helps a lot more in the long run.

I honestly loved physics both semesters, second especially, and i actually found that i it was required for me to understand the material by doing the work, and i enjoy that. Not all classes need you to understand, a lot just ask for you to memorize stuff. Also, i heard that this is the type of learning you need for college so this will probably help.

I feel like, while the mastery system is sometimes problematic, it really helps to ease the stress that I have while taking each exam. Knowing that by making one silly mistake, I can still get 100% in a category allows me to stop stressing about every single little thing during tests, which also helps me to focus. It is inconvenient when I don’t master a standard, but I feel like the 0% that results from it furthers my determination to clarify that standard and thus improve my grade during the 2nd tries.

I think that however much you put into the class is what you get. Like all other classes, the advantage is in taking the intiative, regardless of the grading system. I feel the standards system is a keeper though, because as you said it eliminates the hesitance for each individual point and allows the student to focus on the big picture. VURY NICE.

With grades being administered on a pretty consistant schedule, I find that I care less about my grade for this class. The use of standards makes me feel accountable for mastering a topic. That is to say, I don’t feel good if I master a target out of luck on an exam. I think other courses’ grading systems could take the honors physics model! Physics was a pleasure!

However, some students definitely do not like the mastery learning system. While they don’t object to the principles of SBAR, the binary nature of the mastery learning system drives them crazy.

The Mastery system is extremely stressful, because it is possible to miss 4 questions across two standards and fail an exam, while it is also possible to miss 4 question across 4 standards and receive a 100%. This system would be great if it was not tied to the letter grade system, but because it is, it’s extremely detrimental to report cards. (Which do matter, regardless of understanding or not.)

I may use this quote in my introduction of mastery learning next year. I don’t expect every student to agree with my philosophy that the grade for the course should reflect true understanding of a concept regardless of the effort exerted to almost understand it. However, I want every student to understand my perspective.

Additional Challenges

Unfortunately, cheating is always a concern of my colleague and I. We asked several questions on the survey about cheating and, unfortunately, I was surprised when 59% of students agreed with the statement that other students cheat on 2nd tries and only half of students disagreed with the statement that they feel pressure from their peers to tell them what is on the exam or 2nd tries.

As in the case with the additional formative assessments, the challenge will be for my colleague and I to find the time to address this issue. However, we have to ask ourselves how to spend our very limited time. Do we spend it to thwart those students who are trying to cheat or do we spent it to help those students who are trying to learn?

What’s Next?

I had discussions with several students who suggested that completing homework should not be graded but should be required before the initial summative assessment in order to earn the opportunity for a second summative assessment. This is the policy of another science class and students like that it motivates them to complete the homework before the initial assessment. Personally, this doesn’t ring true for my philosophy.

My colleague has what may be a great solution. He wants to focus on lab notebooks next year. He is proposing that students are permitted to use their lab notebooks, which may contain observations from lab activities and homework problems, on the initial summative assessment and perhaps not on the secondary summative assessments. This may provide sufficient motivation for students to complete the practice that they need before the initial assessment without requiring all students to complete the same amount of practice or penalizing those who don’t complete the practice.

I hope to meet with my Assistant Principal and hear the motivation behind the policy to not allow reassessments unless a student earns less than an 80% on an exam and to cap that reassessment at 80%. I hope to side step this policy entirely, but I am curious as to the motivation behind it. I also want to share these results and our ideas for next year.

We will have a new LMS next year and perhaps that will provide a mechanism to offer more formative assessments and feedback before the summative assessment.

This summer, my colleague and I will sit on one of our decks and figure out what to change for next year. I’m now more confident that we can continue to pursue our goals without taking steps backward.

Help! SBAR Challenges!

My colleague and I have been using standards-based assessment and reporting (SBAR) (a.k.a. standards based grading (SBG)) and a mastery learning methodology for the past three years. We have been very pleased with the results and have continued to evaluate and improve our methodology each year. However, this semester, it appears that the students aren’t drinking the kool-aid. I need some advice.

We are planning on administering an anonymous survey as well as having a class discussion on the topic of SBAR, learning, and assessment. I also plan to talk privately with various students. However, I wanted to solicit the advice from the larger SBAR community first since I expect that will shape the survey and direction of the conversation.

The problems this semester are that students are doing worse on initial assessments and the quality of work has deteriorated. To be fair, this doesn’t apply to every single student, but it does appear to be an overall trend. While I’m not sure of the reasons behind this change and I hope that the survey, class discussion, and individual discussions provide some clarity, I have a hypothesis. I believe that there has been a change in attitude this semester compared to the previous five.

In past semesters, many students would have the following attitude towards class: learn the material as well as possible throughout the unit, do some of the homework, and try to master every standard on the initial exam. If they didn’t master a few standards, they would complete extra practice and take advantage of a reassessment outside of class. These reassessments were best to be avoided, however, since they required extra work and time outside of class.

This semester, I believe many students have the following attitude towards class: try to learn the material only based on in-class activities, do nothing outside of class, and attempt the initial exam. They expect to have to reassess every standard; so, any that they happen to master on the initial exam is considered a fortunate bonus. They then prepare the required extra practice and take advantage of the reassessment outside of class. In addition, since they only have to master a standard, do the bare minimum amount of work or quality of work to meet that expectation.

I still believe our methodology is philosophically sound. However, I fear that this deferred effort approach will result in less understanding and less retention. In addition, this bare minimum approach leads to sloppy and careless work and poor habits. This is not okay regardless of how sound the philosophy is.

I’m considering a couple of changes. One, abandon the mastery system for a 1-5 scale like what I use with my AP-level class. While this allows for more differentiation in terms of quality of student work and depth of understanding, it feels like such a huge step backward in terms of trying to de-emphasize grades. Two, adopt a cap for reassessments. My school is pushing an 80% cap for reassessments. Only students who score less than 80% are eligible to reassess. In addition, the maximum score on a reassessment will be an 80%. While this may motivate students to develop their understanding and practice before the initial assessment, it seems to contradict the very foundation of SBAR.

Perhaps even more importantly that understanding the change in attitude, I’m not sure what has precipitated this change. What is different this semester compared to the previous five? How am I, my student, or my school different?

It is all somewhat depressing since I felt that we really had created something special that was meeting the goals I set for my students. Instead, this semester, I feel that our fragile ecosystem has been shattered and I’m not sure we can recover.

No More Credit for Homework

As a previously shared, I am not making many changes in Honors Physics this semester. However, we are making two significant changes related to homework. Despite my strong belief in standards-based assessment and reporting philosophy, I have always provided some credit for completing homework. I’ve previously shared my attempt to justify this policy.

To minimize the overhead of checking homework and discourage blatant copying, we use WebAssign for homework. It worked well and certainly didn’t require much effort once I had created the problem sets. However, at the end of this semester a huge problem hit my colleague and I like a brick wall:

You get what you reward.

We rewarded a student submitting the correct answer for 80% of the homework problems in WebAssign and that is exactly what we got.

The behavior that we were unintentionally rewarding began to become clear when I would help students outside of class. The dialog would go like this:

S: “Mr. Schmit, I have a question about a homework problem. Can you help me?”

Me: “Of course! Let me see your notebook and what you have so far.”

S: “It is problem number 38. I’ll show you in the text.”

Me: “Okay, but let me see what you have written down so far.”

S: blank look

Me: “Let me see your sketch, diagram, list of givens, equation with variables, substitution of values with units, …”

S: blank look

S: I just solve the problem on WebAssign.

Me: blank look

S: I just type the numbers into my calculator and enter the final answer in WebAssign.

While I don’t have this conversation with every student, it is not at all uncommon. I suppose I shouldn’t be surprised, the students are exhibiting the exact behavior that I’m rewarding.

So, this semester, no credit for homework. None. I will still create homework assignments on WebAssign since students do like to check their answers or to ask for another version of the same problem for practice. This change will at least stop rewarding the behaviors we don’t want.

While hopefully students’ experiences during the fall semester will be sufficient to encourage them to adopt robust and organized problem solving methods, I realize it won’t for everyone. So, the second change that we are making is that before reassessment a student must show me clear, detailed, and robust solutions to the homework problems related to that standard.

Yes, I realize that many of you have been doing exactly this from day one. I’m a bit slow to catch on as it took me two and a half years. Better late than never.

As a humorous endnote, one student solved a circular-motion, car-on-banked-curve problem on the semester final exam without showing any work at all. He wrote a note about how he did the whole thing on his calculator and didn’t expect any credit. He also noted how it would be quite ironic if he got the answer wrong. He didn’t.