Category Archives: standards-based grading

Greatest Benefit of Canvas

Last spring, I was part of our district’s pilot for an LMS. I became a fan of Canvas and was very pleased when we selected it as our district’s LMS.

I absolutely love Canvas’ ease of use. I use all the typical features like announcements, discussions, and file storage. More unique features like modules help my students find everything they need for each unit and pages allow me to easily share enrichment materials.

However, looking back at this first full semester with Canvas, I was surprised which feature had the greatest impact on student learning. It wasn’t any of the above. It was SpeedGrader. Specifically, the ease with which SpeedGrader enables me to provide rich feedback to students on their assignments. Sure, I provided feedback before Canvas by writing comments on lab reports, but it was time consuming (I write much slower than I type) and not always legible (my handwriting is poor). I always had more feedback to provide than what I took the time to write. SpeedGrader has changed all of this.

Here’s my workflow for AP Physics B. Students create an ePortfolio in Canvas that contains all of the labs for which they perform analysis and are assessed. I create an assignment in Canvas for each lab and they submit a link to their ePortfolio. (The ePortfolio part isn’t critical, you could create assignments and have students submit their work in any number of ways.) In SpeedGrader, I can view their ePortfolio in one pane while typing feedback in another. This feedback is what has had the greatest impact on student learning.

SpeedGrader

I don’t score labs by subtracting a bunch of points, I read them. For my AP class, they earn a score of 1-5 which is reported in the online grade book, but doesn’t show up anywhere in Canvas. In Canvas, I just mark the assignment as complete or incomplete. In Canvas, the focus is on learning; not grades. What students do get is my feedback which often starts a discussion about their lab. My feedback is usually questions of the type I would ask of them in person. Questions that help them make connections between different ideas, clarify a misunderstanding, or illustrate an inconsistency in their analysis. In addition, I can easily point out sections that are incomplete. Many students have their notifications configured so that they receive an email when I submit feedback and some respond back almost immediately.

The integrated discussions in SpeedGrader is a perfect example of the role that technology should play in education. Enhancing a sound educational practice (rich feedback and discussion) by making it more efficient and easier for all involved.

Four of my five classes submit all of their assignments in Canvas. Guess what that fifth class will start doing this semester?

SBAR and Mastery Student Survey

I previously wrote about my challenges with my Honors Physics class this year. I received several comments from other teachers which influenced the survey I administered to students this past Friday. While my colleague and I still need to analyze all the data, reflect on the semester, and decide what changes we will make for next year; I thought I’d share some of the more interesting feedback that we received.

Homework

The biggest change that we made this spring semester is that we no longer provided credit for homework. This was motivated by our experience that students weren’t developing good problem solving techniques and that he homework was really just for practice. This change was the most frequently commented on by students. On the survey, only a third of students agreed with the statement that they complete their homework before the unit exam, but 90% said that they would if it directly affected their grade. In addition, half of students agree with the statement that they complete the homework only because it is required before doing 2nd tries (our reassessment opportunity offered before and after school once a week for two weeks following the summative exam). However, 77% of students agree with the statement that they write out the complete solutions as opposed to just answers. So, while students are developing better problem solving techniques, they aren’t doing their homework.

The reasons for this are captured best by the feedback provided by a couple of students:

I liked the way the system worked first semester much better. Even though webassign was normally a stressful night before rush, I always felt significantly moer prepared after completing it and knowing that it was going to influence my grade if I didn’t do it was just enough motivation to complete it each time. This semester, now that there is no required prep before the exam, I find myself preparing less, which I know is ultimately my own responsibility. I know many students are probably in the same situation though, being motivated by hw completion grades. I think requiring prep before first assessments would also help to lower the percent of people who need second tries since they are more prepared for the first try.

I personally think that you should require the graded webassign before the tests, much like first semester. For me, I typically read the book and do a few practice problems before the exam, but I often don’t have enough time to study adequately (my biggest problem this year). I just don’t have enough time to do everything (school work in general) so I prioritize, and graded assignments take priority. Spending time working on assignments is less time for sleeping at night, so very often the most studing I do is a reading of the entire section during lunch. If webassign was graded, I would do it.”

These comments illustrate that students are aware that they need the practice, but feel unable to do so with out the threat of losing points. These comments are indicative of a much broader issue. Is the problem that we don’t motivate students to do homework by awarding credit or that other classes do? If no one awarded points for homework, then students could decide what to do based on what would help them most.

Preparing for Initial Assessment

The results of the survey and the comments shared by students demonstrate that the homework change is one of the factors contributing to the lack of preparation for the initial assessment. In addition, a couple of other questions on the survey reinforced my concern.

Only about half of students expect to master the standards on the unit exam and half feel lucky when they do master a standard. 88% of students, compared to 63% of students, say they prepare sufficiently for the second assessment compared to the original assessment.

Todd Zimmerman and Kelly O’Shea both suggested that additional formative assessments were needed. I added questions to the survey to solicit feedback on this idea. While only 43% of students want more homework before the initial summative assessment, 68% of students want quizzes with feedback before the unit exam. I think the survey results support Josh Gates’ idea of providing initial formative assessments for feedback not grades. While there is little to no difference between these assessments and homework, simply presenting it in a different way results in very different reactions in students. So, Todd, Josh, and Kelly’s ideas definitely resonated with students. The challenge will be for my colleague and I to find time to create, administer, and provide feedback on these formative assessments.

Standards Based Assessment and Reporting and Mastery Learning

My colleague and I started applying SBAR and mastery learning three years ago for a variety of reasons. The primary one was to help students focus on learning rather than grades and, as a result, better retain the concepts and have less stress.

75% of students agreed with the statement that having the standards enumerated help them prepare for the exam and 62% of students agree with the statement that the mastery system helps them focus more on learning and understanding and less on points and grades. That said, 53% of students say they still focus on their grade in honors physics as much as they do in traditional classes. In addition, while 75% of students said they take 2nd tries to improve understanding, 94% of students said they take them to improve their grade. In general, Honors Physics students are extremely grade conscious. However, in terms of stress, 91% of students find exams less stressful knowing 2nd tries are available.

Another concern that I had was that students weren’t doing their best work but, instead, were putting forth the effort to just barely master the standard. Students disagreed with this assertion. 82% of students said they do the best they can when completing an assignment and only 15% say they do the minimum to achieve mastery. Perhaps this concern of mine was due more to end-of-year malaise than reality.

Most students appreciate the benefits of standards-based assessment and reporting and mastery learning. To provide some context, the initial summative assessment usually consists of a multiple choice portion for some standards and a problem-solving portion for others. On the multiple-choice portion, students can usually miss one or two questions and still demonstrate mastery. On the free response portion, students can make non-critical errors and still demonstrate mastery. Here are some of their comments:

I like the Honors Physics standards system, but sometimes the standards do hurt my grade (if I miss two questions out of 7, it is a 0% instead of a 72%). I really appreciate the second tries for standards because of this. I also feel that the standards system forces students to retain what they learn and helps them be more prepared for the final.

The second tries are extremely beneficial to students because it allows students to relearn a target, which I believe reinforces and strengthens the learning from a topic. I remember second try targets much better than other targets.

I feel as if the class is well oriented in that it helps people to focus more on the concepts rather than having to worry about their grade. It really helps a lot more in the long run.

I honestly loved physics both semesters, second especially, and i actually found that i it was required for me to understand the material by doing the work, and i enjoy that. Not all classes need you to understand, a lot just ask for you to memorize stuff. Also, i heard that this is the type of learning you need for college so this will probably help.

I feel like, while the mastery system is sometimes problematic, it really helps to ease the stress that I have while taking each exam. Knowing that by making one silly mistake, I can still get 100% in a category allows me to stop stressing about every single little thing during tests, which also helps me to focus. It is inconvenient when I don’t master a standard, but I feel like the 0% that results from it furthers my determination to clarify that standard and thus improve my grade during the 2nd tries.

I think that however much you put into the class is what you get. Like all other classes, the advantage is in taking the intiative, regardless of the grading system. I feel the standards system is a keeper though, because as you said it eliminates the hesitance for each individual point and allows the student to focus on the big picture. VURY NICE.

With grades being administered on a pretty consistant schedule, I find that I care less about my grade for this class. The use of standards makes me feel accountable for mastering a topic. That is to say, I don’t feel good if I master a target out of luck on an exam. I think other courses’ grading systems could take the honors physics model! Physics was a pleasure!

However, some students definitely do not like the mastery learning system. While they don’t object to the principles of SBAR, the binary nature of the mastery learning system drives them crazy.

The Mastery system is extremely stressful, because it is possible to miss 4 questions across two standards and fail an exam, while it is also possible to miss 4 question across 4 standards and receive a 100%. This system would be great if it was not tied to the letter grade system, but because it is, it’s extremely detrimental to report cards. (Which do matter, regardless of understanding or not.)

I may use this quote in my introduction of mastery learning next year. I don’t expect every student to agree with my philosophy that the grade for the course should reflect true understanding of a concept regardless of the effort exerted to almost understand it. However, I want every student to understand my perspective.

Additional Challenges

Unfortunately, cheating is always a concern of my colleague and I. We asked several questions on the survey about cheating and, unfortunately, I was surprised when 59% of students agreed with the statement that other students cheat on 2nd tries and only half of students disagreed with the statement that they feel pressure from their peers to tell them what is on the exam or 2nd tries.

As in the case with the additional formative assessments, the challenge will be for my colleague and I to find the time to address this issue. However, we have to ask ourselves how to spend our very limited time. Do we spend it to thwart those students who are trying to cheat or do we spent it to help those students who are trying to learn?

What’s Next?

I had discussions with several students who suggested that completing homework should not be graded but should be required before the initial summative assessment in order to earn the opportunity for a second summative assessment. This is the policy of another science class and students like that it motivates them to complete the homework before the initial assessment. Personally, this doesn’t ring true for my philosophy.

My colleague has what may be a great solution. He wants to focus on lab notebooks next year. He is proposing that students are permitted to use their lab notebooks, which may contain observations from lab activities and homework problems, on the initial summative assessment and perhaps not on the secondary summative assessments. This may provide sufficient motivation for students to complete the practice that they need before the initial assessment without requiring all students to complete the same amount of practice or penalizing those who don’t complete the practice.

I hope to meet with my Assistant Principal and hear the motivation behind the policy to not allow reassessments unless a student earns less than an 80% on an exam and to cap that reassessment at 80%. I hope to side step this policy entirely, but I am curious as to the motivation behind it. I also want to share these results and our ideas for next year.

We will have a new LMS next year and perhaps that will provide a mechanism to offer more formative assessments and feedback before the summative assessment.

This summer, my colleague and I will sit on one of our decks and figure out what to change for next year. I’m now more confident that we can continue to pursue our goals without taking steps backward.

Help! SBAR Challenges!

My colleague and I have been using standards-based assessment and reporting (SBAR) (a.k.a. standards based grading (SBG)) and a mastery learning methodology for the past three years. We have been very pleased with the results and have continued to evaluate and improve our methodology each year. However, this semester, it appears that the students aren’t drinking the kool-aid. I need some advice.

We are planning on administering an anonymous survey as well as having a class discussion on the topic of SBAR, learning, and assessment. I also plan to talk privately with various students. However, I wanted to solicit the advice from the larger SBAR community first since I expect that will shape the survey and direction of the conversation.

The problems this semester are that students are doing worse on initial assessments and the quality of work has deteriorated. To be fair, this doesn’t apply to every single student, but it does appear to be an overall trend. While I’m not sure of the reasons behind this change and I hope that the survey, class discussion, and individual discussions provide some clarity, I have a hypothesis. I believe that there has been a change in attitude this semester compared to the previous five.

In past semesters, many students would have the following attitude towards class: learn the material as well as possible throughout the unit, do some of the homework, and try to master every standard on the initial exam. If they didn’t master a few standards, they would complete extra practice and take advantage of a reassessment outside of class. These reassessments were best to be avoided, however, since they required extra work and time outside of class.

This semester, I believe many students have the following attitude towards class: try to learn the material only based on in-class activities, do nothing outside of class, and attempt the initial exam. They expect to have to reassess every standard; so, any that they happen to master on the initial exam is considered a fortunate bonus. They then prepare the required extra practice and take advantage of the reassessment outside of class. In addition, since they only have to master a standard, do the bare minimum amount of work or quality of work to meet that expectation.

I still believe our methodology is philosophically sound. However, I fear that this deferred effort approach will result in less understanding and less retention. In addition, this bare minimum approach leads to sloppy and careless work and poor habits. This is not okay regardless of how sound the philosophy is.

I’m considering a couple of changes. One, abandon the mastery system for a 1-5 scale like what I use with my AP-level class. While this allows for more differentiation in terms of quality of student work and depth of understanding, it feels like such a huge step backward in terms of trying to de-emphasize grades. Two, adopt a cap for reassessments. My school is pushing an 80% cap for reassessments. Only students who score less than 80% are eligible to reassess. In addition, the maximum score on a reassessment will be an 80%. While this may motivate students to develop their understanding and practice before the initial assessment, it seems to contradict the very foundation of SBAR.

Perhaps even more importantly that understanding the change in attitude, I’m not sure what has precipitated this change. What is different this semester compared to the previous five? How am I, my student, or my school different?

It is all somewhat depressing since I felt that we really had created something special that was meeting the goals I set for my students. Instead, this semester, I feel that our fragile ecosystem has been shattered and I’m not sure we can recover.

No More Credit for Homework

As a previously shared, I am not making many changes in Honors Physics this semester. However, we are making two significant changes related to homework. Despite my strong belief in standards-based assessment and reporting philosophy, I have always provided some credit for completing homework. I’ve previously shared my attempt to justify this policy.

To minimize the overhead of checking homework and discourage blatant copying, we use WebAssign for homework. It worked well and certainly didn’t require much effort once I had created the problem sets. However, at the end of this semester a huge problem hit my colleague and I like a brick wall:

You get what you reward.

We rewarded a student submitting the correct answer for 80% of the homework problems in WebAssign and that is exactly what we got.

The behavior that we were unintentionally rewarding began to become clear when I would help students outside of class. The dialog would go like this:

S: “Mr. Schmit, I have a question about a homework problem. Can you help me?”

Me: “Of course! Let me see your notebook and what you have so far.”

S: “It is problem number 38. I’ll show you in the text.”

Me: “Okay, but let me see what you have written down so far.”

S: blank look

Me: “Let me see your sketch, diagram, list of givens, equation with variables, substitution of values with units, …”

S: blank look

S: I just solve the problem on WebAssign.

Me: blank look

S: I just type the numbers into my calculator and enter the final answer in WebAssign.

While I don’t have this conversation with every student, it is not at all uncommon. I suppose I shouldn’t be surprised, the students are exhibiting the exact behavior that I’m rewarding.

So, this semester, no credit for homework. None. I will still create homework assignments on WebAssign since students do like to check their answers or to ask for another version of the same problem for practice. This change will at least stop rewarding the behaviors we don’t want.

While hopefully students’ experiences during the fall semester will be sufficient to encourage them to adopt robust and organized problem solving methods, I realize it won’t for everyone. So, the second change that we are making is that before reassessment a student must show me clear, detailed, and robust solutions to the homework problems related to that standard.

Yes, I realize that many of you have been doing exactly this from day one. I’m a bit slow to catch on as it took me two and a half years. Better late than never.

As a humorous endnote, one student solved a circular-motion, car-on-banked-curve problem on the semester final exam without showing any work at all. He wrote a note about how he did the whole thing on his calculator and didn’t expect any credit. He also noted how it would be quite ironic if he got the answer wrong. He didn’t.

ISEC 2011: Standards-Based Grading for High School Physics

This post is to capture the resources discussed in the Standards-Based Grading for High School Physics presentation (P154) at the Illinois Science Education Conference. Mark Rowzee and I presented our experience in adapting standards-based assessment and reporting to two different physics courses over the past three years. Our abstract is:

We will share our experience in implementing standards-based grading (a.k.a. standards based assessment and reporting) in our regular and honors physics classes over the past two years. This methodology has helped students focus on learning and understanding and not collecting points for a grade. It has helped us focus on defining meaningful standards and providing helpful feedback.

ISEC 2011 Standards-Based Grading for High School Physics

Links to Resources:

General Physics Standards

This is a follow-up post to the Honors Physics Standards post that enumerates the standards that we have defined for our General Physics class. As I mentioned previously, this year, our entire school is replacing the traditional report card with a standards-based report card. The standards reflected on this report card, which we call report-card standards, represent an aggregation of several of the more-specific standards and are common across both high schools in our district. For General Physics, we have defined the following report-card standards for the whole year.

Report-Card Standards

  • science as a process
  • understand the basic concepts of kinematics
  • understand, explain, discuss, and apply Newton’s Laws
  • understand the basic concepts of energy and energy conservation
  • understand the basic concepts of momentum and its conservation
  • explain, discuss, and calculate the properties of electrostatics
  • explain, discuss, and calculate the properties of electric circuits
  • understand, explain, and discuss the properties of magnetism
  • describe wave type, properties, and interactions
  • understand the relationships among science, technology, and society in historical and contemporary contexts

Below are the more-specific standards that we use for General Physics during the fall semester. These standards are influenced by objectives defined by a group of physics teachers working together at the county level as well as Modeling Instruction.

Fall Semester Standards

STT 1. I can build a qualitative model, identify and classify variables, and make tentative qualitative predictions about the relationship between variables.

STT 2. I can select appropriate measuring devices, consider accuracy of measuring devices, maximize range of data, and calculate error propagation for an experiment.

STT 3. I can develop linear relationships and relate mathematical and graphical expressions.

STT Lab 1. I can create and populate data tables for an experiment.

STT Lab 2. I can measure phenomena in the laboratory with minimum error.

STT Lab 3. I can create graphs from data measured in an experiment.

STT Lab 4. I can analyze graphs of data measured in an experiment.

STT Lab 5. I can analyze uncertainty in an experiment.

STT Lab 6. I can write a complete formal experiment report according to the specified format.

CVPM 1. I can distinguish between scalar and vector quantities.

CVPM 2. I can describe and analyze constant-velocity motion based on graphs, numeric data, words, and diagrams.

BFPM 1. I can draw a free body diagram and add vectors graphically to find net force.

BFPM 2. I can identify the Law of Inertia (Newton’s 1st Law) to various situations in the real world.

BFPM 3. I can identify action-reaction force pairs (Newton’s 3rd Law) and the fact that they act on two separate bodies.

CAPM 1. I can describe and analyze uniform-acceleration motion based on graphs, numeric data, words, and diagrams.

CAPM 2. I can apply the various kinematics equations in one dimension.

UBFPM 1. I can draw a free body diagram and use the concept of net force to solve problems using Newton’s 2nd Law

UBFPM 2. I can identify how different factors affect the force of friction and can differentiate between static and kinetic friction.

UBFPM 3. I can solve problems using the coefficient of friction.

UBFPM Lab 1. I can determine the relationship between force, mass, and acceleration using experimental data.

PMPM 1. I can justify that if the only force acting on an object is gravity, it will have the same constant downward acceleration regardless of mass, velocity or position.

PMPM 2. I can apply the various kinematics equations in two dimensions while recognizing the independence of horizontal and vertical variables.

PMPM Lab 1. Model the path of a projectile based on experimental data and use this model to hit the predicted location.

PMPM Lab 2. Compare predicted values based on a model against experimental results.

COEM 1. I can identify that energy is transferred and solve problems using conservation of mechanical energy (kinetic energy and gravitational potential energy)

COEM 2. I can identify work as a change in energy and calculate its based on force and displacement.

COEM 3. I can analyze the rate of energy change of a system in terms of power.

COEM Lab 1: Perform an experiment to compare the loss of gravitational potential energy and the gain of kinetic energy of an object moving down an  incline in order to calculate the energy transferred between the system and the environment.

COMM 1. I can identify momentum of an object as the product of mass and velocity and relate the change in momentum (Impulse) to the force acting on it over a period of time.

COMM 2. I can analyze the momentum of a system of objects in one dimension and distinguish between elastic and inelastic collisions

COMM 3. I can solve problems using conservation of momentum were the net external force is zero.

Spring Semester Standards

ES 1. I can identify the charge on each sub-atomic particle and describe the behavior that each has on each other and how these particles move in a conductor.

ES 2. I can apply the principle of conservation of charge (charge is neither created nor destroyed just transferred from one object to another) to predict the movement of charges in insulators and conductors.

ES 3. I can predict attraction and repulsion between charged and neutral objects and predict how charges will redistribute based on charging by contact and induction.

ES 4. I can apply Coulomb’s Law to two charged particles.

ES 5. I can describe an electric field and identify the electric field diagrams for a one or two charge system and identify the direction of the force experienced by a charge in an electric field.

ES Lab 01. I can predict the charge on a neutral object knowing the process by which it was charged.

ES Lab 02. I can demonstrate how to put a charge on a conductor using the processes of conduction and induction.

CIR 1. I can recognize and analyze series and parallel circuits.

CIR 2. I can apply how energy is conserved within a circuit (Loop rule) and how charge is conserved within a circuit (Junction Rule)

CIR 3. I can calculate equivalent resistance and apply Ohm’s Law.

CIR 4. I can calculate the power used by an electronic device.

CIR Lab 1: I can measure voltage and current with an appropriate meter.

CIR Lab 2: I can draw a circuit diagram and build it correctly based on a description.

CIR Lab 3: I can draw a circuit diagram for a circuit based on bulb brightness and observations of the circuit.

EM 1. I can recognize and explain what causes magnetic fields.

EM 2. I can identify the direction of magnetic fields.

EM 3. I can distinguish between magnetic fields and electric fields.

EM Lab 1. I can understand the relationship between magnetic and electric fields.

EM Lab 2. I can recognize that an object must be charged and moving in a magnetic field in order to experience a magnetic force.

WA 1. Know and identify the following features of a wave: amplitude, wavelength, frequency, crest, trough, node, antinode, and period.

WA 2. Identify, and compare and contrast, the two types of waves and how they transfer energy.

WA 3. Apply the principle of superposition to explain constructive and destructive interference of waves.

WA 4. Conceptually and mathematically describe reflection and refraction of waves.

WA 5. Conceptually and mathematically demonstrate the relationship between velocity, frequency, and wavelength for a wave, and how wave medium affects these variables.

Understand the relationships among science, technology, and society in historical and contemporary contexts.

The Danger of Misapplying Powerful Tools

When I was a software engineer, I frequently used powerful tools such as C++ and techniques such as object-oriented analysis and design to implement software that performed complex operations in an efficient and effective manner. I also spent a lot of time sharing these with others. However, I learned to provide a caveat: if misapplied, these tools and techniques can result in a much more significant problem than would result when applying less powerful ones. That is, if you are not skilled in the deployment of these tools and techniques, the risk is much larger than the benefit.

Other engineers didn’t always appreciate this caveat. So, I would try to communicate with an analogy. You can build a desk with a saw, hammer, screwdriver, and drill. You can build a desk more efficiently using a table saw, drill press, and nail gun. If you make a mistake with the hammer, you may loose a fingernail. If you make a mistake with the table saw, you may loose a finger. If you are not adept at deploying the tools and techniques, maybe you should stick with the hand tools until you are.

In reality, the risk of misapplying these tools and techniques is more significant than the impact on the immediate project. The broader risk is that others who observe the troubled project associate the failure with the tools and techniques instead of the application of those tools and techniques. People get the impression, and share their impression, that “C++ and object-oriented analysis and design is a load of crap. Did you see what happened to project X?” Rarely do people, especially people not skilled with these tools and techniques, have the impression that the problem is the application of the tools and techniques rather than the tools and techniques themselves. This, in fact, is a much more serious risk that threatens future applications of the tools and techniques in a proficient manner due to their now tarnished reputation.

A series of articles and posts recently reminded me of my experience writing software and this analogy. I feel compelled to start with a disclaimer since this post has the potential to come across as arrogant, which is certainly not my intention. I have not performed any longitudinal studies that support my conclusions. My conclusions are based on few observations and my gut instinct. I tend to trust my gut instinct since it has served me well in the past. So, if you find this post arrogant, before you write me off, see if these ideas resonate with your experience.

SBAR

Let’s start with Standards-Based Reporting and Assessment (SBAR) (a.k.a., Standards-Based Grading (SBG)). Last year, my school started adapting SBAR school-wide. SBAR is a powerful methodology that requires proficient deployment. It is not easy to adapt and effectively apply SBAR to a classroom in an effective way that resonates with parents, students, teachers, and administrators. Proper deployment requires a fundamental change in the teacher’s and students’ philosophy of learning. While the effect of a failed deployment on the individual classes is unfortunate, the larger problem is that teachers and parents attribute the problems to SBAR and not its application. It takes much less effort to convince a parent confused about SBAR of its value than it does to convince a parent livid about SBAR due to a poor experience in another class. At my school, one early SBAR adopter stopped referencing SBAR or SBG at all in his class to distance his methodology from the problematic applications. Fortunately, my school has pulled back a bit this year. This is the risk of mandating application of a powerful tool by those not proficient in its deployment. This is not a unique experience.

Two years ago, another teacher and I decided to try to apply SBAR to our Honors Physics class. We mitigated the risk by limiting deployment to six sections of a single class taught just by the two of us. We sent letters to parents, talked to parent groups, discussed the system with students during class. Only after gaining a year of experience, did we attempt to adapt SBAR to our General Physics class which contained ten sections and was taught by four different teachers. The risk of trying to deploy SBAR on this scale initially was too great given our proficiency.

Technology

Someone recently shared this New York Times article that questions the value of technology in the classroom. In general, a given piece of technology on its own isn’t effective or not effective. Whether technology is effective or not depends as much on its application as the technology itself. It depends on the teacher and the students and the class. Personally, I’ll stick with my $2 interactive whiteboards. This isn’t because SMART Boards are inherently ineffective. It is because they aren’t effective for me and my students given my classroom and my expertise. I expect there are teachers out there who use SMART Boards quite effectively. They are probably sick of hearing how they are a complete waste of money.

I hope to have a class set of iPads at some point this year. My school isn’t going to buy iPads for every student. Instead, we’ll put iPad in the hands of 25 General Physics students in my classroom and see what we can do together. Start small, reflect, adjust, expand.

Modeling

I participated in a Modeling Instruction Physics workshop in the summer of 2008. I didn’t dare to really start modeling in my classroom until last fall. Why? I believed that the potential risk to my students due to a misapplication of the modeling methodology was tremendous. I decided that it was better for my students to learn what they could via more traditional instruction than what I foresaw as a potential disaster if I misapplied the deployment of modeling. Even more importantly, I was concerned that I could put Modeling Instruction at risk of never being adopted if my failed deployment was interpreted as a failure of Modeling Instruction itself. Only after more research, practice of Modeling Instruction techniques, and discussions with others, did I feel comfortable deploying Modeling in my class last fall. In an attempt to shield modeling from my potential deployment failures, this is the first year that I’ve associated the label “Modeling Instruction” to my class.

I used to be surprised at how adamantly some Modelers warned teachers not to do Modeling Instruction unless they had taken a workshop. I now believe they are worried about the same potential risk that I am. Modeling Instruction is a collection of powerful tools and techniques. Done well, by a skilled practitioner, Modeling Instruction can be incredibly effective. Applied ineffectively, Modeling Instruction can be a disaster and tarnish its reputation. I think students are better served by traditional instruction than by Modeling Instruction applied ineffectively. Traditional instruction may result in a lost fingernail. Ineffective modeling instruction may result in a lost finger. There, I said it. Disagree in the comments. Just don’t take that quote out of context.

While not directly related to modeling, I believe this recent article supports my conclusions. The problem isn’t that hands-on labs are ineffective, it is that ineffective deployment of hands-on labs is ineffective.

Conclusion

I don’t want my thoughts that I’ve shared here to paralyze you into inaction. Rather, I hope that I’ve encouraged you to make sure that you have sufficient expertise so you can apply your powerful tools and techniques in an effective manner. Your students will benefit and the reputation of these powerful tools and techniques will benefit as well.

How do you do this?

  • Attend professional development opportunities (e.g., Modeling Instruction Workshops) that increase your skill with these powerful tools and techniques.
  • Apply these powerful tools and techniques in a limited manner as you gain experience and expertise.
  • Participate on Twitter, start a blog, read a bunch of blogs, participate in online discussions (e.g., Global Physics Department), and subscribe to email lists to accelerate your knowledge of these powerful tools and techniques.
  • Observe skilled practitioners of these tools and techniques, find a coach to observe you, welcome feedback from everyone.