Sunday, October 10, 2010

Accuracy versus Precision - Some Teacher Conceptions

When it comes to lab, measurement is the road to quantitative data. And when it comes to measurement, there is a wealth of issues related to the reliability of the measured data that our students collect. The issues associated with the data our students collect include accuracy, precision, reproducibility, uncertainty, margin of error, calibration, percent error and significant digits. In the United States, with its alphabetical approach to high school science instruction (biology first, then chemistry, then physics), these issues are most commonly addressed in the first few weeks of chemistry class - the first quantitative science class that students are exposed to. It seems to me that much less attention is given to the topic in physics - the second quantitative science class.


The Question in Question
Question #21 on our most recent Honors Chemistry test has stolen a good deal of my mind space during this past week. If I had to guess what question on the test would provide fuel for a blog post, I certainly would not have picked this one. In fact, I would have bet that a question on a chemistry test would never prompt a post in a blog about physics lab work. But for certain, question #2 is a lab question, driving to the heart of the concepts associated with reliability of measurement. Here is question #21 from the test:
Three different measuring tools were used to measure the volume of the same sample of water. The resulting measurements are shown:
Tool A: 120 mL
Tool B: 123 mL
Tool C: 123.4 mL
What conclusion can one make about the three measuring tools?
a. The three tools have varying degrees of precision.
b. Tool A is not as accurate as tool B and tool C.
c. Tool C is the most accurate tool of the three.
d. None of the tools are accurate.

Before you read on in search of my answer, give the question some thought and make a commitment to either an answer or to the verdict that the question is a terrible question. As you think through the question, give attention to your thought process. What are your internal conceptions of accuracy and precision? What images do these terms conjure up in your mind?


Teacher Conceptions of Accuracy and Precision
What captivated me about this question was that it seems that so few of us agree about what the answer is or if there is any answer at all. My first recognition that there was a lack of agreement about the terms accuracy and precision was when I checked my test key against a colleague's. We disagreed about the answer to question #21. Being intrigued (and concerned), I asked two of my respected colleagues what their answer would be. Once more, there was disagreement. The count was 2:2 and my intrigue over the question and the conceptions that we hold regarding accuracy and precision grew. And so I began an informal survey of several colleagues (approximately 13 others) in our science department. I presented them with question #21 and asked them how they would answer it. I found that my science teaching colleagues fell into three categories.

The first category is Category C - those who answered C with very little reservation. As they thought out loud about the question, they commented that the answer is definitely not A. Their conception of precision was that an instrument was precise if the measurements that it took were reproducible. Since there was only one measurement made with each tool, there is no way to evaluate the precision of the three instruments. Multiple measurements with the same tool would be required in order to evaluate the precision of the tool. For Category C teachers, precision had to do with reproducibility and without multiple measurements from the same instrument, there is no way to judge an instrument's precision. Yet because Tool C was able to make measurements with a higher number of significant digits, it was the tool that could make the most accurate measurement.

The second category is Category N - those who responded by saying there is no answer to the question. A couple of teachers within this category responded loudly with That's a terrible question! and Where did you get this question? comments. Like Category C teachers, Category N teachers were unable to make a decision regarding the precision of the tools. Once more, their conception of precision had to do with reproducibility and since only one measurement was made with each tool, their precision could not be compared. But Category N teachers had an additional problem with evaluating the accuracy of the tools since "the real volume" or "the true reading" or "the actual amount" was not stated in the question. For Category N teachers, their conception of accuracy had to do with proximity of a measurement to the true, real, or actual value. When I probed a bit about what was meant by true,real, or actual, I eventually was able to elicit the phrase accepted value from these teachers. Because of the lack of information required to evaluate both the precision and the accuracy of the tools, these teachers regarded the question as a bad question that had no answer.

The third category is Category A - those who answered that A was the correct response. Like Category N teachers, Category A teachers believed that there was a need for knowledge of the accepted volume of the sample in order to judge the accuracy of the tool. As such, these teachers ruled out choice B, C and D as possible answers. But these teachers quickly gravitated towards answer A. Their conception of precision had nothing to do with reproducibility. Their conception of precision had to do with exactness of the tool. The number of division present on the tool was the critical feature that marked a tool as being precise. The more divisions that were present on the tool, the more precise the measurement. For Category A teachers, a tool was precise if it was able to make a measurement to a greater number of significant digits.

Three of the Category A teachers had difficulty comparing the precision of tool A and tool B. For these teachers, it seemed that both tool A and tool B could make measurements of the volume to three significant digits. They admitted that their understanding of significant digits was stale and they were unsure of the rule regarding the significance of trailing zeroes in a measurement. If the topic came up and the rule was clarified (trailing zeroes are only considered significant if a decimal place is present in the number), these teachers decisively chose A as the answer. All three teachers were physics teachers and significant digits are given significantly less attention (if any at all) by physics teachers in our department.


The Dart Board Analogy
The most interesting aspect of my dialogue with the 16 science teachers was the frequency with which they referenced a particular analogy as they approached the question. In our post-survey discussions, every one of these teachers discussed the dart board or bull's-eye analogy. The dart board or bull's-eye analogy is commonly found in chemistry textbooks when discussing the distinction between accuracy and precision. It seems to be the common modus operandi of both textbook authors and classroom teachers in explaining how accuracy and precision are different.


Of the seven chemistry teachers that I surveyed, all but one of those teachers explicitly referenced the dart board analogy. And each one that did was either a Category C or a Category N teacher. Ingrained in their mind was the conception of precision as being equivalent with reproducibility. Only one chemistry teacher was a Category A teacher; when reading the question, this teacher did not conjure up a picture of a dart board, but rather conjured up images of volumetric measuring tools with varying amounts of divisions.


Six of the eight physics teachers that I surveyed were Category A teachers. These six physics teachers did not equate precision with reproducibility. In our post-survey discussion, each of these Category A teachers mentioned the dart board analogy but did not feel that the analogy was significant to the question. One of the physics teachers was a Category C teacher; he quickly referenced the dart board analogy as he reasoned through to his answer. In fact, he commented that the thing he remembers most about chemistry class was the dart board analogy. The other non-Category A physics teacher explicitly defined accuracy (proximity to a target value) and precision (reproducibility) in a manner consistent with the dart board analogy. This teacher was a Category N teacher.


Is the Dart Board Analogy On Target?
My intrigue over this question was intensified by the realization of the power of an analogy on teachers' thought processes. The analogy dominated the thinking of nearly every chemistry teacher as they approached this question. As one colleague put it, "Every chemistry textbook defines precision as reproducibility using the dart board analogy."

So what is this dart board analogy? And why is it so popular? Perhaps the most concise and representative presentation of the dart board or bull's eye analogy can be found at http://celebrating200years.noaa.gov/magazine/tct/tct_side1.html. If unfamiliar with the analogy, take some time to review it if you are unfamiliar with it. As you read through the analogy, ask yourself if you agree with it. Is the dart board analogy on target when it comes to the presentation of the concepts of accuracy and precision? Then come back next week to the Lab Blab and Other Gab blog as I take aim at the dart board analogy.






This article is contributed by Tom Henderson. Tom currently teaches Honors ChemPhys (Physics portion) and Honors Chemistry at Glenbrook South High School in Glenview, IL, where he has taught since 1989. Tom invites readers to return next week as he continues to gab about accuracy and precision. Tom plans to take aim at the dart board analogy and hopes to provide some accurate and precise discussion about topics of measurement.

Saturday, September 25, 2010

The Back of the Room

At this time of the year (the first few weeks of school), I'm thinking a lot about the back of the room. Like most science teachers, it is one of my main objectives during the first few weeks to set the stage for the remainder of the course with regard to the role of lab. Part of the stage-setting process is that I get a little preachy about the importance of the laboratory and its uniqueness to the curriculum.

On the first day of class, I tell my students that "Science class is different than other classes because the room is bigger." I pause long enough for them to show those looks of confusion and perplexity; then I repeat the statement.  After repeating the statement, it is clear from their faces that they need and want an explanation. So I explain that the back wall of our classroom is not located behind the last row of seats like it is in their other classes. Unlike math class, history class and english class, the back wall of our science classroom is located a good thirty feet behind the last row of desks. Lab tables and lab equipment fill the extra 30 feet of classroom space. The activity that happens in that space is what makes science class different than other classes.

Then I explain that the room is bigger in science class because the subject of science is different than other subjects. Compared to their other courses, science is unique. It is this extra 30 feet in the back of the room that makes it unique. Borrowing a line from a colleague, I explain that the answers to our questions are found in the back of the room and not in the textbook. Every trip to the back of the room involves an effort to answer a question. The question must reign supreme in the mind of every student as they cross the threshold between where science is talked about and where science is done.

Herein lies the challenge: forming questions that lead students along a path of inquiry and result in a learning experience in the doing of science. For certain, not all questions are created equal. So what makes a good question? Here are some of my quick ponderings on the topic. I think that good questions share some of the following traits:
  1. Good questions are testable questions; the answers can be found within the lab environment. 
  2. Good questions are interesting questions; they engage students.
  3. Good questions are questions that are clear enough to guide (and replace) the procedure.
  4. Good questions are questions whose answers cannot be found in the textbook.
  5. Good questions emerge from students' own curiosities. 
This past week, we started what is perhaps my favorite back of the room experiences. I call the lab Improving Your Image. While the lab is definitely challenging, it is also one of my students' favorite back of the room experiences. When I watch students do this lab, I feel like science is happening. The question and the purpose that is presented to students is
Question: What is the mathematical relationship between the number of images formed by a combination of two plane mirrors and the angle between the mirrors? 
Purpose: To determine the mathematical equation that relates the number of images to the angle between two plane mirrors. 
As is my usual custom, I discuss the question and the purpose during the pre-lab session; students write the Title and Purpose into their lab notebooks as I quickly made last-minute preparations. To explore the question, I have purchased several inexpensive 1-foot square mirrors from a department store and have taped sets of two mirrors together at their edges using duct tape. I showed students the equipment and demonstrated how the angle can be adjusted. I also showed students a protractor that was available at each lab station. I then asked the class, "What will the procedure involve?" The question/purpose is clear enough to provide the answer to that question. Invariably a student is glad to volunteer the procedure. I then asked the class, "What data will you collect?" Once more, the question/purpose is clear enough to provide the answer to that question. Several hands lifted as students eagerly participated in the pre-lab. In effect that question that is posed at the beginning of the lab guides and even replaces the procedure. For the remainder of the lab, the question (and not a step-by-step procedure) will remain at the forefront of the students' minds.

Curiosity piqued during the pre-lab session as I asked the class, "Have any of you ever done this procedure in your bathroom?" This question got some odd looks, but the odd looks quickly subsided as several students responded with audible O' Yeahs. I asked a responder to describe how they have done this procedure in their bathroom. She explained how they have a mirror on the door of their medicine cabinet which opens up towards a second mirror on the wall. They can open and close the medicine cabinet door and adjust the angle that it makes with the wall mirror. If they place their face between the mirrors, they will begin to see varying number of images with a varying angle. The other responders grinned in agreement. Excitement built as more students recognized that they have done the same thing. I dismissed the students to the back of the room to begin investigating the question.

Students began adjusting the angle between the mirrors and counting the number of images that they could see. As I entered the back of the room, I heard a chorus of wow, cool, and ewww. Interest heightened as students begin to manipulate the mirror angle and observe the multitude of images. As the angle grew narrower, the number of images increased. The photo below illustrates the wowness of the lab. Seeing the multiple images of a single object as you scan the 360-degree panorama is a "this rocks" experience for students.

Septuplets


Students adjust the mirror angle and count the number of images. They repeat the count for a variety of angles, collecting sufficient data that would allow them to answer the question - to determine the mathematical equation relating the number of images to the mirror angle. On years in which I wish the challenge to be easier, I suggest angles of 180, 120, 90, 60, 45, 40, 30 and 20 degrees. And on some years I allow them to choose whatever angles they wish. I almost always follow-up the activity on the following class day with a JAVA simulation that models the formation and location of images for varying angles. (View applet exercise.)

This past week, I entered the lab a few minutes after startup. As I approached one lab table, I overheard three students talking like I've never heard them talk before. They were fully engaged in the question. A sense of enthusiasm could be observed in their voices as they were adjusting the mirrors, counting the images, and pondering the question. I tried to keep my presence unknown as I eavesdropped on their conversation. They were discussing how the mirrors were dividing up the space surrounding the apex into sections and an image was present in each section (except for the section the object was in). They were quite animated in their discussion; they used their arms to form angles and began to point out the image locations. One student began sketching in her lab notebook to illustrate the point she was trying to communicate. As their hypothesis developed, they changed the angle and recounted images in an effort to test it. I knew they were doing science. And I knew they were close to the answer when I heard them talk about dividing 360 degrees by the angle. I quickly left for fear that I'd be invited to help answer their question. They were making great progress and enjoying each minute of it; it was time to scram.

A few minutes later I approached another lab table where one of my students was content at working by himself on the problem. Needing a sounding board, he stopped me and remarked, "Mr. H, I've been thinking about this pretty hard ever since you showed us the question." I love ponderers. And I love to hear that phrase I've been thinking. He sounded out his hypothesis about what the equation was and then paused for confirmation from me. Looking at the data table in his lab notebook, I asked him "What does the data say?" With a classroom-wide grin, he said "The data and the equation fit perfectly." Case closed!

I've been reflecting this past week on why do I like this lab so much?  I believe the answer is that this lab, perhaps more than any of my other labs, demonstrates the power of a good question. If labs are to be done with purpose (and not with procedure), then the question must reign supreme in every student mind as they enter the back of the room. This lab demonstrates nearly every trait of a good question. It is testable and answerable. It clearly engages students. It is clear enough to guide the procedure. It does not lead to a verification experience where an answer found in a textbook is verified in the back of the room. Few textbooks (if any) ever discuss the topic; this answer must be found in the back of the room.

For me, two of the greatest challenges to making the back of the room experience scientifically authentic has to with good questions. There are certainly other challenges, but I immediately think of the challenge of ...
  • ... forming questions that are strong enough to guide students along a path of scientific inquiry towards an answer.
  • ... cultivating an attitude among and developing the ability of students to form questions that are testable within a laboratory environment.
Near the top of this post, I mentioned five traits of good questions. The one trait that is not exhibited by this Improving Your Image lab is that the question does not emerge from the students' curiosity. The question certainly conjures up curiosity, yet I was the one who presented the question. This year I will be making efforts to improve students' abilities to form good questions - the kind that can be answered in the lab environment. Once students gain confidence that answers can be found in the back of the room, then it becomes their turn to ask the question. Once students begin asking the question, the level of inquiry will be raised to the highest level. (See previous post on Using Levels of Inquiry in the Classroom.)

Cultivating this attitude of curiosity and developing this student ability to ask questions will be on my radar screen for the rest of the year. Stop back next month as I report on my first effort at improving student ability to ask a good question.


Sunday, September 5, 2010

Emotions, Pride and Lab Journals

For the past few summers, I spent the week after school ended on a week-long bicycle trip. Although stressful to practically “race” out the door at the end of the school year, the trip benefitted me by serving as a mental re-set button. After a few days on the bike, my mind seemed to accept that I was no longer in school-mode. This June, I biked with my uncle, an established researcher in molecular biology, and an established biker as well. The first day was the hardest. Think hills—lots of hills—and pouring rain. We arrived at the end of the day to find our luggage had a similar experience as us—it was drenched from downpours, too. My uncle pulled dripping wet clothes, papers, electronics, and the like from his bag. He was most upset that his journal got soaked. I responded with horror, too, asking: “did you lose experiments?” The most interesting thing to me was his response. He replied that it had been a blank journal, and that he had intended to come up with some experiments on the trip. I realized that if all that emotion is connected to a BLANK lab journal, there must be a whole lot of emotion connected with a full one! It made me reflect on a few specific students this year.

On the first day of the school year, I gave each student a blank notebook that I purchased out-of-pocket for them. I introduced the course briefly and then sent students to the back of the room to try their hands at their first lab. Alayna, however, came straight to me. She wanted to know if she could switch the color of her notebook. You see, she had already color-coded her classes, and physics simply had to be the color green. Although it had only been in her possession for five minutes, Alayna was already thinking of the journal as hers. Naturally, we made the switch to green. Alayna made the journal hers in other ways. I’m linking here to an image of the first page of the table of contents Alayna chose to create at the beginning of her journal to give you a sense of the effort she put into making the journal of value to herself. Alayna took pride in her journal throughout the year. As one of the only seniors in a predominantly junior course, Alayna left school about a month early to complete a senior project. Alayna was a deeply involved student, so I’m sure leaving high school was poignant for her. She promised to stop by before she graduated, and she told me that she’d bring me a present when she did. When I saw Alayna in June for the promised visit, she gave me my present. She presented me with her lab journal so that I could keep it and find ways to use it to help future students.

Kara is synonymous with cheerfulness to me. There was, however, one incident that caused Kara great distress. I allow my students to use their journals during tests and quizzes. As a co-worker said, students’ journals are “like Google for physics class.” The use of journals on tests and quizzes adds to students’ desires to make a useful journal, and it encourages me as a teacher to think about how to move past fact-based, recall-only questions on tests. There is one exception to my policy of allowing journals during tests. Students who are absent on test day are not allowed to use their journal on the make-up test. I provide an equation sheet to use with the make-up test instead. Kara missed a test day because she was sick, and she had to make up the test without the use of her lab journal. Although generally more chipper than a Disney hero, Kara was always sullen upon recalling this particular test. The test wasn’t out of the norm in terms of her test averages. In fact, it didn’t even impact her grade (I checked to see if there was any difference in her quarter grade had she been simply excused from the test—there wasn’t). Regardless, taking a test without her lab journal ranked as a truly depressing memory for her.

For both Kara and Alayna, the journals were a large part of their physics experience. Both students spent a lot of time making the journal theirs and of personal value, and as a result both students had a lot of pride and emotions attached to their journals. Although traditional assessments do have their place, journal work often provides something different for students. Here are just a few reasons why I believe students often take more pride in lab journals than tests and quizzes:
  • More focus on inquiry. As a teacher, having the kids keep a lab journal forces me to consider whether my labs are the most appropriate for journaling. A lab where kids confirm things they already know or simply enter numbers into a bunch of blank boxes often makes me stop and think: “how could I add more inquiry to this lab?”
  • Novelty. I was told during a workshop this summer that the average student takes over 1600 tests and quizzes between grades one and twelve. Why should my test be anything special to them? A journal, however, is different—I guarantee you no student has written over 1600 journals!
  • Personal choice and creativity. I was reading the book Drive by Daniel Pink and was reminded how important autonomy and choice are as motivating factors. Journals are a tool that, when used well, allows students to determine how to best accomplish their lab goals. This choice fosters creativity, investment, and motivation.
  • Authenticity. Real scientists, like my uncle, use lab journals. I didn’t ask my uncle the last time he bubbled in answers to a multiple choice test, but I’m guessing it was a long time ago.
  • Evidence of progression. A test or a quiz is a snapshot at one instant in time. Often, the only data that is recorded in a teacher’s gradebook is a percentage. Here’s a challenge for any teacher. Look at your gradebook from last year. Let’s say you find a student who got an 80% on a quiz. Can you tell me anything else about the student’s compression? Did that student understand 80% of the material perfectly and know nothing about the other 20%, or did the student understand all of the material somewhat but just not completely? There is not enough data to determine the answer from just the percentage score. Look back at a student’s lab journal and you will have a much better picture of the student’s progress and understanding.
I am not suggesting that lab journals are the answer to everything, or that I have mastered their use. Rather, I hope you will reflect along with me about some of the things lab journals do that traditional assessments do not, and vice versa. I invite you to share your insights in the comments section of this post. It will be like a mini “online journal.” Speaking of, that’s my topic for next time.


Until then, 
Debbie 


____________________________________



This month's article is contributed by Debbie Berlin. Debbie is a graduate of Northwestern University in Evanston, Illinois. She has been a high school physics teacher for 12 years. Debbie currently teaches Regular Physics and Honors Physics at Glenbrook South High School in Glenview, IL, where she has taught since 2004.  

Saturday, August 21, 2010

June, July and August

Summer time is a wonderful time of the year. Where I live, summer time means sunshine, heat and humidity. It also means a chance to complete a project or two that was ignored during the busy school year.  This summer's project was the pond project.

The pond project was precipitated (no pun intended) by the loss of two trees during the previous fall. One tree was a 60-foot tall, aging ash that swayed in the wind with such an amplitude that we expected it to fall at anytime. The other tree was a 30-foot tall American elm - the last of the original 13 that was present in our yard when we bought the home. The other 12 elms were overcome by Dutch elm disease; we expected the same fate for this elm. Removing these two trees from our landscape left large holes, both in the ground and in the visual field between our home and our neighbors. With the landscape cleared of the two trees, we were confronted with the obvious need to impart some freshness into a yard which had for far too long been dominated by these two old, decrepit-looking beasts.

Part of the solution to the need to impart freshness into our landscape was the undertaking of the pond project. The plan was to build a small pond (6'x9') and waterfall in our back yard. After researching the topic, I became convinced that it was an easy enough project to qualify as a do-it-yourself project. For certain, it would be both a stretch and a challenge, but nonetheless a do-able task. Like many of these do-it-yourself projects, the pond project would help keep my spirit young (and make my body older) while I devoted myself to learning a new set of skills and a new way of thinking. Having never built a pond before, the project was clearly a venture outside of my safety zone. While intimidating at times, venturing outside of your personal safety zone can be a really good thing.

You have likely heard the saying: There are three good reasons to go into teaching: June, July, and August. I love teaching. And fortunately I have more than three reasons to have entered into this honorable profession. But I, like my students, do agree that June, July and August are wonderful months.  These three months offer us time to relax, to regroup, to spend more quality time with family, to travel, and to pursue interests that were limited during the previous nine months. There are no papers to grade, no lessons to prepare, and no tardy forms to fill out. There is no equipment to fix, to clean or to put away. June, July and August offers teachers (and students) a time to get away - to get away from the day-to-day grind of school.

But June, July and August is also a time to retool, to rethink and to recharge. It's a time to reflect on what went well and what can be improved. It's a time to improve. It's a time to build. It's a time to prepare, to get ready, and to get started. As a colleague of mine often says, "June, July and August makes it possible to hit the ground running once that first day of school comes."

June, July and August gives us an opportunity to inspect the landscape and to make changes. It's a time to consider ridding the landscape of those aging beasts that have been around for far too long and to inject a little freshness into our practices. In our profession of teaching, we all need a pond project. Once a year, we need to take the time to consider a change. We need a chance to learn a new set of skills and a new way of thinking about our instruction, our assessment, our curriculum, and our other practices. We need a chance to venture outside of our safety zone.

June, July and August offers teachers an invitation to try something new and fresh. It provides us with the gift of time to ponder the possibility of adopting a challenge. While intimidating at times, changes in our practices and our way of thinking can be a good thing. The words of my department chair resonates in my mind:
"There are some teachers who have taught their first year 20 times; there are other teachers who have 20 years of experience."
Sometimes the landscape of our teaching practices needs some change, some freshness, and some improvement. It gets old, decrepit and at times diseased. If we're not careful, some deadwood may come crashing down during a storm. June, July and August is a wonderful time.

I hope you have had a good summer. And as the new school year approaches, I hope you are considering your landscape. I hope that this year adds one more year to your teaching experience. And I hope that this school year affords you a chance to do some traveling ... outside of your safety zone.

Have a great school year!


________________________________________



This article is contributed by Tom Henderson. Tom is the author of The Physics Classroom website.  He is a graduate of the University of Illinois in Champaign-Urbana, Illinois. He has been a high school physics teacher since 1989. Tom currently teaches Honors ChemPhys (Physics portion) and Honors Chemistry at Glenbrook South High School in Glenview, IL, where he has taught since 1989. 




Feel free to stop back by the Lab Blab and Other Gab blog as we explore a variety of topics this coming school year. Topics will range from the use of lab journals in the science classroom to ways of improving our approaches to scientific inquiry. And along the way, we may also investigate ways of improving the scientific literacy of our students.

Wednesday, June 16, 2010

It's (All) About Data

Lab: it's all about data. From beginning to end, the focus is in one manner or another on data. Scientists begin with a question that they hope to answer. And from then on, the focus is on data. And our physics students should participate in the same type of data-centered activities as those of scientists.  Here is a sampling of the type of data-centered activities which characterize most labs.
  • Deciding on what data to collect. 
  • Deciding how to gather the data. 
  • Collecting the data. 
  • Deciding how to record data. 
  • Recording the data. 
  • Determining what the data mean. 
  • Determining if the data mean anything at all. 
  • Evaluating the trustworthiness of the data. 
  • Comparing the data with the data of other experimenters. 
  • Comparing data with previous experiments. 
  • Graphing the data. 
  • Analyzing the data. 
  • Presenting the data in various forms.  
  • Deciding what forms would be best for presenting the data. 
  • Drawing conclusions based on the data. 
  • Collecting better data. 
  • Deciding what new data to collect. 
  • Deciding on better methods of collecting the data. 
  • Using the data as evidence. 
  • Referring to the data in support of conclusions.

Clearly labs are data-centered activities - activities in which a wealth of decisions about data must be made in order to pave a logical trail from the question to the answer.

It is at this point that science is distinctly different than the other disciplines which our students study. In many of the other disciplines, opinions prevail.  In science, data prevails (or at least should prevail).  As we all know, there is not a lot of room in science for opinion.  Scientists follow the data towards their logical conclusions and make every effort to build models which both assimilate the data and explain the data. In science classes, our students should be doing the same types of activities. They should be given an abundance of opportunities to collect, analyze, evaluate, and draw conclusions from the data.  When it comes to labs, its all about data.

An article titled Using Levels of Inquiry in the Science Classroom, written by Jeff Rylander, was posted previously on this blog. In his article, Jeff provided a framework for thinking about scientific inquiry lab activities.  The framework centered around the division of a lab task into the formulation of a testable question, the development of a method or procedure for answering the question, and the formulation of a solution or answer to the question.  In the framework presented by Jeff, higher levels of inquiry are characterized by activities in which students have a greater degree of control in the various stages of the lab task.  (Jeff's ideas regarding levels of inquiry originated from an article published in The Science Teacher by Michael E. Fay and Stacey Lowery Bretz. The article is available online at NSTA's Science Store.)

When I think about the centrality of data in a physics lab, I think about Jeff's article. Particularly, I think about the locus of control in the various data-oriented decisions which must be made during the course of a lab.  As an instructor, I need to be thinking about what data-related decisions students will be left to make. I need to think about whether I make the decision about what data they collect or whether they decide on what data must be collected.  I need to be thinking about whether I decide on the methods by which students will collect the data or if I will leave that decision to them.  I need to be thinking about whether I decide on how students will organize and present the data or if I will leave them to decide on this matter.  Consistent with Jeff's framework for levels of inquiry, the more the locus of control is shifted from the instructor to the student, the higher the level of inquiry which the lab will assume. And one way to think about scientific inquiry is to think about how many of the data-oriented decisions are being made by the instructor and how many are being made by the student.

As I write this article, I am one day into summer vacation.  School ended yesterday. The summer months allow time to relax and to rest (and to catch up on the honey-do lists), but also time to reflect, rethink and retool. For me, much of my summer reflections will be focused on how I can improve the scientific inquiry skills of my students.  And for starters, I will be thinking about the types of revisions which I can make to the lab program in order to foster improved inquiry skills.  I will be looking at the labs which my students do through the lens of data.  I will be pondering each lab which my students do and asking:  When it comes to data-oriented decisions, where is the locus of control for this particular lab activity? The more that the control lies upon the student side of the equation, the higher the level of scientific inquiry.


Tuesday, June 1, 2010

In Praise of Projects

In a previous blog, Debbie Berlin wrote about the end of semester projects used in her Regular Physics classes at Glenbrook South High School.  Inspired in part by the design and the success of that project, I decided to implement a similar project in one of my physics courses.  The course is called ChemPhys - a two-year course in which students take chemistry and physics on alternating days, thus completing a full year of chemistry and physics over the course of the two years.  The guidelines, expectations and scoring scheme for the project which I have implemented with my classes are described online at the course site (http://gbschemphys.com/chemphys/lab/project/project.html). In brief, I would describe the project as an open-ended experimental design project in which students attempt to identify and answer a testable question which is of interest to them.  There are few constraints in terms of topic, as long as physics is a naturally intrinsic part of the topic.  As I write, we are currently in the middle of the project, having completed a couple of days of planning and research and three days of experimental investigations.  I am offering in this post what is a sort of mid-term reflection of the progress, joys and frustrations which I have witnessed and experienced.

The most striking observation which I have made thus far is in the level of excitement which students have displayed. The project comes at the end of a two year course and in May - when the northern weather begins to turn bright and sunny and when the students have begun reducing the number of school days remaining to less than 20. This is not exactly a prime time for the display of academic enthusiasm.  Yet there have been several times when I have had to cheerfully pinch myself after being placed in a state of shock by an observation or comment.

The first day of lab research was originally scheduled to begin on Thursday, May 6.  For a variety of reasons I had decided to postpone the start of research until Monday, May 10.  One of the students in my second section of the day entered the classroom looking very dejected and down.  I asked her what was wrong.  She responded about how upset she was that the project had been postponed. She had heard from a friend in a previous section that it would not start until the following week.  She had commented how much she was looking forward to this day and how disappointing that the news was.  I was rather shocked that she was taking what seemed to me to be a very inconsequential decision as a very disappointing turn of events. I began to wonder: When was the last time that the postponement of an in-class activity caused such obvious disappointment on the part of one of my students?  I could not think of such an instance in my 20-plus years of teaching.  Clearly, this student possessed a sense of excitement and anticipation over the start of this research project.

On the third day of our lab research, I entered the room four minutes before class and was instantly struck by what I observed.  Several students had arrived before me and were in the back of the room setting up.  They were fetching their equipment from the various storage locations and preparing to start their research.  As other students entered the room, they set their bags down and headed to the back of the room to do the same.  There was no need for prompting or prodding. They knew what to do and didn't wait for an invitation to do it. With a minute prior to the start of class, one student entered the room, looked around, looked at me, and asked with a sense of amazement "Did the bell ring already?" Given the activity that she observed, she assumed that class must have already started and she had arrived tardy.  I assured her that she was on time, in fact one minute early, but there was no need to wait to begin.  Again, I began to wonder:  When was the last time that my students entered the room and spontaneously began to do science without any prodding or prompting?  Once more, I could not think of such an instance in my 20-plus years of teaching. Clearly, this research project increased the level of engagement of my students in the task of doing science.

A research project of this variety is an assessment.  It is a means of assessing student capability and performance in a manner in which a traditional test is unable to do so.  My tests excel at determining what students know and understand about the concepts and principles of science.  They are able to test students' ability to analyze complex physical situations and to solve physics word problems.  They are capable of determining the proficiency of students at analyzing diagrams and graphs.  But they are unable to test student's ability to conduct an experiment from beginning to end.  One role which this research project plays is that it assesses students' ability to do science: to design an experiment, to determine what data to collect and how to collect it, to analyze the data and draw conclusions, and to determine an effective manner to present the data and results.

As an assessment, a research project of this variety teaches me much which I might not otherwise be aware of.  For instance, it is clear to me that my students have great difficulty identifying a testable question.  During the brainstorming stage of the project, many of my students were able to identify great questions; yet very few of the questions that students initially proposed were testable.  Their questions reflected topics of interest and innate curiosity.  But their questions also reflected tremendous confusion regarding what constitutes a testable question.  Purpose statements such as "to determine what an unsafe speed is in a bobsled race" reveal great curiosity but not much understanding of what can and cannot be done via a high school physics lab experience.  I quickly recognized that the development of a testable question was indeed a challenge for students. I couldn't help but think of a previous article in this blog by Jeff Rylander on Using Levels of Inquiry in the Classroom. In that article, Jeff discussed a hierarchy of four levels of student inquiry which were based upon the work of Fay and Bretz. My research project was challenging students at the highest level - the level in which the question, the procedure and the solution is constructed by the student.

This research project was doing what any good assessment should do - providing the teacher an opportunity to assess student ability and know-how with the intent of affecting curriculum and instruction. Given the difficulties which I have observed, I am already making plans to incorporate more level 4 inquiry challenges into the course.  My goal will be to improve students' ability to construct the questionthe procedure and the solution.

My final observation pertains to the dramatic change in my role as teacher.  This research project has truly put me in my rightful place as the guide on the side.  My role is no longer the sage on the stage; rather I am an advisor serving a similar role as a college professor guiding the research of a graduate student.  I suggest things to think about, offer alternative procedures, and direct students to experts in a field or exceptional resources. When the groups encounter difficulties, I become part of their brainstorming team;  together we ponder ideas for circumventing the road-block. I serve as a sounding board for those students who are thinking through what to study, how to study it, how to interpret data, etc.

In addition to being an advisor, I am also a lab equipment manager.  With 16 different projects happening at once, I must make sure that every lab group has what they need when the need it.  This means lots of planning and preparation.  There's no chance for spur of the moment, last-second planning and preparation.  Trips to the local hardware store are common.  These must be planned in advance of the start of the school day.  I certainly can't run out to the hardware store in the middle of class.


I have been pondering the topic of engagement over this past semester.  Exactly what is it that gets students engaged and invested in labwork?  What is that causes students to be excited about the back of the room? What gets students turned on about doing science? I must be honest that I don't know the answer. But one thing that I do know is that an open-ended challenge of this nature seems to maximize student interest and involvement. When students are responsible for the design and development of the entire investigation regarding a topic of their own choosing, the investment level increases. Students become true scientists, engaged in the tasks of scientific inquiry from the development of the question to the answering of the question.  Meanwhile, I am able to observe their work and make judgements about what students can and cannot do as a result of their participation in my course.



This month's article is contributed by Tom Henderson. Tom is the author of The Physics Classroom website.  He is a graduate of the University of Illinois in Champaign-Urbana, Illinois. He has been a high school physics teacher since 1989. Tom currently teaches Honors ChemPhys (Physics portion) and Honors Chemistry at Glenbrook South High School in Glenview, IL, where he has taught since 1989.

Tom invites those teachers who are interested in learning more about his Scientific Investigations website to visit his course pages at http://gbschemphys.com/chemphys/lab/project/project.html.

Saturday, May 8, 2010

Other Gab: Why Do Demos?

Last year was the first year in 10 years that my teaching assignment involved teaching Chemistry. As a cross-over to Chemistry, there was much to re-learn, much to develop, much to get accustomed to.  As they say "You can't do it all" and I most certainly didn't. What was most lacking in my teaching of chemistry was a repertoire of effective demonstrations. When I received the same assignment of teaching two sections of chemistry during this school year, I made a pledge to myself to do a demonstration each day (at least each day in which there wasn't a lab experience). My thought was that my students should be doing chemistry and seeing chemistry on a daily basis.  Every day should include chemistry; not just talk about chemistry or calculations about chemistry, but actual chemistry. Chemical reactions should happen. Chemical properties should come alive.  After all, it's a science class and science should be happening.


There's no doubt about it!  Students love chemistry demonstrations.  And physics demonstrations. And any science demonstration.  And I love them too!  Who couldn't love a science demonstration?  Science museums stay in business not because there are a bunch of people inside showing PowerPoint slides; and not because there is an opportunity to sit around tables solving stoichiometry or projectile problems; and not because there are booths where you can sit down and balance chemical equations or draw free-body diagrams.  Rather, science museums stay in business because there are interesting things to look at, to watch, and to interact with. And when these things happen, people learn.  And the learning that does happen is more closely tied to the content which is being learned; it is not abstracted, detached, nor remote.


Students love demonstrations.  They enjoy them.  But do they learn from them? Now that's a tough question.  And an important one.  After all, my science class should be about more than just having fun.  It should be more than entertainment.  If my professional goals centered around providing fun and entertainment for others, then I should have sought to obtain a job at a museum.  Or at a zoo. Or in a circus. But my professional aspirations are centered around educating high schoolers and that naturally landed me a job in a high school teaching science.


In addition to a couple of lab experiences, there were three noteworthy demonstrations this week in my chemistry class. On Monday, we were talking about solubility and saturation of solutions.  To demonstrate a supersaturated solution, a hunk (new measurement unit) of sodium acetate was placed in a beaker of 10 mL of water.  It was heated and heated and eventually dissolved; there's a definite chemistry lesson in this.  Then the solution of dissolved NaAc (as it is affectionately known) is poured into a clean buret;  it cools over the next 30 minutes as we discuss variables effecting solubility, solubility curves, unsaturated vs. saturated vs. supersaturated solutions, etc. Near the end of class we return to the back of the room. The solution drips from the buret onto a watch glass with a single crystal of NaAc.  Students watch in amazement as the dissolved NaAc immediately crystallizes, forming a tall column of undissolved solid.  (View picture.) Entertaining? Defintely!  Enjoyable? Clearly. Educational? Potentially.  For certain, the entire cycle of dissolving the NaAc at high temperatures, allowing for supersaturation through cooling, and ultimately the crystallization of the NaAc as it dripped from the buret was the bridge which connected the content of the lesson to the real world of chemistry.  Thanks to the demo, the content was no longer abstracted, detached nor remote;  rather, it was alive and happening before their very eyes.  Big bang. Big buck.


As a St. Patrick's Day demonstration this past Wednesday, some boric acid was dissolved in methanol, squirted on a lab table in the shape of a shamrock and lit.  For 20-30 seconds, a green flame in the shape of a shamrock emerges from the lab table. The green of the flame is a characteristic of boron's emission spectra.  Students responded immediately: "Cool" "Wow!"  "OMG" "Do it again!" After three more repetitions and about three more minutes, students were back in their seats cranking out molarity calculations. Little bang, little buck. This is a demo that's mostly entertainment. Definitely fun. (And unfortunately it did leave with a lot of leftover boric acid and methanol.)


On a third day this week in chemistry class, I complemented a lesson on molarity and dilution with a demonstration in which two solutions were made by dissolving a known amount of copper chloride in a 200 mL volumetric flask of water.  Lab techniques were demonstrated and students calculated the concentrations.  One solution was five times the concentration of the other. We noted the color of the two solutions; that was a separate lesson in itself.  Then I took out 40 mL of the more concentrated solution and placed it in a third 200 mL volumetric flask;  we calculated the moles of copper chloride present in that 40 mL.  Then I added water to this 40 mL to fill the flask to the 200 mL mark.  As I added the water, I asked students how much copper chloride was I presently adding;  they all agreed - none.  When finished diluting the solution, I asked students to calculate the concentration.  We all agreed that the new concentration was the same as the concentration of the more dilute solution; we noted the color.  I discussed the concept of dilution, dilution calculations and a dilution factor.  I thought this was big bang.  Attachment of content to real world. The dilution concept from the textbook coming alive. A chance to discuss good lab technique. A chance to demonstrate the types of questions which a chemist asks. A chance to see chemistry happen. Entertaining? Not really.  Enjoyable? I enjoyed it; my students would rather be watching green flames or see a column of NaAc grow tall. Educational?  Potentially.


Why do demos? is the question I am pondering.  Why skip demos? is a question which is easy to answer.  They take time to prepare. (Planning, practicing and preparing the dilution demo took me close to 50 minutes. Preparing and practicing the NaAc demo took me a half hour.) They take time to clean up and sometimes they are a pain to clean up. (The boric acid is mildly toxic; methanol causes blindness if ingested; and my St. Patricks day demo left me with an extra 200 mL of toxic mess to dispose of.) They cost money (equipment, materials, chemicals). They take class time to perform.  Sometimes they don't work as intended and I get embarrassed (but a good humbling isn't always a bad thing). Sometimes its difficult to think of a demo appropriate to the current topic. (The combustion of the methanol with the boric acid doesn't exactly have a solution theme.) And sometimes we believe that the class time spent showing students demonstrations could be put to better use by doing more educational activities, which usually means doing more drill and practice type work. (For certain, in the time it took me to mix two solutions and then dilute one to the concentration of the other, I could have done at least twice as many molarity and dilution problems.)  So if Why skip demos? is a question which quickly evokes some valid answers (or at least some appealing answers), then Why do demos?


As I was cleaning up my mess after class on Thursday, my demo a day commitment came under personal scrutiny. Was there any value to what I was doing?  Was I getting any bang for my buck?  Why was I exhausting myself with demos?  I walked back to the office pondering these questions. When I arrived in my office I observed Mrs. S at her desk.  Mrs. S is a veteran in the trade of doing demonstrations. Mrs. S is a chemistry teacher in our department who has a reputation among her colleagues and her students for loving demonstrations. Mrs. S is definitely a demo a day teacher. And to her credit, she does her demonstrations with flare (no pun intended).  As there should be, there is a sense of entertainment, even theatre, when Mrs. S does chemical demonstrations.  This is not to say that there is no learning going on;  it is simply to say that the learning occurs in an environment which captivates her lower-level chemistry students. Her students are hooked!  What teacher wouldn't want that?


Hoping for some encouragement, I posed the Why do demos? question to Mrs. S. Without hesitation, she gave me what seemed to be 30 answers to the question. Here is her take on the question:
  • Demonstrations provide a sort of "visual cement" for a science course; they provide visual reinforcement of the content material.
  • Students will most likely remember the material which is demonstrated. When they reminense about your class, they won't be thinking about worksheets, tests, or PowerPoint slides; they will be remembering your demonstrations.
  • Demonstrations bring the textbook material to life and provide relevant application of the content.
  • Demonstrations address the need to appeal to the varying learning modalites of students.  For many students, seeing is believing and seeing is learning.
  • Demonstrations provide a avenue for critical thinking as they often naturally lead to the question why does this happen?
  • Demonstrations pique student curiosity; students become more invested when their curiosity is piqued.
  • Demonstrations provide an interesting diversion amidst an otherwise drab lesson plan, providing students with an interesting chunk to chew.
  • Committing oneself to doing demonstrations encourages a teacher to grow professionally as they learn new ways to present and reinforce content material.
  • Demonstrations are FUN - for both teachers and students.  They create an atmosphere of exciting inquiry within the classroom.


Shortly after having finished her list, Mrs. S began to bolt out of the office towards the prep room to prepare her upcoming rendition of a chemical demonstration.  Out of the corner of my eye, I observed her tie-dye lab coat flash by me.  I stopped her and asked if she would be interested in my left-over methanol-boric acid solution.  As a veteran of the trade, she knew exactly what it was for. She gladly accepted my offer. And I was quite relieved to have pawned it off on her.  Whew! What a relief.


The rest of my lunch period and prep period was busy as I prepared for my afternoon physics sections.  I had little time to continue my ponderings until 5 minutes into my first afternoon physics class;  the fire alarms sounded and the whole school was evacuated.  This gave me a few more minutes of further ponderings on the topic of why do demos? As is usually the case, the fire alarm was a false call and we returned to our classed to finish the school day.  


On my 30-minute trek home from school, my car ride thoughts returned to why do demos? I began to think about all those former students, who when reminsicsing about their experience in my course, would inevitably make a comment pertaining to demonstrations.  I've never heard such a student make comments like "I remember that one PowerPoint presentation on inertia" or "Have you designed anymore cool worksheets for your students to do" or "Do you still use that one Powerpoint presentation on ..."  No! Never!  This is not what my former students remember. The record is clear; they make comments like "I still remember that one demonstration when you made the aluminum rod make a sound by stroking it with your hand" or "Have you come up with any new demonstrations to show your students" or "Do you still do that demonstration when you shoot the falling monkey with your projectile launcher?" Students remember my demonstrations and your demonstrations for a reason. Demonstrations are meaningful; they stick in their heads.  Demonstrations are visual; they can't forget them.  Demonstrations are engaging; they hook kids attention. Demonstrations provide the connection between the concepts we are talking about and the material world which those concepts seek to describe.


Demonstrations are probably the closest thing in our profession to edutainment. They are an engaging and (at times) entertaining means of educating our students.  And they are an educational means of entertaining our students.  When a professional teacher transparently embeds a demonstration with the lesson content, students become engaged in the lesson.  Student investment in the lesson rises and And when the demonstration is presented using effective pedagogical strategies, higher rates of learning inevitably results.  A lesson immersed with showing students the operations of the material world is a lesson students will remember. Now what teacher wouldn't want that?


I got an email from Mrs. S the next morning: "By the way, one more reason to do demonstrations: total school evacuation." Now I am really glad I pawned the methanol-boric acid solution off on Mrs. S.  Better her than me.

Saturday, March 20, 2010

How to Do a PLCU

[Editor's Note: The past several posts have focused in one way or another upon a common challenge confronting physics teachers.  The challenge involves getting students involved, engaged and invested in minds-on activity during hands-on laboratory activities. This post represents one teacher's efforts to salt the oats.]



The desire to make laboratory experiences appear to be meaningful to students caused me to rethink some of the methods by which they were evaluated about a decade ago. The desire of students to merely complete a set of directions without incorporating the relationships they were investigating was such an abhorrence, as was the copying that ensued, that I decided to make each discovery experience one of responsibility; the students were going to learn something and were expected to apply it.

The method by which this was done became know as the “Post-Lab Check Up” (plcu). Its intention was to make each student responsible to observe/collect data and discover and apply relationships. Their observations and results were organized in some fashion (sheet, notebook…) which were then used while responding to the questions related to the investigation. The goal is not for the student to have a complete understanding of the concepts at hand, but to use previously-acquired skills in new situations and make some sense of them.

This may appear to be similar to a “lab quiz” format, but the intentions are different. The plcu is intended to keep the student engaged with the activities of his or her group, experiencing mutual discovery. Since I started doing this a little over 10 years ago, very infrequently does any student refrain from involvement, a surprising change from the days when one student would dominate while the others copied. The copying that occurs now is almost always followed by “what does this mean?” or “how did you find that?” a far cry from the blatant plagiarism of the past.

Although improvement in attitude in the laboratory is valuable in itself, the Post-Lab Check Up serves a second purpose. It is well known that investigation followed by rehearsal is much more effective in making sense of the concepts at hand than investigation alone. Therefore I try to have the “check up” occur during the same class period. When this is not possible, it will occur at the beginning of the next period. While this may appear to be valuable since students can then spend the next few hours reviewing what they discovered, it is a rare event when a student spends any meaningful time unless there is an evaluative piece to complete such as data analysis, graphing…. In these situations, having the check up the next day is required.

The purpose of the post-lab check up is to provide a low key, non-intimidating experience for students, to encourage them to exert a sincere effort during lab investigations. I find this is accomplished by combining a completion grade with the plcu grade. As a result, a low plcu score, 4 out of 10 for example, will result in a 70% lab score as long as the student has been involved in the process. In the past, this is a typical grade for labs in which the student participated and appeared to complete the lab, but the data were poorly evaluated or collected. However, I do not use this for investigations that last several class periods and require a more formal write-up.



This week's article is contributed by Dave Smith. Dave is a graduate of Wheaton College in Wheaton, Illinois. He has been a high school physics teacher for 24 years. Dave is currently teaching Regular Physics and Honors Physics at Glenbrook South High School in Glenview, IL, where he has served since 1997.


Dave is sharing three examples of his many Post-Lab Check Ups.  The three examples are available for downloading and inspection:

If you are a science teacher and have something to share, consider contacting the Lab Blab blog coordinator.  Your ideas could soon appear here on the Lab Blab and Other Gab pages.

Sunday, March 14, 2010

Confessions of a Coattail Curber

I can't believe I said it.  They can't believe I said it.  Once more, I let careless words slip out of my mouth.  Like a dagger in the heart, they hit two of my favorite students with such force that their mouths dropped at a rate of 9-10 g's. How could I have been so insensitive? It took only five seconds; but in those five seconds I dashed Riley's and Ellen's hopes and dreams of another good lab. Seeing a group of six students crowded around a single table with only one set of equipment, I placed a molecular model kit on a separate table and said, "Riley and Ellen, let's have you come over here and work together."

Riley and Ellen's mouths opened wide.  It was as though I had broken the news that their best friend had died. Their faces immediately saddened, their postures changed and they even but-Mr-H-ed me. I've been teaching long enough to know that one! "But Mr. H, ..."  This but-Mr-H was noticeably different. There was a distinct trace of lowering intensity from the m-sound to the ch-sound accompanied by a little quivering along the way. This was a clear sign that I had crossed the line and they were VERY upset. Time seemed to stand still as they looked at me, looked at each other, looked at Lonnie, picked up their lab notebooks and slowly walk to the other lab table to work alone on their lab. It was at that point that I realized the dark truth about myself.  I am a coattail curber. 

There is six month's worth of evidence that shows that Riley and Ellen invariably gravitate to Lonnie's lab table.  All overt efforts on my part to vary the groups and to provide different partners and different experiences for students in labs are ultimately foiled by this sort of gravitation between Lonnie and Riley and Ellen.  The nature of this attraction seems to be related to Lonnie's tendency to understand the lab environment. Riley and Ellen's need to be partnered with a student who understands the purpose and is able to plot out a procedure, manipulate the equipment, collect the data, perform the calculations and reason towards a conclusion. As far as Riley and Ellen are concerned, their association with Lonnie is a match made in heaven. As for Lonnie, Riley and Ellen are among the nicest two students you could ever meet. A little bit of nice will go a long way towards providing a lot of helpin'.

It could be said that Riley and Ellen ride the coattails of Lonnie.  Their success in lab is largely due to the coattail effect. Coattail effect. A phrase borrowed from politics which Wikipedia describes as "a generic phrase for anyone that hangs onto another person as they forge ahead, without effort from the hanger-on." For certain, Riley and Ellen are hanging onto Lonnie as he forges ahead. The Free Dictionary describes the coattail effect this way: "to use your connection with someone successful to achieve success yourself." For certain, Riley and Ellen are successful in lab insofar as Lonnie is successful in lab.  But the most vivid description of a coattailer comes from The Urban Dictionary: "To sponge, mooch, free load, skate by, or do absolutely nothing but watch while somebody else does all the work and still somehow try to take at least partial credit for something you had no hand in."  When six students crowd around a single set of material at a lab table, there are going to be several students who do nothing but watch while somebody else does all the work.  And you can bet these students are going to take at least partial credit for something they had no hand in.  In suggesting that Riley and Ellen separate from Lonnie and the other three members at the lab table, I was preventing the coattail effect.  I was being a coattail curber.

Now I assure you that I meant no harm on that fateful Wednesday morning. My intentions were entirely innocent.  In fact, I actually intended to do good towards Riley and Ellen and the other members at the table.  It was not my intent to curb anything.  I simply wanted to promote increased engagement in the lab activity. My logic was simple: as the ratio of the quantity of hands to the quantity of lab sets is decreased, the level of engagement increases and the amount of learning increases. My formula for success was that the smaller the group size, the greater the engagement and the more profitable the lab experience.  So I was simply migrating from table to table in an effort to reduce the group sizes to two students.  But in doing so, I was curbing the ability of many students to achieve success on this lab by means of the coattail effect.  I was a coattail curber.

When I was in high school, I played basketball on the school team.  During nearly every practice there was a moment when the coach blew the whistle and shouted "Free Throws."  We all knew what to do.  The coach did not need to say anything else.  We all got a ball, paired up and went to one of the eight baskets around the gym to practice our free throws.  The formula for optimizing this experience was simple: two players, one set of equipment. The smaller the group size at every basket, the more beneficial the activity. If your free throw partner was ill or injured, you didn't triple up with two other teammates; rather you considered yourself fortunate to have a basket to yourself at which you would get more free throw practice. In my four years of playing high school basketball, I never witnessed six players congregating at a single basket, each waiting their turn to shoot free throws while there were empty baskets around the gym. This was just not a sensible way to occupy the time. And never once did the entire team crowd around the best free throw shooters basket and watch him shoot free throws for 10 minutes, considering each success of his as being their own. That would be ludicrous.

 In making an effort on that Wednesday morning to reduce the size of lab groups, I was exercising free throw practice logic.  Two students, one set of equipment, an optimized experience. Just two weeks earlier, I had done a Young's Experiment Lab in one of my physics classes.  I had one laser, one slide with a double slit, one screen and 25 students. That's 25 students using a single set of equipment.  I enjoyed watching 8-10 students cooperate (and at times, argue) as they attempted to collect data to determine the wavelength of light. But what I didn't enjoy is watching the inactivity of a dozen or more students as they sat unengaged on the side of the room as a spectator. At first, they were humored by the feud over which of the little red dots on the screen to use for measuring y. And many of them were quite entertained as several students argued about how much sag to allow in the measuring tape s they measured the distance from the screen to the double slit.  But soon they zoned out and turned into Delilahs.  Engagement turned into spectatorship and the coattailers quickly lined the sidelines. So when I made the effort to break up the groups of four and six into smaller groups of two students, I was simply attempting to optimize the experience for my students.  It was using free throw practice logic.

After some further resistance, Riley and Ellen crowded around their own set of molecular models and worked on the lab together as a twosome.  I periodically circulated through the lab over the course of the next 30 minutes.  Each time I passed by Riley and Ellen's station, I observed 100% engagement.  I also observed science talk, lots of thinking, growing confidence and great progress.  As the period ended, Riley and Ellen returned to the front of the room with a smile on their face and a sense of pride in their hearts.  Riley commented, "Mr. H, aren't you proud of us?  Ellen and I finished the lab on our own. Aren't you proud of us?"  I winked, smiled and affirmed,  "I knew you could do it. I AM proud of you."

As Riley and Ellen left the room that day, I thought to myself:  I'm proud to be a coattail curber. If I want my students engaged in doing science, I will have to provide environments which are conducive to engagement. And one aspect of such an engaging environment is group size.  While there may be occasions for which larger groups offer more benefit than smaller groups, I've observed that engagement generally increases when free throw practice logic is applied to group size. Especially for those very passive students who generally "sponge, mooch, free load, skate by, or do absolutely nothing but watch while somebody else does all the work", minimizing group size goes a long way towards increasing their engagement.

So the first very practical means of salting the oats involves curbing the coattail effect by limiting the group size.  Next week we explore one teacher's strategy for encouraging engagement and investment in lab activities which has shown positive gains even in situations in which the group size is more than desired.


This week's article is written by Tom Henderson.

The following teachers follow this blog:

Visit The Physics Classroom on FaceBook