You are currently browsing articles tagged sbar.

My use of portfolios for assessment and grading is not going well. Actually, that’s not entirely true. I’ve implemented portfolios quite well, from a technological standpoint. Google Sites may not be pretty but I’ve managed to tweak them into an assessment portfolio system over the last several years and I’ve accumulated many examples of portfolios filled with excellent student work. But a separate Google Site portfolio in addition to a personal blog for each student is starting to feel like just another website to manage and the self-analysis that I thought portfolios would bring has not materialized, at least not for most students. I even went so far this past school year to discontinue using the portfolios in my “regular” biology sections although I continued to use them in my college-level courses like anatomy and college biology.

Before I deconstruct the failings of my current system, let me review what I hoped to achieve with building a standards-based system that uses blogs and portfolios to share, assess, and measure student learning:

Goals of my standards-based portfolio system:

  1. Students produce artifacts of learning that are seen by more than just the teacher (me) and add to the body of learning resources available online to other students.
  2. Students have the chance to use multiple technological (and non-tech) tools to demonstrate mastery of a topic or skill.
  3. Students know what kinds of topics and skills are required to master the course content and achieve a high grade for their particular course.
  4. Student blogs such as Google Blogger and WordPress serve as a chronological storage/timeline of student learning artifacts and travel with the student even after they finish a particular class.
  5. Google Site portfolios serve a major self-assessment function in that students review their blog posts and select their best work by topic and skill standard for inclusion in the portfolio.
  6. A portfolio site provides an easy way to assess the entire body of a student’s work and makes allowances for the different lengths of time it takes individual students to demonstrate mastery of a particular topic or skill.
  7. Student portfolios serve an accountability function for both student and teacher, since the portfolio is easily shared with other stakeholders such as parents, the local community college that issues our concurrent college credit, and perhaps even the state’s Department of Education.

Let’s see how these goals have panned out:

Students produce artifacts of learning that are seen by more than just the teacher (me) and add to the body of learning resources available online to other students.

This is happening, or at least the possibility of it happening exists because everything is posted online. Certainly parent conferences are strengthened greatly by being able to easily get student work into parents hands. As for a wider audience, however, most students do just enough to get by and truly exceptional learning artifacts that explain a topic well enough to get lots of views are rare. We’ve had a few notable exceptions and a blog exchange or two, but largely the audience for student work seems to mostly just be me.

Students have the chance to use multiple technological (and non-tech) tools to demonstrate mastery of a topic or skill.

This is also happening, given the online nature of blogs. However, students don’t use very many tools. Google Docs/Slides are all over the place and we take a lot of pictures and video of labs and such, but I don’t see a lot of creative photo editing or captioning and video post-production is minimal. We’ve gotten Snapchat involved in some instances, but that’s about it. The non-tech side of things usually just involves taking pics of a study guide or drawing or perhaps the occasional model of a cell or muscle fiber. Its fair to say that students don’t generally seek out new creative tools that they are not already familiar with.

Students know what kinds of topics and skills are required to master the course content and achieve a high grade for their particular course.


I like to think that this is true, since the portfolio comes to students pre-populated with a list of content and skill standards with a description of each. There are only 7 major standards, and the portfolio more or less puts them right in students’ faces, including the major subject area topics. As for figuring out how to achieve a high grade in the class, that’s far less obvious and much more experiential as each student and I have a dialogue about the quality of their work in the portfolio. There is a ton a flexibility in using portfolios, which is awesome from a philosophical standpoint, but explaining that flexibility to students in terms of concrete requirements for certain grade levels (A, B, C, etc.) is difficult. It is especially fun at the beginning of the school year when the body of work in the portfolio is tiny and grades usually are simply pass-fail or rarely go higher than a B. Students that consider themselves “A” students often freak out and ask what they can do to improve, when in reality they only have an artifact or two per standard to show off and I’m not ready to make a measurement based on so little data.

Student blogs such as Google Blogger and WordPress serve as a chronological storage/timeline of student learning artifacts and travel with the student even after they finish a particular class.

This is working well. Blogs are the meat and potatoes of the system and students around the school have come to expect to “do blogs in Ludwig’s class” even if they are not initially sure what that means. Blogs are relatively easy to set up and maintain, although I’ve seen some students struggle with remembering their logins. The time stamps are useful in parent conferences, especially where allegations of cheating have arisen. Its very easy to see who published content first if someone later borrows bits and pieces for themselves.

Google Site portfolios serve a major self-assessment function in that students review their blog posts and select their best work by topic and skill standard for inclusion in the portfolio.

This is not working well in most cases. Although some students have rocked the portfolio as a tool for self-analysis of their work, many students struggle with how to characterize and sort their work based on the standards that I’ve posted. After a semester or sometimes even after 3rd Quarter I’ll still have some students who need to be told exactly where to put links to their different work samples. A large majority of students take a link to a piece of work and put it on lots of portfolio pages even if the work doesn’t demonstrate the standards on those pages. Blog posts without graphs will end up under Data Visualization and simple content-area worksheets will find their way to Plan and Carry Out Scientific Investigations. The Self-Analysis page of the portfolio invariably generates comments like “I need to not procrastinate” which is definitely true but is also a lower bar than describing exactly which content you don’t understand. I have the sense that the portfolio is an afterthought for most students who don’t work on it until final grades are due and so the reflection that goes into it suffers from the speed at which it is thrown together.

A portfolio site provides an easy way to assess the entire body of a student’s work and makes allowances for the different lengths of time it takes individual students to demonstrate mastery of a particular topic or skill.

This is working well. It is very easy to look at a student’s portfolio page such as Data Collection and see every graph and data table that they’ve ever done for the class rather than combing through the chronological record of blog posts trying to identify which posts have graphs in them. From a purely quantitative standpoint, its obvious on a given portfolio page how many times the student has addressed a particular standard, assuming their self-assessment of each post isn’t too far off. Portfolios have made it easier to assess and to grade by standard. In the biology courses this past year that I did not use portfolios, I found it much harder to quantify some of the performance standards based solely on the blog posts.

Student portfolios serve an accountability function for both student and teacher, since the portfolio is easily shared with other stakeholders such as parents, the local community college that issues our concurrent college credit, and perhaps even the state’s Department of Education.


Yes, portfolios can deliver a lot of information about how I run my classes, but who is looking at them? Probably only the student and I are viewing any given portfolio, and sometimes not even they are interested in what the portfolio can show. In many cases the portfolio is simply “one more thing” for a student to do that “duplicates” what goes on with the blog. Its this kind of feedback from students that the portfolio was just “one more site to manage” that led me to scale back which courses use portfolios. If no one is looking, why bust our butts to create and maintain a nearly duplicate site of student work samples?

Future directions:

The “harsh reality” of the state of assessment portfolios isn’t too dreadful, but I have a sense that portfolio-based assessment could be going a lot better than it is in my hands. The goal of assessment of individual skill and content standards still remains, but the medium in which the information is collected needs some tweaking.

Google Sites are relatively clunky but do the job of collecting work samples as long as you have a laptop in front of you. But as students go increasingly more mobile I’m thinking of trying out SeeSaw as a replacement for the portfolio, and perhaps the blogs as well. It looks like SeeSaw will let students collect a variety of work samples into their SeeSaw portfolio using mobile and laptop devices and, for a fee, it will let me tag and evaluate student work by standard.

Simplifying down to two major platforms (Schoology and SeeSaw) from three (Schoology, Blog, and Portfolio Site) is a step in the right direction, although I’m concerned about students losing access to their work at the end of a school year if they don’t control their own personal blog. SeeSaw looks to be primarily aimed at a younger generation of kids than my high school bunch and is appropriately more teacher-centered, but a lot of the fundamentals are there: collection of learning artifacts, assessment by topic and performance standard, and publication to parents and others for accountability purposes and sharing of created resources. I’ll be curious to see whether the lack of student control of their own individual sites is a real problem in SeeSaw or if it actually creates better structure and accountability for my students and myself.

Tags: , , ,


I’ve waited to post this set of thoughts until I was back at school, mostly because I was up in the mountains for the last bit of the summer. Yes, I had wifi, but, well, there were other things to do that were better than blogging.

But I’m back to writing, and in terms of changes for this school year that you might be interested in, there are two big ones:

1. I’m replacing my GoogleApp spreadsheets for tracking student progress with Shawn and Vic’s BlueHarvest. By the end of last year I was using the spreadsheets almost exclusively for comments back and forth with students so having to create and manage one for each student this year seemed silly when BH is built to do that. I’m through the setup phase with BH and have managed to get login info to nearly all of my 110 students. There were times that I wished that I knew how to pull all my student info (names with emails) off of Infinite Campus into a nice, importable spreadsheet, but in reality there have been so many schedule changes that I would have had to add/delete a bunch of kids anyway. The big downside of not adding emails for each kid was that I ended up having to get them their passwords via Edmodo, which took a bit of typing today to accomplish (harrywookiewookie is still my favorite).

2. SBG for AP Biology! Remember the experiment last year with my student-led Phunsics class? I’m going to apply some of the philosophy of that class to my teaching (or co-teaching, maybe mentoring?) of AP Biology this year. What’s that you say? There’s an audit process for approval of my curriculum documents? Oh dang. Guess we had better start writing them together.

Naw, it’ll be fun. The College Board was nice enough to follow my model of emphasizing skill standards for AP Biology students as well as providing a short list of content standards. Ok, maybe the list is a bit longer than I think is necessary, but I’ve grouped them into 13 content standards, not too far from my usual 10-or-so content area standards per course. With 7 skill standards and the 13 content areas (see the course description or the prezi linked on my AP Biology page), we’ve got a basic framework from which to set up the course. Once the students get over the rush of holding their newly-acquired iPads, we’ll get down to work on prioritizing our goals for the course and agree on a basic plan for the year.

The big difference between this AP Biology course and the phunsics course, besides the obvious content-area shift, will be in assessment. AP Biology will follow the pattern I’ve established for my other classes, namely activities->blogging->portfolio building. I think the new structure of the AP Biology course in the College Board’s documents lends itself pretty well to a standards-based portfolio that students can fill with evidence of each standard. I’ll post links to some apbio student portfolios once they are sufficiently underway.

Tags: , ,

Scott McLeod recently asked this question in his post Reconciling Convergence and Divergence:

How do you reconcile…

principles of standards-based grading; “begin with the end in mind and work backwards;” understanding by design; and other more convergent learning ideas


project-, problem-, challenge-, and/or inquiry-based learning; creativity; innovation; collaboration; and our need for more divergent thinkers?

My answer: I don’t reconcile the two, nor am I sure that I should. I do both. Separately.

As frequent readers of this blog will know, I’ve been experimenting with standards-based assessment and grading for a couple of years now and am to the point that I feel reasonably expert in structuring my classroom around standards. I typically start off each course in the fall by discussing the specific standards that students will meet during the year and explaining how they might go about proving that they’ve met those standards. We then proceed to work together as a class to do a variety of activities and labs designed to help students meet the standards that I have laid out. This works well in my biology, anatomy, and chemistry classes, all of which are concurrent college credit and so are matched to my state’s community college system guidelines for each particular course. Very, very convergent stuff. All students focus their learning on mostly the same set of ideas, even going so far as to complete electronic portfolios based on a common template that I provide for them. This system works nicely and the portfolios that students are producing are excellent, with lots of evidence that they’ve learned particular skill and content standards.

But what about physics? This year I had the opportunity to take over the job of physics teacher because: a) no one else wanted to teach it, and b) I had a lot of proto-engineers begging me to teach anything besides biology or anatomy.  This class turned out to be radically different from anything else that I’ve ever taught. It was radically different because I didn’t go into the class with a defined set of standards. The class was not concurrent college credit so I didn’t have to concern myself with matching a college syllabus. The state of Colorado does have physical science standards for students, but they had mostly fulfilled those in their freshman and sophomore level courses, and the kids taking physics were Juniors and Seniors.

With nothing to prove to anyone about whether I had correctly learnified my students, I was free to structure the class as I saw fit. I decided to let the students run it. On the first day of school I explained that they would be designing the class, not me. We spent the next few days brainstorming what sorts of things normally go on in a physics class, which topics they ought to leave physics knowing about, and how to do assessments of said goals. In other words, the students and I were still in a very standards-based frame of mind.

But then we diverged. Big time. Our brainstorming sessions had revealed a lot of different student interests: What about building that hovercraft you were telling us about and just how much power does a shop vac produce? Can we build some sort of catapult?  How about a potato gun? By the third week of school, we had all carried out a couple of the standard labs on measuring motion using video analysis and motion sensors but that was the last time we did anything as a whole group. The rest of the year was project based. Completely student designed and initiated to the point they started calling the class “phunsics.” My lesson plan book for the class was a mess. Usually it just said “Projects” until after class when I could actually fill in what students worked on that day, and when I did fill it in, I often had to summarize four or five different projects for the same class period. And so it went all year, sometimes in great bursts of activity, sometimes in lulls of senioritis and apathy, but always there were one or two major projects underway and several on the back burners.

To try to explain the course to future generations of phunsics students (and anyone else curious about what the class looked like), students created several videos about their experience.  A playlist of some of their videos is worth watching for some different perspectives on the class. Also, here’s my tribute video for the Phunsics team.


How then do we decide which type of course is better for learning, the convergent “let’s meet the standards” kind of class or the divergent “follow your interests” kind of class?  That all depends on how you measure learning, I suppose. On the one hand, students in anatomy, biology, and chemistry have portfolios of the work they accomplished during the year and anyone curious enough could see exactly what sorts of standards they had met. On the other hand, the phunsics students exhibited self-direction, organizational skills, coping with failure, teamwork, and creativity. Our current set of standardized assessments would completely overlook the achievements of these students, should we choose to assess them that way.

Would I teach the anatomy, biology, and chemistry courses the same way that I did physics this year? I’m not so sure I would. Some subjects lend themselves to true inquiry and self-direction better than others. Disciplines like physics and engineering will always have an advantage over subjects like biology and anatomy where real inquiry involves very specialized equipment and a ton of background knowledge that students may not yet possess. Likewise inquiry in chemistry has to be bounded both by safety considerations and the background knowledge of students. Don’t get me wrong, I work in as many open-ended and inquiry labs as possible in these disciplines but these labs or “problems” are still often defined by the teacher and not the learner. Probably I still suck at PBL and just need to get better at it, but for now any sort of PBL short of giving full control to students seems kind of artificial to me.

In conclusion, I’m going to try to offer the physics course as often as I can, which at this point is every other year in rotation with AP Biology. I think a student-designed course like that is vital to help students understand what real scientific inquiry is like, with teams working together to solve problems and meet design challenges they meet along the way. And, at least for now, I’ll keep the anatomy, biology, and chemistry courses as standards-based courses, but attempt to move them in a direction of more student control about how and when they meet the particular standards.

Tags: , , , ,

I was recently invited by my colleague Kelly Jo Smith to participate in a podcast discussion about standards-based grades. There has been enough of a stir about SBG at our school lately that she thought it would be good to get several teachers who are trying it out to go “on tape” discussing our experiences of SBG.

I was joined by Justin Miller (art teacher), Eva Rodriguez (Spanish teacher), and Kelly Jo (language arts and drama teacher) for a great conversation that I managed to record and push out as a podcast. It really was a very affirming conversation and I think we all came away feeling a little less alone in our struggles with reforming our grading systems to reflect learning rather than completion.

I’m not a huge fan of podcasts due to my short attention span, but I think this sort of extended conversation about a topic is exactly why podcasting was invented. If you are at all interested in how teachers in several disciplines are using standards-based grades then you might want to give it a listen.



Tags: , , , , ,

As I’ve written elsewhere, my focus this year has shifted from tinkering with educational technology to tinkering with, well, most everything else about my classroom. The main focus has been about changing how I grade students. When I started teaching I used the typical points-based grading structure where 10 or 100 point assignments are given and students rack up points towards a total. From there, as I got sick of the points game (can you say cheating?) and tried to limit it, I moved to a more streamlined system in which students still earned points, just fewer of them.  This year has seen the implementation of a standards-based assessment and reporting system, variously called standards based grading, or occasionally skills-based grading.

The main focus of this system is to provide feedback to students, parents, and myself about how students are performing on specific, predefined learning targets. This puts the focus on learning specific skills and content, not simple completion of tasks for points. So how was this accomplished? In short, I had to restructure my gradebook to reflect each major skill or content area in which I wanted students to be able to demonstrate proficiency. This meant that I first had to define the standards that I would assess students on. This task was not too terrible, since I teach a variety of concurrent credit (sometimes called dual credit) classes and had great guidelines from the college-level classes to pull from.

Next, I had to decide what tool to use to do the actual reporting of student progress on the standards. Our school’s online grading program was certainly not up to the task, so I designed a gradesheet in GoogleDocs instead. This let me set up conditional formatting of spreadsheet cells to use different colors to highlight areas of strengths and weaknesses. Once an appropriate gradesheet was created for each course I taught, it was a straightforward task to clone the gradesheet for each student and share it with them via their GoogleApps account. Using GoogleDocs also gave me the option to share students’ gradesheets with their parents as the need arose, since the school’s online gradebook really doesn’t show the detailed feedback that the gradesheet does.

Then, of course, came the real test of the system: actually using it with students. This meant having to explain standards-based grading to them during the first few days of class. Let’s just say there were lots of blank stares. Talking about standards-based grading with students was probably a lot like talking about dancing, you might get the general idea and like the theory, but until you do it for yourself, there is no real sense of how it works.

But students did figure it out and, by the end of the first semester, they had a pretty good feel for what their gradesheets were all about and were beginning to use them to guide their learning. I began to hear the language of the classroom change a bit to where students began talking about which standards they still needed to meet instead of asking how many points an assignment was worth. Some students even asked me to give other students access to their gradesheets so that they could discuss them together and figure out what steps to take next.

It wasn’t all rosy, of course. Some students were so used to a points system that the idea that one unmet standard could lower their grade was really foreign to them. Even some of the higher-performing students, used to building up a surplus of points, had to think a bit differently. But most students caught on, and many seemed to really enjoy the flexibility of the system.

Here is a quick rundown of the things that impressed me about a standards-based grading system:

Guiding instruction

If you really want to know what your students are learning, try laying it out visually in your gradebook. For me, the big aha moment came after several weeks of school when I realized that there was no evidence of one of the major standards, Experimental Design (Std 4), in anyone’s gradesheet. Why not? I hadn’t provided them any opportunities to meet that standard yet. After that, I tried to plan activities that would help students design their own labs. I struggled with that standard all year, actually. Looking ahead to next year, I know for sure that one of the areas that I need to work on is to get students more involved in performing real scientific investigations.

Informing students of specific strengths and weaknesses

After several weeks of trying standards-based grades, it became obvious in the gradesheet what I knew from my experience of teaching over several years: each student brings a different set of skills with them to my classes. Some students were rocking the technology savvy standard (Std. 6) with their prezis, videos, and animations while others were brilliant writers that were at high levels for their communication standard (Std. 7). Each student gradesheet was unique, but having the gradesheet as a reference made conversations with students about their grade much more meaningful than simply saying “you have to work harder”or “just turn stuff in.” We could see exactly which content or skills each student needed to work on.

Allowing for mistakes and experimentation

One of the great things about standards-based grading using a 4 tier scale is that students don’t dig themselves into holes like they can in some points systems. Using cumulative points, the kid who forgets to turn in an assignment loses points and their grade suffers (sometimes drastically), unless you later give “extra credit,” which is usually unrelated to any real learning. Instead, standards-based grades separate out the areas of difficulty into discrete chunks which can be addressed individually without necessarily dragging down the entire grade.  My students were allowed multiple chances to meet each skill or content area standard, a fact that they really appreciated. This meant that students could botch a quiz or try some web tool that didn’t work, but they could try again with a different assessment to try to show an increase in their ability or understanding. For example, here’s part of a gradesheet from a student who fixed some misunderstandings:

In at least three of the standards (columns), there is evidence that the student performed better on a second try at each standard.


Showing student progress and achievement over time

This is perhaps my favorite part of using standards-based grades and the individual gradesheets. Each gradesheet starts out the year as a blank slate, but as we work together through the year, encountering new challenges, students begin to see a color-coded record of their achievements, sort of a trophy case, perhaps, of all that they have done throughout the year. Yes, numbers are involved (0-4) just like a regular gradebook, but there is something about color that draws the eye and paints a picture of what has been achieved in a way a numerical score cannot. For example, here are the content knowledge (Std. 1) gradesheets for a few of my students:

Evidence of content knowledge 1

Content area knowledge learned by Student A

Content area knowledge learned by Student B

Content area knowledge learned by Student C

You can see at a glance that Student A had some strengths and weaknesses throughout the year, Student B showed excellent understanding in all that they did, and that Student C struggled to produce evidence of learning for a number of content areas.

I’m going to let my students have the final word in this discussion of standards-based grading. I asked students in my biology classes to produce a short “advice” video that I could share with next years students to make the transition to SBG easier on them (and me). Here and here are a couple of the videos that best explain what students think about my grading system. I love the quote at the end of Cherlyn’s and Tenchita’s advice video: “Its different, but you’ll get used to it. Its better than anybody else’s.”

Tags: , , ,