Wednesday, July 24, 2013

LaTeX Code for Standards

If you're interested in displaying coarser standards with core skills and proficiency indicators (or whatever you want to call them) in nice LaTeX boxes, here's some code for you!

There are two main templates that I work with - one with a subtitle for the standard and one without. The centering on the vertical text doesn't (in my experience) work, so I just have to play with the size of the parbox to make that work, but everything else renders pretty easily.

Edit: Thanks to Aaron Titus for pointing out two necessary inclusions:
\usepackage{multirow}
\usepackage{graphicx}

No subtitle:


{\footnotesize \begin{tabular}{| p{.7 cm} | p{1.7 cm} | p{13 cm} | }
\hline
\multirow{8}{*}
 {\rotatebox[origin=c]{90}{\parbox{22 mm}{{\large{\bf Friction }}}}}  
&Core Skills & Identify situations in which friction forces are present and understand the microscopic model of friction forces\\ \cline{3-3}
& & Differentiate between static and kinetic friction \\ \cline{2-3}
& \multirow{2}{*}{\parbox{1.7cm}{Proficiency Indicators}} & Determine the direction of the friction force \\ \cline{3-3}
& & Use an appropriate expression for the magnitude of the friction force \\ \cline{3-3}
& & Understand the relationships among normal force, friction force and friction coefficients \\ \cline{3-3}
& & Solve problems using friction \\ \cline{2-3}
& \multirow{1}{*}{\parbox{1.7cm}{Adv. Ind.}} & Solve complex friction problems, including banked curves (not at design speed) \\ \cline{2-3}
\hline
\end{tabular} }
\vspace{2 mm}

Subtitle:
{\footnotesize \begin{tabular}{| p{.15 cm}  p{.15 cm} | p{1.7 cm} | p{13 cm} | }
\hline
\multirow{8}{*}
{\rotatebox[origin=c]{90}{\parbox{32 mm}{{\large{\bf UFPM }}}}}  
&\multirow{8}{*}
{\rotatebox[origin=c]{90}{{\parbox{50 mm}{\scriptsize \centering Unbalanced Force Particle Model}}}} &Core Skills & Recognize when the forces on an object or system are not balanced from observation, graphs, equations, or descriptions of the motion  \\ \cline{4-4}
& & & Identify the presence and directions of normal, tension, and weight forces  \\ \cline{4-4}
& & & Draw a force diagram (FBD) accurately showing directions and types of forces acting on an object or system  \\ \cline{4-4}
& & & Write net force equations describing an object or system; they should indicate that the forces are not balanced in the appropriate dimension(s)  \\ \cline{3-4}
& & \multirow{2}{*}{\parbox{1.7cm}{Proficiency Indicators}} & Draw FBD correctly indicating that forces are not balanced; recognize same \\ \cline{4-4}
& & & Choose and consistently apply workable direction(s) of positive \\ \cline{4-4}
& & & Correctly apply Newton's 3rd law \\ \cline{4-4}
& & & Choose appropriate axes for force analysis \\ \cline{4-4}
& & & Solve problems using net force equations and/or FBD \\ \cline{3-4} 
 \hline
\end{tabular} }
\vspace{2 mm}

Wednesday, July 17, 2013

LaTeX Python Scripts

Here are the supporting materials from my final talk at AAPT's Summer Meeting 2013 in Portland, on LaTeX and Python in assessment creation.

There are two Python scripts and some LaTeX examples in the package. The scripts are:

  • Assessment Maker Lite: takes problems from the problem database (well, really a folder of individual problem LaTeX files) and assembles a properly numbered and pretty assessment PDF
  • Problemdb Lite: manages the problem database: it counts the problems, identifies the tags on each, and renders a PDF based on search criteria ("Vectors Drag" would search for problems containing both tags and "Vectors OR Drag" would get vector problems and drag problems) or gives a PDF of the whole set. Vertical space is removed, so the set will be compressed and will fit in a smaller number of pages, but it may cause awkward display at times.
There are some auxiliary documents, including a LaTeX example document and an Excel sheet to store assessment names and problem numbers for future reference (and easier entry into the Assessment Maker). 

You'll have to set some paths if you move things around, and definitely to reflect the location of your LaTeX engine and PDF viewer. There are comments in the code showing you where you might need to do that.

Any Python install should be fine: I'm using Python 2.7, if you're curious. You'll need a LaTeX install and the only custom style package needed is floatflt (available at CTAN and by Googling).

These can be expanded a great deal, including adding a GUI (I use Tkinter), polling Google docs (using GoogleCL) to create customized reassessments based on student signups, etc. These require more dependencies and directory-setting, so I didn't include them in the Lite versions.

I'm not a software engineer, so there's surely improvement to be made, features to be added, and errors to be checked (it'll just crash out if there's a compile issue, for example), but it should be fine for workaday use - it is for me! :)

Tuesday, July 16, 2013

Stick Figure Maker 1.0

Here's a VPython script that lets you click and drag to create stick figures in any position, so that you can take a screenshot and use them in test questions, etc. It's quick and dirty, but it should work for you, if you have VPython installed.
  • Drag any of the red handles to reposition - the head's not independently movable yet
  • Position the floor, if you need to
  • Click the toggle switch to hide the handles
  • Screenshot
  • Done!
There's a short YouTube video of the process in action here.

Some examples:
Walking
Running
Ninja Attack!
I borrowed the kernel of the drag-and-drop code from Sherwood/Chabay. Let me know if you find it useful!

The source:

Monday, July 15, 2013

SBG Resources

I'm posting some resources from the SBG panel from today. Thanks to everyone that came! Here are some of the resources that we discussed:

Sunday, July 14, 2013

SBG: Philosophy and Logistics

Teaching is always the best way to learn. Yesterday, Andy (SuperFly) Rundquist and I gave a workshop on standards-based grading at the AAPT summer national meeting in Portland. While preparing the workshop, I did several things. Watching all of Portlandia was valuable, but preparing the agenda and having conversations in the workshop were terrific.

Many of these teachers were on the cusp of their first year of SBG implementation, and itching for advice. One of the things that folks are always interested in is the logistics - grading scaling, calculating overall grades, mechanisms for reassessment, etc. The more important part, though, is the philosophy, and everything else is just implementation. If you're not sure what your purposes for assessment are, how that relates to feedback and what the students do next, then the potential benefits will be lost. This is an interesting contrast to traditional grading - because everyone's familiar with traditional grading, you don't actually need to consider why you're doing it, what you want out of it, exactly what the expectations are for the students, etc. That doesn't mean that any particular teacher doesn't, but think about it - do you really need to have a consistent philosophy of assessment to conduct your class and assessments that way? You really don't!

There are as many implementations of SBG as there are teachers, but there are, as I see it, two non-negotiable tenets of SBG:
  1. To the best of your understanding, student understanding and the grade should be interchangeable, equivalent, homeomorphic, or whatever you want to call it.
  2. Understanding changes over time (therefore... so should grades, by application of #1)
A corollary of #1 is that the grades have to be related to the content, rather than the assignments. That's where the standards-based part comes in. How you accomplish these, by grainy or coarse standards, student-initiated or teacher-initiated reassessments, with binary or other rubrics, etc. is highly dependent on your students (age, preparation, numbers), your school (time, culture, schedule), and you (personality, experience, class rapport). As long as you keep #1 and #2 in mind, communicate them to the students often, and everyone's clear that the students and teacher are a team working towards understanding, the implementation details aren't the main thing.

Basically, that's how you survive your first year of SBG. With that under your belt, start worrying about reducing paperwork, streamlining reassessment sign-up and administration, making reporting better, and all of that.

One big discussion that we had involved the inclusion of time as a variable in the grading. This is the default in traditional grading - there's not much reward for learning material after the test, so two students that both learn all of the material and score the same on the final could, because of the speed at which they learned it, get vastly different overall grades. The philosophy of SBG definitely tries to remove time as a variable, and I think that the improvement of grades to reflect improved understanding and the lowering of grades to reflect eroded understanding puts a very clear message out that enduring understanding is all that I care about, not how fast you learn or whether you can cram.

There was a little bit of pushback on that, and a good discussion followed. One idea thrown out was about employers - don't they want to know which person can learn the skill in a day instead of three? Probably they do, but I don't think that the traditional system actually communicates that clearly. Before I get to that, I don't think that it's appropriate to make that sort of thing an issue for novices; we're talking about kids in their first or second class here, not grad students ready to go into the workforce as physicists. 

The more important thing for me is the collateral damage that an attempt to include speed of acquisition by way of not allowing grades to change over time. The argument seems like it's about differentiating between these two kids:
...but in reality, it ends up being these two kids:
Without the incentive structure, support, and learning from assessments (when they're not final, they're easier to really take as indicators for improvement, rather than scarlet "D"s to be ashamed of and thrown away), a good number of those students simply fall of the the back of the pack, having a shaky foundation that they can't build upon. What I'm saying is that, with SBG, the
blue line" kid from the second graph can become the "blue line" kid from the first graph!

I've seen some criticism from others than SBG can allow higher grades than traditional grading. When applied correctly, that's a feature, not a bug.

My grade distribution is about the same as it was before, but each grade represents a higher level of understanding than before. When there's a mechanism to improve, the bar can be higher!

Interestingly, we heard from some local teachers that the state of Oregon is implementing SBG across the board in the near future. That seemed to have driven some of the high attendance in our workshop. I did a little searching, but couldn't find the details on the plan, so corrections/additions/clarifications are welcome in the comments. I'm certainly all for SBG, but it's really dangerous to make big top-down changes to these things. There must be a lot of teacher education and buy-in to make it work, or it'll be another passed ed fad, which would be a tragedy. Philosophy first, logistics second.

OK, tonight a little sushi to talk over the panel discussion on SBG that Aaron Titus, Stephen Collins, SuperFly, and I are leading tomorrow (8-10 in Broadway I/II) and then a day checking out Eugenia Etkina's new text at noon (Pavilion West) and the Demo/Lab (Galleria II) and Modeling instruction (Ballroom II/III) sessions from 4-6.