I get a lot of emails asking about coxswain evaluations. Coaches want to know what they should say, how long they should be, are they even necessary (why would you ask me, of all people, this question), etc. and coxswains want to know how to interpret everything, what they should take seriously, and how much of what the rowers wrote is based on their level of like/dislike for person they’re writing about. Additionally, if you have a coxswain who is new to the team (like us – we have one freshman and two upperclassmen) it can be hard for them to know what to take away from the evaluations since they likely won’t have coxed many of the rowers if you do these early on in the season and the feedback can be scarce and occasionally harsh.
Coxswain evaluations are important and coaches should make it a priority to do them at least two or three times during the year. The problem though is what I said above – coaches don’t know how to make them and coxswains don’t know how to interpret them which renders the time you put into them all but wasted. Coaches also have to realize that part of doing evals is spending 20-30ish minutes going over them with your coxswains, explaining some of the more ambiguous comments, giving them specific things to work on based off the feedback, etc. Done right, yes, it does amount to a few hours of work but at the end of the day it’s a few hours well spent and your coxswains will be that much better for it. And, to be honest, it’s quite literally the least you can do for them in terms of helping them get better.
Sometime in October-ish we did our first round of evaluations for our three coxswains. I was excited but a little apprehensive at the same time because every coxswain evaluation I’ve seen before this has been borderline awful and/or useless. Thankfully the one they’ve been using is actually pretty good and manages to cover all the bases pretty well. (I’ll go into detail a bit more down below.)
Once the rowers had filled them out (this took maybe 10-15 minutes total) I collected them and asked the other coaches if they were going to go over them with the coxswains. They said “nah, we usually just give them the sheets to read on their own” to which I responded with this exact expression (I’m completely serious). Now, let’s think about this for a second. If you were given 20+ evaluations containing a lot of comments but no real indication of which of the three coxswains the feedback was directed towards, how much would you get out reading them? Probably not a lot. So … here’s what I did to make it easier for the coxswains to actually use the feedback they were being given.
To preface this, I’ve made templates of my “system” for you to use with your coxswains if you’d like. Everything is explained down below and can be found in this Google Doc.
First things first – the evaluation itself. MIT’s used this one for awhile so I can’t take credit for making it but I do like it so at the very least I’m endorsing it. It’s simple and to the point but open-ended enough for the rowers to elaborate if they have any specific comments (which, obviously, the goal is for them to do that with each section).
Once you have your evaluations and they’re filled out the next thing you’ve gotta do is figure out what to do with all that information. The first thing that I did was take all the numerical ratings and average them into one number so that instead of having 20+ ratings for each of the nine sub-sections, they’d only have one number each. (The sheet for this is under “Overall Evaluation” in the second tab at the bottom.) This allows them to get a better idea of where they fall on the 1-5 scale. It’s just like what your teachers do with your grades – instead of giving you a million individual grades at the end of the semester they just give you one that you can then compare to the pre-defined scale in order to determine how you did.
I tend to spend a lot of time on this section because averaging 20 numbers for nine sections times three people is rather time-consuming. Luckily the day that I crunched all the winter numbers last week was when everyone was either biking or out on a long run so this ended up being a good way for me to pass the time until they got back. I have a pretty good system that works well for me so it only took me about an hour, or maybe a little less than, to get everything averaged out.
The next part is the most time-consuming. I’ve done this twice now and each time I’ve spent about 2.5 – 3hrs total putting these spreadsheets together (so about 45-60min per coxswain). How long it takes you will depend on how many coxswains you have, how many comments your rowers have left/how detailed they are, how diligent you are about dividing them up amongst the coxswains they apply to, and whether or not you boil down the comments to two to four bullet points of specific things to work on (hint: you should).
Each coxswain has their own sheet for each season that we’ve conducted the evals. We just did our second set last week so as you can see, each coxswain has two sheets so far for the year. Each individual sheet is broken down into four main sections, just like the evaluation itself. There’s a “pros”, “cons”, and “general comments” section where I’ve taken all the comments the rowers have left and divided them up to fit into one of those three categories. Most of the time the rowers will specify if their comments are directed towards a particular coxswain but if they don’t then I just consider it a general comment that’s directed towards everyone and I’ll include it on each person’s sheet.
As you can see, some of the comments are a bit repetitive but I think it’s important to write them all down regardless so that the coxswains can see what the rowers are noticing and how they feel about certain aspects of their coxing. If one person says “steers a great course” it’s not nearly as much of a confidence boost as four people saying it is. Same goes for the negative comments – they might not take “doesn’t steer competitive courses” that seriously when it comes from one person but if six of their teammates say it then it holds a bit more weight.
The “things to work on” section should be two (minimum) to four (maximum) bullet points based on all the pro/con/general comments. These really don’t take that much effort to come up with either. As you read through the comments you should easily be able to get a sense for what areas the rowers think they can or want them to improve on.
After putting all that together then you can go over it with your coxswains. When I sat down with ours I printed out their individual sheets so they could read the comments for themselves as we went over them and essentially just read through everything, pointed out anything that I thought was worth discussing and/or elaborating on, and got their thoughts on how they felt about the comments (did they agree/disagree with anything, have questions, etc.). We did this individually the first time but when we go over the most recent ones I think I’m going to do it as a group just because there’s only three of them and not as many individual nuances to discuss this time around.
The takeaway here is that coxswain evaluations should be a regular thing that you do at least twice per season (for comparison’s sake) and in order to maximize their effectiveness you, the coach, need to spend a few hours organizing them so that you can go directly to each coxswain and say “Here’s what your teammates said, here’s what we’d like to see you work on based on the feedback they’ve provided, let’s discuss…”. Don’t just give them a pile of papers and expect them to sort through all that themselves because they won’t do it (and I don’t blame them). Hell, you can outsource all your evals to me and I’ll organize them for you if it means you’ll actually do evals for your coxswains (…totally serious, by the way).
Related: Hey! So I’m a coxswain in high school and we (all the coxswains) want a coxswain evaluation/ranking from the rowers. Some coxswains feel like they should be in a different boat and we all want feedback from the rowers. How do we go about asking our coach about it?
After the first round of evals that we did all three of us (the coaches) noticed some major improvements in our coxswains so if you want proof that spending the time doing these and providing them with real information actually pays off, just look at the fall vs. winter averages in the first picture. I was a little skeptical initially because I didn’t think there was going to be much of a difference (mainly because I didn’t think the rowers would notice anything, not that I didn’t think our coxswains had improved) but I was really excited to see actual numerical data that backed up what we were seeing on the water.
Anyways, I hope all of this is helpful and encourages everyone to make coxswain evaluations a regular part of your seasonal plans. Coxswains, if your team hasn’t done evaluations before you should pose the idea to your coach(es) and show them the first tab of the Google Doc. If you have done evaluations but want to discuss some of the comments or get some additional feedback/insight, feel free to get in touch.