Wednesday, April 22, 2015

Does coaching work? We spent $1,300 to find out.

In January, we launched the DotaCoach Progress Experiment to measure whether coaching actually makes students better players. We believed it did (strongly enough to start this very business), but we didn’t have proof. So we gave two months of free professional coaching to ten students and measured their progress. In total, we gave away 180 free lessons, at a total cost of $1,300 USD. Now that the results are in, we’d like to share them with you. Over the coming weeks, we'll be releasing a series of posts showing quantitative and qualitative data about our students and their progress.

Today we’re kicking things off with the most commonly-used way to measure skill, MMR. Although MMR is not a perfect measurement of skill, it's reasonably sound mathematically, it's the best single metric that we have, and it's proven to be quite accurate given enough games played. With that out of the way, let's take a look at the MMR graph.


This graph shows how the MMR of students and controls changed over three and a half months leading up to the experiment. It then resets everyone to zero (at the “Coaching Begins” vertical line) and shows the same comparison over the course of the experiment. Prior to the experiment, both groups fluctuate but have no significant change in their MMR. Once the experiment starts, the students receiving lessons show clear improvements compared to the controls.

Students and controls 
Of our original 10 students, 3 dropped out within the first week or two due to real-life commitments. This left us with the 7 students you see in the graph above. To provide a basis for comparison, we also gathered data on 6 players who had applied to be a part of the experiment, but were not selected. These players are referred to as controls, or non-students. As you may have noticed, the controls actually dipped below zero slightly during the course of the experiment; we expect this was just a random anomaly, rather than them being heartbroken from not being chosen for the experiment.

MMR calculation 
We originally had students sign up for Dota 2 Toplist to track MMR throughout the experiment, but the site has not worked properly since mid-February. So we had to improvise a bit. We pulled all match data for every student and control for the last six months from the Dota 2 Match History API. We estimated their MMR by looking at ranked matches and assigning +25 for a win and -25 for a loss. This proved to be a sufficiently accurate predictor for players in the experiment’s range, 1,500-4,500 MMR. We also cross-verified our estimates by looking at the resulting MMR at different dates for which we had recorded MMRs; it was accurate within 2% for all students.

Note that the Dota 2 Match History API does not distinguish between solo and party ranked play, so this graph includes both. This is annoying for us and hard to adjust for since some people don’t play party at all and we don’t have data for controls. For our students it looks like about 75% of the change is coming from solo games, which conservatively works out to about a 350 solo MMR improvement on average.  

Lesson counts
One of the graph’s options allows you to see the number of lessons each student took. These range from 10 hours (Student #3) to 37 hours (Student #5) over the course of the two months. 

As you can see, coaching clearly improves players. Looking at the MMR percentiles Valve published, students in our experiment leapfrogged about 10-20% of the Dota playerbase in just two months.
You may be thinking, “I’ve had my MMR swing by a few hundred at times, so what?” Great question! What makes these results significant is that the improvement happened consistently across several students. Normally when you see swings, people go up and down but they average around zero over time. But during the experiment, every student saw improvement, and the average improvement was well above zero. This indicates that the students are actually getting better; it is very unlikely that every student in the experiment happened to have a big MMR upswing during the course of the experiment. As you probably noticed, we added standard error bars to the graph to help show this. They give a quick, intuitive sense of the significance of these results, and show that the improvements are not merely coincidence or a temporary swing. (If anyone is interested, we can run the numbers for other statistical tests in future posts, or share anonymized data with interested parties.)

This data suggests that coaching is an effective way for players to improve their skills. An average improvement of 350 MMR in two months is a significant gain. For those who are looking to improve their skills, this experiment provides compelling evidence that coaching can be an effective way to improve. Whether it's from a professional like we offer at DotaCoach, or a friend who has offered their time and advice, we hope you'll give coaching a try.

One student commented thusly on what he took away from the experiment and these results: “get a mentor whether paid or unpaid, thank him, appreciate him, and next time u want to spend $10 on a hat - maybe invest in your gameplay and spend an hour with a coach.”

We’d love to hear what data/analysis you’re interested in for future posts. Let us know by commenting here, on Facebook, tweeting us, or sending a mail to and we’ll try to answer your question in future posts.