|
Top off-season questions continue: Analytics versus the eye test |
|
|
|
As the summer drones out - at least in the hockey world - very little has happened in the Rangers sphere. The one piece of news that impacted the team was the signing of Vladimir Tarasenko to a one-year deal by Ottawa. A pipe dream existed that he would return to Broadway, but at least until the trade deadline, that possibility is now gone.
In order to pass the time, we have been covering the off season questions I posted a few weeks. Several questions have been covered either by prior blogs or ones recently discussed in separable blogs. Today, we tackle another one of the open questions on a topic that is extremely polarizing.
16) To what extent do you think analytics should be used in roster/line/pair decisions? (Mark Weissman)
To help with the Analytics conversation, below is a column by Harman Dayak in the Athletic that I thought presented the topic very well.
Dayak lays out how there needs to usage of both analytics and the eye test with one providing the what and the other the why. Being a slave, my term not his, to one and not the other only allows for part of the story. Same with dismissing analytics as just being numbers or the eye test because it’s old school and doesn’t cover the new reality.
Understanding what each component brings to the table is critical. But it’s also important to understand and recognize the limitations of each method. Ignore one only presents part of the story. A melding of the two, as long as the individual has an open mind, is the better synthesis between new and old school.
Too much money is being poured into analytics to just be dismissive. On the flip side, to toss out years of scouting because it’s not cutting edge enough is short sighted. Each method has its flaws and doesn’t paint the full story or picture. That’s why I see and recognize the value of analytics because the eye can’t and doesn’t capture all. Same with analytics, but the new school is good at unearthing players that may be over or undervalued missed by the eye test.
The majority of the hockey world now agrees that data has some value as a tool that can offer objective insights into player performance. The old days of intense analytics versus eye test debates are mostly gone because people understand that it’s not an either-or proposition, you should be using both the eye test and data
Analytics are useful for telling you what results are happening when a player is on the ice (e.g. the team dominates opponents in terms of possession and scoring chances when Player X is on the ice, or the club gets caved in and bleeds a lot defensively when Player Y is on the ice). It provides objective data and helps account for biases. The eye test — and other contextual and qualitative factors — tell you why those results happened and what it actually reveals about the player’s ability, skill set and value.
Analytical models have often been right when they’ve suggested this player is overrated or that player is underrated. I truly hope people don’t take this article as an opportunity to bash analytics or the work of a smart, talented colleague like (Dom) Luszczyszyn because the hockey public’s overall knowledge and conversations around players are sharper and more sophisticated due to these tools.
The point is that the numbers alone sometimes point you in the wrong direction.
Anytime we evaluate a player, we should accept their statistical profile as valuable information but then dive deeper into the possible ways it might not be telling the entire story. That’d be much better than drawing rash, immediate conclusions when a player’s analytical profile card goes viral on Twitter.
You have to make up your own mind. But I thought the column, while focused on 10 players where analytics were wrong, was a good overview as to the benefits and detriments or relying on just one method. What say you?