How do NHL analytics measure player and team value? Explaining the key advanced stats

Jun 5, 2022; Tampa, Florida, USA; Tampa Bay Lightning left wing Ondrej Palat (18) skates with the puck as New York Rangers goaltender Igor Shesterkin (31) defends during the third period of the Eastern Conference Final of the 2022 Stanley Cup Playoffs at Amalie Arena. Mandatory Credit: Kim Klement-USA TODAY Sports
By Shayna Goldman and Charlie O'Connor
Jun 22, 2022

Sports analysis is always evolving, and data plays a key role in that. It details aspects of the game that our eyes can’t always catch in real time, or provides concrete information to support what we see, what we don’t see at the time, or counteract our biases.

Data is a tool — one to be used in tandem with the eye test. But having this information is only so useful without an understanding of what is being measured, when to use it, and how to apply it in analysis.

Advertisement

That’s why we’re here to help. A few years ago, Charlie created two primers on hockey analytics that are still incredibly helpful today (Part 1 and Part 2). But we’re looking to build on that with a new edition.

Then, early next week we’re going to answer your questions in an “analytics mailbag” similar to the excellent work Lindsey Adler and Eno Sarris did for baseball.

So, let’s dive right in on some key concepts and models, what they mean, and how to use them.


A quick refresher on key terms

Corsi: At its core, Corsi is just plus/minus for shot attempts that occur — blocked shots, missed shots, and shots on goal (including goals). Goals are relatively infrequent events, and a team can often win the goal battle in a game without possessing the puck more than an opponent or outshooting them. Corsi, by tabulating every shot attempt (and not just the ones that lead to goals), does a much better job of evaluating which team controlled play in a given game.

Few use raw Corsi metrics anymore. For starters, it’s a far superior stat when presented in percentage form (Corsi For Percentage). And second, Corsi assumes that all shot attempts are of equal quality. Obviously, that’s not the case, so to account for differences in shot quality, other metrics must be used.

Fenwick: Similar to Corsi, but stripping blocked shots out of the equation, leaving only unblocked shots (missed shots and shots on goal). Few cite Fenwick much anymore, but it’s still worth knowing, in large part because it forms the heart of the widely-cited Expected Goals metric.

Expected goals: Not all shots are created equally and expected goals help account for that, which is why it can be used as a proxy for shot quality. Essentially, expected goal models assign a value to shot attempts based on the likelihood of that shot resulting in a goal. The basic factors in most models feature shot type, location, distance, and angle. Game state is also a key factor. But each model differs and it’s important to understand what is or isn’t featured in each when using it in analysis.

Advertisement

In the public sphere, all models are based on data from the NHL and since blocked shots are noted from the location of the block, not the shot, expected goal models assign a value to every unblocked shot attempt. Without public data for pre-shot movement or even an actual record of rebound shots, each builder has to calculate their own method of tracking pre-shot movement, so rebounds and second-chance efforts, along with rush chances, can be weighed.

Four “core” models in the public sphere are those of Evolving-Hockey, HockeyViz, MoneyPuck, and Natural Stat Trick, and each one has some unique differences in their methodologies that can explain why the results may slightly vary. How shooting talent is handled is one of those differences. These resources all offer expected goal data, but MoneyPuck offers an adjustment for it and HockeyViz allows it to be shown after the expected goal value is calculated.

Models developed by private companies, on the other hand, may feature pre-shot movement, including whether a shot was generated off the rush or the cycle, or the passing that preceded the shot.

Like Corsi and Fenwick, expected goals can be presented in a few ways. It can be shown individually, to represent the number of goals a player is expected to have based on the quality of their shots — whether as a raw number or rate. It’s also used as an on-ice metric, whether as a rate for or against, a percentage, or differential. Plus, it can be applied to goaltenders as well.

Score adjusted: Attentive hockey fans likely notice that when a team is trailing in a game by a significant number of goals, they tend to outshoot the opposition. Why does this often happen? Multiple reasons: trailing teams take more risks, leading teams play it a bit safer, and trailing clubs might simply have more urgency. As a result, sometimes the raw Corsi battle in a 6-1 game can look relatively even, despite the fact that when the game was close, the team that won clearly dominated play. So how does a conscientious analyst go about accounting for this?

Advertisement

By score-adjusting metrics like Corsi, Fenwick or xG, of course. Essentially, score adjustment takes into account the average shot and xG results in each score state, and controls for it. So if, for example, teams with three-goal leads on average collect just 43 percent of the shots and expected goals, 43 percent would become the new baseline expectation for that score state, and the stats would be adjusted accordingly. It’s not throwing out data entirely, which is never a good idea. It’s simply putting it into its proper context, making for stronger data all around.

“Per 60” Metrics: Raw totals lack context and per 60 stats can account for that by weighing ice time. It can make more sense to use raw totals over small samples, like single games or a stretch of a few. But over extended periods of time, applying the metric per 60 minutes of play helps even the playing field. It can be applied to most metrics. The formula is simply (Statistic/ice time) * 60.

Johnny Gaudreau led the league in the regular season with 4.53 points per 60 in all situations — that’s how many points he scored, on average, every 60 minutes he was on the ice. A player having impressive results in a short amount of time — like Andrew Copp’s 3.97 points per 60 post-deadline with the Rangers that puts him 12th among all skaters in 2021-22 — can inflate their scoring rate. The context to consider there is that 16 games played, and the takeaway could be what an encouraging start he got off to on the scoresheet with his new club versus him being among the elite scorers in the league.

This applies to on-ice metrics as well. Seeing that Patrice Bergeron was on the ice for 599 shot attempts against at five-on-five doesn’t account for how many games he did or didn’t play, or his playing time within those tilts. Weighing his ice time shows that the Bruins conceded a rate of 39.78 shot attempts against per 60 while he was deployed, which is the best in the league among skaters with at least 100 minutes played. Add in the context of his usage and workload, and it makes it all the more impressive now that all skaters are on the same scale.

For Percentage: Differentials from raw totals may be used on a single-game basis. Otherwise, most numbers — Corsi, Fenwick, expected goals, and actual scoring — are converted into per 60 rates or percentages. Generally, anything above 50 percent is considered to be positive because a team (or player) is breaking even in a certain metric, generating more than they allow.

Let’s stick with Bergeron here. With the Selke-winning center on the ice, the Bruins created 50.88 expected goals while allowing 22.43 back. In total, 73.31 expected goals were generated while Bergeron was on the ice, but 69.40 percent were created by Boston. The ice was incredibly tilted in the Bruins’ favor while their elite defensive forward was deployed, and that held up to lead the league at five-on-five.

This can be used at the team level as well. In the regular season, across all 82 games, the Coyotes only mustered 41 percent of the expected goals share at five-on-five to finish last in the league. Not only were they out-shot and out-chanced this year, but Arizona’s results were also just as bad with a 41 percent goals rate.

Advertisement

Microstats: The definition of microstats is a bit nebulous, but they’re really just stats that help to describe and quantify play-driving ability that are not currently tracked by the NHL. Luckily, in the public sphere, we have Corey Sznajder — who runs the All Three Zones project — who tracks events like offensive zone entries, defensive zone exits, shot assists, and zone entry defenses.

These metrics, for the most part, are more descriptive than predictive. They explain what happened, and allow hockey fans to better understand the individual skill sets of players and teams. But they don’t necessarily showcase whether a player or team is effective — simply that they are effective (or ineffective) in one particular aspect of the game. Still, the descriptive value is very real, and they can help in theorizing why a particular player or team has success, by providing additional information regarding the way in which they play.

Player value models in the public sphere

Goals Above Replacement

At its core, Goals Above Replacement at the NHL level is a value metric. It’s trying to quantify the value of a single player towards the success of his team towards goals and in turn, wins. That’s what every fan and general manager does when trying to determine a rough ranking of the best players in hockey, and which players would be most intelligent to acquire.

GAR just standardizes that process and gives each player a single-digit number that measures his value over a given length of time.

Goals Above Replacement (or Wins Above Replacement) statistical models got their popular start in baseball. But it’s the basketball models that serve as the best apples-to-apples comparison to hockey GAR models, such as Evolving-Hockey’s popular one in the public sphere (which will be the primary focus here — check out their three-part explainer if you’re looking for a more technical breakdown of it). That’s because, like hockey, basketball includes two different kinds of events:

  • Events that a player directly produces (goals, assists, turnovers/takeaways, fouls/penalties)
  • Events that occur while a player is actively playing that they don’t necessarily personally create, but help to facilitate by their presence on the court (or ice)

So the goal becomes to credit a player for both what he directly does, and also what happens on his watch that he helps to occur due to actions that are a bit more subtle (and also not tracked by the league), like off-puck positioning or breakout passes that don’t lead directly to goals, which directly impacts the play of his teammates. How does an individual person know the proper weight to give to each individual action, each individual category? The truth is, even the most experienced hockey expert probably can’t, particularly for over 600 players at once.

This is where statistical models like GAR come in. Using regression techniques and machine learning, a model like GAR can determine which events lead most to positive on-ice outcomes — such as goals and wins. Then, the model weighs each event accordingly, standardizing it across all players at a given position. The single number the model then spits out? A player’s Goals Above Replacement value, meaning how many extra goals (in terms of goal differential) said player contributed to his club above replacement level (for skaters, replacement level is the average performance of all forwards below 13th on a team’s depth chart and defensemen below seventh). The higher the number, the “better” the player.

Evolving-Hockey’s GAR model breaks out individual contributions into distinct categories — six for skaters and four for goalies — and the final GAR number is a combination of all of them. Luckily, they’re pretty intuitive for even casual hockey fans.

  • Even-strength offense (not applicable for goalies)
  • Even-strength defense
  • Power-play offense (not applicable for goalies)
  • Short-handed defense
  • Penalties taken
  • Penalties drawn

So, what events (the inputs) comprise each category? It’s about what you would think. Even-strength offense accounts for a player’s impact on their teammates’ goal production rates and their shot creation rates, the goals they themselves scored, the assists they produced, among numerous other metrics. Even-strength defense is built on their impact on their teammates’ chance suppression rates (as measured by their expected goal model), but also more traditional events like blocks and hits. The puzzle pieces that put together WAR metrics really aren’t that complicated — they’re mostly statistics and concepts that every hockey fan would agree are important. The complicated part is simply in how they’re all combined into one, catch-all metric.

Advertisement

The biggest strength of Goals Above Replacement is simple — it provides an objective, fair framework with which to rank players. Now, that doesn’t mean a GAR model is necessarily perfect. It’s going to be subject to the biases of the components that comprise it, and the decisions — both tactical and mathematical — that construct it. But the power of GAR comes in its comprehensiveness and consistency. A human just isn’t going to be able to rate every single player objectively, via the exact same parameters, solely via the eye test. GAR, via advanced mathematics, modeling, and — this is key — an understanding of hockey and what leads to success by those who constructed the model, actually can.

Now, as noted, that doesn’t mean GAR is perfect. For starters, too much is probably made, for example, of the difference between a player with, say, a 20.1 GAR in a given season and a 19.7 GAR. Is the 20.1 GAR player “better” by the model? Well, not really. We’re talking about tenths of a goal here — or even a goal or two — there’s reasonable room for debate regarding value. Sure, if one player is at 20 and another 10, the former almost certainly had the better year, but acting like one player is obviously better than another due to a small gap is foolish.

There’s also the fact that models like GAR are influenced by ice time and minutes played, which can complicate things a bit. Goals Above Replacement is an accumulated stat, so the more minutes played, the more GAR a player can theoretically generate. In other words, Player A could have a higher raw GAR number, but Player B might have actually generated more GAR in his minutes — in other words, they played less but was more efficient in those minutes.

For example, Connor McDavid generated 27.9 GAR in 1,765 minutes in 2021-22. Matthew Tkachuk accumulated 23.8 GAR. So case closed, McDavid was better, right? Well, maybe. But it’s worth noting that McDavid played in 300 more minutes than Tkachuk, and he actually generate less GAR per 60 minutes (0.950) in his ice time than Tkachuk did in his (0.973).

Does that mean that Tkachuk was better? Not necessarily. It just depends on the question one is looking to answer. Raw value is still raw value — McDavid contributed more to his team in 2021-22 than Tkachuk did to his, even if it took him more minutes to do it. Plus, it opens the door for other interesting discussions. Should Tkachuk be playing more minutes, or would his performance in those minutes dip due to overuse, negating the benefit? By the same token, would McDavid be more efficient in the minutes that he played if he was used less, and would that increased efficiency be enough to make up the difference in raw value added?

This is why even a model as advanced as Evolving-Hockey’s GAR should be used as the starting point of discussion, not the endpoint. There’s so much more to analyze than just a single number, even if someone has decided to trust in the validity of that number. (Which they should, because especially in the public sphere, it’s extremely valuable.)

Game Score Value Added

To understand Game Score Value Added (GSVA), we first have to step back to game score.

Advertisement

Dom Luszczyszyn developed it back in 2016, based on work done in basketball, to measure single game productivity. The one-stat-tells-all metric combines numbers all found on an NHL stat sheet, blending more traditional metrics with advanced stats. It’s a linear weight model with weights for each component. For skaters, that’s:

Goals: 0.75

Primary Assists: 0.7

Secondary Assists: 0.55

Shots: 0.075

Blocks: 0.05

Penalty Differential: 0.15

Faceoff Differential: 0.01

 Five-on-Five Corsi Differential: 0.05

Five-on-Five Goal Differential: 0.15

A key difference in game score from other models, like GAR, is that this works at the player level instead of scaling for team performance. It also sets itself apart by using points. The drawback is that on the surface, it doesn’t factor key context like quality of teammates and competition, zone starts, playing time, and score of the game. Special teams are technically accounted for, since Game Score uses points but it’s accounted for difference than GAR which uses on ice goals per situation.

As described in the name, GSVA is based on Game Score. But instead of just measuring a single game, or even averaging a player’s game score for an entire season, it uses data from a player’s last three seasons. The data is weighed by recency and regressed to the mean individually, and is adjusted for age. There’s also an adjustment to account for quality of teammates and competition.

Projections are input into the Game Score formula and becomes a wins above replacement rate. It can be used for individual players or added together to project a team’s collective strength.

Since its creation, there have been tweaks to the formula — like shifting from Corsi to Evolving-Hockey’s expected goal model (changing individual shots to expected goal generation, and switching metrics for five-on-five on-ice differentials). Plus there’s been work to the model to better account for defense — that continued this year to add in short-handed impact.

Advertisement

GSVA will be found in two ways — as a player projection based on those three years of data (or, as much NHL experience as a player has if they haven’t reached that benchmark) or on a single season to measure actual level of play. The actual value and projection can be compared to see how a player exceeds expectations or falls short of their projected level.

Take Norris Trophy winner Cale Makar. Based on the defenseman’s NHL experience to this point, he’s projected to be worth 4.9 wins. He had such an outstanding regular season that his actual GSVA was 5.43, to balance the scales, since he missed a few games this year, on a per-82 game basis he was actually worth 5.8 wins, which far exceeds expectations. Elite offensive play and defensive efforts, along with impressive scoring totals, contribute to that league-leading value among the position.

Goalie metrics

Goalie metrics in the true NHL public sphere are relatively straightforward. Goals-against average simply measures how many goals a netminder allows on average in a 60-minute performance. Save percentage — the best of the “basic” goalie metrics — just divides the number of shots a goalie stops by the total number they faced, and it’s that metric that allows for Hockey Reference to include a “quality start” stat on their website, meaning that a goalie posted a save percentage on a given night equal to or higher than the league average.

Both save percentage and GAA have value. (OK, save percentage is a lot more valuable than goals-against average.) But neither accounts for the quality of the shots that a netminder is facing. After all, there’s a big difference between a goalie with a 0.900 save percentage who stopped nine of 10 unscreened point shots, and one with a 0.900 who stopped nine of 10 one-timers from the low slot. The latter clearly faced more difficult shots than the former, but save percentage would judge both performances equal. That certainly doesn’t seem fair.

Enter the metric Goals Saved Above Expectation or GSAx.

Just as save percentage uses shots as its primary statistic, GSAx uses expected goals. As noted earlier, expected goals (or xG) assign a numerical value to each unblocked shot based on the location and quality of that shot. A shot from the point, then, wouldn’t be worth one shot — it might be worth 0.03 because it had a 3 percent chance of becoming a goal.

So how does GSAx work? First, it takes all of the expected goals that a netminder has faced in a given period of time — essentially, the number of goals that a netminder “should” have allowed based on the quality of the shots faced. Then, it looks at how many actual goals the netminder permitted and subtracts that total from the xG total.

The result? Let’s do a quick case study, using Vezina Trophy winner Igor Shesterkin.

  • Igor Shesterkin expected goals faced in 2021-22: 143.92
  • Igor Shesterkin’s actual goals allowed in 2021-22: 106
  • 143.92 – 106.00 = +37.92 Goals Saved Above Expectation

With GSAx, a goalie wants to come out in the positive, though it’s certainly possible to post a negative GSAx as well. A netminder would simply have to allow more actual goals than expected goals, which 74 out of the 114 goalies that faced at least 50 shots in 2021-22 did.

In other words, being an NHL goalie: quite difficult.

Advertisement

Like all public metrics, GSAx has its weaknesses, which are almost identical to the weakness of the public xG models that they use. The xG models on sites like Evolving-Hockey can’t account for pre-shot passing movement (which certainly increased shot quality), or shot velocity, or screens. Now, there are private models that do claim to account for these factors, but those in the public sphere can only use the data the NHL publicly tracks and shares. Until the NHL provides it and makes it easily accessible, these limitations will remain, and public GSAx metrics will remain largely driven by shot location, with add-ons like rebounds and rush opportunities included as well.

Still, even the mere addition of shot location to the formula over assuming every shot is the same makes GSAx a superior metric to raw save percentage, making it a more than worthwhile metric. And there’s even more room for it to improve there, along with the quality of underlying data.

Useful special teams metrics

Five-on-five play tends to be emphasized when analyzing underlying numbers since that’s where the majority of the game is played. But there are ways to measure special teams’ play deeper, instead of focusing on success percentages on both ends of the ice.

A team’s power-play percentage notes their success rate on the advantage but doesn’t detail anything further than that. That means that there isn’t any sort of sustainability check on offensive generation besides finishing talent and opposing goaltending. The Maple Leafs led the league in the regular season, operating at a 27.3 percent rate.

According to Evolving-Hockey, Toronto finished second in their shot creation (115.59 shot attempts per 60) and expected goal rate (9.42 per 60). That shows the strength below the surface along with their ability to score. To learn more about where the team tends to shoot from, HockeyViz can assist with a heat map highlighting areas of higher shot concentration in orange and brown tones (and areas of little offensive creation in purple). Keep in mind that the expected goal rate does slightly change here because it’s a different model, but according to this resource, that generation is 35 percent stronger than the league average.

Via HockeyViz

On the other hand, there are the Kings who sat 27th at 16.1 percent on the advantage. This is a team that finished eighth in the league in shot rate and were slightly above average in expected goals. So while there was room for improvement to be more overwhelming in offensive creation, it’s clear that finishing talent was really what burned them.

Advertisement

All of this applies to penalty kill as well. A team’s percentage may actually show a team’s strength on the back end while short-handed — that’s the case for the league-leading Hurricanes.

The Ducks, on the other hand, have goaltending to thank for landing 10th in the league on the season as a whole, despite falling into the bottom-10 in both shot attempts and expected goals against.

Digging below the surface helps show which teams have more dimension to their short-handed approach, whether it’s on an individual or team level. The Maple Leafs, for example, landed at the top of the league in offensive generation on the penalty. Mitch Marner, a key cog in their PK (who leads the team’s forwards in the percentage of ice time he plays), ranks highly in the league individual shot and expected goal rate which helps add to that.

Single-game analysis tools

There are a number of sites that provide live data for every NHL game. Natural Stat Trick, MoneyPuck, and Evolving-Hockey all provide counting and rate stats that extend beyond what the league provides on each game sheet. Plus, these sites feature sortable on-ice metrics for teams, players, and in the case of MoneyPuck and Natural Stat Trick, line combinations and pairings.

Each resource also has some unique information. Evolving-Hockey has Goals Saved Above Expectations (GSAx) readily available, along with expected and delta save percentages. MoneyPuck has a running count of puck freezes for goalies. And Natural Stat Trick allows users to check on-ice metrics for players against certain opponents to learn more about the results of a head-to-head matchup.

These sites also all include a number of visuals to help depict different aspects of game play.

But if you really want to go heavy on visualizations, that’s a specialty of HockeyViz. This site has tools that update real time during games, and then provides even more information afterwards — from player usage charts that show lineups, ice time, and on-ice goal results for every player, to matchup breakdowns, zone deployment, and shift charts. Along with team-wide graphics, there’s detailed visuals for every single player postgame, as well.

All via HockeyViz

Some of the most popular game-analysis tools include shot maps, which can be found in numerous sources. Here’s some of the essentials:

Natural Stat Trick has heat maps for every single game that updates live. These can be filtered by situation to show all situations, even strength, five-on-five (raw and score and venue adjusted), as well as special teams. These help show where teams generate (and allow) shots from, and which areas of the ice are more concentrated with attempts.

ViaNaturalStatTrick

While HockeyViz is known for season-wide heat maps for players and teams, for single-game usage there are shot maps available. Like Natural Stat Trick, these update live and have a few situational options (even strength, power play, short-handed). On these visuals, every unblocked attempt can be identified by shot type, result, shooter, and expected goal value. Plus, there’s a collective expected goal total for each team based on their shot quality. Along with providing a team-wide look, these can be filtered to show attempts by a player on either side of the matchup, as well as how a team performed with or without a skater. These maps can also be broken out by period.

Via HockeyViz

Evolving-Hockey also offers a live shot map with filters for situation, period of play, whether the data is score-adjusted, and player’s on-ice results. Every unblocked shot attempt is sized to represent shot quality and is also color-coded by result (goal, miss, and shot). This interactive tool allows the user to hover over each shot to learn more — from who the shooter was and what the expected goal value of the attempt was, along with shot type, angle, distance, and strength. A collective value of each team’s shot attempts is also noted.

Via Evolving-Hockey

MoneyPuck also has an interactive shot map that presents data on every shot attempt when hovering over each data point. That information includes the shooter, event time, shot type, angle, distance, and result. Plus, MoneyPuck offers probabilities for each shot reaching the net and becoming a goal.

Via MoneyPuck

Besides shot maps, in-game trends are readily available through a number of these sites as well.

HockeyViz offers a way to measure “shot-pressure” which updates live, and notes when goals are scored (along with game state, goal scorer, and pucks that hit the posts). Natural Stat Trick offers a game flow visual for shot attempts and expected goals. Evolving-Hockey has its own version for cumulative shot attempts and expected goal generation. MoneyPuck also has the latter — both are interactive, allowing users to highlight the timeline for details on shot attempts and goals scored.

While these graphics can help users get the gist of the events of a game, HockeyStatCards provides a way to quickly measure player performance. This resource uses game score, to provide a full report from every game, along with a leaderboard to highlight the stars of the night. Leaderboards are also available for best and worst single-night performances over the year, as well as top and bottom average game score earned by a skater.

Via HockeyStatCards

(Top photo: Kim Klement / USA Today)

Get all-access to exclusive stories.

Subscribe to The Athletic for in-depth coverage of your favorite players, teams, leagues and clubs. Try a week on us.