Chasing Perfection: A Behind-the-Scenes Look at the High-Stakes Game of Creating an NBA Champion (17 page)

BOOK: Chasing Perfection: A Behind-the-Scenes Look at the High-Stakes Game of Creating an NBA Champion
9.32Mb size Format: txt, pdf, ePub

With the deluge of on- and off-court data now available, basketball data analysis is rapidly becoming, as Daryl Morey has suggested at the Sloan Sports Analytics Conference he co-chairs, “only constricted by money, time, and the questions you ask.” The bigger challenge for every NBA team lies in how to value, disseminate, and use the information they can generate.

Information-gathering and -sharing structures for NBA teams differ wildly, but at a baseline, the operations are very complicated and nuanced. The possible data pieces a team can gather include in-game data from SportVU, in-game proprietary data being tracked by staffers, in-game/trend data from Synergy, practice data coming from Catapult or other wearable technology devices, practice data being tracked by staffers (the 76ers, notably, track every shot,
including free throws, taken in practices), medical data, sleep data, salary and cap data, college and international scouting data, NBA pro personnel data, and advance scouting data.

All of that different data can be parsed in an enormous number of ways, and there are different constituents for each kind of data, all of whom consume and subsequently communicate findings to other constituents in very different ways. Sometimes, a report immediately finds its end user. Sometimes, the initial reader is a conduit to the eventual end users. Some people want very complicated, nuanced answers. Some need very simple ones. The possibilities of how and what to do with the data being collected by NBA teams are effectively endless, and the communication network has to be tailored to the wants and capacities of the target users. To make it more complex, team management often is hiring data staffers whose expertise stretch well beyond the hirers’, making ongoing evaluation of the data team’s work all that much more difficult.

As a quick example of how complicated one single decision can be, take an instance of the work of Ben Alamar. Alamar is a former professor of sports management at Menlo College in California who rose to prominence in the sports analytics world thanks to seven years of NBA work, first as the director of analytics with the Seattle SuperSonics/Oklahoma City Thunder and then later as a consultant with the Cleveland Cavaliers. In his 2013 book
Sports Analytics: A Guide for Coaches, Managers and Other Decision Makers,
Alamar discusses the process he undertook in 2008 to try to help the Super-Sonics (who were moving to Oklahoma City that summer) evaluate UCLA guard prospect Russell Westbrook, who was entering the draft after
two seasons of college ball.

Westbrook was a tricky draft case because he mostly played shooting guard at UCLA (with the Kings’ Collison playing the point for those Bruins teams), but the Thunder wanted a point guard with their No. 4 overall pick to go with star-in-the-making Kevin Durant, who had just finished his rookie season. The team loved Westbrook’s
physical attributes and mental makeup, but wasn’t sure he would transition to the NBA level as a primary ballhandler or distributor.

As such, the team needed a way to try to measure Westbrook’s passing acumen in college, and then project that to the NBA level. Standard statistics (like assists) were not going to provide the team with enough information to judge Westbrook’s decision making, so they needed to do some proprietary work to evaluate him more appropriately.

Alamar, through extensive film work, created a metric that looked at UCLA’s shooting percentages when Westbrook passed a teammate the ball versus shots that came unassisted or when other teammates made the pass that led to a shot. Alamar found that Westbrook’s impact on UCLA’s shooting was greater than the team’s point guard, Collison (who would be picked twenty-first overall in the 2009 NBA Draft), and stacked up favorably against both the performance of other prospects in the 2008 draft and a pool of established NBA point guards.

Once Alamar had data he was comfortable with, the challenge became communicating this new metric and what it meant to the decision makers in charge of the draft. It was enormously helpful to be able to show that Westbrook’s impact at UCLA was comparable (adjusted for context) to that of elite NBA point guards, as well as a projected elite point guard (University of Memphis’ Derrick Rose, who would be the number one overall pick in 2008). Also, being able to show that lesser NBA point guards scored worse on this metric than both the best NBA point guards and Westbrook also helped slot Westbrook higher in the minds of the brass.

Alamar wasn’t even in the team’s draft war room when they selected Westbrook, and Alamar’s analysis wasn’t (by a long shot) the only factor that encouraged the team to select him, but that didn’t stop a team official from emerging from the room after the pick had been made and yelling at Alamar, “You got your guy!” The pick has worked out brilliantly for the Thunder. Westbrook turned into an
All-Star by his third season in the league, and made second-team All-NBA in the 2014–15 season. He’s not a conventional point guard by any means, but while being one of the most physically dominant and destructive perimeter players in the world, he also led the NBA in 2014–15 in percentage of teammates’ baskets assisted while he’s on the floor at an astounding 47 percent.

The above described one part of the work that went into the evaluation of one potential draft prospect, so you can see how expansive this can get. An analytics team needs to find the right balance between pushing information to the different end users, pulling information requested by those users, and communicating it in varying ways such that each end user will be as receptive as possible to the conclusions drawn.

“I think analytics is best when it works in concert with everybody,” said Alamar, who now is ESPN’s director of production analytics, having taken over the job from Oliver. “And we ask a lot of questions to find out what people are really interested in. And when people are actually asking us questions, that’s great because that provides us ways to demonstrate the value and know that our audience is going to be listening when we deliver something. What is difficult is when we know the answer that they’re looking for, and the answer that we come up with is different. That can be challenging. But in general, I think a process that works with and in conjunction with all our audiences—as opposed to us just doling out the data—is a better, more effective model.”

Per Alamar, the two main goals of an analytics group are to provide new and/or actionable information, and to save time for decision makers. As good as a data team can be in managing and exploring data to solve team questions, though, the effectiveness of the operation is judged heavily by its ability to communicate the findings to audiences that will range from completely open-minded to those looking for validation of their gut. Alamar provides a simple but fairly common example in today’s NBA.

“Well, when a coach comes to you and says, ‘I think our problem is X, can you show me data that supports that, so I can show it to the player?’ It’s tough, particularly when that coach is new and you’ve worked with him for a couple of weeks, to come back and say, ‘Coach, no, actually you’re wrong,’ Alamar said. “If it’s not delivered well and it’s not delivered carefully, that’s a message that, ‘Oh, this guy doesn’t know what he’s talking about. I’ve been in the league for twenty years, I know that this is the problem. He’s just either dumb or he’s not very good at his job or he couldn’t deliver what I wanted.’ And so that’s a problem. Now, for the analytics person to find a way to support the coach’s answer, you don’t want to do that either because then you’re giving people the wrong information. So in the end, what you have to do is be really careful about how you present the information.”

Some current team analytics staffers echo Alamar’s sentiments. Alex Rucker started leading analytics for the Toronto Raptors in 2009, and his group developed one of the seminal analytics concepts that has made it to the public realm. In 2013,
Grantland
’s Zach Lowe detailed the group’s “ghost defenders” visual model which showed, in animated form, where Raptors defenders ideally would be on a play as the actual offensive and defensive players (and the ball) were shown
moving around the court. While the model may have been more high-concept art than something that was highly implementable for human players, there were some interesting takeaways from the computer-generated data, including the suggestion that defending teams should double-team the ball way more often and more aggressively than they currently do.

Like every other analytics staffer contacted, Rucker is not permitted to discuss specifics about his team or any of their players, but he faces many of the same challenges detailed above in terms of trying to find the right balance in the questions his group is trying to solve, and then how to disseminate that information in the most optimal way.

“On the one hand, I think the analytics—the technical guys—they have a push responsibility to go through the data and find out . . . for themselves, to use their basketball knowledge, use the data set to gain some insights to some aspect of the game, and then hopefully push that to whoever is their more immediate consumer, which is usually the front office, the executive management level,” he said.

“But, at the other end, you do have kind of an end user, you do have a consumer, you do have somebody who is wanting something from this analytics thing, if you will. . . . [T]hey’re the ones paying for it, they’re the ones asking for things, and quite honestly, they do ask for specific things. ‘I want to know this.’ So, on our level, can we give you an answer to that, and if so, go about doing it? And at some level, it’s prioritizing between what they want to know and what you think is important to push, and just kind of finding a balance between those two things.”

Dean Oliver adds: “Any place I’ve been, I’ve tried to make sure that the communication, the translation between the words and the numbers—the basketball language—is clear, because you can do the greatest analysis behind the scenes, and if you can’t put it into basketball language, then it’s not going to get implemented right, even if they want to understand.”

But even when you do have end product that is framed in a way that can be used by the basketball side of the operation, that doesn’t mean that the information will be used. There are tons of limitations—even in the current data-intensive world of basketball—in the output an analytics team can generate on a regular basis. When you then factor in that merely a small percentage of that work likely even gets fully considered, let alone implemented, it’s difficult to pinpoint the specific value of analytics operations, even though every NBA team is investing in them to some degree.

As Rucker notes, there’s a difficult balance between maintaining the academic rigor of statistical exercises—which includes understanding and accepting the inherent uncertainty in most calculations
involving such a dynamic sport played by humans with imperfect decision making—and expressing enough confidence and certainty in your conclusions that they will be taken seriously by the basketball part of the operation.

“That’s an extraordinarily difficult balance to strike between being intellectually honest and truthful, which is saying, ‘Hey, this stuff I’m talking about has all sorts of limitations and disclaimers and caveats’—the academic side, if you will.” Rucker said. “As a data scientist, if I’m going to be any good, [I have to be] astutely aware of the limitations in my data and the errors that come with anything I create.

“But you’re dealing with people that don’t function in academia, that don’t live in that world, so you can’t very well offer everything with error terms and uncertainties and risk. I think that you have to—to me, I guess the right balance is to continuously openly acknowledge that there are limitations. This is not gospel, this is not truth, this is hopefully we’re turning a flashlight onto a part of the table that we haven’t looked at before, and hopefully offering some insight, but that might not be, we might not be seeing what we think we’re seeing.”

And even when the information is good, and is presented well, and is well received by the decision makers, that’s far from the only information being considered when a team makes any choice, whether it be on game strategy, pro player evaluation, the college draft, or trades.

“It’s really up to the coach or the general manager to distill all the information, because they’re getting information from all different sides, all different types of information,” Alamar said. “They have the really, really hard job of figuring out what’s the best set of information to use and how to utilize it. And so that’s difficult, because we’re right more than we’re wrong, but we’re wrong sometimes, and if a coach or general manager is really good at what they do, and is honestly weighing all of the information and working through it, they may choose not to do it.

“Not because there’s a breakdown, but because they see other information that is counter to what we see as more compelling to them. The trick is to make sure that they’re honestly weighing all the information. And you get them to a place where they will take what you’re saying seriously and think about it carefully. As long as they’re doing that, then that’s all in basketball I think you can honestly ask for.”

Other books

The Muscle Part One by Michelle St. James
An Unusual Courtship by Katherine Marlowe
Glory (Book 4) by McManamon, Michael
Ring of Terror by Michael Gilbert
The Natural Golf Swing by Knudson, George, Rubenstein, Lorne