Shane Lowry was the original guinea pig. On the ninth hole, a par 4 at TPC Toronto, the TV broadcast framed the Irishman in a small box on the lower left of the screen, face-on, with a graphic in the upper left promising a “tee shot prediction” using the colour of the shot tracer.

“Green is good, red is bad,” Steve Sands, the Golf Channel play-by-play voice on the call, said.

An overhead drone shot of the hole took up the rest of the screen, and when Lowry swung, a blue line traced the path of his ball. Approximately 1.2 seconds later, the blue line turned green, and when they cut to the ball at rest, there it was, safely in the fairway. The same process played out for Bob MacIntyre, but when Corey Conners teed off, the line turned briefly yellow at the 1.2 second mark, then shifted immediately to red. The ball landed in a bunker just off the fairway.

The feature known as “drone AR smart tracing”—the AR stands for “augmented reality”—debuted with Lowry’s shot on June 5 at the RBC Canadian Open, and represented years of work by the PGA Tour with six different partners (including NBC and CBS Sports). Tour officials view it as part of a broader initiative to improve its broadcasts, alongside speciality cameras, walk-and-talk interviews and other innovations. But this specific development can be … well … traced to one particular ambition. Jon Freedman, the senior vice president of media broadcasting at the tour, and Alex Turnbull, the SVP of production technology and analytics, kept hearing the same feedback: fans like shot-tracing, and they like drones. That inspired a shared vision.

“We were always saying, we’ve got to figure out how to trace from a drone,” Freedman said.

Working together with CBS, NBC and three outside companies—Bolt6 wrote the software, Virtual Eye designed the graphics and visualised the output, and Kaze Aerial Production provided the drones and the associated flight technology—they launched Live Drone AR in 2024 at the Travelers Championship. That package was wide-ranging, but can be described succinctly as “live aerial video with real-time analytics”—in other words, drones that can fly up to 50 mph cover the action and are coupled with statistics and graphics to help explain the sport itself, from strategy to course design to result. The tour owns the tech—they bought the IP from Bolt6—and it won them an Emmy earlier this year. The networks have free use of Drone AR for select tour events—they’ll be used for the tour’s three FedEx Cup Playoff events NBC is broadcasting, beginning with this week’s FedEx St. Jude Championship. But the technology can also be licensed for the majors, as CBS did this year for the PGA Championship and NBC did at the men’s and women’s U.S. Open, and will again for the Ryder Cup.

The new smart tracing that debuted in Canada is part of that overall package and represents the fruition of Freedman and Turnbull’s concept of combining drone tech with shot tracing and real-time analytics.

“The colour of the trace has never really told the story in televised golf,” Turnbull said. “It’s traditionally just been some colour the networks or sponsor prefer, and you don’t know, aside from shot shape, if the ball’s going to be in a good or bad place. So we thought, let’s have the colour of the trace be dictated by where the ball ends up.”

None of this, he and Freedman emphasised, would be possible without the new and improved ShotLink. As far as I knew, ShotLink was still operated the same way it had been when I began covering the PGA Tour in 2013, with volunteers shooting each ball with lasers that generated data on shot distance and result. In fact, I was behind the times—the technology had evolved significantly, and in the past two years has been rebuilt entirely. There are now approximately 150 data-capturing cameras on the golf course, working in sync with each other. At least two TrackMan radars can be found on each hole, collecting trajectory data and capturing every metric—ball-in-motion, tee shots, putt distances—with less reliance on humans than ever before. This is all the necessary infrastructure underlying the predictive capacity of the smart tracing tech, and at one point Turnbull referred to it as “the single source of truth” behind their innovations.

/content/dam/images/golfdigest/fullset/2025/7/Smart-Trace-example-playoff-rbc-canadian-open.jpg

An example of the Smart Trace technology used during coverage of the playoff between Ryan Fox and Sam Burns at the RBC Canadian Open. (Image courtesy of the PGA Tour)

So, how does the prediction actually work? Conceding that the underlying technology is both proprietary and impenetrable to the average fan, it is possible to paint a broader picture. ShotLink data is the foundational element, because along with the data it captures, it’s capable of “predicting” where a shot will go, and Drone AR is able to tap into those predictions to create real-time feedback not just on where the ball will land—that’s the easy part—but, crucially, how it will roll afterward. As Turnbull explained, it requires a “physics engine” that fires 10 predictions per second. While the shot tracer itself is almost instantaneous (this is the blue line that follows the ball), the predictive tracking with its red or green colour kicks in after 1.2 seconds, and from there, the feedback comes as a set of probabilities. The number of factors that have to be considered is staggering—not just ball flight and curve, but also the shape of the land, firmness of turf, wind direction and more. LiDAR mapping, which is basically a 3D-model of the course, helps on that front, and through these mechanisms, they can simulate with great accuracy how the ball will act after it lands. The system also “learns,” and they’ve found that predictions on ShotLink improve over the course of a round as it collects data on (for instance) ground conditions and rollout. All of this, clearly, is based in cloud computing—there’s no central hub where humans are generating the predictive data.

The advantage of the drone is that it can fly all across the course and deploy this technology for any tee shot. The disadvantage is that there’s only one drone (the networks will typically have a second in the air for “beauty shots”), so at the moment it’s mostly limited to the leaders or featured groups, but Turnbull told me there are plans to make the technology scalable to traditional cameras around the course.

Obviously, there are things that they can’t plan for—the ball hits an acorn in the middle of the fairway and bounds into the left rough—and they’re cautious about deploying the tech on certain holes that are notoriously hard to predict because of slope or other obstacles near the fairway. But even those outliers, or the very rare times when the tracer gets it wrong, end up feeding the algorithm to make it more accurate in the future.

It’s eye-popping technology, and the more you learn, the more impressive it becomes, but there’s a bigger question at play that may be overshadowed. The idea behind any innovation like this is to create a more entertaining television product. So, is it working? In the case of Smart Tracing, the essential tension is whether it’s more fun to see the colors on the tracer and have a highly accurate idea of where the ball will go, or if it’s better to wait to see it land—in other words, does the technology rob the viewer of that moment of uncertainty and anticipation which makes up one of the more dramatic parts of the TV experience? To put it even more simply: Is this a spoiler?

Personally, I had mixed feelings. The basic shot tracer itself is a kind of spoiler, but my opinion, and I think the overwhelming consensus, is that it’s a huge net positive for any golf broadcast. The predictive tracing is a little different—shot tracer lets us see that it might be close to trouble, but the appearance of a red or green tracer after 1.2 seconds undercuts some drama. On the other hand, there’s a part of me that likes knowing and wonders if, rather than robbing the suspense, the tech is just shifting the moment of suspense a bit earlier. But then I argue back—why move the suspense from the analogue moment of a ball hitting the turf to the digital one of a fake line changing colour? Unlike the basic shot tracer, which solves the problem of not knowing where a shot went, I found it harder to explain the need for smart tracing.

I conducted some informal surveys, but the results were equally unclear. On X, replies were negative on the innovation by about a 3:1 margin. A survey of 15 friends big into golf showed the opposite results, with almost the exact ratio. A Reddit thread turned out roughly even, while a poll of Golf Digest staffers ended up about 2:1 against. None of this is scientific or especially meaningful except as a snapshot—even if the majority of reactions were negative, it’s also true that people with negative opinions are more motivated to speak out—but I thought Golf Digest’s Jamie Kennedy was both eloquent and representative in his opposition:

All of this was a matter of significant concern to Turnbull and Freedman as the technology developed. There was a spirited debate about this exact topic, they said, especially in the early stages, but ultimately, the potential of telling more granular stories—fairway vs. rough is just the beginning, according to Turnbull—won out.

“You’re going to have a few naysayers of, ‘Hhey, it’s taking the suspense away,'” said Freedman. “But I was with someone yesterday involved with the production who was saying, ‘This is a game changer. I don’t want to be a network that doesn’t have this.'”

“You don’t want to implement a new technology for the sake of doing something flashy unless it really tells a story,” Turnbull added. “You want it to enhance the drama. And I think the way I’ve looked at it is it’s not so much a prediction. Having the perspective of the drone that shows you a different angle of the course than you’ve ever seen before, and seeing the entirety of a hole in the perspective of a player hitting a shot and then adding this colour context, that’s just real-time feedback. It’s not taking away the drama; it’s becoming part of the story.”

Detractors often point to hypotheticals in other sports—you clearly wouldn’t want a graphic telling you whether a Steph Curry three-pointer would go in the instant the shot left his hand—but Turnbull pointed to televised poker, with cards on the screen and odds percentages that provide needed context on the hand. His point is well-taken; this isn’t exactly like other sports, and the “spoiler” is only a matter of degrees and seconds.

It’s worth noting, though, that a precursor to DroneAR called “Predicta-ball,” which graphically predicted a ball’s landing spot on a par 3, seems to have been discontinued for par-3 use after debuting two years ago, though the underlying technology is still in use in some capacity. It’s not the same as smart tracing, but it serves as proof that there are cases where predictive innovations detract from the viewing experience.

There is no consensus yet on whether smart tracing is additive, though strong opinions abound on both sides. The fundamental question here isn’t different from other debates raging around modern tech. Freedman said there wasn’t specific market research done with fans beforehand on smart tracing—though they do receive post-facto feedback from the Fan Council on topics like these—and without drawing a direct equivalence to more impactful technologies, it’s hard to ignore that the choice to implement came unilaterally from above. Which is their job, clearly, and in this case a course correction is theoretically simple; as with Predicta-ball, they can just stop doing it. Yet even in the relatively low-stakes sphere of tee shots on a golf broadcast, the question of “can we predict the future?” has been answered resoundingly and impressively, while the question of “should we?” hangs in the balance.

Follow Golf Digest Middle East on social media

Instagram

X

Facebook

YouTube

Main Image: CBS / PGA Tour