Case Study

Building an experimentation engine to identify, fund, and validate sports tech RESEARCH opportunities

SportsX was MLSE’s internal R&D program designed to turn early-stage sports tech ideas into decision-ready projects. The core unlock was runway: dedicated capital reserved for research and development. That funding model made future-facing work possible without forcing immediate ROI, giving teams the space to do proper research, prototype with intent, and validate ideas with real evidence.

The program created a repeatable way to source ideas across the organization, evaluate them with a consistent scorecard, and move the strongest concepts into funded in-flight work without relying on ad hoc pilots or subjective decision-making.

Submissions Generated

0
0
+

ideas submitted across R&D streams

Projects Approved

0

Projects funded and put into flight within the first year

Partnership Investment

$
0
M

Role

Co-founder and Director of Product Design

Scope

Program framework, portfolio intake, evaluation model, experiment framing, and delivery on select R&D projects

Team

SportsX Review Committee + cross-functional project pods (Product, Design, Engineering, Data, Ops, Partnerships)

Timeline

2021-2022

The content is password protected.

Enter the provided password to view.

OverView

SportsX created a structured pipeline for R&D innovation across MLSE.

Three R&D streams to balance exploration and execution

  • Basic Research: contribute to knowledge where published work is limited

  • Applied Research: solve known problems with practical outcomes

  • Experimental Development: prototype-driven work that can transition into products or processes

MLSE operates a multi-app ecosystem across team brands and live-event environments.

MLSE operates a multi-app ecosystem across team brands and live-event environments.

Standard governance to move from ideas to funded projects

Usage peaks on game day: tickets, entry, in-venue utility, commerce.

Usage peaks on game day: tickets, entry, in-venue utility, commerce.

Comparable evaluation using a consistent scorecard across submissions

FY24 delivery: launched new Leafs, Raptors, TFC, Argos apps ahead of opening nights

FY24 delivery: launched new NHL, Leafs, Raptors, and TFC apps ahead of opening nights; enhanced existing apps for performance and operational efficiency.

Program storytelling through clear project one-pagers and committee-ready artifacts

FY24 delivery: launched new Leafs, Raptors, TFC, Argos apps ahead of opening nights

FY24 delivery: launched new NHL, Leafs, Raptors, and TFC apps ahead of opening nights; enhanced existing apps for performance and operational efficiency.

The Challenge

MLSE had high-potential ideas, partners, and emerging technology opportunities but early-stage innovation needed a system to scale. Without a consistent process, great concepts were hard to evaluate, hard to fund, and even harder to move into execution with confidence.

Inconsistent early-stage rigor

Fragmented System

Ideas entered the pipeline in different formats and with different levels of research. That made it difficult to separate “interesting” from “decision-ready.”

Inconsistent patterns, IA, and analytics made it harder to scale improvements and prove

business impact.

No shared way to prioritize

Teams evaluated opportunities through different lenses. Without a common rubric, prioritization relied too heavily on opinion, urgency, or who was advocating the loudest.

Tickets not loading at gates, app freezing during critical moments, forced reinstalls causing fan frustration.

Innovation competed with short-term delivery

Future-facing work struggled to earn time and resources because immediate ROI was easier to justify. The result was underpowered discovery and shallow validation.

Raptors security score baseline 48/100. 150k+ crashes across team apps.

$500k+ sponsorship revenue at risk.

My Role

I co-founded SportsX and owned the framework that made the program scalable as a system, not just successful as a set of one-off projects. I was responsible for setting the structure behind how ideas entered the program, how they were evaluated, and how they were communicated in a way that enabled confident funding decisions.

Helped define how projects should be framed for committee review (problem, hypothesis, questions, impact)

Supported the program narrative and artifacts that made decisions easier (one-pagers, scorecard logic, pitch readiness)

Led delivery on select R&D projects, moving concepts from research and prototyping into early validation and execution

Built to standardize the selection model so projects could be compared across streams

Directed experience and system design for ticketing, entry readiness, in-venue utility and commerce

Worked cross-functionally to align constraints and execution realities (capability, time, operational lift)

Aligned stakeholders across analytics, partnerships, marketing, ticketing and venue-ops

Owned pitch narratives for committee approval, presenting projects, and driving alignment through formal review and voting

Helped define how projects should be framed for committee review (problem, hypothesis, questions, impact)

Built to standardize the selection model so projects could be compared across streams

Supported the program narrative and artifacts that made decisions easier (one-pagers, scorecard logic, pitch readiness)

Worked cross-functionally to align constraints and execution realities (capability, time, operational lift)

Led delivery on select R&D projects, moving concepts from research and prototyping into early validation and execution

Owned pitch narratives for committee approval, presenting projects, and driving alignment through formal review and voting

Strategy: 3 Core Pillars

Evidence-first delivery in real conditions

We prioritized prototypes and pilots that could generate signal quickly, then tested them in real operational contexts. Each project produced measurable learnings, a clear recommendation (scale / iterate / stop), and a documented retrospective so the program’s knowledge compounded over time.

Prioritize tickets, entry readiness, in-venue orientation, and transactions where failure is most costly.

Runway with accountability

SportsX created protected R&D capital so future-facing work could happen without forcing immediate ROI. In exchange, every project had clear decision gates and defined outcomes—so the runway produced confident go/no-go calls, not open-ended exploration.

Prioritize tickets, entry readiness, in-venue orientation, and transactions where failure is most costly.

Comparable decisions through a shared rubric

We standardized how ideas were framed and evaluated. A consistent scorecard and committee vote made projects comparable across teams and streams, reduced subjective debate, and increased the quality of proposals moving into funding.

Prioritize tickets, entry readiness, in-venue orientation, and transactions where failure is most costly.

R&D PRojects - Season 1

THE ENTRY EXPERIENCE STUDY

A study focused on how we might begin the live event experience before fans even get to the arena.

THE METAVERSE COMMUNITY EXPERIMENT

Creating a linked metaverse and physical world community to elevate the fan experience

THE NFT PHYSICAL WEARABLES EXPERIMENT

Creating a linked metaverse and physical world community to elevate the fan experience

UNCONCSIOUS BIAS IN SOCCER TALENT EVALUATION

A study focused on reducing human bias from soccer scouting practices

AR TRAINING EXPERIENCE

Experimental application of Augmented Reality as a tool for basketball training.

PSYCHOLOGICAL RESILIENCE OF NBA PLAYERS & DRAFT PROSPECTS

A study to understand the performance resilience of draft prospects and NBA players

Program Impact - Season 1

In its first season, SportsX turned early-stage innovation from an abstract ambition into an operating system. A shared intake model and evaluation rubric replaced opinion-driven debate with comparable, evidence-based decisions. Teams spent less time arguing for ideas and more time testing them.


Protected R&D runway changed behavior. Projects showed up with clearer hypotheses, stronger research, and explicit decision criteria. Low-signal ideas were stopped earlier, while high-potential work moved into execution with confidence. As a result, time to decision dropped, project acceptance rates increased, and roadmap waste was materially reduced.


Just as importantly, the program aligned stakeholders across product, partnerships, data, and operations around a single moment of truth. Innovation became visible, accountable, and repeatable. This allowed us to set a foundation the organization could scale beyond the first season.
  • Stakeholder ALignment Score

    +
    0
    0
    %

    Increase in stakeholder alignment scores after the first Innovation Forum established a shared review and voting moment.

  • Roadmap Waste

    +
    0
    0
    %

    Reduction in roadmap waste by stopping low-signal ideas earlier, before resourcing and delivery commitments.

  • Project Acceptance Rate

    +
    0
    0
    %

    Increase in project acceptance rate after introducing a standardized rubric and committee-based voting.

  • Time-To-Decision

    -
    0
    0
    %

    Reduction in time-to-decision from intake to go/no-go by shifting to a repeatable review cadence and decision-ready artifacts.

  • cross-functional participation

    +
    0
    0
    %

    Increase in cross-functional participation in early-stage innovation through an org-wide intake and clearer project framing.

  • Decision Readiness

    +
    0
    0
    %

    Increase in “decision-ready” proposals by standardizing one-pagers, scorecards, and success criteria.

  • Stakeholder ALignment Score

    +
    0
    0
    %

    Increase in stakeholder alignment scores after the first Innovation Forum established a shared review and voting moment.

  • Roadmap Waste

    +
    0
    0
    %

    Reduction in roadmap waste by stopping low-signal ideas earlier, before resourcing and delivery commitments.

  • Project Acceptance Rate

    +
    0
    0
    %

    Increase in project acceptance rate after introducing a standardized rubric and committee-based voting.

  • Time-To-Decision

    -
    0
    0
    %

    Reduction in time-to-decision from intake to go/no-go by shifting to a repeatable review cadence and decision-ready artifacts.

  • cross-functional participation

    +
    0
    0
    %

    Increase in cross-functional participation in early-stage innovation through an org-wide intake and clearer project framing.

  • Decision Readiness

    +
    0
    0
    %

    Increase in “decision-ready” proposals by standardizing one-pagers, scorecards, and success criteria.

Learnings

Runway changes behavior

When funding is protected, teams invest in stronger research, better prototypes, and clearer measurement instead of shortcutting to “something shippable.”

Streams prevent portfolio collapse

Separating Basic / Applied / Experimental kept the program balanced between long-term learning and near-term execution.

Make the decision explicit

The best proposals defined the decision upfront: what we would do if results were strong, weak, or inconclusive.

Moments of visibility create momentum

The annual Innovation Forum made the work real to the broader org and created accountability through shared review and voting.

Let's BUILD SOMETHING MEANINGFUL

Whether you're building something new or scaling what you have, I'd love to hear about your challenge.

Let's BUILD SOMETHING MEANINGFUL

Whether you're building something new or scaling what you have, I'd love to hear about your challenge.

Let's BUILD SOMETHING MEANINGFUL

Whether you're building something new or scaling what you have, I'd love to hear about your challenge.

Eugene Joe

Strategic Design Leader for AI-Native Teams

© 2026 Eugene Joe. All rights reserved.

Eugene Joe

Strategic Design Leader for AI-Native Teams

© 2026 Eugene Joe. All rights reserved.

Eugene Joe

Strategic Design Leader for AI-Native Teams

© 2026 Eugene Joe. All rights reserved.