I study how the criteria used to evaluate technological innovations emerge, persist, and change. In many industries, what counts as a successful innovation is defined by criteria that seem objective but are shaped by strategic choices. When those criteria fall short of capturing the actual performance of innovations in market deployment, who proposes new evaluation criteria — and why are some firms able to do this while most are not? My dissertation develops a theory of evaluative evolution: how evaluation criteria of innovations change, which firms drive that change, and what happens to the technological trajectories and competitive dynamics of innovations following that change.

I study these questions in the pharmaceutical industry, where clinical trial endpoints determine which therapies reach patients — and where the wrong criteria can mean approved drugs that don't actually help. I bring training in law, biostatistics, and strategic management, which is why I see evaluation criteria as simultaneously technical design choices and sites of strategic contestation. I have been invited to present my research at major pharmaceutical firms.

I am on the 2026–2027 academic job market.

Research Interests

  • Innovation Strategy
  • Evaluation of Innovations
  • Organizational Learning
  • Science-Based Industries

Methods

  • Causal Inference / Econometrics
  • Natural Language Processing
  • Medical Concept Classification Systems
  • Computational Text Analysis

Dissertation: Evaluative Evolution

The dissertation consists of three interconnected studies. The first asks which firms recognize and propose new criteria when existing ones fail. The second examines how evaluators decide which proposals to endorse — a problem structurally harder than evaluating innovations under fixed criteria, because the very dimensions being proposed are new. The third traces what happens after endorsement: how new criteria redirect technological trajectories, and why the firms most likely to adopt are not always the ones who benefit most.

Because newly endorsed criteria are themselves satisficing — good enough to guide decisions, but never fully capturing real-world performance — evaluative evolution is recursive, and each new criterion sows the seeds of its own eventual revision. These three studies open a broader research program — including how evaluators learn through the evaluative evolution process, the durability of industry convergence around new criteria, the dynamics of successive evaluative cycles, and the extension of evaluative evolution to medical devices, financial regulation, and environmental standards.

Working Papers

Vision or Delusion? How Evaluation Sequence Anchors the Assessment of Novelty

Yunxiang Bai, Subrina Shen, & Melody Chang

In preparation for submission to Strategic Management Journal

Organizations select against novel ventures even when they explicitly seek novelty. The literature diagnoses this as a problem of obscured vision — evaluators fail to see the upside. But evaluators do score both upside and risk. This study argues that the penalty arises not only from how they see each dimension, but also from how they synthesize conflicting dimensions into an overall judgment.

On Giants' Shoulders While Keeping Others Off of Yours: Engagement in Science and Firm Generative Appropriability

Francisco Polidoro & Yunxiang Bai

Presented at SMS Annual Conference, Istanbul, 2024

Engaging in public science creates knowledge that rivals can freely use — so does it ultimately help or hurt the publishing firm? The literature has treated this as a single tradeoff, but tracing four decades of knowledge flows reveals that the answer depends on a temporal distinction the literature has not drawn.

Publications

Mitigating Nonattendance Using Clinic-Resourced Incentives Can Be Mutually Beneficial: A Contingency Management-Inspired Partially Observable Markov Decision Process Model

Yunxiang Bai & Björn P. Berg

Value in Health, 24(8), 1102–1110, 2021

I am prepared to teach courses in strategic management, innovation strategy, and research methods.

General Management & Strategy

Instructor of Record

Undergraduate / Master's · UT Austin McCombs · Summer 2024

Instructor rating: 5.00 / 5.00 · Course rating: 4.89 / 5.00

Biostatistical Literacy

Teaching Assistant

University of Minnesota · 2019–2020

Education

Ph.D. in Management, University of Texas at Austin (Expected 2027)

M.S. in Biostatistics, University of Minnesota (2021)

LL.B., Tsinghua University (2018)

Selected Awards

Outstanding Graduate Research Fellowship (2026–2027)

McCombs Dean's Fellowship (2025–2026)

Cooper Fellowship (2025–2026)

Graduate School Continuing Fellowship (2024–2025)

Conference Presentations

Strategic Management Society Annual Conference (2024, 2023)

Academy of Management Annual Meeting (2024)

Download full CV (PDF)

Office

2110 Speedway
Austin, TX 78712

I am on the 2026–2027 academic job market.