Who Is It For?

Build, Bin, Boost is designed for business school teaching, particularly:

Masters in Management

Programmes covering innovation and technology strategy

MBA Electives

R&D management and corporate entrepreneurship

Undergraduate Electives

Innovation management, entrepreneurship, or technology strategy

Executive Education

Programmes for R&D leaders and product managers

Learning Objectives

Through gameplay, students learn to:

Concepts Explored in the Simulation

The game exposes students to 20 core concepts from innovation management, strategy, and behavioural decision-making. Click any concept to learn more.

R&D Portfolio Management

The discipline of selecting, funding, and managing a collection of innovation projects to achieve strategic objectives. Rather than evaluating projects in isolation, portfolio management considers how projects interact, compete for resources, and collectively balance risk and reward.

In the game: Students manage an entire R&D portfolio over two years, making selection, funding, and termination decisions while responding to market events and board expectations.

Portfolio Balance

A well-managed portfolio balances incremental projects (safer, shorter-term returns) with transformational projects (riskier, but potentially breakthrough). Portfolios that are too conservative miss opportunities; those that are too aggressive risk catastrophic failure.

In the game: The board evaluates portfolio composition. Scoring penalises portfolios that are over 80% incremental or over 80% transformational, and rewards a balanced mix.

Strategic Alignment & Bucket Allocation

Organisations typically define strategic priorities and allocate R&D spending across “buckets” that reflect those priorities. Strategic alignment ensures innovation investments serve the organisation’s direction, not just individual project merit.

In the game: The board sets four strategic priorities. Students receive bonuses for covering all buckets and penalties for over-concentrating in one area.

Technology Readiness Levels (TRLs)

A 1–9 scale originally developed by NASA to assess the maturity of a technology, from basic research (TRL 1) through to production-ready systems (TRL 9). TRLs help decision-makers understand how much development risk remains and how far a project is from market.

In the game: Every project displays its current and target TRL. Lower TRLs indicate higher technical risk but potentially greater novelty. Students learn to factor technology maturity into investment decisions.

Multi-Criteria Decision Analysis

A structured approach to evaluating options against multiple criteria, each carrying a different weight. The choice of criteria and their weights shapes which projects score highest—and which get overlooked. Every criterion has trade-offs: what it catches and what it misses.

In the game: Students select from 15 criteria across technical, strategic, market, and financial categories, then assign weights. End-game analysis reveals how their criteria choices systematically biased their portfolio.

Stage-Gate Review

A project management approach where projects pass through a series of “gates”—formal review points where decision-makers evaluate progress and decide whether to continue, modify, or terminate. Stage-gates impose discipline on the innovation process and create structured moments for portfolio tending.

In the game: Each quarter serves as a gate. Students receive traffic-light status updates and must decide whether to continue, reduce, or kill each project based on the signals.

Active Portfolio Tending

Portfolio management is not a one-time selection exercise. It requires ongoing attention—monitoring projects, reallocating resources, accelerating winners, and cutting losers. The “Build, Bin, Boost” framework captures the three core actions: invest in promising projects, terminate underperformers, and accelerate breakthroughs.

In the game: Students who passively let projects run receive lower scores. Active management decisions—continuing, reducing, boosting, or killing—are tracked and rewarded.

Kill Discipline

One of the hardest skills in innovation management. Organisations routinely keep failing projects alive too long due to sunk costs, political pressure, or optimism bias. Good kill discipline means terminating projects early when the signals warrant it, freeing resources for more promising work.

In the game: Kill decisions are scored based on timing and project type. Killing a failing incremental project early earns points; killing a transformational project too hastily is penalised. The end-game reveals which kills were correct.

Sunk Cost Fallacy

The tendency to continue investing in a project because of previously invested resources (time, money, effort) rather than evaluating it on its future prospects alone. Past spending is irrecoverable and should not influence forward-looking decisions—but in practice, it often does.

In the game: Students inherit legacy projects with millions already spent. The display shows “sunk cost” figures, forcing students to confront whether past spending should influence their decision to continue or kill.

Escalation of Commitment

A pattern where decision-makers increase their investment in a failing course of action, often because admitting failure threatens their reputation or status. Related to sunk cost fallacy but also driven by ego, political pressure, and organisational inertia.

In the game: Several legacy projects come with political baggage—the CTO staked his reputation on one, a VP championed another. Killing them means writing off millions and damaging relationships, creating realistic escalation pressure.

Selection Bias & Blind Review

Evaluators may unconsciously favour or penalise project proposals based on the identity, reputation, or track record of the proposing team rather than the idea itself. Blind review—hiding team information during evaluation—is one mechanism for reducing this “person bias.”

In the game: Students can choose to enable blind review, hiding team details during project evaluation. The end-game analysis reveals whether blinding changed their selection patterns.

Committee Composition & Collective Bias

Who sits on the evaluation panel determines what gets funded. Each committee member brings their own expertise, preferences, and blind spots. A finance-heavy panel may undervalue exploration; a technology-heavy panel may overlook market fit. The composition of the panel is itself a strategic decision.

In the game: Students assemble an evaluation committee from staff members with explicitly defined biases and scoring effects. The end-game reveals how committee composition shaped the portfolio.

Exploitation vs. Exploration

A central tension in innovation strategy. Exploitation means refining and optimising existing capabilities for short-term returns. Exploration means searching for new knowledge and opportunities with uncertain payoffs. Ambidextrous organisations manage to do both simultaneously.

In the game: Portfolios that are all exploitation (incremental projects) receive career penalties. The board expects a mix that includes exploratory, transformational investments alongside safe bets.

Resource Allocation Under Constraints

In the real world, R&D budgets are finite. Every project funded means another project is not. Resource allocation is a zero-sum exercise that forces explicit trade-offs between competing opportunities, making prioritisation essential.

In the game: Students operate within a fixed annual budget. They can request extra funding from the CEO, but doing so raises the performance bar they will be evaluated against.

Selection Errors (Type I & Type II)

Two types of selection error carry different costs. A false positive (Type I) means funding a project that fails—wasting investment. A false negative (Type II) means rejecting a project that would have succeeded—a missed opportunity. Most organisations focus on avoiding false positives, but false negatives are often more costly in the long run.

In the game: The scoring system tracks both error types. End-game counterfactual analysis reveals which rejected projects would have been winners, making the cost of missed opportunities concrete.

ROI & Financial Discipline

Return on investment is a critical metric for R&D evaluation, but over-reliance on financial projections can kill ambitious long-term bets. Early-stage projects have highly uncertain returns, and strict ROI thresholds systematically bias portfolios toward incremental, predictable projects.

In the game: ROI is one of six board evaluation dimensions. Students who over-weight financial criteria may achieve short-term returns but miss transformational opportunities, resulting in lower career outcomes.

Bootleg / Skunk Works Innovation

Not all innovation comes through formal channels. “Bootleg” or “skunk works” projects are unofficial initiatives developed by teams working outside the sanctioned process, often using spare time or diverted resources. These grassroots efforts can produce breakthroughs that formal processes miss, but they also bypass governance.

In the game: Team members occasionally pitch unauthorised projects they have been developing informally. Students must decide whether to bring these bootleg innovations into the official portfolio or shut them down.

AI-Augmented Decision Making

AI tools can assist R&D decision-making by analysing patterns in historical data to score and rank projects. However, AI systems have systematic biases—they overweight quantifiable metrics, display false precision, and struggle to evaluate genuinely novel ideas that lack historical precedent.

In the game: The optional AI advisor “Devi” provides project scores and recommendations. End-game analysis compares AI accuracy to the student’s own accuracy, revealing when algorithmic advice helps and when it misleads.

Organisational Politics & Stakeholder Management

Innovation decisions do not happen in a vacuum. Executives champion pet projects, teams lobby for their proposals, and budget decisions carry political consequences. Effective R&D leaders must navigate these dynamics while maintaining analytical rigour.

In the game: Legacy projects come with political context—VP-championed initiatives, CTO reputation stakes, orphaned pet projects. Dilemma events arrive from above, below, and across the organisation, forcing students to balance analytical judgment with political reality.

Counterfactual Analysis

Understanding what would have happened is as important as understanding what did happen. Counterfactual analysis examines the outcomes of decisions not taken—projects rejected, projects killed, opportunities passed over. This reveals hidden costs of selection processes and biases that are invisible when looking only at funded projects.

In the game: The end-game reveals the fate of every project—including those the student rejected or killed. Students see which rejected proposals would have succeeded and which funded projects would have failed regardless, enabling a full accounting of their decision quality.

How to Use in Teaching

The simulation works well as:

Pre-Class Preparation

Students play before a session on portfolio management, bringing their results and experiences to discussion. The downloadable case study PDF provides a ready-made artefact for class.

In-Class Activity

A 60–90 minute guided session where students play and then debrief together. Works well in computer labs or with students on their own laptops.

Assessment Component

Students submit their downloaded case study PDF with reflection reports analysing their decisions and outcomes. The simulation generates unique results for each playthrough.

Group Exercise

Teams play together, discussing and debating each decision. This mirrors real-world committee dynamics and adds a collaborative dimension to the learning.

Team Play & Competition

Build, Bin, Boost works as both an individual and team activity. Here are several ways to introduce collaboration and competition:

Team-Based Play

Assign students to teams of 3–5 and have them play the simulation together on a single screen. Teams must discuss and agree on each decision—which projects to fund, what to bin, and what to boost. This mirrors real-world R&D committee dynamics and produces rich debate about risk, strategy, and resource allocation.

Competitive Tournaments

Run the simulation as a class tournament. Each team (or individual) plays independently, then compare final scores—revenue generated, portfolio success rates, and board ratings. Rank teams on a leaderboard and discuss why different strategies led to different outcomes. This adds energy and motivation to the learning experience.

Role Assignment

Within teams, assign roles that mirror real organisations: a Chief Technology Officer, a Finance Director, a Strategy Lead. Each role-holder advocates for decisions from their perspective, creating productive tension and forcing negotiation—just like real portfolio governance.

Cross-Team Debrief

After play, have teams present their strategies and results to the class. Compare approaches: Did cautious teams outperform risk-takers? Did committee-based selectors beat individual decision-makers? The variation in outcomes provides rich material for discussion about portfolio management theory.

Pairing with Case Studies

The simulation is designed to complement traditional case-based teaching. Consider pairing it with:

Before a Case Discussion

Have students play the simulation before studying a real-world R&D portfolio case. Their gameplay experience gives them first-hand intuition about the trade-offs and pressures involved in project selection, making the case discussion more grounded and personal.

After a Case Discussion

Use the simulation to test whether students can apply lessons from the case. After analysing how a company managed its innovation portfolio, students play the simulation and reflect on whether they followed the same principles—or fell into the same traps.

Suggested Case Pairings

The simulation pairs well with cases on R&D portfolio management, stage-gate processes, innovation strategy, and corporate venturing. It is particularly effective alongside cases that explore how large firms manage project selection under uncertainty and organisational politics.

Reflective Comparison

Ask students to write a short reflection comparing their simulation decisions to the case protagonist's choices. What did they do similarly? Where did they diverge? What does this reveal about their own decision-making biases?

Discussion Questions

Suggested questions for classroom debrief:

Technical Requirements

What You Need

  • Modern web browser (Chrome, Firefox, Safari, Edge)
  • Desktop or laptop (recommended)
  • Internet connection to load the page

What You Don't Need

  • No installation required
  • No student accounts or logins
  • No license fees
  • No IT department involvement

Licence & Usage Rights

Build, Bin, Boost is released under a Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International (CC BY-NC-SA 4.0) licence. This means:

You Are Free To

  • Use — Play and deploy the simulation in your courses at no cost, with no per-student charges or institutional agreements
  • Adapt — Modify the source code, change scenarios, add new projects, adjust parameters, or translate into other languages
  • Share — Redistribute the original or your modified version to colleagues and other educators
  • Host — Run your own copy on your institution’s servers or learning management system

Under These Conditions

  • Attribution — Give appropriate credit to the original author (Ammon Salter, Warwick Business School, University of Warwick) and indicate if changes were made
  • NonCommercial — You may not use the material for commercial purposes (selling access, including in paid training packages, etc.)
  • ShareAlike — If you remix or build upon the simulation, you must distribute your contributions under the same CC BY-NC-SA 4.0 licence

No paperwork required. You do not need to contact us for permission to use the simulation in your teaching. Simply use it, cite it, and share any improvements with the community.

Customise & Develop New Versions

Build, Bin, Boost is designed to be modified and extended. The entire simulation is a single HTML file with no server dependencies, making it straightforward to customise:

Modify the Scenario

Change the company setting, industry context, or strategic priorities to match your teaching focus. The simulation’s project database, event system, and scoring parameters are all configurable within the source code.

Add New Projects

Create new R&D project proposals that reflect specific industries, technologies, or innovation challenges relevant to your curriculum. Each project has attributes for feasibility, novelty, strategic alignment, and team capability.

Adjust Difficulty & Parameters

Tune the budget constraints, number of rounds, scoring weights, or event probabilities to create easier or more challenging versions for different student levels—from undergraduate introductions to executive masterclasses.

Build Discipline-Specific Versions

Adapt the simulation for pharmaceutical R&D, software product development, clean energy innovation, or any other sector. The modular design supports creating multiple variants from the same codebase.

Open Source & Community

Build, Bin, Boost is an open-source project that welcomes contributions from educators, developers, and researchers worldwide:

Source Code on GitHub

The full source code is available on GitHub. Fork the repository to start building your own version, or submit pull requests to improve the core simulation for everyone.

Community Contributions Welcome

We encourage educators to share their adaptations, additional project scenarios, teaching guides, and assessment rubrics. By contributing back to the project, you help build a richer resource for the entire teaching community.

Report Issues & Suggest Features

Found a bug or have an idea for improvement? Open an issue on GitHub. We actively review feedback and incorporate community suggestions into future releases.

No Technical Barriers

The simulation is built entirely in HTML, CSS, and JavaScript—no frameworks, no build tools, no server required. Any web developer (or technically inclined educator) can read, understand, and modify the code. This deliberate simplicity lowers the barrier to community participation.

Background Reading

Academic research that informed the simulation.

View 14 references
  • Brasil, V.C. and Eggers, J.P. (2019). Product and innovation portfolio management. Oxford Research Encyclopedia of Business and Management.
  • Brasil, V.C., Salerno, M.S., Eggers, J.P. and Gomes, L.A.V. (2021). Boosting radical innovation using ambidextrous portfolio management. Research-Technology Management, 64(5), 39-49.
  • Boudreau, K.J., Guinan, E.C., Lakhani, K.R. and Riedl, C. (2016). Looking across and looking beyond the knowledge frontier. Management Science, 62(10), 2765-2783.
  • Cooper, R.G. and Sommer, A.F. (2023). Dynamic portfolio management for new product development. Research-Technology Management, 66(3), 19-31.
  • Criscuolo, P., Dahlander, L., Grohsjean, T. and Salter, A. (2017). Evaluating novelty: The role of panels in the selection of R&D projects. Academy of Management Journal, 60(2), 433-460.
  • Criscuolo, P., Dahlander, L., Grohsjean, T. and Salter, A. (2021). The sequence effect on the selection of R&D projects. Organization Science, 32(4), 1046-1067.
  • Dahlander, L., Beretta, M., Thomas, A., Kazemi, S., Fenger, M.H. and Frederiksen, L. (2023). Weeding out or picking winners in open innovation? Research Policy, 52(10), 104875.
  • Dahlander, L., Thomas, A., Wallin, M.W. and Ångström, R.C. (2023). Blinded by the person? Experimental evidence from idea evaluation. Strategic Management Journal, 44(10), 2443-2459.
  • Kumar, A. and Operti, E. (2023). Missed chances and unfulfilled hopes. Strategic Management Journal, 44(13), 3067-3097.
  • Masucci, M., Parker, S.C., Brusoni, S. and Camerani, R. (2021). How are corporate ventures evaluated and selected? Technovation, 99, 102126.
  • Mount, M.P., Baer, M. and Lupoli, M.J. (2021). Quantum leaps or baby steps? Strategic Management Journal, 42(8), 1490-1515.
  • Sharapov, D. and Dahlander, L. (2025). Selection regimes and selection errors. Organization Science.
  • Si, H., Kavadias, S. and Loch, C. (2022). Managing innovation portfolios. Production and Operations Management, 31(12), 4572-4588.
  • Wilden, R., Lin, N., Hohberger, J. and Randhawa, K. (2023). Selecting innovation projects. Journal of Management Studies, 60(7), 1720-1751.

How to Cite

If you use this simulation in your teaching or research:

APA Style

Salter, A. (2025). Build, Bin, Boost: The R&D Portfolio Simulation [Computer software]. Warwick Business School, University of Warwick. https://buildbinboost.org

Harvard Style

Salter, A. (2025) Build, Bin, Boost: The R&D Portfolio Simulation. Warwick Business School, University of Warwick. Available at: https://buildbinboost.org (Accessed: [date]).

BibTeX

@software{salter2025buildbinboost,
  author = {Salter, Ammon},
  title = {Build, Bin, Boost: The R&D Portfolio Simulation},
  year = {2025},
  publisher = {Warwick Business School, University of Warwick},
  url = {https://buildbinboost.org}
}