Idea Management Template

Document Control

FieldValue
Template Version1.0
Created2026-04-30
AuthorTechnology Researcher
StatusActive
Review CycleQuarterly

1. Purpose

This template provides a standardised framework for capturing, evaluating, prioritising, and tracking ideas within the organisation. It ensures that all ideas are documented consistently, evaluated against common criteria, and progress through a transparent lifecycle from conception to implementation or archival.

2. Idea Lifecycle Overview

The idea management process follows a Phase-Gate model, dividing each idea into distinct stages separated by decision points (gates). This approach ensures that resources are committed progressively and that weak ideas are filtered out early.

Lifecycle Stages

┌─────────────┐    ┌─────────────┐    ┌─────────────┐    ┌─────────────┐    ┌─────────────┐
│  1. Capture  │───▶│  2. Assess  │───▶│  3. Evaluate │───▶│  4. Develop  │───▶│  5. Implement│
└─────────────┘    └─────────────┘    └─────────────┘    └─────────────┘    └─────────────┘
       │                  │                  │                  │                  │
    Gate 1:           Gate 2:            Gate 3:            Gate 4:          Gate 5:
    Triage            Initial            Deep Dive          Business Case    Launch
    Review            Screening          Analysis           Approval         Decision

Stage Definitions

StageDescriptionGate DecisionOwner
1. CaptureIdea is submitted with initial descriptionGo/Kill: Does it meet minimum criteria?Any team member
2. AssessInitial feasibility and alignment checkGo/Kill: Worth further investigation?Team Lead / Manager
3. EvaluateDetailed analysis using scoring frameworksGo/Kill/Pivot: Strong enough to develop?Evaluation Panel
4. DevelopBusiness case, prototype, resource planningGo/Kill/Hold: Approved for implementation?Steering Committee
5. ImplementExecution, monitoring, and deliveryLaunch/Iterate/StopProject Team

3. Idea Submission Template

3.1 Basic Information

FieldDescriptionExample
Idea IDUnique identifier (auto-generated)IDEA-2026-001
TitleConcise, descriptive titleAI-Powered Code Review Assistant
SubmitterName and role of the person submittingJane Smith, Senior Developer
Date SubmittedDate of initial submission2026-04-30
CategoryClassification of the idea typeSee Section 3.2
TagsRelevant keywords for searchabilityAI, Developer Tools, Automation

3.2 Idea Categories

CategoryDescriptionExamples
Product FeatureNew functionality for existing or planned productsReal-time collaboration, Dark mode
Process ImprovementEnhancement to internal workflows or processesAutomated testing pipeline, CI/CD optimisation
Technology AdoptionNew technology, framework, or tool to evaluate/adoptGraphQL, Rust, Edge computing
Business ModelNew revenue streams, partnerships, or market approachesFreemium tier, API marketplace
Customer ExperienceImprovements to user onboarding, support, or engagementInteractive tutorials, AI chatbot
InfrastructurePlatform, security, scalability, or reliability improvementsMulti-region deployment, Zero-trust architecture

3.3 Idea Description

## Problem Statement
What problem does this idea solve? Who experiences this problem?

## Proposed Solution
Describe the idea and how it addresses the problem.

## Expected Benefits
What value will this create? Quantify where possible.

## Target Audience
Who will benefit from this idea?

## Dependencies
What other systems, teams, or resources does this depend on?

## Initial Effort Estimate
Rough order of magnitude: [Small / Medium / Large / Epic]

## Supporting Evidence
Links, data, research, or examples that support this idea.

4. Evaluation Framework

4.1 Gate 1: Triage Review Checklist

Quick assessment to filter out ideas that do not meet minimum thresholds.
CriteriaYes/NoNotes
Is the idea clearly described?
Does it align with at least one strategic objective?
Is there a identifiable problem being solved?
Is the idea technically feasible in principle?
Does it comply with organisational policies (security, ethics, legal)?
Is there a reasonable chance of positive return vs. risk?
Decision: [ ] Go to Assess [ ] Kill [ ] Request Clarification

4.2 Gate 2: Initial Screening Scorecard

Score each criterion on a scale of 0-10.
CriterionWeightScore (0-10)Weighted ScoreNotes
Strategic Alignment20%How well does this align with company strategy?
Customer Value20%How much value does this create for users/customers?
Technical Feasibility15%Can we build this with current/near-future capabilities?
Market Opportunity15%Size of the opportunity or problem being addressed
Competitive Advantage10%Does this differentiate us from competitors?
Resource Availability10%Do we have (or can we get) the resources needed?
Time to Value10%How quickly can we realise benefits?
TOTAL100%/10
Scoring Guide:
  • 0-3: Poor — Significant concerns
  • 4-6: Fair — Viable but with notable risks
  • 7-8: Good — Strong candidate
  • 9-10: Excellent — Exceptional opportunity
Decision: [ ] Go to Evaluate [ ] Kill [ ] Rework [ ] Hold for later

4.3 Gate 3: Deep Dive Evaluation (RICE Framework)

For ideas that pass initial screening, apply the RICE scoring model for prioritisation.
ComponentDescriptionFormula/Value
ReachHow many people will this impact per quarter?Number of users/customers
ImpactHow much will this impact each person?3 = Massive, 2 = High, 1 = Medium, 0.5 = Low, 0.25 = Minimal
ConfidenceHow confident are we in our estimates?100% = High, 80% = Medium, 50% = Low
EffortHow much work is required (in person-months)?Number of person-months
RICE Score = (Reach × Impact × Confidence) / Effort
IdeaReachImpactConfidenceEffortRICE Score
Example A1000280%3533
Example B5003100%2750

4.4 Alternative Evaluation Frameworks

Depending on context, apply one of these complementary frameworks:
FrameworkBest Used WhenKey Metrics
ICE (Impact, Confidence, Ease)Quick prioritisation with limited dataSimple 1-10 scoring
Value vs EffortVisual portfolio mapping2-axis matrix
Kano ModelUnderstanding customer satisfaction impactMust-have, Performance, Delighter
Weighted Scoring (MCDA)Complex decisions with many criteriaCustom weighted criteria
WSJF (Weighted Shortest Job First)Agile/SAFe environments(Business Value + Time Criticality + Risk Reduction) / Job Duration

5. Idea Tracking Register

5.1 Master Register

Idea IDTitleCategorySubmitterDateStageRICE ScoreStatusNext ReviewOwner
IDEA-2026-001AI Code ReviewProduct FeatureJ. Smith2026-04-30Evaluate533In Progress2026-05-15A. Jones

5.2 Status Definitions

StatusDescriptionAction Required
NewIdea submitted, awaiting triageSchedule Gate 1 review
Under ReviewCurrently being assessed at a gateComplete evaluation
ApprovedPassed gate, moving to next stageAllocate resources
In DevelopmentBeing developed or prototypedTrack progress
ImplementedSuccessfully deliveredMonitor outcomes
On HoldTemporarily pausedDefine reactivation criteria
RejectedDid not pass gate evaluationDocument rationale
ArchivedSuperseded or no longer relevantPeriodic cleanup

6. Gate Review Meeting Template

6.1 Meeting Preparation

ItemResponsibilityDeadline
Idea documentation completeIdea Owner3 days before meeting
Evaluation scorecard filledEvaluation Panel2 days before meeting
Supporting materials distributedFacilitator1 day before meeting

6.2 Meeting Agenda

TimeItemLead
5 minReview idea summaryIdea Owner
10 minPresent evaluation findingsEvaluator
10 minDiscussion and questionsAll
5 minDecision and rationaleDecision Maker
5 minNext steps and action itemsFacilitator

6.3 Decision Record

FieldValue
Idea ID
Gate Number
Date
Decision[ ] Go [ ] Kill [ ] Rework [ ] Hold
Rationale
ConditionsAny conditions attached to the decision
Next Review Date
Decision Maker
Attendees

7. Idea Prioritisation Matrix

7.1 Value vs Effort Matrix

Plot ideas on a 2x2 matrix after evaluation:
                    HIGH VALUE

        ┌───────────────┼───────────────┐
        │   QUICK WINS   │  MAJOR       │
        │   (Do First)   │  PROJECTS    │
        │                │  (Plan)      │
        │                │              │
LOW     ├───────────────┼───────────────┤    HIGH
EFFORT  │   FILL-INS     │  MONEY       │    EFFORT
        │   (Do if time) │  PITS        │
        │                │  (Avoid)     │
        │                │              │
        └───────────────┼───────────────┘

                    LOW VALUE

7.2 Prioritisation Actions

QuadrantActionCriteria
Quick WinsImplement immediatelyHigh value, low effort
Major ProjectsPlan and resourceHigh value, high effort
Fill-InsDo when capacity allowsLow value, low effort
Money PitsReconsider or rejectLow value, high effort

8. Integration with Evaluation Frameworks

This template integrates with the following established evaluation frameworks (detailed research available in evaluation-frameworks-research.md):

8.1 Framework Selection Guide

SituationRecommended FrameworkRationale
Initial rapid screeningICESimple, fast, requires minimal data
Detailed prioritisationRICEIncorporates reach for scale-aware decisions
Customer satisfaction focusKano ModelDistinguishes must-haves from delighters
Multi-stakeholder decisionsWeighted Scoring (MCDA)Accommodates diverse criteria and weights
Strategic portfolio viewValue vs EffortVisual, intuitive for leadership
VC-style evaluationVC CriteriaMarket-focused, investor-aligned

8.2 Hybrid Approach Recommendation

For ThoughtLabs, a two-stage hybrid approach is recommended:
  1. Stage 1 (Gate 2): Use Value vs Effort for quick visual categorisation
  2. Stage 2 (Gate 3): Apply RICE for quantitative prioritisation of high-value ideas
  3. Supplementary: Use Kano Model when evaluating customer-facing features

9. Roles and Responsibilities

RoleResponsibilitiesTypical Holder
Idea SubmitterSubmit ideas with complete documentationAny team member
Idea ChampionDrive idea through stages, prepare materialsAssigned team member
Gate ReviewerEvaluate ideas against criteria, scoreTeam leads, subject experts
Decision MakerMake go/kill decisions at gatesManager, steering committee
FacilitatorRun gate review meetings, track actionsProject Manager / PMO
Portfolio ManagerMaintain register, report on pipelineProduct Manager / CTO

10. Metrics and Reporting

10.1 Key Performance Indicators

MetricDescriptionTarget
Ideas SubmittedTotal ideas captured per periodTrack trend
Conversion Rate% of ideas progressing from submission to implementation>20%
Cycle TimeAverage time from submission to decision30 days per gate
Implementation Rate% of approved ideas successfully delivered>80%
Value RealisedMeasured benefit from implemented ideasTrack per idea

10.2 Reporting Cadence

ReportAudienceFrequencyContent
Idea Pipeline SummaryLeadershipMonthlyFunnel status, key metrics
Gate Review OutcomesAll stakeholdersPer reviewDecisions, rationale
Idea Portfolio DashboardProduct/EngineeringWeeklyActive ideas, priorities
Quarterly ReviewBoard/CTOQuarterlyTrends, strategic alignment

11. Templates and Forms

11.1 Quick Idea Capture Form (One-Pager)

IDEA CAPTURE FORM
─────────────────
Title:
Submitted by:
Date:
Category: [ ] Product  [ ] Process  [ ] Technology  [ ] Business  [ ] CX  [ ] Infrastructure

Problem: (2-3 sentences)
Solution: (2-3 sentences)
Expected Benefit: (quantified if possible)
Effort Estimate: [ ] Small (<1 week)  [ ] Medium (1-4 weeks)  [ ] Large (1-3 months)  [ ] Epic (>3 months)

Attachments/Links:

11.2 Business Case Template (Gate 4)

BUSINESS CASE: [Idea Title]
────────────────────────────
1. Executive Summary
2. Problem and Opportunity
3. Proposed Solution
4. Market Analysis
5. Technical Approach
6. Resource Requirements
   - People:
   - Budget:
   - Timeline:
7. Financial Analysis
   - Costs:
   - Benefits:
   - ROI:
   - NPV:
8. Risk Assessment
9. Implementation Plan
10. Success Criteria
11. Recommendation

12. Best Practices

12.1 Idea Generation

  • Encourage submissions from all levels of the organisation
  • Run regular ideation sessions (brainstorming, SCAMPER, 5 Whys)
  • Maintain an open idea channel for continuous submission
  • Recognise and reward idea contributors

12.2 Evaluation

  • Apply criteria consistently across all ideas
  • Document rationale for all decisions
  • Avoid anchoring bias — evaluate each idea on its own merits
  • Use data and evidence, not intuition alone

12.3 Process Management

  • Keep gate reviews time-boxed and focused
  • Provide timely feedback to submitters
  • Regularly review and prune the idea pipeline
  • Learn from rejected ideas — capture insights

12.4 Common Pitfalls to Avoid

PitfallMitigation
Analysis paralysisTime-box evaluation stages
HiPPO effect (Highest Paid Person’s Opinion)Use structured scoring, not opinions
Idea graveyard (no feedback)Always communicate decisions
Over-processingMatch rigour to idea complexity
Sunk cost fallacyBe willing to kill ideas at any gate

13. References and Further Reading

  • ISO 56000 — Innovation Management Standards
  • Cooper, R.G. — Phase-Gate Process for New Product Development
  • Intercom — RICE Prioritisation Framework
  • Ellis, S. — ICE Scoring Model
  • Kano, N. — Kano Model of Customer Satisfaction
  • Tidd, J. & Bessant, J. — Managing Innovation (Wiley)
  • Clark, C.H. — Idea Management: How to Motivate Creativity and Innovation (AMACOM)

Appendix A: Glossary

TermDefinition
GateA decision point in the phase-gate process where continuation is decided
PhaseA distinct stage of work between gates
RICEReach, Impact, Confidence, Effort — a prioritisation framework
ICEImpact, Confidence, Ease — a simplified prioritisation framework
Kano ModelA framework for categorising features by customer satisfaction impact
MCDAMulti-Criteria Decision Analysis — weighted scoring approach
WSJFWeighted Shortest Job First — SAFe prioritisation method
HiPPOHighest Paid Person’s Opinion — bias to avoid in evaluation

This template should be reviewed quarterly and updated based on organisational learnings and changes in strategic direction.