Idea Management Template
Document Control
| Field | Value |
|---|---|
| Template Version | 1.0 |
| Created | 2026-04-30 |
| Author | Technology Researcher |
| Status | Active |
| Review Cycle | Quarterly |
1. Purpose
This template provides a standardised framework for capturing, evaluating, prioritising, and tracking ideas within the organisation. It ensures that all ideas are documented consistently, evaluated against common criteria, and progress through a transparent lifecycle from conception to implementation or archival.2. Idea Lifecycle Overview
The idea management process follows a Phase-Gate model, dividing each idea into distinct stages separated by decision points (gates). This approach ensures that resources are committed progressively and that weak ideas are filtered out early.Lifecycle Stages
Stage Definitions
| Stage | Description | Gate Decision | Owner |
|---|---|---|---|
| 1. Capture | Idea is submitted with initial description | Go/Kill: Does it meet minimum criteria? | Any team member |
| 2. Assess | Initial feasibility and alignment check | Go/Kill: Worth further investigation? | Team Lead / Manager |
| 3. Evaluate | Detailed analysis using scoring frameworks | Go/Kill/Pivot: Strong enough to develop? | Evaluation Panel |
| 4. Develop | Business case, prototype, resource planning | Go/Kill/Hold: Approved for implementation? | Steering Committee |
| 5. Implement | Execution, monitoring, and delivery | Launch/Iterate/Stop | Project Team |
3. Idea Submission Template
3.1 Basic Information
| Field | Description | Example |
|---|---|---|
| Idea ID | Unique identifier (auto-generated) | IDEA-2026-001 |
| Title | Concise, descriptive title | AI-Powered Code Review Assistant |
| Submitter | Name and role of the person submitting | Jane Smith, Senior Developer |
| Date Submitted | Date of initial submission | 2026-04-30 |
| Category | Classification of the idea type | See Section 3.2 |
| Tags | Relevant keywords for searchability | AI, Developer Tools, Automation |
3.2 Idea Categories
| Category | Description | Examples |
|---|---|---|
| Product Feature | New functionality for existing or planned products | Real-time collaboration, Dark mode |
| Process Improvement | Enhancement to internal workflows or processes | Automated testing pipeline, CI/CD optimisation |
| Technology Adoption | New technology, framework, or tool to evaluate/adopt | GraphQL, Rust, Edge computing |
| Business Model | New revenue streams, partnerships, or market approaches | Freemium tier, API marketplace |
| Customer Experience | Improvements to user onboarding, support, or engagement | Interactive tutorials, AI chatbot |
| Infrastructure | Platform, security, scalability, or reliability improvements | Multi-region deployment, Zero-trust architecture |
3.3 Idea Description
4. Evaluation Framework
4.1 Gate 1: Triage Review Checklist
Quick assessment to filter out ideas that do not meet minimum thresholds.| Criteria | Yes/No | Notes |
|---|---|---|
| Is the idea clearly described? | ||
| Does it align with at least one strategic objective? | ||
| Is there a identifiable problem being solved? | ||
| Is the idea technically feasible in principle? | ||
| Does it comply with organisational policies (security, ethics, legal)? | ||
| Is there a reasonable chance of positive return vs. risk? |
4.2 Gate 2: Initial Screening Scorecard
Score each criterion on a scale of 0-10.| Criterion | Weight | Score (0-10) | Weighted Score | Notes |
|---|---|---|---|---|
| Strategic Alignment | 20% | How well does this align with company strategy? | ||
| Customer Value | 20% | How much value does this create for users/customers? | ||
| Technical Feasibility | 15% | Can we build this with current/near-future capabilities? | ||
| Market Opportunity | 15% | Size of the opportunity or problem being addressed | ||
| Competitive Advantage | 10% | Does this differentiate us from competitors? | ||
| Resource Availability | 10% | Do we have (or can we get) the resources needed? | ||
| Time to Value | 10% | How quickly can we realise benefits? | ||
| TOTAL | 100% | /10 |
- 0-3: Poor — Significant concerns
- 4-6: Fair — Viable but with notable risks
- 7-8: Good — Strong candidate
- 9-10: Excellent — Exceptional opportunity
4.3 Gate 3: Deep Dive Evaluation (RICE Framework)
For ideas that pass initial screening, apply the RICE scoring model for prioritisation.| Component | Description | Formula/Value |
|---|---|---|
| Reach | How many people will this impact per quarter? | Number of users/customers |
| Impact | How much will this impact each person? | 3 = Massive, 2 = High, 1 = Medium, 0.5 = Low, 0.25 = Minimal |
| Confidence | How confident are we in our estimates? | 100% = High, 80% = Medium, 50% = Low |
| Effort | How much work is required (in person-months)? | Number of person-months |
| Idea | Reach | Impact | Confidence | Effort | RICE Score |
|---|---|---|---|---|---|
| Example A | 1000 | 2 | 80% | 3 | 533 |
| Example B | 500 | 3 | 100% | 2 | 750 |
4.4 Alternative Evaluation Frameworks
Depending on context, apply one of these complementary frameworks:| Framework | Best Used When | Key Metrics |
|---|---|---|
| ICE (Impact, Confidence, Ease) | Quick prioritisation with limited data | Simple 1-10 scoring |
| Value vs Effort | Visual portfolio mapping | 2-axis matrix |
| Kano Model | Understanding customer satisfaction impact | Must-have, Performance, Delighter |
| Weighted Scoring (MCDA) | Complex decisions with many criteria | Custom weighted criteria |
| WSJF (Weighted Shortest Job First) | Agile/SAFe environments | (Business Value + Time Criticality + Risk Reduction) / Job Duration |
5. Idea Tracking Register
5.1 Master Register
| Idea ID | Title | Category | Submitter | Date | Stage | RICE Score | Status | Next Review | Owner |
|---|---|---|---|---|---|---|---|---|---|
| IDEA-2026-001 | AI Code Review | Product Feature | J. Smith | 2026-04-30 | Evaluate | 533 | In Progress | 2026-05-15 | A. Jones |
5.2 Status Definitions
| Status | Description | Action Required |
|---|---|---|
| New | Idea submitted, awaiting triage | Schedule Gate 1 review |
| Under Review | Currently being assessed at a gate | Complete evaluation |
| Approved | Passed gate, moving to next stage | Allocate resources |
| In Development | Being developed or prototyped | Track progress |
| Implemented | Successfully delivered | Monitor outcomes |
| On Hold | Temporarily paused | Define reactivation criteria |
| Rejected | Did not pass gate evaluation | Document rationale |
| Archived | Superseded or no longer relevant | Periodic cleanup |
6. Gate Review Meeting Template
6.1 Meeting Preparation
| Item | Responsibility | Deadline |
|---|---|---|
| Idea documentation complete | Idea Owner | 3 days before meeting |
| Evaluation scorecard filled | Evaluation Panel | 2 days before meeting |
| Supporting materials distributed | Facilitator | 1 day before meeting |
6.2 Meeting Agenda
| Time | Item | Lead |
|---|---|---|
| 5 min | Review idea summary | Idea Owner |
| 10 min | Present evaluation findings | Evaluator |
| 10 min | Discussion and questions | All |
| 5 min | Decision and rationale | Decision Maker |
| 5 min | Next steps and action items | Facilitator |
6.3 Decision Record
| Field | Value |
|---|---|
| Idea ID | |
| Gate Number | |
| Date | |
| Decision | [ ] Go [ ] Kill [ ] Rework [ ] Hold |
| Rationale | |
| Conditions | Any conditions attached to the decision |
| Next Review Date | |
| Decision Maker | |
| Attendees |
7. Idea Prioritisation Matrix
7.1 Value vs Effort Matrix
Plot ideas on a 2x2 matrix after evaluation:7.2 Prioritisation Actions
| Quadrant | Action | Criteria |
|---|---|---|
| Quick Wins | Implement immediately | High value, low effort |
| Major Projects | Plan and resource | High value, high effort |
| Fill-Ins | Do when capacity allows | Low value, low effort |
| Money Pits | Reconsider or reject | Low value, high effort |
8. Integration with Evaluation Frameworks
This template integrates with the following established evaluation frameworks (detailed research available inevaluation-frameworks-research.md):
8.1 Framework Selection Guide
| Situation | Recommended Framework | Rationale |
|---|---|---|
| Initial rapid screening | ICE | Simple, fast, requires minimal data |
| Detailed prioritisation | RICE | Incorporates reach for scale-aware decisions |
| Customer satisfaction focus | Kano Model | Distinguishes must-haves from delighters |
| Multi-stakeholder decisions | Weighted Scoring (MCDA) | Accommodates diverse criteria and weights |
| Strategic portfolio view | Value vs Effort | Visual, intuitive for leadership |
| VC-style evaluation | VC Criteria | Market-focused, investor-aligned |
8.2 Hybrid Approach Recommendation
For ThoughtLabs, a two-stage hybrid approach is recommended:- Stage 1 (Gate 2): Use Value vs Effort for quick visual categorisation
- Stage 2 (Gate 3): Apply RICE for quantitative prioritisation of high-value ideas
- Supplementary: Use Kano Model when evaluating customer-facing features
9. Roles and Responsibilities
| Role | Responsibilities | Typical Holder |
|---|---|---|
| Idea Submitter | Submit ideas with complete documentation | Any team member |
| Idea Champion | Drive idea through stages, prepare materials | Assigned team member |
| Gate Reviewer | Evaluate ideas against criteria, score | Team leads, subject experts |
| Decision Maker | Make go/kill decisions at gates | Manager, steering committee |
| Facilitator | Run gate review meetings, track actions | Project Manager / PMO |
| Portfolio Manager | Maintain register, report on pipeline | Product Manager / CTO |
10. Metrics and Reporting
10.1 Key Performance Indicators
| Metric | Description | Target |
|---|---|---|
| Ideas Submitted | Total ideas captured per period | Track trend |
| Conversion Rate | % of ideas progressing from submission to implementation | >20% |
| Cycle Time | Average time from submission to decision | 30 days per gate |
| Implementation Rate | % of approved ideas successfully delivered | >80% |
| Value Realised | Measured benefit from implemented ideas | Track per idea |
10.2 Reporting Cadence
| Report | Audience | Frequency | Content |
|---|---|---|---|
| Idea Pipeline Summary | Leadership | Monthly | Funnel status, key metrics |
| Gate Review Outcomes | All stakeholders | Per review | Decisions, rationale |
| Idea Portfolio Dashboard | Product/Engineering | Weekly | Active ideas, priorities |
| Quarterly Review | Board/CTO | Quarterly | Trends, strategic alignment |
11. Templates and Forms
11.1 Quick Idea Capture Form (One-Pager)
11.2 Business Case Template (Gate 4)
12. Best Practices
12.1 Idea Generation
- Encourage submissions from all levels of the organisation
- Run regular ideation sessions (brainstorming, SCAMPER, 5 Whys)
- Maintain an open idea channel for continuous submission
- Recognise and reward idea contributors
12.2 Evaluation
- Apply criteria consistently across all ideas
- Document rationale for all decisions
- Avoid anchoring bias — evaluate each idea on its own merits
- Use data and evidence, not intuition alone
12.3 Process Management
- Keep gate reviews time-boxed and focused
- Provide timely feedback to submitters
- Regularly review and prune the idea pipeline
- Learn from rejected ideas — capture insights
12.4 Common Pitfalls to Avoid
| Pitfall | Mitigation |
|---|---|
| Analysis paralysis | Time-box evaluation stages |
| HiPPO effect (Highest Paid Person’s Opinion) | Use structured scoring, not opinions |
| Idea graveyard (no feedback) | Always communicate decisions |
| Over-processing | Match rigour to idea complexity |
| Sunk cost fallacy | Be willing to kill ideas at any gate |
13. References and Further Reading
- ISO 56000 — Innovation Management Standards
- Cooper, R.G. — Phase-Gate Process for New Product Development
- Intercom — RICE Prioritisation Framework
- Ellis, S. — ICE Scoring Model
- Kano, N. — Kano Model of Customer Satisfaction
- Tidd, J. & Bessant, J. — Managing Innovation (Wiley)
- Clark, C.H. — Idea Management: How to Motivate Creativity and Innovation (AMACOM)
Appendix A: Glossary
| Term | Definition |
|---|---|
| Gate | A decision point in the phase-gate process where continuation is decided |
| Phase | A distinct stage of work between gates |
| RICE | Reach, Impact, Confidence, Effort — a prioritisation framework |
| ICE | Impact, Confidence, Ease — a simplified prioritisation framework |
| Kano Model | A framework for categorising features by customer satisfaction impact |
| MCDA | Multi-Criteria Decision Analysis — weighted scoring approach |
| WSJF | Weighted Shortest Job First — SAFe prioritisation method |
| HiPPO | Highest Paid Person’s Opinion — bias to avoid in evaluation |
This template should be reviewed quarterly and updated based on organisational learnings and changes in strategic direction.