Skip to content.
Back to Walnut
Playbooks
Brought to you by

Use this quick reference to benchmark Walnut demos and playlists, diagnose performance gaps, and choose the right optimization lever β€” without digging through every dashboard.


1️⃣ Before You Benchmark (Don’t Skip)

  • Start with purpose: Confirm why this asset exists (discovery, conversion, onboarding, expansion) and what β€œsuccess” should drive (explore, book, trial, adopt).
  • Wait for processing: sessions finalize after inactivity; Insights refreshes periodically; CRM/MAP sync may lag.
  • Use a stable window: last 14–30 days
  • Segment before you compare: External vs Internal, Identified vs Anonymous (use Hide Bounced Sessions when diagnosing mid/late-funnel behavior).
  • Check identification coverage: aim for 70–80% identified sessions (use Uncover for company-level + forms/URL params for contact-level).

2️⃣ Core Benchmark KPIs (Use These Everywhere)

KPIWhat it tells youIf it’s low…
Completion RateStory quality & relevanceValue arrives too late, flow is too long, or navigation is unclear
Bounce RateHook & gating effectivenessOpener is weak or gate appears too early
Median Time SpentDepth of attention (outlier-safe)Confusion, fatigue, or unclear β€œwhat’s next” momentum
CTA Conversion (FAB or other CTA)Intent to actCTA is too late, too generic, or not aligned to viewer stage
Engagement ScoreOverall asset health vs top performersStructural, pacing, or relevance issues
Identified vs AnonymousAttribution readinessYou’re missing ROI signals and undercounting high-intent behavior

*CTA precedence:
If the asset includes a FAB or another explicit conversion CTA (e.g., Book a Meeting, Start Trial), benchmark conversion rate first and treat completion as a supporting signal.


3️⃣ Benchmarks Change by Funnel Stage

Funnel StagePrimary Benchmark Focus
Top of FunnelBounce ↓, early value/early completion ↑, identification coverage ↑
Mid-FunnelCompletion ↑, median time ↑, CTA conversion (FAB) ↑, engagement score ↑
Bottom of FunnelReturn sessions ↑, CTA conversion (FAB) ↑, late-stage screens/sections viewed
Post-SaleRepeat sessions ↑, completion consistency ↑, playlist completion ↑

4️⃣ Which Tool to Use (Fast Diagnosis)

If you see…Use this toolTo answer…
Low completionGuides Funnel / Screens FunnelWhere does attention drop (step-by-step or screen-by-screen)?
High bounceInsights Summary + First Screen/First StepIs the opener, promise, or gate creating early exits?
High time, low completionSessions Table + Session JourneyAre viewers stuck, looping, or getting lost?
Low CTA conversion (FAB)Top Screens + Guides FunnelIs the CTA too late, unclear, or missing the peak value moment?
Playlist drop-offPlaylist Item InsightsWhich item breaks momentum (or fails to earn continuation)?

5️⃣ Guided Demo vs Playlist Benchmarks

Asset TypePrimary KPIsOptimization Focus
Guided DemoGuide completion (chapters), steps viewed, median time, bounce, CTA conversion (FAB)Narrative pacing, annotation length, gate placement, CTA timing/copy
PlaylistPlaylist completion, avg/median time, played %, item-level completionSequencing, first-item strength, scope control (3–5 items), β€œrecommended next” momentum

6️⃣ Common Fixes (Metric β†’ Action)

  • Bounce high: strengthen the opener (clear promise + value fast); delay gating until after early value (often steps 3–5).
  • Completion low: move the β€œaha” earlier; shorten paths; trim/merge redundant guide steps.
  • Time high, completion low: simplify navigation; clarify β€œwhat’s next”; reduce modal/text friction.
  • CTA conversion (FAB) low: move CTA closer to the peak value moment; use action-driven copy (Book, Start, Unlock, Continue); offer more than one CTA moment (not only the end).
  • Playlist exits early: start with the most visual/fast-to-value item; rename items by outcome; limit to 3–5 items.

7️⃣ Advanced (Optional): ROI Benchmarks

  • Stage conversion (with vs without demo views)
  • Average sales cycle length by demo activity
  • Win / loss by demo engagement
  • Open opportunities with low engagement (re-activation targets)

Rule of thumb: Only use ROI benchmarks once identification coverage and integrations are solid (otherwise attribution will be incomplete).


Want the full framework?
Go deeper on benchmarks, KPI ranges, and optimization examples in
πŸ‘‰ Benchmarking Guided Demos: What β€œGood” Looks Like in Walnut

Frequently asked questions

Q: What are the core KPIs for benchmarking Walnut demos?

A: The five core KPIs are: completion rate (are viewers finishing?), session duration (are they engaged?), guide step drop-off (where do they lose interest?), CTA click-through rate (are they taking action?), and return viewer rate (are they coming back?). Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β 

Q: How do guided demo benchmarks differ from playlist benchmarks? Β  Β  Β  Β  Β  A: Guided demos are measured on individual completion and CTA conversion. Playlists are measured on playlist-level engagement β€” how many demos in the playlist a viewer watches, playlist completion rate, and which demo in the sequence drives the most CTA clicks. Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β Β 

Q: What tools in Walnut should I use for benchmarking?Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  A: Use Walnut Insights for session-level analytics, the Demo Performance dashboard for aggregate trends, and CRM reports (Salesforce or HubSpot) for pipeline attribution. For quick diagnosis, start with the Insights summary view and filter by date range. Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β 

Q: How do I know if my demo performance is below average? Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  Β  A: If your guided demo completion rate is below 50%, session duration is under 1 minute, or CTA click-through is below 5%, performance is likely below average. Check the drop-off point first β€” it usually reveals whether the issue is content, length, or placement.Β 

More playbooks

"Real-world benchmarks for embedded demo performance across funnel stages. See what top 15% performers do differently and how to optimize yours.
Benchmarking

Embedded Demo Performance: Benchmarks, Signals, and What to Optimize

Overview This playbook is designed to help you understand what strong guided embedded demo performance looks like and how to…
Read playbook
Benchmarking-Guided-Demos_-What-Good-Looks-Like-in-Walnut
Benchmarking

Benchmarking Guided Demos: What β€œGood” Looks Like in Walnut

Overview This guide helps you benchmark guided demo performance using Walnut Insights. You’ll learn what to measure, how to compare…
Read playbook
Win With Embedded Demos: Drive Engagement, Accessibility, and Discoverability

Win With Embedded Demos: Drive Engagement, Accessibility, and Discoverability

Overview Embedded demos help teams replace β€œdead” content experiences with something buyers can actually explore. Instead of passively reading or…
Read playbook

You sell the best product.
You deserve the best demos.

Halftone purple background Halftone green background

Book a Demo

Are you nuts?!

Walnut squirrel mascot illustration

Appreciate the intention, friend! We're all good. We make a business out of our tech. We don't do this for the money - only for glory. But if you want to keep in touch, we'll be glad to!


Let's keep in touch, you generous philanthropist!

Sign up here!

Fill out the short form below to join the waiting list.

Let's get started

Enter your email to get started

Nice to meet you

Share a bit about yourself

Company Info

Introduce your company by filling in your company details below

Let's get you started in Walnut…

Set your password and start building interactive demos in no time.

Continuing in 4 seconds...