Product Review Script Template
A high-converting product review structure for any category — with a full written example reviewing Notion after six months of real use, and the three patterns that kill trust in review content.
Sample Hook
"I've used [product] for six months. Not a first impressions video — actual daily use. Here's the honest assessment, including the parts the company probably wishes I'd leave out."
Product Review Script Template
Use for any product category: software, apps, digital products, physical goods, courses, and services. Adapt sections to your product type.
Creator Archetypes
Thomas Frank
Frank tests products inside his actual work — he'll show Notion being used to plan his video production schedule, his content calendar, his reading list. The product is never reviewed in isolation. Every feature evaluation is embedded in a real workflow question: does this make the work faster or slower, more organized or more complicated? His reviews are structured around the question "did this change my workflow?" rather than "is this a good product?" The former is a claim he can prove with before/after comparison. The latter is a judgment call. His "before this, I used X" frame is structural: it creates a baseline that makes every comparison meaningful.
Ali Abdaal
Abdaal segments reviews by use case rather than by feature. A Notion review isn't "here are the features" — it's "here's what Notion does for students / for content creators / for solo founders / for teams." Each segment creates a distinct Identity Debt moment: "is this the segment for me?" Viewers self-select into the relevant segment and skip the irrelevant ones, which reduces frustration and increases trust. His explicit comparison to prior tools — "I used to do this in Bear Notes, here's how this compares" — answers the implicit question every product review viewer has: "should I switch from what I have?"
Written Example: Notion After 6 Months
This section is written out in full — no brackets. Use it as a model for how to execute the honest-review structure with real workflow context.
Six months ago, I moved my entire system — content calendar, script drafts, research notes, project tracking, reading list — from four separate apps into Notion. I want to tell you whether that was a good decision.
Quick verdict first: for solo creators who want everything in one place and are willing to spend a few weeks building their system, Notion is excellent. For people who want a tool that works out of the box without setup time, there are better options. I'll be specific about both.
The thing that surprised me most in the first month was how much setup Notion requires before it's useful. Every other tool I moved from had working defaults — Bear Notes worked immediately for notes, Asana worked immediately for tasks. Notion's defaults are deliberately minimal. You are building the system, not adopting one. If you find that appealing, you're the target user. If you find that draining, don't buy in.
Here's what changed in my actual workflow six months later: I now have one place where a video idea exists from first concept through final draft and publication checklist. Before, a video idea lived in Bear as a note, moved to a Google Doc when I started scripting, got tracked in Asana as a project, and disappeared from view between those steps. The connective tissue between those stages was me manually remembering to transfer it. I dropped things. With Notion, I don't.
The database feature is where Notion earns its complexity cost. Once you understand that a database is not a table but a collection of pages with properties — meaning every row is a full document, not just a cell — the whole system clicks. A content calendar where every card is the actual script, with status properties showing exactly where it is in production, is the thing I couldn't replicate in any simpler tool at any price.
What Notion doesn't do well: rich text editing. Writing long-form content in Notion feels worse than writing it in Bear or in a dedicated editor. I still draft long scripts elsewhere and paste them in. The mobile app is also meaningfully worse than the desktop version, which matters if you take notes on your phone.
The people who should use Notion: solo creators and small teams who want their entire workflow in one database-linked system and are willing to spend a weekend building it. The people who should skip it: anyone who wants a working system on day one, anyone whose primary use is long-form writing, anyone who wants a polished mobile experience.
The Template
Hook (0:00–0:30)
For long-term honest reviews:
"I've used [product] for [time period] — not a first impressions video. Here's my honest assessment after actual daily use, including the parts [company] probably wishes I'd leave out."
For sponsored but honest reviews:
"This video is sponsored by [brand]. That said — you know I've turned down sponsorships before, and I'm going to tell you exactly what I think about [product], including what doesn't work."
For comparison reviews:
"I switched from [previous product] to [new product] [time ago]. Here's whether it was worth it — including the things I missed and the things I didn't."
Who This Review Is For (0:30–1:30)
Segment the audience early. Viewers who aren't the target user should know so they can self-select out rather than watching and feeling misled.
"This review is most relevant if you're [specific use case with real context]. If you're [different situation], I'll tell you upfront that [alternative] is probably better for your needs — here's why."
Quick Verdict (1:30–2:00)
State the verdict before the evidence. Detailed viewers will stay for the reasoning.
"Quick verdict: [buy it / skip it / wait / use it for X but not Y]. Here's the reasoning."
The Honest Setup (2:00–4:00)
Context that makes the review credible:
- How you've been using it (specific use cases, frequency, duration)
- What you used before and why you switched
- What you were hoping it would solve
"Before I get into specifics: here's my context. I use [product] for [specific workflow]. I came from [previous solution]. The specific problem I was trying to solve was [problem]. That context matters because this review is about whether it solved that problem — not whether it's impressive in general."
Core Feature Evaluation (4:00–7:00)
Test the primary thing the product claims to do.
"The main reason people buy [product] is [core claim]. Here's how I tested it: [specific methodology — not a first-day impression, but a stress test with real conditions]. Here's what I found: [results with specifics, not adjectives]."
Translate features into workflow: "the database feature" means nothing. "Being able to see all 47 videos I've made, filtered by status, with one click to the actual script" means something.
What Surprised Me (7:00–9:00)
Both positive and negative. This section distinguishes real reviews from press releases.
"Here's what I didn't expect. [Positive surprise — something that works better than marketed, or solves a problem you didn't know you had.] And here's the honest negative: [limitation that only shows up with real use, not in a demo]. For most users, [whether this changes the recommendation]."
Who Should and Shouldn't Buy This (9:00–10:30)
Be specific enough that a viewer can self-sort accurately.
"Buy if: [specific use case description with enough detail that the viewer can tell if it applies to them]. Skip if: [equally specific description of who this doesn't serve]. Worth noting: [pricing, alternatives, upcoming version considerations if relevant]."
Final Verdict and CTA (Final 60 Seconds)
"After [time] with [product]: [two-sentence summary]. For [specific buyer], the answer is [recommendation]. For [other buyer], the answer is [different recommendation]."
"If you want to compare this to [main alternative], I've done that in [this video]. Links are in the description."
What Kills This Format
1. Reviewing features instead of outcomes. Nobody cares if a product has a "database" feature. They care whether that feature makes their specific work faster, better organized, or less frustrating. Every feature evaluation should answer: "what does this mean for how I actually use this?" Not "what does this feature do?" but "what does having this feature make possible?" Reviews that stay at the feature level read like spec sheets.
2. "It's not for everyone" as a hedge. Every product is not for everyone. Saying so provides no information. What creates value is naming specifically who it's for and who it isn't — with enough specificity that each viewer can self-sort. "This is for creators who want everything in one system and will spend a weekend building it" is a useful claim. "Whether it's right for you depends on your needs" is not. The hedge finish signals that the creator is afraid of being wrong, and viewers detect it as lack of commitment.
3. Reviewing too early. A two-week review of any tool with depth is a first-impressions piece. The distinction matters because tools with learning curves — productivity apps, creative software, complex services — often feel worse at week two than at month three, once the learning investment has paid off. A genuine review requires enough time to have changed your workflow: to have built patterns, hit edge cases, discovered limitations that don't appear in demos, and evaluated the product against what you actually needed it to do.
Quick Reference
- "Is [product] worth it?" and "[Product] review [year]" are your highest-volume title templates — the year is essential for search freshness
- Update old reviews with "UPDATE:" in the title when significant version changes happen — these re-index and signal currency
- Your most-shared section will almost always be the honest negative — don't soften it
Want this written for your specific video?
YouScript uses this structure to write a complete script in your voice — with your hook, your transitions, and your CTA. Takes 15 minutes.
Generate my script free →First 3 scripts free. No credit card required.