Also called: 404 test
Requires existing audience or product
Relevant metrics: Views, Click through rate
How: Instead of setting up expensive custom integrations and partnerships, fake it! Build only what is absolutely necessary to advertise your product to real users while faking the rest.
Why: This is a quick and easy way to validate interest in a feature without actually building it, but implementing exactly enough for it to seem real.
This experiment is part of the Validation Patterns printed card deck
A collection of 60 product experiments that will validate your idea in a matter of days, not months. They are regularly used by product builders at companies like Google, Facebook, Dropbox, and Amazon.Get your deck!
Before the experiment
The first thing to do when planning any kind of test or experiment, is to figure out what you want to test. To make critical assumptions explicit, fill out an experiment sheet as you prepare your test. We created a sample sheet for you to get started. Download the Experiment Sheet.
By building a fake advertisement for a feature or product that doesn’t yet exist, but links to a “coming soon” page, the interest of potential users can be gauged through the clickthrough rate (CTR). Two things are tested:
- Whether your product idea sparks the interest of potential customers
- Whether an additional feature is a welcome addition to an existing product
Your goal is to build just enough to predict future reactiosn based on actual behavior. By generating data on the actual behaviors of your users, you are gaining data reliability much greater than if you would have just asked them if they were interested.
Testing feature demand in your product
Consider creating a link or button on a website for your intended feature.
If you are testing new ways to present your content, create a button called “Switch to map view” or “Switch to tile view”. Are users already satisfied with your existing way of presenting content, chances are they will be less likely to respond to your new option. The same can be true if you chose words that does not communicate well.
Button location and design can be critical
Once you have found evidence that you are on to something, consider running a few extra tests changing button location and design to up data reliability. The button location and design (color and size) can have more influence than the button text itself. Always consider experimenting placing buttons in different locations.
Test for copy comprehension and add variations
The words you use in your fake door experiment will greatly influence your result. Consider testing for comprehension first using the Five Second Test before you put your copy on a button. No matter what you do, you will want to test multiple variations of your button copy.
After the experiment
To make sure you move forward, it is a good idea to systematically record your the insights you learned and what actions or decisions follow. We created a sample Learning Sheet, that will help you capture insights in the process of turning your product ideas successful. Download the Learning Sheet.
Polyvore outfit sales
When the online store Polyvore tested their “outfit sales” feature, their most uncertain assumptions were if people were interested in shopping for outfits and whether customers would buy more if they got a bigger discount. They faked the clothing brand and the product team handled payment and shipping themselves.
Source: Polyvore outfit sales
Tesla build date
When releasing its first car, Tesla deployed a Fake Door experiment to validate demand. To validate Willingness to Pay before production had even begun, they asked customers to put down a $5,000 deposit to secure a build date. The traditional way would have been to start selling it once it was out.
Source: Pretotyping @ Work
- UX for lean startups by Laura Klein
- Validating Product Ideas Through Lean User Research by Tomer Sharon
- Fake Doors - How to Test Product Ideas Quickly (presentation at Hustlecon 2013) by Jess Lee
- The Real Startup book - Fake Door Smoke test by Tristan Kromer, et. al.