I have no idea what converts anymore.
I have no idea what converts anymore. The more split tests we run… the less I feel I’m able to predict what’s going to win. I genuinely don’t know anymore. And I think that’s a good thing. The Dunning Krueger effect is finally rearing its’ head I guess. Here are some recent tests that LOST that “should have” won based off conventional wisdom/common sense: Image vs. video on a book a call page. Funny enough, this one was an accident at first. When uploading the video, we accidentally put a thumbnail instead of the video. Turns out, that won. In order to validate the results, we ran the test twice again and… you guessed it, same thing. Image beat video. Usually everyone says “video converts better!” yea… not always. Another one - on an upsell page for book a call, we’re noticing the same thing. Having no video is converting better than having an objectively good video that does an amazing job of framing the call. If you would’ve asked me to bet money on either of these before, I would have absolutely said video was going to win out. Without a doubt. Headlines? My “favorite” out of 4 we test loses plenty of times. Sometimes the most basic, least “copywritten” headline wins by a huge margin. We are testing a headline on an opt in right now - I think the control is objectively better, but it’s losing by 25% to one of the variants. Takeaway: best practices aren’t always best for you… When someone tells you something is GUARANTEED to convert better… be very wary unless it’s an extremely pedantic thing (e.g. a working buy button works better than a broken buy button) I can’t tell you what test will win… but I can tell you that if you consistently test your business will win.