Interview with Chris Goward, Founder + CEO of WiderFunnel

Estimated Reading Time: 14 minutes
March 7, 2016

Chris-Goward_HeadshotChris Goward is the founder and CEO of WiderFunnel, an optimization services company. He is also the author of You Should Test That!, a guide to creating dramatic business improvements through optimization. We spoke with Chris about his book, the pitfalls of website redesign, and how to get the most out of optimization and value proposition.

Key Learnings:

1. Chris wrote You Should Test That to encourage the industry to think more about optimization as a strategy and a program. There are still a lot of companies that don’t have a good process in place, but there has been some improvement.

2. WiderFunnel approaches optimization challenges by looking at the unique situation from the customer’s perspective. They use certain frameworks to identify barriers and then attack them with totally different tactics depending on the situation, rather than using one technique for every problem.

3. Understanding the customer should happen separately from the testing phase; there is an exploration phase and a validation phase, and they require completely different mindsets.

4. If you have a strong value proposition that meets the customer’s needs, you have a high potential for getting that sale. The message also needs to be clear and relevant to the customer.

5. Write out three columns to help visualize your value proposition: what the prospects want, what your competitors offer and what you offer. Look at the similarities and differences to help you come up with three or more different plans to test.

6. Redesigning your website will not always fix optimization problems. Test the old website vs. the proposed new website before launching it to make sure it’s more viable than the original version.

7. Companies often buy personalization technology before validating what type of personalization will actually impact their performance, and a lot of companies are actually over-personalizing. In doing so, they reduce their ability to run more important tests that can impact strategy, while dramatically increasing the complexity of their websites.

Interview:

ML: Thanks for speaking with me today, Chris. So, to dive in, you wrote You Should Test That in 2013. What is the state of testing and optimization today vs. where it was just a few years ago when you started writing your book?

CG: Quite a bit has changed in three years. We started WiderFunnel in 2007, and we were some of the first to talk about the concept of strategically doing A/B testing to drive optimization. When I wrote the book, most people were thinking of optimization in terms of trying to improve their websites by looking at so-called best practices and tips and tricks. There was no testing to see if the conversion barriers had improved. I wrote You Should Test That to encourage the industry to think more about optimization as a strategy and a program. There are still a lot of companies that don’t have a good process in place, but there has been some improvement. I do think that optimization has become much more accepted as a default strategy, which is great because companies can now start developing those conversations into better programs with rapid-cycle testing; more inside-driven tests; UX tweets or design-focused elements. Looking at optimization helps you understand your clients and customers at a much deeper level.

ML: What was your initial motivation for writing the book?

CG: I Initially put off writing the book for two years because I was so busy building the company and supporting clients. But in doing that support role, it became apparent that the companies needed to hear our perspective. People were asking me for a consolidated overview of how to approach optimization to get consistent results. So, I wrote down a lot of examples and mapped out the frameworks to help clients get commercial revenue lift consistently day after day, and the book was born.

ML: Interesting. At times, when we do tests with our own clients, they’ll have this “aha” moment when they test something and get unexpected results. Their surprise has been surprising to me, but I suppose their reaction supports the importance of this testing methodology. Can you share some of those “aha” moments that you’ve had with your own clients?

CG: Yes, it is very much a humbling business, and we are also often surprised with the results we get. In fact, a couple of weeks ago, we ran a series of blog posts called the “Aha Moments” series, where we posted five different surprising test results to learn from. One test that sticks out for me is using social proof, both in headlines and on certain landing pages and banners. That has worked very well for some target audiences and very poorly for others. Through this test, we found that A/B testing can be used to understand the persuasive triggers and motivations of customers. We started by using frameworks to understand potential value proposition positioning approaches. Then, we used the testing within a target audience to get hints, which we validated and revalidated. For example, in one case, we wanted to test the importance of social inclusion in the online community vs. the tangible aspects of the service our client was offering vs. the quality of the product. We were easily able to test different motivational drivers via carefully ordered headlines, and we found that the social inclusion piece seemed to be very important for people. Yet, we tried a similar approach for another client and found completely different motivational drivers behind their customer’s actions. So, each target audience really has its own unique profile in terms of persuasional product brand fit.

ML: Do you have a go-to technique that you use when you do optimization work? For example, some people really like testing buttons; others like testing forms or value proposition, etc. Where do you typically start?

CG: That’s a great question, because I think most people approach optimization by prioritizing certain techniques. But we actually take a completely different approach. Since day one in 2007, we’ve approached optimization challenges by looking at the unique situation from the customer’s perspective. We use certain frameworks to identify barriers first and then attack them with totally different tactics depending on the situation, rather than using one technique for every problem.

ML: How would you suggest that organizations use their own data to do this type of analysis? Many organizations have access to Google Analytics or Omniture, but when they begin to ask questions about optimization, they’re rarely looking at their own data set to glean the insights they need to make changes. Do you have any feedback on how our data can help us build the right type of testing process?

CG: It is so critical to understand your available data before starting an optimization program. We’ve actually released an updated version of our optimization process called the Infinity Optimization Process™, which details the kind of data that is important to access and understand before analyzing the customer’s experience. Understanding the customer should happen separately from the testing phase; there is an exploration phase and a validation phase, and they require completely different mindsets. In the exploration phase, you utilize user testing, focus groups and usability studies to understand who your users are. Eventually, you should have about five main data tests to look at to really get into the mind of the prospect and the situation. Then, you look at what techniques you can apply to impact conversion and analyze the experiences in the validation phase. So, the exploration phase is an expansive phase of identifying potential opportunities and the validation phase is a deductive phase of finding out which ones are actually working; both are critical for the optimization program.

ML: In your book, you talk quite a bit about how to remove friction on a website. We’re all being bombarded with hundreds of messages online every day, and there’s such a scarcity of attention. How do you capture and retain attention on a website long enough to get a person to complete a purchase (or perform any other call to action)?

CG: It’s all about meeting the needs of the customer and having a strong value proposition. If you have a strong value proposition that meets the customer’s needs, you have a high potential for getting that sale. The message also needs to be clear and relevant to the customer. The clearer the presentation and imagery is on your site, the fewer distractions there are, and the more likely they are to convert. Your website needs to convey that you can meet all of your customers’ needs with your valuable solution. It helps to have a sense of urgency about it, too.

ML: In your book, you dedicate a whole chapter to value proposition. Do you have a certain method that you’d recommend for testing value proposition?

CG:  If you focus your attention on the products, you can really focus on a strong, unique value proposition. It’s important to look at what your prospects want, and compare that to what you offer and what your competition offers. I would even write out three columns to help visualize it: what the prospects want, what your competitors offer and what you offer. Look at the similarities and differences to help you come up with three or more different plans to test.

ML: I like how you talk about website redesign in your book, as well. So many companies think that their problems could be solved by simply implementing a redesign, but that’s not always the answer. What advice do you have for organizations that tend to build new things without testing their viability instead of slowly introducing changes through thorough testing?

CG: What you’re getting at there is really a question of cultural change and organizational will. If a company is traditional and is committed to a certain idea, it’s usually very hard to change that approach or opinion. To steer organizations away from a misguided redesign, what you can do is plant seeds of “FUD”– fear, uncertainty and doubt. At the very minimum, if we can’t convince a company not to go through with a traditional redesign, we at least try to get them to test the old website vs. the new website. The companies who do this are able to avoid major redesign mistakes; I’ve seen certain redesigns go very wrong, destroying years of work. But in the end, it’s really about creating a culture shift to a more data-driven approach, and doing preliminary testing of a redesign is a great way to introduce higher-ups to how valuable data can be.

ML: What are your thoughts on mobile testing vs. web testing? Should they be treated differently?

CG: Mobile should be treated differently from a technical perspective, but they’re very similar from a process and planning perspective. Your audience is essentially the same, but they just happen to be viewing your material in a different context. So, there are different distractions to be aware of depending on the context, and you might need to test different calls to action for mobile vs. desktop. But the process is the same when thinking about the problem from the customer’s perspective; it’s just the variables that are different.

ML: Would you be able to share which elements you’re currently testing on your own company website?

CG: We’ve been testing some of our blog offers, subscriptions and hooks. In the past, we’ve also tested our actual information architecture, calls to action and major layout elements. We’ve rolled out our new template designs in the past three months, so there’s been quite a bit of testing in the past year.

ML: Approximately what percentage of an organization’s digital budget should be spent on testing?

CG: I’m not sure I have a benchmark for that because it depends so much on the scale of their revenue, and some companies can run more tests than others if they have a higher traffic to revenue ratio. We approach it more on a case-by-case basis; we track the program and ROI level and make sure that we’re delivering at least 100% ROI on the program. It varies so much in some facilities because we’re delivering between a full hundred and two thousand percent ROI depending on the traffic and revenue volume.

ML: Looking ahead, how do you think personalization will impact testing processes?

CG: Personalization is a hot topic this year. It’s a tactic that can create more relevant experiences and cause a lot of great impact, but it’s only a hypothesis until it’s tested. Companies often buy personalization technology before validating what type of personalization will actually impact their performance, and a lot of companies are actually over-personalizing. In doing so, they reduce their ability to run more important tests that can impact strategy, while dramatically increasing the complexity of their websites. That said, personalization is still very powerful for a lot of companies, especially in certain niches. We’ve actually written an eight-step process for testing personalization on our blog.

ML: Do you foresee any major changes in the process of optimization and testing?

CG: At its core, optimization should be about building understanding with the customer, so it shouldn’t change much from that perspective. The tactics will change, and the technology will continue to change, and the data that’s available will become more and more valuable and clear. The testing will continue to progress as our data resources grow. For example, a few years ago, we didn’t have access to many on-site survey tools because of how expensive they were, but now they’re dirt cheap or free. I think the process is going to get even more refined and insights-driven for faster results.

ML: There are many different tools specifically built for testing. Do you have any favorites that you would unofficially endorse?

CG: You need a lot of different types of technology for all the different testing phases. For the exploration phase, we use all kinds of different tools to gather insights from our customers. Google Analytics is generally the preferred analytics platform, but there are also others that are good, as well. A lot of companies also have their own house-made survey tools for collecting NPS data, which is very powerful. We’re actually building a tool right now to help companies manage their optimization projects and insights so they can access all of their test archives. A/B testing tools like optimizeMe and Adobe are still used by a lot of companies. I’m sure there will be new ones coming out, but those are the standouts right now.

ML: Last question for you, Chris: Do you have any new upcoming books?

CG: People keep asking for something new, and I’m not sure. We’re still potentially thinking about publishing something. I don’t know yet if we’ll do the traditional book approach.

ML: You know, I interviewed another author who said he wasn’t ready to write another book yet because everyone keeps complimenting him on the book without actually taking the advice from the book. So, he’s decided not to write again until people start following his advice.

CG: That’s a good answer.

Resources:

Author

  • Michael Loban is the CMO of InfoTrust, a Cincinnati-based digital analytics consulting and technology company that helps businesses analyze and improve their marketing efforts. He’s also an adjunct professor at both Xavier University and University of Cincinnati on the subjects of digital marketing and analytics. When he's not educating others on the power of data, he's likely running a marathon or traveling. He's been to more countries than you have -- trust us.

    View all posts
Last Updated: May 12, 2023

Get Your Assessment

Thank you! We will be in touch with your results soon.
{{ field.placeholder }}
{{ option.name }}

Talk To Us

Talk To Us

Receive Book Updates

Fill out this form to receive email announcements about Crawl, Walk, Run: Advancing Analytics Maturity with Google Marketing Platform. This includes pre-sale dates, official publishing dates, and more.

Search InfoTrust

Leave Us A Review

Leave a review and let us know how we’re doing. Only actual clients, please.

  • This field is for validation purposes and should be left unchanged.