pexels-photo-372098

The Low Cost of Failure

Maverick GarnerJuly 05, 2018

The more years that have passed, the more I've begun to realize just how much time I spend sitting at a desk, or behind the wheel of my car. So at the start of this year, I decided to do something about it - it was finally time to hit the gym and get in shape.

A book I was reading called “Mini Habits”, by Stephen Guise, supplemented this ambitious decision. One of my favorite passages from the book posited a maxim that motivated me further: the "Stupid Small Step" plan. The idea is ingenious in its simplicity - in order to create new and useful habits, getting started means setting goals that are, well, and require little to no willpower or significant cost. Aspiring to complete a single push-up on a daily basis is an easily attainable goal to hit, right?

The point of this exercise (workout pun) is to iteratively build progress off of a simple, accomplishable foundation - once a person does a push-up, they're likely to try another one, and another, and another.

In the weeks since I adopted this effective method of thinking, I find myself exercising more and more (I'm doing more than one push-up a day, for sure). Part of what keeps me motivated is that, at the end of the day, the initial goal is still exceedingly simple: one push-up a day, minimum. So if I find myself tired from a long day at work, or too stressed to push myself physically, I find personal solace in the fact that accomplishing the minimum still means I'm adhering to my daily workout parameters.

It's that dynamic - setting a low cost of failure to gradually increase returns - that influenced the creation of the article in front of you.

The High Cost of Analytics

Let's apply this concept to data analysis. Analyzing data, ultimately, is done in order to make important business decisions in order to provide better value for customers - what do customers do? When they do it, and why? It's a fluid, ongoing quest for answers, buried away in what often amounts to miles of historical numbers and figures.

There are significant cost factors to be considered when making decisions based on data points - mainly time, talent, and theories. Let's go through each of these factors and discuss the complications they present.

The Cost of Time

There are already a handful of tools that can ingest wide ranges of data. This allows scientists to cut data into pieces, analyze it, and answer specific questions. Doing this is time-consuming for several reasons, which we're going to touch on.

In the interest of full disclosure (and simplicity), here's the big catch: questions, in relation to data analysis, are often complex, and specific. Answering nuanced questions requires additional calculations - or a subset of calculations - as well as extra network transactions. This takes a lot of extra time.

Another time cost to consider is the size of the data set itself - "big data" sets, for example, often carry the burden of incredibly time-consuming data scans across billions, if not trillions, of separate events. Even if the system architecture powering these scans is built to be scalable, we've learned through experience that the query engine itself, regardless of the solution framework, must be optimized to execute quickly and efficiently.

Finally, even if the questions being asked only require simple answers, businesses are still at the mercy of their data scientists to obtain and share the query results. For busy decision makers, this means more waiting.

Our user interface is built to answer questions, in seconds, allowing virtually anyone to know immediately how customers and visitors act and interact with your brand in a visually intuitive, and easy-to-use manner. No PhD required.

The Cost of Talent

Finding success with any product or service is a matter of visibility. You've got to know the needs of customers, existing and new, to determine which aspect of your product provides value to them and their business. This means you've got to know every aspect of their day-to-day activities, then determine what processes and interactions your offering can improve.

Though that may sound like a simple endeavor, knowing every user action and interaction in real time and throughout time is an incredibly daunting task. New interactions are created every microsecond, which could add up to trillions of actions per day. Without the ability to ask questions about user activity throughout the day, as part of your everyday job, it becomes impossible to forensically, and specifically, know what kinds of things your customers find valuable, daily and over time, for a vast majority of users.

Sure, an armchair data analyst could write simple scripts to parse the data into useful answers, but not everyone has a data analyst on deck who's ready to execute in a moment’s notice. This means businesses must hire expert analytic consultants who can define complex and defensible results, outcomes, and metrics needed to make data-driven decisions. This leads to higher costs.

Analytics consultants aren't dime-a-dozen - cultivating skills to complete these tasks requires years of experience in data analytics and advanced mathematics, as well as a great deal of natural analytical tendencies. Paying for this type of talent isn't cheap, and certainly isn't the best solution.

The Cost of Testing Theories

Let's pretend, for a moment, your business has the time and the hired talent needed to arrange and execute on the queries needed to answer all posited questions. There's yet another problem: what if the theory being tested is wrong to begin with? This happens more often than you'd probably think, and the only way to reach this determination is by completing the data collection process, and reviewing it. If the initial theory isn't correct, that means starting the process again from scratch, and putting the data through its paces again.

At the end of the day, every employee with an organization - from marketing professionals to data scientists, have a very specific, and often educated opinion about why their respective theories are correct. But having an analytics professional test every theory would take an unreasonable amount of time and resources.

Our platform allows virtually any user to ask unlimited sequences of questions throughout their workday, as an integral part of their job and success.

How it Adds Up

The bottom line? The cost of obtaining answers about your customers and visitors adds up fast. Adversely, the cost of failure is too high given the potential risk, which is ultimately dependent on the technical know-how of the analyst that's been hired. Nobody wants to spend big money to test theories that could lead nowhere - that's really the opposite of doing a single push-up a day. It's more like trying to do 1,000 on your first go - ambitious, sure, but ultimately limited by an unreasonable foundation.

This is a story we've heard from clients time and time again, and it's this collective sentiment that truly illustrates the problem Interana was built to solve.

Interana and the Low Cost of Failure

At Interana, we subscribe to a "low cost of failure" maxim - we want every member of every organization to feel comfortable and confident that our flexible platform can and will provide answers, no matter what. No data analysts or contractors required.

We're on a mission to prove this is possible. Using our platform, you can jump right into the deep end of Lake Data and start swimming with no fear of drowning. The more you swim in your data pool, the more efficient (and less exhausting) the process becomes.

Thomas Edison once famously said, "I have not failed. I just found 10,000 ways that did not work." We built our foundation on these wise words. Part of finding the right answer is finding and identifying the wrong ones, fast. Interana allows users to leverage its comparative nature - continue testing and altering theories, margins, and questions in real-time, as you go.

Previous article Blog Summary Next article