If you know what this ruler is for, you may already know the history, and for those of you who don’t, this is a data verification ruler.
One of the things I consider quite often is where we are at in terms of data science and data analysis and how that applies to market research in particular. The tools we have now are, in a word, amazing. But what I think of most is how we got here, and how has this journey impacted how we approach our work in its most current iteration.
Where We Were
Before computers were on everyone’s desk, the academic world had been using computers to analyze their data, primarily in the social sciences, where vast amounts of data had been collected for decades. Computers allowed these scientists to perform analytics that had been mostly theoretical due to the cumbersome nature of hand-calculating everything. The pioneers of statistical software were closely associated with academic institutions, as these were the testing grounds for new approaches, new research, and gigantic data sets.
My introduction to data analysis came as an undergrad, and we were exposed to a data tabulation program that was for its time (the early 1980s), which allowed us to view our data in an entirely different way. One of the key parts of this early experience was that we still had to see the data, manually enter the data, and verify the data. While cumbersome, this interaction with the data would provide some valuable long-term benefits.
Market Research in the ’80s
In the late ’80s, while working at an academic research institution, computer capabilities were ramped up. There was a room that held about 8-10 terminals, and each one was operated by an admin who, for 8 hours daily, converted punch cards into massive data files. Years of academic research were finally being digitized, and we could not see it on a screen or run analyses on the data (even if just to see it spit out descriptive statistics).
Things were starting to move faster. We were still running batch files to read the data from files (from data that we might have entered manually, printed out in massive large-format printouts, and then verified with a ruler that matched the variable location). We would run a batch file after programming the language of how it would be read, the data layout, variable definitions, and then, finally, the analysis. If you made a mistake, you wouldn’t know until you went from the 4th-floor to the 1st-floor print room to see your printout and hope it was more than a page long (which would indicate it ran without error). Oh, and if your program crashed, you had no idea where the error was, just that it crashed.
About this time, our lab had what was, at the time, the first PC at the university, and I was given the responsibility of learning how to use it. First, I needed to learn how to run programs in DOS. You interacted directly with the computer’s CPU, which sounds novel but, really, for a long time, was one of the best ways to get anything done (Windows had so many quirks, but at this time, it wasn’t available). However, the big game changer was the introduction of the first PC version of SPSS. In our lab, we were one of the pilot test sites for the program. It had color (red and blue). You still ran stuff in batches, but you could see it running, and you could see your errors.
Growing with Technology and Data Analysis
We were so much more efficient at the time. It was hard to imagine being more productive, especially compared to what we had just come from. However, things were going to get kicked up again with the advent of Windows and Windows-based programs. Once we could run the analysis with a point-and-click and run these analyses repeatedly, it brought with it a sense of exploration and depth to the analysis we couldn’t imagine previously when our analytical approach was all pre-planned.
In moving into the private sector, there was a quick realization that the level of analysis we were used to simply wasn’t a thought for most of the market research. I would analyze data and was met primarily with a blank stare about what this significance testing we spoke of. Keep it simple, just percentages; maybe toss in a random t-test, but don’t get carried away. We would have to literally re-do entire reports to what in my opinion was a dumbing down of the information. The business world wasn’t quite ready to meet the new era of data analysis and visualizations, but it was starting.
Market Research in the ’90s And ’00s
Throughout the ’90s and into the early 2000s, the mindset of the business world began to change. These tools that we’re improving each year were beginning to make it easier for decision-makers to see the value in a deeper dive into the information. For the first time, we would actually have requests to run a regression analysis.
As the 2000s progressed, the rate of change ramped up. We started out doing mostly phone surveys (thanks to the development of CATI systems), but the idea of online market research was now making waves. Companies were starting to compile email addresses, and working with companies to use this as a way to gather information without the time and costs associated with large phone surveys. And with this development came the idea of integrating the systems to make them as seamless as possible.
Where We’re at Now
Fast forward to now. Our software allows us to gather data in a matter of hours in some cases, and we can bring that data into our analytical software that not only runs all of our analysis but generates data visualizations that can be integrated into reporting platforms (and can be updated). Our analysis now includes an array of techniques that, again, are integrated into everything we report on and present in meaningful visuals. One analyst can now conduct the work that in the past would require several individuals, all in a fraction of the time (and more accurately).
We went from literally cutting and pasting our graphics from a printout into a report to having that graphic live linked to the data source and contained in a reporting document. Our time is now spent on evaluating and analyzing the information and providing deeper insights into the results. We can use our experiences to highlight some of the subtleties in the results that in the past would have been overlooked. Our approach is deeper in context and less on superficial results.
But now that we have come full circle let’s get back to the origins of this summary, which is interacting with the data. One of the benefits of this journey has been that there is always a focus on the data and data analysis. To look at and interact with the data. It’s important to not lose sight of what you have in front of you and what you are trying to summarize and analyze. Look at it, scroll through it, and get a feel for what’s in the data set. Explore, explore, explore. Run an exploratory analysis, create different groups of respondents, and then subgroups. Combine similar measures, then run those against other measures. At the end of the day, you now have time to use this approach, and in reality, most of it will never make it to the final report, but you will know more about the data than anyone, and you will feel confident in what you’re presenting, which will ultimately impact how your clients and customers view you and your role as a researcher/analyst.
I hope you have enjoyed a history of data analysis with Cairn Consulting Group.
Let’s Talk Market Research
Looking to leverage your data? Set up a chat with our team, and let us use our extensive experience and knowledge to help you build your next market strategy.