Friday, September 13, 2013

Two year lapse

Just a quick note to confirm that the Inside Market Research blog has gone through a two-year lapse in publication. I wonder, do subscribers even care?

 The reason for the lapse has been due to a few things:

1. Feeling overextended with various work projects.
2. Not being able to publicly disclose much of my Comcast Business research (it's proprietary).
3. The belief that Facebook, Twitter, and LinkedIn are replacing blogs as social tools of communication and dialog.

I'd like to see if this post generates any feedback, and then make a decision about whether to revive this platform, or let it continue to act as a dusty library of content (which continues to garner about 10 or 11 page views per day). Whatever the outcome, I do sincerely appreciate those who have followed my writings here and elsewhere on the Internet.

Saturday, June 04, 2011

Who's reading Inside Market Research?

Last month, we passed the sixth "candy or iron" anniversary of Inside Market Research without much fanfare. While the blog was launched in May 2005, it wasn't until April 2006 that the Google Analytics monitoring widget was installed. I'm a big fan of Google Analytics and other web traffic tools (like StatCounter). These tools allow a publisher to learn more -- often times much more -- about who is visiting the website and how they're using it.

For the 5-year history of the Analytics-enabled blog, it looks like we've had over 37,600 visits, with about 1.25 pages being opened per visit. Visitors spend about 49 seconds on the site, on average. As I've always known, my article that compares churn rates is by far the most popular of my pages, accounting for almost 30% of all the page views. The next-most visited page is an article about sample sizes, garnering about 9% of page views.

However, if you drill down more carefully into the data, I think you can make some other interesting discoveries. For this task, I decided to take only the most recent 24 months of traffic data, so that the findings suggest more current trends and distributions.

For example, can we estimate the market share of various Internet Service Providers based on the U.S.-based traffic to my blog? The traffic statistics would suggest so:
  • Comcast - 27%
  • RoadRunner (Time Warner) - 15%
  • Verizon - 10%
  • SBC Global /BellSouth / PacBell (AT&T) - 10%
  • Cox - 5%
  • Charter - 4%
  • Optimum Online (Cablevision) - 4%
  • Comcast Business Class - 3%
  • Qwest - 3%
  • Verizon Wireless - 1%
  • Cogentco - 1%
  • XO - 1%
  • All others & unknown - 16%
That strikes me as probably fairly accurate, and it seems to line up rather closely with other independent measures. The Comcast share is likely inflated somewhat, since I publish from a local Comcast area, and I am also a Comcast employee.

Using the same methodology, we might learn that in Australia, BigPond Broadband and TPG Internet are neck-and-neck for top spot in ISP market share. Or that India's leading ISP is Airtel Broadband, followed by Tata Indicom (VSNL). In the UK, it's British Telecom with about a 30% lead on Virgin Media, which itself has about a 30% lead on BE Internet.

But you can get even nosier about your visitors. For instance, I looked at all of the web domains that had at least 5 unique visits to my blog over the past two years. The domain that seemed to be most interested in my content was the office of Bnei Moran Productions in Israel, spending an average of nearly 10 minutes per visit on my site. Within the United States, the honor of "most interested in my blog" goes to Health Care Service Corporation, clocking in at 7:41 per visit.

Since my blog is about market research, it's interesting to note that some market research and similar consulting firms spend a bit of time reading my commentary. In order of depth of interest (as measured by time per site visit):
I would like to thank these five companies for taking a bit more time to read my thoughts and findings on Inside Market Research!

Labels: , , , , , ,

Wednesday, November 10, 2010

Trusting results of product concept research

Any quantitative study that tests purchase interest in a new or unfamiliar product concept will produce results that are likely to be questioned or doubted by even those who designed the study. At my current employer, the Market Research team has made great strides and taken considerable effort to try to normalize the variance that occurs in estimated take-rates from product to product, study to study. We've done this by:

  • Creating standardized wordings for questions capturing interest and purchase intent;
  • Creating standardized scales for questions concerning agreement, likelihood, satisfaction, and switching; and,
  • Retaining archival data results from previous studies to make benchmark comparisons.
Despite this, the fickle nature of telephone and web-based survey audiences remains a hobgoblin of professional consumer researchers. Twenty years ago, response rates on telephone surveys would easily surpass 35% or 40%, where now they are fortunate to break 20% (not to mention that 18% or more of the population no longer has a land-line telephone). Ten years ago, response rates on web-based surveys would commonly be in the 5% to 8% range, where now it is not unusual to obtain less than a 2% response rate (not to mention concerns that many panels are stacked with "professional respondents"). Frankly, despite all of our efforts to be consistent with data and our expectations of that data to guide insights, the changing world makes it more and more difficult to obtain "reliable" measures that uniformly track the general population.

But, we learn certain compensatory tricks and caveats. For example, we know that consumers typically under-report common daily activities (e.g., time spent watching TV will often get reported around 22 to 24 hours per week, but when the actual "people meter" is switched on by Nielsen, it's typically closer to 31 or 32 hours per week). Conversely, consumers will over-report infrequent activities (e.g., we're seeing consumers over-report online long-form video viewing, but ethnographic studies that observe actual waking-to-bedtime behavior suggest that this activity is potentially over-reported by a factor of 5x to 8x).

A recent study at hand regarding a new sports-related product/service has returned a mountain of data, based off a questionnaire that was carefully designed and vetted by both some of our company's and the vendor's best personnel.

We knew going into this research that the presentation of such a multifaceted product would likely require a video format to convey all of the features to the respondent. On the other hand, most of our new product concept testing does not enjoy the benefits of a glossy video presentation, so some of the "benchmark" data loses its comparability. People tend to more warmly embrace a concept that's been presented to them in a stimulating, engaging way (such as a video clip) than when they're presented with only words on a page.

So, when the results came back, showing a rather strong interest in the concept, it didn't take long for us to begin wondering if it was the slick presentation of the concept (compared to other, more typical presentment formats) that gave it an edge against benchmarks.

In this particular case, we had also asked some "true or false" questions to gauge whether or not the respondents truly understood what the product offered, or whether they had gotten carried away with imagined promises of delivered benefits. We concluded that at least two-thirds of the respondents really had a good grasp of the concept (getting three out of three of the true/false questions correct), and so that helped ease everyone's concerns about potentially inflated take-rates.

What do you do in your organization when you encounter research situations such as this?

Labels: ,

Sunday, October 10, 2010

The importance of competitive bidding

When businesses seek to conduct impartial research about a subject near and dear to them, I think it's an important practice of good governance to obtain competitive proposals and quotations from at least three reputable vendors.

I don't think it's good practice to allow "the new guy" to wire the contract to his former employer, then when publicly called out about it, to ignore the problem entirely. It would seem that the world's fifth-most popular website doesn't see things my way.

Congratulations, Q2 Consulting LLC. You're surely the pride of Oklahoma now.

Labels: , , ,

Tuesday, March 23, 2010

What is Field and tab?

A wiki formatted definition, as I write it, is:

'''Field and tab''' refers to a limited set of services provided in the [[marketing research]] industry. The name refers to the task of '''''field'''ing'' a questionnaire (that is, interviewing consumers or whomever is the target market, to get their response data to an array of questions), then '''''tab'''ulating'' the resulting data into convenient two-dimensional tables (called "cross-tabulations"), based on answers to at least two of the questions included in a survey.

{| border="1"
|+ An example of a '''cross-tabulation''' (simple)
! Answer choices !! All Respondents !! Males !! Female
! Voted for Democrat
| 55% || 50% || 60%
! Voted for Republican

A field and tab research vendor will typically not be responsible for drafting a questionnaire or assisting on high-level sampling design discussions. Likewise, after the data has been collected and tabulated, the vendor will typically not interpret the resulting data nor prepare a deck of presentation slides. These responsibilities fall on the client (the research buyer) or on a [[consulting|consultant]] that the client may hire.

Field and tab is always a part of the offering of a [[full service research]] vendor. A full-service firm will sometimes offer only their field and tab component when the budget for a particular study is limited.

* Hague, Paul N., '''Market research: a guide to planning, methodology & evaluation''' (3rd edition), Kogan Page publisher, 2002.

[[Category:Research methods]]

Note: The content published above is released to the public by Gregory Kohs, under the terms of the Creative Commons, Attribution-Share Alike 3.0 Unported license. All respectable attempt should be made to attribute the original content to Gregory Kohs.

Labels: , ,