Dedoose – providing context to qualitative research

One of the challenges that I have found over the years in my career has been getting technical (read engineering) people to accept and appreciate the value of qualitative research.


In marketing, customer feedback is a critical part of understanding what customers need and value.  Whether the feedback is around product features, advertising, customer satisfaction, messaging, or other important customer engagement areas, qualitative research plays a key role.

Technical people by nature tend to gravitate towards quantifiable data.  This is why surveys with Likert scales (e.g. 1-5 ratings) or numbers driven research (the volume of units or click data) is frequently favored by more quantitative minded people.  Part of this is because qualitative data is, by the very nature of the data, open to interpretation and easy to manipulate.  Regardless, while quantitative data provides the measure, qualitative data provides the flavor – it provides the deeper understanding behind the quantitative data.

Because of the challenge of presenting qualitative data to broader audiences, I’ve learned over the years to use several tricks to make this data more palatable.  Things like boiling down the data to the ‘Top 10 most common statements” or “This message was consistent among X number of focus group participants”.  This works in some cases but it still leaves a lot to be desired.

Enter a fantastic new tool called Dedoose.  Dedoose is an online research tool that can be used individual or in teams of researchers.  It allows for the management of qualitative data or mixed methods input (both qualitative and quantitative data).

Dedoose provides a structured way to code transcripts from customer interviews, focus group research, email or online responses (think customer service emails as research input!).  Based on the coding, the data can be manipulated and presented in ways that make it understandable and broadly usable.  Because of the workgroup capabilities, people outside the research team can be given access to the data for their own review and manipulation (dangerous, I know – but at least possible!).  WebBasedFullSize_2

One of the (many) nice things about Dedoose is that it’s easy and fun to use.  Our research group had several folks that were not computer or data oriented and they were able to immediately begin using Dedoose and contribute to our project.

Why is this such a powerful approach?  It allows marketing teams to conduct research using different methods and use the Dedoose platform to integrate them into usable, actionable data fast and accurately.  Spending time defending qualitative findings is non-productive time for marketing folks.  Tools that provide better integration, analysis, and presentation of qualitative data are invaluable in getting to the answer more quickly and getting organizational understanding more quickly.


I have no direct relationship with Dedoose other than as a subscriber.  I’m using it in some of my academic research and it’s one of the new tools that I’ve come across that I believe provide real value for marketeers.


Rumors of the Death of the 4 P’s are Greatly Exaggerated

With all of the excitement of Social Media and Digital Marketing, there have been a lot of articles and commentary lately about the demise of the 4P model.  For those that don’t know or need a refresher, the 4P model stands for Product, Pricing, Placement, and Promotion.  A good, simple overview of the 4P model is presented on the NetMBA site.images

Ogilvy and Mather published an article about the 4Ps being out and the 4Es being in.  While I agree with the value of the 4E model as an extension or addition to the 4Ps (I have this as required reading in the undergraduate digital marketing course that I teach), it does not replace the 4P model.

There have been numerous attempts to EXTEND the 4P model over time.  There have also been arguments about what constitutes the main P’s with some arguing that People should be added (for customer segmentation).

Some argue that the 4Ps have been replaced by the 7Cs.  An alternative way of thinking about this is that the 4Ps approach the marketing mix from the vendor/producer viewpoint while the 7Cs approach marketing mix from the Customer/Consumer standpoint.  I think that this is another model that goes deeper and is complimentary to the 4P model.

As a way of highlighting the continuing importance of the 4P model, I point to the recent issues in the marketplace around Microsoft’s Surface RT tablet launch as a way of showing how critical the 4Ps really are to successful marketing.  Regardless of your personal viewpoint (Pro or Con) regarding the Surface RT platform, the product has had a challenging launch and ramp by any objective measure and it can be clearly tracked back to three of the four P’s.

From a Product standpoint, the Surface RT has had strong positive reviews for it’s industrial design but has also had negative responses to it’s lack of available apps and inability to run legacy Windows applications.

From a Pricing standpoint, there has been strong pushback from reviewers on the initial pricing for the Surface RT.  While there have been pros and cons about the Surface RT pricing and price positioning (with and without keyboards), the fact that it is raised as a value concern in various reviews shows that Pricing and price-positioning is an important part of the marketing mix.

Finally, Placement has been a major issue as the product was initially available only in Microsoft’s stores and online.  There was some commentary early on in the announcements that this was a point product for Microsoft and would not be available through broad distribution as a way to minimize competitive conflict with Windows 8 OEMS but that has proven to be an issue for the product’s acceptance in the marketplace.

From a Promotion standpoint, Microsoft has done a good job of building Awareness.  Using the 4P model it is possible to see clearly the challenges that they have in driving Consideration and, most importantly, Conversion.

While there is always value in extending models and creating complimentary models, the claims that the 4Ps are irrelevant are questionable at best.  Like good brands, good models stand the test of time.

Tableau Software – a great tool for Data Visualization and Analytics for Startups

There is a great deal of discussion about the value of analytics and big data management in the technology industry today.  Deloitte’s Center for the Edge has research called ‘The Big Shift’ that has looked at the changing world of global business.  Among the research findings are points that suggest that with increasing globalization, lower barriers to entry, and the speed of communications, companies that can collect information, assess it, and act on it quicker than the competition may have some fundamental advantages in the marketplace.

On the other hand, some believe that information analysis is an important way to optimize a business but not as important in the early stage invention mode (I don’t agree with this at all).  In his blog posting on KillerStartups, Bo Fishback, the CEO of Zaarly and previously the vice president of entrepreneurship for the Ewing Marion Kauffman Foundation, states that user testing and input trumps data at the early stage of the business.

My view is that information and data is only useful if it can help provide insights.  Insights into both what customers/prospects want (for finding gaps and opportunities), what customers/prospects do (for optimizing), and what patterns are not obvious from raw data (for both finding gaps AND optimizing) are very important and can be the difference between leading and losing.

One of the challenges that I’ve run into is that analytics conjures up visions of statistics and complex mathematical equations.  Both math and statistics are important to analytics but, with a well developed tool, those should be behind the scenes running the tool, not a required input into using the tool.  A good data analytics tool should allow anyone on a startup team to be able to use it, it shouldn’t require a math major to be able to use it.

In my area of interest and research I come across a number of sites, services, and tools that may be interesting for startups.   One of them is Tableau Software.  Tableau is a Data Visualization tool that allows users to easily connect to datasets.  From very simple flat data files (.csv, .xls, .txt) to very complex SQL data structures (Hadoop, SQL Server, Oracle, etc).  Tableau can analyze data while the data stays in the repository or the data can be imported into Tableau for offline processing.

The main user interface for the user and the data is through a graphical management panel.  Based on the data labels selected, different visual representations are automatically presented to the user to select.  These visual representations are active graphs that allow for deeper understanding of the underlying data for the different objects.  Things like mapping data to geographic map representations are quick and easy.

Sample Map

Why is Tableau interesting?  Four main reasons; 1) it has a graphical interface that is ridiculously simple to use (easier than Excel for graph generation!), 2) it comes with a long library of data connectors and linking multiple databases is handled by Tableau  – no complex SQL code to write, 3) it is fast and intuitive, this tool is appropriate for use across the organization, from a fresh marketing staffer to an experienced data analytics expert.  and finally, 4) the price – it’s not expensive! (don’t tell them).

I was introduced to this tool during a Ph.D. course at Oklahoma State University where we are using it for some data mining.  I’ve now used it extensively for other business analysis work and I’ve found that it has both saved time and provided useful insights by visually representing and allowing the manipulation of the visualization.

Tableau has a download of the tool available for trial.

Organizational Justice and the Application for Managers

Unknown Artist. Source: Google Images

Last term in the Ph.D. program at Oklahoma State University we spent an amazing day with Dr. Deborah Rupp, the William C. Byham Chair of Industrial/Organizational Psychology at Purdue University.  Dr. Rupp is a researcher and expert in the area of Organizational Justice.  If you look at her Curriculum Vitae online it is amazing the body of work that she has already produced.

When we read some of the academic papers around her area of expertise the title of ‘Organizational Justice’ really threw me.  Slightly switching the words to ‘Organizational Fairness’ helps clarify a little but still not obvious.  What Organizational Justice research addresses is the area around fairness in the workplace, organizational ethics, and high performance work systems.

What was most interesting about the readings and her area of research is that it provides a framework that any manager could (should?) use in processing and communicating decisions that may have an impact on organizational emotions, morale, and/or commitment levels.  Internally these are decisions such as organization changes, promotions, raises, project assignments, etc.  Externally these are decisions such as partnership choices or business interactions.

Using the framework may provide two benefits; 1) it allows the manager to ensure that the decision is appropriate and fair before the decision is communicated, and 2) it clarifies for all parties affected the critical contributors to the decision.  In short, it’s a way to ensure that tough, potentially challenging decisions are fairly made and communicated.

So, what is the framework?  It’s made up of three specific components.  For the sake of translating this to business application I have generalized the descriptions – more detail and the empirical data that supports these three are available for anyone that wants to churn through the academic articles:

1) Distributive Justice – is the decision appropriate and fair regarding the  decisions’ outcomes and distribution of resources? The outcomes or resources may be tangible (e.g., promotions, pay) as well as intangible (e.g., social recognition, praise).  A manager needs to step back and ask, “will the organization  view this as a fair decision”.

2) Procedural Justice – Was the process used to make the decision fair?  In order for decisions to be accepted as fairly arrived at it is critical to ensure that the decision followed a process that the organization (internal or external) perceive as ‘fair’.  In many cases this may mean taking additional steps that the manager may think are unnecessary but may be critical to ensuring a fair process.  For example, soliciting additional feedback on a potential promotion may be seen as unnecessary but may provide further evidence of the appropriateness (or inappropriateness) of the decision.

3) Interactional Justice – Was the decision communicated fairly and with respect for all parties?  This component is made up of two key areas which are related; interpersonal fairness and informational fairness.  To be precise, Interactional Justice states that decisions must take into account the interpersonal response and must be communicated in a way that is fair and clear.  For example, communicating a potential promotion to part of a team would NOT be seen as fair – from both an interpersonal standpoint (those not in consideration will be upset) and an informational standpoint (those not in ‘the know’ will be upset).  Ensuring that decisions are communicated with an intent to be fair and sensitive to the personal impacts to individuals may positively impact the organizational perception of fairness.

This framework provides an excellent tool for thinking through decisions that may impact the perception of fairness of a decision.  It may also provide a checklist to ensure that decisions that have difficult potential outcomes are well thought out.

Dr. Rupp’s work is incredibly important for managers and executives.  Many managers are able to make good decisions but may not be good at communicating and implementing them.  How many times are ‘correct’ decisions negatively impacted by poor implementation or communication of the decisions?  Her work provides useful, implementable concepts that should improve decision implementation in organizations.

Rupp, D.E., Baldwin, A., & Bashshur, M (2006).  Using developmental assessment centers to foster workplace fairness.  The Psychologist Manager Journal, 9(2), 145-170

Rupp, D.E. (2010).  An employee-centered model of organizational justice and social responsibility.  Organizational Psychology Review, 1(1) 72-94

Information Management in High Performance Learning Organizations

In 2009 the research team at the Deloitte Center for the Edge, led by John Hagel III, introduced a concept they titled ‘The Big Shift”.  Their research showed that competition was changing rapidly on a global basis and the current way of doing business was fundamentally broken.  They showed that return on assets (ROA) for public companies had declined by 75% since 19651.  They stated that The Big Shift “represents the convergence of long-term trends, playing out over decades, that are fundamentally reshaping the business landscape.”  A major trend that is reshaping the competitive landscape is around two powerful changes in the business world; digital infrastructure and knowledge flows.

The digital infrastructure is an interesting, well-known phenomena and I may cover this in a future blog post.  The more fundamental and critical shift is the change in knowledge flows.  Companies that can acquire information, interpret it, and take action faster than the competition may have significant competitive advantages.  In the new paradigm knowledge is an asset that should be nurtured, managed, and protected just like inventory, distribution networks, or cash.

What does this really mean from a tactical standpoint?  There is interesting research that breaks down information processing in a business context in a very meaningful way and provides a hierarchy of information management that can enhance knowledge creation and management in their organizations (Sinkula, James M, 1994)2.

The research breaks down the information processing to four key stages:

Information Acquisition – how and where does the organization get the necessary information;

Information Distribution – sharing information appropriately and in a timely way to facilitate action;

Information Interpretation (and resultant actions) – reviewing the information, drawing appropriate conclusions, and acting on the conclusions;

Organizational Memory – storing the information and the processes around the acquisition, distribution, and interpretation to enable consistent refreshing over time.

These four steps are inter-connected and absolutely critical in driving time-sensitive decisions that are fact-based, effective, and optimized for success.

If you subscribe to the Deloitte theory that Knowledge Flow is an asset to an organization then becoming world class at the first three stages is critical for competitive success.  The Organizational Memory component is critical for long-term value creation.

Each of these four stages are worthy of their own overviews and over the next few weeks I will be posting more about each of these areas.  I’d be interested in any questions or feedback on the above.

  1. Hagel, J., Seely Brown, J., and Davidson, Shaping Strategy in a World of Constant Disruption, Harvard Business Review, Oct. 2008.
  2. Sinkula, J. M., Market Information Processing and Organizational Learning, Journal of Marketing, Vol. 58, January 1994, p 35-45


Do traditional marketing frameworks still apply?

In the graduate marketing management course that I teach at St. Edwards University I spend a significant amount of time presenting the traditional marketing frameworks.  The Four P’s (Product, Pricing, Placement, Promotion), SWOT analysis (Strength, Weakness, Opportunities, and Threats), and Michael Porter’s Five Forces analysis (Threats from other Similar, Threats from New Entrants, Threats from Substitutes, Supplier Power, and Buyer Power) are some of the frameworks that we talk about and debate.

These are traditional ways of analyzing a business, product, service, or market area.  These have existed for quite a long time – certainly before the Internet had the incredibly disruptive and seismic impact on traditional market delivery methods such as print advertising, radio, PR, and events.

Additionally, I spend a considerable amount of time talking about market communication goals and execution using the traditional framework of Awareness, Consideration, and Conversion.  These also are very established focus areas for marketing execution.  They were established back when ad agencies were setting metrics and goals for expected performance from different types of marketing communications.

In reviewing these with the students, the question was asked – ‘How relevant are these frameworks given the growth of social media, social marketing, and the changing vehicles/power of traditional communications vehicles such as newspapers, magazines, and events/tradeshows.

My answer is that these frameworks are even more important than ever before!  Frameworks exist to help bring order out of chaos.  The more convoluted and disparate the communications vehicles and methods become, the more profiliferation of user-driven communications and marketing (see Yelp for an example), the more that the traditional frameworks can be used to help organize that marketing efforts and needs.  If you are a restauranteur seeming to lose ground based on LivingSocial, Groupon, Yelp, etc being used by your competitors, then using tools such as a SWOT analysis or a Five Forces analysis may help you organize what the appropriate market response should be.

Using the Awareness, Consideration, and Conversion model can help you determine the most appropriate and cost-effective way of marketing the response.  Maybe offering a promotion through LivingSocial IS the right way to get people to try your products (Conversion), maybe getting a local food reviewer to review the product and posting it on social media sites may be a good approach (Consideration), or maybe sponsoring a local charity will raise the visibility of your business (Awareness).  Understanding this model can sharpen your market response and focus the dollars that you spend.

In this time of social media proliferation, shifting advertising and marketing vehicle power, and constant solicitation of the marketing dollar, it is MORE incumbent on marketing organizations to rely on the traditional frameworks.  Rarely have they had as much relevance as they do now in this state of marketing chaos.

Delivering Communications Excellence

In this global marketplace where parties can be separated by distance, communications is a critical capability for success.  Where, in the past,  basic communications interpretation tools such as seeing the other  person’s expression or hearing the inflection in the voice would help clarify communication misunderstandings or mistakes, the dependence on email, texting, and even video chatting has exposed significant issues for those that haven’t mastered the skills of communications techniques.  Similar to developing new processes – communications skills require good execution before relying on technology to streamline the activity.  Failure to master the rules of great communications skills runs the risk of technology magnifying the shortcomings in a person’s communications.  While it sounds esoteric, there really is a method for ensuring strong communications skills.  Here are three basic rules for communications excellence:

1)      Communicate frequently and regularly.  Consistency gets people comfortable with the tone, inflections, and how information is structured and processed.   Regular communications creates a psychological conversation between the sender and the other person that makes it easier over time to exchange ideas without having to second-guess the other person’s message.

2)      Focus on quality of message.  Learning to communicate clearly in 50 words what takes many people 500 words is an important skill.  Shorter, high quality messages tend to be received and remembered better by the recipient than long, wordy communications that ramble.  Also, shorter impactful messages are better received when communicating frequently and regularly.

3)      Ensure that the message is dependable.   It only takes one or two material mistakes for credibility to be destroyed.   It’s important to not only be confident and correct in the message, but in many cases, it’s important to defend the message with citations.  One of the reasons that news articles and research papers cite sources is to establish credibility.  This focus on delivering dependable messages accomplishes two things; 1) it makes the sender do the appropriate discovery work to ensure that the communication is correct, and 2) it establishes with the receiver that the sender has tried to ensure that the message is correct and gives them a point of reference for the dependability of the message.  For example, conveying a request for a feature for a new product has much more weight if the feature request is backed up with specific customer feedback.  The more reliable the messages are then the more trusted the messenger becomes.

You may wonder what this has to do with mobile computing devices and tablets.  In some ways, it is exactly what mobile phones and tablets excel at – being tools for communications excellence.  Tablets bring multiple communications tools together in one device; email, text, chat.  Many tablets provide additional tools; video chatting (Skype and iChat), cameras, voice, video.

These tools, in conjunction with wireless capabilities, provide a great vehicle for communications excellence.  Within the tablets’ messaging tools there is the ability to provide frequently and regularly.  The cameras, video, and data capture capabilities help focus on quality of message as well as allowing for capture of information which highlights the dependability of the message.

Tablets enable a new level of communications, whether through form based applications that capture information and transmit them in a structured way , or as messaging platforms to help automate the basis of communications excellence; frequency, quality, and dependability.