Organizational Justice and the Application for Managers

Unknown Artist. Source: Google Images

Last term in the Ph.D. program at Oklahoma State University we spent an amazing day with Dr. Deborah Rupp, the William C. Byham Chair of Industrial/Organizational Psychology at Purdue University.  Dr. Rupp is a researcher and expert in the area of Organizational Justice.  If you look at her Curriculum Vitae online it is amazing the body of work that she has already produced.

When we read some of the academic papers around her area of expertise the title of ‘Organizational Justice’ really threw me.  Slightly switching the words to ‘Organizational Fairness’ helps clarify a little but still not obvious.  What Organizational Justice research addresses is the area around fairness in the workplace, organizational ethics, and high performance work systems.

What was most interesting about the readings and her area of research is that it provides a framework that any manager could (should?) use in processing and communicating decisions that may have an impact on organizational emotions, morale, and/or commitment levels.  Internally these are decisions such as organization changes, promotions, raises, project assignments, etc.  Externally these are decisions such as partnership choices or business interactions.

Using the framework may provide two benefits; 1) it allows the manager to ensure that the decision is appropriate and fair before the decision is communicated, and 2) it clarifies for all parties affected the critical contributors to the decision.  In short, it’s a way to ensure that tough, potentially challenging decisions are fairly made and communicated.

So, what is the framework?  It’s made up of three specific components.  For the sake of translating this to business application I have generalized the descriptions – more detail and the empirical data that supports these three are available for anyone that wants to churn through the academic articles:

1) Distributive Justice – is the decision appropriate and fair regarding the  decisions’ outcomes and distribution of resources? The outcomes or resources may be tangible (e.g., promotions, pay) as well as intangible (e.g., social recognition, praise).  A manager needs to step back and ask, “will the organization  view this as a fair decision”.

2) Procedural Justice – Was the process used to make the decision fair?  In order for decisions to be accepted as fairly arrived at it is critical to ensure that the decision followed a process that the organization (internal or external) perceive as ‘fair’.  In many cases this may mean taking additional steps that the manager may think are unnecessary but may be critical to ensuring a fair process.  For example, soliciting additional feedback on a potential promotion may be seen as unnecessary but may provide further evidence of the appropriateness (or inappropriateness) of the decision.

3) Interactional Justice – Was the decision communicated fairly and with respect for all parties?  This component is made up of two key areas which are related; interpersonal fairness and informational fairness.  To be precise, Interactional Justice states that decisions must take into account the interpersonal response and must be communicated in a way that is fair and clear.  For example, communicating a potential promotion to part of a team would NOT be seen as fair – from both an interpersonal standpoint (those not in consideration will be upset) and an informational standpoint (those not in ‘the know’ will be upset).  Ensuring that decisions are communicated with an intent to be fair and sensitive to the personal impacts to individuals may positively impact the organizational perception of fairness.

This framework provides an excellent tool for thinking through decisions that may impact the perception of fairness of a decision.  It may also provide a checklist to ensure that decisions that have difficult potential outcomes are well thought out.

Dr. Rupp’s work is incredibly important for managers and executives.  Many managers are able to make good decisions but may not be good at communicating and implementing them.  How many times are ‘correct’ decisions negatively impacted by poor implementation or communication of the decisions?  Her work provides useful, implementable concepts that should improve decision implementation in organizations.

Rupp, D.E., Baldwin, A., & Bashshur, M (2006).  Using developmental assessment centers to foster workplace fairness.  The Psychologist Manager Journal, 9(2), 145-170

Rupp, D.E. (2010).  An employee-centered model of organizational justice and social responsibility.  Organizational Psychology Review, 1(1) 72-94

Information Management in High Performance Learning Organizations

In 2009 the research team at the Deloitte Center for the Edge, led by John Hagel III, introduced a concept they titled ‘The Big Shift”.  Their research showed that competition was changing rapidly on a global basis and the current way of doing business was fundamentally broken.  They showed that return on assets (ROA) for public companies had declined by 75% since 19651.  They stated that The Big Shift “represents the convergence of long-term trends, playing out over decades, that are fundamentally reshaping the business landscape.”  A major trend that is reshaping the competitive landscape is around two powerful changes in the business world; digital infrastructure and knowledge flows.

The digital infrastructure is an interesting, well-known phenomena and I may cover this in a future blog post.  The more fundamental and critical shift is the change in knowledge flows.  Companies that can acquire information, interpret it, and take action faster than the competition may have significant competitive advantages.  In the new paradigm knowledge is an asset that should be nurtured, managed, and protected just like inventory, distribution networks, or cash.

What does this really mean from a tactical standpoint?  There is interesting research that breaks down information processing in a business context in a very meaningful way and provides a hierarchy of information management that can enhance knowledge creation and management in their organizations (Sinkula, James M, 1994)2.

The research breaks down the information processing to four key stages:

Information Acquisition – how and where does the organization get the necessary information;

Information Distribution – sharing information appropriately and in a timely way to facilitate action;

Information Interpretation (and resultant actions) – reviewing the information, drawing appropriate conclusions, and acting on the conclusions;

Organizational Memory – storing the information and the processes around the acquisition, distribution, and interpretation to enable consistent refreshing over time.

These four steps are inter-connected and absolutely critical in driving time-sensitive decisions that are fact-based, effective, and optimized for success.

If you subscribe to the Deloitte theory that Knowledge Flow is an asset to an organization then becoming world class at the first three stages is critical for competitive success.  The Organizational Memory component is critical for long-term value creation.

Each of these four stages are worthy of their own overviews and over the next few weeks I will be posting more about each of these areas.  I’d be interested in any questions or feedback on the above.

  1. Hagel, J., Seely Brown, J., and Davidson, Shaping Strategy in a World of Constant Disruption, Harvard Business Review, Oct. 2008.
  2. Sinkula, J. M., Market Information Processing and Organizational Learning, Journal of Marketing, Vol. 58, January 1994, p 35-45

 

Do traditional marketing frameworks still apply?

In the graduate marketing management course that I teach at St. Edwards University I spend a significant amount of time presenting the traditional marketing frameworks.  The Four P’s (Product, Pricing, Placement, Promotion), SWOT analysis (Strength, Weakness, Opportunities, and Threats), and Michael Porter’s Five Forces analysis (Threats from other Similar, Threats from New Entrants, Threats from Substitutes, Supplier Power, and Buyer Power) are some of the frameworks that we talk about and debate.


These are traditional ways of analyzing a business, product, service, or market area.  These have existed for quite a long time – certainly before the Internet had the incredibly disruptive and seismic impact on traditional market delivery methods such as print advertising, radio, PR, and events.

Additionally, I spend a considerable amount of time talking about market communication goals and execution using the traditional framework of Awareness, Consideration, and Conversion.  These also are very established focus areas for marketing execution.  They were established back when ad agencies were setting metrics and goals for expected performance from different types of marketing communications.

In reviewing these with the students, the question was asked – ‘How relevant are these frameworks given the growth of social media, social marketing, and the changing vehicles/power of traditional communications vehicles such as newspapers, magazines, and events/tradeshows.

My answer is that these frameworks are even more important than ever before!  Frameworks exist to help bring order out of chaos.  The more convoluted and disparate the communications vehicles and methods become, the more profiliferation of user-driven communications and marketing (see Yelp for an example), the more that the traditional frameworks can be used to help organize that marketing efforts and needs.  If you are a restauranteur seeming to lose ground based on LivingSocial, Groupon, Yelp, etc being used by your competitors, then using tools such as a SWOT analysis or a Five Forces analysis may help you organize what the appropriate market response should be.

Using the Awareness, Consideration, and Conversion model can help you determine the most appropriate and cost-effective way of marketing the response.  Maybe offering a promotion through LivingSocial IS the right way to get people to try your products (Conversion), maybe getting a local food reviewer to review the product and posting it on social media sites may be a good approach (Consideration), or maybe sponsoring a local charity will raise the visibility of your business (Awareness).  Understanding this model can sharpen your market response and focus the dollars that you spend.

In this time of social media proliferation, shifting advertising and marketing vehicle power, and constant solicitation of the marketing dollar, it is MORE incumbent on marketing organizations to rely on the traditional frameworks.  Rarely have they had as much relevance as they do now in this state of marketing chaos.

Delivering Communications Excellence

In this global marketplace where parties can be separated by distance, communications is a critical capability for success.  Where, in the past,  basic communications interpretation tools such as seeing the other  person’s expression or hearing the inflection in the voice would help clarify communication misunderstandings or mistakes, the dependence on email, texting, and even video chatting has exposed significant issues for those that haven’t mastered the skills of communications techniques.  Similar to developing new processes – communications skills require good execution before relying on technology to streamline the activity.  Failure to master the rules of great communications skills runs the risk of technology magnifying the shortcomings in a person’s communications.  While it sounds esoteric, there really is a method for ensuring strong communications skills.  Here are three basic rules for communications excellence:

1)      Communicate frequently and regularly.  Consistency gets people comfortable with the tone, inflections, and how information is structured and processed.   Regular communications creates a psychological conversation between the sender and the other person that makes it easier over time to exchange ideas without having to second-guess the other person’s message.

2)      Focus on quality of message.  Learning to communicate clearly in 50 words what takes many people 500 words is an important skill.  Shorter, high quality messages tend to be received and remembered better by the recipient than long, wordy communications that ramble.  Also, shorter impactful messages are better received when communicating frequently and regularly.

3)      Ensure that the message is dependable.   It only takes one or two material mistakes for credibility to be destroyed.   It’s important to not only be confident and correct in the message, but in many cases, it’s important to defend the message with citations.  One of the reasons that news articles and research papers cite sources is to establish credibility.  This focus on delivering dependable messages accomplishes two things; 1) it makes the sender do the appropriate discovery work to ensure that the communication is correct, and 2) it establishes with the receiver that the sender has tried to ensure that the message is correct and gives them a point of reference for the dependability of the message.  For example, conveying a request for a feature for a new product has much more weight if the feature request is backed up with specific customer feedback.  The more reliable the messages are then the more trusted the messenger becomes.

You may wonder what this has to do with mobile computing devices and tablets.  In some ways, it is exactly what mobile phones and tablets excel at – being tools for communications excellence.  Tablets bring multiple communications tools together in one device; email, text, chat.  Many tablets provide additional tools; video chatting (Skype and iChat), cameras, voice, video.

These tools, in conjunction with wireless capabilities, provide a great vehicle for communications excellence.  Within the tablets’ messaging tools there is the ability to provide frequently and regularly.  The cameras, video, and data capture capabilities help focus on quality of message as well as allowing for capture of information which highlights the dependability of the message.

Tablets enable a new level of communications, whether through form based applications that capture information and transmit them in a structured way , or as messaging platforms to help automate the basis of communications excellence; frequency, quality, and dependability.

The App Store – Why is everyone doing an App Store?

Microsoft and Apple are engaged in a fight over a seemingly ludicrous issue, the name App Store.  It seems superfluous.  EVERYONE is doing an App Store.  Intel launched a site called Intel AppUp which has software for purchase for Netbook and Atom-based systems, Google has its Market for Android, Verizon has its Marketplace.  Even we, at Motion, have launched a software store for the purchase and downloading of applications appropriate to our customers.  This actually is the start of another major shift in the computing ecosystem.  The transition of packaged apps from brick and mortar stores to apps and applets available for immediate download and use.  A new measuring stick of a platform’s success seems to be around how many apps are available and how easy it is to get them and use them.

Apple has been fighting the app battle with Google by pointing out, initially, how many apps they have for their platforms.  Now that Android marketplace is catching up (and maybe passing?) with Apple the new argument is the compatibility issues between apps on different Android platforms. Watch for discussions about app incompatibility on Android.  It’s the new rallying cry in favor of a closed system (iOS) and highly PRIVATELY regulated market – Apple’s App Store.

The App Stores have only recently become practical because of two major reasons; 1) the bandwidth of wide-area networking – both wired and wireless, is now sufficient to support the downloading of rich and small apps, and 2) the customer’s desire for location independent acquisition/delivery of applications.  If a customer had to go to an Apple store to buy Pandora, for example, not a lot of users would have Pandora.  Similarly, if a user had to go to Verizon to download a new mail client, there would be little uptake of mail clients.  With the bandwidth available and the huge market size of mobile devices, location independent app delivery is now a critical component for any platform vendor.

Every piece of the mobile technology value chain is impacted by the success (or failure) of the transition to downloadable applications.  Wireless providers attract more users if they have devices with desirable applications, device vendors sell more units if they have desirable applications, component vendors sell more components if the devices that they provide parts for are on the winning side of the device demand curve, etc.  Apps are a critical piece of the user demand puzzle.

Which begs the ultimate question – what happens to applications like Microsoft Office that are too massive for location independent delivery?

Comments?

The Mobile Information Shift

It seems like daily there is news out about mobile computing and communications that confuse and confound.  The mobile environment has so many different facets; the communications networks and their voice and data architectures (3G/4G/LTE/WiMax/etc.), device manufacturers creating devices that are hard to distinguish (Android tablets, anyone?), and software offerings that seem to be all over the map (applets, Software as a Service, Cloud-based, etc).  What does it all mean and how does anyone make sense of all this noise?

In researching for a speech that I gave on the future of mobile computing I realized that there are three fundamental ‘pillars’ driving the mobile computing and communications world and they are dependent on each other and inter-related in many, many ways.  These pillars have one added dimension which affects them; the geographic and international differences that exist for each of these columns.  For example, wireless network capabilities are at different stages of capability and readiness in different parts of the world.  The number of internet-enabled users in Asia far outnumber the North American internet-enabled user base and this has dramatic impact on application targeting.

The three key ‘pillars’ are Wireless Networks (both in-building and wide-area), Application Delivery (SaaS, Cloud, local, etc), and Mobile Devices (tablets, smartphones, netbooks, ultralight notebooks).   Each of these are directly impacted by geographic differences – either by the state of the technology environment in those geographies or by the sheer number of users and potential users in those geographies.

My intent with this blog area is to try to apply the impact of new anouncements based on the framework of the three pillars or geographic ramifications.  Hope that you find them interesting.

The shift from Number Two to Number One

With Steve Job’s recent medical leave underway there has been a lot of focus on Tim Cook, Apple’s number two person for many years.  While Tim has stepped in several times in the past to run the ship while Steve was out for his medical treatments, this time it appears to have hit home to the reporters and analysts that Mr. Cook may actually be needed to step into the number one spot on a more permanent basis.   The industry pundits appear concerned.

There is really something to this Number Two to Number One thing.  It IS different.  I don’t believe it’s about vision or strategy though.  I believe it’s about the span of control and intense spotlight that the number one person has to contend with.  What few people seem to be aware of is that the companies with strong management teams at the top do well because of a strong number one AND a strong number two.  Rarely is there a number three with the same breadth of organizational control (it would just be a top heavy vertical organization if they did).

There have been a number of number two players that have taken over – Craig Barrett at Intel is one example but I would add executives like Kevin Rollins at Dell (you can argue about how those turned out but for several years they were very successful).  Tim Cook will be just another in a line of number twos that end up in the top spot.

I have some empathy for his current situation because I, like him, was a ‘perrenial number two’ in the companies that I worked for and even the company that I co-founded.  I would tell people that I liked being number two and they would snicker.  As much of a control freak as I am people really just didn’t believe that I didn’t want to be the top person and assumed it was false modesty or some game that I was playing to not look to ambitious to the number one person.  It wasn’t a game.  There is a definite role for a strong number two that enhances the game of a strong number one and it was a role that I cherished.  (By the way, being a strong number two to a weak number one doesn’t go well and doesn’t last long).  I’ve spent the last two years going through my own transition from COO to CEO – even though it’s on a very different scale, and there are some things that I’ve observed about the transition that may be worth watching here.

There are several advantages to being number two; 1) there tends to be less internal focus on number two and it’s easier to get the facts and the pulse of what’s going on as a number two (not a lot of yes-men to number twos!), 2) externally there is less pressure on a number two as industry analysts, press, etc tend to focus on number one (anyone know who is number two at Microsoft, GE, IBM, etc?), and the role of number two tends to be an execution role so there is a lot of opportunity to roll-up the sleeves and dig into meaty problems and work them to resolution – without people being intimidated by the top guy being involved.  If you look at Tim Cook’s history those are roles that he appears to have played very, very well.  It doesn’t diminish Steve Job’s accomplishments in any way or Tim Cook’s.  Together they appear to have been good for the company.

Stepping out into the top role is a massive change.  Tim Cook will need to be able to deal with the focus.  It can be intense and unnerving for someone who is not an externally driven person.  He will need to find a number two that complements him.  Having two number twos at the top or only a number one also doesn’t work very well (see Compaq post-Canion, or Microsoft now with the loss of the Gates/Balmer team as an example).

Tim Cook has been a real success as Apple’s number two.  The role that he will be taking on is much different than what he has been doing.  The analysts are right on watching this closely but I don’t believe it’s about ‘Vision’.  I believe it’s about the significant personal shift from being Number Two to Number One.

Let me know what you think.

What’s really driving the Nokia deal with Microsoft?

Stephen Elop, the CEO of Nokia, has been under a tremendous amount of pressure since he announced the partnership deal between Nokia and Microsoft last week.  Given the significant market share of the Symbian operating system, which is managed and maintained by Nokia, the confusion and anger is understandable as Elop has not made clear the strategic necessity of the deal other than the money that Microsoft is going to pay for the partnership.

Phone Operating System Market Share Q1'10

The anger inside Nokia is public and intense with protests from the employees garnering worldwide attention.

Nokia’s main operating system for it’s phones is the Symbian platform.  For quite awhile other phone manufacturers also used the Symbian OS for their phones.  Samsung and Sony Ericcson had good international offerings based on the Symbian OS.  Symbian was one of the first true ‘smartphone’ platforms.  It offered a software development kit (SDK) for developers and had a dedicated browser and supported the Opera browser.

Unfortunately, two things have happened to change the mobile phone world dramatically in a very short time.  First, the iPhone introduced the concept of easy to build, quick to deploy applications for the phone space with Google following very quickly with Android appealing to the same developer community.  Both Apple and Google have been fighting each other to attract the apps to their platforms and have been very succcessful.

There are really five potential global operating systems available for the new generation of smartphones for both device manufacturers and application developers.  Those operating systems are iOS for iPhone (Apple devices exclusively), Palm WebOS (now part of HP), Android, Symbian, and now, Windows Phone 7.

The other two major Symbian licensees have shifted to other platforms with Sony Ericsson moving to Android and continuing to invest in that platform, and Samsung supporting both Windows Phone 7 and Android.  This leaves Symbian in third place and likely the WebOS a distant fourth place in terms of attracting developer interest and support.

With Nokia carrying the flag for Symbian on it’s own the likelihood of success long-term looks to be challenging.   Elop made a clear decision that, while a huge risk, marries one of the worlds largest handset vendors with one of the industry leader’s emerging smartphone platform.  This gives Nokia considerable clout with Microsoft in the development and evolution of the phone platform and gives Microsoft a leg up in building market share over Apple and Android quickly.  This should also allow Nokia to use it’s formidable software resources to extend and develop solutions based on the Windows Phone 7.

In a nutshell what this is about is a significant change in the Phone Operating System wars.  It’s no longer about user interface and browser speeds.  It’s about the applications available to the users.  The platform that has the most applications that are relevant to users and the more device choices are where the market is starting to turn.  Given that, it’s clear why Nokia’s executive team is making this bet.  And it makes you think about the risk to Apple and HP WebOS longer term.

Back online and writing again

OK, so I’ve been a little busy for the past two and a half years.  You’ll notice that my last post was in June of 2008.  That’s because in July of 2008 I was asked to return to the company that I founded in 2001 (Motion Computing) by the company’s board of directors.  The gig was supposed to be a short-term (up to 6 months) effort to jump back into the company to help the CEO and executive team.  The company had been losing some of the momentum that was seen in the early years and was seeing numerous challenges in 2008.

For those of you that may have followed me prior to starting the blog and creating iTaggit.com, I went back to grad school in the fall of 2005 at Kellogg School of Management at Northwestern University and I left Motion in early 2006 to focus on school and do another startup.  I stayed on the board for about a year and then left the company for good.  I graduated from Kellogg in May of 2007.  During my final year at school I really had little/no contact about Motion and was focused primarily on working with my co-founders on  iTaggit.com.

Well, my six month gig was more than I anticipated.  In January of 2009 the board moved me into the role of CEO and so began the journey that continues today.  After 2 years in the role we have made significant progress and I have been able to apply much of what I learned at Kellogg and have gained a new appreciation for some of the changes and challenges facing executives in the technology business.

In the area that our business is centered, the mobile technology arena, there are four major areas of change that are related and evolving at a rapid pace.  Networking (broadband, Wireless WAN, and Wireless LAN), Application delivery, mobile (and non-mobile but new) computing devices, and international expansion are all driving massive changes in the markets.

This blog will tackle subjects that I’m seeing in the industry – trends, challenges, and business learnings around the technology world.  I’m hopeful that the postings will be of interest and value to readers.