Posts Tagged ‘“matthew loxton”’

Electronic Health Records: Where they should be Going but aren’t

January 9, 2015

The past few years, mostly because of the Affordable Care Act, the adoption of Electronic Health Record (EHR) systems in the USA has seen a dramatic growth. Because of the rapid climb, EHR vendors have been trousering some pretty large amounts of revenue, billions of dollars, in fact. This is not a bad thing per se’, but as Congress suddenly realized this year, all that cash didn’t translate into giant leaps in innovation, as they predicted. Some of this is the result of a captive market, some because of psychosocial artifacts of clinicians, and some to do with that markets aren’t necessarily innovative.

One of the ways in which one can see the lack of innovation, or even basic maturity, is the degree to which clinicians have to type the same data over and over in different electronic forms. Not only do the EHR systems not interoperate very well between vendors, some don’t even interoperate with themselves! So it is a common sight to see a nurse type in records from a sheet of paper, then if they are lucky, copy and paste them into another form. If they are unlucky, they get to retype the same data multiple times in different EHR screens. If they are doubly unlucky, the system is also somewhat fragile, which isn’t unusual, and it aborts the session before the data is saved. In that case, they get to retype it all again when the system comes back to life. Sometimes this happens several times a day – in one case that I encountered, the clinician had to try fourteen times before the system recorded the data!

This is obviously a pretty abominable situation, and to get even the most basic degree of workflow into this is going to take a lot of effort and money. Luckily, the EHR vendors are flush and positively glowing pink with all that Meaningful Use cash in their fists.

The Goal

What I want to see isn’t beyond current technology or in the realm of science fiction, and not even where we ultimately want to be, but it shows where the thinking needs to head (In my opinion, that is).

What I want to see is the removal of the human from any data capture that doesn’t actually require their expertise.
Not really a big ask, given that we can put intelligence in spectacles and the average smartphone has more brains than it knows what to do with.

So let’s say a patient arrives for a consultation.

When they enter the waiting room, I want them to get a transponder sticker. These are dirt cheap, pretty reliable, and can be scanned without actual contact. At the reception desk, the clerk reads the sticker and associates it with the patient record. Now I can tally who left without being registered (elopement), how long it took (primary wait time), and at which stage of the encounter all the patients are (census).

When the patient is called, they are read leaving the waiting room, and again when they enter the examination room. The nurse or nurse practitioner scans them, and the patient record is already onscreen in the room when the nurse scans their ID on the workstation. Each vital sign collected goes directly into the patient record because the instruments are vaguely intelligent. Blood pressure, pulse-oximetry, weight, height, respirations, temperature, etc. are all directed from the device to the EHR simply by using them on the patient. These are all time-stamped, have the ID of who was using them, the ID of the device, and are shown as machine entries in the patient record.

Verbal notes can already be captured through speech recognition, but let’s say that the nurse actually has to enter this themselves. They don’t have to search for the patient record or the screen, those are already there, and they simply need to verify that the patient record is correct. (Although unless the patient swapped armbands with somebody, we are pretty sure who they are).

When the process has reached a certain point, the EHR can buzz the physician that the patient is close to ready. So no long wait while the nurse has to write things down or type in much, and no need for them to go find the physician.

A similar scenario unfolds when the physician enters: the room, patient, and physician are associated in an entry event because all three have transponder identities. Relevant patient data is already displayed when the physician scans their ID at the workstation to login, and again, any use of instruments captures data. Listening to the patients lungs with an intelligent stethoscope can capture the sounds, timestamp them, and put them into the correct place in the patient’s record. Even more wonderful, if the patient has any electronic records pertinent to the encounter, these can be transmitted from a smartphone Personal Health Record (PHR) app.

The only parts the physician play in capturing data is when expertise is required or when the machines can’t (yet) do it themselves. There is no reason on earth why a scale, blood pressure cuff, or pulse-oximetry device can’t transfer the data to the EHR themselves. Only the most antiquarian of medical offices don’t already have devices that display the data digitally, it’s just that we then typically ask a human to write it down or type it into the EHR manually. That is a bad use of resources, and opens up opportunities to get it wrong.

With time stamped machine data, the practice can start monitoring movement and wait times, and would be enabled to make adjustments to their workflow to optimize patient flow, and reduce unnecessary steps or waits. Staffing rosters and equipment placement can be evidence based rather than rely on guesswork, and bottlenecks in the processes will be far more visible.

Conclusion

The basic theory is similar to industrial engineering – don’t ask a human to do something that the machine can do. Free up clinician time, reduce transcription errors, and allow the clinician to focus on where their expertise lies – not in being low-level data capture clerks.

We should be demanding that equipment manufacturers and EHR vendors get their act together, and stop making clinicians do their dirty work.

That’s my story, and I’m sticking to it!

KM in Healthcare – Focus for 2013

March 23, 2013

Since November 2012, I have been expanding my KM efforts in healthcare, and this blog will show that change in emphasis.

In 2011/2012, I was focused on KM in the electronic commodity aftermarket repair industry, and while this really was a very productive time and allowed me to develop some tools and methods, I felt that the healthcare industry was a one that was undergoing a revolution, and that there was a significant part for KM to play.

Of course much of the KM applied in the electronics repair arena can be transported to healthcare, for example, the activity-based knowledge audit process (Loxton, 2013) published in the JKMR&P can be seamlessly adapted to the healthcare field.

Since November I have visited hospitals, interviewed a wide range of people in both clinical and administrative parts of hospitals, and I have been wading through a huge pile of information on a variety of technologies and areas in healthcare.

In addition to touring and talking and reading, for good measure I also made use of courses available through the Coursera MOOC.
Health Informatics in the Cloud” by Dr. Mark Braunstein of Georgia Tech in particular has been very helpful, but there really is an amazing amount of free and high quality materials online these days.

Healthcare is a very wide field, and I have been focusing firstly on hospitals and hospital systems, and more narrowly on the inpatient flow management part.

Flow Management involves some very interesting aspects of Lean, KM, and modeling, and includes (but not limited to) Bed Management, ED Management, Utilization Management, Surgical Workflow & Quality Management, and Real Time Location Management.

Some of the people I have met have included Emergency Department Nurses and physicians, Ward Administrators, Utilization Managers and Reviewers, Housekeeping staff, Admissions Clerks, and my absolute favorite, the Bed Czar.

A Bed Czar is described by the IHI as follows:

The centralized bed authority (or “bed czar”) is a person or location responsible for processing all admissions and transfers. Key responsibilities of the centralized bed authority include: active participation in daily bed meetings, convene AM bed huddles; oversee placement of admitted and transfer patients in beds; visit units to identify available beds with staff assigned to them and assess staff capacity to safely take additional admissions; communicate with units about placements and anticipated needs; and serve as a conduit for all physicians admitting patients. The centralized bed authority in most effective when it is incorporated into an overall system for managing real time demand and capacity.

My next blog will have some specifics on flow management from a KM perspective, and I hope readers find it useful.

Bibliography

Loxton, M. H. (2013). A simplified integrated critical activity-based knowledge audit template. Knowl Manage Res Prac.

Why KM isn’t going away anytime soon

September 4, 2012

There have been a fair number of people in the blogosphere over the last few years who have trumpeted that KM is “Dead” – some of them mean it in an ironic way or simply as a provocative hook to get eyeballs on their blogs, some think the way we understand KM is changing and that the old ways are “dead”, and some actually believe that KM is a term best ceded to IT and that the next shiny thing beckons – be that complexity, agile, or something else.

The real acid test is whether there is an increase in the number of jobs that are either about KM or require some degree of expertise in it, and whether they mention KM activities or compliance with KM practices as an essential part of jobs – But this is something I can’t answer just yet, since getting Monster, Indeed, etc. to pony up data on what KM jobs there were over the past decades is not easy.

Until then, let’s look at three things that would individually drive a need for Knowledge Management.

  1. Business variance and volatility – i.e. “Turbulence”
  2. Increase in the share of firm’s market capitalization due to Intangible Assets
  3. Demographics

My position is that the swell produced by each of these three market dynamics would individually create a need for Knowledge Management, but that collectively they make it an imperative – firms that do not get this right are in my view already dead men walking.

Turbulence

One of the markers I look for in firms to tell me if Knowledge Management is likely to deliver ROI, is the degree to which they are subject to variation and volatility. To get a metric on that I measure inter alia the following:

  • Change in regulations, laws, technologies, and players in their market space.
  • Product churn and variation
  • Staff turnover, skills variation, and performance variation.

Many economists have a similar measure that they simply call “Turbulence”
Here’s an interesting image that is typical of a measure of turbulence that just keeps showing up wherever one looks. It is a measure of business change in terms of new startups, mergers & acquisitions, business closures, etc.

This example is focused on healthcare, but as I said, the same shape keeps showing up – very little turbulence in the decades prior to the 70’s, with an explosion in the 90’s, and a small amount of calming running through the oughts, but no sign of anything like a return to the stable days of the 50’s and 60’s.

(HBR, 2012)

This is the “new normal” of the business world, with turbulence of an order of magnitude higher than what was previously “normal” and a status quo in which turbulence is a constant companion.
The days of a person doing the same job for decades, or a firm staying in the same business, or ownership of the firm staying constant are gone and never to return – and consequently, the ability to acquire knowledge fast, to be able to use it effectively, and to be able to “manage” one’s knowledge assets both tacit and explicit are critical to survival both for individuals and for organizations.

Intangible Assets

As per the image from Savage (1996) there was a time when “wealth” pretty much meant owning land – Being a big landowner meant having status, position, power.
Then it shifted to access to labour and wealth meant being able to acquire, mobilize, and manage a workforce.
Then it was having access to capital to fund business operations.

… and now it means controlling knowledge.

(Savage, 1996)

Over the last century, the proportion the market value of a publicly-traded firm that an auditor could capture with the balance-sheet in one hand and a pencil in the other has gone from over 90% at the start of the 1900’s to a low of under 20% in the 2010’s. The balance is made up mostly of “Intangible Capital”, and was often tossed into a bucket marked “goodwill”.

(Ocean-Tomo, 2010)

In fact, if you look at the data from Ocean Tomo, just since 1975 the proportion of market value of the S&P 500 has gone from just 17% to 80% in 2010.

So picture this if you will – you are an investor, and you have a bag of cash and want to grow it by purchasing a firm that you believe stands ready to take advantage of new needs and to generate a tidy profit for you (or your backers). You send in the bean-counters, and they take stock of the firm, ticking off as they walk the premises every line item on the balance sheet – raw materials, buildings, plant and equipment, finished goods, cash in hand, etc. By the time they have met with your banker (who might also want to see the results), and have walked the floor, they could give you circa 1910, an account that was close to 90% accurate as to the worth of the firm.

No doubt you would be happy with this statement of affairs and you could make the purchase with not too many sleepless nights.
Barring unforeseen circumstance, all should be well and the small amount that was unaccounted for and lies in the entry marked “goodwill” was merely icing, and if push came to shove you could just sell off the assets and still be in the black.

Fast forward to 2012 and your accountants return to you a balance sheet and inventory that reflect only 20% of the value of the firm, and they report that they think that maybe there is another 80% hidden in the “goodwill” line, but they aren’t sure. It may be 0%, it might be 90%, they just don’t know.
You spend days with your stomach churning, and if you represent investors, you fret over how you will explain this.

At this point investors, bankers, analysts, and increasingly shareholders, are simply not satisfied.
Leaving 80% of the value of a firm to guesswork simply is not acceptable, and they have various plans afoot to force firms to identify the value of their intangibles – ranging from the SHRM attempt to have value metrics for Human Capital, to more complex evaluations of the worth of a firm’s knowledge.

2012 saw some banks put dollar value against patents for the purposes of loan collateral, and who can forget all the patent auctions of 2011-2012, with more no doubt coming.
IC is no longer something that is seen as a bit of icing, it is now the major part of the cake itself – >80% in fact.

Demographic Change

We talk a lot about the “Baby Boomers” and their immanent retirement, but have you ever actually seen it?

Here is what a population pyramid for Germany looks like:

(Source US Census Bureau 2012)

What this means is that there are way fewer people in each age-group following those who are now at the peaks of their career, and the number of people entering the job market won’t be able to fill the spots as the groups above them shift up and the oldest shift out. The bulk of that 80% of value represented by IC lies in the skills, knowledge, and traits of the knowledge workers you employ – and generally the older ones are the most valuable to you. They know how, they know what and when, and most of all, they know why.

If you create a population pyramid for the Knowledge Workers in your firm, you might be in for a nasty shock (especially those of you with a need for highly-skilled practitioners such as engineers, planners, and managers) – you simply might not have enough people to replace the older skilled workers as they shift out of the job market, and you don’t have all that long to figure out what to do. In fact, in some firms it is already too late, they are simply going to go bust as their older and most experienced and qualified people retire.
The best such firms can do is plan for a somewhat orderly shutdown.

Knowledge Management

Let’s agree not to play “definition” bingo and to go down the rabbit-hole of the myriad somewhat-overlapping definitions of “Knowledge Management”, and suffice it to say that what we are trying to achieve is to have a clear picture of what the organization needs to know in order to execute its operational activities, to organize, regulate, control that required knowledge, and to maintain levels of it sufficient to meet operational needs.
So if we were to lay out an ISO9000 diagram of all the operational processes necessary to achieve the organization’s tier-1 goals, and then determine for each activity in the flow what the person would need to know, we would arrive at a list of what knowledge was minimally necessary (and perhaps not even sufficient) to meet EBITDA and other requirements.

The terrain in which KM practitioners operate is for the most part that of Intangible Capital, as depicted below.

(Adams & Oleksak, 2010)

The role of the person(s) responsible for Knowledge Management in the organization would be to see that the needs for knowledge were identified, to identify and measure the degree to which these were met, and have a plan and processes to make sure that the organization acquired, maintained, and put to work that knowledge in the most cost-effective and timely manner possible.

This is not going to be the IT guy any more than the IT guy is responsible for running the finances of the organization.

Conclusion

There has never been another time during which control over knowledge assets has been more important, and firms that do not have a robust knowledge management practice humming along will experience very high rates of failure as we track forward.
Far from being “dead”, knowledge management is going to be a significant determinant of which firms survive, and which roll over and sink as the combined effect of turbulence, the value of IC, and demographic change swells up around them.

References

Adams, M., & Oleksak, M. (2010). Intangible Capital: Praeger.

HBR (2012). The Volatile U.S. Economy, Industry by Industry Retrieved 09/04/2012, 2012, from http://hbr.org/web/slideshows/the-volatile-us-economy-industry-by-industry/1-slide

Ocean-Tomo (2010). Intangible Asset Market Value Retrieved 09/04/2012, 2012, from http://www.oceantomo.com/media/newsreleases/Intangible-Asset-Market-Value-Study

Savage, C. M. (1996). Fifth generation management : co-creating through virtual enterprising, dynamic teaming, and knowledge networking (Rev. ed.). Boston: Butterworth-Heinemann.

~~~

Matthew Loxton is a Knowledge Management practitioner, and is a peer reviewer for the Journal of Knowledge Management Research & Practice. Matthew holds a Master’s degree in Knowledge Management from the University of Canberra, and provides pro-bono consulting in Knowledge Management and IT Governance to various medical institutions.

It’s Time for a CKO

June 18, 2012

This blog post is a companion piece to the presentation I gave at the June ICKC Practitioner’s Meeting in which I presented slides and discussed some of the history of the launch of Knowledge Management, and why now is a critical time for firms to have a CKO. The title of the presentation was “Time for a CKO?” and was the second presentation of the meeting after “The New Economics” by Peter Bretscher.

The full slide-deck is also available both as a pdf and as a power-point slide deck at SlideShare

The Big Flop

In the 80’s and 90’s “The Knowledge Age” was the new fancy idea, and was taken up mainly by gurus like Drucker, but also several business academics.
Unfortunately the hype quickly overtook any ability to deliver, and consulting firms and software companies pounded money out of it and it quickly turned into a fad.

The idea was good, but the terrain was unprepared and consulting and software simply wasn’t going to deliver an ROI – Management didn’t know how to “do KM”, nobody was quite sure what the objectives were, and there simply were too few actual KM practitioners to even make a dent in it.

The result was an expensive, highly visible, and embarrassing belly-flop.

Back to Basics

So let’s just revisit two of the big moving parts driving KM and for the moment ignore all the practical reasons for KM like faster on-boarding, reducing waste, increasing quality, etc.

Two major changes have been underway historically – where wealth comes from, and the proportion of corporate value that is due to intangibles.

Firstly, wealth has changed in principle source from real-estate during feudal times, to being able to command labour and capital, to a current situation in which knowledge is the primary source of wealth.

Over the last hundred years, the measurement of corporate wealth has shown an increasing shift from property to ability. At the turn of the last century a firm’s wealth was made up primarily from its ownership of tangible assets – real estate, equipment, stock, and cash, but by the arrival of the early Knowledge Era, this had already been shifting

The current era is marked by a shift in the balance between the contribution to EBITDA and Market Capitalization in favor of Intangible Assets, and this is deemed likely to continue for several decades.

Secondly, the share of Intangible Capital as a share of the market value of firms has changed from a historical norm of <20% in the 1970’s to a current situation in which IC accounts for over 80% of a firm’s value.

illustration of the split between tangible and intangible value in the S&P 500 rising from 20/80 in the late 70’s to 80/20 in 2005

Knowledge was seen by Drucker, Senge, and others as being the only remaining way that firms can stay competitive in the Knowledge Era, and the major source of differentiation amongst competitors. These days every firm has more or less the same access to capital, raw materials, basic labor, and equipment as every other, and competitive advantage is no longer a matter of merely securing access to resources or materials.

The thing that separates Apple or 3M from the lower-order players is not physical assets but knowledge and acumen.

At the same time the people that track market valuation have been noticing an increase in the “Q” value that Tobin derived by comparing the market value of a firm with its physical assets and cash.
What has been increasingly obvious since the mid eighties is that the gap has been rapidly widening and that it seems to be stabilizing at around 80% of a firm’s market cap being attributable to intangible assets.

Source Adams & Oleksak (2010)

As per the International Association of IC Practitioners (IAICP), these include:

  1. Relationship Capital such as customer goodwill, reputation, and referrals
  2. Human Capital such as skills, knowhow, and expertise
  3. Structural Capital such as processes, patents, and trade secrets

They also add a fourth component, “Strategic Capital”, which I take as being the overlap of the first three that are individually necessary and collectively sufficient to achieve organizational strategic objectives.

Where Are We Now?

“Intellectual property has become one of the most important resources in the 21st century. It’s now an accepted fact that, just like financial capital or commodities or labor, IP is more than an economic asset – it also forms the basis of a global market”

Manny Schecter, chief patent counsel at IBM. (Forbes 2012)

Nonaka provides a model that distinguishes between knowledge that people can turn into documents (Explicit) and knowledge that either can’t be expressed or is locked away in their heads and practices and maybe even something they were unaware that they knew (Tacit). His model provides ways to move explicit knowledge into tacit knowledge (like studying and practicing the cello), tacit into tacit (like an apprenticeship), tacit into explicit, and explicit into explicit.

In support of this much has matured since his model was devised:

  • Many colleges and universities now offer master’s and doctoral programs for KM, and several institutes such as the Knowledge Management Institute and KM Pro offer certification courses for practitioners.
  • Several KM journals are published, amongst which the Journal of Knowledge Management Research & Practice has a rated impact factor.
  • KM interlocks with several other fields – at one end with the TQM, Lean/6Sigma movements, Applied Psychology, and Operational Research, and at the other end with Finance & Economics through Intellectual Asset Management and Intangible Asset Management

In addition, many large and innovative firms employ KM – from 3M to Xerox, including Deloitte, Dow Jones , Forrester, Fujitsu, Gartner, Google, HP, IBM, Lexis-Nexis, Pratt & Whitney, PWC, Siemens, World Bank , etc.

The primary areas of activity for KM are in workplace collaboration, management of innovation, the development of occupational communities in which standards of practice are refined, and corporate valuation through increased discovery and accounting of intellectual assets.

So where are we compared to the 80’s and 90’s?

The reasons for the lack of mainstreaming of institutionalized Knowledge Management have all fallen away, but the reluctance is still dwelling because of that memory in the minds of many executives. At the same time we have several emergent and increased pressures for institutionalizing Knowledge Management.

  • Technological changes and adoption rates continue to climb
  • You cannot simply put 80% of an organization’s worth under “goodwill”
  • The trade in IC has increased sharply over the last decade both for defensive and product development uses.

Simply put, IC is becoming fungible.

… and even being considered as collateral for loans by banks.

Return of the CKO

The CKO is not a new role, but one which holds increasing relevance in an age where knowledge and other intangible assets form such a large proportion of value, and in a time when retirement rate reaches 10,000 people per day in the US.

The CKO should be the structural keystone that brings IC and knowledge in particular under a single umbrella of scrutiny, management, and governance.
The days in which a firm’s knowledge could be left to the day to day operational dynamics are long gone, and it amounts to corporate suicide to leave knowledge management to chance.

To be sure, everyone “does” Knowledge Management, just like every firm “does finance”, but leaving it to chance implies that it is not likely to be done well, nor done in a fashion that enhances the likelihood of achieving organizational objectives. In much the same way that a CFO does not personally own all the money in the organization but provides governance, guidance, and a framework under which money and physical assets are managed and accounted for, the CKO should do the same for knowledge and IC.

Knowledge Management straddles all operations of an organization, and at its heart asks a simple duo of questions: how does a person know what they are meant to do, and how do they know how to do it?

In this sense KM overlaps on one side with HR/Recruitment in terms of what skills and experience a person needs to have prior to joining the organization in order to execute the assigned activities in their role.
KM also interfaces with Learning and Development in order to make good on knowledge that must be taught in addition to those “just-in time” job aids that must be presented to a worker at the time of execution in the form of knowledgebase articles.

On the valuation side, KM interfaces with Finance to establish value of knowledge artifacts and the abilities of staff.

KM provides both tactical and strategic support for the organizational mission as far as knowledge is concerned – from operational knowledge-bases, to Communities of Practice, to valuation of Intangible Capital such as trade secrets, methods, procedures, copyright, patents, etc.

In addition KM provides the framework and basis upon which those could be bundled or commoditized to make them available for franchising, leasing/licensing, or sale.

Is it For You?

The CKO role, and in fact organized and institutionalized Knowledge Management, is not for everyone, and the research shows consistently that there are several factors that are indicators that institutionalized KM and a CKO role would deliver a strong ROI.

The higher a firm rates on these items, the more likely there is to be a positive ROI for institutionalized Knowledge Management.
Here we deal with the three broad areas.

  1. The Business Model
  2. The Organizational Culture and Environment
  3. Volatility & Variability in the business terrain

Variability and Volatility deserve special attention since the more fluid and volatile the market, products, and labor pool are, the higher the need to be able to learn quickly and adapt fast and be able to lower the risks of volatility by having on-hand knowledge that represents the best and most current available.

To find out for yourself, try two surveys that I have built

  1. The KM Fit-test Survey
  2. The KMOL-C climate survey

Conclusion

The time to institutionalize Knowledge Management is now – the game has changed and all the old obstacles are either solved or no longer significant hurdles to implementing a formal process to gain control of IC.
There has never been a time in which pure knowledge in the form of know-how and know-who determine the value of a firm, and ongoing survival is going to depend on gaining a high degree of management capability over intangible assets.

~~~

Matthew Loxton is a Knowledge Management professional and holds a Master’s degree in Knowledge Management from the University of Canberra. Mr. Loxton has extensive international experience and is currently available as a Knowledge Management consultant or as a permanent employee at an organization that wishes to put knowledge to work.

Interview Questions – A way to get better performers, or get sued?

January 20, 2011

I was going to post a nice article on a fashion game-show and what it can teach us about business, and my stand-in article was on how to get value out of all that ubiquitous company gossip and rumor.
… but then I got into another long exchange with several recruiters and HR professionals about interview questions, and I decided to talk about that instead.

Besides, I love yapping on about statistics and also came to realize that this was a subject that is a foreign area to many HR people – apparently and according to three HR Professionals, statistics and questionnaire design are not typically in the training for HR staff and recruiters.

Status Quo

Here’s the situation:

Many recruiters have lists of their favorite questions to ask candidates, and there are more blogs and articles online than you can shake a stick at with lists of “best questions”, “favorite questions”, and “most common questions”. Some recruiters have their own lists, some draw from those blogs and articles, and others make up new questions as they go, or even do it on the fly during an interview.
What gets my giddy-goat is that while they all wax lyrical about how wonderful their questions are and how happy they are with the results, almost none volunteer how they determine that their questions do anything whatsoever other than make them happy and take time.

The articles tend to be empty when it comes to explaining the reasoning behind the “top ten/twenty/forty-two” list, and can’t point to results other than (at best) a few hand-picked and probably fictitious anecdotes. Many recruiters also espouse questions to “throw” the candidate, catch them off-guard, or startle them, which is supposedly going to reveal a “true character” or do something else that is simply marvelous but undisclosed.
… and of course there are many examples of those “why is a manhole cover round” sort of question which are spoken of in hushed and reverent terms.

My question is why is one asking these questions at all.
I mean, it takes time, effort, and presumably one needs to take notes and then compare answers, and time is money.
The answer is that the questions are going to give insight into the applicant’s personality and abilities.

Fair enough, I say.

… but this is hiring and we are presumably trying to get a better performer than those of our competition, and whose performance translates into achievement of corporate objectives – EBITDA, for example.
In which case, I am not so sure that we are trying to discover “true character” as much as simply trying to match applicants to a role in such a way that we are more likely to achieve operational goals – in other words, performance.

What to Do

Firstly, don’t ask illegal questions, it can get your employer into a whole heap of pain.
I say this because over the years I have been asked about my religion, my age, my national origin, and even my political affiliations, and each time I made a mental note to eradicate that if I joined the firm.
There is silly, and then there is just plain ridiculously silly – Don’t ask questions that expose your employer to legal action for improper discrimination. It costs money, it harms the reputation, and it just isn’t necessary.
Simple rule, if you aren’t sure of the legality of a question, leave it out!

Secondly, get a book on questionnaire design and interviewing (Scheaffer, Mendenhall Iii et al. ; Oppenheim 1998; Van Bennekom 2002; Swanson 2005), and maybe one on qualitative analysis (Ezzy 2002)

Here are some basic points before we get to my recommendations on designing a process for interview questions:

  1. Getting experts for technical or specialized tasks is a good idea, and you shouldn’t stop when you reach staffing – IO Psychology was founded precisely to address staffing issues.
  2. Don’t fret unnecessarily about sample size – a sample is used to predict a feature of a population to which it belongs and you aren’t trying to tell what is going on in the general population by using your sample of applicants, so sample size is not as relevant.
  3. There are robust statistical methods to deal with non-parametric situations in which sample sizes are small, such as Kolmogorov-Smirnov, Willcoxon, Kendall, and other tests
  4. Statistical tests are pretty much always going to be better than gut-feel and guessing since that is precisely what they have been designed to do. They exist because of the many and various biases and errors that come factory-installed in our Neolithic brains.

I often encounter this chestnut – “Past performance is a better predictor of success than chance.”

Yes it is a better predictor than chance, but that doesn’t mean that the question or the specific past behavior selected are better than chance.
While it is true that amongst the myriad past behaviors there are those that would predict specific future behavior, there is no reason to believe that we have selected the right predictors or that what we believe to have been a predictive behavior is going to be so.
In addition, don’t forget that people learn, and learning is exactly the opposite of past predicting future.

Dr.Shepell the EAP expert has suggested a regimen of measuring the predictive power of your questions over time. He suggested 2yr tenure as a performance measure, and that the scores from recruitment questions be correlated to whether the person is still employed at the 2yr anniversary to see if the questions had higher predictive power than chance. This is a simple task that can be done with standard features in Excel.

My suggestion is more complex and involves (a) post-diction to see if a question would have predicted known performers and (b) for prediction I choose the regular performance review scores. Predictive questions should correlate strongly with performance evaluation scores (unless the appraisals are rubbish).

An additional suggestion is to code the probationary outcome and either produce a dummy Boolean variable to correlate against the questions, or to expand the probationary result into a Likert scale with negative values if the person was released and positive values if they were retained. That allows a “no thanks” or “ok,sure” to be distinguished from a “Hell, No!” and a “Hell, YES” evaluation.

Here’s what I am recommending:

  1. Derive interview questions from four sources:
    1. previous critical events in the company’s history
    2. desired operational outcomes or goals
    3. the characteristics of known performers
    4. Industry-specific authorities (but make sure you understand the heritage of the questions)
    5. Like unpackaged drugs, do not get them from anyone without solid credentials
  2. Test them before use
    1. Examine them for Content Validity and Construct Validity i.e. do they test the things they are meant to and do they do so exhaustively and exclusively – the whole truth and nothing but the truth
    2. Check with simple correlation that currently known high performers answer the questions as you would have expected. If you have a top-gun Software engineer and want to get another, make sure the questions would be answered by them in the way you expected – if they don’t then modify your question or drop it.
    3. Check that the answers by existing staff correlate to their performance reviews – unless you are making a pig’s ear of the regular performance reviews, you should have simple numerical ratings that can be correlated to the answers to your questions. If there isn’t a strong positive relationship between the appraisal scores and your questions, then one or both are a mess.
    4. Take your questions to the company lawyer who knows employment law in your locality. This is not a DIY step, get legal advice before you put the company’s neck on the block.
    5. Take them to the Marketing department and get them to give you a feel for whether you are damaging the brand in any way. You shouldn’t have many questions so they should be able to give you a feel in a few breaths.
  3. Use them in a consistent manner
    1. Don’t ad lib and don’t change the wording or delivery
    2. Explain how long the questioning will take, who will use the answers and for what purpose, and how long they will be kept on record
    3. Keep records – this is company property, not yours to discard or lose
  4. Test them over time
    1. Use Dr.Shepell’s criterion – if the results don’t predict tenure, then something is wrong, probably the question itself.
    2. At each performance review, run correlations again and see how the questions are doing at predicting performance – if the correlation isn’t higher than 0.5 then you might as well be flipping a coin! You should be refining the question battery to give you an overall predictive power of 0.85 or above.
    3. Once you have a few years of data, get a good statistician in to do some fancier tests like Discriminant and Factor analysis.
  5. Be Sneaky Observant
    1. See if you can get people at other companies and particularly your competitors to answer the questions – the objective is to get a competitive edge over other firms in your market space by hiring better people than they are.
    2. Put some of the questions online in forums where SMEs that you typically hire would congregate, and see if they correlate to how senior the respondents appear to be in their area of expertise
    3. Approach known experts in the field to answer some of your questions and check those correlations

… but Why?

So why all this bother, after all stats is hard, and isn’t this going to take a lot of effort and time?

If you are keeping records of the answers people gave and how you scored those answers (and please tell me you are keeping scrupulous records), and if you have six-monthly or annual performance reviews that include numerical scores for various categories of performance (and please tell me you are doing this and keeping records), then all you need is to spare a few paltry minutes on extracting the values and running a correlation between the scores from the questions and the performance scores.

The IT folks can write a script to do all that automatically if your appraisal system doesn’t already have that functionality.

The effort is therefore not all that great since you should be doing most of it anyway.

The benefit is that you …

  1. Don’t waste time and effort asking, coding, and using questions that don’t do anything – if the question is as effective as flipping a coin, leave it off the list
  2. Get a solid basis for a defense if your hiring practices wind up being challenged in court – it is a whole lot easier to defend if you can show statistical tracking over time for questions used than standing there looking earnest and saying how you really really believe they are good questions.
  3. You get to demonstrate in real and tangible terms the value of your profession – you can show in hard numbers how the hiring processes lead to competitive advantage and shareholder value. Not a bad thing to be able to show these days!

Conclusion

Building interview and selection questions in a methodical way and tracking their predictive power eliminates many of the inbuilt biases that come with the standard-issue human brain, and creates intellectual capital that moves the questioning process from a smoke & mirror charade to a solid foundation and translates into real operational advantage.
The costs of doing it are lower than simply carrying on a status quo based on belief and opinion, and the additional effort involved in running basic statistical correlations is negligible.

There is simply no reason not to do so.

~~~


Matthew Loxton is a Knowledge Management expert and holds a Master’s degree in Knowledge Management from the University of Canberra. Mr. Loxton has extensive international experience and is currently available as a Knowledge Management consultant or as a permanent employee at an organization that wishes to put knowledge assets to work.

Bibliography

Ezzy, D. (2002). Qualitative Analysis: Practice and Innovation (New South Wales, Allen & Unwin.

Oppenheim, A. N. (1998). Questionnaire design, interviewing and attitude measurement, Pinter Pub Ltd.

Scheaffer, R. L., W. Mendenhall Iii, et al. “Elementary survey sampling. USA: IPT, 1996.” Links: 126-195.

Swanson, R. A. (2005). Research in organizations: Foundations and methods of inquiry, Berrett-Koehler Publishers.

Van Bennekom, F. C. (2002). Customer surveying: A guidebook for service managers, Customer Service Press.

 


%d bloggers like this: