Far better an approximate answer to the right question … than the exact answer to the wrong question.
— John Tukey, Statistician
If I had an hour to solve a problem and my life depended on the solutions, I would spend the first 55 minutes determining the proper question to ask, for once I know the proper question, I could solve the problem in less than five minutes.
— Albert Einstein, Physicist
Much of what we do both at the individual and organizational levels are driven by questions. Questions are the lens by which we see what is and is not possible. It has been my experience that teams and organizations, regardless of whether they are working at improving student success or addressing workforce demands, sometimes go astray when they seek answers to the wrong questions.
This error generally happens not because of lack of intelligence, work ethic, or even passion, but because those working on the problem respond too quickly to the high pressure for a solution. The perception that they have to hurry often results in teams moving too quickly from the problem-space into the solution-space because we are often metaphorically “building the plane while we are also flying it.”
This pressure is keenly felt when attempting to evaluate an initiative. Consequently, the focus generally is on “What can we measure?” which on the surface would be the exact question that should be asked. But I have found that frustrations mount when the question of what to measure becomes the focus of the evaluation before the project team and evaluator together address other important questions such as: “What do we want to know?”
One time I had a client project team grappling with what should be measured for the evaluation to demonstrate project outcomes. Rather than dwelling on their dilemma about what to measure, I asked a series of questions such as “What do you want to learn about your project?” “How does this project change behavior?” “What do your stakeholders want to know?” As team members answered those questions, I pointed out how their responses led to what they really should be measuring regardless of how difficult it could be to obtain relevant data.
This experience reminded me that once you focus on what people want to learn from an intervention, it is easier to figure out how to measure outcomes.
My advice is do not skip over the questions of what you want to learn. Sure, those questions can be challenging because of a fear that those items cannot be measured. But doing this deeper thinking up-front avoids angst at the end about inadequate data or measures that lack meaning and often reveals novels ways of measuring outcomes that may have at first seemed impossible to measure. Yes, the preliminary work will take time, but the benefits are so worth it.
*Portions were originally published in October 2012 issue of the Dayton B2B Magazine.