Our company’s approach to strategic planning is changing, shifting from large public forums to assessments and interview.
Caution – this article is still a bit of a work in progress. However, I believe it’s purpose is valuable in starting a dialogue on whether the open public workshops, board meetings and other ‘group-think’ planning sessions are beneficial to the development of strategic plans. And investigating whether there are effective alternatives is important to ensuring the strategic planning remains not only relevant, but effective at producing results.
Finally, before I get into the details, I want to emphasize that the following is my experience and not a scientific study despite the parallels I personally think can be drawn between what I’ve seen and the scientific work in areas such as social influence theory.
About Traditional Planning
For many years, I accepted that model for strategic planning needed to involve one or more meetings of a task force, board, etc., or a one or more open public forums or workshops. As it stands, this model is the most common way communities, boards of directors, organizations, etc. go about developing any plan.
Such traditional planning processes are responsive to three key fears most all of us share:
- Trust. How can a plan developed outside the group or public eye represent the group or public?
- Right vs. Wrong. A community planning process without public input is inconsistent with our a core principle that we can’t have public planning without public representation.
- Narrowly-focused. A plan can’t represent the group if it wasn’t developed by the group.
The ‘Group’ Challenge
Over the last several years, I’ve seen the traditional planning process be ineffective because of:
- The decline of the number of people that attend, let alone participate in, public workshops or forums (unless it’s a controversial topic, ‘wedge’ issue or a “not-in-my-backyard” reaction). In these cases, the participation is more personally motivated by individual impact as opposed to broader vision, strategy, etc.
- The dynamics between individuals within the group are more frequently running counter to the group’s ability to develop forward thinking strategies using collaboration, cooperation and consensus.
For what it’s worth, my perception is that the influence of technology on our social behaviors and face-to-face interactions appears to be spilling over into the these planning processes. Not many years ago I’d occasionally see one or two people affect group dynamics through loud or persistent commentary, but as a facilitator I could counter or at least account for it. However, more and more I’m seeing the same scenario have a more profound effect that is harder to recover the process from. While this perception is a large enough topic for a different article all together, the best example I can compare this to is how viewpoints and disagreements are now shared on Twitter and Facebook.
Stepping Back & Looking at the Process
Given all this, I’ve spent the last couple of years dissecting a few specific experiences and looking for relevant research and areas of study that help explain the dynamics behind the challenges I’ve seen – an odd combination of self-reflection, evaluation and reading.
I’ve come to believe:
- Part of the challenge may be that common biases that often influencing strategic planning are more engaged (I don’t mean consciously or purposefully engaged, but engaged as sort of a baseline that’s been normalized due to frequency of use).
- Differences of opinion appear harder to reconcile at a time when consensus solutions are more widely viewed as a loss instead of a win-win
So what are those biases and what alternative approaches am I using?
Common Biases in Strategic Planning
Some of the most common biases I’ve observed in previous economic development strategic planning engagements are:
Social Influence Theory
Theorizes that a person’s emotions, opinions and/or behaviors are influenced and affected by others. Herbert Kelman in 1958 identified three varieties of social influence bias:
- Compliance – responding favorably to an explicit or implicit request or suggestion by others, but can also be represented as keeping dissenting opinion private due to social pressure
- Identification – being influenced by someone admired
- Internalization – acceptance of a belief or behavior based on a feeling of an internal reward
Cognitive biases are a systematic deviation from rationality in judgement, arguing that individuals create their own “subjective social reality” from their perception of inputs. Related to heuristics[i] (a/k/a cognitive short-cuts or rules used to simplify decisions), cognitive biases can be useful when timeliness of decisions is more important than accuracy.
Example: ‘representative heuristic’ or the tendency to judge the frequency or likelihood of an occurrence by the extent to which the event resembles a typical case
Decision making is affected by a lack of information or “ambiguity” and people tend to select options for which the probability of a favorable outcome is known.
The tendency to rely too heavily, or “anchor”, on one trait or piece of information when making decisions (usually the first piece of information acquired on that subject)
The tendency to do (or believe) things because many other people do (or believe) the same.
The tendency to revise one’s belief insufficiently when presented with new evidence.
The tendency to attribute greater accuracy to the opinion of an authority figure (unrelated to its content) and be more influenced by that opinion.
Availability Cascade Bias
A self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse (or “repeat something long enough and it will become true”)
The tendency to search for, interpret, focus on and remember information in a way that confirms one’s preconceptions.
Shared Information Bias
Tendency for group members to spend more time and energy discussing information that all members are already familiar with (i.e., shared information), and less time and energy discussing information that only some members are aware of.
Many of these biases have been stymieing strategic planning more than they have previously. And more and more open public engagements are failing to attract diverse community or board opinions and the perception that those who want to complain are more likely to show up has never been more accurate.
Use of Interviews & Assessments as an Alternative
I (and my firm) am now relying more and more on assessments and interviews (even earlier in our engagements) to identify both the individual and group values, themes (components of a shared vision) and inputs (strengths, weaknesses, opportunities and threats) needed to structure draft plans and project group engagement.
Our approach relies on a modified Delphi method a/k/a Delphi Technique), being that the modification comes from using interviews to both support and supplement a comprehensive assessment (a/k/a survey) as opposed to Delphi’s typical use of multiple rounds of surveys (assessments).
In our recent experience (purely observational), the group consensus formed in through this approach more often resembles a ‘group conscious’ than ‘consensus’ – a distinction explained by the definition of ‘group conscious’ as the individual awareness of the identity of the group’s shared believes, thoughts and feelings and ‘consensus’ as the general agreement of a group. Further, ‘consensus’ typically involves negotiation, elimination or substitution of parts from the whole based on individuals making decisions. By contrast ‘group conscious’ involves individuals making choices based on their awareness and understanding of common beliefs, values, thoughts and feelings (e.g. a town’s vision) without that same level of individual decision making.
In a recent project example, I used interviews and an assessment, plus a more traditional public workshop (which took place during the assessment). The result was an 18% participation rate in the assessment, which was nearly double the 10% participation rate in the public workshop.
In additional to attracting more participation, the assessment also yielded a greater depth of opinions, community insights and ideas for projects and goals. Subsequent use of this approach thus far appears to have produced similar outcomes, yet more analysis is required.
For those interested, below is more specific information on the Delphi method (also referred to as the Delphi technique).
A communication technique or method, originally developed as a systematic, interactive forecasting method which relies on a panel of experts. Experts answer two or more rounds of questionnaires. After each round, a facilitator anonymously summaries the experts’ forecasts and reasons for their judgments. Experts are then encouraged to revise previous earlier answers in light of information from other experts. It is believed that during this process the range of the answers will decrease and the group will converge towards the “correct” answer. The process concludes after reaching a predefined terminus (e.g. number of rounds, achievement of consensus, stability of results).
Developed[vii] by Project RAND[viii] in the 1950s for a report on the future technological capabilities of the Army Air Corp, its purpose was to overcome situations where experts were often influenced by cognitive biases. The method relies on using of a series of questionnaires to narrow the range of focus and arrive at an expert opinion.
[i] Originally discovered by cognitive scientist Herbert Simon, heuristics were studied further by psychologists Amos Tversky and Daniel Kahneman in the early 1970s and became the subject of a well-known paper, “Judgement Under Uncertainty: Heuristics and Biases” published in 1974 in Science.
[ii] First described by Daniel Ellsberg in 1961, it was part of his dissertation on decision theory looking at decision making under conditions of uncertainty or ambiguity and how the outcome may not be consistent with well-defined probabilities. This work is now a part of what is referred to as the Ellsberg Paradox and it later influenced other approaches including Choquet’s info-gap decision theory. Mr. Ellsberg was a U.S. Military Analyst and for RAND Corporation employee; however, he is most known for his role in releasing the Pentagon Papers.
[iii] Ward Edwards, 1968. Underweighing new information because of an existing belief of base distribution of information.
[iv] Reference Milgram experiment in 1961
[v] Identified by Psychologist Peter Wason (1960s).
[vi] Wikipedia; International Economic Development Council
[vii] Attributed to Olaf Hemmer, Norman Dalkey and Nicholas Rescher
[viii] a/k/a RAND Corporation