Methodologies and Certifications

Project management, business contingency planning and I am sure many other areas are beset by certification frenzy. And each is certain that their competing methodologies are inferior. When I first moved to Canada I picked up a textbook on business accounting, having survived (with a decent grade) three terms of ‘management’ accounting in my university days in the US.  US accounting practice embraces a number of different deprecation strategies — which the Canadian book refuted and claimed that everything that disagreed with their approach was invalid as it wasn’t ‘natural’. Sputter, gasp… what is ‘natural’ about an agreement with the government to allow capital cost recovery? Its not gravity…

What I see more and more, particularly in the ‘LinkdIn’ business groups, is more of the same. Increasingly it is reminding me of religion… proponents of practice ‘x’ claim that it is the only divinely inspired approach and all others are wrong. There are just too many examples  to cite.

I see two sets of problems — methodologies are yardsticks and checklists, but one should not confuse them with reality. The yardstick is a useful measuring tool but its gradations are not the landscape being measured.  Reality does not have it wrong if there is no lump at the ‘x’th graduation… And there are always multiple ways of looking at a problem that lead to solutions. In my experience stepping off the path is sometimes the first step in finding it — but it takes courage to step into the abyss with no safety net.

To be a tad more nuanced about this… methodologies (and their close relatives ‘standards’) are conceptual frameworks that provide common sets of terminology for ideas arranged in a particular order. It is helpful, even vital, if in discussing a problem, that all participants share a common set of terms. Standard sizes and shapes facilitates interchange… if that is what one is trying to accomplish. And checklists remind us of overlooked facts — but it is up to the practitioner to evaluate what portions of the framework are relevant and what are not. It is not a paint-by-numbers canvas where one is complete only when all the blanks have been colored in.

Consulting is particularly evil in this regard… one can always do more analysis. The challenge is ‘how much is enough’ ? What is needed to solve the problem for the client and does it require deviation from the ‘proven methodology’ to be done cost-effectively? I recall a systems engineering text from many years ago about the difficulty of getting sufficient decimal places in accuracy in a particular mathematical model. But one lad was bright enough to ask where the data was coming from… turns out the process viscosity that everyone was fussing about was provided by a foreman who stuck his finger into the process stream, rubbed it against his thumb and guessed the appropriate number. No carefully calibrated viscosimeter…

Methodologies are like this… one wants to solve a problem and develops an ordered checklist of things to ask. Over time this grows — particularly if the approach is useful. Make no mistake, conceptual frameworks can be very powerful and useful.  At some point the framework turns into a named approach… ‘Prince2’ comes to mind (it’s a project management methodology — not picking in it per se, just that the name popped into my mind). A support organization grows up around it and they administer training, tests and award certifications.  The certification holders would like everyone to believe that this means they understand project management — not just that they passed a test about a particular methodology.  Mostly this works… but the map is not the territory. And the tendency for holders of ‘x’ cert to feel that holders of ‘y’ or even, gasp, ‘none of the above’ are heretics and just don’t understand the area evolves.

Methodologies are infamous for creating an alphabet soup of categorizations that are used to guide their work. The analysts couch too easily turns into a bed of Procrustes — non-conforming problems are chopped off or stretched to fit.  One must do ‘A’ and ‘B’ and ‘C’ in order to solve the problem (and bill the client)… And if one does ‘A’, then ‘C’ and perhaps ‘F’ this just isn’t right.  Rules? Or guidelines? Makes for beautiful boilerplate work that lands with a thud on the clients desk.

It all boils down to feeling a need to keep a mental separation between the real world before me and the tools I use to approach it. There are too many examples of classic tests that fail because the really check for something else — benzine comes to mind. And sometimes too much structure gets in the way of solving the problem…

[to be continued]


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s