I am a big believer, and utilizer, of Balanced Scorecards. However, I rarely agree with articles/presentations regarding their use. Before I forget, I highly recommend "Balanced Scorecard" by Niven and "Performance Dashboards" by Eckerson.
I seem to be a lone wolf regarding my perspective, but here it goes:
a) Most organizations set up monitoring/goals/Balanced Scorecards/Dashboards then crunch numbers and evaluate performance. What is rarely done is to analyze a system or organizational change as to the impact that change has on the elements of the balanced score. In plain English, before implementing a mid/high level change or program study the possible effects on the 4 wheels of the Scorecard.
I've seen more profit/operating damage done because of this then by any lack of inspection, forms, corrective actions, etc.
My example (a true one) is posted elsewhere in the Cove:
Should the billing process be part of ISO 9001:2008?
The Balanced Scorecard should be a tool for analysis of impact BEFORE implementation of a change!
b) Balanced Scorecards (BC's) and Dashboards (DB's) are two different things, and should be kept separate. BC's are a strategic-level instrument and DB's are tactical, and never should a BC metric depend on only one DB input.
Also, DB's are often abused. I refer to an article in Business Week early in '06 regarding DB's. The article opened with Larry Ellison sitting on the S.S. Oracle, his mega-yacht, monitoring sales on a real-time basis. Someone else mentioned tracking desk-time of employees. I wrote a letter of response (not published) that DB's are not instruments for "Big Brother" monitoring.
In my private practice I prefer to refer to BC's as "Balanced Ceiling Fans." a) The blades really do have to be balanced. b) The blades really have to have pitch, or bite, to be effective. (A tool to improve, not just monitor.) c) Don't have to have 4 blades, I've used up to 6.