You must have come across this term, “governance” at many places, and would have also notes that each interprets it as per his / her understanding. Actually, there is nothing wrong with that, as some definitions are bound to context, and may be this one too; let’s investigate.
I personally see it as a three step process;
1) What needs to be accomplished – Define, what you want to monitor such that it meets identified goals / objectives.
2) The value system – Define, the responsibilities for each observer / stakeholder, in line with overall value system.
3) How – That is, how would ensure that what you have identified under #1, actually gets measured, and improved?
In essence, the WORK, COMPOSITION, and TASKS, the three attributes to effectively steer towards identified goal / objective.
This is intuitive and obvious; no rocket science involved here. The reason being, effective governance requires clarity on goals / objectives, stakeholder buy-in and empowerment, and details of activities to achieve given targets, i.e. the focused approach.
Now let us see, how each attribute of governance may look like.
WORK
If you as a program / project manager has to ensure success of a given program / project, what work units you would be monitoring?
I’m sure as a program / project manager, I would like to monitor;
1) Risks,
2) Financial Health,
3) Resource Utilization and Projections,
4) Resource Compatibility, Availability and Productivity,
5) Quality of “Output”,
6) Issues,
7) Benefits,
8) Stakeholders Interests,
9) Communication,
10) Client Satisfaction, and
11) Operational Performance.
By doing this, I have defined the scope of my governance across various work units / functions. Please remember, I have just defined, “WHAT”, and the details in form of “HOW” are yet to be defined.
Is that enough for this attribute? I do not think so, the reason being, you have identified what needs to be monitored; however, to monitor anything you need a base. So, the exercise will only be completed after we define the minimum acceptable level of performance against each work unit; thereby, giving the governing body / council a measurement to assess the performance against given target.
These attributes would help you in identifying the participating entities, and would give you a rough idea on the “COMPOSITION” of a governing body; which is our second attribute of whole governance process.
It is no coincidence that there is a linkage among all three attributes. There were bound to be linkages, as otherwise how would you have glued together the attributes of governance?
TASKS
This is the place where you define the activities / tasks that one needs to carry out against each identified work unit. These tasks would help you in gathering data points to measure the performance against set targets.
Now let us see, what typically would go under each identified work units / functions.
Risks Management
• Definition
− Identification (Categorization, Uncertainty Level)
− Quantification (Severity, Probability, Detect-ability)
− Prioritization (Highest to Lowest Dollar Amount, and Selection of Few)
− State management and flow
• RMMM
− Mitigation
− Monitoring
− Management
• Tools & usage
The identification element of definition attribute can be carried out from two perspectives, first, “risk avoidance”, and second, “risk identification for overcoming weaknesses / threats”.
The second part is addressed via “Strategic” / “Enterprise” risks management, wherein, the exercise goes beyond “avoidance”, and links those to strategy. It has wider benefits, as instead of temporarily avoiding the risks; one converts that to internal strength.
The motivation behind “Strategic Risk Management” is;
1) Improved readiness to wider range of risks,
2) Optimum resource allocation / utilization,
3) Improved competitive advantage,
4) Reduced volatility / fluctuations / surprises, and
5) Reduced costs of risk transfer.
A word of caution here, one needs to understand the context under which this elaborated exercise needs to be done; second, there would be situations (in most of the cases) where mere avoidance exercise would suffice.
Financial Stewardship
The typical “earn” Vs “burn” metrics. The exercise of selection of project / program, on the basis of ROI, and strategic fit had already been over. Now, is the time to see the “performance” as we progress.
Financial health of a project is a function of “schedule”, “effort / cost”, and “quality of operations”; assuming there are no changes in “environment”. If any one of those does not perform as per plan, there would be an adverse impact on the finances of that project.
The operations part is measured through performance of internal engineering processes and activities, so, let’s first concentrate on “schedule”, “effort” and “cost”.
As the project / program progresses, you start getting the actual data on schedule and effort (hence resultant cost). This data should be used to calculate the value earned on completed / in-progress work. This is precisely the objective behind EV (Earned Value) metrics. So, let’s see, what each would mean to us.
• Earned Value Methodology
− What was budgeted?
− What has been accomplished?
− How much it should have been accomplished?
− How much did it cost to accomplish till date?
− How much it would cost to accomplish remaining work?
− How much it would cost at completion?
− What is the performance / efficiency required to complete remaining work to avoid any deviation?
Now, let’s convert these English sentences into corresponding metrics;
− Budgeted At Completion (BAC)
− Earned Value (EV) / Budgeted Cost of Work Performed (BCWP)
− Planned Value (PV) OR Budgeted Cost of Work Scheduled (BCWS)
− Actual Cost of Work Performed (ACWP)
− Estimated To Complete (ETC)
− Estimated At Completion (EAC)
− To Complete Performance Index (TCPI)
In essence, three critical ratios, namely;
· CPI = BCWP / ACWP >= 1 value is desired
· SPI = BCWP / BCWS >= 1 value is desired
· TCPI = (BAC – BCWP) / (BAC – ACWP) <= 1 is desired
Human Resource Stewardship
A resource is a very generic term; given the context it changes nature. Typically, it is used for “S/W”, “H/W”, “Human”, and “Capital”. However, here I’ll be discussing it from “Human” perspective alone.
Whenever we mention “human” resources, immediately we (should) think from flowing perspectives;
1) Utilization rate,
2) Projections – future requirements vis-à-vis current capacity,
3) Readiness of resources – Trainings, and compatibility,
4) Availability / Adequacy of resources,
5) Productivity,
6) Support systems, and
7) Employee satisfaction
a. Suggestions
b. Concerns
c. Rewards and Recognition
d. Organizational Values
Need some more attributes to measure, add more; however, I personally have felt that measurement on these should suffice the need of any program / project.
Quality of Output
Now this is a tricky part; the metrics would depend upon your definition of quality. Essentially, what is important to you and your customer should figure in here. For example, for a S/W project, typically it is divided into two, namely, product and process metrics;
Product
a) Defect Density
b) Field Errors
c) First Time Right
Process
a) Review / Testing Efficiency
b) Review / Testing Effectiveness
c) Schedule / Effort performance
d) Requirements Stability
e) Defect Slip-Age
f) Effective Throughput . . .
However, depending upon context and customer requirement it may change. For example, for any outsourcing deal, these may either get replaced or supplemented by SLA. Similarly, it may also get supplemented by non functional requirements, acceptance criteria etc.
It is up to you to define quality goals, and a corresponding metrics, with LCL/UCL, LSL/USL and monitor those against targeted goals.
In essence, common definition for various elements; time bound audits to validate compliance, and effect control.
Issues Management
As any program / project would come across issues at any given point in time, it is required to have an issue management process. That is, it would have its own life cycle from articulation / logging, till resolution.
Like quality, it would depend upon program / client needs, as escalation process in particular be very specific to “governance” composition / structure that you identify for a given program / project.
Benefits
It is to be noted that each program works with a benefit profile, i.e. a complete description of each benefit, expected from a given program. Therefore, since a program needs to be measured against planned and actual benefits, one needs to monitor the performance against listed benefits too.
Stakeholder Interests
It is typically a matrix showing the mapping of each stakeholder and their particular interest(s) in given program.
To put it in a layman’s words, whatever is “expected” out of a given program, needs to be monitored.
Communication
• Attributes – What needs to be communicated
• Mode – How it needs to be communicated
• Frequency – When it needs to be communicated
• Stakeholders – To whom it needs to be communicated
• Escalation Process – When it should be communicated to next level in hierarchy, such that the “point” is addressed.
It goes without saying, since it may have action points, important decisions, one needs to track the items to closure, and store for future reference.
Client Satisfaction
Internal Indicators
I believe a set of internal indicators (inline with earlier stated points) could also help us in uncovering the layers of unknown; and I would rather start with these measures, as any negative deviation in these measures would mean lower customer satisfaction. So, these measures act as alerts for “early warning”, and it is up to us to acknowledge and work upon those to increase customer satisfaction, or ignore and get doomed.
The internal measures could be summarized as follows;
1) The quality and process / project metrics
2) The management / SQA review points
3) Internal audit findings
4) The review records
5) The internal cost / financial metrics
6) Internal employee satisfaction survey
7) Number of escalations and severity
External Indicators
Though one can have subjective as well as objective questionnaire to capture the response of the customer, however, a mix of both would be my choice, as it would provide customer an opportunity to "rank" as well as means to convey his / her feelings / observation, which would require further analysis on vendor's part. I believe following aspects should provide a suitable base for capturing the satisfaction level;
1) Product / Service Quality – Overall / Phase wise
2) Schedule Adherence (variance / span)
3) Cost Vs. Value Proposition
4) Responsiveness / Empathy
5) Accuracy of provided solution / responses
6) Resources ability / capability
The above factors / attributes could be objective and may have ranking from negative to positive. In addition, we could have certain subjective feedback, wherein we try to capture their emotions / feelings / observations, which in turn could provide us a clue on as what went right / wrong. For example;
1) If you were to re-execute the project, then what were the things that you would like to do differently, and how? OR
2) What are the points / factors / actions that you think could have provided much more value add to this initiative?
3) What are our two most positive and negative points, and how we could have addressed those (negative points)?
4) If some referral calls you up, what be your spontaneous feedback on us?
5) Where and how do we need to improve upon?
6) Would you like to do a repeat business with us?
Operational Performance
For a typical S/W project the engineering processes should be governed from following perspective;
Common Processes / Practices Across;
− SDLC (Req., Dsgn., CUT, QA)
− Requirements engineering and management practices
− Design practices
− Standards, Checklists, Guidelines, Re-usable Artifacts
− Validation and verification points, and practices
− Change Management
− Software Configuration Management (SCM)
− Common standards, structure, naming conventions, artifacts
− Multiple streams development - Generic configuration items convergence process
− Common process for build and release
− Common release location
− Common guidelines – release notes, FTP maintenance
− Definition of role, access, and privileges
− Process for “status accounting”
− Defect Management
− Knowledge Management
COMPOSITION
So, what we have covered so far is, WHAT, and HOW of governance part. Now, lets cover the “WHOM” part.
As mentioned initially, the governance composition is dictated by program needs; there is no definite structure for it. The participating entities and their responsibilities / empowerment can easily be derived from WORK, and TASKS that we have covered so far.
As mentioned, it the governing council which is going to govern, therefore, we need to define the ROLE, and corresponding RESPONSIBILITIES, and EMPOWERMENT for each role.
Please note that only responsibility without needed empowerment would result in failure of governance model. Therefore, definition of “empowerment” is as important as that of responsibility.
Thursday, June 5, 2008
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment