X Welcome to International Affairs Forum

International Affairs Forum a platform to encourage a more complete understanding of the world's opinions on international relations and economics. It presents a cross-section of all-partisan mainstream content, from left to right and across the world.

By reading International Affairs Forum, not only explore pieces you agree with but pieces you don't agree with. Read the other side, challenge yourself, analyze, and share pieces with others. Most importantly, analyze the issues and discuss them civilly with others.

And, yes, send us your essay or editorial! Students are encouraged to participate.

Please enter and join the many International Affairs Forum participants who seek a better path toward addressing world issues.
Sat. October 05, 2024
Get Published   |   About Us   |   Donate   | Login
International Affairs Forum
IAF Articles
IA Forum Interview: Meegan Scott (part 2 of 3)
Comments (0)

IA-Forum: What is your approach to Monitoring and Evaluation strategic plans?

Meegan Scott:  Our role in evaluating strategic plans depends on whether we are adjunct internal or external consultants, and who commissioned the valuation.  I mentioned internal adjunct because that's a service we provide where we are adjunct to a team and would just be involved for a few days.  The evaluation would involve examining the context as well as the basis and logic of the strategy contained in the plan.  Also, comparing expected to actual results, identifying emergent strategies, making recommendations for corrective actions, and developing recommendations for performance improvement. 

How it actually plays out at Magate Wildhorse depends on whether we are doing the evaluation prior to execution or implementation.  If it's before execution, we focus on the content of the plan and we look for the typical consistency balance consonants feasibility advantage, completeness, clarity, and ability. When we look at consistency, we are looking at strategic intent, the framework, and so forth.  Depending on the strategy, the framework and context, we are looking at questions that would include things like: Are the strategic intent and strategy framework consistent?  Was, or to what extent, was a value chain considered for improving products or service delivery, or both?  Has the organization strategy identity been clearly articulated?  Are they relevant?  Is there a mission, vision values, culture statements, desired culture statements, and value propositions?  Then we look to answer where there provisions and powerful messages for communicating the strategy identity.  Does it provide for, does it plan, does it provide for building human resource and leadership capacity in response to internal gaps or and desired future state of the organization?  We also look at what, and how solid, the planned activities initiatives are.  For example, activities and initiatives for retaining or growing membership customer base for capturing non-users.  We also ask, Is there a timeline and process for strategy renewal and updates?  What is the frequency of updating the plan version control, and what is the efficiency of the plan development process?  You have to learn what you could have done differently, what you have to get the clients to do differently, what they could have done differently. 

Organizational history and assumptions review and sharing are important.  You have to examine when and why the organization started.  How has it changed, what has impacted the change and what are you looking for in the future?  Sometimes, this is for organizational learning.  We ask clients to answer questions related to history and the processes that they can do away with.  We also look for mechanisms for ongoing surveillance as well as balance in accordance with the strategic framework.  We also look at how well a plan addresses risk, the assumptions and the contingencies, and if it addresses them at all.  We also examine timelines and level of effort, performance standards, quality standards, performance indicators, and a mix of lead and SMARTER indictors (Specific Measurable Attainable Realistic, Time-bound, Extending, and Rewarding).  We also look at work plans and budgets. During the analysis of the processes, we would use methods such as customer satisfaction assessments.  We also look at time sheets, schedules and journals for assessing the time and process.  We also look at our reflexive journal for every strategic planning exercise; I get a completely new journal. 

The monitoring which happens to fall pretty close to the progression on the strategy process relates to the evaluation and control function.  Again, the approach that we'll take depends on our role in all of this.  We tend to use a blended approach where we focus on utilization and empowerment of the client.  We are proactive about our emphasis on learning from the monitoring and performance measurement process for correcting actions to avoid disaster and also for learning and improvement for risk management and mitigation. 

Those are the approaches and now let’s discuss the steps.  If we led the planning, the monitoring readiness assessment would not be an exhaustive process because we would have done that in the prior organizational assessment.  This would be more about knowing that they're ready to take on the challenge, preparing them to use monitoring and reporting tools as well as tracking and reporting tools.  The next thing that we would focus on is identifying the best tool for them.  We look at the type, the level, and the quality of the monitoring and reporting experience of the organization. 

We play the role of evangelist for performance management and measurement.  That helps us, as well as the organization, as it makes planning easier.  We ensure everyone understands the role of monitoring and evaluation, and its importance in demonstrating and ensuring accountability.  Getting stakeholder buy-in and ensuring relevance to validate and making a judgment about the effectiveness is important.  It is always interesting and when a group walks up and says, “We do well because we did this and that”.  But when you begin to speak with founders, management, and staff, they understand that what they were taking for doing well is a different story.  Or maybe what they were beating themselves up for what was not all that bad. 

We also emphasize the importance of monitoring for providing information that may lead to winning more funding, getting messages for marketing, and for driving innovation.  There’s also the possibility of resource mobilization through monitoring, as you may gather evidence that you’re an attractive partner for another entity.  So, you can collaborate, share funds as well as to strategy success. 

Our plans generally support performance management framework and systems, which would include key information for monitoring implementation such as the inputs, the outputs, the outcomes, the agreed-to impacts, and key performance indicators both for founders and their accountability needs.  This also assists the organization to get their handle on its own results and straightening and strengthening its overall strategy.  At times, we don't have baseline data, especially if they are new clients or maybe it's just for a particular program they're just rolling out and there's no baseline data.  If the organization doesn’t have baseline data, we might be able to gather data from other sources such as government department statistics so that we can start a baseline.  If not, they know that the monitoring and performance data that will be gathered further in the process will help to provide the baseline.  We also include cost information and look at the frequency of collecting responsibility, quality measures for indicators, and so on. 

We put all these things in the plan document.  We include each line item of the plan, allowing for expansion of those items to ensure that the strategy is broken out properly and supported by indicators and measures that support the strategy and the strategic plan in your operations plan.  Using that document, we can also adjust it to create a performance management report and use that same document to create the monthly report.  In it, you have to identify cumulative information that comes from the day-to-day activity processes that are suitable to be captured at a strategic level.  We generally end up with something that can be adjusted easily for monthly, quarterly and annual reports.  Each quarterly report displays accumulated data on performance for the next quarter and the final quarter; and the third quarter is adjusted to form the annual report. 

Overall, our monitoring approach involves establishing the importance of monitoring and reporting for organizational growth and accountability.  We look at information needs in terms of assessments, agreed-to indicators, measures, and outcomes to monitor.  We provide training in the use of performance monitoring and measurement frameworks and tools.  We also establish teams and assign roles for performance management and measurement, the board, and for the corporate strategy plan.

We also include the annual report and ensure that the AGM that is there, the financial reporting is there on the calendar and so on.  At this point, we have identified internal champions and if we are adjunct external or internal players, then we serve as the lead champion posting the performance measurement meetings or leading them with of course the leadership of the organization.  This allows us to guide the floor and facilitate meetings. 

We also ensure there's a performance measurement calendar and visuals that support the PBCA, the plan to check act framework for continuous improvement, and results-based management or other blends.  The visuals are comprised of two separate triangles.  One triangle has expected outcomes and impacts at the level of the organization.  This can also include the societal level if it's going to have a societal impact.  In the middle of it, there are the program initiative service product indicators related to outcomes and related to output efficiency cost effectiveness indicators.  At the base of the triangle are the internal and individual measures for individual performance coming from input and output processes.  To tie those up, you stretch compliance quality improvement measures running up and down the line so they can see that has to be included. 

The other triangle includes objective strategies, levels of effort, and a responsibility targets time line running down it.  This is a PDCA visual to remind them to communicate findings to act and to improve on.  We then combine the surveillance that we’ve performed, through reviewing reports, documents, and other sources.  Here, we're looking for consistency and truth, in terms of what is reported.  We'll perform surveys, interviews, samples and meetings to monitor the validity and reliability of information, as well as identify any problems.  Because the team is busy performing management duties, they don't have time to take on those activities.  We do that and tie everything together and then develop a total analysis with recommendations. 

Out of that process, we end up getting the emergent strategies.  We analyze problems, deviations from planned theories and from planned activities; even deviations from how program logic said it would work and to suggest and or draw out corrective measures and follow ups.  We identify gaps that may have overlooked and new gaps that may arise out of the implementation process, and gain inside information for updating strategy plan. 

Sometimes you may develop a business process improvement team or something to handle an improvement to make that change or to push through an initiative if you see it's falling behind.  It could be a business process improvement, but it could also be something for resourcing planned activity that's not happening for some reason.  We look to see if the annual service was gone or if it is an organization that's implementing donor projects or government funded projects.  We do that to see if they did their external evaluation in accordance with when it was scheduled to be done. 

Last, to ensure utility, we package the information and communicate the findings so that the potential impacts, feedback and information for driving the improvement, any decisions, and risk facts are communicated.  We facilitate moments for reflection and look at what worked, what went well, what didn’t go well; and why.  Also, what could we have done better. 

IA-Forum:  What about the post-execution evaluation process?

Meegan Scott:  The task at hand in post-implementation is to make a judgment about the strength of the organization at a milestone review period (when we're asked to do the evaluation).  It could be a mid-term review of a plan or at the end of a planning period.  We ask if the organization is stronger at the end of that milestone period or the planned period than when the plan was created and at the start of execution.  Was the strategy executed successfully?  This is an attempt to assess the effectiveness of a plan in guiding the organization towards achieving improved performance.  We look at that in terms of effectiveness, efficiency, relevance, financial viability, cost effectiveness, and for some type of entities, we would go deeper into looking at quality aspects. 

For that type of plan, we use the ocean plan framework and develop along those lines, at how the plan helps in adjusting to changes in the environment.  These include political factors, social factors, competitor inflation, interest rates, and even ecological legislation.  Sometimes, we'll find that entities do not know all the governing legislation affecting them.  So, we normally place a table up at the front of a plan that includes governing legislations. 

We also look at the plan logic and the premises and predictions.  Where did those work according to plan?  Were they accurate, to what extent, and what needs to be adjusted?  We look at whether or not the plan helped to improve motivation in a desired culture and advance the mission of the organization.  We also look at its impact on the organization, on the history; did it help to create new history?  Did the plan carry out what was meant to be done?  Did it help to straighten and improve financial management partnerships, program management, leadership at different levels, and HR capacity to support both the present and the desired future?  Analyzing HR capacity is in part to help management retain tacit knowledge in the organization rather than just staying put and waiting to hire staff. 

Typically, the evaluation will take the form of a self-assessment.  Even though it may be a request for an evaluation by a donor, we try to make it into a self-assessment so that the client can benefit from owning and growing that culture of performance and measurement instead of an improvement.  The client also benefits from receiving information for decision-making related to their strategic choices for strategy updates and reformulating the next plan and milestone period versus if you only did it from an accountability perspective.  We also do it for performance information, accountability requirements, and to guide resource allocation.  Moreover, we look at the infrastructure for delivering strategy. 

If the organization is implementing multiple programs as would be in the case of a government department, an NGO, the approach would be heavily influenced by the terms of reference.  That comes with a call for proposal versus if it was just the organization that came up with the idea and asked for a proposal.  Then we are left with greater leverage in designing what it is that we are going to do. 

Our approach includes a blend of evaluation approaches.  This depends on the competence of the organization in conducting a needs and performance information and the information needs outlined or that we glean from the call for the evaluation and the intended users.  That blend would involve components of the utilization focus evaluation approach.  It would include consultations for ensuring that the information collected will be of benefit and is what is desired by the organization and its stakeholders.  We may involve a theory-based evaluation approach for assessing the logical particular problems, the effectiveness, and the context.  We would want to look at the theory of change, how it's holding up, the participants and their attitudes, and how their participation impacts the outcomes for them.  We could also use a more all-inclusive strategic evaluation approach.  That would be a strategic evaluation into the outcomes and the impact of the target population.  We’d look at the results and service levels as well whatever they are creating and selling or giving away.  And you examine the outcomes in terms of relevance and effectiveness of the outcomes.  As mentioned, that includes the product or service and how efficient were they in producing the output, how cost-effective it was to deliver the solutions, and the quality of those outputs that were delivered.  You also want to analyze the internal management and leadership as it relates to output processes that are involved and for developing them. 

So you perform an individual level assessment and review of measures.  This is another challenge.  The moment you begin to ask for job descriptions and such, expect a large information flow.  At the plan level, we review measures indicators, strategy identity, et cetera.  We also lead a management response session for discussing the findings, the recommendations, and the judgment.  From that, you'll get feedback into how they feel about the judgment and how they feel about the findings.  This may result in some insight about the context and maybe some adjustment.  You will also draw out of that process actions for improvement and try to get some calendar and resource commitments towards that. 

A review of external literature and internal, organizational literature is part of the process.  External literature includes literature of the external environment.  When looking at internal literature, we examine their reporting and advanced documents, the operations plan, the corporate strategy plan, performance reports, and minutes from board meetings.  Other areas for analysis could include conducting surveys, interviews, and consultations.  If we have to calculate data and develop estimates, then we do social media searches and sometimes, we even have to look at lab research. 

The output would be typically an organizational assessment and development report.  This would include the proposed strategic options and choices for informing the development of the next plan, strategy update, or plan of date for the next planned period.  

Let’s turn to the kinds of questions that we would ask.  We would ask questions such as, what were the goals and objectives in the plan and these are not necessarily asked?  How did the organization perform based on the strategic intent stated in the plan and its related goals?  We'd also ask how effective did the organization use the plan to manage the delivery of its results?  That is, the priority, the focus areas, the approaches, and accountability.  Another question is: how easily or difficult did the plan make the performance management and measurement process?  Because if it is just a summary of a plan and not fully documented with measurable indicators, that's trouble for execution. 

An important question to ask is: does the plan include an alignment mechanism for cascading and aligning?  We look at whether the major initiatives and commitments were delivered on time and in budget.  If there were deviations, how wide was it spread and what needs to be changed?  Therefore, if they finished before schedule, were late or on time, we want to understand the reasons.  We also look at the overall workings of the plan logic based on the theories of change or the strategy maps and strategy framework or income output map, or any combination of them.  We also look at whether or not the scope of operations is made clear by the plan.  We look at where the initiative is suitable for building capacity and advancing the strategic direction articulated in the plan. 

So, we'll use multiple sources of qualitative and quantitative evidence.  We look at the use of the plan and the process and the annual and periodic review and strategy update.  Are they following that guide and are they updating the plan?  We ask if a priority trade off happened and, if so, why?  We also look at the effectiveness of the plan in communicating to the board, to management, to partners, to founders, and to staff.  Do they understand the plan or do they find it as a burdensome document?  Does it address the value chain and how you're going to make product relationships and leverage them in terms of the products or services or the supply chain?  Moreover, we analyze how effective it is in articulating the strategic identity.  So that's how we do that post-implementation evaluation. 

Meegan Scott is managing director, founder and lead strategic management consultant at Magate Wildhorse Ltd. Meegan has almost two decades of experience in strategic management—serving in both local and international development. Her practice terrain lies at the convergence of corporate strategy; strategy execution; social research; program management; change management; marketing; performance management and measurement.

Prior to establishing Magate Wildhorse, Meegan wore the twin hats of Incubator Marketing Manager and strategic management consultant to client entities of the incubator. She also served as Corporate Strategy Planner to the National Environment and Planning Agency in Jamaica.

Meegan holds a Bachelors of Social Science in International Relations; an MBA (Marketing and finance focused); the designation PMP, and a Postgraduate Diploma in Business Analysis.

Comments in Chronological order (0 total comments)

Report Abuse
Contact Us | About Us | Donate | Terms & Conditions X Facebook Get Alerts Get Published

All Rights Reserved. Copyright 2002 - 2024