Preconference Workshops

The SAMEA 6th Biennial Pre-Conference Capacity Building Workshop Series consist of a selection of 21 different workshops, across a broad range of topics and of varying durations. One and two-day workshops are available,

The workshop series will precede the 6th Biennial SAMEA Conference with the theme ‘‘Purpose-driven Monitoring and Evaluation’’ and feature workshops by distinguished M&E professionals.

The Costs of the Workshops are as follows:

2 days workshops for SAMEA members: R 3 800.00
2 days workshops for SAMEA non-members: R 4 200.00
1 day workshops for SAMEA members: R 1 950.00
1 day workshops for SAMEA non-members: R 2 200.00

The following is a list of conference workshops in order of occurrence and duration. Click on the title of each workshop for more information about the workshop.

Presenter: Sara Vaca
Level: Intermediate-Advanced
Dates: 23 – 24 Oct (2 days)
The 2-days workshop on Data Visualization for Monitoring and Evaluation will provide participants with a wide overview of the top-notch applications of data visualization to M&E. Firstly covering the concept and principles of data visualization (quantitative and mostly qualitative), the facilitator will share numerous examples on how to convey tipical information present in M&E reports in a visual attractive way. Participants will be encouraged and walked through producing ideas for visuals, constructively criticizing data visualization examples and awakening their Visual Thinking (they will also be encouraged to share reports and examples that will eventually be used as case studies or real-life examples during the workshop).

Presenter: Dr Jos Vaessen
Level: Intermediate-Advanced
Dates: 23 – 24 Oct (2 days)
Interventions are theories and evaluation is the test. This well-known reference is indicative of an influential school of thought and practice in evaluation, often called theory-driven or theory-based evaluation. While having been around for more than four decades, over the last decade theory-based evaluation has received new impetus and has become part and parcel of the toolkit of program evaluators across the globe.The past decade has also seen a dramatic increase in impact evaluation debates and practices. While theory-based evaluation has often been cast as an alternative to quantitative counterfactual-based impact evaluation, in practice the two can reinforce each other. At the same time, the scope for applying different expressions of theory-based evaluations is much broader than impact evaluation only. The workshop will address the following main themes:

  1. What is theory-based evaluation and why is it important?
  2. What are useful principles for reconstructing a program theory?
  3. How can we apply theory-based evaluation in practice?

Learning outcome:

  • After this course, participants have developed an initial (but sound) understanding of the role of theory in evaluation and how to apply theory-based evaluation in practice.

Learning modalities:

  • Short interactive lectures
  • Group exercise and presentations on the basis of an empirical case
Presenter: Michael A. Harnar & Dr. Juna Z. Snow
Level: Beginner
Dates: 23 – 24 Oct (2 Days)
From observations of programs in situ to describing impact in greater detail, qualitative data serve complex and critical needs in evaluation practice. This workshop briefly contextualizes the epistemological foundation for qualitative data before introducing participants, through hands-on project work, to a variety of qualitative data collection, analysis, and reporting methods. After two busy days of working on individual and group projects, putting various collection, analysis, and data management tools to use, you will leave this workshop more proficient in applying qualitative methods in your evaluation projects.
Presenter: Freer Gordon
Level: Intermediate-Advanced
Dates: 23 Oct
The term Theory of Change has penetrated the monitoring and evaluation environment. As an M&E “buzz word” it is being used regularly in calling for assistance in the design and implementation of monitoring and evaluation processes, and in the initial design of programmes, often with little understanding as to its use and its relevance and connection to other programme tools such as a logframe. Despite the widespread use of theory of change in monitoring and evaluation there is limited conceptual clarity on key concepts, few operational guidelines, and even fewer guidelines on how to make the most of theories of change in relation to managing and evaluating programs. The workshop speaks directly to these topics.The primary audience of this workshop is programme managers and programme directors who have some exposure to monitoring and evaluation but limited exposure to the monitoring and evaluation thinking behind the design and use of a TOC within a programme.This course will target intermediate M&E users. However, beginners who have an interest in learning more about TOC will be welcomed and accommodated.
Presenters: Alyna Wyatt & Dhashni Naidoo
Level: Basic-Intermediate
Dates: 24 Oct
While evaluators may appreciate the challenges and benefits of participatory methods, they may not be familiar with appropriate tools to use with different groups of people or how to add participatory methods into more traditional evaluation designs. Many evaluators find their toolkit of methods is inadequate or ineffective at gathering accurate, reliable, valid and usable data from the programmes’ constituents. Attributes such as low literacy or numeracy skills, gender, socio-economic status, cultural identity, HIV or immigrants’ status, tend to silence some and leave them out of typical evaluation processes. Further, terms of references often do not provide sufficient flexibility to allow an evaluation to be entirely empowering or democratic – yet we still want to make sure that all perspectives are included in our evaluative processes. After a brief conceptual, principles-based grounding, participants will practice a number of methods derived from participatory rural appraisal and the more inclusive participatory learning and action family of methodologies that can engage and empower program participants in ways that offer alternative methods of communicating, and in many ways, enhance the quality of the information gathered. Adaptable to many contexts, these tools can be used across the spectrum of M&E activities from needs assessment and program monitoring and evaluation to interpreting and communicating results with stakeholders.
Presenter: Kirsten Mulcahy & Amreen Choda
Level: Basic-Intermediate
Dates: 23 Oct
This interactive workshop will seek to inform participants on how to design and implement an evaluative monitoring and results measurement (MRM) system. An evaluative MRM system supports implementers to prove and improve programmatic results. This workshop will provide participants with-

  • An understanding of international good practice in implementing MRM systems to support evidence based decision-making and improved learning outcomes; referencing the Donor Committee for Enterprise Development (DCED) Standard for Results Measurement.
  • Practical experience in setting up a functional MRM system – this will include creating results chains, developing indicators, establishing a measurement plan, and introducing project course correction based on evidence gathered.
  • Discussions around different implementation experiences referencing MRM processes, M&E frameworks and monitoring cultures from past project experience (spanning the youth economic opportunities, financial inclusion, and agricultural sectors across Africa and South Africa specifically) – this will include discussions around informal channels for monitoring, the importance of instilling a ‘monitoring culture’, and common challenges and solutions to implementing MRM systems in complex environments.
Presenter: Tara Polzer Ngwato
Level: Intermediate-Advanced
Dates: 23 Oct
Do you commission, design or implement impact evaluations as part of your job, or do you intend to do so in the future?
Do you ever decided not to commission or conduct an impact evaluation for an intervention because you thought it would be methodologically impossible or too expensive to find a control group?
Are you a researcher with experience of designing surveys, but you would like to know more about impact evaluations?This intermediate to advanced 1-day workshop is aimed at M&E practitioners and M&E managers/commissioners who already have a basic understanding of impact evaluations. The workshop is based on several practical examples of impact evaluations, including those designed before the intervention and those designed retrospectively, e.g. after the intervention commenced or was completed. The focus is on the practical methodological challenges and options for designing and implementing data collection for comparable case and control groups to assess intervention impact.
Learning outcomes: participants will increase their understanding of:

  • The logic of case and control group design (pre- and post-intervention scenarios)
  • The context factors which impact on the practicality (sample reachability and reliability, cost effectiveness, etc.) of different sample designs and data collection in the (South) African context
  • Tricks and checklists for reliable, practical and affordable comparative sample design in context.

The workshop will combine the discussion of impact evaluation sample design theory with practical case studies. It will include small group exercises to design comparative samples and practical data collection methodologies based on various scenarios. The proposed designs will then be presented and discussed to solidify learning.

Presenter: Dr Jan Schenk
Level: Intermediate
Dates: 23 Oct
This one-day workshop will provide a solid introduction to mobile data collection systems based on Open Data Kit (ODK). ODK is an open-source industry standard for the rapid development and deployment of offline mobile forms for data collection of all sorts (e.g. surveys, audits, stocktaking etc). For the training we will be using a commercial derivate of ODK called SurveyCTO but the lessons learnt will be applicable to other ODK-based solutions.
The aim of the workshop is to highlight the advantages of mobile data collection vs paper-based systems in terms of automation, quality control, data management and reporting. It will cover the following topics:

  • Overview of the most common mobile data management tools
  • Introduction to XLSForm, a standard for mobile form authoring
  • Semi-automated quality control using Google Sheets and Zapier
  • Exporting data for analysis (e.g. Stata, SPSS)
  • Automated data visualization and reporting using maps and graphs
  • Examples of mobile data collection in M&E

The course will be interactive and hands-on. The participants will need to bring their laptops to fully participate in the activities. Before the workshop the participants should set up their own trial accounts with SurveyCTO and Zapier and have access to Google Drive. The facilitator will provide tablets for data collection and demonstration purposes.

Presenter: Victor Kiwujja And Ronald Waiswa
Level: Intermediate
Dates: 23 Oct
This workshop has been specially designed to provide you with a thorough understanding of data management process, and pragmatic step-by-step process for conducting data analysis Using Advanced Functions of Microsoft Excel. While the program is centred on the above, it will also impart insights on the rationale of data visualization for impact oriented M&E systems.The training will cover two main data analysis & visualization strategies i.e. Data analysis and interpretation using PIVOT table and graph (Conducting Bivariate analysis, Cross tabulations, multi variable analysis, Pivot graphs, Pivot Slicers, Creating PivotChart Reports from Scratch and development of strategic decisions and policies from M&E data) and Linking and Automating Data analysis tables and graphs (Introduction to linking sheets, Using COUNT, COUNTIF & COUNTIFS function, Using the SUMIF & SUMIFS, AVERAGEIF & AVERAGEIFS, Combining various formulae in Excel and creating automated visuals). Delegates will need to have attended or be familiar with the basics of Microsoft Excel.The Facilitators Mr. Ronald Waiswa & Mr. Victor Kiwujja have over 15 years of experience in monitoring, evaluation, research, policy development, capacity assessment and development among others. They have empowered over 5000 individuals through routine capacity building sessions focusing on data management, analysis, visualization and statistical modeling using Excel and other statistical software. Recently Lida Africa held an Executive Workshop for data analysis and management at the 8th AfrEA International Conference where over 65 participants from 23 countries attended.
Presenter: Sahar Iqbal Mohy-Ud-Din
Level: Beginner-Intermediate
Dates: 23 Oct
A Systems Thinking Approach to Results Based Monitoring and Evaluation for Social Development Projects introduces participants to systems thinking for social development to tackle complex challenges in a fast changing world. It unlocks a new plane of understanding complex problems and provides a platform for designing solutions using results based management. RADAF has designed this training method to encourage multi-level analysis through various tools, assessments and activities that links systems thinking to results based management for monitoring and evaluation of social development interventions. The participant will learn of a set of tools that enable him/her to, amongst others; (1) graphically depict understanding of complex social system’s behaviour incorporated with a needs assessment, (2) identify and design high-leverage interventions to address root causes of a complex challenges, 3) map relationships of strategic value in the delivery of outcomes in an intervention and 4) design strategies for intended and unintended consequences through feedback loops.The key learning outcomes of the training entail:

  • Understanding the structure of a system and how this structure defines the behaviour of the system at the design level of an intervention;
  • Conducting systems mapping of relationships
  • Understanding how a systems feedback loops can inform on the theory of change and the logic model of an intervention
  • Conducting monitoring and evaluation within a systems framework
  • Identifying leverage points in the system to affect change in the log frame
  • Developing key indicators, outputs, outcomes for each type of evaluation used in the system
Presenter: Dr Madri Van Rensberg and Fabiola Amariles
Level: Intermediate-Intermediate
Dates: 23 Oct
Gender equality and social equity are central to ensuring the realization of sustainable and equitable development. The Sustainable Development Goals (SDGs) emphasizes “no one left behind” and urges a “localized” response in regional and national development goals and strategies. Gender equality and social equity are expected to be among the key strategies and outcomes mainstreamed in global and national development strategies. Gender may be present or absent in evaluations in different ways, and it is possible to distinguish between gender blind, gender instrumental, gender ameliorative and gender transformative evaluations.
Participants in this workshop will learn how to conduct evaluations that are gender responsive and be able to ensure that routinely collected data is better used as evidence to advocate for transformations in the society that lead to social justice. Through the use of mixed methods in evaluation, participants will be able to assess how far a project/programme has contributed to changing power relations within institutions based on gender and other identities. Issues of power are examined in the field, within the evaluation team, between the evaluation team and the implementing agency and between the implementing agency and the donor.
Evaluators and M&E practitioners as well as those making decisions and use information from evaluation information (e.g. M&E managers, programme managers and commissioners of evaluations) from all sectors including government and civil society will benefit from this workshop.
Presenter: Sophia Van Rensburg and Susannah Clarke
Level: Beginner
Dates: 23 Oct
Data visualisation is a tool to transform information that is buried in complex reports and statistics, into innovative visual products that communicate concepts in a clear and actionable manner. This workshop introduces participants to techniques to improve standard data visualisation (diagrams and charts) and to create innovative visual products that can facilitate the communication of complex stories and ideas. The focus is on developing conceptual skills and critical thinking through participatory methodology. Participants will have the opportunity to engage in practical learning activities to explore creative data visualisation. This workshop is designed for individuals involved in presenting data, including researchers, M&E practitioners and programme officers. Participants should be competent in Microsoft Office, including Excel. At the end of the workshop participants will be able to understand the power of creative thinking and strategic data visualisation; maximise the visual impact of standard graphs and diagrams and create simple infographics
Presenter: Mike Leslie
Level: Beginner
Dates: 23 Oct
The workshop is aimed at Parliamentarians, legislators, councillors, legislative researchers and their support staff. The one-day short course will deliver a foundational understanding of the processes and outputs associated with public sector evaluations and how they can be utilised in legislatures. Through a series of interactive presentations, a practical exercise and group work, participants will learn how the multiple evaluation-related outputs can be of use to them in fulfilling their oversight role and ensuring a more efficient, effective and accountable public service. Drawing on the respective experiences of the facilitator and the participants, the workshop will explore opportunities to utilise evaluations based on their actual work in legislatures.
Presenter: Sandi Premakanthan
Level: Beginner
Dates: 24 Oct
The objective of this workshop is to develop: a Simplified Cost-Effective Purpose-Driven Measurement, Monitoring and Evaluation (PDMM&E) system that provide continuous performance results evidence for informed management decision making and resource allocation may have a greater probability of institutionalization and sustainability.The standardized approach to building logic models is a unique methodology and serves as a powerful planning tool. It will also in the long term facilitate the creation of a performance results management based evaluation culture and a value system within an organization. A standardized approach to building logic models results in the identification of key-results activities, both enabling and core activities undertaken by the programs and services that contributes to the achievement of overall purpose-driven outcomes (Results, benefits to stakeholders).The standardized performance results evidence chain logic model and their associated suite of performance measures/metrics/indicators are illustrations of both, service delivery and community development theories of action. The theories of action or change and the standardized approach to the development of logic models, facilitates the identification of a limited number of standard outputs (products and services) associated with the key-results activities, three levels of outcomes, the intended impacts (immediate, intermediate and final or ultimate) and a standardized suite of performance indicators and metrics that form the results evidence chain.
This workshop is aimed at… the workshop will benefit all practitioners, novice and experienced evaluators, academics, program/project managers and staff engaged in planning, developing and implementing purpose-driven performance results based strategic and operational management control systems. Several examples, exercises and case studies will be used to illustrate and support the main learning objective of the workshop of how to develop logic models using a standardized logic model approach and a performance measurement strategy for Outcome Management.
Workshop Outcome: The workshop will build knowledge to enhanced level of understanding and skills to practice the technique of standardized logic modelling to develop a Simplified Cost-Effective Purpose-Driven Measurement, Monitoring and Evaluation (PDMM&E) system.
Presenter: Mr. Muserero K. Joseph
Level: Beginner-Intermediate
Dates: 24 Oct
Decentralisation has been embraced by many stakeholders and viewed as a suitable mechanism for addressing welfare and political challenges, in addition to improving efficiency of public service delivery. However, to realise the objectives of decentralisation and good governance, there must be checks and balances by local communities who are ultimate beneficiaries of such reforms. Therefore, empowerment of local communities in holding both technical and political leaders accountable has taken a centre stage.However, the questions asked by stakeholders are; “who should create the enabling environment? who should strengthen capacities of local communities for engagement in monitoring those at the helm of service delivery? and what approaches work best? To answer the above questions, several approaches and best practices have been brought on board however, each has its strengths and weaknesses in different political and socio-economic contexts.This workshop therefore, draws from Uganda Government’s country wide experience of Baraza programme implementation which is aimed at: establishing a public information sharing mechanism that provides citizens with a platform to influence Government development programmes through Institutionalizing downward accountability. The Baraza tool therefore, creates an independent citizens’ monitoring function that enhances Central Government’s responsiveness to Citizens development demands and public service delivery concerns.The primary audience of this workshop are; implementers of projects and programmes, policy makers, M&E practitioners in public and private sectors especially those involved in service delivery and advocacy for social-economic rights but exhibit limited or no exposure to community engagement in performance monitoring.The course therefore, targets all players in decentralized service delivery, monitoring and evaluation, community advocacy for social-economic enhancement, multi-stakeholder groups that are engaged in infrastructure monitoring, Gender mainstreaming for good governance.
Presenter: Jacqueline Moodley & Lauren Stuart
Level: Beginner
Dates: 24 Oct
The purpose of the workshop is to transfer knowledge on theory of results-based monitoring and evaluation (M&E). It also provides practical guidelines of how to develop M&E systems for social development initiatives funded by Corporate Social Investment (CSI) programmes.
The workshop will entail an overview of M&E, proposed log frameworks for social development and education initiatives, as well as an in-depth description of data collection and sampling techniques. Practical sessions on the challenges experienced when conducting M&E research, as well as the budgets associated with the exercise will be demonstrated.This is a one-day workshop that will be presented at the beginner level for evaluators.
Presenter: Aru Rasappan
Level: Basic Intermediate
Dates: 24 Oct
The IRBM system was first introduced in 1999 and is a variant of the original Results-Based Management system. It has since then been introduced to many countries. The Malaysian government has since 2009 become hundred percent IRBM compliant.The IRBM system is suitable for the public sector as well as private sector or NGOs. It has also been adopted by private sector banking and financial sector including Islamic banking and financial standards agency. It can be applied for improving sustainable development results for both an individual organization or for whole-of-government. It is also very well suited for donor or funding bodies which wish to better manage their development programs and projects.This workshop is suitable for officials from the public sector. It is especially useful for the following persons:a. Public sector managers
b. Policy makers
c. Program managers
d. Budget Directors/managers
e. Performance managers
f. Senior management
g. Policy and program managers
h. Performance evaluators
i. Funding agency managers
Presenter: Antonio Hercules
Level: Intermediate
Dates: 24 Oct
This workshop responds to the needs of (i) officials who procure consultants and manage the associated evaluations, and would like to deepen their understanding of key elements in the evaluation project management exercise, and (ii) consultants who would like to offer greater value to public managers in the execution of evaluations that have been awarded.Using workshop participation methods intensively, Government officials/consultants will learn: How to assess/prepare consultant proposals; and How to assess/prepare key deliverables and related evaluation resources: terms of reference, inception report, theory of change, logframe, literature review, evaluation matrix, data-collection instruments, stakeholder management, and especially draft reports.By the end of the workshop, participants will have gained knowledge on: The importance of a Context-appropriate evaluation (purpose) from both perspectives; Improved awareness of the needs/service offering of government officials/consultants; Deeper and improved understanding of deliverable management/production; and how to improve the match of needs/delivery in the evaluation space in the public sector in South Africa, and the Continent.
Presenter: Jennifer Bisgard
Level: Beginner
Dates: 24 Oct
This interactive workshop will give participants access to information related to branding, pricing, pursuing tenders, and issues related to running a small business. It is open to all levels of evaluation expertise but is recommended for those who are interested in starting their own business. The 40 AfrEA 2017 workshop participants indicated that the workshop was enjoyable, relevant and participative. Jennifer Bisgard, founder and Director of Khulisa Management Services (a South African evaluation company founded in 1993), founding SAMEA Chair, and former AfrEA Board member is facilitating this workshop.
Presenter: Michael A. Harnar
Level: Beginner
Dates: 24 Oct
From observations of programs in situ to describing impact in greater detail, qualitative data serve complex and critical needs in evaluation practice. This workshop briefly contextualizes the epistemological foundation for qualitative data before introducing participants, through hands-on project work, to a variety of qualitative data collection, analysis, and reporting methods. After two busy days of working on individual and group projects, putting various collection, analysis, and data management tools to use, you will leave this workshop more proficient in applying qualitative methods in your evaluation projects.
Presenter: Benita Williams
Level: For VOPE Leaders Only
Dates: 24 Oct
The VOPE pre-conference exchange workshop is open to VOPE leaders and VOPE administrators in the Southern African Region. The VOPE day has two main objectives:

  • Capacity Building. The workshop will draw on the significant experience of various VOPE leaders to build the capacity of those fledgling and emerging VOPE networks in the region.
  • Strengthening Regional VOPE collaboration. VOPE leaders will come together in the region to work on a memorandum of understanding that will spell out how the VOPEs can facilitate exchange, maximise resources, reduce duplication of efforts and ultimately promote supply and demand of good quality Monitoring and Evaluation in the Region.

The workshop will include:

  • A discussion on solutions to typical VOPE difficulties
  • An exploration of the VOPE toolkit – a web resource compiled by the IOCE to promote active and well governed VOPEs
  • A discussion on how finances, memberships and international linkages can be used to foster active and sustainable VOPEs
  • A gallery walk of various EvalPartners Resources for VOPEs
  • A discussion about local and international Professionalization Initiatives
  • A work session for crafting a Memorandum of Understanding to strengthen collaboration between VOPEs in the region.

The VOPE day is an opportunity to promote the objectives of EvalAgenda2020, the African Evaluation Association (AfrEA) and the International Organization for Cooperation in Evaluation.

Presenter: Pindai Sithole
Level: Intermediate
Dates: 24 Oct
NVivo is a robust software (www.qsrinternational.com) that supports qualitative methodology in research, monitoring and evaluation. It is specifically designed to help individuals and organizations to organize, analyze and find insights in qualitative data sets like: interviews, focus group discussions, open-ended survey responses, personnel qualitative performance data, personnel exit interviews data, articles, social media, video and web content and more. Most significantly, NVivo is a powerful tool for visualization of qualitative data or findings in research and evaluation as well as that it is a qualitative database in itself. In short, the benefits of NVivo are that it allows one to work more efficiently thereby saving time and other resources, systematically organizes, stores and retrieves data; uncover connections in the data in ways that are not possible manually and supports evidence-based discourse in research, evaluation and policymaking. A 30-day version with full features will be provided to every participant in this workshop for no extra cost.