Social entrepreneurship/Evaluation Framework

^ Given the levels and aspects of evaluation alluded to previously, I suspect this could lead to a curriculum around "e/valuation". Here are some ideas for a framework (adding to some of the previous postings) which may lead to a matrix of methods and approaches that may be applied in various contexts. At some stage, I may cut and paste these notes into separate wiki pages around e/valuation.

Feel free to comment on the discussion page - Kim 14:32, 4 December 2008 (UTC)

Terminology

 * Evaluation: although I am drawn to "valuation", there is no strong consensus on a pervasive AI approach, so I will use the term "evaluation".
 * Programme: the framework is intended to be useful for a wide spectrum of initiatives, programmes, projects, etc. The terms "programme" and initiative" are used interchangeably to refer to the World Bank Institute and Meraka Institute's initiative on Student Social Entrepreneuship, and to similar endeavours.
 * Project is used to refer to components of such initiatives including student projects and the broader projects of NGOs, CBOs, etc.
 * Social Impact is used as an appropriate term for this initiative centred around social entrepreneurship. The author has a preference for thinking in terms of sustainable development (which requires simultaneous consideration of economic, environmental and social concerns), and at times uses wording to indicate "sustainability thinking" behind social impact.

Aims
Evaluation should be an integral and on-going part of any programme development and may be linked with the approach taken at inception and other methodologies applied during its life cycle.

This particular project had elements of the "Logic Model" and Appreciative Inquiry applied at some points of its development, and the evaluation framework elaborated here reflects this.

The aim is to develop a framework for use at all levels from broad programme evaluation to student project evaluation.

Hopefully, the framework will inspire the development of learning resources for multiple approaches.

Overview
This framework rests on a broad definition (Alkin, 2004) of evaluation as a knowledge production activity, which may occur at any point or continuously throughout a programme, to provide information on progress to interested and affected parties.

Key questions:


 * Who is evaluating whom for what purpose?
 * How will the results of the evaluation be used?

The general purpose of evaluation in this initiative is to inform stakeholders and to improve processes at multiple levels covering evaluation in the small, medium, and large - programme evaluation, student learning, and programme impact:


 * 1) Evaluation in the Small: Programme Evaluation
 * 2) * Growth of the network of collaborating institutions and individuals
 * 3) * Programme management
 * 4) ** Monitoring and evaluation activities
 * 5) ** Being aware of degree of progress
 * 6) ** Sensitive and responsive to change
 * 7) ** Diligent and accurate reporting
 * 8) ** Inclusiveness, transparency, ethics, ...
 * 9) ** Leadership development
 * 10) * etc.
 * 11) Evaluation in the Medium: Student Learning
 * 12) * Are the learning resources helpful to learners?
 * 13) ** Are they using them?
 * 14) ** Are there performance improvements? (grades and effectiveness)
 * 15) ** The quality of the learning resources being produced in this process
 * 16) ** Student effectiveness in applying the learning
 * 17) Evaluation in the Large: Impact
 * 18) * of student projects
 * 19) * of the programme
 * 20) * of project portfolios.

Evaluation in the Small: Programme Development
Evaluation of the programme is framed in terms of the original vision and goals of the initiative, recognising that these may change during its life cycle. It is recommended that the vision, mission and strategy be revisited on a regular basis, say quarterly, or at least annually.

Depending on the resources available, evaluating the programme could include the following facets:


 * 1) Reassess the need for the programme
 * 2) * Is the original vision still relevant?
 * 3) * If not, should the programme be closed down, or could the energy be redirected leveraging work done already?
 * 4) Internal processes:
 * 5) * Strategic
 * 6) ** Planning
 * 7) ** Resourcing
 * 8) * Operational
 * 9) ** Resources management
 * 10) *** Human resources
 * 11) *** Assets
 * 12) ** Financial management
 * 13) ** Communications
 * 14) *** Reporting
 * 15) *** Publicity
 * 16) ** Fund raising
 * 17) * Outputs
 * 18) Community Building
 * 19) * Degree of engagement and collaboration
 * 20) ** Educational institutions
 * 21) *** Institutional support
 * 22) *** Collaboration across institutions
 * 23) ** Student communities
 * 24) ** NGO, CBO and non-profit organisations
 * 25) ** Public sector
 * 26) ** Business sector
 * 27) *** CSI, CSR, ...
 * 28) Sustainability
 * 29) * Portfolio Balance addressing social, economic and environmental issues
 * 30) * Sustainability assessment
 * 31) * Social
 * 32) ** Inclusivity
 * 33) *** Diversity management
 * 34) ** Transparency
 * 35) ** Ethics
 * 36) * Environmental
 * 37) ** Impact assessment
 * 38) ** Strategic environmental assessment
 * 39) * Economic
 * 40) ** Poverty reduction
 * 41) ** Wealth creation
 * 42) ** Business viability

In general, evaluation needs to be done with due diligence and be properly resourced. At the same time it is necessary to be as efficient and agile about monitoring and evaluation as possible.

It is important to note that this initiative started with no clear idea of where to go and how to make the greatest impact. It differs from standard development projects which try to plan everything in advance, including the monitoring and evaluation. The methodology, therefore, has been ad hoc and "agile" with regular communication including the following questions:


 * What is working and how can we grow this positive energy?
 * What have we learned?
 * How can we improve the process?

Advantages of this approach include:


 * Flexibility and responsiveness
 * to change and in cognisance of new knowledge and learning
 * Open processes
 * anyone may join and contribute at any time
 * Freedom to retain/drop or continue/change any aspects of the programme that are working/or not.
 * Focus on the process in the early stages, and directing resources in directions and to emerging ideas with demonstrated merit.

Where specific actions were identified, more formal steps were put in place which have brought us to this point of developing an evaluation framework which needs to go beyond the activities of the core team and service future participants as the initiative is moved forward into a piloting phase.

We expect that most of the student projects, which are typically of relatively short duration, will need to be equally agile, if not more-so, in respect of execution and evaluation. Some of the longer term projects accommodating successions of students may require more formal approaches.

The initiative provides an opportunity to explore and conduct research on methodologies for effective student social entrepreneurship.

Evaluation in the Medium: Student Learning
The framework provides links to resources useful for student evaluation in general, and emphasises factors, approaches and issues of particular relevance in this programme.

An evaluation should check whether the learning resources are of use to the students and their supervisors, and to what extent both parties are contributing in a synergistic manner.

Measuring the quality of the learning resources is closely tied to measuring usage, which in turn affects the quality.

Full student involvement in the evaluation process (including the development of measures, etc.) is vital.

Implicit in the process outlined here is evaluation capacity building among the participants (students, educators, others).

Student learning is in parallel to learning among their supervisors and mentors ,and occurs on many levels including


 * learning of facts;
 * skills development;
 * gaining knowledge of tools available to help make sense of the world and to share/grow that knowledge;
 * learning how to interact and work with others;
 * being an independent contributor;
 * self direction while listening to the advice of others;
 * becoming dependable, reliable, and
 * developing a sense of interdependence;
 * self actualisation;
 * becoming a social entrepreneur.

The degree of evaluation effort for all of the above depends on the nature of the projects, the learners and the dynamics of each educational context.

One aspect that will need attention in the early phases is the quality of the learning resources and associated learning environments.

Evaluation in the Large: Social Impact and Sustainability
This section focuses on measuring Social Impact - of projects, portfolios of projects and the programme as a whole. Assessing whether these are making a difference in people's lives.

The resources presented are designed for students to use in the evaluation of their projects, for academics and researchers involved in evaluation research and/or research on social impact and sustainability, and for various parties evaluating the impact of the programme.

As the programme unfolds, the resource set will grow covering a range of methodologies which may be applied. The Methodologies section points to a range of approaches which may applied with some guidance on selection and adaptation.

Methodology
As this initiative moves into a pilot phase in 2009, the opportunities for evaluation and research will grow.


 * "...knowledge produced has to be drawn from systematic enquiry; that is, it must be the result of the application of the canons of social science research." (Alkin 2004)

For credibility, data informing an evaluation should be collected systematically in a scientifically rigorous manner, and analysed and interpreted according to accepted principles. In the context of social entrepreneurship, this is likely to be a combination of quantitative and qualitative methods coordinated in innovative ways, requiring a high level of insight on how to do this to derive valid knowledge.

Reference to social science learning resources is strongly recommended, particularly those pertaining to social science research methodology.

The Methodologies section, with research and evaluation in mind, outlines a variety of approaches to evaluation, research and organisational development which may prove useful in deciding what needs to be done when and how to communicate the result(s).

General Considerations
The framework was written from a particular perspective while trying to be as general as possible. When using it, be aware that your context may be different requiring special local knowledge and a deep understanding of the approaches suggested. Consult experts (who might be listed on the community pages).

The following considerations have some generality:


 * Involve interested and affected parties in the evaluation process including
 * negotiating the parameters for the evaluation
 * developing criteria, indicators and other measures
 * Consider the feasibility of any suggested evaluation component of an initiative:
 * Are the necessary resources available? (human, monetary, infrastructure, ...)?
 * If the resources are not available for the type of evaluation needed, the most dramatic consequence could be that
 * the project/programme should stop (e.g. if it involves ethical concerns) - be brave enough to do this. On the other hand, it might be possible to
 * compromise with a partial evaluation or a different approach.
 * Either way, a clear decision must be made in the right way (transparency, inclusivity) and communicated sensitively to all interested and affected parties.
 * Is the timing right? (wrt other processes, the people involved and their availability, ...)
 * Contexts and assumptions change over time as they are in-/validated, new factors and knowledge emerge, etc.
 * Conduct processes that are sensitive to change, and responsive. Agile.
 * A positive evaluation at one level may be negative at another.
 * Be aware that interpreting success and failure is complex.
 * Example: assessing the impact of empowering a brick maker to serve the community more effectively (with new knowledge, equipment and capacity) who then leaves for better business prospects elsewhere, may be positive for the brick maker but not necessarily for the community. How do we evaluate the students? Great idea, good implementation, but then something happened they did not anticipate.
 * In the absence of unit standards for social entrepreneurship, the initiative will need to conduct evaluations while potentially developing these standards (if that is possible for social entrepreneurship?).

Aspects of Evaluation
The aim of this section is to provide a quick reference to types of evaluation techniques and activities associated with different contexts. See Alkin (2004) for one approach which tabulates evaluation techniques according to purpose.

Introduction to Programme Evaluation
The key guiding questions for evaluation in general apply to programme evaluation:


 * Who is evaluating whom for what purpose? and
 * How will the results be used?

These questions help orientate and focus the evaluation. Once answered, a host of additional questions arise on how to serve the purpose and meet the requirements of the stakeholders requesting the evaluation:


 * What specific questions need to be answered?
 * What information is required to answer those questions?
 * How can that information be obtained?

The complexity behind those questions will vary according to the nature of the project, the nature of the evaluation itself, and other factors.

In general, you will need to think about


 * the type of information required
 * qualitative
 * quantitative
 * information sources
 * reliability
 * validity
 * data collection methodology
 * data analysis and interpretation
 * communicating the results for the various target audiences.

As for any significant piece of work, it is wise to plan it carefully:


 * When is the final evaluation report required?
 * What needs to happen by when in order to meet that deadline?

This suggests preparing a plan for the evaluation:


 * Clearly define the evaluation goal(s)
 * Design a plan for reaching them
 * Who needs to do what by when?
 * Responsibilities, tasks, resources, time lines, ....

The degree of planning needed will depend on the scope of the evaluation and characteristics of the programme and components/aspects being evaluated.

In general, the following are critical success factors for evaluation exercises be they once-off or long term, on-going:


 * Management/leadership support
 * Verbal and moral support
 * Incentives and rewards (e.g. financial and/or recognition)
 * Financial and other resources (facilities, dedicated people, etc.)
 * Support demonstrated through follow-up action and decisions made on the basis of the evaluation.
 * A clear vision for the evaluation exercise itself that is aligned with the vision and mission of the programme, and its philosophy around evaluation.
 * For example, evaluation seen as a collaborative, participatory learning activity providing an opportunity for all involved to improve their individual and collective contribution towards the effectiveness of the programme.
 * A mission statement for the programme evaluation could be "to improve the processes of the initiative as a whole to enhance its effectiveness and impact".
 * A strategic plan describing for each component/aspect of the evaluation:
 * Who will do what by when
 * How the results will be communicated
 * Resource requirements
 * Communications strategy within and across components of the evaluation, especially where there are dependencies.

The plan should address all the questions raised above and clearly indicate those which have answers and additional questions raised in the planning such as the purpose of the evaluation, the stakeholders, the questions to be answered, the approaches to data collection, analysis and interpretation, time lines, responsibilities, resource requirements and budget.

Share the plan with the relevant participants and stakeholders towards integration of evaluation activities with the normal running of the programme.

The methodologies section suggests some approaches which may be applied in developing the evaluation strategy and plan.

The rest of this section covers


 * Needs analysis, an activity which should precede the implementation of a programme and be revisited from time to time, or when called for - e.g. when a "stop-go" evaluation is instituted;
 * Possible foci for evaluation of this programme in the short term (2009 - 2010), and
 * Projected foci for the future if all goes well.

Needs Assessment
At the beginning of the programme assess the need for it, and revisit this assessment from time to time to check that the rationale is still valid in the current context. Refer to the relevant documents (concept documents, proposals, contracts, correspondence, etc.) and keep them on hand for reference during the evaluation.

Needs Assessment Howtos

 * Needs Assessment

Context, Vision and Original Goals
Closely tied to needs assessment (or reassessment), is an evaluation of the goals and aspirations of the programme, and progress towards these.

In the early phases of this programme, a vision was defined


 * Enabling communities to enhance their quality of life sustainably while facilitating student learning on social entrepreneurship

Variations were discussed suggesting a mission


 * Establish a "network of universities" focussed on social entrepreneurship
 * Enhance social entrepreneurial practice across these institutions
 * Create an online space for
 * students to interact and share experiences
 * educators to co-develop and share learning resources
 * students to find community based organisations to work with
 * communities to express their needs and aspirations

and intermediate objectives started to emerge from general statements such as the following:


 * The initiative seeks to inspire, facilitate and catalyse the establishment of a network of universities offering social entrepreneurship as a module and possible career path.


 * The hub of the initiative is a wiki portal providing a venue for collaborative development and use of learning resources, and online community building among educators, learners, community representatives, social entrepreneurs and other stakeholders.

At the time of writing, the wiki portal is up and running with some preliminary content including place holders for a sub-set of the curriculum - sufficient to set the scene for further development when the institutions and students engage.

A series of workshops have been convened designed to draw in some participants from the academic, public, private and civil sectors (see Activities).

Ultimately, the collaborating institutions will need to define their own goals, roles, degree of participation, extent to which they will be users of the resources or both users and co-producers.

When this happens, a new set of evaluation questions will arise, and the types of evaluation activities may diversify.

Aspects of the programme including the wiki and a game framework will be piloted in 2009.

Questions for evaluation may probe:


 * The degree of support offered by the participating institutions and individuals
 * The processes of engagement of the institutions, educators and learners
 * Success factors for effective and productive engagement
 * Awareness raising and preparation in respect of
 * The programme
 * Free/Libre and Open Educational Resources
 * Skills required to use the resources
 * On the wiki
 * Via the gaming framework under development.
 * Communication effectiveness.

Documentation providing background on the vision, mission and goals of this initiative is available at About Social Entrepreneurship.

When the initiative kicks off again in 2009, it will be worthwhile for the core team to revisit the vision, mission and goals of the programme and redesign the strategy in the light of the learning so far.

The methodologies section outlines possible approaches. A tentative suggestion for consideration is to draw up a logic model and plan for the next year and a half.

This will set the scene for evaluation during the pilot phase and into the next phase (towards self-sustainability).

Checklist of items to include:


 * Inputs: status report and resources ready to contribute in 2009.
 * Activities: piloting activities with the gaming system and use of the wiki, developing incentives to participate, communication, training and events.
 * Outputs: reports, learning resources, system enhancements (gaming), number of institutions participating, ....
 * Outcomes: changes in behaviour among educators, students, and others. Reporting. Student outputs. Research outputs.
 * Impact: anticipated impact on communities through portfolios of projects lodged in the institutions, elevated status of social entrepreneurship in the institutions and popularity as a career path, ...

In the logic model and strategy remember to highlight aspects of the process which are relevant for evaluation:


 * changes in context
 * needs assessment
 * base lines - where we are starting from (assets, inputs, etc.)
 * risk assessment - feasibility of the plan/strategy with respect to time lines, resources, budget, ...
 * processes - due diligence, sufficient documentation, assumptions, inclusivity (stakeholders involvement), participative, ...
 * deliverables
 * short, medium and long term outcomes
 * impact.

Finally, an exercise for those interested in taking this programme evaluation process forward:

Medium Term Evaluation: beyond 2010
If all goes well, and the academic community engages with the NGO and CBO communities in their wake, with continued interest from the public sector, and growing interest from the private sector, the evaluation focus may shift.

The types of questions to be answered may become for example:


 * What led to the successful uptake, and how can we grow this energy?
 * What needs to change as we scale up?
 * What types of learning resources have been produced?
 * Quality and usage?
 * What factors contributed to this production?
 * How can this collaborative production energy be harnessed and/or ignited in other areas of the curriculum?
 * etc.

Some of these are covered in the next major section (on serving the needs of learners).

The methodologies section suggests approaches to answering some of these questions, and others that may arise (in reality).

Predicting evaluation activities for programme development beyond this best case scenario is difficult and probably not useful. So, for the purposes of this document, we now turn out attention to evaluation from the perspective of the students, and later to evaluating impact in the long term.

Serving the needs of learners
This section briefly covers student evaluation for the benefit of those in the educational institutions concerned with that aspect, and touches on the evaluation of the educational resources produced through this programme.

Student Evaluation
On a pragmatic level, in the current context as a South African initiative starting with a few collaborating institutions, student evaluation is embedded in the academic systems of those institutions. The intention is to augment those systems and provide new pathways and tools for students of social entrepreneurship to reach new heights.

As the communities of user-producers of these resources grow, the hope is that the quantity and quality of the resources will continually and rapidly evolve to meet the changing needs.

Known dependencies include the following, all of which suggest intermediate objectives and actions which may be included in programme evaluation (above):


 * Support from faculty/management of the participating institutions and organisations
 * In principle, with clear statements to that effect, and where applicable, signing of MoUs, etc.
 * In terms of co-investment and allocating resources, and
 * through offering incentives for educators and champions to participate.
 * Awareness and understanding among participants of the opportunity offered by the free and open educational resources movement, and the advantages of commons-based peer production.
 * A critical mass of educators, learners and champions on the topics of interest during the course(s),
 * with the skills required to use, maintain and enhance the learning resources effectively.

Achieving these intermediate objectives will help grow sufficient numbers of learners/educators/champions/etc. to warrant an evaluation and/or produce meaningful research results at this level.

First, an exercise, primarily for educators and student evaluators, but of interest to learners:

Evaluating the Learning Resources
The first criterion is whether the resources are meeting the needs of the learners.

Over time, the pool of shared learning resources improve with use and refinement.

The task here is to evaluate these processes, some aspects of which are outlined in the Methodologies section.

Social Impact and Sustainability
The participants in this initiative have collectively been exposed to a wide variety of techniques, some of which have influenced this document more than others. Over time, we hope that readers will broaden the perspective and co-develop a more comprehensive collection of resources and guides to their appropriate use.

At this stage, we do not make a clear distinction between "outcomes" and "impact". On the one hand, "outcomes" may be regarded as tangible deliverables (such as reports, policy documents, MoUs, business plans, a new company, a collaboration agreement, etc.), while "impact" might refer to changes in the living conditions and quality of life of people in the scope of the initiative (e.g. through wealth creation, improvement in primary health care and environmental awareness). On the other hand, "outcomes" may also refer to changes in behaviour, relationships, and activities of affected parties.

Methodologies
As mentioned earlier, evaluation is an integral part of programme/organisational development. Most of the associated methodologies recognise this and include evaluation activities among their processes and practices.

This section suggests a collection of approaches which may be applied for programme development in the short and medium term, and for evaluating impact in the long term.

First, a note on complexity, a reminder of the multiple perspectives, and the need for research to support the initiative and evaluation.

Complexity
Developing the evaluation framework requires an understanding of the meaning and nature of social entrepreneurship (its underlying philosophy) and what the initiative seeks to achieve. It requires consideration of the participating groups, interested and affected parties, and an appreciation of the desired outcomes from multiple perspectives.

These factors count in the design of an evaluation strategy and the methods to be applied at different times in a potentially changing socio-political, economic and biophysical environment.

In short, the challenge of driving and evaluating this initiative, and the challenges students are likely to encounter, may be described as "wicked" and inherently complex. The sections which follow outline a number of techniques and approaches which may help the various parties manage the complexity associated with their interventions, focus their activities and design effective evaluation strategies.

Activities such as identifying and prioritising community needs and opportunities, planning of projects and decision making, all occur in a particular context often with incomplete (or even contradictory) data to describe it. For this reason, there is an emphasis on approaches which recognise these types of issues. Agility and well informed methodological innovation is encouraged where appropriate.

Perspectives
During the life cycle of an initiative, evaluation exercises may be conducted for a variety of reasons with interested and affected parties requiring specific types of information.

Triggers for evaluation include:


 * Routine scheduled "internal" evaluation activities - e.g. to serve predefined reporting requirements,
 * ad hoc on-demand evaluation activities when stakeholders request an evaluation on account of concerns or success beyond expectations,
 * required evaluations for (e.g.) national, regional or global bodies, and
 * event driven evaluation sparked by public interest (for example).

For all of these, it is important to have a clear understanding of the purpose of the evaluation, the type of information required to meet that purpose, who will be using the information and how they intend to use it.

The participating/affected parties and their interests in evaluating the initiative may include the following:


 * Tertiary institutions: the value of participating for the institution (clarification of common goals, accountability, ...)
 * Lecturers: the value of the learning resources, the network and peer production process to improve educational practice
 * Students: the usefulness of the learning resources and network for achieving learning objectives (e.g. credits and success in the field work), knowledge acquisition, personal growth, understanding the advantages of becoming a user-contributor
 * Researchers: assessing opportunities to conduct meaningful and credible research on real world social impact via knowledge networking and peer production
 * Social entrepreneurs: assessing whether participation in this initiative is worthwhile for enhancing their impact
 * CBOs: assessing the value of engaging in terms of meeting community needs (e.g. enhancing local capacity and growing social enterprises)?
 * NGOs: understanding their role and impact within a network of operators
 * Corporate Social Responsibility program managers in the private sector: assessing potential to make a real difference through CSI
 * Donor agencies and other funders: assessing whether donor funds are being well spent, and progress towards self sustainability or achievement of objectives
 * Government departments and local municipalities: assessing the degree to which the projects and portfolios augment government activities and identifying opportunities for collaboration.
 * The network itself: assessing state of the network, its effectiveness, process improvement, best practices, shared knowledge, communities of interest, communities of practice, etc.
 * Communities: understanding the value of the initiative in terms of sustainable development opportunities
 * The initiators of the network (the core team driving the initiative): operational effectiveness, progress towards achievement of stated goals and objectives, whether or not to continue, and if so, how best to invest - refining the strategy towards self-sustainability of the network.

Evaluation and Research
While evaluation may be regarded as a requirement demanded by stakeholders and others with a specific purpose in mind, research is typically more open ended with the goal of satisfying our curiosity and deepening our understanding of phenomena.

Given the association with academia of this initiative, the importance of research is highlighted as a means of enhancing our understanding and ability to optimise our efforts.

In line with the current structure of this framework, the following research areas are suggested to augment the initiative and associated evaluation activities. The areas are listed where the research opportunities seem greatest - some may have wider applicability.

Programme Development
Research on Organisation Development approaches applied in this initiative. These may include


 * Outcome Mapping
 * Theory of Constraints
 * Appreciative Inquiry
 * etc.

The research could look at


 * Application of the approach in context
 * Culturally appropriate adaptations and alternatives.
 * Action research and
 * variations of action research
 * Development and Validation of Indicators for evaluation.
 * Techniques within these approaches and other methods
 * Facilitation
 * Dialogue Mapping
 * Decision making techniques
 * MCDM
 * SSM
 * AHP
 * Multi-methodologies
 * etc.
 * Innovative adaptations and blends of approaches.

Learning Resources Effectiveness

 * Research on Free and Open Educational Resources
 * Pedagogy and connected learning.
 * Student and educator readiness for connected learning and specifically wiki course facilitation and learning.
 * Student and educator practices using the resources available.
 * Student and educator evaluation research.
 * Network Science.
 * Peer production.

Some of these are mentioned again in the methodology sub-sections below and expanded within the curriculum.

Social Impact Research
Although the causative links between the activities of an initiative and social impact or sustainable development are difficult to establish, there are some approaches to programme development and evaluation which at least enhance the possibility of making such connections, or which ensure that critical success factors are addressed.

Processes:


 * Outcome mapping (above)
 * Appreciative Inquiry (above)
 * Positive Deviance
 * Social Systems Analysis (SAS2): an approach to collaborative inquiry and social engagement
 * Policy Impact
 * etc.

Research could be conducted to


 * check assumptions and raise issues of interest to the programme development team and evaluators,
 * identify critical success factors to address in programme evaluation,
 * explore the extent to which the measures implicit in these approaches indicate impact, and where/if possible,
 * assess whether there has actually been any demonstrable impact on account of the programme.

Short Term Programme Development: Foundation and Piloting
As mentioned previously, the piloting phase of 2009 provides an opportunity to revisit the vision, goals, mission and aspirations of the programme while establishing a solid foundation for a sustainable network of universities and other parties focused on social entrepreneurship.

The following approaches may be considered for programme development and evaluation.

Outcome Mapping
For general background on the approach see the module on Outcome Mapping (in preparation) and associated case studies, specifically, the SSE Case study which suggests possible outcomes for the programme.

Theory of constraints
Theory of Constraints is a well established and well researched approach which has been applied in many contexts beyond its roots in manufacturing. Here we briefly outline a simplified variation suggesting a process which may be elaborated and evaluated in specific situations. The process is typically carried out with representatives of interested and affected parties in a facilitated workshop using a while board, postits, etc. to capture collective thinking.


 * 1) Formulate a clear, achievable goal indicating "who will experience what by when?"
 * 2) Recursively ask "Why is this goal not reached right now?" to surface barriers
 * 3) Structure the barriers (e.g. cluster, combine and order them to show dependencies)
 * 4) Reformulate the barriers as sub-goals or intermediate objectives (IO map)
 * 5) For each intermediate objective repeat the process until the IO map is complete.

The process so far results in a tree of intermediate objectives (IO map).

Now, for each node in the map a set of actions or tasks may be defined and assigned to individuals or sub-teams, etc. with time lines.

Exercise: hack the above process to suit your needs.

Appreciative Inquiry (Preview)
Appreciative Inquiry is listed here to indicate that the approach could be applied pervasively from now on. More detail is covered in the medium term section below (i.e. after the piloting phase), by which time there will more to "appreciate".

etc.

 * Indicators of progress with the wiki.
 * Plan the process with various outputs such as:
 * Plan of action for the next phase
 * Minutes of planned meetings
 * MoUs with collaborating institutions
 * Degree of co-investment
 * Specific modules complete with Activities and Resources
 * Documents relating to how specific institutions augment existing syllabi, processes and systems with the wiki portal, ....
 * Quarterly progress reports
 * Reviews
 * etc.

Medium Term Appreciation: Consolidation and Growth of the Network
If all goes well, by 2010, the piloting phase will be complete, the learning resources will be more extensive, the participants will have learned in the process, and a solid foundation will have been established to consolidate and grow the network.

At this point, review, reflect, capture the learning, and use the knowledge gained to plan the next phase. Recall the facets of programme evaluation listed in the Overview, and the key questions that have guided the process to date:


 * What is working and how can we grow this positive energy?
 * What have we learned?
 * How can we improve the process?

The result of this routine evaluation may suggest a range of new approaches and changes to various components of this initiative. In anticipation, the focus of this section is on Appreciative Inquiry which has risen to the fore as an approach which seems well suited to the theme of social entrepreneurship, and which will be most applicable during a phase of consolidation and growth.

Overview
For an overview and introduction to Appreciative Inquiry see:


 * Appreciative Inquiry

Focus
Aspects of the programme to highlight for valuation at this stage, in the spirit of appreciative inquiry, include the most positive experiences of


 * Students of Social Entrepreneurship
 * Teachers/ educators, facilitators, mentors, course administrators, etc.
 * Programme developers (the core team and working groups)
 * Other parties associated with the programme:
 * Academic institutions, faculties
 * CBOs and NGOs
 * Private sector companies, CSI departments, ...
 * Public sector
 * Other civil society organisations
 * Researchers
 * External parties.

Resources

 * Preskill and Catsambas, (2006)
 * Reed (2006)

Long Term Sustainability Assessment
For the purposes of this framework, "sustainability" means simultaneous consideration of all three pillars of sustainability: economic, environmental and social.

As a social entrepreneurship initiative, we emphasise the social aspect and encourage practitioners using this framework to be mindful of the others.

Although this section is short, and does little more than provide a brief mention of this holistic perspective, its importance cannot be over-emphasised. See the module on sustainability and social entrepreneurship for general background and Triple Bottom Line Reporting for an evaluation approach.

Additional Techniques
Additional techniques may be applied in certain circumstances including programme design and development, evaluation and decision making. For example, the purpose of an evaluation might be to decide on which projects in a portfolio should be prioritised for further funding. This could require multiple-criteria decision making (MCDM) techniques. Refer to the readings, resources and appendices below and (in future) the section "Aspects of Evaluation" above.

Conclusions
Evaluating this initiative, an inherently complex and multi-faceted learning exercise, is challenging and the process will require a level of agility which can only add to the complexity. This document provides one approach to structuring the problem by considering three primary levels of analysis (small, medium and large) and directing readers to techniques and approaches which may be useful at each level.

At this stage of the project the suggestion is to focus on evaluation in the small: the process of establishing a curriculum and network of institutions/ educators and learners who derive benefit and in turn contribute to the shared resources.

For the most part, one needs to be pragmatic, and not expect to implement a fully comprehensive water-tight monitoring and evaluation regime for a process which we hope will inspire self-organising communities of user-contributers making positive social impacts.

Instead, assess particular activities and sub-projects for parts which can be evaluated using known techniques in cognisance of their limitations and, where possible, explore innovative approaches which extend and go beyond action research by addressing "evaluation" differently (e.g. "appreciation").

...

Appendices
These appendices are intended to be developed by educators and learners involved. Their form is unrestricted but should be based on experience. For example: "Customisation of by ", etc. (create links)



Evaluation in General

 * Alkin MC (ed.) 2004. Evaluation Roots: tracing Theorists' Views and Influences', Sage.
 * Measuring Innovation - Social Edge
 * Measuring What Matters - Social Edge
 * American Evaluation Association
 * Targeting Outcomes of Programmes
 * Program Development and Evaluation
 * Program Development
 * Evaluation - University of Wisconsin Extension
 * e.g. The "Planning a Program Evaluation" booklet, and
 * Quick tips.
 * Impact Assessment of ICT4D Projects - check out the Compendium.
 * Institute of Development Studies, Knowledge Services - Evaluation framework: "From Access to Action: Impact Pathways for the IDS Knowledge Services".
 * Measuring Innovation - Evaluation in the Field of Social Entrepreneurship - FSG's white paper commissioned by the Skoll Foundation for the 2005 Skoll Forum on Social Entrepreneurship at Oxford University, 2005.
 * Search (Google)

Social Science Methodology

 * Social Research

Organisational Development

 * Theory of Constraints - collaborative goal setting
 * Appreciative Inquiry - recognising potential and growing the life energy in a community
 * World Café - collaboration among diverse participants/groups

Decision Making

 * Nominal group technique
 * Delphi Technique
 * /Analytic Hierarchy Process/

Problem Structuring

 * /Soft systems methodology/
 * /Multimethodology/
 * /Dialogue Mapping/

Sustainability Assessment

 * /Triple Bottom Line Reporting/

Acknowledgements

 * Participants in the AI List.
 * Some ideas emerged while working on the OER Handbook for Educators.