comment 0

LSE podcasts I liked

Being back in Finland and having to fix things in the house has given me an opportunity to listen again to the always very interesting  LSE Public Lecture podcasts. The effects and implications of Brexit and the election of President Trump dominate the discussion at the moment but they are not the only themes. Here a couple of podcasts that I wanted to share with you:

Fate of the West: the battle to save the world’s most successful political idea

Bill Emmont was the editor-in-chief of the Economist from 1993 to 2006, and is now a writer and consultant on international affairs. He is a regular contributor to the Financial Times, La Stampa and Nikkei Business. His latest book is The Fate of the West: The Battle to Save the World’s Most Successful Political Idea.

In this podcast Emmont explains how they must change in order to recover and thrive. When faced with global instability and economic uncertainty, it is tempting for states to react by closing borders, hoarding wealth and solidifying power. We have seen it at various times in Japan, France and Italy and now it is infecting all of Europe and America, as the vote for Brexit in the UK has vividly shown. This insularity, together with increased inequality of income and wealth threatens the future role of the West as a font of stability, prosperity and security. Part of the problem is that the principles of liberal democracy upon which the success of the West (i.e., countries who have embraced democracy and the rule fo law across the world) has been built have been suborned, with special interest groups such as bankers accruing too much power and too great a share of the economic cake. So how is this threat to be countered? States such as Sweden in the 1990s, California at different times or Britain under Thatcher all halted stagnation by clearing away the powers of interest groups and restoring their societies’ ability to evolve. To survive, the West needs to be porous, open and flexible. From reinventing welfare systems to redefining the working age, from reimagining education to embracing automation, Emmott will lay out the changes the West must make to revive itself in the moment and avoid a deathly rigid future.

Emont talks about the evolution of democracies rather than the development of democracies which I found quite interesting.

 

Farewell to Globalization: Farewell to the Liberal World Order? The Populist Revolt from Brexit to Trump and beyond

Professor Michael Cox is Director of LSE IDEAS and Emeritus Professor of International Relations at LSE.

Until very recently nearly everybody – with a few critical exceptions – insisted that globalization was the only way forward for the world as a whole. Yet globalization is now under challenge: and not in the developing countries where billions still live in poverty but in the rich nations of the West. How has this come about and how serious is the opposition to globalization? One point in Professor Cox discussion is that populists who are generally against evidence and facts that contractionist their perspective prose a big challenges to evidence-informed policymaking.

 

A Village, a Country and the Discipline: economic development in Palanpur over seven decades

Professor Lord Stern and Professor Amartya Sen

Professor Nicholas Stern is the Patel Professor of Economics and Government at LSE, Director of the LSE India Observatory and President of the British Academy.  Amartya Sen  is the recipient of the 1998 Nobel Prize in Economics and is an LSE Honorary Fellow.

Professor Stern and his team  have conducted research in the village of Palanpur in India over several decades. What do their findings tell about the evolution of social and economic systems, inequality and the prospects for India? This is a unique research project which has had the opportunity to last over a long period of time and therefore being able to capture the evolution of the social and economic system in the village of Palanpur over generations. To me it shows why a (very) long term perspective is needed when thinking about development and why evolution and systems are terms that provide a better perspective about how change happens in a society that development and sectors.

comment 0

The three components that can help system thinking in development programmes

As I reread Duncan Green’s blog about an interesting conversation we had over a coffee few weeks ago in Brixton (How might a systems approach change the way aid supports the knowledge sector in Indonesia?), it occurred to me that I could add something.

‘Systems’, he writes, ‘evolve through the endless churn of variation, selection and amplification. Variation = rate of mutations – new species, new companies; Selection = some are fit for the landscape, others are rubbish and die out; Amplification = the fit ones expand or proliferate.’

Hong Kong MTR map

The map of Hong Kong’s MTR: elements, interconnections, and purpose

When it comes to development interventions, in order to contribute to a change or changes in a system, the various stakeholders have to ‘see’ where the web of variation, selection and amplification is weakest and focus on those weaknesses.

There is something more, I think. Duncan’s blog reminded me of the something I read in a book by Donella Meadows, which he also mentions in his How Change Happens, Thinking in Systems. A Primer.

Meadows writes about the key components that make a system: elements, interconnections, and purpose.

  • The elements of a system are easy to spot. They are visible and tangible. In a knowledge sector, for example, elements include universities, policy research institutes, policy analysis units, ministries, local governments, civil servants, researchers, data scientists, etc.
  • Interconnections are the relationships that hold the elements together. To spot these interconnections requires some research and analysis to understand why elements are linked as they are. Interconnections are often the results of information flows and in a knowledge sector there can be a government department linking with think tanks to procure an analysis about the state of the economy or the quality of education and health services. The government can then use (or not use) this information to inform funding decisions.
  • The purpose is the most difficult component of a system to spot. The purpose of the systems can be articulated in written documents such as regulations or bylaws, but it is better deduced from the behavior of organizations and individuals. For example, a written commitment to use more research-based evidence in policymaking may or may not be followed by concrete requests from ministries for research results and analysis to think tanks and policy research organizations.

To generate and/or contribute to changes in a system, it is important to understand how systems evolve (variation, selection and amplification) but also to have a clear map of the components of the systems trying to go beyond the mapping of the elements and changes to the elements of the system (e.g., new research organizations, trained civil servants, etc.) which usually, as Meadows reminds us, have the least impact on the system as a whole but are easier to measure and report on.

comment 0

Estimating the return on investment of policy research and engagement

Photo credit: Charles Wiriawan, Flickr

Photo credit: Charles Wiriawan, Flickr

Is it possible to calculate the return of investment on a research policy project? Well, the Indonesian think tank SurveyMETER and the Knowledge Sector Initiative have given it a go for a policy research project on services and infrastructure for the aging population in the Indonesian municipality of Balikpapan. Here is what we did and what we learned along the way.

[Editor’s Note: This post was written by Tanty Nurhayati Djafar, Program Officer, Knowledge Sector Initiative; Wayan Suriastini, Director, SurveyMETER; and Arnaldo Pellini, Senior Research Fellow, Overseas Development Institute, and Learning Lead, Knowledge Sector Initiative.]

First, a few disclaimers

This is the first time we have used this methodology and so it is an experiment and we’re learning as we go (Infographic).

It is important to remember that calculating the return on investment is just one piece of the puzzle when it comes to understanding research impact and contribution. The evidence produced should complement other monitoring, evaluation and learning work. In the case of SurveyMETER, we also undertook three episode studies to describe how the research and engagement activities contributed to the policy decisions in three Indonesian municipalities.

It is not possible to calculate the return on investment for all policy research projects. In this case study, the policy change resulted in a budget commitment by the government which provided a monetary sum to calculate against the research investment.

The policy research project and impact

Indonesia’s aging population (over 65 years of age) is increasing rapidly. It is expected to increase from 18 million (or 7.5 percent of the population) in 2010 to 41 million (or 14 percent of the population) by 2030.  If we consider Indonesia’s rapid rate of urbanisation (2.6 percent per year), the number of elderly citizens living in urban areas will grow considerably. Some municipalities and local governments have started to think about how to prepare for this change. In 2013, Balikpapan, a municipality in south-west Kalimantan, began a collaboration with SurveyMETER to produce and discuss research evidence to better understand how well the municipality was addressing senior citizens’ access to services and infrastructure and decide which investments would be required to provide senior citizens with a high quality of life.

During 2013 SurveyMETER collaborated with the Centre for Ageing Studies at the University of Indonesia. They conducted a study in 14 cities across Indonesia to assess their preparedness for this increase in the senior citizen population. As a result, the municipality committed some US$ 8.5 million over 2015-2020 to fund better services and infrastructure for elderly citizens.

Estimating the return on investment

To test the return on investment with the policy engagement by SurveyMETER we used a five-step methodology which has been developed by the Redstone Strategy Group. Redstone used this methodology in 2013 with 3 policy research organisations supported by the Think Tank Initiative: Research on Poverty Alleviation (REPOA) in Tanzania, Fundación ARU in Bolivia, and the Institute of Economic Affairs (IEA) in Ghana (see examples from this exercise for further information).

Steps 1 – 4 are relatively straightforward:

  1. Define the policy problem: in this case, it was an aging population and the need for policies and programs to prepare appropriate services and infrastructure within the city.
  2. Define the research contribution to addressing the problem: the research provided a comparative assessment of preparedness across 14 municipalities.
  3. Define the ‘benefit’ of the policy change: based on the available data, we defined the benefit as the budget commitment by the municipality to fund services and infrastructure.
  4. Estimate the cost of research and policy engagement activities: this was calculated by the research finance team.

Step 5 is where it gets tricky. We had to estimate the portion of benefits for which we could argue a contribution.

We took the six conditions for policy change suggested by the Redstone Strategy Group and discussed the conditions before and after the research took place. We then estimated the contribution of the research to any changes. Ideally this process would involve external as well as project stakeholders, but as this was an experimental first go, we kept it within the project team.

Of course, without external stakeholders, internal biases are inevitable. We think we may have underestimated the contribution in favour of other external factors.

Initially we struggled to relate the six conditions to the context of SurveyMETER’s policy engagement in Balikpapan. A paper that really helped us understand the definitions of the conditions and facilitate the discussion was Assessing Advocacy by Barkhorn et al. (2013).

A ratio is a comparison of like quantities; estimating the benefits of policy research and engagement requires good access to data and the availability of financial or monetary values for both the benefits and the costs. It took some time for us to gather the data about the budget commitment of the municipality of Balikpapan. The lesson we learned is that if the decision is made to use the return on investment for a policy research and engagement project, it is better to start planning data collection at an early stage of the project.

As is often the case, commitment of the leadership of the institute has been a key enabling factor for putting in place a team and providing it with the space and resources to test (and learn) this and other tools for assessing the policy influence of policy engagement and research. This enabled the SurveyMETER team to gain some additional experience with qualitative research methods and storytelling that are relatively new to it.

 

comments 2

First impressions from entering the best school system in the world

In June this year my family and I moved back to Europe after spending more than 10 years in Southeast Asia. We decided for various reason to move back to Finland. We have family and relatives here and one of the factors that brought us back was to be closer to the family.

A second factor that led to our decision was also the quality of the Finnish schooling system which is consistently ranked together with Singapore at the top of international rankings such as OECD’s PISA.

Our daughters have started school in a local primary school in the Philippines in  Dumaguete  where we lived between 2009 and 2013. When we moved to Jakarta in the end of 2013 they attended for the first time an international school which they enjoyed.

Their main language of communication is English. They speak also well Finnish and some Italian. When we were doing some research to choose the city where to move in Finland we found that at least three cities (Helsinki, Tampere, and Turku) have public (i.e. free) schools where the language of instruction is English for primary and lower secondary.

See: http://nordic.businessinsider.com/finland-has-one-of-the-best-education-systems-in-the-world–here-are-4-things-it-does-better-than-the-us-2016-11/

We chose to move to Tampere and registered our daughters to the Finnish International School of Tampere. The school year started on 10th of August. Few days before that the families of the newly enrolled children were invited for a meeting with the principal and some of the teachers.

These are my first impression from that meeting and the first week of schooling of our daughters:

– There are about 700 students in the school. Some come from families where both parents are from overseas. Others are from families where one of the parents is from Finland. The majority, however, are from Finnish families that have decided that the children would get their education in English and therefore become very good English speakers.

– Finnish is taught to all foreign students as a second language for several hours a week.

– After the first day of school both my daughters were amazed that the teachers had asked the students to call them by their first name. There is no Mr, Mrs, Miss, or even worse Dr., followed by the family name. Just the first name. Plain and simple.

– There are the usual subjects such as Math, History, Science, etc. School subjects, however, include also wood work, working with textile, and home economics which is compulsory for all students and is about learning to cook food, bake bread and cakes, make jams, iron clothes, and use a sewing machine.

– Students are given a personalised learning schedule depending on the extra subjects they have enrolled to. This means that school days do not start always at the same time. School can start on Monday at 8am, on Tuesday at 10, on Wednesday at 9, etc.

–  The city of Tampere has an online platform for all schools where parents, students and of course teachers are registered and can communicate with eachother about absence, problems, test results,  etc. At the same time the city education office can gather data about enrolment, nationalities, students and teachers ratios, etc.

–  School day ends at 2 or 3pm. Lunch is provided for free at school.

– At the meeting with the teachers, the mother of a new American student asked  at what age are kids expected to go to school alone in Finland. The principal replied that in Finland that would be from the first grade. Our daughters are of course older than that and are learning about using the public busses to go to school. I went with them for the first couple of days and (coming from Jakarta) was really surprised to see very young kids getting into the bus, tapping their bus cards, and be alone or with friends on their way to/from school.

–  Every 45-50 minutes there is a break and the students are expected to get outside their classroom to stretch their legs. Once a day they are expected to get also outside the building and out into the school yard. Some of the parents asked at what temperature during the winter kids are kept inside. The teachers said that that happens only very rarely when it is exceptionally cold, but up to -15 C and with the right clothes, the students are brought out to get some fresh air every day.

So, these are my very first positive impressions of the best school system in the world. No doubts I will learn more as the year progresses. There must be something that does not work well, but so far I could not see it.

comment 0

The struggle of bringing research into climate change policy

I read a vey interesting article in the Guardian Weekly about the struggle of science research in the United States following the election of Donald Trump as president (The climate change battle dividing trump’s America). Funding for climate change research is being cut. A climate change sceptic has been appointed as head of the Environmental Protection Agency.  A couple fo things caught my attention in this article:

  1. Yes, to inform policy decisions with scientific knowledge and research is difficult
  2. Science and evidence for policy is under attack from the conservative right but also from a postmodern strand fo leftwing discourse which views science as a social construct
  3. Politicians in the US, UK and Europe have become too technocratic, unable to to view issues from a personalised emotive basis.
  4. Climate change scientific in the US are thinking to run for Congress, thus becoming politicians to bring more evidence and science into policy

35 days 35 photos 35mmThese are difficult times for making the case that research, knowlede, science matter for informing and improving policy decisions, particularly in the West.  Luckily, there are countries who are investing in evidence and seek ways to bring more evidence into policy, like Louise Ball of the Overseas Development Institute has highlighted in a recent blog: Developing and emerging countries buck the ‘post-truth’ trend.

comments 2

Doing Development Differently means Doing Monitoring, Evaluation & Learning Differently too

Recently, we attended a two-day workshop on ‘Implementing the New Development Agenda: Doing Development Differently (DDD), Thinking and Working Politically (TWP) and Problem Driven Iterative Adaption (PDIA).’ The event was co-organised by the Knowledge Sector Initiative, the KOMPAK programme, and The World Bank in Indonesia and attended by practitioners, researchers, government and other partners – and it was great to see how the debate is becoming more mainstreamed and nuanced. With the benefits of adaptive programming firmly accepted (at least by those in attendance), workshop sessions provided space to explore more in-depth questions, such as, how can you monitor, evaluate and learn (MEL) from adaptive programmes?

MEL is particularly important for the quick feedback loops needed to inform adaptive programming. The session discussion was really interesting, and it looks like there may be an emerging DDD MEL community of practice. Broadly speaking the discussion identified two main challenges and drew out experiences that show how these can be overcome:

1. The need for an integrated MEL team that encourages a culture of learning

Doing Development Differently requires programme teams to contribute actively and explicitly to programme learning . For example, by providing feedback and reflection in meetings or after action reviews. This should help to inform programme decision-making. However, programme teams often do not have the necessary skills to do this – or they do not see it as a part of their day-to-day work, and don’t make time for it.

For too many development programmes, the monitoring and evaluation (and only sometimes learning!) unit is staffed with M&E experts who specialize in meeting donor accountability reporting requirements. They tend to be separate from the work of the programme teams, and from day-to-day interaction with partners. They develop their own plans for monitoring activities and rarely involve the programme team and partners.  What’s more, the M&E manager is often not part of the programme’s senior management team.

Some solutions we identified:

  • Programme team leaders need to invest in developing different learning capacities within teams.

Programme officers’ job descriptions should include monitoring and learning tasks, as well as soft skills such as facilitation to enable better participation in programme design and activities.

  • The programme’s senior management team should have strategies in place to develop a learning culture within their teams.

Learning cannot be imposed on the staff. It has to emerge in a participatory way. Examples include rewarding writing and publications, creating space for open discussion about what works and what does not, and generating evidence that being involved in monitoring and learning actually helps to improve your day-to-day work with partners.

  • MEL teams and units should be set up only if they are really needed and add value.

If the MEL work done by the programme teams is sufficient, then there is no need to have a dedicated MEL team. Where they are necessary (for instance in larger programmes), greater collaboration between the MEL and programme teams can be fostered by creating space for reflection and sharing that has a clear purpose. For example, the purpose may be to inform programme staff about decisions, validate evidence of progress, or to reinforce the programme goals and approach. Another way, would be to involve MEL staff in programme activities, such as designing prototypes.

  • All this requires a flexible environment that provides a real delegation of authority and decision-making within teams.

2. The need for MEL frameworks that are fit for purpose (to enable learning and adaption, rather than only for accountability and milestone reporting)

MEL frameworks and approaches are often produced too late in the programme cycle, and are often overcomplicated.  There are two reasons for this. First, there is sometimes a misunderstanding about the expectations of the funder. Second, MEL plans are often a key deliverable that triggers a milestone payment to the contractor and therefore need a lot of detail.

Another problem is that adaptive programmes – especially in the field of social change and policy innovation – may implement several pilots at the same time. Monitoring and learning from them results in rich case studies and stories of change. The reporting to the funder is often in an aggregated form, which loses important information about patterns and differences. When the information becomes too generalised, it is less useful to inform the funder’s investment decisions.

Some of the solutions identified were:

  • Programme leadership could take the initiative to design reporting processes that start from what is most useful for the programme- and discussing with the donor changes to reporting requirements.

For example, the case could be made for annual progress and learning reports, with very brief highlight updates on a quarterly or six-monthly basis.

  • MEL systems have to be fit for purpose but do not “over-engineer” the MEL framework and approach.

Start small. Be adaptive and test a few simple tools and questions that help you learn. Work with funders to select tools and processes that work for the programme. And invest in team capacities and capabilities that really help to inform an adaptive approach.

  • MEL teams have to be integrated into the work of the programme.

Much of the above discussion has focused on the monitoring aspects of MEL – ongoing gathering and reporting of data.  Which leads to the question – is there a role for evaluation (discrete studies) to support adaptive programming?  And if so, what might they look like?  But that’s a discussion for another blog (including how we define evaluation as compared to monitoring).

This article was originally published on BetterEvaluation. Read the original article.

comment 0

On the Indonesian Knowledge Sector: two new working papers

At the Knowledge Sector Initiative we have just published two new working papers on various aspects of the Indonesian Knowledge Sector and on evidence to policy processes, systems, and experiences. I really enjoyed working on these papers.

Is Measuring Policy Influence Like Measuring Thin Air? The Experience of SurveyMETER in Producing Three Episode Studies of Research-based Policy Influence

Arnaldo Pellini, Tanty Nurhayati Djafar and Ni Wayan Suriastini

Is Measuring Policy Influence Like Measuring Thin Air?This working paper reflects on the experience of SurveyMETER, a policy research institute based in the Yogyakarta, in testing the use of episode studies to describe a policy influence and policy engagement process that SurveyMETER had been leading. Episode studies are one of the tools that can help policy research organizations and think tanks to document the uptake of their research work and/or the degree to which the evidence they have produced has informed policy processes and policy actors. A key lesson from the experience of SurveyMETER is that the investment of time and resources to produce the episode studies is the result of the intent and commitment from the leadership of the organisation to learn about what works and what does not in their policy engagement and policy influence activities.

Linking Values and Research Evidence for Policy Advocacy: The Journey of the Indonesian Forum for Budget Transparency

Inaya Rakhmani, Arnaldo Pellini and Yenti Nurhidayat

Linking Values and Research Evidence for Policy Advocacy

Indonesia is the largest economy in Southeast Asia but ranks only 88th (out of 167 countries) in Transparency International’s 2015 Corruption Perception Index. Anti-corruption reforms in Indonesia have been slow due to generalised expenditure inefficiencies, and the misuse both at national and local levels of public funds. The government is trying to address these issues but these efforts also require the involvement of non-state actors. Civil society organisations, have been involved in this policy issue since 1998. SEKNAS FITRA is one of them. This policy research and advocacy organisation has played a prominent role to push for more budget transparency in Indonesia. This working paper describes how SEKNAS FITRA has evolved over the years and how, by linking policy research and advocacy, it has become a key non-state actor in this policy area.

 

 

comment 0

Cycling to work in Jakarta: can we start to hope?

It takes me about 20 minutes to get to the office and I saw not one, but two other cyclists on their way to work. And guess what? Both were riding a Brompton as I do.

The first cyclist  happened to follow my same route. I tailed him all the way until Senopati. He rode a red Brompton, full of accessories: front carrying bag, saddle pouch, retractable mirror fitted on the handlebar. He wore the compulsory helmet and anti-pollution mask.

I saw the second Brompton rider few minutes later as I was starting to push my bicycle up to the ramp of the Transjakarta footbridge.  A light green Brompton, inconspicuous, with only basic accessories. 239B576200000578-2854513-image-8_1417344393406

I do not know about you, but I have the impression, since I came back from the annual leave last July, that there are more people in the streets taking their bicycles to go to work. Don’t get me wrong. It is not that the streets are now flooded by bright colour bicycles. But from one cyclists I could spot every three weeks , I can now see three or four a week. Even, two on the same day as last Wednesday. Is something changing? Is this the beginning of a shift that will bring more people on their bicycles and less cars and motorbikes in the streets? Can we start to hope?

After all, Jakarta is among the top 10 cities with the worst traffic in the world. Traffic jams cost to the city economy an estimated $3b annually. In 1999, which is one of the most recent estimates, the health costs associated with air pollution in Jakarta were estimated at $220m.

The city administration is doing a lot for addressing its traffic problem. It has created the Transjakarta bus network system. It is building the first north -south MRT line with both above the road and underground segments. This is a massive investments that finally took off when, now President, Joko Widodo was elected Governor of the city back in 2012 .

These investments are attempts to address the traffic congestion problem and could be complemented by smaller investments to enable more environmental friendly ways to commute to work or go to the supermarket or the grocery store. These could be bicycle lines that are separated and protected from cars and motorbikes. Not everyone will be able to commute to work using their bicycle. Some, but not all, of my colleagues, for example, live too far to do that. The ones who live close enough I think they would commute with their bicycle if the right infrastructure would make it safe.

I see more cyclists in the streets lately. Maybe it is just my imagination or by chance, but hopefully we can really  start to hope that cycling to work becomes more accepted and practiced.

comments 2

Learning about learning in an adaptive programme

I am re-posting here the blog published last week by Better Evaluation where Fred Carden and I discuss about learning in an adaptive programme. Better Evaluation has started a conversation to answer questions such as:

  • How relevant are these ideas for our work?
  • How different is learning in an adaptive programme compared to what we already do?
  • What are some challenges in doing evaluation in ways that support adaptive management?  How can they be overcome?
  • Are there good examples of evaluation for adaptive management we can learn from? Or guidance?

If you’d like to be involved in the discussion further with Better Evaluation and help with the development of an Adaptive Management Option page, please register your interest and let them know what examples, advice, resources or questions you’d like to share.

The Australia-Indonesia Partnership for Pro-Poor Policy: The Knowledge Sector Initiative (KSI) is a joint programme between the governments of Indonesia and Australia that seeks to improve the lives of the Indonesian people through better quality evidence-informed policy making. The programme is working with Bappenas (Indonesian Ministry of National Development Planning) to assist research institutions to improve the quality and relevance of their research; improve better communication of research results to inform public debate and policy making processes; identify and mitigate systemic barriers that limit interaction between knowledge production, intermediation, demand and use. The first phase of the programme started in May 2013 and will end in June 2017.

KSI is built on the hypothesis that greater use of research evidence improves public policy and does not have ready-made solutions to the problems in the evidence-based policy making space in Indonesia. It is a programme that has to find solutions that work through trial, errors and learning.  In the following conversation between Fred Carden and myself, we discuss what learning means in an adaptive programme such as KSI.

Arnaldo Pellini I have been thinking lately about the meaning of Learning in a programme like KSI which aims to be problem-driven and adaptive. Recently Duncan Green wrote that ‘adaptive management, seems to be where the big aid agencies [and large programmes?] have found a point of entry into the whole ‘Doing Development Differently’ debate.’ There is a lot of talk around Learning in this attempt to find a different way to address ‘wicked hard problems’.

A couple of weeks ago I stumbled upon a quote by Denis Diderot ‘There are three principal means of acquiring knowledge… observation of nature, reflection, and experimentation. Observation collects facts; reflection combines them; experimentation verifies the result of that combination’. It made me think about the implementation of KSI and the process of transforming raw data from observations into knowledge and learning to be used by others. I also thought about the many meanings and interpretations that are given to Learning by all the individuals and stakeholders who gravitate around the KSI programme. I came to the conclusion that establishing and developing a learning function that satisfies everybody is almost impossible because the demand and needs and interest are very different. What do you think?

Fred Carden Interesting question. Let’s start from what KSI’s learning approach means. To look at that I will contrast it with the approach that was (appropriately) used in the design phase for KSI (back in 2010-2013) and which was assumed by many would continue to be central in how KSI operated. That did not turn out to be the case. KSI was designed largely through the conduct of diagnostics to get a better handle on the issues and problems at play and help identify some key starting points and key actors in the knowledge sector. It did that well.

If we go back to our dictionaries, diagnosis is about identifying a problem, be it a disease, or an institutional or legal barrier. As such it is positivist; it is about getting clear on what is happening. Diagnosis is not treatment. Diagnosis is not about doing something to address what has been identified or clarified; it is merely about our best guess on what the problem actually is – not to dismiss or downgrade diagnosis, but simply situate what it does and does not do. Doing a diagnosis is useful because as we learn our diagnosis changes. But we don’t learn by doing diagnosis. A diagnostic approach can identify some of the key problems and opportunities through a number of tests or studies. At some point, an agreement is reached that sufficient diagnosis has been carried out to identify the problem(s).

AP Interesting metaphor. So, the diagnosis is not an end in itself. It is a means to an end, that is to generate and accumulate evidence to better understand and unpack the problems a programme will face. It kick starts the process of thinking of possible solutions. You are also saying that diagnosis is more than an assessment done at the beginning of a programme, it is iterative.

FC Yes. In the case of the knowledge sector, the diagnostics mapped out a response to the diagnosis with the design of a fifteen-year years programme in three phases, that is KSI.  Part of the diagnosis being that the problem in the Indonesian knowledge sector would take a long time to treat because of the multiple manifestations of the problem in different sectors and at different levels. The diagnosis did not suggest taking two aspirins and calling again in the morning. It said you would be at this a long time and suggested there would be many twists and turns along the way. It also suggested that diagnostic studies would continue throughout implementation of the programme. The diagnostics and experience also suggested that there was a need for much stronger ownership of the process in local institutions and organizations. So to move forward, KSI moves beyond diagnostics to the “doing” – and that is where the learning happens.

AP I can see a risk here, especially if researchers are involved in a programme, and that is that diagnostics are never enough. If they are iterative, they almost certainly raise new questions that can be studied further. In a sense, the diagnostic and learning that comes from it may never end.

FC I agree, the risk is there, but remember learning and better understanding does not come from diagnostic studies. Learning comes from doing:

Doing is the precursor to learning and learning is the precursor to developing a robust vision for the work to be done going forward. Patrizi & Patton, 2009, IDRC.

We had excellent diagnosis to start with but we knew that a diagnosis could only peel back so many layers of the complex problems of the knowledge sectors; these are many, such as poor quality of research, low capability to demand evidence in government organisations, insufficient incentives for academicians to publish in international journals, etc. Learning comes also from testing things out, learning from what happens, revising and testing again. If we go back to our dictionary, learning is the activity or process of gaining knowledge and skills by study or practice – put another way, modification of a behavioral tendency by experience (as opposed to conditioning). As we learn we move into new territory, so some new diagnostics are needed, but always with an eye on how they help us move towards our objective.

AP Problems or solutions? Where to start and what does it mean for learning? I find these questions interesting on multiple levels: adaptive development, alternative development processes, monitoring and evaluation, knowledge management, etc. I recognize that I am biased and believe that it is important to start from problems first, with an open mind about what solutions can be. Starting from solution (i.e., expert solutions) has been one of the main problems that development has had in the last 40+ years, particularly on building state capability. But what you are saying is that it is also possible to start from solutions with an open mind. All of which is also conducive to learning.

FC A learning approach starts from the position that we have some idea of what we think the problem is (diagnosis is problem-driven), and some ideas about what to do about it, some possible solutions. When I say ‘we’, I do not mean one group (such as a programme team) with one view and one set of ideas, but a number of different stakeholders who agree on the general parameters of the problem they want to solve and may want to test different ideas and different approaches. The path forward in a learning approach is to take some of these potential solutions and test them, rather than focus in the first instance on a redefinition of the problem. So I say solution-driven because the whole point is to experiment and find possible solutions. But these need to be based on evidence of what the problem seems to be. Too often research is happy with its diagnosis, but  to me, the issue is how we use that diagnosis for change.

It also means being open to the potential solutions of others and where it makes sense, integrating and adapting others’ ideas and approaches. As we test, we refine the problem and refine the solutions. Through testing and interaction with the problem, elements of it become clearer and different elements take on more importance than others, sometimes because they are more tractable and therefore help create momentum or a vision for the work to be done going forward, sometimes because we did not see them until we started the work. A learning approach is iterative, going back and forth between problem and potential solution. As we test things out we also redefine and refine our understanding of the problem. But the key point is that a learning approach doesn’t stay focused on only the problem, it goes to solutions.

Documenting our learning is the major activity in which KSI is engaged at the moment. The evaluation products such as Stories of Change and their synthesis, the KSI Performance Story, the organizational assessments of policy research institutes, the documentation of practices that have worked elsewhere all contribute to the learning that is informing directions for the next phase of the programme.

AP In a sense it is not so much a dichotomy between problems and solutions, rather the acknowledgement that a learning-based approach is the common denominator between problem analysis and solution testing.

FC A learning-based approach does not negate the need for periodic diagnostics. But the learning really happens when we try something out and we see what works, why and for who. We can use that learning to adjust, carry out new diagnostics or sometimes continue on the same path. It intends to be more partner driven and is focussed on identifying the mechanisms that will strengthen the knowledge sector over the long term. It involves trial and errors which brings us to the point that learning has opportunity costs. We have to recognize these, but I would argue that the resources spent on learning help ensure we waste much less of our precious programme resource. Some of the working papers that KSI has presented are diagnostic in nature, such as the partner review of university barriers led by Yanuar Nugroho. Some working papers present solutions that have been tested and tried out elsewhere and which could generate ideas for possible solutions in the context and with the partners with whom KSI works (e.g., Managing a Government Think Tank: Inside the Black Box; Investing in Evidence: Lessons from the UK Department for Environment, Food and Rural Affairs).

AP To conclude, it is not so much about whether to start from problems or solutions, in terms of acquiring learning that informs plans and actions. Both are good options.

FC Yes, that’s right. We need some thoughtful, locally-driven problem definition, but we also need to make some forays into possible solutions. I would add that what matters is also who speaks of the problems and who is suggesting the possible solutions. The KSI programme tries to give ownership and control of the analysis of problems and the identification of solutions to partners through core grants and other mechanisms.

Complex interventions are well… complex. There are no easy solutions and even understanding the problems can be challenges. As someone once defined complexity – ‘You don’t solve complex problems, but you know you are making progress if you like your new problems better than your old ones.’ The essence of a learning approach is continuous adaptation and evolution of problem and solution.

comment 0

Here is what we learned about learning about PDIA (Part II)

Written by Arnaldo Pellini, Endah Purnawati, and Siti Ruhanawati

During the last couple of months we have gone through the six modules of the online PDIA course developed by Matt Andrews et al. We want to share here what we have learned about learning about Problem Driven Iterative Adaptation (PDIA). You can find Part I blog with some of our early reflections here.

bsc_book_border_2We work in a relatively large programme which aims, with its partners, at strengthening the capabilities and  systems of evidence-based policy making in Indonesia. It is a programme to which a PDIA approach seems particularly suited (See Knowledge Sector Initiative).

I (ie Arnaldo) have done the course a year or so ago and found it super-interesting. Ana and Endah did not take part at the time but really wanted to discover what PDIA is and how it can help their work.

When I took part in the original course, I saved the reading material and assignments into folders: a folder for each module. I also created playlists in YouTube which store the video lectures in the right sequence.

The six modules of the course are:

  • Capability for Policy Implementation
  • Techniques of Successful Failure
  • Building the Capability you Need
  • PDIA to Escape
  • Constructing Problems
  • Deconstructing Problems

We decided to meet every second week. I shared the reading material, assignment, and video playlist for the first module and two weeks later we met to discuss about it. For me this was also a good opportunity to go through the material again.

After the second module Ana and Endah realized that it was quite difficult for them to make time especially for the readings. Two weeks seemed a good amount of time for one module, but due to the workload in the programme, they found themselves having to rush the reading during the last couple of days before we would meet again. It did not work well.

At Module 3 we decided to try something different.

We watched the module’s videos together. This proved to be the best way to go about learning about PDIA and we continued in this way until Module 6.

Our key take away from learning about PDIA are:

  • Watching the videos together, pausing them at a particularly interesting point or rewinding them when something was not clear, gave us the opportunity, not only to learn about the principles and key concept underpinning PDIA, but also to discuss right away what those principles and key concepts mean for our programme and our work. Is PDIA feasible in our programme? What are the challenges? Which areas of our programme deal with wicked hard problems? Which ones with logistical problems? Do we come up with three possible solutions for each problem we identify? and so on.
  • The approach emphasized by PDIA which pushes us to get really to the roots of the problems, can help to assess whether we are dealing with real problems or just solutions presented as problems.
  • When we identify a real and concrete problem we can be confident that we can think and find some solutions.
  • PDIA requires acceptance and space to be applied in a programme. This requires programmes to have the flexibility to try out solutions and accept that these may not succeed.

We shared these takeaways to the programme in brown bag lunch at the office. The questions that the colleagues asked us resonate with the discussions which are taking place in the PDIA and DDD blogosphere about the next steps of PDIA:

  • When testing a solution how long do you go before you decide it is not working ?
  • How to reconcile annual plans and budgets, approvals from steering committees, regular reports to the funder with this way of working? Maybe by developing sufficiently general plans, get them approved, and allow for modifications within certain parameters?
  • How do you engage government partners in this way of working?
  • … and more

The learning about PDIA continues here in Jakarta.

 

P.S. In case you wonder what the photo at the beginning of the blog has to do with PDIA, the answer is: not much. Being a bicycle commuter in Jakarta, that is my personal campaign for bicycle lanes in the city. The absence of which, now that I think about it, can be seen as a wicked hard problem to solve.