Evaluation of results and adaptation of EU Rural Development Programmes

•The evaluation process for the EU rural development programmes could potentially generate valuable information for future policy design.•However, lack of relevant data severely impedes the possibilities of applying scientifically well founded evaluation methods.•Evaluation results are uncertain and...

Full description

Saved in:
Bibliographic Details
Published in:Land use policy Vol. 67; pp. 298 - 314
Main Authors: Andersson, Anna, Höjgård, Sören, Rabinowicz, Ewa
Format: Journal Article
Language:English
Published: Kidlington Elsevier Ltd 01-09-2017
Elsevier Science Ltd
Subjects:
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:•The evaluation process for the EU rural development programmes could potentially generate valuable information for future policy design.•However, lack of relevant data severely impedes the possibilities of applying scientifically well founded evaluation methods.•Evaluation results are uncertain and theoretically questionable, leading to few recommendations and those that are given are seldom followed.•Furthermore, results are only available in their native languages, limiting the possibilities of learning from the experiences of other MS.•We find few indications that evaluations affect policy design; rather it appears that they have been used for policy legitimisation. The EU Commission highlights evaluations as important for improving common policies. But do evaluations actually contribute? This paper examines whether this has been the case for the EU Rural Development Programmes (RDPs). We investigate 1) to what extent evaluations have influenced the design of national programmes and 2) if they have affected the Rural Development Regulation on which national programmes are based. Our main finding is that evaluations do not seem to affect future policy to any discernible degree. This is the case for both national programmes and the Regulation itself, which seems to have evolved in response to external pressures. Partly, this may be because evaluations tend to give vague or too general recommendations. Moreover, evaluations seldom apply counterfactual analysis, often because of a lack of data, implying that results may be methodologically questioned. Lastly, evaluations, and RDPs, are hard to locate and seldom translated from their native languages, impairing the possibilities of learning from the experiences of others.
ISSN:0264-8377
1873-5754
1873-5754
DOI:10.1016/j.landusepol.2017.05.002