1. decembar 2013.
Within the framework of the TEMPUS STREW project, Centre for Education Policy prepared and delivered 6 trainings on the role of evidence in decision making in higher education. In total, about 150 individuals from 10 universities and 5 system-level institutions (higher education authorities) from 6 countries participated in the trainings which took place between November 2012 – April 2013. The participants were university leadership, management, administrative staff, as well as higher education authorities, such as ministries and other structures at the national level from Albania, Bosnia and Herzegovina, Macedonia, Montenegro, and Serbia.
The aim of the trainings was to contribute to strengthening capacities for informed decision making in higher education policy and university management, i.e. so-called evidence-informed or evidence-based policy making in higher education, by engaging broad range of individuals directly or indirectly involved in decision making or evidence provision at various instances, but mostly system, institutional and departmental or faculty level. The initial training concept was based on a pre-training survey which showed that potential participants recognised the need to strengthen their knowledge and skills in the domain of evidence-informed policy making.
The overarching topic of the training courses was the one of engaging evidence in the process of decision making at universities through strengthening the practices of collecting and analysing evidence for its further use in higher education decision making. We consider evidence to be anything that comes as a result of “...any systematic process of critical investigation and evaluation, theory building, data collection, analysis and codification related to development policy and practice. It also includes action research, i.e. self-reflection by practitioners orientated towards the enhancement of direct practice.” [1] We therefore saw participants as those identifying or providing evidence for internal university decision making, where evidence includes all the information, data and knowledge acquired through the above described systematic processes.
The two-day session was foreseen as a combination of training and interactive workshop in which the participants were led through the content from simpler to complex, with the focus on determined relevance (according to the information gathered through the questionnaire), starting with more familiar concepts and moving towards less familiar, while keeping the discussions and activities within the specific university context at all times.
At the beginning of each of the trainings, the participants were introduced with global and European trends in higher education and the way these affect regional and local contexts. This was followed by a presentation of the most noted challenges faced by the higher education institutions and policy makers in the region or in the country in question. Participants were also given elaborate input on the policy related decision making and policy implementation tools, with a particular focus on evidence as a way to inform not only policy making, but also policy implementation and evaluation. Most of the two-day trainings were dedicated to interactive workshops during which participants addressed a specific problem faced at their higher education institution. The problems were pre-determined by facilitators, yet they were of rather general nature and in almost all the cases participants did not have difficulties in identifying these as also challenges faced by their own institutions.
During these two days it was vital that participants saw the importance of their own contribution in the process of enhancing quality at their own institutions. It was not expected that the participants completely changed their practices after these trainings, primarily because the time-scope was more than limited for that, and also because this project activity cannot change the real context in which the individuals hereby engaged operated in their day to day activities. However, what we saw as feasible was a general enhancement of participants’ understanding with respect to the importance of systematic data-collection practices, good analytical skills, as well as a good capability of linking evidence and institutional knowledge with strategic action. The facilitators were dedicated to contributing to a strengthened sense of commitment of participants to quality in performing their tasks, as well as to their awareness of the relevance of their contribution to higher education development.
A detailed description of the trainings, including the report on the feedback received and recommendations, is provided in the “Report on the Implementation of Trainings in the Framework of the TEMPUS STREW Project” and can be downloaded here.
Participants’ Feedback
With regards to the feedback received from the participants after the training, 67% of completed the questionnaire, out of which 94% of reported to have been overall extremely satisfied or satisfied with the training and in general their satisfaction was relatively evenly distributed across the units in which the training was divided. With respect to the various aspects of the training, trainers’ knowledge on the subject, notably received most points, with 97% of respondents reported to have been extremely satisfied or satisfied with it. Group activities seem to have been highly appreciated by respondents. A crucial question asked in the feedback form was the one referring to the perceived usefulness of the training itself for participants’ individual work at their respective institutions. Here, 56% reported that the training was useful or extremely useful, while 36% were said to have been undecided. On the other hand, for 77% of respondents the expectations they had before the training were met.
Regarding the aspects of the training participants’ most positively appreciated, interaction with and the opportunity to meet colleagues from other or even the same institution was what they tended to point out the most positive aspects of the training. In addition, they tended to positively assess the assignments they were working on, even though some of them pointed out that that they felt there was little they could do to bring about change. In trainings with few participants from the university leadership or management or policy makers, some participants would point out that as a minus point, along with the remark that the trainings were little adapted to the administrative staff and were more oriented towards decision makers. A few participants also felt that the input from facilitators would have been better if there were for more examples from the region and beyond provided.
---
Footnotes:
[1] Sutcliffe, S. & Court, J. (2005). Evidence-Based Policymaking: What is it? How does it work? What relevance for developing countries?, p. 3., London: ODI.