Ementa:
O curso tem como objetivo ampliar a formação acadêmica e prática no campo da pesquisa, inovação e do planejamento e gestão da educação superior, com foco na avaliação de resultados e impactos de políticas, programas, projetos e instituições. Os estudantes entrarão em contato com conceitos fundamentais de avaliação (ciclo de políticas, teoria da mudança, atribuição de causalidade, entre outros), desenho e métodos de avaliação, bem como estratégias de coleta e análise de dados. Além disso, o curso abordará conceitos de meta-avaliação e meta-análise, assim como tendências recentes da área.
Bibliografia:
Alcàntara, A. M.; Woolcock, M. (2014). Integrating Qualitative Methods into Investment Climate Impact Evaluations. Washington, DC: The World Bank.
Alston, Julian M. et al. (2000). A Meta-Analysis of Rates of Return to Agricultural R&D: Ex pede Herculem? The Research Reports (2000): 1–148.
Bamberger, M. (2015). Innovations in the use of mixed methods in real-world evaluation. Journal of Development Effectiveness, 7(3), 317–326.
Bauer, M.S.; Kirchner, J. (2020). Implementation science: What is it and why should I care? Psychiatry Research, 283, 112376.
Bornmann, L. (2014). Do altmetrics point to the broader impact of research? An overview of benefits and disadvantages of altmetrics. Journal of Informetrics, 8(4), 895–903.
Cook, B.G.; Odom, S.L. (2013). Evidence-Based Practices and Implementation Science in Special Education. Exceptional Children, 79(3), 135-144.
Curry, S.; Gadd, E.; Wilsdon, J. (2022). Harnessing the Metric Tide: Indicators, Infrastructures & Priorities for UK Responsible Research Assessment. Research on Research Institute. Report. Agreement on reforming research assessment (2022).
Delahais, T.; Toulemonde, J. (2017). Making rigorous causal claims in a real-life context: Has research contributed to sustainable forest management? Evaluation, 23(4), 370–388.
Edler, J. et al. (2012). The practice of evaluation in innovation policy in Europe. Research Evaluation, 21, 167–182.
Fang, L. (2015). Do Clusters Encourage Innovation? A Meta-analysis. Journal of Planning Literature, 30(3), 239–260.
Frey, K. (2000). Políticas públicas: um debate conceitual e reflexões referentes à prática da análise de políticas públicas no Brasil. Planejamento e Políticas Públicas, n. 21.
Gertler, P. J.; Martinez, S.; Premand, P.; Rawlings, L. B.; Vermeersch, C. M. J. (2016). Impact Evaluation in Practice. The World Bank: Washington.
Hicks, D.; Wouters, P.; Waltman, L. et al. (2015). Bibliometrics: The Leiden Manifesto for research metrics. Nature, 520, 429–431.
Hicks, D.; Melkers, J. (2013). Bibliometrics as a tool for research evaluation. In: Handbook on the Theory and Practice of Program Evaluation, pp. 323–349. Edward Elgar Publishing
Horbach, S.P.J.M.; Halffman, W. (2018). The changing forms and expectations of peer review. Research Integrity and Peer Review, 3(8).
Lee, C. J.; Sugimoto, C. R.; Zhang, G.; Cronin, B. (2013). Bias in peer review. Journal of the American Society for Information Science and Technology, 64(1), 2–17.
Leeuwe, F.; Vaessen, J. (2009). Impact Evaluations and Development: NONIE Guidance on Impact Evaluation. Working Paper 57490. Washington D.C.: The World Bank.
Link, A.N.; Scott, J.T. (2013). The Theory and Practice of Public-Sector R&D Economic Impact Analysis. In: Link, A.N.; Vonortas, N.S. (Ed.) Handbook on the Theory and Practice of Program Evaluation. Cheltenham: Edward Elgar.
Milne, C. (2022). Evaluation capacity building in response to the agricultural research impact agenda: Emerging insights from Ireland, Catalonia (Spain), New Zealand, and Uruguay. Evaluation and Program Planning, 102127.
Morgan Jones, M.; Manville, C.; Chataway, J. (2017). Learning from the UK’s research impact assessment exercise: A case study of a retrospective impact assessment exercise and questions for the future. The Journal of Technology Transfer, 1–25.
Pearl, J.; Mackenzie, D. (2018). The Book of Why – The New Science of Cause and Effect. Basic Books, New York.
Pinto, D.; Bin, A.; Ferré, M.; Turner, J. A.; Rodrigues, G. S.; Costa, M. M.; Pereiro, M. S.; Mechelk, J.; Romemont, A. de; Heaune, K. (2024). Data-Driven R&D&I Management for Societal Impacts: Introduction and Application of AgroRadarEval.
Priem, J.; Taraborelli, D.; Groth, P.; Neylon, C. (2010). Altmetrics: A Manifesto. Retrieved March 24, 2022, from http://altmetrics.org/manifesto/
Sanderson, I. (2002). Evaluation, Policy Learning and Evidence-Based Policy Making. Public Administration, 80: 1-22.
Stern, E. et al. (2012). Broadening the Range of Designs and Methods for Impact Evaluations: Report of a Study Commissioned by the Department for International Development. DFID Working Paper 38, April 2012.
Stufflebeam, D. L. (2001). Meta-evaluation imperative. American Journal of Evaluation, 22(2), 183–209.
Thelwall, M. (2021). Measuring societal impacts of research with altmetrics? Common problems and mistakes. Journal of Economic Surveys, 35(5), 1302–1314.
Turner, J. A.; Guesmi, B.; Gil, J. M.; Heanue, K.; Sierra, M.; Percy, H.; Bortagaray, I.; Chams, N.;
Van der Most, F. (2010). Use and non-use of research evaluation: A literature review. CIRCLE
Ano de Catálogo: 2026
Créditos: 4
Número mínimo de alunos: 3
Idioma de oferecimento: Português
Tipo Oferecimento: Regular
Local Oferecimento:
Horários/Salas:
Docentes:
Reservas:
Não possui reservas.| Hora | Segunda | Terça | Quarta | Quinta | Sexta | Sábado |
|---|---|---|---|---|---|---|
| 07:00 | ||||||
| 08:00 | ||||||
| 09:00 | ||||||
| 10:00 | ||||||
| 11:00 | ||||||
| 12:00 | ||||||
| 13:00 | ||||||
| 14:00 | A - | |||||
| 15:00 | A - | |||||
| 16:00 | A - | |||||
| 17:00 | ||||||
| 18:00 | ||||||
| 19:00 | ||||||
| 20:00 | ||||||
| 21:00 | ||||||
| 22:00 | ||||||
| 23:00 |