home > User-oriented Evaluation > Selected Activities
User-oriented Evaluation
Research Themes
Selected Activities

a) In collaboration with other themes, build capacity in assessing the value of hazard warnings and advice through targeted workshops, conference sessions and reviews involving risk reduction and social science researchers, operational staff, service providers and key user groups. Promulgate current capability in both the meteorological and impact community through publication of a white paper and/or special issues of journals. Identify go-to people in each hazard to assist researchers. 


b) Through workshops and inter-comparisons, involving social scientists, operational meteorologists and users, and building on a review of current capability, identify appropriate methods and metrics for evaluating hazard forecasts and warnings that reproduce subjective judgement. Evaluate selected methods with users in FDPs. 


c) Together with the multi-scale forecasting theme, develop and evaluate improved ensemble diagnostics and new approaches to ensemble verification, particularly with relevance to hazard predictions, evaluate in case studies and FDPs. 


d) Develop and apply techniques for evaluating errors in warning timing/duration and demonstrate in FDPs. 

e) Develop robust approaches for accounting for observation uncertainty in verification and demonstrate in case studies, RDPs and FDPs. 


f) In collaboration with the Vulnerability & Risk and Communication themes, catalogue post-event case-study evaluations, identifying similarities and differences, sources of hazard information, usage of advice in decision making and good practice in evaluation.  


g) Contribute to a cross-cutting activity to identify and promulgate good practice in enhancing user trust by assessing and communicating forecast successes and failures and their causes and improvements to forecast capability from a user perspective. 


h) Contribute to a cross-cutting activity to develop an international collaborative activity to collect social media, volunteer and other professional data, through application and assessment of data quality metrics and use of the data in verification. 


i) Through workshops and demonstrations, collaborate with experts in other fields, e.g. World Bank, International Association of Evaluators, International Council of Science (ICSU) Council on Evaluation, World Framework for Climate Services (WFCS), National Science Foundation (NSF) Hazards SEES), to raise interest in assessment of the value of weather services and, in particular, in stepwise evaluation of mitigation of hazard potential, leading to publication of a white paper. 


j) Lead a cross-cutting activity to use reviews, workshops and participation in the design and execution of FDPs to develop an understanding of the propagation of error and value through the processing chain from meteorological observation and forecast to information use and user benefit. 


k) Lead a cross-cutting activity, in collaboration with operational meteorologists and CBS, to develop verification tools that enable operational meteorologists to judge the value of new products and capabilities. Evaluate the benefit in FDPs, testbeds etc. and publish the results. 


l) Lead a cross-cutting activity, in collaboration with operational meteorologists and CBS, to develop real-time verification facilities that support operational meteorologists in assessing the accuracy of the current forecast. Evaluate the benefit in FDPs, testbeds etc. and publish the results.  


Users Login