RealWorld Evaluation
Working Under Budget, Time, Data, and Political Constraints
Häftad, Engelska, 2019
Av J. Michael Bamberger, Linda S. Mabry, J. Michael (Independent Consultant) Bamberger, WA) Mabry, Linda S. (Washington State University
1 879 kr
Produktinformation
- Utgivningsdatum2019-09-27
- Mått203 x 254 x 25 mm
- Vikt1 200 g
- FormatHäftad
- SpråkEngelska
- Antal sidor568
- Upplaga3
- FörlagSAGE Publications
- ISBN9781544318783
Tillhör följande kategorier
Michael Bamberger has been involved in development evaluation for fifty years. Beginning in Latin America where he worked in urban community development and evaluation for over a decade, he became interested in the coping strategies of low-income communities, how they were affected by and how they influenced development efforts. Most evaluation research fails to capture these survival strategies, frequently underestimating the resilience of these communities – particularly women and female-headed households. During 20 years with the World Bank he worked as monitoring and evaluation advisor for the Urban Development Department, evaluation training coordinator with the Economic Development Department and Senior Sociologist in the Gender and Development Department. After retiring from the Bank in 2001 he has worked as a development evaluation consultant with more than 10 UN agencies as well as development banks, bilateral development agencies, NGOs and foundations. Since 2001 he has been on the faculty of the International Program for Development Evaluation Training (IPDET). Recent publications include: (with Jim Rugh and Linda Mabry) RealWorld Evaluation: Working under budget, time, data and political constraints (2012 second edition); (with Marco Segone) How to design and manage equity focused evaluations (2011); Engendering Monitoring and Evaluation ( 2013 ); (with Linda Raftree) Emerging opportunities: Monitoring and evaluation in a tech-enabled world (2014); (with Marco Segone and Shravanti Reddy) How to integrate gender equality and social equity in national evaluation policies and systems (2014). Linda Mabry is a faculty member at Washington State University specializing in program evaluation, student assessment, and research and evaluation methodology. She currently serves as president of the Oregon Program Evaluation Network and on the editorial board for Studies in Educational Evaluation. She has served in a variety of leadership positions for the American Evaluation Association, including the Board of Directors, chair of the Task Force on Educational Accountability, and chair of the Theories of Evaluation topical interest group. She has also served n the Board of Trustees for the National Center for the Improvement of Educational Assessments and on the Performance Assessment Review Board of New York. She has conducted evaluations for the U.S. Department of Education, National Science Foundation, National Endowment for the Arts, the Jacob Javits Foundation, Hewlett-Packard Corporation, Ameritech Corporation, ATT-Comcast Corporation, the New York City Fund for Public Education, the Chicago Arts Partnerships in Education, the Chicago Teachers Academy of Mathematics and Science, and a variety of university, state, and school agencies. She has published in a number of scholarly journals and written several books, including Evaluation and the Postmodern Dilemma (1997) and Portfolios Plus: A Critical Guide to Performance Assessment (1999).
- List of Boxes, Figures, and TablesList of AppendicesForeword by Jim RughPrefaceAcknowledgmentsAbout the AuthorsPART I • THE SEVEN STEPS OF THE REALWORLD EVALUATION APPROACHChapter 1 • Overview: RealWorld Evaluation and the Contexts in Which It Is Used1. Welcome to RealWorld Evaluation2. The RealWorld Evaluation Context3. The Four Types of Constraints Addressed by the RealWorld Approach4. Additional Organizational and Administrative Challenges5. The RealWorld Approach to Evaluation Challenges6. Who Uses RealWorld Evaluation, for What Purposes, and When?SummaryFurther ReadingChapter 2 • First Clarify the Purpose: Scoping the Evaluation1. Stakeholder Expectations of Impact Evaluations2. Understanding Information Needs3. Developing the Program Theory Model4. Identifying the Constraints to Be Addressed by RWE and Determining the Appropriate Evaluation Design5. Developing Designs Suitable for RealWorld Evaluation ConditionsSummaryFurther ReadingChapter 3 • Not Enough Money: Addressing Budget Constraints1. Simplifying the Evaluation Design2. Clarifying Client Information Needs3. Using Existing Data4. Reducing Costs by Reducing Sample Size5. Reducing Costs of Data Collection and Analysis6. Assessing the Feasibility and Utility of Using New Information Technology (NIT) to Reduce the Costs of Data Collection7. Threats to Validity of Budget ConstraintsSummaryFurther ReadingChapter 4 • Not Enough Time: Addressing Scheduling and Other Time Constraints1. Similarities and Differences Between Time and Budget Constraints2. Simplifying the Evaluation Design3. Clarifying Client Information Needs and Deadlines4. Using Existing Documentary Data5. Reducing Sample Size6. Rapid Data-Collection Methods7. Reducing Time Pressure on Outside Consultants8. Hiring More Resource People9. Building Outcome Indicators Into Project Records10. New Information Technology for Data Collection and Analysis11. Common Threats to Adequacy and Validity Relating to Time ConstraintsSummaryFurther ReadingChapter 5 • Critical Information Is Missing or Difficult to Collect: Addressing Data Constraints1. Data Issues Facing RealWorld Evaluators2. Reconstructing Baseline Data3. Special Issues Reconstructing Baseline Data for Project Populations and Comparison Groups4. Collecting Data on Sensitive Topics or From Difficult-to-Reach Groups5. Common Threats to Adequacy and Validity of an Evaluation Relating to Data ConstraintsSummaryFurther ReadingChapter 6 • Political Constraints1. Values, Ethics, and Politics2. Societal Politics and Evaluation3. Stakeholder Politics4. Professional Politics5. Political Issues in the Design Phase6. Political Issues in the Conduct of an Evaluation7. Political Issues in Evaluation Reporting and Use8. AdvocacySummaryFurther ReadingChapter 7 • Strengthening the Evaluation Design and the Validity of the Conclusions1. Validity in Evaluation2. Factors Affecting Adequacy and Validity3. A Framework for Assessing the Validity and Adequacy of QUANT, QUAL, and Mixed-Method Designs4. Assessing and Addressing Threats to Validity for Quantitative Impact Evaluations5. Assessing Adequacy and Validity for Qualitative Impact Evaluations6. Assessing Validity for Mixed-Method (MM) Evaluations7. Using the Threats-to-Validity WorksheetsSummaryFurther ReadingChapter 8 • Making It Useful: Helping Clients and Other Stakeholders Utilize the Evaluation1. What Do We Mean by Influential Evaluations and Useful Evaluations?2. The Underutilization of Evaluation Studies3. Strategies for Promoting the Utilization of Evaluation Findings and RecommendationsSummaryFurther ReadingPART II • A REVIEW OF EVALUATION METHODS AND APPROACHES AND THEIR APPLICATION IN REALWORLD EVALUATION: FOR THOSE WHO WOULD LIKE TO DIG DEEPERChapter 9 • Standards and Ethics1. Standards of Competence2. Professional Standards3. Ethical Codes of Conduct4. IssuesSummaryFurther ReadingChapter 10 • Theory-Based Evaluation and Theory of Change1. Theory-Based Evaluation (TBE) and Theory of Change (TOC)2. Applications of Program Theory in Program Evaluation3. Using TOC in Program Evaluation4. Designing a Theory of Change Evaluation Framework5. Integrating a Theory of Change Into the Program Management, Monitoring, and Evaluation Cycle6. Program Theory Evaluation and CausalitySummaryFurther ReadingChapter 11 • Evaluation Designs: The RWE Strategy for Selecting the Appropriate Evaluation Design to Respond to the Purpose and Context of Each Evaluation1. Different Approaches to the Classification of Evaluation Designs2. Assessing Causality Attribution and Contribution3. The RWE Approach to the Selection of the Appropriate Impact Evaluation Design4. Tools and Techniques for Strengthening the Basic Evaluation Designs5. Selecting the Best Design for RealWorld Evaluation ScenariosSummaryFurther ReadingChapter 12 • Quantitative Evaluation Methods1. Quantitative Evaluation Methodologies2. Experimental and Quasi-Experimental Designs3. Strengths and Weaknesses of Quantitative Evaluation Methodologies4. Applications of Quantitative Methodologies in Program Evaluation5. Quantitative Methods for Data Collection6. The Management of Data Collection for Quantitative Studies7. Data AnalysisSummaryFurther ReadingChapter 13 • Qualitative Evaluation Methods1. Design2. Data Collection3. Data Analysis4. Reporting5. Real-World ConstraintsSummaryFurther ReadingChapter 14 • Mixed-Method Evaluation1. The Mixed-Method Approach2. Rationale for Mixed-Method Approaches3. Approaches to the Use of Mixed Methods4. Mixed-Method Strategies5. Implementing a Mixed-Method Design6. Using Mixed Methods to Tell a More Compelling Story of What a Program Has Achieved7. Case Studies Illustrating the Use of Mixed MethodsSummaryFurther ReadingChapter 15 • Sampling Strategies for RealWorld Evaluation1. The Importance of Sampling for RealWorld Evaluation2. Purposive Sampling3. Probability (Random) Sampling4. Using Power Analysis and Effect Size for Estimating the Appropriate Sample Size for an Impact Evaluation5. The Contribution of Meta-Analysis6. Sampling Issues for Mixed-Method Evaluations7. Sampling Issues for RealWorld EvaluationSummaryFurther ReadingChapter 16 • Evaluating Complex Projects, Programs, and Policies1. The Move Toward Complex, Country-Level Development Programming2. Defining Complexity in Development Programs and Evaluations3. A Framework for the Evaluation of Complex Development ProgramsSummaryFurther ReadingChapter 17 • Gender Evaluation: Integrating Gender Analysis Into Evaluations1. Why a Gender Focus Is Critical2. Gender Issues in Evaluations3. Designing a Gender Evaluation4. Gender Evaluations With Different Scopes5. The Tools of Gender EvaluationSummaryFurther ReadingChapter 18 • Evaluation in the Age of Big Data1. Introducing Big Data and Data Science2. Increasing Application of Big Data in the Development Context3. The Tools of Data Science4. Potential Applications of Data Science in Development Evaluation5. Building Bridges Between Data Science and EvaluationSummaryFurther ReadingPART III • MANAGING EVALUATIONSChapter 19 • Managing Evaluations1. Organizational and Political Issues Affecting the Design, Implementation, and Use of Evaluations2. Planning and Managing the Evaluation3. Institutionalizing Impact Evaluation Systems at the Country and Sector Levels4. Evaluating Capacity DevelopmentSummaryFurther ReadingChapter 20 • The Road Ahead1. Conclusions2. RecommendationsGlossary of Terms and AcronymsReferencesAuthor IndexSubject Index
"This book moves the study of evaluation from the theoretical to the practical, so that evaluators can improve their work. It deals with most of the real issues that evaluators face, particularly at the international level."