𝔖 Scriptorium
✦   LIBER   ✦

📁

Evaluation Time: A Practical Guide for Evaluation

✍ Scribed by Gail V. Barrington, Beverly F. Triana-Tremain


Publisher
SAGE Publications
Year
2022
Tongue
English
Leaves
609
Category
Library

⬇  Acquire This Volume

No coin nor oath required. For personal study only.

✦ Synopsis


Evaluation Time: A Practical Guide for Evaluation is an accessible and comprehensive guide to the practice of evaluation. Gail Vallance Barrington and Beverly Triana-Tremain integrate new approaches and classic frameworks with practical tools that readers can use to design evaluation studies and measure if objectives are met. The book stresses the role of critical thinking, as well as self-reflection, and demonstrates the importance of context and equity.

✦ Table of Contents


e9781544339504_cover
FM-01-9781544339504
Outline placeholder
Praise for This Book
Brief Contents
Detailed Contents
Preface
Purpose
Audience
Book Structure
Book Features
Instructor Resources
Acknowledgments
About the Authors
PART-01-9781544339504
Part I. Fundamentals of Evaluation
CH001-9781544339504
1. The Scope of Evaluation
Introduction
What Is Evaluation?
Why Do Evaluation?
To Determine Merit, Worth, and Significance
To Gain an Understanding of How Programs Work and the Difference They Can Make to Stakeholders
To Improve Quality of Life
To Pursue Social Justice and a More Equitable World
Research and Evaluation
Evaluation and Research Are Mutually Exclusive
Evaluation Requires Research but Research Does Not Require Evaluation
Evaluation and Research Share Similarities
A Brief History of Evaluation
The Three Pillars of Evaluation
The Guiding Principles
The Program Evaluation Standards
The Professional Competencies
Education and Training
Professionalization
CH002-9781544339504
2. How Evaluators Think
Bev's Philosophy of Mornings
Introduction
Scaffolds for Evaluation Thought
The Ladder of Abstraction
The Ladder of Evaluation Theory
Theory in Action
Ontology: Our Worldview
Realism
Relativism
Paradigms or Frameworks
Positivism/Postpositivism
The Scientific Method
Plan-Do-Study-Act (PDSA) Cycle
Deductive Thinking
Constructivism
Inductive Thinking
Subjectivism
Reflective Thinking
Pragmatism
Abductive Thinking
Natural Science Theory
Systems Theory
The Socio-Ecological Model (SEM)
Chaos Theory
Complexity Theory
Social Science Theory
Knowledge
Attitude
Behavior
Evaluation Theory
The Program Evaluation Project
Epistemology: Our Knowledge
Importance of Coherence
Critical Thinking and Evaluative Thinking
CH003-9781544339504
3. Program Logic
The Golden Thread
Introduction
Theory of Change
Creating a Theory of Change
Project Superwomen Theory of Change
The Fiver Children's Foundation Theory of Change
Assessing Quality in Theories of Change
Program Theory
The IF-THEN Statement
Creating a Program Theory: Alberta Methadone Maintenance Guidelines Evaluation
Assessing Quality in a Program Theory
The Logic Model
The Purpose of a Logic Model
How to Read a Logic Model
The Structure of a Logic Model
Inputs
Activities
Outputs
Outcomes
Comparison of Logic Model Structures
Assessing the Quality of Logic Models
Other Logic Model Elements
Supporting Data Statement
Vision or Mission Statements
Goals of the Intervention
Assumptions
Contextual Factors
Reach
Advantages of Logic Models
Limitations of Logic Models
PART-02-9781544339504
Part II. Evaluation and the Program Life Cycle
CH004-9781544339504
4. Pre- and Early Program Evaluation
Grizzly Bears and Policy Change
Introduction
The Literature Review
Generic Literature Review
Systematic Review
Meta-Analysis
Stakeholder Analysis
Stakeholder Mapping
Social Network Analysis
Communicating with Stakeholders
Needs Assessment
Phases of a Needs Assessment
Phase 1. Pre-Assessment (exploration)
Phase 2. Assessment (data gathering)
Phase 3. Post-Assessment (utilization)
Mapping Assets and Needs
Social Indicators and Trend Analysis
Consensus Building
Brainstorming
Nominal Group Technique (NGT)
Delphi Method
Forums
Example of a Needs Assessment
Program Design
Nine Steps to Program Design
Step 1. Analyze Available Data
Step 2. Specify Problem Statement
Step 3. Identify an Intended Population
Step 4. Develop Program Description and Goal Statements
Step 5. Develop an Outcomes Measurement Framework
Step 6. Identify Appropriate Activities
Step 7. Build Effective Partnerships
Step 8. Provide a Realistic Timeline
Step 9. Identify Effective Evaluation Strategies
Evaluability Assessment
Evaluability Assessment Example: Engaging Stakeholders to Improve Data Collection in Evaluability Assessments
CH005-9781544339504
5. Mid-Cycle Program Evaluation
A Killer Asteroid Is Headed Our Way
Introduction
Formative Evaluation
Participatory Evaluation
Utilization-Focused Evaluation (U-FE)
Empowerment Evaluation
Developmental Evaluation
Process Evaluation
Coverage
Quality Improvement
Accreditation
Satisfaction and Feedback Loops
Conducting Effective Participatory Focus Group Research with
Systematically Marginalized Young People
Descriptive Evaluation
Document Review
Observation
Case Studies
Fidelity
Performance Management
Performance Indicators
Performance Standards
Performance Measurement
Monitoring and Evaluation (M & E)
CH006-9781544339504
6. End-of-Cycle Program Evaluation
Traffic Jam at 29,000 Feet
Introduction
Outcome Evaluation
Individual Change
Organizational Change
Community Change
Outcome Mapping and Outcome Harvesting
Impact Evaluation
Population Change
Changes in Health Indicators
Changes in Social Determinants of Health
The Counterfactual Argument
Contribution Analysis
Summative Evaluation
Determining Value and Worth
Sustainability Evaluation
Economic Evaluation
Cost-Effectiveness Analysis (CEA)
Cost-Benefit Analysis (CBA)
Cost-Utility Analysis (CUA)
Partial Economic Evaluations
Fiscal Analysis
PART-03-9781544339504
Part III. Evaluation Methods
CH007-9781544339504
7. Using Quantitative Methods in Evaluation
Data and the Dishwasher
Introduction
Overview of Quantitative Methods
What Is Credible Evidence?
Hierarchies and Tensions in Determining Credible Evidence
Rigor
Cultural Appropriateness
Types of Quantitative Data
Quantitative Designs
Terminology Used in Quantitative Design
Types of Quantitative Designs
Pre-Experimental Designs
Single Case Study
One-Group, Pretest–Posttest Design
Quasi-Experimental Designs
Nonequivalent Control Group Design
Interrupted Time Series
Experimental Designs
Posttest-Only Control Group Design
Pretest/Posttest Control Group Design
Strengths and Limitations of Quantitative Designs
Ensuring Quality in Quantitative Designs
Validity
Internal Validity
Threats to Internal Validity
External Validity
Threats to External Validity
Reliability
Internal Consistency
Test–Retest Reliability
Objectivity
Sensitivity
Analyzing Quantitative Data
Data Formats and Sources
Data Preparation
Hypothesis Testing
Statistical Analysis
Descriptive Statistics
Inferential Statistics
Parametric and Nonparametric Tests
The Dependent Sample t Test (or Paired Sample t Test)
Analysis of Variance
Pearson Correlation
Example of an Evaluation with a Quantitative Design: Factors Precipitating Suicidality Among Homeless Youth
The Survey: The Most Common Quantitative Tool
Is a Survey the Right Tool?
What Practical Considerations Are Needed When Designing a Survey?
How Can I Be Sure My Questions Will Yield the Data I Need?
CH008-9781544339504
8. Using Qualitative Methods in Evaluation
Tiny Babies and Qualitative Research
Introduction
Overview of Qualitative Methods
Quantitative Versus Qualitative—What's the Difference?
Ensuring Methodological Rigor in Qualitative Designs
What Is Qualitative Thinking?
Qualitative Designs
Qualitative Traditions
Ethnography
Phenomenology
Social Constructivism
Narrative Inquiry
Critical Theory
Grounded Theory
Common Characteristics of Qualitative Designs
Participants
Design
Data Collection and Analysis
The Self
Common Threats to Quality and How to Counteract Them
During Study Design
During Data Collection
During Data Analysis and Synthesis
Strengths and Limitations of Qualitative Designs
The Evaluator Experience in Old Crow, Yukon
Qualitative Analysis—Overview
Using Bloom's Taxonomy—A Framework for Qualitative Analysis
Levels 1 and 2—Knowledge and Comprehension
Prepare to Code
Level 3—Application
Record Analytic Thoughts and Processes
Level 4—Analysis
Reduce and Chunk the Data
Initial Coding
Categorizing
Theming
Level 5—Recombine and Interpret
Synthesis
Count
Note Patterns
Look at Plausibility
Cluster
Make Metaphors
Split Variables
Move From Particular to General
Factor
Note Relationships Among Variables
Find Intervening Variables
Data Displays
Narrative Description
The Matrix
The Flowchart
The Creative Graphic
Level 6—Making a Judgment
Evaluation
Building a Logical Chain of Evidence
Achieving Coherence
Example of an Evaluation With a Qualitative Design: The Girls Just Wanna Have Fun Evaluation
The Interview—The Most Common Qualitative Tool
Types of Interviews
Checklist for Planning the Interview
Checklist for Conducting the Interview
Trauma-informed Interviews
CH009-9781544339504
9. Using Mixed Methods in Evaluation
Jell-O® and Mixed Methods
Introduction
What Is Mixed Methods?
Advantages and Challenges of Mixed Methods Designs
Advantages of Mixed Methods Designs
Challenges of Mixed Methods Designs
Mixed Methods Thinking
Mixed Methods Designs
The Convergent Design
The Explanatory Sequential Design
The Exploratory Sequential Design
Mixed Methods Integration Strategies—A Rough Guide
Integrated Management Strategies
Integrated Design
Integrated Instrument Development
Integrated Data Analysis
Integrated Data Synthesis
Integrated Reporting
Integrating Through Narrative
Integrating Through Data Transformation
Integrating Through Joint Displays
Evaluation of An Alternate Lawyer Licensing Pathway Using Mixed Methods
Ensuring Quality in Mixed Methods Designs
Example of an Evaluation Using an Exploratory Sequential Mixed Methods Design: Telling It All—A Story of Women's Social Capital
The Focus Group: A Companion Tool in Mixed Methods Studies
Planning and Conducting the Focus Group
PART-04-9781544339504
Part IV. Communicating About Evaluation
CH010-9781544339504
10. The Evaluation Plan
The Franklin Expedition, A Planning Disaster
Introduction
The Preplanning Process: Case Study #1 The Learning Circle Pilot Project
Identify and Engage Stakeholders
Review Program Documents and Conduct a Brief Literature Review
Conduct Key Informant Interviews and On-Site Observation
Design the Program Theory and Logic Model
Ask the Evaluation Questions
Develop Indicators
Finalize Methods Decisions
Prepare the Data Collection Matrix
Creates a Shared Understanding of the Project
Acts as a Concrete Work Plan
Maintains Study Focus
Provides a Coding System for Study Tools
Guides Data Analysis
Provides a Systematic and Recognized Outline for the Final Report
The Evaluation Plan: Case Study #2 MD/PhD Educational Program
Title Page
Program Overview
Program Role
Program Description
Key Stakeholders
Program Context
Evaluation Description
Evaluation Purpose
Type of Evaluation
Evaluation Questions
Intended Users and Use
Evaluation Approach
Evaluation Standards, Ethics, and Values
Evaluation Methods
Data Analysis and Synthesis
Types of Analysis Planned
Methods of Data Synthesis
Role of Stakeholders in Data Interpretation
Reporting
Planned Evaluation Management: Work Samples
Team Roles
Management Strategy
Data Ownership and Security
Study Limitations and Strengths
Schedule of Activities
Budget
Accountability
CH011-9781544339504
11. Communication, Reporting, and Use
Emergency on Hallowe'en
Introduction
Communication During the Evaluation
Evaluation Conversations
Interactive Forms of Communication
Managing Conflict
Informal Communications Tools
Data Visualization
Visual Processing
Reporting at the End of the Evaluation
The Final Report
Classic Reporting
Action-Oriented Reporting
Report Impact
How to Craft Conclusions
Present Results in a Simple Format
Select Results That Are Dually Significant
Use Sensitivity Analysis
Test the Validity of the Conclusions
How to Craft Recommendations
Avoid Meek Recommendations
Be Politically Astute
Work Collaboratively With Stakeholders
Supporting Use Once the Evaluation Is Completed
Evaluation Use Within the Organization
Process Use
Organizational Learning
Evaluation Capacity Building
Evaluation Use Beyond the Organization
Knowledge Into Action
Knowledge Creation
Knowledge Dissemination
Knowledge Application
Evaluation Influence
Social Betterment
CH012-9781544339504
12. Evaluation Context and the Evaluator's New Stance
THE CREATION OF TURTLE ISLAND, AN OJIBWE LEGEND
Introduction
Context Matters
The Evaluator's Power and Responsibility
Zooming In
Program and Participants
Methods
Communication Strategies
Lessons From Zooming In
Zooming Out
The Global and Natural Imperative
Strategies for Evaluators
Use Systems Thinking
Seek Transformative Engagement
Incorporate Sustainability Into Evaluation Practice
Identity and Social Issues
Race
Strategies for Evaluators
Understand Racial Framing
Work Toward Cultural Humility
Look for Strengths, Not Deficits
Culture2
Strategies for the Evaluator
Integrate Indigenous Knowledge in Evaluation Practice
Work With Different Epistemologies
Use an Indigenous Evaluation Framework
Manage Indigenous Data With Respect
Ethnicity
Strategies for the Evaluator
Use the Socio-Ecological Model (SEM)
Consider LatCrit Strategies
Use Evaluation for Advocacy
Organizations and Structures
Strategies for the Evaluator
Expand Approach to Learning
Use Equitable Evaluation Strategies
The Evaluator's New Stance
BM-01-9781544339504
Glossary
BM-02-9781544339504
References
BM-03-9781544339504
Index


📜 SIMILAR VOLUMES


Evaluation Time: A Practical Guide for E
✍ Gail V. Barrington, Beverly F. Triana-Tremain 📂 Library 🌐 English

<span>Evaluation Time: A Practical Guide for Evaluation</span><span> is an accessible and comprehensive guide to the practice of evaluation. Gail Vallance Barrington and Beverly Triana-Tremain integrate new approaches and classic frameworks with practical tools that readers can use to design evaluat

Evaluating Value-Added Models for Teache
✍ Daniel McCaffrey 📂 Library 📅 2004 🌐 English

Clarifies the primary questions raised by the use of value-added models for measuring teacher effects, reviews the most important recent applications of VAM, and discusses statistical and measurement issues assoicated with VAM.

Evaluation Practice for Collaborative Gr
✍ Lori L Bakken 📂 Library 📅 2018 🏛 Oxford University Press, USA 🌐 English

<em>Evaluation Practice for Collaborative Growth</em>highlights the approaches, tools, and techniques that are most useful for evaluating educational and social service programs. This book walks the reader through a process of creating answerable evaluations questions, designing evaluation studies t

Best Practices in Faculty Evaluation: A
✍ Jeffrey L. Buller 📂 Library 📅 2012 🏛 Jossey-Bass 🌐 English

Jeffrey L. Buller s newest book is designed to help department chairs, deans, and members of evaluation committees by showing them what they need to know and do when participating in faculty reviews and evaluations. The book shows how to apply the information about performance and convey clear messa

Evaluating Reference Services: A Practic
✍ Jo Bell Whitlatch 📂 Library 📅 2000 🌐 English

Whether at the desk, on the phone, or online, good reference services are often the keys to winning the hearts and minds of library customers. With this handy new guidebook, reference luminary Jo Bell Whitlatch outlines practical methods for evaluating and delivering excellent reference service to t

Running Randomized Evaluations: A Practi
✍ Rachel Glennerster, Kudzai Takavarasha 📂 Library 📅 2013 🏛 Princeton University Press 🌐 English

<span><p><b>A comprehensive guide to running randomized impact evaluations of social programs in developing countries</b><br><br>This book provides a comprehensive yet accessible guide to running randomized impact evaluations of social programs. Drawing on the experience of researchers at the Abdul