<p><em>Teacher Evaluation: Guide to Professional Practice</em> is organized around four dominant, interrelated core issues: professional standards, a guide to applying the Joint Committee's <em>Standards</em>, ten alternative models for the evaluation of teacher performance, and an analysis of these
Being an Evaluator: Your Practical Guide to Evaluation
✍ Scribed by Donna R. Podems
- Publisher
- The Guilford Press
- Year
- 2018
- Tongue
- English
- Leaves
- 386
- Category
- Library
No coin nor oath required. For personal study only.
✦ Synopsis
✦ Table of Contents
Cover
Half Title Page
Title Page
Copyright
Dedication
A Conversation with My Readers
Acknowledgments
Contents
Part I. Doing Evaluation And Thinking Evaluatively
1. Speaking the Language
The Power of Words
Why Evaluation Terms and Concepts Can Be Perplexing
Why an Evaluator Should Clarify Terminology
The Evaluator’s Role in Clarifying Terminology
Learn the Meaning—Not Just the Label
Evaluation, Evaluative Thinking, and Monitoring: What These Terms (Can) Mean
Intervention, Activity, Project, Program: Doppelgangers or Differences?
Vision, Mission, Goal, Objective: Management Speak Translated into M&E Speak
Wrapping Up
Our Conversation: Between You and Me
2. The Tale of the Researcher and the Evaluator
Why It Is Important to Understand the Differences between Researchers and Evaluators
Researchers and Evaluators: Compatriots or Competitors?
How to Facilitate Useful Conversations
Our Conversation: Between You and Me
Describing the Evaluative Journey
Potential Users of the Evaluative Journey
Wrapping Up
3. Starting the Evaluative Journey
Introducing the Intervention
Introducing the Problem Statement
Clarifying Different Kinds of Problems, and Why It Matters
Introducing Beneficiaries and Six Related Terms
Focusing on the Intended Beneficiaries
Including Beneficiaries in the Process: Placebo or Panacea?
Wrapping Up
Our Conversation: Between You and Me
4. How We Know What We Know, and Why We Think That
The Conundrum
The Academic Discussion
The Practitioner Conversation
The Difference between Facts and Assumptions
Engaging with Facts: Before and After the Fact
How Mixing Facts and Assumptions Can Influence an Intervention
Values, Values Everywhere
Wrapping Up
Our Conversation: Between You and Me
Postscript: Facts and Assumptions in Dickens’s “A Christmas Carol”
5. Data and Credibility: What Inquiring Evaluator Minds Need to Know
Qualitative Inquiry
Quantitative Inquiry
Mixed Methods
Sampling
Practical Guidance in Choosing an Inquiry Method
Credible Data
Credible Evidence
Baseline Study
Wrapping Up
Interlude
6. Linking Problem Statements, Interventions, and Results
The Importance of a Plausible and Evaluable Intervention
What Needs to Be Linked
How to Identify Strong Links and Broken Ones, and Why That Matters
Wrapping Up
Our Conversation: Between You and Me
7. All about Results
Four Critical Reasons to Understand the Logic of Results
Starting the Results Discussion
Logical Thinking in the Results Discussion
An Evaluator’s Role in the Results Discussion
Why Discussing the Logic of Results Is (Often) Demanding and Messy
Strategies for Unpacking the Logic of Results: Mastering the Results Conversation
The Framework: Unpacking Results
Contextual Factors: Messing with the Results Logic
Wrapping Up
Our Conversation: Between You and Me
8. Talking Intervention Theory (and Logic)
Three Reasons to Have an Explicit Theory of Change
Four Examples of How an Evaluator Engages with a Theory of Change
Sorting Out an Intervention’s Theory of Change: A Framework
Common Challenges to Discussing Theory of Change
Program Logic and Theory of Change: How They Are Connected, and Why Both Are Useful
Three Things We Know When Theory of Change and Logic Are Clear
Theory and Logic: Informing When to Monitor and Evaluate What and Where
Wrapping Up
Our Conversation: Between You and Me
9. Assessing and Evaluating Progress
Measurement and Assessment: The Differences
What to Assess
How to Measure Progress
Wrapping Up
Our Conversation: Between You and Me
10. Thinking Through an Evaluable Intervention: Pulling It All Together
The Baby Game
Common Challenges with M&E Frameworks
Wrapping Up
Our Conversation: Between You and Me
Part II. Working as an Evaluator and Exploring Evaluation
11. The Personal Choices of Being an Evaluator
What Type of Evaluator Do You Want to Be?
Being an Evaluator: An Abundance of Roles
An Evaluator’s Role by Evaluation Purpose: Formative, Summative, and Developmental
An Evaluator’s Role by Evaluation Purpose: Four More of Them?
An Evaluator’s Role through Self‑Identification: Sector Specialist or Evaluation Generalist?
An Evaluator’s Role through a Formal Position: Internal or External Evaluator?
An Evaluator’s Role in Some Not‑So‑Common Positions
Exploring Evaluation’s and an Evaluator’s Roles in Society
Who Is Considered an Evaluator, and Who Is Not?
Wrapping Up
Our Conversation: Between You and Me
12. Thinking about Values
What Are Values?
Ethics, Values, and Morals: The Practical Differences
Exploring Your Values
Where to Find Values to Inform the Evaluative Process
The Evaluator’s Role in Identifying and Applying Criteria to Value an Intervention
Evaluation Ethics and Principles
Wrapping Up
Our Conversation: Between You and Me
13. Thinking about Power, Politics, Culture, Language, and Context
Evaluation Choices: A Little Bit Technical
Evaluation Influencers: The Fabulous Five
Evaluation Context: Broadest of Them All
Wrapping Up
Our Conversation: Between You and Me
14. The Scholarly Side of Being an Evaluator
Knowing the Evaluation Literature: It Matters
Ways to Start Engaging with the Literature
Wrapping Up
Our Conversation: Between You and Me
15. Navigating the Maze of Evaluation Choices
When Someone Wants an Evaluation: The Names They Use
Implementation, Outcome, and Impact Labels
Approaches to Evaluation
The Evaluation Design
The Evaluation Report and Other Ways to Communicate
Evaluating the Evaluation
Wrapping Up
Our Conversation: Between You and Me
16. The World of Recommendations (and the Underworld of Critical Feedback)
A Recommendation: What Is It?
Who Are the Recommendations For?
The Evaluator’s Role in Recommendations
Strategies for Developing Meaningful Recommendations
Strategies for Presenting Acceptable Recommendations
A Changing World: Shifting from Recommendations at the End to Learning Throughout
Providing Feedback
Wrapping Up
Our Conversation: Between You and Me
17. The Dirty Laundry Chapter
Eight Challenges in the Design Phase, and How to Mitigate Them
Ten Challenges to Implementation and Reporting, and How to Mitigate Them
Nine Common Questions That Arise during an Evaluative Process
Wrapping Up
Epilogue
References
Author Index
Subject Index
About the Author
📜 SIMILAR VOLUMES
• A supplementary guide for students who are learning how to evaluate reports of empirical research published in academic journals.</P> <P>• Your students will learn the practical aspects of evaluating research, not just how to apply a laundry list of technical terms from their textbooks. </P> <P>•
<em>Evaluating Research in Academic Journals</em>is a guide for students who are learning how to evaluate reports of empirical research published in academic journals. It breaks down the process of evaluating a journal article into easy-to-understand steps, and emphasizes the practical aspects of ev
<p><span>Evaluating Research in Academic Journals</span><span> is a guide for students learning how to evaluate reports of empirical research published in academic journals. It breaks down the process of evaluating a journal article into easy-to-understand steps and emphasizes the practical aspects