John Wiley & Sons Practitioner's Guide to Using Research for Evidence-Informed Practice Cover The latest edition of an essential text to help students and practitioners distinguish between resea.. Product #: 978-1-119-85856-0 Regular price: $89.63 $89.63 In Stock

Practitioner's Guide to Using Research for Evidence-Informed Practice

Rubin, Allen / Bellamy, Jennifer

Cover

3. Edition June 2022
304 Pages, Softcover
Wiley & Sons Ltd

ISBN: 978-1-119-85856-0
John Wiley & Sons

Further versions

epubmobipdf

The latest edition of an essential text to help students and practitioners distinguish between research studies that should and should not influence practice decisions

Now in its third edition, Practitioner's Guide to Using Research for Evidence-Informed Practice delivers an essential and practical guide to integrating research appraisal into evidence-informed practice. The book walks you through the skills, knowledge, and strategies you can use to identify significant strengths and limitations in research.

The ability to appraise the veracity and validity of research will improve your service provision and practice decisions. By teaching you to be a critical consumer of modern research, this book helps you avoid treatments based on fatally flawed research and methodologies.

Practitioner's Guide to Using Research for Evidence-Informed Practice, Third Edition offers:
* An extensive introduction to evidence-informed practice, including explorations of unethical research and discussions of social justice in the context of evidence-informed practice.
* Explanations of how to appraise studies on intervention efficacy, including the criteria for inferring effectiveness and critically examining experiments.
* Discussions of how to critically appraise studies for alternative evidence-informed practice questions, including nonexperimental quantitative studies and qualitative studies.
A comprehensive and authoritative blueprint for critically assessing research studies, interventions, programs, policies, and assessment tools, Practitioner's Guide to Using Research for Evidence-Informed Practice belongs in the bookshelves of students and practitioners of the social sciences.

Preface xi

Acknowledgements xv

About the Authors xvii

About the Companion Website xix

Part 1 Overview of Evidence-Informed Practice

1 Introduction to Evidence-Informed Practice (EIP) 2

1.1 Emergence of EIP 4

1.2 Defining EIP 4

1.3 Types of EIP Questions 5

1.4 EIP Practice Regarding Policy and Social Justice 13

1.5 EIP and Black Lives Matter 13

1.6 Developing an EIP Practice Process Outlook 14

1.7 EIP as a Client-Centered, Compassionate Means, Not an End unto Itself 16

1.8 EIP and Professional Ethics 17

Key Chapter Concepts 18

Review Exercises 19

Additional Readings 19

2 Steps in the EIP Process 21

2.1 Step 1: Question Formulation 22

2.2 Step 2: Evidence Search 22

2.3 Step 3: Critically Appraising Studies and Reviews 29

2.4 Step 4: Selecting and Implementing the Intervention 30

2.5 Step 5: Monitor Client Progress 33

2.6 Feasibility Constraints 33

2.7 But What about the Dodo Bird Verdict? 36

Key Chapter Concepts 38

Review Exercises 39

Additional Readings 39

3 Research Hierarchies: Which Types of Research Are Best for Which Questions? 40

3.1 More than One Type of Hierarchy for More than One Type of EIP Question 41

3.2 Qualitative and Quantitative Studies 42

3.3 Which Types of Research Designs Apply to Which Types of EIP Questions? 43

Key Chapter Concepts 52

Review Exercises 53

Additional Readings 53

Part 2 Critically Appraising Studies for EIP Questions about Intervention Effectiveness

4 Criteria for Inferring Effectiveness: How Do We Know What Works? 56

4.1 Internal Validity 57

4.2 Measurement Issues 62

4.3 Statistical Chance 65

4.4 External Validity 66

4.5 Synopses of Fictitious Research Studies 67

Key Chapter Concepts 71

Review Exercises 72

Exercise for Critically Appraising Published Articles 73

Additional Readings 73

5 Critically Appraising Experiments 74

5.1 Classic Pretest-Posttest Control Group Design 75

5.2 Posttest-Only Control Group Design 76

5.3 Solomon Four-Group Design 77

5.4 Alternative Treatment Designs 78

5.5 Dismantling Designs 79

5.6 Placebo Control Group Designs 80

5.7 Experimental Demand and Experimenter Expectancies 82

5.8 Obtrusive Versus Unobtrusive Observation 83

5.9 Compensatory Equalization and Compensatory Rivalry 83

5.10 Resentful Demoralization 84

5.11 Treatment Diffusion 84

5.12 Treatment Fidelity 85

5.13 Practitioner Equivalence 85

5.14 Differential Attrition 86

5.15 Synopses of Research Studies 88

Key Chapter Concepts 91

Review Exercises 92

Exercise for Critically Appraising Published Articles 92

Additional Readings 93

6 Critically Appraising Quasi-Experiments: Nonequivalent Comparison Groups Designs 94

6.1 Nonequivalent Comparison Groups Designs 95

6.2 Additional Logical Arrangements to Control for Potential Selectivity Biases 97

6.3 Statistical Controls for Potential Selectivity Biases 101

6.4 Creating Matched Comparison Groups Using Propensity Score Matching 105

6.5 Pilot Studies 108

6.6 Synopses of Research Studies 110

Key Chapter Concepts 113

Review Exercises 114

Exercise for Critically Appraising Published Articles 114

Additional Readings 114

7 Critically Appraising Quasi-Experiments: Time-Series Designs and Single-Case Designs 115

7.1 Simple Time-Series Designs 116

7.2 Multiple Time-Series Designs 118

7.3 Single-Case Designs 119

7.4 Synopses of Research Studies 125

Key Chapter Concepts 129

Review Exercises 130

Exercise for Critically Appraising Published Articles 131

Additional Reading 131

8 Critically Appraising Systematic Reviews and Meta-Analyses 132

8.1 Advantages of Systematic Reviews and Meta-Analyses 133

8.2 Risks in Relying Exclusively on Systematic Reviews and Meta-Analyses 135

8.3 Where to Start 135

8.4 What to Look for When Critically Appraising Systematic Reviews 135

8.5 What Distinguishes a Systematic Review from Other Types of Reviews? 142

8.6 What to Look for When Critically Appraising Meta-Analyses 143

8.7 Synopses of Research Studies 152

Key Chapter Concepts 155

Review Exercises 156

Exercise for Critically Appraising Published Articles 157

Additional Readings 157

Part 3 Critically Appraising Studies for Alternative EIP Questions

9 Critically Appraising Nonexperimental Quantitative Studies 160

9.1 Surveys 161

9.2 Cross-Sectional and Longitudinal Studies 169

9.3 Case-Control Studies 171

9.4 Synopses of Research Studies 172

Key Chapter Concepts 178

Review Exercises 179

Exercise for Critically Appraising Published Articles 179

Additional Readings 179

10 Critically Appraising Qualitative Studies 180

10.1 Qualitative Observation 182

10.2 Qualitative Interviewing 183

10.3 Other Qualitative Methodologies 186

10.4 Qualitative Sampling 186

10.5 Grounded Theory 187

10.6 Alternatives to Grounded Theory 188

10.7 Frameworks for Appraising Qualitative Studies 189

10.8 Mixed Model and Mixed Methods Studies 193

10.9 Synopses of Research Studies 193

Key Chapter Concepts 198

Review Exercises 200

Exercise for Critically Appraising Published Articles 201

Additional Readings 201

Part 4 Assessment and Monitoring in Evidence-Informed Practice

11 Critically Appraising, Selecting, and Constructing Assessment Instruments 204

11.1 Reliability 205

11.2 Validity 208

11.3 Feasibility 214

11.4 Sample Characteristics 214

11.5 Locating Assessment Instruments 215

11.6 Constructing Assessment Instruments 216

11.7 Synopses of Research Studies 218

Key Chapter Concepts 220

Review Exercises 221

Exercise for Critically Appraising Published Articles 222

Additional Readings 222

12 Monitoring Client Progress 223

12.1 A Practitioner-Friendly Single-Case Design 224

12.2 Using Within-Group Effect-Size Benchmarks 234

Key Chapter Concepts 235

Review Exercises 236

Additional Readings 236

Part 5 Additional Aspects of Evidence-Informed Practice

13 Appraising and Conducting Data Analyses in EIP 238

13.1 Introduction 238

13.2 Ruling Out Statistical Chance 239

13.3 What Else Do You Need to Know? 244

13.4 The 05 Cutoff Point Is Not Sacred! 245

13.5 What Else Do You Need to Know? 246

13.6 Calculating Within-Group Effect Sizes and Using Benchmarks 247

13.7 Conclusion 248

Key Chapter Concepts 248

Review Exercises 249

Additional Reading 249

14 Critically Appraising Social Justice Research Studies 250

14.1 Introduction 250

14.2 Evidence-Informed Social Action 251

14.3 What Type of Evidence? 252

14.4 Participatory Action Research (PAR) 253

14.5 Illustrations of Other Types of Social Justice Research 254

14.6 Conclusion 254

Key Chapter Concepts 258

Review Exercises 259

Additional Readings 260

Glossary 261

References 269

Index 273
ALLEN RUBIN, PhD, is the Kantambu Latting College Professorship for Leadership and Change at the University of Houston Graduate College of Social Work. He is the author of several bestselling titles in social work research.

JENNIFER BELLAMY, PhD, is Associate Dean for Research and Faculty Development and Professor at the Graduate School of Social Work at the University of Denver. She teaches research and theory courses at the master's and doctoral levels.

A. Rubin, University of Texas at Austin; J. Bellamy, University of Chicago