image.AlternativeText

Qualitative Research & Evaluation Methods

Integrating theory and practice.

Intermediate/Advanced Qualitative Research | Introduction to Qualitative Research Methods | Program Evaluation

  • Request Instructor Sample
  • DESCRIPTION

Drawing on more than 40 years of experience conducting applied social science research and program evaluation, author Michael Quinn Patton has crafted the most comprehensive and systematic book on qualitative research and evaluation methods, inquiry frameworks, and analysis options available today. Now offering more balance between applied research and evaluation, t his Fourth Edition illuminates all aspects of qualitative inquiry through new examples, stories, and cartoons; more than a hundred new summarizing and synthesizing exhibits; and a wide range of new highlight sections/sidebars that elaborate on important and emergent issues . For the first time, full case studies are included to illustrate extended research and evaluation examples. In addition, each chapter features an extended "rumination," written in a voice and style more emphatic and engaging than traditional textbook style, about a core issue of persistent debate and controversy.

Available formats

Chapter 1. the nature, niche, and value of qualitative inquiry, chapter 2. strategic themes in qualitative inquiry, chapter 3. variety of qualitative inquiry frameworks: paradigmatic, philosophical, and theoretical orientations, chapter 4. practical and actionable qualitative applications, chapter 5. designing qualitative studies, chapter 6. fieldwork strategies and observation methods, chapter 7. qualitative interviewing, chapter 8. qualitative analysis and interpretation, chapter 9. enhancing the quality and credibility of qualitative studies.

Student Study Site EXCLUSIVE! Access to certain full-text SAGE journal articles that have been carefully selected for each chapter. Each article supports and expands on the concepts presented in the chapter. This feature also provides questions to focus and guide your interpretation. 

NEW TO THIS EDITION:

  • A new organization that creatively groups content into individual modules within nine chapters offers instructors the ability to easily customize course content and readers the ability to quickly navigate and reference specific topics by choosing across a range of over 80 modules.
  • New examples of the contributions of qualitative inquiry to our understanding of patterns in the world have been added, including advances in theory, practice, and both classic and innovative methods.
  • New and major in-depth discussions explore systems thinking , complexity theory , pragmatic framing , and generic qualitative inquiry.
  • More than half of this edition consists of new, extended research and evaluation examples with full case studies as exemplars.
  • The number of qualitative (purposeful) sampling options has been expanded from 16 to 40 to provide readers with the most innovative and comprehensive case selection framework ever assembled and explained.
  • The book offers the most comprehensive, rigorous, and assertive discussion of methods and analysis techniques for drawing causal inferences available today.
  • Techniques for high-quality observational fieldwork and in-depth interviewing are covered, along with emergent approaches and alternative frameworks for inquiry .
  • Principles-focused qualitative evaluation is premiered in this edition.
  • Detailed analysis guidelines and diverse examples include innovative approaches to data visualization.
  • The number of qualitative frameworks for judging quality has been expanded from five to seven.
  • Over a hundred new exhibits —essential for teaching—summarize and synthesize key information.
  • New sidebars throughout the text elaborate on important and emergent issues.
  • Hundreds of new references make the book up to date and comprehensive.
  • New cartoons and graphic comics specially commissioned and created for this edition illustrate key concepts in a unique and memorable way.
  • An Instructor Resource and Student Study Site filled with helpful supplemental resources, including access to carefully selected SAGE journal articles, are available at no additional charge.

KEY FEATURES:

  • Qualitative inquiry’s seven major contributions to understanding the world are presented.
  • Twelve primary strategic themes of qualitative inquiry illuminate the unique niche of qualitative inquiry in research and evaluation studies.
  • Seven distinct, criteria-based frameworks for presenting and judging qualitative findings are provided.
  • Sixteen different theoretical and philosophical approaches to qualitative inquiry are identified, compared, contrasted, and discussed.
  • Variations in observational methods are covered, including historical perspectives, case studies and their layers, and cross-case analysis.
  • Alternative interviewing strategies and approaches are linked to theoretical and methodological traditions, including innovative and emergent methods, such as using social media and data visualization.
  • Key strategies are included to unravel the complexities of and controversies about causal analysis , generalizations , and triangulation .
  • Comprehensive references provide the scholarly foundations for qualitative theory and practice.
"Very thoughtful and thorough coverage of qualitative design and study." Kari O'Grady Ph.D, Loyola University Maryland
  “The content itself, based in years of thinking, reading, doing, conversing, is a huge strength. Reading the chapters is like sitting at the feet of one of the masters.”   Kathleen A. Bolland, The University of Alabama
“I can’t emphasize enough the quality, detail, and depth of the presentation of research design and methods… Students and experienced researchers will appreciate the depth of presentation of potential qualitative paradigms, theoretical orientations and frameworks as well as special methodological applications that are often not covered in other qualitative texts.” Susan S. Manning, University of Denver
“It is refreshing to see a text that engages the multiple philosophical and historical trajectories within a qualitative research tradition while integrating this discussion so well with the practice of research design, fieldwork strategies, and data analysis.” Michael P. O’Malley, Texas State University

Session Timeout

Study Site Homepage

  • Request new password
  • Create a new account

Qualitative Research & Evaluation Methods: Integrating Theory and Practice

Student resources.

This site is intended to enhance your use of  Qualitative Research & Evaluation Methods, Fourth Edition ,  by Michael Quinn Patton. Please note that all the materials on this site are especially geared toward maximizing your understanding of the material. 

Drawing on more than 40 years of experience conducting applied social science research and program evaluation, author Michael Quinn Patton has crafted the most comprehensive and systematic book on qualitative research and evaluation methods, inquiry frameworks, and analysis options available today. Now offering more balance between applied research and evaluation, this  Fourth Edition  illuminates all aspects of qualitative inquiry through new examples, stories, and cartoons; more than a hundred new summarizing and synthesizing exhibits; and a wide range of new highlight sections/sidebars that elaborate on important and emergent issues. For the first time, full case studies are included to illustrate extended research and evaluation examples. In addition, each chapter features an extended "rumination," written in a voice and style more emphatic and engaging than traditional textbook style, about a core issue of persistent debate and controversy.

Acknowledgments

We gratefully acknowledge Michael Quinn Patton for writing an excellent text and creating the materials on this site.

bookimage

qualitative research and evaluation methods 2015

  • Writing, Research & Publishing Guides

Sorry, there was a problem.

Kindle app logo image

Download the free Kindle app and start reading Kindle books instantly on your smartphone, tablet, or computer - no Kindle device required .

Read instantly on your browser with Kindle for Web.

Using your mobile phone camera - scan the code below and download the Kindle app.

QR code to download the Kindle App

Image Unavailable

Qualitative Evaluation and Research Methods

  • To view this video download Flash Player

qualitative research and evaluation methods 2015

Follow the author

Michael Quinn Patton

Qualitative Evaluation and Research Methods 2nd Edition

Once again setting the standard for the field, the Second Edition of Qualitative Evaluation Methods reflects the tremendous explosion of interest in qualitative methods over the past decade. Thoroughly revised and updated, this new edition includes three new chapters on Theoretic Foundations of Qualitative Inquiry, Particularly Appropriate Qualitative Applications and Quality, and Credibility of Qualitative Analysis. Patton has also a completely updated literature review and citations section to reflect the mass of new research in qualitative methods in the last ten years. It will be of interest to anyone involved in evaluation of any kind.

  • ISBN-10 0803937792
  • ISBN-13 978-0803937796
  • Edition 2nd
  • Publisher SAGE Publications, Inc
  • Publication date February 1, 1990
  • Language English
  • Dimensions 1.25 x 5.75 x 9 inches
  • Print length 536 pages
  • See all details

Editorial Reviews

About the author.

Michael Quinn Patton  is author of more than a dozen books on evaluation including Qualitative Research & Evaluation Methods, 4th ed (2015), Blue Marble Evaluation (2020), Principles-Focused Evaluation (2018), Facilitating Evaluation (2018) and Developmental Evaluation (2011). Based in Minnesota, he was on the faculty of the University of Minnesota for 18 years and is a former president of the American Evaluation Association (AEA). Michael is a recipient of the Alva and Gunnar Myrdal Evaluation Practice Award, the Paul F. Lazarsfeld Evaluation Theory Award, and the Research on Evaluation Award, all from AEA He has also received the Lester F. Ward Distinguished Contribution to Applied and Clinical Sociology Award from the Association for Applied and Clinical Sociology. In 2021 he received the first Transformative Evaluator Award from EvalYouth. He is an active speaker, trainer, and workshop presenter who has conducted applied research and evaluation on a broad range of issues and has worked with organizations and programs at the international, national, state, provincial, and local levels. Michael has three children―a musician, an engineer, and an evaluator―and four grandchildren. When not evaluating, he enjoys exploring the woods and rivers of Minnesota, where he lives.

Product details

  • Publisher ‏ : ‎ SAGE Publications, Inc; 2nd edition (February 1, 1990)
  • Language ‏ : ‎ English
  • Hardcover ‏ : ‎ 536 pages
  • ISBN-10 ‏ : ‎ 0803937792
  • ISBN-13 ‏ : ‎ 978-0803937796
  • Item Weight ‏ : ‎ 1.65 pounds
  • Dimensions ‏ : ‎ 1.25 x 5.75 x 9 inches
  • #1,400 in Medical Psychology Research
  • #1,614 in Social Sciences Methodology
  • #2,162 in Research Reference Books

About the author

Michael quinn patton.

Michael Quinn Patton lives in Minnesota where, according to the state's poet laureate, Garrison Keillor, "all the women are strong, all the men are good looking, and all the children are above average." It was this lack of interesting statistical variation in Minnesota that led him to qualitative inquiry despite the strong quantitative orientation of his doctoral studies in sociology at the University of Wisconsin. He serves on the graduate faculty of The Union Institute, a nontraditional, interdisciplinary, nonresidential and individually designed doctoral program.

He was on the faculty of the University of Minnesota for 18 years, including five years as Director of the Minnesota Center for Social Research, where he was awarded the Morse-Amoco Award for innovative teaching. He won the University of Minnesota storytelling competition and has authored several other books which include Utilization-Focused Evaluation, Creative Evaluation, Practical Evaluation, How to Use Qualitative Methods in Evaluation, and Family Sexual Abuse: Frontline Research and Evaluation.

He edited Culture and Evaluation for the journal New Direction in Program Evaluation. His creative nonfiction book, Grand Canyon Celebration: A Father-Son Journey of Discovery, was a finalist for 1999 Minnesota Book of the Year.He is former President of the American Evaluation Association and the only recipient of both the Alva and Gunner Myrdal Award for Outstanding Contributions to Useful and Practical Evaluation from the Evaluation Research Society and the Paul F. Lazarsfeld Award for Lifelong Contributions to Evaluation Theory from the American Evaluation Association. The Society for Applied Sociology awarded him the 2001 Lester F. Ward Award for Outstanding Contributions to Applied Sociology.

Customer reviews

  • 5 star 4 star 3 star 2 star 1 star 5 star 48% 36% 15% 0% 0% 48%
  • 5 star 4 star 3 star 2 star 1 star 4 star 48% 36% 15% 0% 0% 36%
  • 5 star 4 star 3 star 2 star 1 star 3 star 48% 36% 15% 0% 0% 15%
  • 5 star 4 star 3 star 2 star 1 star 2 star 48% 36% 15% 0% 0% 0%
  • 5 star 4 star 3 star 2 star 1 star 1 star 48% 36% 15% 0% 0% 0%

Customer Reviews, including Product Star Ratings help customers to learn more about the product and decide whether it is the right product for them.

To calculate the overall star rating and percentage breakdown by star, we don’t use a simple average. Instead, our system considers things like how recent a review is and if the reviewer bought the item on Amazon. It also analyzed reviews to verify trustworthiness.

  • Sort reviews by Top reviews Most recent Top reviews

Top reviews from the United States

There was a problem filtering reviews right now. please try again later..

qualitative research and evaluation methods 2015

Top reviews from other countries

  • About Amazon
  • Investor Relations
  • Amazon Devices
  • Amazon Science
  • Sell products on Amazon
  • Sell on Amazon Business
  • Sell apps on Amazon
  • Become an Affiliate
  • Advertise Your Products
  • Self-Publish with Us
  • Host an Amazon Hub
  • › See More Make Money with Us
  • Amazon Business Card
  • Shop with Points
  • Reload Your Balance
  • Amazon Currency Converter
  • Amazon and COVID-19
  • Your Account
  • Your Orders
  • Shipping Rates & Policies
  • Returns & Replacements
  • Manage Your Content and Devices
 
 
 
 
  • Conditions of Use
  • Privacy Notice
  • Consumer Health Data Privacy Disclosure
  • Your Ads Privacy Choices

qualitative research and evaluation methods 2015

qualitative research and evaluation methods 2015

Select your cookie preferences

We use cookies and similar tools that are necessary to enable you to make purchases, to enhance your shopping experiences and to provide our services, as detailed in our Cookie notice . We also use these cookies to understand how customers use our services (for example, by measuring site visits) so we can make improvements.

If you agree, we'll also use cookies to complement your shopping experience across the Amazon stores as described in our Cookie notice . Your choice applies to using first-party and third-party advertising cookies on this service. Cookies store or access standard device information such as a unique identifier. The 96 third parties who use cookies on this service do so for their purposes of displaying and measuring personalized ads, generating audience insights, and developing and improving products. Click "Decline" to reject, or "Customise" to make more detailed advertising choices, or learn more. You can change your choices at any time by visiting Cookie preferences , as described in the Cookie notice. To learn more about how and for what purposes Amazon uses personal information (such as Amazon Store order history), please visit our Privacy notice .

qualitative research and evaluation methods 2015

  • Politics, Philosophy & Social Sciences
  • Social Sciences
  • Methodology & Research

Sorry, there was a problem.

Kindle app logo image

Download the free Kindle app and start reading Kindle books instantly on your smartphone, tablet or computer – no Kindle device required .

Read instantly on your browser with Kindle for Web.

Using your mobile phone camera - scan the code below and download the Kindle app.

QR code to download the Kindle App

Image Unavailable

Qualitative Research & Evaluation Methods: Integrating Theory and Practice

  • To view this video download Flash Player

Follow the author

Michael Quinn Patton

Qualitative Research & Evaluation Methods: Integrating Theory and Practice Hardcover – 6 Jan. 2015

Drawing on more than 40 years of experience conducting applied social science research and program evaluation, author Michael Quinn Patton has crafted the most comprehensive and systematic book on qualitative research and evaluation methods, inquiry frameworks, and analysis options available today. Now offering more balance between applied research and evaluation, this Fourth Edition illuminates all aspects of qualitative inquiry through new examples, stories, and cartoons; more than a hundred new summarizing and synthesizing exhibits; and a wide range of new highlight sections/sidebars that elaborate on important and emergent issues. For the first time, full case studies are included to illustrate extended research and evaluation examples. In addition, each chapter features an extended "rumination," written in a voice and style more emphatic and engaging than traditional textbook style, about a core issue of persistent debate and controversy.

  • ISBN-10 9781412972123
  • ISBN-13 978-1412972123
  • Edition Fourth
  • Publisher SAGE Publications, Inc
  • Publication date 6 Jan. 2015
  • Language English
  • Dimensions 22.23 x 3.81 x 28.58 cm
  • Print length 832 pages
  • See all details

Product description

"The content itself, based in years of thinking, reading, doing, conversing, is a huge strength. Reading the chapters is like sitting at the feet of one of the masters."

"I can’t emphasize enough the quality, detail, and depth of the presentation of research design and methods… Students and experienced researchers will appreciate the depth of presentation of potential qualitative paradigms, theoretical orientations and frameworks as well as special methodological applications that are often not covered in other qualitative texts."

"It is refreshing to see a text that engages the multiple philosophical and historical trajectories within a qualitative research tradition while integrating this discussion so well with the practice of research design, fieldwork strategies, and data analysis."

From the Back Cover

About the author.

Michael Quinn Patton  is author of more than a dozen books on evaluation including Qualitative Research & Evaluation Methods, 4th ed (2015), Blue Marble Evaluation (2020), Principles-Focused Evaluation (2018), Facilitating Evaluation (2018) and Developmental Evaluation (2011). Based in Minnesota, he was on the faculty of the University of Minnesota for 18 years and is a former president of the American Evaluation Association (AEA). Michael is a recipient of the Alva and Gunnar Myrdal Evaluation Practice Award, the Paul F. Lazarsfeld Evaluation Theory Award, and the Research on Evaluation Award, all from AEA He has also received the Lester F. Ward Distinguished Contribution to Applied and Clinical Sociology Award from the Association for Applied and Clinical Sociology. In 2021 he received the first Transformative Evaluator Award from EvalYouth. He is an active speaker, trainer, and workshop presenter who has conducted applied research and evaluation on a broad range of issues and has worked with organizations and programs at the international, national, state, provincial, and local levels. Michael has three children―a musician, an engineer, and an evaluator―and four grandchildren. When not evaluating, he enjoys exploring the woods and rivers of Minnesota, where he lives.

Product details

  • ASIN ‏ : ‎ 1412972124
  • Publisher ‏ : ‎ SAGE Publications, Inc; Fourth edition (6 Jan. 2015)
  • Language ‏ : ‎ English
  • Hardcover ‏ : ‎ 832 pages
  • ISBN-10 ‏ : ‎ 9781412972123
  • ISBN-13 ‏ : ‎ 978-1412972123
  • Dimensions ‏ : ‎ 22.23 x 3.81 x 28.58 cm
  • 497 in Scientific Equipment & Techniques
  • 5,848 in Anthropology & Sociology Biographies
  • 102,848 in Reference (Books)

About the author

Michael quinn patton.

Michael Quinn Patton lives in Minnesota where, according to the state's poet laureate, Garrison Keillor, "all the women are strong, all the men are good looking, and all the children are above average." It was this lack of interesting statistical variation in Minnesota that led him to qualitative inquiry despite the strong quantitative orientation of his doctoral studies in sociology at the University of Wisconsin. He serves on the graduate faculty of The Union Institute, a nontraditional, interdisciplinary, nonresidential and individually designed doctoral program.

He was on the faculty of the University of Minnesota for 18 years, including five years as Director of the Minnesota Center for Social Research, where he was awarded the Morse-Amoco Award for innovative teaching. He won the University of Minnesota storytelling competition and has authored several other books which include Utilization-Focused Evaluation, Creative Evaluation, Practical Evaluation, How to Use Qualitative Methods in Evaluation, and Family Sexual Abuse: Frontline Research and Evaluation.

He edited Culture and Evaluation for the journal New Direction in Program Evaluation. His creative nonfiction book, Grand Canyon Celebration: A Father-Son Journey of Discovery, was a finalist for 1999 Minnesota Book of the Year.He is former President of the American Evaluation Association and the only recipient of both the Alva and Gunner Myrdal Award for Outstanding Contributions to Useful and Practical Evaluation from the Evaluation Research Society and the Paul F. Lazarsfeld Award for Lifelong Contributions to Evaluation Theory from the American Evaluation Association. The Society for Applied Sociology awarded him the 2001 Lester F. Ward Award for Outstanding Contributions to Applied Sociology.

Customer reviews

  • 5 star 4 star 3 star 2 star 1 star 5 star 80% 11% 5% 3% 1% 80%
  • 5 star 4 star 3 star 2 star 1 star 4 star 80% 11% 5% 3% 1% 11%
  • 5 star 4 star 3 star 2 star 1 star 3 star 80% 11% 5% 3% 1% 5%
  • 5 star 4 star 3 star 2 star 1 star 2 star 80% 11% 5% 3% 1% 3%
  • 5 star 4 star 3 star 2 star 1 star 1 star 80% 11% 5% 3% 1% 1%

Customer Reviews, including Product Star Ratings, help customers to learn more about the product and decide whether it is the right product for them.

To calculate the overall star rating and percentage breakdown by star, we don’t use a simple average. Instead, our system considers things like how recent a review is and if the reviewer bought the item on Amazon. It also analyses reviews to verify trustworthiness.

  • Sort reviews by Top reviews Most recent Top reviews

Top reviews from United Kingdom

There was a problem filtering reviews right now. please try again later..

qualitative research and evaluation methods 2015

Top reviews from other countries

qualitative research and evaluation methods 2015

  • UK Modern Slavery Statement
  • Amazon Science
  • Sell on Amazon
  • Sell on Amazon Business
  • Sell on Amazon Handmade
  • Associates Programme
  • Fulfilment by Amazon
  • Seller Fulfilled Prime
  • Advertise Your Products
  • Independently Publish with Us
  • Host an Amazon Hub
  • › See More Make Money with Us
  • The Amazon Barclaycard
  • Credit Card
  • Amazon Money Store
  • Amazon Currency Converter
  • Payment Methods Help
  • Shop with Points
  • Top Up Your Account
  • Top Up Your Account in Store
  • COVID-19 and Amazon
  • Track Packages or View Orders
  • Delivery Rates & Policies
  • Returns & Replacements
  • Manage Your Content and Devices
  • Amazon Mobile App
  • Customer Service
  • Accessibility
 
 
 
     
  • Conditions of Use & Sale
  • Privacy Notice
  • Cookies Notice
  • Interest-Based Ads Notice

qualitative research and evaluation methods 2015

  •  Sign into My Research
  •  Create My Research Account
  • Company Website
  • Our Products
  • About Dissertations
  • Español (España)
  • Support Center

Select language

  • Bahasa Indonesia
  • Português (Brasil)
  • Português (Portugal)

Welcome to My Research!

You may have access to the free features available through My Research. You can save searches, save documents, create alerts and more. Please log in through your library or institution to check if you have access.

Welcome to My Research!

Translate this article into 20 different languages!

If you log in through your library or institution you might have access to this article in multiple languages.

Translate this article into 20 different languages!

Get access to 20+ different citations styles

Styles include MLA, APA, Chicago and many more. This feature may be available for free if you log in through your library or institution.

Get access to 20+ different citations styles

Looking for a PDF of this document?

You may have access to it for free by logging in through your library or institution.

Looking for a PDF of this document?

Want to save this document?

You may have access to different export options including Google Drive and Microsoft OneDrive and citation management tools like RefWorks and EasyBib. Try logging in through your library or institution to get access to these tools.

Want to save this document?

  • More like this
  • Preview Available
  • Other Source

qualitative research and evaluation methods 2015

Qualitative Research & Evaluation Methods: Integrating Theory and Practice, 4th Edition

No items selected.

Please select one or more items.

Select results items first to use the cite, email, save, and export options

You might have access to the full article...

Try and log in through your institution to see if they have access to the full text.

Content area

ISBN: 9781412972123

TITLE: Qualitative Research & Evaluation Methods: Integrating Theory and Practice, 4th Edition

AUTHOR: Michael Quinn Patton

PUBLISHER: SAGE

PUBLISH DATE: 2015

PRICE: $99.00

BINDING: Hardcover

LIBRARY OF CONGRESS CLASSIFICATION: H62

You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer

Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer

Suggested sources

  • About ProQuest
  • Terms of Use
  • Privacy Policy
  • Cookie Policy

This website may not work correctly because your browser is out of date. Please update your browser .

  • Qualitative research & evaluation methods: Integrating theory and practice

Resource link

The fourth edition of Michael Quinn Patton's  Qualitative Research & Evaluation Methods Integrating Theory and Practice,  published by Sage Publications, analyses and provides clear guidance and advice for using a range of different qualitative methods for evaluation.

  • Module 1. How qualitative inquiry contributes to our understanding of the world
  • Module 2. What makes qualitative data qualitative
  • Module 3. Making methods decisions
  • Module 4. The fruit of qualitative methods: Chapter summary and conclusion
  • Module 5. Strategic design principles for qualitative inquiry
  • Module 6. Strategic principles guiding data collection and fieldwork
  • Module 7. Strategic principles for qualitative analysis and reporting findings
  • Module 8: Integrating the 12 strategic qualitative principles in practice
  • Module 9. Understanding the Paradigms Debate: Quants versus Quals
  • Module 10. Introduction to Qualitative Inquiry Frameworks
  • Module 11. Ethnography and Autoethnography
  • Module 12. Positivism, Postpositivism, Empiricism and Foundationalist Epistemologies
  • Module 13. Grounded Theory and Realism
  • Module 14 Phenomenology and Heuristic Inquiry
  • Module 15 Social Constructionism, Constructivism, Postmodernism, and Narrative Inquiry
  • Module 16. Ethnomethodology, Semiotics, and Symbolic Interaction, Hermeneutics and Ecological Psychology
  • Module 17 Systems Theory and Complexity Theory
  • Module 18. Pragmatism, Generic Qualitative Inquiry, and Utilization-Focused Evaluation
  • Module 19 Patterns and themes across inquiry frameworks: Chapter summary and conclusions
  • Module 20. Practical purposes, concrete questions, and actionable answers: Illuminating and enhancing quality
  • Module 21. Program evaluation applications: Focus on outcomes
  • Module 22 Specialized qualitative evaluation applications
  • Module 23 Evaluating program models and theories of change, and evaluation models especially aligned with qualitative methods
  • Module 24 Interactive and participatory qualitative applications
  • Module 25 Democratic evaluation, indigenous research and evaluation, capacity building, and cultural competence
  • Module 26 Special methodological applications
  • Module 27 A vision of the utility of qualitative methods: Chapter summary and conclusion
  • Module 28 Design thinking: Questions derive from purpose, design answers questions
  • Module 29 Date Collection Decisions
  • Module 30 Purposeful sampling and case selection: Overview of strategies and options
  • Module 31 Single-significant-case sampling as a design strategy
  • Module 32 Comparison-focused sampling options
  • Module 33 Group characteristics sampling strategies and options
  • Module 34 Concept and theoretical sampling strategies and options
  • Module 35. Instrumental-use multiple-case sampling
  • Module 36 Sequential and emergence-driven sampling strategies and options
  • Module 37 Analytically focused sampling
  • Module 38 Mixed, stratified, and nested purposeful sampling strategies
  • Module 39 Information-rich cases
  • Module 40 Sample size for qualitative designs
  • Module 41 Mixed methods designs
  • Module 42 Qualitative design chapter summary and conclusion: Methods choices and decisions
  • Module 43 The Power of direct observation
  • Module 44. Variations in observational methods
  • Module 45. Variations in duration of observations and site visits: From rapid reconnaissance to longitudinal studies over years
  • Module 46. Variations in observational focus and summary of dimensions along which fieldwork varies
  • Module 47. What to observe: Sensitizing concepts
  • Module 48. Integrating what to observe with how to observe
  • Module 49. Unobtrusive observations and indicators, and documents and archival fieldwork
  • Module 50. Observing oneself: Reflexivity and Creativity, and Review of Fieldwork Dimensions
  • Module 51. Doing Fieldwork: The Data Gathering Process
  • Module 52. Stages of fieldwork: Entry into the field
  • Module 53. Routinization of fieldwork: The dynamics of the second stage
  • Module 54. Bringing fieldwork to a close
  • Module 55. The observer and what is observed: Unity, separation, and reactivity
  • Module 56. Chapter summary and conclusion: Guidelines for fieldwork
  • Module 57 The Interview Society: Diversity of applications
  • Module 58 Distinguishing interview approaches and types of interviews
  • Module 59 Question options and skilled question formulation
  • Module 60 Rapport, neutrality, and the interview relationship
  • Module 61 Interviewing groups and cross-cultural interviewing
  • Module 62. Creative modes of qualitative inquiry
  • Module 63. Ethical issues and challenges in qualitative interviewing
  • Module 64. Personal reflections on interviewing, and chapter summary and conclusion
  • Module 65. Setting the Context for Qualitative Analysis: Challenge, Purpose, and Focus
  • Module 66. Thick description and case studies: The bedrock of qualitative analysis
  • Module 67. Qualitative Analysis Approaches: Identifying Patterns and Themes
  • Module 68. The intellectual and operational work of analysis
  • Module 69. Logical and matrix analyses, and synthesizing qualitative studies
  • Module 70. Interpreting findings, determining substantive significance, phenomenological essence, and hermeneutic interpretation
  • Module 71. Causal explanation thorough qualitative analysis
  • Module 72. New analysis directions: Contribution analysis, participatory analysis, and qualitative counterfactuals
  • Module 73. Writing up and reporting findings, including using visuals
  • Module 74. Special analysis and reporting issues: Mixed methods, focused communications, and principles-focused report exemplar.
  • Module 75 Chapter summary and conclusion, plus case study exhibits
  • Module 76. Analytical processes for enhancing credibility: systematically engaging and questioning the data
  • Module 77. Four triangulation processes for enhancing credibility
  • Part 1, universal criteria, and traditional scientific research versus constructivist criteria
  • Part 2: artistic, participatory, critical change, systems, pragmatic, and mixed criteria
  • Module 80 Credibility of the inquirer
  • Module 81 Generalizations, Extrapolations, Transferability, Principles, and Lessons learned
  • Module 82 Enhancing the credibility and utility of qualitative inquiry by addressing philosophy of science issues

Patton, M. Q. (2014).  Qualitative Research & Evaluation Methods: Integrative Theory and Practice . SAGE Publications.

'Qualitative research & evaluation methods: Integrating theory and practice' is referenced in:

  • Week 46: Rumination #2: Confusing empathy with bias
  • Week 47: Rumination #3: Fools' gold: the widely touted methodological "gold standard" is neither golden nor a standard

Framework/Guide

  • Rainbow Framework :  Sample

Back to top

© 2022 BetterEvaluation. All right reserved.

  • Technical Support
  • Find My Rep

You are here

The Sage website, including online ordering services, may be unavailable due to system maintenance on September 6th between 6:00 pm and 12:00 am PDT. If you need assistance, please  visit our Contact us page for further information. 

Thank you for your patience and we apologise for the inconvenience.

Patton, Michael

Michael Quinn Patton Utilization-Focused Evaluation, Saint Paul, MN

Michael Quinn Patton  is author of more than a dozen books on evaluation including Qualitative Research & Evaluation Methods, 4th ed (2015), Blue Marble Evaluation (2020), Principles-Focused Evaluation (2018), Facilitating Evaluation (2018) and Developmental Evaluation (2011). Based in Minnesota, he was on the faculty of the University of Minnesota for 18 years and is a former president of the American Evaluation Association (AEA). Michael is a recipient of the Alva and Gunnar Myrdal Evaluation Practice Award, the Paul F. Lazarsfeld Evaluation Theory Award, and the Research on Evaluation Award, all from AEA He has also received the Lester F. Ward Distinguished Contribution to Applied and Clinical Sociology Award from the Association for Applied and Clinical Sociology. In 2021 he received the first Transformative Evaluator Award from EvalYouth. He is an active speaker, trainer, and workshop presenter who has conducted applied research and evaluation on a broad range of issues and has worked with organizations and programs at the international, national, state, provincial, and local levels. Michael has three children—a musician, an engineer, and an evaluator—and four grandchildren. When not evaluating, he enjoys exploring the woods and rivers of Minnesota, where he lives.

Utilization-Focused Evaluation

Facilitating Evaluation

Qualitative Research & Evaluation Methods

Essentials of Utilization-Focused Evaluation

Family Sexual Abuse

How to Use Qualitative Methods in Evaluation

Practical Evaluation

Field Methods

Qualitative Research & Evaluation Methods 4th Edition Integrating Theory and Practice

Cover image: Qualitative Research & Evaluation Methods 4th edition 9781412972123

  • Author(s) Michael Quinn Patton
  • Publisher SAGE Publications, Inc

Print ISBN 9781412972123, 1412972124

Etext isbn 9781483376059, 1483376052.

  • Edition 4th
  • Copyright 2015
  • Available from $ 65.00 USD SKU: 9781483376059R90

The world’s #1 eTextbook reader for students. VitalSource is the leading provider of online textbooks and course materials. More than 15 million users have used our Bookshelf platform over the past year to improve their learning experience and outcomes. With anytime, anywhere access and built-in tools like highlighters, flashcards, and study groups, it’s easy to see why so many students are going digital with Bookshelf.

Over 2.7 million titles available from more than 1,000 publishers

Over 65,000 customer reviews with an average rating of 9.5

Over 5 billion digital pages viewed over the past 12 months

Over 7,000 institutions using Bookshelf across 241 countries

Qualitative Research & Evaluation Methods: Integrating Theory and Practice 4th Edition is written by Michael Quinn Patton and published by SAGE Publications, Inc. The Digital and eTextbook ISBNs for Qualitative Research & Evaluation Methods are 9781483376059, 1483376052 and the print ISBNs are 9781412972123, 1412972124. Save up to 80% versus print by going digital with VitalSource. Additional ISBNs for this eTextbook include 9781483314815, 9781483301457.

Ask the publishers to restore access to 500,000+ books.

Can You Chip In? (USD)

Internet Archive Audio

qualitative research and evaluation methods 2015

  • This Just In
  • Grateful Dead
  • Old Time Radio
  • 78 RPMs and Cylinder Recordings
  • Audio Books & Poetry
  • Computers, Technology and Science
  • Music, Arts & Culture
  • News & Public Affairs
  • Spirituality & Religion
  • Radio News Archive

qualitative research and evaluation methods 2015

  • Flickr Commons
  • Occupy Wall Street Flickr
  • NASA Images
  • Solar System Collection
  • Ames Research Center

qualitative research and evaluation methods 2015

  • All Software
  • Old School Emulation
  • MS-DOS Games
  • Historical Software
  • Classic PC Games
  • Software Library
  • Kodi Archive and Support File
  • Vintage Software
  • CD-ROM Software
  • CD-ROM Software Library
  • Software Sites
  • Tucows Software Library
  • Shareware CD-ROMs
  • Software Capsules Compilation
  • CD-ROM Images
  • ZX Spectrum
  • DOOM Level CD

qualitative research and evaluation methods 2015

  • Smithsonian Libraries
  • FEDLINK (US)
  • Lincoln Collection
  • American Libraries
  • Canadian Libraries
  • Universal Library
  • Project Gutenberg
  • Children's Library
  • Biodiversity Heritage Library
  • Books by Language
  • Additional Collections

qualitative research and evaluation methods 2015

  • Prelinger Archives
  • Democracy Now!
  • Occupy Wall Street
  • TV NSA Clip Library
  • Animation & Cartoons
  • Arts & Music
  • Computers & Technology
  • Cultural & Academic Films
  • Ephemeral Films
  • Sports Videos
  • Videogame Videos
  • Youth Media

Search the history of over 866 billion web pages on the Internet.

Mobile Apps

  • Wayback Machine (iOS)
  • Wayback Machine (Android)

Browser Extensions

Archive-it subscription.

  • Explore the Collections
  • Build Collections

Save Page Now

Capture a web page as it appears now for use as a trusted citation in the future.

Please enter a valid web address

  • Donate Donate icon An illustration of a heart shape

Qualitative research and evaluation methods

Bookreader item preview, share or embed this item, flag this item for.

  • Graphic Violence
  • Explicit Sexual Content
  • Hate Speech
  • Misinformation/Disinformation
  • Marketing/Phishing/Advertising
  • Misleading/Inaccurate/Missing Metadata

[WorldCat (this item)]

plus-circle Add Review comment Reviews

3 Favorites

Better World Books

DOWNLOAD OPTIONS

No suitable files to display here.

IN COLLECTIONS

Uploaded by station26.cebu on March 2, 2021

SIMILAR ITEMS (based on metadata)

  • Open access
  • Published: 02 September 2024

“I am there just to get on with it”: a qualitative study on the labour of the patient and public involvement workforce

  • Stan Papoulias   ORCID: orcid.org/0000-0002-7891-0923 1 &
  • Louca-Mai Brady 2  

Health Research Policy and Systems volume  22 , Article number:  118 ( 2024 ) Cite this article

28 Altmetric

Metrics details

Workers tasked with specific responsibilities around patient and public involvement (PPI) are now routinely part of the organizational landscape for applied health research in the United Kingdom. Even as the National Institute for Health and Care Research (NIHR) has had a pioneering role in developing a robust PPI infrastructure for publicly funded health research in the United Kingdom, considerable barriers remain to embedding substantive and sustainable public input in the design and delivery of research. Notably, researchers and clinicians report a tension between funders’ orientation towards deliverables and the resources and labour required to embed public involvement in research. These and other tensions require further investigation.

This was a qualitative study with participatory elements. Using purposive and snowball sampling and attending to regional and institutional diversity, we conducted 21 semi-structured interviews with individuals holding NIHR-funded formal PPI roles across England. Interviews were analysed through reflexive thematic analysis with coding and framing presented and adjusted through two workshops with study participants.

We generated five overarching themes which signal a growing tension between expectations put on staff in PPI roles and the structural limitations of these roles: (i) the instability of support; (ii) the production of invisible labour; (iii) PPI work as more than a job; (iv) accountability without control; and (v) delivering change without changing.

Conclusions

The NIHR PPI workforce has enabled considerable progress in embedding patient and public input in research activities. However, the role has led not to a resolution of the tension between performance management priorities and the labour of PPI, but rather to its displacement and – potentially – its intensification. We suggest that the expectation to “deliver” PPI hinges on a paradoxical demand to deliver a transformational intervention that is fundamentally divorced from any labour of transformation. We conclude that ongoing efforts to transform health research ecologies so as to better respond to the needs of patients will need to grapple with the force and consequences of this paradoxical demand.

Peer Review reports

Introduction – the labour of PPI

The inclusion of patients, service users and members of the public in the design, delivery and governance of health research is increasingly embedded in policy internationally, as partnerships with the beneficiaries of health research are seen to increase its relevance, acceptability and implementability. In this context, a growing number of studies have sought to evaluate the impact of public participation on research, including identifying the barriers and facilitators of good practice [ 1 , 2 , 3 , 4 , 5 , 6 , 7 , 8 ]. Some of this inquiry has centred on power, control and agency. Attention has been drawn, for example, to the scarcity of user or community-led research and to the low status of experiential knowledge in the hierarchies of knowledge production guiding evidence-based medicine [ 9 ]. Such hierarchies, authors have argued, constrain the legitimacy that the experiential knowledge of patients can achieve within academic-led research [ 10 ], may block the possibility of equitable partnerships such as those envisioned in co-production [ 11 ] and may function as a pull back against more participatory or emancipatory models of research [ 12 , 13 , 14 ]. In this way, patient and public inclusion in research may become less likely to aim towards inclusion of public and patient-led priorities, acting instead as kind of a “handmaiden” to research, servicing and validating institutionally pre-defined research goals [ 15 , 16 , 17 ].

Research on how public participation-related activities function as a form of labour within a research ecosystem, however, is scarce [ 18 ]. In this paper, we examine the labour of embedding such participation, with the aim of understanding how such labour fits within the regimes of performance management underpinning current research systems. We argue that considering this “fit” is crucial for a broader understanding of the implementation of public participation and therefore its potential impact on research delivery. To this end, we present findings from a UK study of the labour of an emerging professional cadre: “patient and public involvement” leads, managers and co-ordinators (henceforth PPI, the term routinely used for public participation in the United Kingdom). We concentrate specifically on staff working on research partnerships and centres funded by the National Institute for Health and Care Research (NIHR). This focus on the NIHR is motivated by the organization’s status as the centralized research and development arm of the National Health Service (NHS), with an important role in shaping health research systems in the United Kingdom since 2006. NIHR explicitly installed PPI in research as a foundational part of its mission and is currently considered a global leader in the field [ 19 ]. We contend that exploring the labour of this radically under-investigated workforce is crucial for understanding what we see as the shifting tensions – outlined in later sections – that underpin the key policy priority of embedding patients as collaborators in applied health research. To contextualize our study, we first consider how the requirement for PPI in research relates to the overall policy rationale underpinning the organizational mission of the NIHR as the NHS’s research arm, then consider existing research on tensions identified in efforts to embed PPI in a health system governed through regimes of performance management and finally articulate the ways in which dedicated PPI workers’ responsibilities have been developed as a way to address these tensions.

The NIHR as a site of “reformed managerialism”

The NIHR was founded in 2006 with the aim of centralizing and rationalizing NHS research and development activities. Its foundation instantiated the then Labour government’s efforts to strengthen and consolidate health research in the UK while also tackling some of the problems associated with the earlier introduction of new public management (NPM) principles in the governance of public services. NPM had been introduced in the UK public sector by Margaret Thatcher’s government, in line with similar trends in much of the Global North [ 20 ]. The aim was to curb what the Conservatives saw as saw as excesses in both public spending and professional autonomy. NPM consisted in management techniques adapted from the private sector: in the NHS this introduction was formalized via the 1990 National Health Service and Community Care Act, which created an internal market for services, with local authorities purchasing services from local health providers (NHS Trusts) [ 21 ]; top-down management control; an emphasis on cost-efficiency; a focus on targets and outputs over process; an intensification of metrics for performance management; and a positioning of patients and the public as consumers of health services with a right to choose [ 22 , 23 ]. In the context of the NHS, cost-efficiency meant concentrating on services and on research which would have the greatest positive impact on population health while preventing research waste [ 24 ]. By the mid-1990s, however, considerable criticism had been directed towards this model, including concerns that NPM techniques resulted in silo-like operations and public sector fragmentation, which limited the capacity for collaboration between services essential for effective policy. Importantly, there was also a sense that an excessive managerialism had resulted in a disconnection of public services from public and civic aims, that is, from the values, voices and interests of the public [ 25 , 26 ].

In this context, the emergence of the NIHR can be contextualized through the succeeding Labour government’s much publicized reformed managerialism, announced in their 1997 white paper “The New NHS: Modern, Dependable” [ 27 ]. Here, the reworking of NPM towards “network governance” meant that the silo-like effects of competition and marketization were to be attenuated through a turn to cross-sector partnerships and a renewed attention to quality standards and to patients’ voices [ 28 ]. It has been argued, however, that the new emphasis on partnerships did not undermine the dominance of performance management, while the investment in national standards for quality and safety resulted in an intensified metricization, with the result that this reform may have been more apparent than real, amounting to “NPM with a human face” [ 29 , 30 , 31 ]. Indeed, the NIHR can be seen as an exemplary instantiation of this model: as a centralized commissioner of research for the NHS, the NIHR put in place reporting mechanisms and performance indicators to ensure transparent and cost-efficient use of funds, with outputs and impact measured, managed and ranked [ 24 ]. At the same time, the founding document of the NIHR, Best Research for Best Health, articulates the redirection of such market-oriented principles towards a horizon of public good and patient benefit. The document firmly and explicitly positioned patients and the public as both primary beneficiaries of and important partners in the delivery of health research. People (patients) were to be placed “at the centre of a research system that focuses on quality, transparency and value for money” [ 32 ], a mission implemented through the installation of “structures and mechanisms to facilitate increased involvement of patients and the public in all stages of NHS Research & Development” [ 33 ]. This involvement would be supported by the advisory group INVOLVE, a key part of the new centralized health research system. INVOLVE, which had started life in 1996 as Consumers in NHS Research, funded by the Department of Health, testified to the Labour administration’s investment in championing “consumer” involvement in NHS research as a means of increasing research relevance [ 34 ]. The foundation of the NIHR then exemplified the beneficent alignment of NPM with public benefit, represented through the imaginary of a patient-centred NHS, performing accountability to the consumers/taxpayers through embedding PPI in all its activities. In this context, “public involvement” functioned as the lynchpin through which such alignment could be effected.

PPI work and the “logic of deliverables”: a site of tension

Existing research on the challenges of embedding PPI has typically focussed on the experiences of academics tasked with doing so within university research processes. For example, Pollard and Evans, in a 2013 paper, argue that undertaking PPI work in mental health research can be arduous, emotionally taxing and time consuming, and as such, can be in tension with expectations for cost-efficient and streamlined delivery of research outputs [ 35 ]. Similarly, Papoulias and Callard found that the “logic of deliverables” governing research funding can militate against undertaking PPI or even constitute PPI as “out of sync” with research timelines [ 36 ]. While recent years have seen a deepening operationalization of PPI in the NIHR and beyond, there are indications that this process, rather than removing these tensions, may have recast them in a different form. For example, when PPI is itself set up as performance-based obligation, researchers, faced with the requirement to satisfy an increasing number of such obligations, may either engage in “surface-level spectacles” to impress the funder while eschewing the long-term commitment necessary for substantive and ongoing PPI, or altogether refuse to undertake PPI, relegating the responsibility to others [ 37 , 38 ]. Such refusals may then contribute to a sharpening of workplace inequalities: insofar as PPI work is seen as “low priority” for more established academic staff, it can be unevenly distributed within research organizations, with precariously employed junior researchers and women typically assigned PPI responsibilities with the assumption that they possess the “soft skills” necessary for these roles [ 39 ].

Notably, the emergence of a dedicated PPI workforce is intended as a remedy for this tension by providing support, expertise and ways of negotiating the challenges associated with undertaking PPI responsibilities. In the NIHR, this workforce is part of a burgeoning infrastructure for public involvement which includes national standards, training programmes, payment guidelines, reporting frameworks and impact assessments [ 40 , 41 , 42 , 43 , 44 , 45 ]. By 2015, an INVOLVE review of PPI activities during the first 10 years of the NIHR attested to “a frenzy of involvement activity…across the system”, including more than 200 staff in PPI-related roles [ 40 ]. As NIHR expectations regarding PPI have become more extensive, responsibilities of PPI workers have proliferated, with INVOLVE organizing surveys and national workshops to identify their skills and support needs [ 41 , 42 ]. In 2019, the NIHR mandated the inclusion of a “designated PPI lead” in all funding applications, listing an extensive and complex roster of responsibilities. These now included delivery and implementation of long-term institutional strategies and objectives, thus testifying to the assimilation of involvement activities within the roster of “performance-based obligations” within research delivery systems [ 43 ]. Notably however, this formalization of PPI responsibilities is ambiguous: the website states that the role “should be a budgeted and resourced team member” and that they should have “the relevant skills, experience and authority”, but it does not specify whether this should be a researcher with skills in undertaking PPI or indeed someone hired specifically for their skills in PPI, that is, a member of the PPI workforce. Equally, the specifications, skills and support needs, which have been brought together into a distinct role, have yet to crystallize into a distinct career trajectory.

Case studies and evaluations of PPI practice often reference the skills and expertise required in leading and managing PPI. Chief among them are relational and communication skills: PPI workers have been described as “brokers” who mediate and enable learning between research and lay spaces [ 44 , 45 ]; skilled facilitators enabling inclusive practice [ 46 , 47 , 48 ]; “boundary spanners” navigating the complexities of bridging researchers with public contributors and undertaking community engagement through ongoing relational work [ 49 ]. While enumerating the skillset required for PPI work, some of these studies have identified a broader organizational devaluation of PPI workers: Brady and colleagues write of PPI roles as typically underfunded with poor job security, which undermines the continuity necessary for generating trust in PPI work [ 46 ], while Mathie and colleagues report that many PPI workers describe their work as “invisible”, a term which the authors relate to the sociological work on women’s labour (particularly housework and care labour) which is unpaid and rendered invisible insofar as it is naturalized as “care” [ 50 ]. Research on the neighbouring role of public engagement professionals in UK universities, which has been more extensive than that on PPI roles, can be instructive in fleshing out some of these points: public engagement professionals (PEPs) are tasked with mediating between academics and various publics in the service of a publicly accountable university. In a series of papers on the status of PEPs in university workplaces, Watermeyer and colleagues argue that, since public engagement labour is relegated to non-academic forms of expertise which lack recognition, PEPs’ efforts in boundary spanning do not confer prestige. This lack of prestige can, in effect, function as a “boundary block” obstructing PEPs’ work [ 51 , 52 ]. Furthermore, like Mathie and Brady, Watermeyer and colleagues also argue that the relational and facilitative nature of engagement labour constitutes such labour as feminized and devalued, with PEPs also reporting that their work remains invisible to colleagues and institutional audit instruments alike [ 50 , 53 ].

The present study seeks to explore further these suggestions that PPI labour, like that of public engagement professionals, lacks recognition and is constituted as invisible. However, we maintain that there are significant differences between the purpose and moral implications of involvement and engagement activities. PPI constitutes an amplification of the moral underpinnings of engagement policies: while public engagement seeks to showcase the public utility of academic research, public involvement aims to directly contribute to optimizing and personalizing healthcare provision by minimizing research waste, ensuring that treatments and services tap into the needs of patient groups, and delivering the vision of a patient-centred NHS. Therefore, even as PPI work may be peripheral to other auditable research activities, it is nevertheless central to the current rationale for publicly funded research ecosystems: by suturing performance management and efficiency metrics onto a discourse of public benefit, such work constitutes the moral underpinnings of performance management in health research systems. Therefore, an analysis of the labour of the dedicated PPI workforce is crucial for understanding how this suturing of performance management and “public benefit” works over the conjured figures of patients in need of benefit. This issue lies at the heart of our research study.

Our interview study formed the first phase of a multi-method qualitative inquiry into the working practices of NIHR-funded PPI leads. While PPI lead posts are in evidence in most NIHR-funded research, we decided to focus on NIHR infrastructure funding specifically: these are 5-year grants absorbing a major tranche of NIHR funds (over £600 million annually in 2024). They function as “strategic investments” embodying the principles outlined in Best Research for Best Health: they are awarded to research organizations and NHS Trusts for the purposes of developing and consolidating capacious environments for early stage and applied clinical research, including building a research delivery workforce and embedding a regional infrastructure of partnerships with industry, the third sector and patients and communities [ 55 ]. We believe that understanding the experience of the PPI workforce funded by these grants may give better insights into NIHR’s ecosystem and priorities, since they are specifically set up to support the development of sustainable partnerships and embed the translational pipeline into clinical practice.

The study used purposive sampling with snowball elements. In 2020–2021, we mapped all 72 NIHR infrastructure grants, identified the PPI teams working in each of these using publicly available information (found on the NIHR website and the websites and PPI pages of every organization awarded infrastructure grants) and sent out invitation emails to all teams. Where applicable, we also sent invitations to mailing lists of PPI-lead national networks connected to these grants. Inclusion criteria were that potential participants should have oversight roles, and/or be tasked with cross-programme/centre responsibilities, meaning that their facilitative and strategy building roles should cover the entirety of activities funded by one (and sometimes more than one) NIHR infrastructure grant or centres including advisory roles over most or all research projects associated with the centre of grant, and that they had worked in this or a comparable environment for 2 years.

The individuals who showed interest received detailed information sheets. Once they agreed to participate, they were sent a consent form and a convenient interview time was agreed. We conducted 21 semi-structured interviews online, between March and June 2021, lasting 60–90 min. The interview topic guide was developed in part through a review of organizational documents outlining the role and through a consideration of existing research on the labour of PPI within health research environments. It focussed on how PPI workers fit within the organization relationship between the actual work undertaken and the way this work is represented to both the organization and the funder. Interview questions included how participants understand their role; how they fit in the organization; how their actual work relates to the job description; how their work is understood by both colleagues and public contributors; the relationship between the work they undertake and how this is represented in reports to funder and presentations; and what they find challenging about their work. Information about participants’ background and what brought them to their present role was also gathered. Audio files were checked, transcribed and the transcripts fully de-identified. All participants were given the opportunity to check transcripts and withdraw them at any point until December 2021. None withdrew.

We analysed the interviews using reflexive thematic analysis with participatory elements [ 54 , 55 ]. Reflexive thematic analysis emphasizes the interpretative aspects of the analytical process, including the data “collection” process itself, which this approach recognizes as a generative act, where meaning is co-created between interviewer and participant and the discussion may be guided by the participant rather than strictly adhering to the topic guide [ 56 ]. We identified patterns of meaning through sustained and immersive engagement with the data. NVivo 12 was used for coding, while additional notes and memos on the Word documents themselves mitigated the over-fragmentation that might potentially limit NVivo as a tool for qualitative analysis. Once we had developed themes which gave a thorough interpretation of the data, we presented these to participants in two separate workshops to test for credibility and ensure that participants felt ownership of the process [ 57 ].

As the population from which the sample was taken is quite small, with some teams working across different infrastructure grants, confidentiality and anonymity were important concerns for participants. We therefore decided neither to collect nor to present extensive demographic information to preserve confidentiality and avoid deductive disclosure [ 58 ]. Out of our 21 participants 20 were women; there was some diversity in age, ethnicity and heritage, with a significant majority identifying as white (British or other European). Participants had diverse employment histories: many had come from other university or NHS posts, often in communications, programme management or human resources; a significant minority had come from the voluntary sector; and a small minority from the private sector. As there was no accredited qualification in PPI at the time this study was undertaken, participants had all learned their skills on their present or previous jobs. A total of 13 participants were on full-time contracts, although in several cases funding for these posts was finite and fragmented, often coming from different budgets.

In this paper we present five inter-related themes drawing on the conceptual architecture we outlined in the first half of this paper to explore how PPI workers navigate a research ecosystem of interlocking institutional spaces that is governed by “NPM with a human face”, while striving to align patients and the public with the imaginary of the patient-centred NHS that mobilizes the NIHR mission. These five themes are: (i) the instability of support; (ii) the production of invisible labour; (iii) PPI as moral imperative; (iv) accountability without control; and (v) delivering change without changing.

“There to grease the cogs rather than be the cogs”: the instability of “support”

Infrastructure grants act as a hub for large numbers of studies, often in diverse health fields, most of which should, ideally, include PPI activities. Here, dedicated PPI staff typically fulfil a cross-cutting role: they are meant to oversee, provide training and advise on embedding PPI activities across the grant and, in so doing, support researchers in undertaking PPI. On paper, support towards the institution in the form of training, delivering strategy for and evaluating PPI is associated with more senior roles (designated manager or lead) whereas support towards so-called public contributors is the remit of more junior roles (designated co-ordinator or officer) and can include doing outreach, facilitating, attending to access needs and developing payment and compensation procedures. However, these distinctions rarely applied in practice: participants typically reported that their work did not neatly fit into these categories and that they often had to fulfil both roles regardless of their title. Some were the only person in the team specifically tasked with PPI, and so their “lead” or “manager” designation was more symbolic than actual:

I have no person to manage, although sometimes I do get a little bit of admin support, but I don’t have any line management responsibility. It is really about managing my workload, working with people and managing the volunteers that I work with and administrating those groups and supporting them (P11).

P11’s title was manager but, as they essentially worked alone, shuttling between junior and senior role responsibilities, they justified and made sense of their title by reframing their support work with public contributors as “management”. Furthermore, other participants reported that researchers often misunderstood PPI workers’ cross-cutting role and expected them to both advise on and deliver PPI activities themselves, even in the context of multiple projects, thus altogether releasing researchers of such responsibility.

As a PPI lead, it is very difficult to define what your role is in different projects….and tasks … So, for example, I would imagine in [some cases] we are seen as the go-to if they have questions. [..] whereas, in [other cases], it is like, “Well, that’s your job because you’re the PPI lead” […] there is not a real understanding that PPI is everyone’s responsibility and that the theme leads are there to facilitate and to grease the cogs rather than be the cogs (P20).

Furthermore, participants reported that the NIHR requirement for a PPI lead in all funding applications might in fact have facilitated this slippage. As already mentioned, the NIHR requirement does not differentiate between someone hired specifically to undertake PPI and a researcher tasked with PPI activities. The presence of a member of staff with a “PPI lead” title thus meant that PPI responsibilities in individual research studies could continue to accrue on that worker:

The people who have been left with the burden of implementing [the NIHR specified PPI lead role] are almost exclusively people like me, though, because now researchers expect me to allow myself to be listed on their project as the PPI lead, and I actually wrote a document about what they can do for the PPI lead that more or less says, “Please don’t list me as your PPI lead. Please put aside funds to buy a PPI lead and I will train them, because there is only one me; I can’t be the PPI lead for everyone” (P10).

This expectation that core members of staff with responsibilities for PPI would also be able to act as PPI leads for numerous research projects suggests that this role lacks firm organizational co-ordinates and boundaries. Here, the presence of a PPI workforce does not, in fact, constitute an appropriate allocation of PPI labour but rather testifies to a continuing institutional misapprehension of the nature of such labour particularly in terms of its duration, location and value.

Conjuring PPI: the production of invisible labour

Participants consistently emphasized the invisibility of the kinds of labour, both administrative and relational, specific to public involvement as a process, confirming the findings of Mathie and colleagues [ 50 ]. This invisibility took different forms and had different justifications. Some argued that key aspects of their work, which are foundational to involvement, such as the process of relationship building, do not lend themselves to recognition as a performance indicator: “ There is absolutely no measure for that because how long is a piece of string” (P11). In addition, relationship building necessitated a considerably greater time investment than was institutionally acceptable, and this was particularly evident when it came to outreach. Participants who did their work in community spaces told stories of uncomprehending line-managers, or annoyed colleagues who wondered where the PPI worker goes and what they do all day:

There is very little understanding from colleagues about what I do on a day-to-day basis, and it has led to considerable conflict …. I would arrive at the office and then I would be disappearing quite promptly out into the community, because that is where I belong […] So, it is actually quite easy to become an absent person (P3).

Once again, the NIHR requirement for designated PPI leads in funding applications, intended to raise the visibility of PPI work by formalizing it as costed labour, could instead further consolidate its invisibility:

I am constantly shoved onto bids as 2% of my full-time equivalent and I think I worked out for a year that would be about 39 hours a year. For a researcher, popping the statistician down and all these different people on that bid, “Everyone is 2% and we need the money to run the trial, so 2% is fine”. And if I said to them, “Well, what do you think I would do in those 39 hours?” they wouldn’t have a clue, not a clue (P17).

The 2% of a full-time allocation is accorded to the PPI worker because 2–5% is the time typically costed for leadership roles or for roles with a circumscribed remit (e.g. statisticians). However, this allocation, in making PPI workers’ labour visible either as oversight (what project leads do) or as methodological expertise (what statisticians do), ends up producing the wrong kind of visibility: the 39 h mentioned here might make sense when the role mainly involves chairing weekly meetings or delivering statistical models but are in no way sufficient for the intense and ongoing labour of trust-building and alignment between institutions and public contributors in PPI.

Indeed, such costings, by eliding the complexity and duration of involvement, may reinforce expectations that PPI can be simply conjured up at will and delivered on demand:

A researcher will say to us, “I would really like you to help me to find some people with lived experience, run a focus group and then I’ll be away”. To them, that is the half-hour meeting to talk about this request, maybe 10 minutes to draft a tweet and an email to a charity that represents people with that condition […] the reality is it is astronomically more than that, because there is all this hidden back and forth. […] [researchers] expect to be able to hand over their protocol and then I will find them patients and those patients will be … representative and I will be able to talk to all of those patients and … write them up a report and …send it all back and they will be able to be like, “Thanks for the PPI”, and be on their merry way (P13).

What P13 communicates in this story is the researcher’s failure to perceive the difference between PPI work and institutional norms for project delivery: the researcher who asks for “some people with lived experience” is not simply underestimating how long this process will take. Rather, involvement work is perceived as homologous to metricized and institutionally recognizable activities (for example, recruitment to trials or producing project reports) for which there already exist standard procedures. Here, the relational complexity and improvised dynamic of involvement is turned into a deliverable (“the PPI”) that can be produced through following an appropriate procedure. When PPI workers are expected to instantly deliver the right contributors to fit the project needs, PPI labour is essentially black boxed and in its place sits “the PPI”, a kind of magical object seemingly conjured out of nowhere.

Such invisibility, however, may also be purposefully produced by the PPI workers themselves. One participant spoke of this at length, when detailing how they worked behind the scenes to ensure public contributors have input into research documents:

When we get a plain English summary from a researcher, we rewrite them completely. If the advisory group [see] … a really bad plain English summary, they are just going to go, “I don’t understand anything”. I might as well do the translation straight away so that they can actually review something they understand. [Researchers then] think, “Oh, [the public advisory group] are so good at writing” … and I am thinking, “Well, they don’t … write, they review, and they will say to me, ‘Maybe move this up there and that up there, and I don’t understand these’”, … They are great, don’t get me wrong, but they don’t write it. And it is the same with a lot of things. They think that [the group] are the ones that do it when it is actually the team (P7).

Here, the invisibility of the PPI worker’s labour is purposefully wrought to create good will and lubricate collaboration. Several participants said that they chose to engage in such purposeful invisibility because they knew that resources were not available to train researchers in plain writing and public contributors in academic writing. PPI workers, in ghost-writing accessible texts, thus effect a shortcut in the institutional labour required to generate alignment between researchers and public contributors. However, this shortcut comes at a price: in effecting it, PPI workers may collude in conjuring “the PPI” – they may themselves make their own work disappear.

“Not a 9 to 5”: PPI work as more than a job

Most participants reported that overtime working was common for themselves and their teammates, whether they were on a fractional or full-time contract. Overall, participants saw undertaking extra work as a necessary consequence of their commitment towards public contributors, a commitment which made it difficult to turn work down:

Everyone loses if you say no: the public contributors aren’t involved in a meaningful way, the project won’t be as good because it doesn’t have meaningful PPI involvement (P20).

While overwork was a common result of this commitment, some participants described such overwork as the feature that distinguished PPI work from what one commonly understands as a “job”, because, in this case, over-work was seen as freely chosen rather than externally imposed:

It is me pushing myself or wanting to get things done because I started it and I think I would get less done if I worked less and that would bother me, but I don’t think it is a pressure necessarily from [line manager] or [the institution] or anyone to be like, “No, do more” (P13).

Participants presented relationship building not only as the most time-consuming but also the most enjoyable aspect of PPI work. Community engagement was a key site for this and once again participants tended to represent this type of work as freely chosen:

I did most of the work in my free time in the end because you have to go into communities and you spend a lot longer there. […] So, all of that kind of thing I was just doing in my spare time and I didn’t really notice at the time because I really enjoyed it (P6).

Thus, time spent in relationship building was constituted as both work and not work. It did not lend itself to metricization via workplace time management and additionally, was not perceived by participants themselves as labour (“I didn’t really notice it at the time”). At the same time, out-of-hours work was rationalized as necessary for inclusivity, set up to enable collaboration with public contributors in so far as these do not have a contractual relationship to the employer:

That is not a 9–5. That is a weekends and holidays sort of job, because our job is to reduce the barriers to involvement and some of those barriers are hours – 9–5 is a barrier for some people (P17).

If working overtime allows PPI workers to reduce barriers and enable collaboration with those who are not employed by the institution, that same overtime work also serves to conceal the contractual nature of the PPI workers’ own labour, which now becomes absorbed into the moral requirements of PPI.

“Caught in the middle”: accountability without control

Participants repeatedly emphasized that their ability to contribute to research delivery was stymied by their lack of control over specific projects and over broader institutional priority setting:

… as a PPI lead we are not full member of staff, we are not responsible for choosing the research topics. We […] can only guide researchers who come to us and tell us what they are doing … we don’t have any power to define what the public involvement looks like in a research project (P6).

Tasked with creating alignments and partnerships between the publics and institutions, participants argued that they did not have the power to make them “stick” because they are not “really” part of the team. However, even as PPI workers lacked the power to cement partnerships, any failure in the partnership could be ascribed to them, perceived as a failure of the PPI worker by both funder and public contributors:

Often you have to hand over responsibility and the researcher [who] can let the panel down and … I feel like I have let the panel member down because … I am the one who said, “Oh yes, this person wants to talk to you”, and I find that really challenging, getting caught in the middle like that (P21).

This pairing of accountability with lack of control became more pronounced in grant applications or reports to the funder:

It is also quite frustrating in the sense that, just because I advise something, it doesn’t necessarily mean that it gets implemented or even included in the final grant. [even so] whatever the feedback is still reflects on us, not necessarily on the people who were making the wider decisions […] As PPI leads, we are still usually the ones that get the blame (P10).

Several participants testified to this double frustration: having to witness their PPI plans being rewritten to fit the constraints (financial, pragmatic) of the funding application, they then often found themselves held accountable if the PPI plans fail to carry favour with the funder. PPI workers then become the site where institutional accountability to both its public partners and to the funder gathers – it is as though, while located outside most decision-making, they nevertheless become the attractors for the institution’s missing accountability, which they experience, in the words of P21, as “ being caught in the middle ” or, as another participant put it, as “ the worry you carry around ” (P16).

“There to just get on with it”: delivering change without changing

Participants recognized that effective collaboration between research institutions and various publics requires fundamental institutional changes. Yet they also argued that while PPI workers are not themselves capable of effecting such change, there is nevertheless considerable institutional pressure to deliver on promises made in grant applications and build PPI strategies on this basis:

So, there is that tension about […] pushing this agenda and encouraging people to do more [….] rather than just accepting the status quo. But actually, the reality is that it is very, very hard to get everybody in [grant name] to change what they do and I can’t make that happen, [senior PPI staff] can’t make that happen, nobody can. The whole systemic issue … But you have got, somehow in the strategy and what you say you are going to do, that tension between aspiration and reality (P4).

This tension between aspiration and reality identified here could not be spelled out in reports for fear of reputational damage. In fact, the expectation to have delivered meaningful PPI, now routinely set up in NIHR applications, could itself militate against such change. For example, a frequently voiced concern was that PPI was being progressively under-resourced:

I feel the bar is getting higher and higher and higher and expectations are higher and we have got no extra resource (P16).

However, annual reports, the mechanism through which the doing of PPI is evidenced, made it difficult to be open about any such under-resourcing.

We will allude to [the lack of resources]. So, we will say things like, “We punch above our weight”, but I am not sure that message gets home to the NIHR very clearly. It is not like the annual report is used to say, “Hey, you’re underfunding this systematically, but here’s all the good stuff we do”, because the annual report is, by essence, a process of saying how great you are, isn’t it? (P3).

The inclusion of PPI as a “deliverable” meant that, in a competitive ecosystem, the pressure is on to report that PPI has always already been delivered. As another participant put it, “ no one is going to report the bad stuff ” (P17). Hence reporting, in setting up PPI as a deliverable, reinforced new zones of invisibility for PPI labour and made it harder to surface any under-resourcing for such labour. Furthermore, such reporting also played down any association between successful PPI and system transformation. Another participant described the resistance they encountered after arguing the organization should move away from “last-minute” PPI:

I think it is really hard when […] these people are essentially paying your pay cheque, to then try to push back on certain things that I don’t think are truly PPI ….[A]s somebody who I felt my role was really to show best practice, for then [to be] seen as this difficult person for raising issues or pushing back rather than just getting things done, is really hard [….] I get the impression, at least within the [organization] … that I am not there to really point out any of the issues. I am there just to get on with it (P14).

This opposition between pointing out the issues and “getting on with it” is telling. It names a contradiction at the heart of PPI labour: here, the very act of pushing back – in this case asking for a commitment to more meaningful and ongoing PPI – can be perceived as going against the PPI worker’s responsibilities, insofar as it delays and undoes team expectations for getting things done, for delivering PPI. Here, then, we find an exemplary instance of the incommensurability between the temporal demands of research and those of meaningful PPI practice.

How do the five themes we have presented help open out how policies around public participation are put into practice—as well as the contradictions that this practice navigates – in health systems organized by the rhetorical suturing of performance management onto public benefit? We have argued that the development of a dedicated workforce represents an attempt to “repair” the tension experienced by researchers between the administrative, facilitative and emotional work of PPI and the kinds of deliverables that the institution requires them to prioritize. We argue that our findings indicate that insofar as PPI workers’ role then becomes one of “delivering” PPI, this tension is reproduced and at times intensified within their work. This is because, as actors in the health research ecosystem, PPI staff are tethered to the very regimes of performance management, which give rise to an institutional misapprehension of the actual labour associated with delivering PPI.

This misapprehension surfaces in the instruments through which the funder costs, measures and generates accountability for PPI – namely, the requirement for a costed PPI lead and the mandatory inclusion of a PPI section in applications and regular reports to funder. The NIHR requirement for a costed PPI lead, intended to legitimize the undertaking of PPI as an integral part of a research team’s responsibilities, may instead continue to position the PPI worker as a site for the research team’s wholesale outsourcing of responsibility for PPI, since this responsibility, while in tension with other institutional priorities, cannot nevertheless be refused by the team. Furthermore, the use of titles such as lead, manager or co-ordinator not only signal an orderly distinction between junior and senior roles, which often does not apply in practice, but also reframes the extra-institutional work of PPI (the forging of relationships and administrative support with public contributors), through the intra-institutional functions of performance/project management. This reframing elides an important difference between the two: public and patient partners, for the most part, do not have a formal contractual relationship with the institution and are not subject to performance management in the way that contracted researchers and healthcare professionals are. Indeed, framing the relationship between PPI workers and public contributors through the language of “management” fundamentally misrecognizes the kinds of relationalities produced in the interactions between PPI workers and public contributors and elides the externality of PPI to the “logic of deliverables” [ 36 ].

The inclusion of a detailed PPI section in grant applications and annual reports to funder further consolidates this misapprehension by also representing public involvement as if it is already enrolled within organizational normative procedures and therefore compels those in receipt of funding to evidence such delivery through annual reports [ 37 ]. This demand puts PPI workers under increasing pressure, since their function is to essentially present PPI objectives as not only achievable but already achieved, thus essentially bracketing out the process of organizational transformation which is a necessary prerequisite to establishing enduring partnerships with patients and the public. This bracketing out is at work in the organizational expectation to “just get on with it”, which structures the labour of delivering PPI in NIHR-funded research. Here, the demand to just get on, to do the work one is paid to do, forecloses the possibility of engaging with the structural obstacles that militate against that work being done. To the extent that both role designation and reporting expectations function to conceal the disjuncture that the establishment of public partnerships represents for regimes of performance management, they generate new invisibilities for PPI workers. These invisibilities radically constrain how such labour can be adequately undertaken, recognized and resourced.

In suggesting that much of the labour of staff in public involvement roles is institutionally invisible, and that organizational structures may obstruct or block their efforts, we concur with the arguments made by Watermeyer, Mathie and colleagues about the position of staff in public engagement and public involvement roles, respectively. However, our account diverges from theirs in our interpretation of how and why this labour is experienced as invisible and how that invisibility could be remedied. Mathie and colleagues in particular attribute this invisibility to a lack of parity and an institutional devaluation of what are perceived as “soft skills” – facilitation and relationship building in particular [ 50 ]. They therefore seek to raise PPI work to visibility by emphasizing the complexity of PPI activities and by calling for a ring-fencing of resources and a development of infrastructures capable of sustaining such work. While we concur that the invisibility of PPI labour is connected to its devaluation within research institutions, we also suggest that, in addition, this invisibility is a symptom of a radical misalignment between regimes of performance management and the establishment of sustainable public partnerships. Establishing such partnerships requires, as a number of researchers have demonstrated [ 18 , 59 , 60 ], considerable institutional transformation, yet those tasked with delivering PPI are not only not in a position to effect such transformation, they are also compelled to conceal its absence.

Recognizing and addressing the misalignment between regimes of performance management and the establishment of sustainable public partnerships becomes particularly pressing given the increasing recognition, in many countries, that public participation in health research and intervention development is an important step to effectively identifying and addressing health inequalities [ 19 , 61 , 62 ]. Calls for widening participation, for the inclusion of under-served populations and for co-designing and co-producing health research, which have been gathering force in the last 20 years, have gained renewed urgency in the wake of the coronavirus disease 2019 (COVID-19) pandemic [ 63 , 64 , 65 , 66 , 67 ]. In the United Kingdom, Best Research for Best Health: The Next Chapter, published by the NIHR in 2021 to define the direction and priorities for NHS Research for the coming decade, exemplifies this urgency. The document asserts that a radical broadening of the scope of PPI (now renamed “public partnerships”) is essential for combatting health inequalities: it explicitly amplifies the ambitions of its 2006 predecessor by setting up as a key objective “close and equitable partnerships with communities and groups, including those who have previously not had a voice in research” [ 68 ]. Here, as in other comparable policy documents, emphasis on extending partnerships to so-called underserved communities rests on the assumption that, to some degree at least, PPI has already become the norm for undertaking research. This assumption, we argue, closes down in advance any engagement with the tensions we have been discussing in this paper, and in so doing risks exacerbating them. The document does recognize that for such inclusive partnerships to be established institutions must “work differently, taking research closer to people [..] and building relationships of trust over time” – though, we would suggest, it is far from clear how ready or able institutions are really to take on what working differently might mean.

Our study engages with and emphasizes this need to “work differently” while also arguing that the demands and expectations set up through regimes of performance management and their “logic of deliverables” are not favourable to an opening of a space in which “working differently” could be explored. In health research systems organized through these regimes, “working differently” is constrained by the application of the very templates, instruments and techniques which constitute and manage “business as usual”. Any ongoing effort to transform health research systems so as better to respond to growing health inequalities, our study implies, needs to combat, both materially and procedurally, the ease with which the disjuncture between embedding public partnerships and normative ways of undertaking research comes to disappear.

Limitations

We focus on the labour of the PPI workforce and their negotiation of performance management regimes, which means that we have not discussed relationships between PPI staff and public contributors nor presented examples of good practice. While these are important domains for study if we are to understand the labour of the PPI workforce, they lie outside the scope of this article. Furthermore, our focus on the UK health research system means that our conclusions may have limited generalizability. However, both the consolidation of NPM principles in public sector institutions and the turn to public and patient participation in the design and delivery of health research are shared developments across countries in the Global North in the last 40 years. Therefore, the tensions we discuss are likely to also manifest in health systems outside the United Kingdom, even as they may take somewhat different forms, given differences in how research and grants are costed, and roles structured. Finally, this project has elements of “insider” research since both authors, while working primarily as researchers, have also had experience of embedding PPI in research studies and programmes. Insider research has specific strengths, which include familiarity with the field and a sense of shared identity with participants which may enhance trust, facilitate disclosure and generate rich data. In common with other insider research endeavours, we have sought to reflexively navigate risks of bias and of interpretative blind spots resulting from over-familiarity with the domain under research [ 69 ] by discussing our findings and interpretations with “non-insider” colleagues while writing up this research.

Our qualitative study is one of the first to investigate how the UK PPI workforce is negotiating the current health research landscape. In doing so, we have focused on the UK’s NIHR since this institution embodied the redirection of performance management regimes towards public benefit by means of public participation. If PPI is set up as both the means of enabling this redirection and an outcome of its success, then the PPI workforce, the professional cadre evolving to support PPI, becomes, we argue, the site where the tensions of attempting this alignment are most keenly experienced.

We suggest that, while such alignment would demand a wholesale transformation of organizational norms, the regimes of performance management underpinning research ecologies may also work to foreclose such transformation, thus hollowing out the promise of patient-centred research policies and systems. Recognizing and attending to this foreclosure is urgent, especially given the current policy emphasis in many countries on broadening the scope, ambition and inclusivity of public participation as a means of increasing the reach, relevance and potential positive impact of health research.

Availability of data and materials

The data that support the findings of this study are available on request from the corresponding author.

Shippee ND, Domecq Garces JP, Prutsky Lopez GJ, Wang Z, Elraiyah TA, Nabhan M, et al. PMC5060820: patient and service user engagement in research: a systematic review and synthesized framework. Health Expect. 2015;18(5):1151–66.

Article   PubMed   Google Scholar  

Domecq JP, Prutsky G, Elraiyah T, Wang Z, Nabhan M, Shippee N, et al. PMC3938901: patient engagement in research: a systematic review. BMC Health Serv Res. 2014;26(14):89.

Article   Google Scholar  

Crocker J, Hughes-Morley A, Petit-Zeman S, Rees S. Assessing the impact of patient and public involvement on recruitment and retention in clinical trials: a systematic review. In: 3rd International Clinical Trials Methodology Conference. 2015;16(S2):O91.

Staniszewska S, Herron-Marx S, Mockford C. Measuring the impact of patient and public involvement: the need for an evidence base. Int J Qual Health Care. 2008;20(6):373–4.

Brett J, Staniszewska S, Mockford C, Herron-Marx S, Hughes J, Tysall C, et al. Mapping the impact of patient and public involvement on health and social care research: a systematic review. Health Expect. 2014;17(5):637–50.

Staniszewska S, Adebajo A, Barber R, Beresford P, Brady LM, Brett J, et al. Developing the evidence base of patient and public involvement in health and social care research: the case for measuring impact. Int J Consum Stud. 2011;35(6):628–32.

Staley K. ‘Is it worth doing?’ Measuring the impact of patient and public involvement in research. Res Involv Engagem. 2015;1(1):6.

Article   PubMed   PubMed Central   Google Scholar  

Brady L, Preston J. How do we know what works? Evaluating data on the extent and impact of young people’s involvement in English health research. Res All. 2020;4(2):194–206.

Daly J. Evidence based medicine and the search for a science of clinical care. Oakland: University of California Press; 2005.

Book   Google Scholar  

Ward PR, Thompson J, Barber R, Armitage CJ, Boote JD, Cooper CL, et al. Critical perspectives on ‘consumer involvement’ in health research epistemological dissonance and the know-do gap. J Sociol. 2010;46(1):63–82.

Rose D, Kalathil J. Power, privilege and knowledge: the untenable promise of co-production in mental “health.” Front Soc. 2019;4(57):435866.

Google Scholar  

Beresford P. PMC7317269: PPI or user Involvement: taking stock from a service user perspective in the twenty first century. Res Involv Engagem. 2020;6:36.

McKevitt C. Experience, knowledge and evidence: a comparison of research relations in health and anthropology. Evid Policy. 2013;9(1):113–30.

Boaz A, Biri D, McKevitt C. Rethinking the relationship between science and society: has there been a shift in attitudes to Patient and Public Involvement and Public Engagement in Science in the United Kingdom? Health Expect. 2016;19(3):592–601.

Green G. Power to the people: to what extent has public involvement in applied health research achieved this? Res Involv Engagem. 2016;2(1):28.

Miller FA, Patton SJ, Dobrow M, Berta W. Public involvement in health research systems: a governance framework. Health Res Policy Syst. 2018;16(1):79.

Madden M, Speed E. Beware zombies and unicorns: toward critical patient and public involvement in health research in a neoliberal context. Front Sociol. 2017;2(7):1–6.

Papoulias S, Callard F. Material and epistemic precarity: it’s time to talk about labour exploitation in mental health research. Soc Sci Med. 2022;306:115102.

Lignou S, Sheehan M, Singh I. ‘A commitment to equality, diversity and inclusion’: a conceptual framework for equality of opportunity in patient and public involvement in research. Res Ethics. 2024;20(2):288–303.

Dorey P. The legacy of Thatcherism—public sector reform. Obs Soc Br. 2015;17:33–60.

National Health Service and Community Care Act. 1990.

Ferlie E, Ashburner L, Fitzgerald L, Pettigrew A. The new public management in action. Oxford: Oxford University Press; 1996.

Lapuente V, Van de Walle S. The effects of new public management on the quality of public services. Governance. 2020;33(3):461–75.

Atkinson P, Sheard S, Walley T. ‘All the stars were aligned’? The origins of England’s National Institute for Health Research. Health Res Policy Syst. 2019;17(1):95.

Weir S, Beetham D. Political power and democratic control in Britain: the democratic audit of the United Kingdom. London: Psychology Press; 1999.

Sullivan HC, Skelcher C. Working across boundaries. 1st ed. Houndmills: Palgrave; 2002.

The new NHS: modern, dependable 1997.

Cutler T, Waine B. Managerialism reformed? New labour and public sector management. Soc Policy Adm. 2000;34(3):318–32.

Speed E. Applying soft bureaucracy to rhetorics of choice: UK NHS 1983–2007. In: Clegg SR, Harris M, Hopfl H, editors. Managing modernity: the end of bureaucracy? Oxford: Oxford University Press; 2011.

Dalingwater L. Post-new public management (NPM) and the reconfiguration of health services in England. Obs Soc Br. 2014;1(16):51–64.

Bennett C, McGivern G, Ferlie E, Dopson S, Fitzgerald L. Making wicked problems governable? The case of managed networks in health care. 1st ed. Oxford: Oxford University Press; 2013.

Hanney S, Kuruvilla S, Soper B, Mays N. Who needs what from a national health research system: lessons from reforms to the English department of Health’s R&D system. Health Res Policy Syst. 2010;13(8):11–11.

Evans TW. Best research for best health: a new national health research strategy. Clin Med. 2006;6(5):435–7.

DeNegri S, Evans D, Palm M, Staniszewka S. The history of INVOLVE—a witness seminar. 2024. https://intppinetwork.wixsite.com/ippin/post/history-of-involve . Accessed Apr 17 2024.

Evans D, Pollard KC. Theorising service user involvement from a researcher perspective. In: Staddon P, editor. Mental health service users in research United States. Bristol: Policy Press; 2013. p. 39.

Papoulias SC, Callard F. ‘A limpet on a ship’: spatio-temporal dynamics of patient and public involvement in research. Health Expect. 2021;24(3):810–8.

Komporozos-Athanasiou A, Paylor J, McKevitt C. Governing researchers through public involvement. J Soc Policy. 2022;51(2):268–83.

Paylor J, McKevitt C. The possibilities and limits of “co-producing” research. Front Sociol. 2019;4:23.

Boylan AM, Locock L, Thomson R, Staniszewska S. “About sixty per cent I want to do it”: health researchers’ attitudes to, and experiences of, patient and public involvement (PPI)—a qualitative interview study. Health Expect. 2019. https://doi.org/10.1111/hex.12883 .

DeNegri S. Going the extra mile: improving the nation’s health and wellbeing through public involvement in research. 2015.

Crowe S, Wray P, Lodemore M. NIHR public involvement leads’ meeting November 25 2016. 2017.

NIHR. Taking stock—NIHR public involvement and engagement. 2019. https://www.nihr.ac.uk/documents/taking-stock-nihr-public-involvement-and-engagement/20566 . Accessed Apr 28 2023.

NIHR. Definition and role of the designated PPI (Patient and Public Involvement) lead in a research team. 2020. https://www.nihr.ac.uk/documents/definition-and-role-of-the-designated-ppi-patient-and-public-involvement-lead-in-a-research-team/23441 . Accessed Apr 28 2023.

Li KK, Abelson J, Giacomini M, Contandriopoulos D. Conceptualizing the use of public involvement in health policy decision-making. Soc Sci Med. 2015;138:14–21.

Staley K, Barron D. Learning as an outcome of involvement in research: what are the implications for practice, reporting and evaluation? Res Involv Engagem. 2019;5(1):14.

Brady L, Miller J, McFarlane-Rose E, Noor J, Noor R, Dahlmann-Noor A. “We know that our voices are valued, and that people are actually going to listen”: co-producing an evaluation of a young people’s research advisory group. Res Involv Engagem. 2023;9(1):1–15.

Knowles S, Sharma V, Fortune S, Wadman R, Churchill R, Hetrick S. Adapting a codesign process with young people to prioritize outcomes for a systematic review of interventions to prevent self-harm and suicide. Health Expect. 2022;25(4):1393–404.

Mathie E, Wythe H, Munday D, Millac P, Rhodes G, Roberts N, et al. Reciprocal relationships and the importance of feedback in patient and public involvement: a mixed methods study. Health Expect. 2018;21(5):899–908.

Wilson P, Mathie E, Keenan J, McNeilly E, Goodman C, Howe A, et al. ReseArch with Patient and Public invOlvement: a RealisT evaluation—the RAPPORT study. Health Serv Deliv Res. 2015;3(38):1–176.

Article   CAS   Google Scholar  

Mathie E, Smeeton N, Munday D, Rhodes G, Wythe H, Jones J. The role of patient and public involvement leads in facilitating feedback: “invisible work.” Res Involv Engagem. 2020;6(1):40.

Watermeyer R, Rowe G. Public engagement professionals in a prestige economy: ghosts in the machine. Stud High Educ. 2022;47(7):1297–310.

Watermeyer R, Lewis J. Institutionalizing public engagement through research in UK universities: perceptions, predictions and paradoxes concerning the state of the art. Stud High Educ. 2018;43(9):1612–24.

Collinson JA. ‘Get yourself some nice, neat, matching box files!’ research administrators and occupational identity work. Stud High Educ. 2007;32(3):295–309.

Braun V, Clarke V. Reflecting on reflexive thematic analysis. Qual Res Sport Exerc Health. 2019;11(4):589–97.

Clarke V, Braun V. Thematic analysis: a practical guide. 2021.

Clarke V, Braun V. Successful qualitative research. London: SAGE; 2013.

Lincoln YS, Guba EG. Naturalistic inquiry. 3rd ed. Beverly Hills: Sage Publications; 1985.

Kaiser K. Protecting respondent confidentiality in qualitative research. Qual Health Res. 2009;19(11):1632–41.

Heney V, Poleykett B. The impossibility of engaged research: complicity and accountability between researchers, ‘publics’ and institutions. Sociol Health Illn. 2022;44(S1):179–94.

MacKinnon KR, Guta A, Voronka J, Pilling M, Williams CC, Strike C, et al. The political economy of peer research: mapping the possibilities and precarities of paying people for lived experience. Br J Soc Work. 2021;51(3):888–906.

Bibbins-Domingo K, Helman A, Dzau VJ. The imperative for diversity and inclusion in clinical trials and health research participation. JAMA. 2022;327(23):2283–4.

Washington V, Franklin JB, Huang ES, Mega JL, Abernethy AP. Diversity, equity, and inclusion in clinical research: a path toward precision health for everyone. Clin Pharmacol Ther. 2023;113(3):575–84.

Graham ID, McCutcheon C, Kothari A. Exploring the frontiers of research co-production: the Integrated Knowledge Translation Research Network concept papers. Health Res Policy Syst. 2019;17(1):88.

Marten R, El-Jardali F, Hafeez A, Hanefeld J, Leung GM, Ghaffar A. Co-producing the covid-19 response in Germany, Hong Kong, Lebanon, and Pakistan. BMJ. 2021;372: n243.

Smith H, Budworth L, Grindey C, Hague I, Hamer N, Kislov R, et al. Co-production practice and future research priorities in United Kingdom-funded applied health research: a scoping review. Health Res Policy Syst. 2022;20(1):36.

World Health Organization. Health inequity and the effects of COVID-19: assessing, responding to and mitigating the socioeconomic impact on health to build a better future. Copenhagen: Regional Office for Europe. World Health Organization; 2020.

Dunston R, Lee A, Boud D, Brodie P, Chiarella M. Co-production and health system reform—from re-imagining to re-making. Aust J Public Adm. 2009;68:39–52.

Department for Health and Social Care. Best research for best health: the next chapter. Bethesda: National Institute for Health Research; 2021.

Wilkinson S, Kitzinger C. Representing our own experience: issues in “insider” research. Psychol Women Q. 2013;37:251–5.

Download references

Acknowledgements

S.P. presented earlier versions of this paper at the 8th annual conference of the Centre for Public Engagement Kingston University, December 2021; at the Medical Sociology conference of the British Sociological Association, September 2022; and at the annual Health Services Research UK Conference, July 2023. They are grateful to the audiences of these presentations for their helpful comments. Both authors are also grateful to the generous participants and to the NIHR Applied Research Collaboration Public Involvement Community for their sustaining support and encouragement during this time. S.P. also wishes to thank Felicity Callard for her comments, advice and suggestions throughout this process: this paper would not have been completed without her.

S.P. is supported by the National Institute for Health and Care Research (NIHR) Applied Research Collaboration (ARC) South London at King’s College Hospital NHS Foundation Trust. The views expressed are those of the author and not necessarily those of the NHS, the NIHR or the Department of Health and Social Care.

Author information

Authors and affiliations.

Health Service & Population Research, King’s College London, London, United Kingdom

Stan Papoulias

Centre for Public Health and Community Care, University of Hertfordshire, Hatfield, United Kingdom

Louca-Mai Brady

You can also search for this author in PubMed   Google Scholar

Contributions

S.P. developed the original idea for this article through earlier collaborations with L.M.B. whose long-term experience as a PPI practitioner has been central to both the project and the article. L.M.B. contributed to conceptualization, wrote the first draft of the background and undertook revisions after the first draft including reconceptualization of results. S.P. contributed to conceptualization, undertook data analysis, wrote the first draft of findings and discussion and revised the first draft in its entirety in consultation with L.M.B. Both authors read and approved the final manuscript.

Corresponding author

Correspondence to Stan Papoulias .

Ethics declarations

Ethics approval and consent to participate.

The study received a favourable opinion from the Psychiatry, Nursing and Midwifery Research Ethics Panel, King’s College London (ref no.: LRS-20/21-21466).

Consent for publication

Not applicable.

Competing interests

The authors declare no competing interests.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ . The Creative Commons Public Domain Dedication waiver ( http://creativecommons.org/publicdomain/zero/1.0/ ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and permissions

About this article

Cite this article.

Papoulias, S., Brady, LM. “I am there just to get on with it”: a qualitative study on the labour of the patient and public involvement workforce. Health Res Policy Sys 22 , 118 (2024). https://doi.org/10.1186/s12961-024-01197-5

Download citation

Received : 17 July 2023

Accepted : 26 July 2024

Published : 02 September 2024

DOI : https://doi.org/10.1186/s12961-024-01197-5

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Patient and public involvement
  • PPI workforce
  • New public management
  • National Institute for Health and Care Research (NIHR)

Health Research Policy and Systems

ISSN: 1478-4505

  • Submission enquiries: Access here and click Contact Us
  • General enquiries: [email protected]

qualitative research and evaluation methods 2015

Content Search

Qualitative and quantitative research methodologies (qqrm) - online certificate course.

Research is a core area in development and humanitarian programming. Some of the research activities normally undertaken by various programs to promote evidence-based planning include assessments, surveys and evaluations. These activities employ qualitative and quantitative research methodologies. Qualitative research aims at generating an in-depth understanding of a specific program activity or event, rather than surface description of a large sample of a population. On the other hand, quantitative research focuses on gathering, analyzing and presenting numerical data and generalizing it across groups of people to explain a particular phenomenon.

Given the great significance of research in development and humanitarian work, IDEAL Public Health and Development Consultancy (IPHDC) has planned a training on Qualitative and Quantitative Research Methodologies (QQRM) . The training aims at equipping participants with current knowledge, skills and best practices on research to improve the quality of their overall programming.

When is the training?

30th September to 4th October 2024

Who should attend this training?

UN, Government and NGO staff including but not limited to program coordinators, project managers and officers.

What are the key aspects of the training?

  • Introduction to Qualitative and Quantitative Research Methodologies (QQRM);
  • Hypothesis setting;
  • Research study design including sampling methods;
  • Questionnaire development;
  • Interviewing techniques;
  • Observation techniques and tools;
  • Participatory research techniques;
  • Focus group discussion (FGD) techniques and tools;
  • Key informant interview (KII) techniques and tools;
  • Note-taking and coding;
  • Qualitative and quantitative data analysis (Using SPSS, Nvivo and Ms Excel);
  • Presentation of findings.

What is the main training objective?

This course aims at building the knowledge and competencies of participants on qualitative and quantitative research methodologies.

What learning approach and language will be used in the training?

The training delivery method includes interactive webinar sessions, PowerPoint presentations, articles, videos, quizzes and assessments.

The entire training will be facilitated in English.

Fee information

How to register.

Interested individuals should complete our Course Application Form before Sunday 29th September 2024.

For more information on our courses you can visit our training page .

Latest Updates

Iraq outlines next steps in building the national committee on missing persons and the creation of a national central record, icmp supports forensic facial comparison experts as part of ukraine’s missing persons process.

Uganda + 2 more

IMAGES

  1. bol.com

    qualitative research and evaluation methods 2015

  2. PATTON 2015 Qualitative Research & Evaluation Methods

    qualitative research and evaluation methods 2015

  3. Qualitative research & evaluation methods by Michael Quinn Patton

    qualitative research and evaluation methods 2015

  4. 5 Qualitative Research Methods Every UX Researcher Should Know [+ Examples]

    qualitative research and evaluation methods 2015

  5. Qualitative Research: Definition, Types, Methods and Examples (2022)

    qualitative research and evaluation methods 2015

  6. 6 Types of Qualitative Research Methods

    qualitative research and evaluation methods 2015

VIDEO

  1. Qualitative Research Overview, Types and Relevance (Unit 2)

  2. Qualitative Research Method ( Step by Step complete description )

  3. 9. Qualitative and Quantitative Research Approaches

  4. Qualitative Research Analysis Approaches

  5. Qualitative vs. Quantitative Research Design

  6. Exploring Qualitative and Quantitative Research Methods and why you should use them

COMMENTS

  1. Qualitative Research & Evaluation Methods

    Drawing on more than 40 years of experience conducting applied social science research and program evaluation, author Michael Quinn Patton has crafted the most comprehensive and systematic book on qualitative research and evaluation methods, inquiry frameworks, and analysis options available today. Now offering more balance between applied ...

  2. Qualitative Research & Evaluation Methods

    Qualitative Research & Evaluation Methods

  3. Qualitative Research & Evaluation Methods: Integrating Theory and

    Michael Quinn Patton is author of more than a dozen books on evaluation including Qualitative Research & Evaluation Methods, 4th ed (2015), Blue Marble Evaluation (2020), Principles-Focused Evaluation (2018), Facilitating Evaluation (2018) and Developmental Evaluation (2011). Based in Minnesota, he was on the faculty of the University of Minnesota for 18 years and is a former president of the ...

  4. Qualitative Research & Evaluation Methods

    Qualitative Research & Evaluation Methods: Integrating Theory and Practice. $128.76. (279) Only 3 left in stock - order soon. The book that has been a resource and training tool for countless applied researchers, evaluators, and graduate students has been completely revised with hundreds of new examples and stories illuminating all aspects of ...

  5. Qualitative Research & Evaluation Methods

    Drawing on more than 40 years of experience conducting applied social science research and program evaluation, author Michael Quinn Patton has crafted the most comprehensive and systematic book on qualitative research and evaluation methods, inquiry frameworks, and analysis options available today. Now offering more balance between applied research and evaluation, this Fourth Edition of ...

  6. Qualitative Research & Evaluation Methods

    Qualitative Research & Evaluation Methods. Out of Print - No longer available. This book contains hundreds of examples and stories illuminating all aspects of qualitative inquiry. Patton has created the most comprehensive, systematic review of qualitative methods available. Praise for the Edition.

  7. Qualitative Research & Evaluation Methods

    Drawing on more than 40 years of experience conducting applied social science research and program evaluation, author Michael Quinn Patton has crafted the most comprehensive and systematic book on qualitative research and evaluation methods, inquiry frameworks, and analysis options available today. Now offering more balance between applied ...

  8. Qualitative Research & Evaluation Methods: Integrating Theory and

    Drawing on more than 40 years of experience conducting applied social science research and program evaluation, author Michael Quinn Patton has crafted the most comprehensive and systematic book on qualitative research and evaluation methods, inquiry frameworks, and analysis options available today. Now offering more balance between applied ...

  9. Qualitative Evaluation and Research Methods

    Michael Quinn Patton is author of more than a dozen books on evaluation including Qualitative Research & Evaluation Methods, 4th ed (2015), Blue Marble Evaluation (2020), Principles-Focused Evaluation (2018), Facilitating Evaluation (2018) and Developmental Evaluation (2011). Based in Minnesota, he was on the faculty of the University of Minnesota for 18 years and is a former president of the ...

  10. Qualitative Research & Evaluation Methods: Integrating Theory and

    Buy Qualitative Research & Evaluation Methods: Integrating Theory and Practice Fourth by Patton, Michael Quinn (ISBN: 9781412972123) from Amazon's Book Store. Everyday low prices and free delivery on eligible orders. ... 6 Jan. 2015 . by Michael Quinn Patton (Author) 4.7 4.7 out of 5 stars 276 ratings. See all formats and editions.

  11. Qualitative Research & Evaluation Methods

    Michael Quinn Patton . Michael Quinn Patton is author of more than a dozen books on evaluation including Qualitative Research & Evaluation Methods, 4th ed (2015), Blue Marble Evaluation (2020), Principles-Focused Evaluation (2018), Facilitating Evaluation (2018) and Developmental Evaluation (2011). Based in Minnesota, he was on the faculty of the University of Minnesota for 18 years and is a ...

  12. Qualitative Research & Evaluation Methods:

    TITLE: Qualitative Research & Evaluation Methods: Integrating Theory and Practice, 4th Edition AUTHOR: Michael Quinn Patton PUBLISHER: SAGE PUBLISH DATE: 2015 PAGES: 806 PRICE: $99.00 BINDING: Hardcover

  13. Qualitative research & evaluation methods: Integrating theory and

    The fourth edition of Michael Quinn Patton's Qualitative Research & Evaluation Methods Integrating Theory and Practice, published by Sage Publications, analyses and provides clear guidance and advice for using a range of different qualitative methods for evaluation. Contents. Part 1. Framing Qualitative Inquiry: Theory Informs Practice, Practice Informs Theory

  14. Patton, Michael

    Michael Quinn Patton. Michael Quinn Patton is author of more than a dozen books on evaluation including Qualitative Research & Evaluation Methods, 4th ed (2015), Blue Marble Evaluation (2020), Principles-Focused Evaluation (2018), Facilitating Evaluation (2018) and Developmental Evaluation (2011). Based in Minnesota, he was on the faculty of ...

  15. Qualitative Research and Evaluation Methods (3rd ed.)

    Alternatively, you can explore our Disciplines Hubs, including: Journal portfolios in each of our subject areas. Links to Books and Digital Library content from across Sage.

  16. Qualitative Research & Evaluation Methods 4th Edition

    Qualitative Research & Evaluation Methods: Integrating Theory and Practice 4th Edition is written by Michael Quinn Patton and published by SAGE Publications, Inc. The Digital and eTextbook ISBNs for Qualitative Research & Evaluation Methods are 9781483376059, 1483376052 and the print ISBNs are 9781412972123, 1412972124. Save up to 80% versus print by going digital with VitalSource. Additional ...

  17. Qualitative Research & Evaluation Methods

    Drawing on more than 40 years of experience conducting applied social science research and program evaluation, author Michael Quinn Patton has crafted the most comprehensive and systematic book on qualitative research and evaluation methods, inquiry frameworks, and analysis options available today. Now offering more balance between applied research and evaluation, this Fourth Edition ...

  18. Validity in Qualitative Evaluation: Linking Purposes, Paradigms, and

    Peer debriefing is a form of external evaluation of the qualitative research process. Lincoln and Guba (1985, p. 308) describe the role of the peer reviewer as the "devil's advocate.". It is a person who asks difficult questions about the procedures, meanings, interpretations, and conclusions of the investigation.

  19. Planning Qualitative Research: Design and Decision Making for New

    Planning Qualitative Research: Design and Decision Making ...

  20. Qualitative Research & Evaluation Methods: Integrating Theory and

    Michael Quinn Patton is author of more than a dozen books on evaluation including Qualitative Research & Evaluation Methods, 4th ed (2015), Blue Marble Evaluation (2020), Principles-Focused Evaluation (2018), Facilitating Evaluation (2018) and Developmental Evaluation (2011).

  21. PDF QUALITATIVE RESEARCH METHODS

    ssessing quality in the QRC. The Postscript describes the core attributes of quality in our qual-itative research approach: the importance of coherence, the iterative processes of inductive and deductiv. reasoning, and reflexivity. The Postscript complements the quality assessment criteria which we includ.

  22. Qualitative research and evaluation methods : Patton, Michael Quinn

    Rev. ed. of: Qualitative evaluation and research methods. 2nd ed. 1990 Includes bibliographical references and indexes Access-restricted-item true Addeddate 2021-03-07 21:00:43 Associated-names Patton, Michael Quinn. Qualitative evaluation and research methods Boxid IA40070610 ...

  23. "I am there just to get on with it": a qualitative study on the labour

    Our interview study formed the first phase of a multi-method qualitative inquiry into the working practices of NIHR-funded PPI leads. While PPI lead posts are in evidence in most NIHR-funded research, we decided to focus on NIHR infrastructure funding specifically: these are 5-year grants absorbing a major tranche of NIHR funds (over £600 million annually in 2024).

  24. Methods for Community-Based Participatory Research for Health, 2nd

    This thoroughly revised and updated second edition of Methods for Community-Based Participatory Research for Health provides a step-by-step approach to the application of participatory approaches to quantitative and qualitative data collection and data analysis. With contributions from a distinguished panel of experts, this important volume shows how researchers, practitioners, and community ...

  25. Qualitative and Quantitative Research Methodologies (QQRM)

    These activities employ qualitative and quantitative research methodologies. Qualitative research aims at generating an in-depth understanding of a specific program activity or event, rather than ...

  26. Qualitative Research & Evaluation Methods

    Drawing on more than 40 years of experience conducting applied social science research and program evaluation, author Michael Quinn Patton has crafted the most comprehensive and systematic book on qualitative research and evaluation methods, inquiry frameworks, and analysis options available today. Now offering more balance between applied research and evaluation, this Fourth Edition ...

  27. Transitioning from an occupational therapy student to a clinically

    Study design and setting. This study used a qualitative design to capture the lived experience [Citation 15] of newly graduated occupational therapists' transition process as part of a larger national research project studying the occupational health of occupational therapists [Citation 2].The authors are both licenced occupational therapists.

  28. Evaluation of Pharmacokinetics, Immunogenicity, and Immunotoxicity of

    Small Methods. Early View 2401007. Research Article. Evaluation of Pharmacokinetics, Immunogenicity, and Immunotoxicity of DNA Tetrahedral and DNA Polymeric Nanostructures ... and serum biochemical index analysis. This research shows that the DNA nanostructures of TDN and DNP are safe for biological systems, indicating that TDN and DNP can be ...