Logo for Open Educational Resources

Chapter 11. Interviewing

Introduction.

Interviewing people is at the heart of qualitative research. It is not merely a way to collect data but an intrinsically rewarding activity—an interaction between two people that holds the potential for greater understanding and interpersonal development. Unlike many of our daily interactions with others that are fairly shallow and mundane, sitting down with a person for an hour or two and really listening to what they have to say is a profound and deep enterprise, one that can provide not only “data” for you, the interviewer, but also self-understanding and a feeling of being heard for the interviewee. I always approach interviewing with a deep appreciation for the opportunity it gives me to understand how other people experience the world. That said, there is not one kind of interview but many, and some of these are shallower than others. This chapter will provide you with an overview of interview techniques but with a special focus on the in-depth semistructured interview guide approach, which is the approach most widely used in social science research.

An interview can be variously defined as “a conversation with a purpose” ( Lune and Berg 2018 ) and an attempt to understand the world from the point of view of the person being interviewed: “to unfold the meaning of peoples’ experiences, to uncover their lived world prior to scientific explanations” ( Kvale 2007 ). It is a form of active listening in which the interviewer steers the conversation to subjects and topics of interest to their research but also manages to leave enough space for those interviewed to say surprising things. Achieving that balance is a tricky thing, which is why most practitioners believe interviewing is both an art and a science. In my experience as a teacher, there are some students who are “natural” interviewers (often they are introverts), but anyone can learn to conduct interviews, and everyone, even those of us who have been doing this for years, can improve their interviewing skills. This might be a good time to highlight the fact that the interview is a product between interviewer and interviewee and that this product is only as good as the rapport established between the two participants. Active listening is the key to establishing this necessary rapport.

Patton ( 2002 ) makes the argument that we use interviews because there are certain things that are not observable. In particular, “we cannot observe feelings, thoughts, and intentions. We cannot observe behaviors that took place at some previous point in time. We cannot observe situations that preclude the presence of an observer. We cannot observe how people have organized the world and the meanings they attach to what goes on in the world. We have to ask people questions about those things” ( 341 ).

Types of Interviews

There are several distinct types of interviews. Imagine a continuum (figure 11.1). On one side are unstructured conversations—the kind you have with your friends. No one is in control of those conversations, and what you talk about is often random—whatever pops into your head. There is no secret, underlying purpose to your talking—if anything, the purpose is to talk to and engage with each other, and the words you use and the things you talk about are a little beside the point. An unstructured interview is a little like this informal conversation, except that one of the parties to the conversation (you, the researcher) does have an underlying purpose, and that is to understand the other person. You are not friends speaking for no purpose, but it might feel just as unstructured to the “interviewee” in this scenario. That is one side of the continuum. On the other side are fully structured and standardized survey-type questions asked face-to-face. Here it is very clear who is asking the questions and who is answering them. This doesn’t feel like a conversation at all! A lot of people new to interviewing have this ( erroneously !) in mind when they think about interviews as data collection. Somewhere in the middle of these two extreme cases is the “ semistructured” interview , in which the researcher uses an “interview guide” to gently move the conversation to certain topics and issues. This is the primary form of interviewing for qualitative social scientists and will be what I refer to as interviewing for the rest of this chapter, unless otherwise specified.

Types of Interviewing Questions: Unstructured conversations, Semi-structured interview, Structured interview, Survey questions

Informal (unstructured conversations). This is the most “open-ended” approach to interviewing. It is particularly useful in conjunction with observational methods (see chapters 13 and 14). There are no predetermined questions. Each interview will be different. Imagine you are researching the Oregon Country Fair, an annual event in Veneta, Oregon, that includes live music, artisan craft booths, face painting, and a lot of people walking through forest paths. It’s unlikely that you will be able to get a person to sit down with you and talk intensely about a set of questions for an hour and a half. But you might be able to sidle up to several people and engage with them about their experiences at the fair. You might have a general interest in what attracts people to these events, so you could start a conversation by asking strangers why they are here or why they come back every year. That’s it. Then you have a conversation that may lead you anywhere. Maybe one person tells a long story about how their parents brought them here when they were a kid. A second person talks about how this is better than Burning Man. A third person shares their favorite traveling band. And yet another enthuses about the public library in the woods. During your conversations, you also talk about a lot of other things—the weather, the utilikilts for sale, the fact that a favorite food booth has disappeared. It’s all good. You may not be able to record these conversations. Instead, you might jot down notes on the spot and then, when you have the time, write down as much as you can remember about the conversations in long fieldnotes. Later, you will have to sit down with these fieldnotes and try to make sense of all the information (see chapters 18 and 19).

Interview guide ( semistructured interview ). This is the primary type employed by social science qualitative researchers. The researcher creates an “interview guide” in advance, which she uses in every interview. In theory, every person interviewed is asked the same questions. In practice, every person interviewed is asked mostly the same topics but not always the same questions, as the whole point of a “guide” is that it guides the direction of the conversation but does not command it. The guide is typically between five and ten questions or question areas, sometimes with suggested follow-ups or prompts . For example, one question might be “What was it like growing up in Eastern Oregon?” with prompts such as “Did you live in a rural area? What kind of high school did you attend?” to help the conversation develop. These interviews generally take place in a quiet place (not a busy walkway during a festival) and are recorded. The recordings are transcribed, and those transcriptions then become the “data” that is analyzed (see chapters 18 and 19). The conventional length of one of these types of interviews is between one hour and two hours, optimally ninety minutes. Less than one hour doesn’t allow for much development of questions and thoughts, and two hours (or more) is a lot of time to ask someone to sit still and answer questions. If you have a lot of ground to cover, and the person is willing, I highly recommend two separate interview sessions, with the second session being slightly shorter than the first (e.g., ninety minutes the first day, sixty minutes the second). There are lots of good reasons for this, but the most compelling one is that this allows you to listen to the first day’s recording and catch anything interesting you might have missed in the moment and so develop follow-up questions that can probe further. This also allows the person being interviewed to have some time to think about the issues raised in the interview and go a little deeper with their answers.

Standardized questionnaire with open responses ( structured interview ). This is the type of interview a lot of people have in mind when they hear “interview”: a researcher comes to your door with a clipboard and proceeds to ask you a series of questions. These questions are all the same whoever answers the door; they are “standardized.” Both the wording and the exact order are important, as people’s responses may vary depending on how and when a question is asked. These are qualitative only in that the questions allow for “open-ended responses”: people can say whatever they want rather than select from a predetermined menu of responses. For example, a survey I collaborated on included this open-ended response question: “How does class affect one’s career success in sociology?” Some of the answers were simply one word long (e.g., “debt”), and others were long statements with stories and personal anecdotes. It is possible to be surprised by the responses. Although it’s a stretch to call this kind of questioning a conversation, it does allow the person answering the question some degree of freedom in how they answer.

Survey questionnaire with closed responses (not an interview!). Standardized survey questions with specific answer options (e.g., closed responses) are not really interviews at all, and they do not generate qualitative data. For example, if we included five options for the question “How does class affect one’s career success in sociology?”—(1) debt, (2) social networks, (3) alienation, (4) family doesn’t understand, (5) type of grad program—we leave no room for surprises at all. Instead, we would most likely look at patterns around these responses, thinking quantitatively rather than qualitatively (e.g., using regression analysis techniques, we might find that working-class sociologists were twice as likely to bring up alienation). It can sometimes be confusing for new students because the very same survey can include both closed-ended and open-ended questions. The key is to think about how these will be analyzed and to what level surprises are possible. If your plan is to turn all responses into a number and make predictions about correlations and relationships, you are no longer conducting qualitative research. This is true even if you are conducting this survey face-to-face with a real live human. Closed-response questions are not conversations of any kind, purposeful or not.

In summary, the semistructured interview guide approach is the predominant form of interviewing for social science qualitative researchers because it allows a high degree of freedom of responses from those interviewed (thus allowing for novel discoveries) while still maintaining some connection to a research question area or topic of interest. The rest of the chapter assumes the employment of this form.

Creating an Interview Guide

Your interview guide is the instrument used to bridge your research question(s) and what the people you are interviewing want to tell you. Unlike a standardized questionnaire, the questions actually asked do not need to be exactly what you have written down in your guide. The guide is meant to create space for those you are interviewing to talk about the phenomenon of interest, but sometimes you are not even sure what that phenomenon is until you start asking questions. A priority in creating an interview guide is to ensure it offers space. One of the worst mistakes is to create questions that are so specific that the person answering them will not stray. Relatedly, questions that sound “academic” will shut down a lot of respondents. A good interview guide invites respondents to talk about what is important to them, not feel like they are performing or being evaluated by you.

Good interview questions should not sound like your “research question” at all. For example, let’s say your research question is “How do patriarchal assumptions influence men’s understanding of climate change and responses to climate change?” It would be worse than unhelpful to ask a respondent, “How do your assumptions about the role of men affect your understanding of climate change?” You need to unpack this into manageable nuggets that pull your respondent into the area of interest without leading him anywhere. You could start by asking him what he thinks about climate change in general. Or, even better, whether he has any concerns about heatwaves or increased tornadoes or polar icecaps melting. Once he starts talking about that, you can ask follow-up questions that bring in issues around gendered roles, perhaps asking if he is married (to a woman) and whether his wife shares his thoughts and, if not, how they negotiate that difference. The fact is, you won’t really know the right questions to ask until he starts talking.

There are several distinct types of questions that can be used in your interview guide, either as main questions or as follow-up probes. If you remember that the point is to leave space for the respondent, you will craft a much more effective interview guide! You will also want to think about the place of time in both the questions themselves (past, present, future orientations) and the sequencing of the questions.

Researcher Note

Suggestion : As you read the next three sections (types of questions, temporality, question sequence), have in mind a particular research question, and try to draft questions and sequence them in a way that opens space for a discussion that helps you answer your research question.

Type of Questions

Experience and behavior questions ask about what a respondent does regularly (their behavior) or has done (their experience). These are relatively easy questions for people to answer because they appear more “factual” and less subjective. This makes them good opening questions. For the study on climate change above, you might ask, “Have you ever experienced an unusual weather event? What happened?” Or “You said you work outside? What is a typical summer workday like for you? How do you protect yourself from the heat?”

Opinion and values questions , in contrast, ask questions that get inside the minds of those you are interviewing. “Do you think climate change is real? Who or what is responsible for it?” are two such questions. Note that you don’t have to literally ask, “What is your opinion of X?” but you can find a way to ask the specific question relevant to the conversation you are having. These questions are a bit trickier to ask because the answers you get may depend in part on how your respondent perceives you and whether they want to please you or not. We’ve talked a fair amount about being reflective. Here is another place where this comes into play. You need to be aware of the effect your presence might have on the answers you are receiving and adjust accordingly. If you are a woman who is perceived as liberal asking a man who identifies as conservative about climate change, there is a lot of subtext that can be going on in the interview. There is no one right way to resolve this, but you must at least be aware of it.

Feeling questions are questions that ask respondents to draw on their emotional responses. It’s pretty common for academic researchers to forget that we have bodies and emotions, but people’s understandings of the world often operate at this affective level, sometimes unconsciously or barely consciously. It is a good idea to include questions that leave space for respondents to remember, imagine, or relive emotional responses to particular phenomena. “What was it like when you heard your cousin’s house burned down in that wildfire?” doesn’t explicitly use any emotion words, but it allows your respondent to remember what was probably a pretty emotional day. And if they respond emotionally neutral, that is pretty interesting data too. Note that asking someone “How do you feel about X” is not always going to evoke an emotional response, as they might simply turn around and respond with “I think that…” It is better to craft a question that actually pushes the respondent into the affective category. This might be a specific follow-up to an experience and behavior question —for example, “You just told me about your daily routine during the summer heat. Do you worry it is going to get worse?” or “Have you ever been afraid it will be too hot to get your work accomplished?”

Knowledge questions ask respondents what they actually know about something factual. We have to be careful when we ask these types of questions so that respondents do not feel like we are evaluating them (which would shut them down), but, for example, it is helpful to know when you are having a conversation about climate change that your respondent does in fact know that unusual weather events have increased and that these have been attributed to climate change! Asking these questions can set the stage for deeper questions and can ensure that the conversation makes the same kind of sense to both participants. For example, a conversation about political polarization can be put back on track once you realize that the respondent doesn’t really have a clear understanding that there are two parties in the US. Instead of asking a series of questions about Republicans and Democrats, you might shift your questions to talk more generally about political disagreements (e.g., “people against abortion”). And sometimes what you do want to know is the level of knowledge about a particular program or event (e.g., “Are you aware you can discharge your student loans through the Public Service Loan Forgiveness program?”).

Sensory questions call on all senses of the respondent to capture deeper responses. These are particularly helpful in sparking memory. “Think back to your childhood in Eastern Oregon. Describe the smells, the sounds…” Or you could use these questions to help a person access the full experience of a setting they customarily inhabit: “When you walk through the doors to your office building, what do you see? Hear? Smell?” As with feeling questions , these questions often supplement experience and behavior questions . They are another way of allowing your respondent to report fully and deeply rather than remain on the surface.

Creative questions employ illustrative examples, suggested scenarios, or simulations to get respondents to think more deeply about an issue, topic, or experience. There are many options here. In The Trouble with Passion , Erin Cech ( 2021 ) provides a scenario in which “Joe” is trying to decide whether to stay at his decent but boring computer job or follow his passion by opening a restaurant. She asks respondents, “What should Joe do?” Their answers illuminate the attraction of “passion” in job selection. In my own work, I have used a news story about an upwardly mobile young man who no longer has time to see his mother and sisters to probe respondents’ feelings about the costs of social mobility. Jessi Streib and Betsy Leondar-Wright have used single-page cartoon “scenes” to elicit evaluations of potential racial discrimination, sexual harassment, and classism. Barbara Sutton ( 2010 ) has employed lists of words (“strong,” “mother,” “victim”) on notecards she fans out and asks her female respondents to select and discuss.

Background/Demographic Questions

You most definitely will want to know more about the person you are interviewing in terms of conventional demographic information, such as age, race, gender identity, occupation, and educational attainment. These are not questions that normally open up inquiry. [1] For this reason, my practice has been to include a separate “demographic questionnaire” sheet that I ask each respondent to fill out at the conclusion of the interview. Only include those aspects that are relevant to your study. For example, if you are not exploring religion or religious affiliation, do not include questions about a person’s religion on the demographic sheet. See the example provided at the end of this chapter.

Temporality

Any type of question can have a past, present, or future orientation. For example, if you are asking a behavior question about workplace routine, you might ask the respondent to talk about past work, present work, and ideal (future) work. Similarly, if you want to understand how people cope with natural disasters, you might ask your respondent how they felt then during the wildfire and now in retrospect and whether and to what extent they have concerns for future wildfire disasters. It’s a relatively simple suggestion—don’t forget to ask about past, present, and future—but it can have a big impact on the quality of the responses you receive.

Question Sequence

Having a list of good questions or good question areas is not enough to make a good interview guide. You will want to pay attention to the order in which you ask your questions. Even though any one respondent can derail this order (perhaps by jumping to answer a question you haven’t yet asked), a good advance plan is always helpful. When thinking about sequence, remember that your goal is to get your respondent to open up to you and to say things that might surprise you. To establish rapport, it is best to start with nonthreatening questions. Asking about the present is often the safest place to begin, followed by the past (they have to know you a little bit to get there), and lastly, the future (talking about hopes and fears requires the most rapport). To allow for surprises, it is best to move from very general questions to more particular questions only later in the interview. This ensures that respondents have the freedom to bring up the topics that are relevant to them rather than feel like they are constrained to answer you narrowly. For example, refrain from asking about particular emotions until these have come up previously—don’t lead with them. Often, your more particular questions will emerge only during the course of the interview, tailored to what is emerging in conversation.

Once you have a set of questions, read through them aloud and imagine you are being asked the same questions. Does the set of questions have a natural flow? Would you be willing to answer the very first question to a total stranger? Does your sequence establish facts and experiences before moving on to opinions and values? Did you include prefatory statements, where necessary; transitions; and other announcements? These can be as simple as “Hey, we talked a lot about your experiences as a barista while in college.… Now I am turning to something completely different: how you managed friendships in college.” That is an abrupt transition, but it has been softened by your acknowledgment of that.

Probes and Flexibility

Once you have the interview guide, you will also want to leave room for probes and follow-up questions. As in the sample probe included here, you can write out the obvious probes and follow-up questions in advance. You might not need them, as your respondent might anticipate them and include full responses to the original question. Or you might need to tailor them to how your respondent answered the question. Some common probes and follow-up questions include asking for more details (When did that happen? Who else was there?), asking for elaboration (Could you say more about that?), asking for clarification (Does that mean what I think it means or something else? I understand what you mean, but someone else reading the transcript might not), and asking for contrast or comparison (How did this experience compare with last year’s event?). “Probing is a skill that comes from knowing what to look for in the interview, listening carefully to what is being said and what is not said, and being sensitive to the feedback needs of the person being interviewed” ( Patton 2002:374 ). It takes work! And energy. I and many other interviewers I know report feeling emotionally and even physically drained after conducting an interview. You are tasked with active listening and rearranging your interview guide as needed on the fly. If you only ask the questions written down in your interview guide with no deviations, you are doing it wrong. [2]

The Final Question

Every interview guide should include a very open-ended final question that allows for the respondent to say whatever it is they have been dying to tell you but you’ve forgotten to ask. About half the time they are tired too and will tell you they have nothing else to say. But incredibly, some of the most honest and complete responses take place here, at the end of a long interview. You have to realize that the person being interviewed is often discovering things about themselves as they talk to you and that this process of discovery can lead to new insights for them. Making space at the end is therefore crucial. Be sure you convey that you actually do want them to tell you more, that the offer of “anything else?” is not read as an empty convention where the polite response is no. Here is where you can pull from that active listening and tailor the final question to the particular person. For example, “I’ve asked you a lot of questions about what it was like to live through that wildfire. I’m wondering if there is anything I’ve forgotten to ask, especially because I haven’t had that experience myself” is a much more inviting final question than “Great. Anything you want to add?” It’s also helpful to convey to the person that you have the time to listen to their full answer, even if the allotted time is at the end. After all, there are no more questions to ask, so the respondent knows exactly how much time is left. Do them the courtesy of listening to them!

Conducting the Interview

Once you have your interview guide, you are on your way to conducting your first interview. I always practice my interview guide with a friend or family member. I do this even when the questions don’t make perfect sense for them, as it still helps me realize which questions make no sense, are poorly worded (too academic), or don’t follow sequentially. I also practice the routine I will use for interviewing, which goes something like this:

  • Introduce myself and reintroduce the study
  • Provide consent form and ask them to sign and retain/return copy
  • Ask if they have any questions about the study before we begin
  • Ask if I can begin recording
  • Ask questions (from interview guide)
  • Turn off the recording device
  • Ask if they are willing to fill out my demographic questionnaire
  • Collect questionnaire and, without looking at the answers, place in same folder as signed consent form
  • Thank them and depart

A note on remote interviewing: Interviews have traditionally been conducted face-to-face in a private or quiet public setting. You don’t want a lot of background noise, as this will make transcriptions difficult. During the recent global pandemic, many interviewers, myself included, learned the benefits of interviewing remotely. Although face-to-face is still preferable for many reasons, Zoom interviewing is not a bad alternative, and it does allow more interviews across great distances. Zoom also includes automatic transcription, which significantly cuts down on the time it normally takes to convert our conversations into “data” to be analyzed. These automatic transcriptions are not perfect, however, and you will still need to listen to the recording and clarify and clean up the transcription. Nor do automatic transcriptions include notations of body language or change of tone, which you may want to include. When interviewing remotely, you will want to collect the consent form before you meet: ask them to read, sign, and return it as an email attachment. I think it is better to ask for the demographic questionnaire after the interview, but because some respondents may never return it then, it is probably best to ask for this at the same time as the consent form, in advance of the interview.

What should you bring to the interview? I would recommend bringing two copies of the consent form (one for you and one for the respondent), a demographic questionnaire, a manila folder in which to place the signed consent form and filled-out demographic questionnaire, a printed copy of your interview guide (I print with three-inch right margins so I can jot down notes on the page next to relevant questions), a pen, a recording device, and water.

After the interview, you will want to secure the signed consent form in a locked filing cabinet (if in print) or a password-protected folder on your computer. Using Excel or a similar program that allows tables/spreadsheets, create an identifying number for your interview that links to the consent form without using the name of your respondent. For example, let’s say that I conduct interviews with US politicians, and the first person I meet with is George W. Bush. I will assign the transcription the number “INT#001” and add it to the signed consent form. [3] The signed consent form goes into a locked filing cabinet, and I never use the name “George W. Bush” again. I take the information from the demographic sheet, open my Excel spreadsheet, and add the relevant information in separate columns for the row INT#001: White, male, Republican. When I interview Bill Clinton as my second interview, I include a second row: INT#002: White, male, Democrat. And so on. The only link to the actual name of the respondent and this information is the fact that the consent form (unavailable to anyone but me) has stamped on it the interview number.

Many students get very nervous before their first interview. Actually, many of us are always nervous before the interview! But do not worry—this is normal, and it does pass. Chances are, you will be pleasantly surprised at how comfortable it begins to feel. These “purposeful conversations” are often a delight for both participants. This is not to say that sometimes things go wrong. I often have my students practice several “bad scenarios” (e.g., a respondent that you cannot get to open up; a respondent who is too talkative and dominates the conversation, steering it away from the topics you are interested in; emotions that completely take over; or shocking disclosures you are ill-prepared to handle), but most of the time, things go quite well. Be prepared for the unexpected, but know that the reason interviews are so popular as a technique of data collection is that they are usually richly rewarding for both participants.

One thing that I stress to my methods students and remind myself about is that interviews are still conversations between people. If there’s something you might feel uncomfortable asking someone about in a “normal” conversation, you will likely also feel a bit of discomfort asking it in an interview. Maybe more importantly, your respondent may feel uncomfortable. Social research—especially about inequality—can be uncomfortable. And it’s easy to slip into an abstract, intellectualized, or removed perspective as an interviewer. This is one reason trying out interview questions is important. Another is that sometimes the question sounds good in your head but doesn’t work as well out loud in practice. I learned this the hard way when a respondent asked me how I would answer the question I had just posed, and I realized that not only did I not really know how I would answer it, but I also wasn’t quite as sure I knew what I was asking as I had thought.

—Elizabeth M. Lee, Associate Professor of Sociology at Saint Joseph’s University, author of Class and Campus Life , and co-author of Geographies of Campus Inequality

How Many Interviews?

Your research design has included a targeted number of interviews and a recruitment plan (see chapter 5). Follow your plan, but remember that “ saturation ” is your goal. You interview as many people as you can until you reach a point at which you are no longer surprised by what they tell you. This means not that no one after your first twenty interviews will have surprising, interesting stories to tell you but rather that the picture you are forming about the phenomenon of interest to you from a research perspective has come into focus, and none of the interviews are substantially refocusing that picture. That is when you should stop collecting interviews. Note that to know when you have reached this, you will need to read your transcripts as you go. More about this in chapters 18 and 19.

Your Final Product: The Ideal Interview Transcript

A good interview transcript will demonstrate a subtly controlled conversation by the skillful interviewer. In general, you want to see replies that are about one paragraph long, not short sentences and not running on for several pages. Although it is sometimes necessary to follow respondents down tangents, it is also often necessary to pull them back to the questions that form the basis of your research study. This is not really a free conversation, although it may feel like that to the person you are interviewing.

Final Tips from an Interview Master

Annette Lareau is arguably one of the masters of the trade. In Listening to People , she provides several guidelines for good interviews and then offers a detailed example of an interview gone wrong and how it could be addressed (please see the “Further Readings” at the end of this chapter). Here is an abbreviated version of her set of guidelines: (1) interview respondents who are experts on the subjects of most interest to you (as a corollary, don’t ask people about things they don’t know); (2) listen carefully and talk as little as possible; (3) keep in mind what you want to know and why you want to know it; (4) be a proactive interviewer (subtly guide the conversation); (5) assure respondents that there aren’t any right or wrong answers; (6) use the respondent’s own words to probe further (this both allows you to accurately identify what you heard and pushes the respondent to explain further); (7) reuse effective probes (don’t reinvent the wheel as you go—if repeating the words back works, do it again and again); (8) focus on learning the subjective meanings that events or experiences have for a respondent; (9) don’t be afraid to ask a question that draws on your own knowledge (unlike trial lawyers who are trained never to ask a question for which they don’t already know the answer, sometimes it’s worth it to ask risky questions based on your hypotheses or just plain hunches); (10) keep thinking while you are listening (so difficult…and important); (11) return to a theme raised by a respondent if you want further information; (12) be mindful of power inequalities (and never ever coerce a respondent to continue the interview if they want out); (13) take control with overly talkative respondents; (14) expect overly succinct responses, and develop strategies for probing further; (15) balance digging deep and moving on; (16) develop a plan to deflect questions (e.g., let them know you are happy to answer any questions at the end of the interview, but you don’t want to take time away from them now); and at the end, (17) check to see whether you have asked all your questions. You don’t always have to ask everyone the same set of questions, but if there is a big area you have forgotten to cover, now is the time to recover ( Lareau 2021:93–103 ).

Sample: Demographic Questionnaire

ASA Taskforce on First-Generation and Working-Class Persons in Sociology – Class Effects on Career Success

Supplementary Demographic Questionnaire

Thank you for your participation in this interview project. We would like to collect a few pieces of key demographic information from you to supplement our analyses. Your answers to these questions will be kept confidential and stored by ID number. All of your responses here are entirely voluntary!

What best captures your race/ethnicity? (please check any/all that apply)

  • White (Non Hispanic/Latina/o/x)
  • Black or African American
  • Hispanic, Latino/a/x of Spanish
  • Asian or Asian American
  • American Indian or Alaska Native
  • Middle Eastern or North African
  • Native Hawaiian or Pacific Islander
  • Other : (Please write in: ________________)

What is your current position?

  • Grad Student
  • Full Professor

Please check any and all of the following that apply to you:

  • I identify as a working-class academic
  • I was the first in my family to graduate from college
  • I grew up poor

What best reflects your gender?

  • Transgender female/Transgender woman
  • Transgender male/Transgender man
  • Gender queer/ Gender nonconforming

Anything else you would like us to know about you?

Example: Interview Guide

In this example, follow-up prompts are italicized.  Note the sequence of questions.  That second question often elicits an entire life history , answering several later questions in advance.

Introduction Script/Question

Thank you for participating in our survey of ASA members who identify as first-generation or working-class.  As you may have heard, ASA has sponsored a taskforce on first-generation and working-class persons in sociology and we are interested in hearing from those who so identify.  Your participation in this interview will help advance our knowledge in this area.

  • The first thing we would like to as you is why you have volunteered to be part of this study? What does it mean to you be first-gen or working class?  Why were you willing to be interviewed?
  • How did you decide to become a sociologist?
  • Can you tell me a little bit about where you grew up? ( prompts: what did your parent(s) do for a living?  What kind of high school did you attend?)
  • Has this identity been salient to your experience? (how? How much?)
  • How welcoming was your grad program? Your first academic employer?
  • Why did you decide to pursue sociology at the graduate level?
  • Did you experience culture shock in college? In graduate school?
  • Has your FGWC status shaped how you’ve thought about where you went to school? debt? etc?
  • Were you mentored? How did this work (not work)?  How might it?
  • What did you consider when deciding where to go to grad school? Where to apply for your first position?
  • What, to you, is a mark of career success? Have you achieved that success?  What has helped or hindered your pursuit of success?
  • Do you think sociology, as a field, cares about prestige?
  • Let’s talk a little bit about intersectionality. How does being first-gen/working class work alongside other identities that are important to you?
  • What do your friends and family think about your career? Have you had any difficulty relating to family members or past friends since becoming highly educated?
  • Do you have any debt from college/grad school? Are you concerned about this?  Could you explain more about how you paid for college/grad school?  (here, include assistance from family, fellowships, scholarships, etc.)
  • (You’ve mentioned issues or obstacles you had because of your background.) What could have helped?  Or, who or what did? Can you think of fortuitous moments in your career?
  • Do you have any regrets about the path you took?
  • Is there anything else you would like to add? Anything that the Taskforce should take note of, that we did not ask you about here?

Further Readings

Britten, Nicky. 1995. “Qualitative Interviews in Medical Research.” BMJ: British Medical Journal 31(6999):251–253. A good basic overview of interviewing particularly useful for students of public health and medical research generally.

Corbin, Juliet, and Janice M. Morse. 2003. “The Unstructured Interactive Interview: Issues of Reciprocity and Risks When Dealing with Sensitive Topics.” Qualitative Inquiry 9(3):335–354. Weighs the potential benefits and harms of conducting interviews on topics that may cause emotional distress. Argues that the researcher’s skills and code of ethics should ensure that the interviewing process provides more of a benefit to both participant and researcher than a harm to the former.

Gerson, Kathleen, and Sarah Damaske. 2020. The Science and Art of Interviewing . New York: Oxford University Press. A useful guidebook/textbook for both undergraduates and graduate students, written by sociologists.

Kvale, Steiner. 2007. Doing Interviews . London: SAGE. An easy-to-follow guide to conducting and analyzing interviews by psychologists.

Lamont, Michèle, and Ann Swidler. 2014. “Methodological Pluralism and the Possibilities and Limits of Interviewing.” Qualitative Sociology 37(2):153–171. Written as a response to various debates surrounding the relative value of interview-based studies and ethnographic studies defending the particular strengths of interviewing. This is a must-read article for anyone seriously engaging in qualitative research!

Pugh, Allison J. 2013. “What Good Are Interviews for Thinking about Culture? Demystifying Interpretive Analysis.” American Journal of Cultural Sociology 1(1):42–68. Another defense of interviewing written against those who champion ethnographic methods as superior, particularly in the area of studying culture. A classic.

Rapley, Timothy John. 2001. “The ‘Artfulness’ of Open-Ended Interviewing: Some considerations in analyzing interviews.” Qualitative Research 1(3):303–323. Argues for the importance of “local context” of data production (the relationship built between interviewer and interviewee, for example) in properly analyzing interview data.

Weiss, Robert S. 1995. Learning from Strangers: The Art and Method of Qualitative Interview Studies . New York: Simon and Schuster. A classic and well-regarded textbook on interviewing. Because Weiss has extensive experience conducting surveys, he contrasts the qualitative interview with the survey questionnaire well; particularly useful for those trained in the latter.

  • I say “normally” because how people understand their various identities can itself be an expansive topic of inquiry. Here, I am merely talking about collecting otherwise unexamined demographic data, similar to how we ask people to check boxes on surveys. ↵
  • Again, this applies to “semistructured in-depth interviewing.” When conducting standardized questionnaires, you will want to ask each question exactly as written, without deviations! ↵
  • I always include “INT” in the number because I sometimes have other kinds of data with their own numbering: FG#001 would mean the first focus group, for example. I also always include three-digit spaces, as this allows for up to 999 interviews (or, more realistically, allows for me to interview up to one hundred persons without having to reset my numbering system). ↵

A method of data collection in which the researcher asks the participant questions; the answers to these questions are often recorded and transcribed verbatim. There are many different kinds of interviews - see also semistructured interview , structured interview , and unstructured interview .

A document listing key questions and question areas for use during an interview.  It is used most often for semi-structured interviews.  A good interview guide may have no more than ten primary questions for two hours of interviewing, but these ten questions will be supplemented by probes and relevant follow-ups throughout the interview.  Most IRBs require the inclusion of the interview guide in applications for review.  See also interview and  semi-structured interview .

A data-collection method that relies on casual, conversational, and informal interviewing.  Despite its apparent conversational nature, the researcher usually has a set of particular questions or question areas in mind but allows the interview to unfold spontaneously.  This is a common data-collection technique among ethnographers.  Compare to the semi-structured or in-depth interview .

A form of interview that follows a standard guide of questions asked, although the order of the questions may change to match the particular needs of each individual interview subject, and probing “follow-up” questions are often added during the course of the interview.  The semi-structured interview is the primary form of interviewing used by qualitative researchers in the social sciences.  It is sometimes referred to as an “in-depth” interview.  See also interview and  interview guide .

The cluster of data-collection tools and techniques that involve observing interactions between people, the behaviors, and practices of individuals (sometimes in contrast to what they say about how they act and behave), and cultures in context.  Observational methods are the key tools employed by ethnographers and Grounded Theory .

Follow-up questions used in a semi-structured interview  to elicit further elaboration.  Suggested prompts can be included in the interview guide  to be used/deployed depending on how the initial question was answered or if the topic of the prompt does not emerge spontaneously.

A form of interview that follows a strict set of questions, asked in a particular order, for all interview subjects.  The questions are also the kind that elicits short answers, and the data is more “informative” than probing.  This is often used in mixed-methods studies, accompanying a survey instrument.  Because there is no room for nuance or the exploration of meaning in structured interviews, qualitative researchers tend to employ semi-structured interviews instead.  See also interview.

The point at which you can conclude data collection because every person you are interviewing, the interaction you are observing, or content you are analyzing merely confirms what you have already noted.  Achieving saturation is often used as the justification for the final sample size.

An interview variant in which a person’s life story is elicited in a narrative form.  Turning points and key themes are established by the researcher and used as data points for further analysis.

Introduction to Qualitative Research Methods Copyright © 2023 by Allison Hurst is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License , except where otherwise noted.

Enago Academy

Research Interviews: An effective and insightful way of data collection

' src=

Research interviews play a pivotal role in collecting data for various academic, scientific, and professional endeavors. They provide researchers with an opportunity to delve deep into the thoughts, experiences, and perspectives of an individual, thus enabling a comprehensive understanding of complex phenomena. It is important for researchers to design an effective and insightful method of data collection on a particular topic. A research interview is typically a two-person meeting conducted to collect information on a certain topic. It is a qualitative data collection method to gain primary information.

The three key features of a research interview are as follows:

Features of Research Interviews

Table of Contents

The Significance of Research Interviews in Gathering Primary Data

The role of research interviews in gathering first-hand information is invaluable. Additionally, they allow researchers to interact directly with participants, enabling them to collect unfiltered primary data.

Significance of Research Interviews

1. Subjective Experience

Research interviews facilitate in-depth exploration of a research topic. Thus, by engaging in one-to-one conversation with participants, researchers can delve into the nuances and complexities of their experiences, perspectives, and opinions. This allows comprehensive understanding of the research subject that may not be possible through other methods. Also, research interviews offer the unique advantage of capturing subjective experiences through personal narratives. Moreover, participants can express their thoughts, feelings, and beliefs, which add depth to the findings.

2. Personal Insights

Research interviews offer an opportunity for participants to share their views and opinions on the objective they are being interviewed for. Furthermore, participants can express their thoughts and experiences, providing rich qualitative data . Consequently, these personal narratives add a human element to the research, thus enhancing the understanding of the topic from the participants’ perspectives. Research interviews offer the opportunity to uncover unanticipated insights or emerging themes. Additionally, open-ended questions and active listening can help the researchers to identify new perspectives, ideas, or patterns that may not have been initially considered. As a result, these factors can lead to new avenues for exploration.

3. Clarification and Validation

Researchers can clarify participants’ responses and validate their understanding during an interview. This ensures accurate data collection and interpretation. Additionally, researchers can probe deeper into participants’ statements and seek clarification on any ambiguity in the information.

4. Contextual Information

Research interviews allow researchers to gather contextual information that offers a comprehensive understanding of the research topic. Additionally, participants can provide insights into the social, cultural, or environmental factors that shape their experiences, behaviors, and beliefs. This contextual information helps researchers place the data in a broader context and facilitates a more nuanced analysis.

5. Non-verbal Cues

In addition to verbal responses, research interviews allow researchers to observe non-verbal cues such as body language, facial expressions, and tone of voice. Additionally, non-verbal cues can convey information, such as emotions, attitudes, or levels of comfort. Furthermore, integrating non-verbal cues with verbal responses provides a more holistic understanding of participants’ experiences and enriches the data collection process.

Research interviews offer several advantages, making them a reliable tool for collecting information. However, choosing the right type of research interview is essential for collecting useful data.

Types of Research Interviews

There are several types of research interviews that researchers can use based on their research goals , the nature of their study, and the data they aim to collect. Here are some common types of research interviews:

Types of Research Interviews

1. Structured Interviews

  • Structured interviews are standardized and follow a fixed format.
  • Therefore, these interviews have a pre-determined set of questions.
  • All the participants are asked the same set of questions in the same order.
  • Therefore, this type of interview facilitates standardization and allows easy comparison and quantitative analysis of responses.
  • As a result, structured interviews are used in surveys or studies which aims for a high level of standardization and comparability.

2. Semi-structured Interviews

  • Semi-structured interviews offer a flexible framework by combining pre-determined questions.
  • So, this gives an opportunity for follow-up questions and open-ended discussions.
  • Researchers have a list of core questions but can adapt the interview depending on the participant’s responses.
  • Consequently, this allows for in-depth exploration while maintaining some level of consistency across interviews.
  • As a result, semi-structured interviews are widely used in qualitative research, where content-rich data is desired.

3. Unstructured Interviews

  • Unstructured interviews provide the greatest flexibility and freedom in the interview process.
  • This type do not have a pre-determined set of questions.
  • Thus, the conversation flows naturally based on the participant’s responses and the researcher’s interests.
  • Moreover, this type of interview allows for open-ended exploration and encourages participants to share their experiences, thoughts, and perspectives freely.
  • Unstructured interviews useful to explore new or complex research topics, with limited preconceived questions.

4. Group Interviews (Focus Groups)

  • Group interviews involve multiple participants who engage in a facilitated discussion on a specific topic.
  • This format allows the interaction and exchange of ideas among participants, generating a group dynamic.
  • Therefore, group interviews are beneficial for capturing diverse perspectives, and generating collective insights.
  • They are often used in market research, social sciences, or studies demanding shared experiences.

5. Narrative Interviews

  • Narrative interviews focus on eliciting participants’ personal stories, views, experiences, and narratives. Researchers aim to look into the individual’s life journey.
  • As a result, this type of interview allows participants to construct and share their own narratives, providing rich qualitative data.
  • Qualitative research, oral history, or studies focusing on individual experiences and identities uses narrative interviews.

6. Ethnographic Interviews

  • Ethnographic interviews are conducted within the context of ethnographic research, where researchers immerse themselves in a specific social or cultural setting.
  • These interviews aim to understand participants’ experiences, beliefs, and practices within their cultural context, thereby understanding diversity in different ethnic groups.
  • Furthermore, ethnographic interviews involve building rapport, observing the participants’ daily lives, and engaging in conversations that capture the nuances of the culture under study.

It must be noted that these interview types are not mutually exclusive. Therefore, researchers often employ a combination of approaches to gather the most comprehensive data for their research. The choice of interview type depends on the research objectives and the nature of the research topic.

Steps of Conducting a Research Interview

Research interviews offer several benefits, and thus careful planning and execution of the entire process are important to gather in-depth information from the participants. While conducting an interview, it is essential to know the necessary steps to follow for ensuring success. The steps to conduct a research interview are as follows:

  • Identify the objectives and understand the goals
  • Select an appropriate interview format
  • Organize the necessary materials for the interview
  • Understand the questions to be addressed
  • Analyze the demographics of interviewees
  • Select the interviewees
  • Design the interview questions to gather sufficient information
  • Schedule the interview
  • Explain the purpose of the interview
  • Analyze the interviewee based on his/her responses

Considerations for Research Interviews

Since the flexible nature of research interviews makes them an invaluable tool for data collection, researchers must consider certain factors to make the process effective. They should avoid bias and preconceived notion against the participants. Furthermore, researchers must comply with ethical considerations and respect the cultural differences between them and the participants. Also, they should ensure careful tailoring of the questions to avoid making them offensive or derogatory. The interviewers must respect the privacy of the participants and ensure the confidentiality of their details.

Considerations for Research Interviews

By ensuring due diligence of these considerations associated with research interviews, researchers can maximize the validity and reliability of the collected data, leading to robust and meaningful research outcomes.

Have you ever conducted a research interview? What was your experience? What factors did you consider when conducting a research interview? Share it with researchers worldwide by submitting your thought piece on Enago Academy’s Open Blogging Platform .

Frequently Asked Questions

• Identify the objectives of the interview • State and explain the purpose of the interview • Select an appropriate interview format • Organize the necessary materials for the Interview • Check the demographics of the participants • Select the Interviewees or the participants • Prepare the list of questions to gather maximum useful data from the participants • Schedule the Interview • Analyze the participant based on his/ her Responses

Interviews are important in research as it helps to gather elaborative first-hand information. It helps to draw conclusions from the non-verbal views and personal experiences. It reduces the ambiguity of data through detailed discussions.

The advantages of research interviews are: • It offers first-hand information • Offers detailed assessment which can result in elaborate conclusions • It is easy to conduct • Provides non-verbal cues The disadvantages of research interviews are: • There is a risk of personal bias • It can be time consuming • The outcomes might be unpredictable

The difference between structured and unstructured interview are: • Structured interviews have well-structured questions in a pre-determined order; while unstructured interviews are flexible and do not have a pre-planned set of questions. • Structured interview is more detailed; while unstructured interviews are exploratory in nature. • Structured interview is easier to replicate as compared to unstructured interview.

Focus groups is a group of multiple participants engaging in a facilitated discussion on a specific topic. This format allows for interaction and exchange of ideas among participants.

Rate this article Cancel Reply

Your email address will not be published.

research instrument interview meaning

Enago Academy's Most Popular Articles

Graphical Abstracts vs. Infographics: Best Practices for Visuals - Enago

  • Promoting Research

Graphical Abstracts Vs. Infographics: Best practices for using visual illustrations for increased research impact

Dr. Sarah Chen stared at her computer screen, her eyes staring at her recently published…

10 Tips to Prevent Research Papers From Being Retracted - Enago

  • Publishing Research

10 Tips to Prevent Research Papers From Being Retracted

Research paper retractions represent a critical event in the scientific community. When a published article…

2024 Scholar Metrics: Unveiling research impact (2019-2023)

  • Industry News

Google Releases 2024 Scholar Metrics, Evaluates Impact of Scholarly Articles

Google has released its 2024 Scholar Metrics, assessing scholarly articles from 2019 to 2023. This…

What is Academic Integrity and How to Uphold it [FREE CHECKLIST]

Ensuring Academic Integrity and Transparency in Academic Research: A comprehensive checklist for researchers

Academic integrity is the foundation upon which the credibility and value of scientific findings are…

7 Step Guide for Optimizing Impactful Research Process

  • Reporting Research

How to Optimize Your Research Process: A step-by-step guide

For researchers across disciplines, the path to uncovering novel findings and insights is often filled…

Choosing the Right Analytical Approach: Thematic analysis vs. content analysis for…

Research Recommendations – Guiding policy-makers for evidence-based decision making

Demystifying the Role of Confounding Variables in Research

research instrument interview meaning

Sign-up to read more

Subscribe for free to get unrestricted access to all our resources on research writing and academic publishing including:

  • 2000+ blog articles
  • 50+ Webinars
  • 10+ Expert podcasts
  • 50+ Infographics
  • 10+ Checklists
  • Research Guides

We hate spam too. We promise to protect your privacy and never spam you.

  • AI in Academia
  • Career Corner
  • Diversity and Inclusion
  • Infographics
  • Expert Video Library
  • Other Resources
  • Enago Learn
  • Upcoming & On-Demand Webinars
  • Peer Review Week 2024
  • Open Access Week 2023
  • Conference Videos
  • Enago Report
  • Journal Finder
  • Enago Plagiarism & AI Grammar Check
  • Editing Services
  • Publication Support Services
  • Research Impact
  • Translation Services
  • Publication solutions
  • AI-Based Solutions
  • Thought Leadership
  • Call for Articles
  • Call for Speakers
  • Author Training
  • Edit Profile

I am looking for Editing/ Proofreading services for my manuscript Tentative date of next journal submission:

research instrument interview meaning

In your opinion, what is the most effective way to improve integrity in the peer review process?

Logo for Open Educational Resources Collective

Want to create or adapt books like this? Learn more about how Pressbooks supports open publishing practices.

Chapter 13: Interviews

Danielle Berkovic

Learning outcomes

Upon completion of this chapter, you should be able to:

  • Understand when to use interviews in qualitative research.
  • Develop interview questions for an interview guide.
  • Understand how to conduct an interview.

What are interviews?

An interviewing method is the most commonly used data collection technique in qualitative research. 1 The purpose of an interview is to explore the experiences, understandings, opinions and motivations of research participants. 2 Interviews are conducted one-on-one with the researcher and the participant. Interviews are most appropriate when seeking to understand a participant’s subjective view of an experience and are also considered suitable for the exploration of sensitive topics.

What are the different types of interviews?

There are four main types of interviews:

  • Key stakeholder: A key stakeholder interview aims to explore one issue in detail with a person of interest or importance concerning the research topic. 3 Key stakeholder interviews seek the views of experts on some cultural, political or health aspects of the community, beyond their personal beliefs or actions. An example of a key stakeholder is the Chief Health Officer of Victoria (Australia’s second-most populous state) who oversaw the world’s longest lockdowns in response to the COVID-19 pandemic.
  • Dyad: A dyad interview aims to explore one issue in a level of detail with a dyad (two people). This form of interviewing is used when one participant of the dyad may need some support or is not wholly able to articulate themselves (e.g. people with cognitive impairment, or children). Independence is acknowledged and the interview is analysed as a unit. 4
  • Narrative: A narrative interview helps individuals tell their stories, and prioritises their own perspectives and experiences using the language that they prefer. 5 This type of interview has been widely used in social research but is gaining prominence in health research to better understand person-centred care, for example, negotiating exercise and food abstinence whilst living with Type 2 diabetes. 6,7
  • Life history: A life history interview allows the researcher to explore a person’s individual and subjective experiences within a history of the time framework. 8 Life history interviews challenge the researcher to understand how people’s current attitudes, behaviours and choices are influenced by previous experiences or trauma. Life history interviews have been conducted with Holocaust survivors 9 and youth who have been forcibly recruited to war. 10

Table 13.4 provides a summary of four studies, each adopting one of these types of interviews.

Interviewing techniques

There are two main interview techniques:

  • Semi-structured: Semi-structured interviewing aims to explore a few issues in moderate detail, to expand the researcher’s knowledge at some level. 11 Semi-structured interviews give the researcher the advantage of remaining reasonably objective while enabling participants to share their perspectives and opinions. The researcher should create an interview guide with targeted open questions to direct the interview. As examples, semi-structured interviews have been used to extend knowledge of why women might gain excess weight during pregnancy, 12 and to update guidelines for statin uptake. 13
  • In-depth: In-depth interviewing aims to explore a person’s subjective experiences and feelings about a particular topic. 14 In-depth interviews are often used to explore emotive (e.g. end-of-life care) 15 and complex (e.g. adolescent pregnancy) topics. 16 The researcher should create an interview guide with selected open questions to ask of the participant, but the participant should guide the direction of the interview more than in a semi-structured setting. In-depth interviews value participants’ lived experiences and are frequently used in phenomenology studies (as described in Chapter 6) .

When to use the different types of interview s

The type of interview a researcher uses should be determined by the study design, the research aims and objectives, and participant demographics. For example, if conducting a descriptive study, semi-structured interviews may be the best method of data collection. As explained in Chapter 5 , descriptive studies seek to describe phenomena, rather than to explain or interpret the data. A semi-structured interview, which seeks to expand upon some level of existing knowledge, will likely best facilitate this.

Similarly, if conducting a phenomenological study, in-depth interviews may be the best method of data collection. As described in Chapter 6 , the key concept of phenomenology is the individual. The emphasis is on the lived experience of that individual and the person’s sense-making of those experiences. Therefore, an in-depth interview is likely best placed to elicit that rich data.

While some interview types are better suited to certain study designs, there are no restrictions on the type of interview that may be used. For example, semi-structured interviews provide an excellent accompaniment to trial participation (see Chapter 11 about mixed methods), and key stakeholder interviews, as part of an action research study, can be used to define priorities, barriers and enablers to implementation.

How do I write my interview questions?

An interview aims to explore the experiences, understandings, opinions and motivations of research participants. The general rule is that the interviewee should speak for 80 per cent of the interview, and the interviewer should only be asking questions and clarifying responses, for about 20 per cent of the interview. This percentage may differ depending on the interview type; for example, a semi-structured interview involves the researcher asking more questions than in an in-depth interview. Still, to facilitate free-flowing responses, it is important to use open-ended language to encourage participants to be expansive in their responses. Examples of open-ended terms include questions that start with ‘who’, ‘how’ and ‘where’.

The researcher should avoid closed-ended questions that can be answered with yes or no, and limit conversation. For example, asking a participant ‘Did you have this experience?’ can elicit a simple ‘yes’, whereas asking them to ‘Describe your experience’, will likely encourage a narrative response. Table 13.1 provides examples of terminology to include and avoid in developing interview questions.

Table 13.1. Interview question formats to use and avoid

Use Avoid
Tell me about… Do you think that…
What happened when… Will you do this…
Why is this important? Did you believe that…
How did you feel when…

How do you…
Were there issues from your perspective…
What are the…

What does...

How long should my interview be?

There is no rule about how long an interview should take. Different types of interviews will likely run for different periods of time, but this also depends on the research question/s and the type of participant. For example, given that a semi-structured interview is seeking to expand on some previous knowledge, the interview may need no longer than 30 minutes, or up to one hour. An in-depth interview seeks to explore a topic in a greater level of detail and therefore, at a minimum, would be expected to last an hour. A dyad interview may be as short as 15 minutes (e.g. if the dyad is a person with dementia and a family member or caregiver) or longer, depending on the pairing.

Designing your interview guide

To figure out what questions to ask in an interview guide, the researcher may consult the literature, speak to experts (including people with lived experience) about the research and draw on their current knowledge. The topics and questions should be mapped to the research question/s, and the interview guide should be developed well in advance of commencing data collection. This enables time and opportunity to pilot-test the interview guide. The pilot interview provides an opportunity to explore the language and clarity of questions, the order and flow of the guide and to determine whether the instructions are clear to participants both before and after the interview. It can be beneficial to pilot-test the interview guide with someone who is not familiar with the research topic, to make sure that the language used is easily understood (and will be by participants, too). The study design should be used to determine the number of questions asked and the duration of the interview should guide the extent of the interview guide. The participant type may also determine the extent of the interview guide; for example, clinicians tend to be time-poor and therefore shorter, focused interviews are optimal. An interview guide is also likely to be shorter for a descriptive study than a phenomenological or ethnographic study, given the level of detail required. Chapter 5 outlined a descriptive study in which participants who had undergone percutaneous coronary intervention were interviewed. The interview guide consisted of four main questions and subsequent probing questions, linked to the research questions (see Table 13.2). 17

Table 13.2. Interview guide for a descriptive study

Research question Open questions Probing questions and topics
How does the patient feel, physically and psychologically, after their procedure? From your perspective, what would be considered a successful outcome of the procedure? Did the procedure meet your expectations? How do you define whether the procedure was successful?
How did you feel after the procedure?

How did you feel one week after the procedure and how does that compare with how you feel now?
How does the patient function after their procedure? After your procedure, tell me about your ability to do your daily activities? Prompt for activities including gardening, housework, personal care, work-related and family-related tasks.

Did you attend cardiac rehabilitation? Can you tell us about your experience of cardiac rehabilitation? What effect has medication had on your recovery?

What are the long-term effects of the procedure? What, if any, lifestyle changes have you made since your procedure?

Table 13.3 is an example of a larger and more detailed interview guide, designed for the qualitative component of a mixed-methods study aiming to examine the work and financial effects of living with arthritis as a younger person. The questions are mapped to the World Health Organization’s International Classification of Functioning, Disability, and Health, which measures health and disability at individual and population levels. 18

Table 13.3. Detailed interview guide

Research questions Open questions Probing questions
How do young people experience their arthritis diagnosis? Tell me about your experience of being diagnosed with arthritis.

How did being diagnosed with arthritis make you feel?

Tell me about your experience of arthritis flare ups what do they feel like?

What impacts arthritis flare ups or feeling like your arthritis is worse?

What circumstances lead to these feelings?

Based on your experience, what do you think causes symptoms of arthritis to become worse?
When were you diagnosed with arthritis?

What type of arthritis were you diagnosed with?

Does anyone else in your family have arthritis? What relation are they to you?
What are the work impacts of arthritis on younger people? What is your field of work, and how long have you been in this role?

How frequently do you work (full-time/part-time/casual)?
How has arthritis affected your work-related demands or career? How so?

Has arthritis led you to reconsider your career? How so?

Has arthritis affected your usual working hours each week? How so?

How have changes to work or career because of your arthritis impacted other areas of life, i.e. mental health or family role?
What are the financial impacts of living with arthritis as a younger person? Has your arthritis led to any financial concerns? Financial concerns pertaining to:

• Direct costs: rheumatologist, prescribed and non-prescribed medications (as well as supplements), allied health costs (rheumatology, physiotherapy, chiropractic, osteopathy, myotherapy), Pilates, and gym/personal trainer fees, complementary therapies.

• Indirect costs: workplace absenteeism, productivity, loss of wages, informal care, cost of different types of insurance: health insurance (joint replacements)

It is important to create an interview guide, for the following reasons:

  • The researcher should be familiar with their research questions.
  • Using an interview guide will enable the incorporation of feedback from the piloting process.
  • It is difficult to predict how participants will respond to interview questions. They may answer in a way that is anticipated or they may provide unanticipated insights that warrant follow-up. An interview guide (a physical or digital copy) enables the researcher to note these answers and follow-up with appropriate inquiry.
  • Participants will likely have provided heterogeneous answers to certain questions. The interview guide enables the researcher to note similarities and differences across various interviews, which may be important in data analysis.
  • Even experienced qualitative researchers get nervous before an interview! The interview guide provides a safety net if the researcher forgets their questions or needs to anticipate the next question.

Setting up the interview

In the past, most interviews were conducted in person or by telephone. Emerging technologies promote easier access to research participation (e.g. by people living in rural or remote communities, or for people with mobility limitations). Even in metropolitan settings, many interviews are now conducted electronically (e.g. using videoconferencing platforms). Regardless of your interview setting, it is essential that the interview environment is comfortable for the participant. This process can begin as soon as potential participants express interest in your research. Following are some tips from the literature and our own experiences of leading interviews:

  • Answer questions and set clear expectations . Participating in research is not an everyday task. People do not necessarily know what to expect during a research interview, and this can be daunting. Give people as much information as possible, answer their questions about the research and set clear expectations about what the interview will entail and how long it is expected to last. Let them know that the interview will be recorded for transcription and analysis purposes. Consider sending the interview questions a few days before the interview. This gives people time and space to reflect on their experiences, consider their responses to questions and to provide informed consent for their participation.
  • Consider your setting . If conducting the interview in person, consider the location and room in which the interview will be held. For example, if in a participant’s home, be mindful of their private space. Ask if you should remove your shoes before entering their home. If they offer refreshments (which in our experience many participants do), accept it with gratitude if possible. These considerations apply beyond the participant’s home; if using a room in an office setting, consider privacy and confidentiality, accessibility and potential for disruption. Consider the temperature as well as the furniture in the room, who may be able to overhear conversations and who may walk past. Similarly, if interviewing by phone or online, take time to assess the space, and if in a house or office that is not quiet or private, use headphones as needed.
  • Build rapport. The research topic may be important to participants from a professional perspective, or they may have deep emotional connections to the topic of interest. Regardless of the nature of the interview, it is important to remember that participants are being asked to open up to an interviewer who is likely to be a stranger. Spend some time with participants before the interview, to make sure that they are comfortable. Engage in some general conversation, and ask if they have any questions before you start. Remember that it is not a normal part of someone’s day to participate in research. Make it an enjoyable and/or meaningful experience for them, and it will enhance the data that you collect.
  • Let participants guide you. Oftentimes, the ways in which researchers and participants describe the same phenomena are different. In the interview, reflect the participant’s language. Make sure they feel heard and that they are willing and comfortable to speak openly about their experiences. For example, our research involves talking to older adults about their experience of falls. We noticed early in this research that participants did not use the word ‘fall’ but would rather use terms such as ‘trip’, ‘went over’ and ‘stumbled’. As interviewers we adopted the participant’s language into our questions.
  • Listen consistently and express interest. An interview is more complex than a simple question-and-answer format. The best interview data comes from participants feeling comfortable and confident to share their stories. By the time you are completing the 20th interview, it can be difficult to maintain the same level of concentration as with the first interview. Try to stay engaged: nod along with your participants, maintain eye contact, murmur in agreement and sympathise where warranted.
  • The interviewer is both the data collector and the data collection instrument. The data received is only as good as the questions asked. In qualitative research, the researcher influences how participants answer questions. It is important to remain reflexive and aware of how your language, body language and attitude might influence the interview. Being rested and prepared will enhance the quality of the questions asked and hence the data collected.
  • Avoid excessive use of ‘why’. It can be challenging for participants to recall why they felt a certain way or acted in a particular manner. Try to avoid asking ‘why’ questions too often, and instead adopt some of the open language described earlier in the chapter.

After your interview

When you have completed your interview, thank the participant and let them know they can contact you if they have any questions or follow-up information they would like to provide. If the interview has covered sensitive topics or the participant has become distressed throughout the interview, make sure that appropriate referrals and follow-up are provided (see section 6).

Download the recording from your device and make sure it is saved in a secure location that can only be accessed by people on the approved research team (see Chapters 35 and 36).

It is important to know what to do immediately after each interview is completed. Interviews should be transcribed – that is, reproduced verbatim for data analysis. Transcribing data is an important step in the process of analysis, but it is very time-consuming; transcribing a 60-minute interview can take up to 8 hours. Data analysis is discussed in Section 4.

Table 13.4. Examples of the four types of interviews

Title
CC Licence
First author and year Cuthbertson, 2019 Bannon, 2021 McGranahan, 2020 Gutierrez-Garcia, 2021
Interview type Key stakeholder Dyad Narrative Life history
Interview guide Appendix A eAppendix Supplement Not provided, but the text states that ‘qualitative semi-structured narrative interviews’ were conducted.’ [methods] Not provided, but the text states that ‘an open and semi-structured question guide was designed for use.' [methods]
Study design Convergent mixed-methods study Qualitative dyadic study Narrative interview study Life history and lifeline techniques
Number of participants 30

Key stakeholders were emergency management or disaster healthcare practitioners, academics specialising in disaster management in the Oceania region, and policy managers.
23 dyads 28 7
Aim ‘To investigate threats to the health and well-being of societies associated with disaster impact in Oceania.’ [abstract] ‘To explore the lived experiences of couples managing young-onset dementia using an integrated dyadic coping model.’[abstract] ‘To explore the experiences and views of people with psychotic experiences who have not received any treatment or other support from mental health services for the past 5 years.’ [abstract] ‘To analyse the use of life histories and lifelines in the study of female genital mutilation in the context of cross-cultural research in participants with different languages.’ [abstract]
Country Australia, Fiji, Indonesia, Aotearoa New Zealand, Timor Leste and Tonga United States England Spain
Length of interview 45–60 minutes 60 minutes 40-120 minutes 3 sessions

Session 1: life history interview

Session 2: Lifeline activity where participants used drawings to complement or enhance their interview

Session 3: The researchers and participants worked together to finalise the lifeline.
The life history interviews ran for 40 – 60 minutes. The timing for sessions 2 and 3 is not provided.
Sample of interview questions from interview guide 1. What do you believe are the top five disaster risks or threats in the Oceania region today?

2. What disaster risks do you believe are emerging in the Oceania region over the next decade?

3. Why do you think these are risks?

4. What are the drivers of these risks?

5. Do you have any suggestions on how we can improve disaster risk assessment?

6. Are the current disaster risk plans and practices suited to the future disaster risks? If not, why? If not, what do you think needs to be done to improve them?

7. What are the key areas of disaster practice that can enhance future community resilience to disaster risk?

8. What are the barriers or inhibitors to facilitating this practice?

9. What are the solutions or facilitators to enhancing community resilience?

[Appendix A]

1. We like to start by learning more about what you each first noticed that prompted the evaluations you went through to get to the diagnosis.

• Can you each tell me about the earliest symptoms you noticed?

2. What are the most noticeable or troubling symptoms that you have experienced since the time of diagnosis?

• How have your changes in functioning impacted you?

• Emotionally, how do you feel about your symptoms and the changes in functioning you are experiencing?

3. Are you open with your friends and family about the diagnosis?

• Have you experienced any stigma related to your diagnosis?

4. What is your understanding of the diagnosis?

• What is your understanding about the how this condition will affect you both in the future? How are you getting information about this diagnosis?

[eAppendix Supplement]

Not provided. Not provided.
Analysis Thematic analysis guided by The Hazard and Peril Glossary for describing and categorising disasters applied by the Centre for Research on the Epidemiology of Disasters Emergency Events Database Thematic analysis guided by the Dyadic Coping Theoretical Framework Inductive thematic analysis outlined by Braun and Clarke. Phenomenological method proposed by Giorgi (sense of the whole):

1. Reading the entire description to obtain a general sense of the discourse

2. The researcher goes back to the beginning and reads the text again, with the aim of distinguishing the meaning units by separating the perspective of the phenomenon of interest

3. The researcher expresses the contents of the units of meaning more clearly by creating categories

4. The researcher synthesises the units and categories of meaning into a consistent statement that takes into account the participant’s experience and language.
Main themes 1. Climate change is observed as a contemporary and emerging disaster risk

2. Risk is contextual to the different countries, communities and individuals in Oceania.

3. Human development trajectories and their impact, along with perceptions of a changing world, are viewed as drivers of current and emerging risks.

4. Current disaster risk plans and practices are not suited to future disaster risks.

5. Increased education and education of risk and risk assessment at a local level to empower community risk ownership.

[Results, Box 1]
1. Stress communication

2. Positive individual dyadic coping

3. Positive conjoint dyadic coping

4. Negative individual dyadic coping

5. Negative conjoint dyadic coping

[Abstract]
1. Perceiving psychosis as positive

2. Making sense of psychotic experiences

3. Finding sources of strength

4. Negative past experiences of mental health services

5. Positive past experiences with individual clinicians

[Abstract]
1. Important moments and their relationship with female genital mutilation

2. The ritual knife: how sharp or blunt it is at different stages, where and how women are subsequently held as a result

3. Changing relationships with family: how being subject to female genital mutilation changed relationships with mothers

4. Female genital mutilation increases the risk of future childbirth complications which change relationships with family and healthcare systems

5. Managing experiences with early exposure to physical and sexual violence across the lifespan.

Interviews are the most common data collection technique in qualitative research. There are four main types of interviews; the one you choose will depend on your research question, aims and objectives. It is important to formulate open-ended interview questions that are understandable and easy for participants to answer. Key considerations in setting up the interview will enhance the quality of the data obtained and the experience of the interview for the participant and the researcher.

  • Gill P, Stewart K, Treasure E, Chadwick B. Methods of data collection in qualitative research: interviews and focus groups. Br Dent J . 2008;204(6):291-295. doi:10.1038/bdj.2008.192
  • DeJonckheere M, Vaughn LM. Semistructured interviewing in primary care research: a balance of relationship and rigour. Fam Med Community Health . 2019;7(2):e000057. doi:10.1136/fmch-2018-000057
  • Nyanchoka L, Tudur-Smith C, Porcher R, Hren D. Key stakeholders’ perspectives and experiences with defining, identifying and displaying gaps in health research: a qualitative study. BMJ Open . 2020;10(11):e039932. doi:10.1136/bmjopen-2020-039932
  • Morgan DL, Ataie J, Carder P, Hoffman K. Introducing dyadic interviews as a method for collecting qualitative data. Qual Health Res .  2013;23(9):1276-84. doi:10.1177/1049732313501889
  • Picchi S, Bonapitacola C, Borghi E, et al. The narrative interview in therapeutic education. The diabetic patients’ point of view. Acta Biomed . Jul 18 2018;89(6-S):43-50. doi:10.23750/abm.v89i6-S.7488
  • Stuij M, Elling A, Abma T. Negotiating exercise as medicine: Narratives from people with type 2 diabetes. Health (London) . 2021;25(1):86-102. doi:10.1177/1363459319851545
  • Buchmann M, Wermeling M, Lucius-Hoene G, Himmel W. Experiences of food abstinence in patients with type 2 diabetes: a qualitative study. BMJ Open .  2016;6(1):e008907. doi:10.1136/bmjopen-2015-008907
  • Jessee E. The Life History Interview. Handbook of Research Methods in Health Social Sciences . 2018:1-17:Chapter 80-1.
  • Sheftel A, Zembrzycki S. Only Human: A Reflection on the Ethical and Methodological Challenges of Working with “Difficult” Stories. The Oral History Review . 2019;37(2):191-214. doi:10.1093/ohr/ohq050
  • Harnisch H, Montgomery E. “What kept me going”: A qualitative study of avoidant responses to war-related adversity and perpetration of violence by former forcibly recruited children and youth in the Acholi region of northern Uganda. Soc Sci Med .  2017;188:100-108. doi:10.1016/j.socscimed.2017.07.007
  • Ruslin., Mashuri S, Rasak MSA, Alhabsyi M, Alhabsyi F, Syam H. Semi-structured Interview: A Methodological Reflection on the Development of a Qualitative Research Instrument in Educational Studies. IOSR-JRME . 2022;12(1):22-29. doi:10.9790/7388-1201052229
  • Chang T, Llanes M, Gold KJ, Fetters MD. Perspectives about and approaches to weight gain in pregnancy: a qualitative study of physicians and nurse midwives. BMC Pregnancy & Childbirth . 2013;13(47)doi:10.1186/1471-2393-13-47
  • DeJonckheere M, Robinson CH, Evans L, et al. Designing for Clinical Change: Creating an Intervention to Implement New Statin Guidelines in a Primary Care Clinic. JMIR Hum Factors .  2018;5(2):e19. doi:10.2196/humanfactors.9030
  • Knott E, Rao AH, Summers K, Teeger C. Interviews in the social sciences. Nature Reviews Methods Primers . 2022;2(1)doi:10.1038/s43586-022-00150-6
  • Bergenholtz H, Missel M, Timm H. Talking about death and dying in a hospital setting – a qualitative study of the wishes for end-of-life conversations from the perspective of patients and spouses. BMC Palliat Care . 2020;19(1):168. doi:10.1186/s12904-020-00675-1
  • Olorunsaiye CZ, Degge HM, Ubanyi TO, Achema TA, Yaya S. “It’s like being involved in a car crash”: teen pregnancy narratives of adolescents and young adults in Jos, Nigeria. Int Health . 2022;14(6):562-571. doi:10.1093/inthealth/ihab069
  • Ayton DR, Barker AL, Peeters G, et al. Exploring patient-reported outcomes following percutaneous coronary intervention: A qualitative study. Health Expect .  2018;21(2):457-465. doi:10.1111/hex.12636
  • World Health Organization. International Classification of Functioning, Disability and Health (ICF). WHO. https://www.who.int/standards/classifications/international-classification-of-functioning-disability-and-health#:~:text=ICF%20is%20the%20WHO%20framework,and%20measure%20health%20and%20disability.
  • Cuthbertson J, Rodriguez-Llanes JM, Robertson A, Archer F. Current and Emerging Disaster Risks Perceptions in Oceania: Key Stakeholders Recommendations for Disaster Management and Resilience Building. Int J Environ Res Public Health .  2019;16(3)doi:10.3390/ijerph16030460
  • Bannon SM, Grunberg VA, Reichman M, et al. Thematic Analysis of Dyadic Coping in Couples With Young-Onset Dementia. JAMA Netw Open .  2021;4(4):e216111. doi:10.1001/jamanetworkopen.2021.6111
  • McGranahan R, Jakaite Z, Edwards A, Rennick-Egglestone S, Slade M, Priebe S. Living with Psychosis without Mental Health Services: A Narrative Interview Study. BMJ Open .  2021;11(7):e045661. doi:10.1136/bmjopen-2020-045661
  • Gutiérrez-García AI, Solano-Ruíz C, Siles-González J, Perpiñá-Galvañ J. Life Histories and Lifelines: A Methodological Symbiosis for the Study of Female Genital Mutilation. Int J Qual Methods . 2021;20doi:10.1177/16094069211040969

Qualitative Research – a practical guide for health and social care researchers and practitioners Copyright © 2023 by Danielle Berkovic is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License , except where otherwise noted.

Share This Book

  • University Libraries
  • Research Guides
  • Topic Guides
  • Research Methods Guide
  • Interview Research

Research Methods Guide: Interview Research

  • Introduction
  • Research Design & Method
  • Survey Research
  • Data Analysis
  • Resources & Consultation

Tutorial Videos: Interview Method

Interview as a Method for Qualitative Research

research instrument interview meaning

Goals of Interview Research

  • Preferences
  • They help you explain, better understand, and explore research subjects' opinions, behavior, experiences, phenomenon, etc.
  • Interview questions are usually open-ended questions so that in-depth information will be collected.

Mode of Data Collection

There are several types of interviews, including:

  • Face-to-Face
  • Online (e.g. Skype, Googlehangout, etc)

FAQ: Conducting Interview Research

What are the important steps involved in interviews?

  • Think about who you will interview
  • Think about what kind of information you want to obtain from interviews
  • Think about why you want to pursue in-depth information around your research topic
  • Introduce yourself and explain the aim of the interview
  • Devise your questions so interviewees can help answer your research question
  • Have a sequence to your questions / topics by grouping them in themes
  • Make sure you can easily move back and forth between questions / topics
  • Make sure your questions are clear and easy to understand
  • Do not ask leading questions
  • Do you want to bring a second interviewer with you?
  • Do you want to bring a notetaker?
  • Do you want to record interviews? If so, do you have time to transcribe interview recordings?
  • Where will you interview people? Where is the setting with the least distraction?
  • How long will each interview take?
  • Do you need to address terms of confidentiality?

Do I have to choose either a survey or interviewing method?

No.  In fact, many researchers use a mixed method - interviews can be useful as follow-up to certain respondents to surveys, e.g., to further investigate their responses.

Is training an interviewer important?

Yes, since the interviewer can control the quality of the result, training the interviewer becomes crucial.  If more than one interviewers are involved in your study, it is important to have every interviewer understand the interviewing procedure and rehearse the interviewing process before beginning the formal study.

  • << Previous: Survey Research
  • Next: Data Analysis >>
  • Last Updated: Aug 21, 2023 10:42 AM

Researchmate.net logo

9 Best Examples of Research Instruments in Qualitative Research Explained

Introduction.

Qualitative research is a valuable approach that allows researchers to explore complex phenomena and gain in-depth insights into the experiences and perspectives of individuals. In order to conduct qualitative research effectively, researchers often utilize various research methodologies and instruments. These methodologies and instruments serve as tools to collect and analyze data, enabling researchers to uncover rich and nuanced information.

Qualitative research instruments are tools used to gather non-numerical data, providing researchers with detailed insights into participants' experiences, emotions, and social contexts.

In this article, we will delve into the world of qualitative research instruments, specifically focusing on research instrument examples. We will explore the different types of qualitative research instruments, provide specific examples, and discuss the advantages and limitations of using these instruments in qualitative research. By the end of this article, you will have a comprehensive understanding of the role and significance of research instruments in qualitative research.

Goals of Research Instruments in Qualitative Research

Qualitative research instruments are tools that researchers use to collect and analyze data in qualitative research studies. These instruments help researchers gather rich and detailed information about a particular phenomenon or topic.

One of the main goals of qualitative research is to understand the subjective experiences and perspectives of individuals. To achieve this, researchers need to use instruments that allow for in-depth exploration and interpretation of data. Qualitative research instruments can take various forms, including interviews, questionnaires, observations, and focus groups. Each instrument has its own strengths and limitations, and researchers need to carefully select the most appropriate instrument for their study objectives.

Exploring qualitative research instruments involves understanding the characteristics and features of each instrument, as well as considering the research context and the specific research questions being addressed. Researchers also need to consider the ethical implications of using qualitative research instruments, such as ensuring informed consent and maintaining confidentiality and anonymity of participants.

Examples of Qualitative Research Instruments

Qualitative research instruments are tools used to collect data and gather information in qualitative research studies. These instruments help researchers explore and understand complex social phenomena in depth. There are several types of qualitative research instruments that can be used depending on the research objectives and the nature of the study.

Interviews are a fundamental qualitative research instrument that allows researchers to gather in-depth and personalized information directly from participants through structured, semi-structured, or unstructured formats.

Interviews are one of the most commonly used qualitative research instruments. They involve direct communication between the researcher and the participant, allowing for in-depth exploration of the participant’s experiences, perspectives, and opinions. Interviews can be structured, semi-structured, or unstructured , depending on the level of flexibility in the questioning process. They involve researchers asking open-ended questions to participants to gather in-depth information and insights. Interviews can be conducted face-to-face, over the phone, or through video conferencing.

Focus Groups

Focus groups are a qualitative research instrument that involves guided group discussions, enabling researchers to collect diverse perspectives and explore group dynamics on a particular topic.

Focus groups are another example of qualitative research instrument that involves a group discussion led by a researcher or moderator. Participants in a focus group share their thoughts, ideas, and experiences on a specific topic. This instrument allows for the exploration of group dynamics and the interaction between participants. It also allow researchers to gather multiple perspectives and generate rich qualitative data.

Observations

Observations are a powerful qualitative research instrument that involves systematic and careful observation of participants in their natural settings. This type of qualitative research instrument allows researchers to gather data on behavior, interactions, and social processes. Observations can be participant observations, where the researcher actively participates in the setting, or non-participant observations, where the researcher remains an observer.

Document Analysis

Document analysis is a qualitative research instrument that involves the examination, analyzation and interpretation of written or recorded materials such as documents, texts, audio/video recordings or other written materials. Researchers analyze documents to gain insights into social, cultural, or historical contexts, as well as to understand the perspectives and meanings embedded in the documents.

Visual Methods

Visual methods, such as photography, video recording, or drawings, can be used as qualitative research instruments. These methods allow participants to express their experiences and perspectives visually, providing rich and nuanced data. Visual methods can be particularly useful in studying topics related to art, culture, or visual communication.

Diaries or Journals

Diaries or journals are qualitative research instruments that allow participants to record their thoughts, experiences, and reflections over time, providing researchers with rich, longitudinal data.

Diaries or journals can be used as qualitative research instruments to collect data on participants’ thoughts, feelings, and experiences over a period of time. Participants record their daily activities, reflections, and emotions, providing valuable insights into their lived experiences.

While surveys are commonly associated with quantitative research, they can also be used as qualitative research instruments. Qualitative surveys typically include open-ended questions that allow participants to provide detailed responses. Surveys can be administered online, through interviews, or in written form.

Case Studies

Case studies are in-depth investigations of a particular individual, group, or phenomenon. They involve collecting and analyzing qualitative data from various sources such as interviews, observations, and document analysis. Case studies provide rich and detailed insights into specific contexts or situations.

Ethnography

Ethnography is a qualitative research instrument that involves immersing researchers in a particular social or cultural group to observe and understand their behaviors, beliefs, and practices. Ethnographic research often includes participant observation, interviews, and document analysis.

These are just a few examples of qualitative research instruments. Researchers can choose the most appropriate data collection method or combination of methods based on their research objectives, the nature of the research question, and the available resources.

Advantages of Using Qualitative Research Instruments

Gathering in-depth and detailed information.

Qualitative research instruments offer several advantages that make them valuable tools in the research process. Firstly, qualitative research instruments allow researchers to gather in-depth and detailed information. Unlike quantitative research instruments that focus on numerical data, qualitative instruments provide rich and descriptive data about participants’ feelings, opinions, and experiences. This depth of information allows researchers to gain a comprehensive understanding of the research topic .

Flexibility and Adaptability in Qualitative Research

Another advantage of qualitative research instruments is their flexibility. Researchers can adapt their methods and questions during data collection to respond to emerging insights. This flexibility allows for a more dynamic and responsive research process, enabling researchers to explore new avenues and uncover unexpected findings.

Capturing Data in Natural Settings

Qualitative research instruments also offer the advantage of capturing data in natural settings. Unlike controlled laboratory settings often used in quantitative research, qualitative research takes place in real-world contexts. This natural setting allows researchers to observe participants’ behaviors and interactions in their natural environment, providing a more authentic and realistic representation of their experiences.

Promoting Participant Engagement and Collaboration

Furthermore, qualitative research instruments promote participant engagement and collaboration. By using methods such as interviews and focus groups, researchers can actively involve participants in the research process. This engagement fosters a sense of ownership and empowerment among participants, leading to more meaningful and insightful data.

Exploring Complex Issues Through Qualitative Research

Lastly, qualitative research instruments allow for the exploration of complex issues. Qualitative research is particularly useful when studying complex phenomena that cannot be easily quantified or measured. It allows researchers to delve into the underlying meanings, motivations, and social dynamics that shape individuals’ behaviors and experiences.

Limitations of Qualitative Research Instruments

Qualitative research instruments have several limitations that researchers need to consider when conducting their studies. In this section, we will delve into the limitations of qualitative research instruments as compared to quantitative research.

Time-Consuming Nature of Qualitative Research

One of the main drawbacks of qualitative research is that the process is time-consuming. Unlike quantitative research, which can collect data from a large sample size in a relatively short period of time, qualitative research requires in-depth interviews, observations, and analysis, which can take a significant amount of time.

Subjectivity and Potential Bias in Qualitative Research

Another limitation of qualitative research instruments is that the interpretations are subjective. Since qualitative research focuses on understanding the meaning and context of phenomena, the interpretations of the data can vary depending on the researcher’s perspective and biases. This subjectivity can introduce potential bias and affect the reliability and validity of the findings.

Complexity of Data Analysis

Additionally, qualitative research instruments often involve complex data analysis. Unlike quantitative research, which can use statistical methods to analyze data, qualitative research requires researchers to analyze textual or visual data, which can be time-consuming and challenging. The analysis process involves coding, categorizing, and interpreting the data, which requires expertise and careful attention to detail.

Challenges in Maintaining Anonymity and Privacy

Furthermore, qualitative research instruments may face challenges in maintaining anonymity. In some cases, researchers may need to collect sensitive or personal information from participants, which can raise ethical concerns . Ensuring the privacy and confidentiality of participants’ data can be challenging, and researchers need to take appropriate measures to protect the participants’ identities and maintain their trust.

Limited Generalizability of Qualitative Research Findings

Another limitation of qualitative research instruments is the limited generalizability of the findings. Qualitative research often focuses on a specific context or a small sample size, which may limit the generalizability of the findings to a larger population. While qualitative research provides rich and detailed insights into a particular phenomenon, it may not be representative of the broader population or applicable to other settings.

Difficulty in Replicating Qualitative Research Findings

Lastly, replicating findings in qualitative research can be difficult. Since qualitative research often involves in-depth exploration of a specific phenomenon, replicating the exact conditions and context of the original study can be challenging. This can make it difficult for other researchers to validate or replicate the findings, which is an essential aspect of scientific research.

Despite these limitations, qualitative research instruments offer valuable insights and understanding of complex phenomena. By acknowledging and addressing these limitations, researchers can enhance the rigor and validity of their qualitative research studies.

In conclusion, qualitative research instruments are powerful tools that enable researchers to explore and uncover the complexities of human experiences. By utilizing a range of instruments and considering their advantages and limitations, researchers can enhance the rigor and depth of their qualitative research studies.

Leave a Comment Cancel reply

Save my name, email, and website in this browser for the next time I comment.

Related articles

Research Questions

How to Formulate Research Questions in a Research Proposal? Discover The No. 1 Easiest Template Here!

Chatgpt-Best-Literature-Review-Generator

7 Easy Step-By-Step Guide of Using ChatGPT: The Best Literature Review Generator for Time-Saving Academic Research

Writing-Engaging-Introduction-in-Research-Papers

Writing Engaging Introduction in Research Papers : 7 Tips and Tricks!

Comparative-Frameworks-

Understanding Comparative Frameworks: Their Importance, Components, Examples and 8 Best Practices

artificial-intelligence-in-thesis-writing-for-phd-students

Revolutionizing Effective Thesis Writing for PhD Students Using Artificial Intelligence!

Interviews-as-One-of-Qualitative-Research-Instruments

3 Types of Interviews in Qualitative Research: An Essential Research Instrument and Handy Tips to Conduct Them

highlight abstracts

Highlight Abstracts: An Ultimate Guide For Researchers!

Critical abstracts

Crafting Critical Abstracts: 11 Expert Strategies for Summarizing Research

research instrument interview meaning

Community Blog

Keep up-to-date on postgraduate related issues with our quick reads written by students, postdocs, professors and industry leaders.

What is a Research Instrument?

Picture of DiscoverPhDs

  • By DiscoverPhDs
  • October 9, 2020

What is a Research Instrument?

The term research instrument refers to any tool that you may use to collect or obtain data, measure data and analyse data that is relevant to the subject of your research.

Research instruments are often used in the fields of social sciences and health sciences. These tools can also be found within education that relates to patients, staff, teachers and students.

The format of a research instrument may consist of questionnaires, surveys, interviews, checklists or simple tests. The choice of which specific research instrument tool to use will be decided on the by the researcher. It will also be strongly related to the actual methods that will be used in the specific study.

What Makes a Good Research Instrument?

A good research instrument is one that has been validated and has proven reliability. It should be one that can collect data in a way that’s appropriate to the research question being asked.

The research instrument must be able to assist in answering the research aims , objectives and research questions, as well as prove or disprove the hypothesis of the study.

It should not have any bias in the way that data is collect and it should be clear as to how the research instrument should be used appropriately.

What are the Different Types of Interview Research Instruments?

The general format of an interview is where the interviewer asks the interviewee to answer a set of questions which are normally asked and answered verbally. There are several different types of interview research instruments that may exist.

  • A structural interview may be used in which there are a specific number of questions that are formally asked of the interviewee and their responses recorded using a systematic and standard methodology.
  • An unstructured interview on the other hand may still be based on the same general theme of questions but here the person asking the questions (the interviewer) may change the order the questions are asked in and the specific way in which they’re asked.
  • A focus interview is one in which the interviewer will adapt their line or content of questioning based on the responses from the interviewee.
  • A focus group interview is one in which a group of volunteers or interviewees are asked questions to understand their opinion or thoughts on a specific subject.
  • A non-directive interview is one in which there are no specific questions agreed upon but instead the format is open-ended and more reactionary in the discussion between interviewer and interviewee.

What are the Different Types of Observation Research Instruments?

An observation research instrument is one in which a researcher makes observations and records of the behaviour of individuals. There are several different types.

Structured observations occur when the study is performed at a predetermined location and time, in which the volunteers or study participants are observed used standardised methods.

Naturalistic observations are focused on volunteers or participants being in more natural environments in which their reactions and behaviour are also more natural or spontaneous.

A participant observation occurs when the person conducting the research actively becomes part of the group of volunteers or participants that he or she is researching.

Final Comments

The types of research instruments will depend on the format of the research study being performed: qualitative, quantitative or a mixed methodology. You may for example utilise questionnaires when a study is more qualitative or use a scoring scale in more quantitative studies.

A Guide to Your First Week as a PhD Student

How should you spend your first week as a PhD student? Here’s are 7 steps to help you get started on your journey.

Academic Conference

Academic conferences are expensive and it can be tough finding the funds to go; this naturally leads to the question of are academic conferences worth it?

Writing Habits That Work

There’s no doubt about it – writing can be difficult. Whether you’re writing the first sentence of a paper or a grant proposal, it’s easy

Join thousands of other students and stay up to date with the latest PhD programmes, funding opportunities and advice.

research instrument interview meaning

Browse PhDs Now

Rationale for Research

The term rationale of research means the reason for performing the research study in question.

DiscoverPhDs procrastination trap

Are you always finding yourself working on sections of your research tasks right up until your deadlines? Are you still finding yourself distracted the moment

research instrument interview meaning

Dr Jadavji completed her PhD in Medical Genetics & Neuroscience from McGill University, Montreal, Canada in 2012. She is now an assistant professor involved in a mix of research, teaching and service projects.

research instrument interview meaning

Nidhi is a PhD student at Virginia Tech, focused on developing an engineered platform to study the breast tumor microenvironment, for diagnostic and prognostic purposes.

Join Thousands of Students

Banner

Research Methodologies: Research Instruments

  • Research Methodology Basics
  • Research Instruments
  • Types of Research Methodologies

Header Image

research interview survey bibguru

Types of Research Instruments

A research instrument is a tool you will use to help you collect, measure and analyze the data you use as part of your research.  The choice of research instrument will usually be yours to make as the researcher and will be whichever best suits your methodology. 

There are many different research instruments you can use in collecting data for your research:

  • Interviews  (either as a group or one-on-one). You can carry out interviews in many different ways. For example, your interview can be structured, semi-structured, or unstructured. The difference between them is how formal the set of questions is that is asked of the interviewee. In a group interview, you may choose to ask the interviewees to give you their opinions or perceptions on certain topics.
  • Surveys  (online or in-person). In survey research, you are posing questions in which you ask for a response from the person taking the survey. You may wish to have either free-answer questions such as essay style questions, or you may wish to use closed questions such as multiple choice. You may even wish to make the survey a mixture of both.
  • Focus Groups.  Similar to the group interview above, you may wish to ask a focus group to discuss a particular topic or opinion while you make a note of the answers given.
  • Observations.  This is a good research instrument to use if you are looking into human behaviors. Different ways of researching this include studying the spontaneous behavior of participants in their everyday life, or something more structured. A structured observation is research conducted at a set time and place where researchers observe behavior as planned and agreed upon with participants.

These are the most common ways of carrying out research, but it is really dependent on your needs as a researcher and what approach you think is best to take. It is also possible to combine a number of research instruments if this is necessary and appropriate in answering your research problem.

Data Collection

How to Collect Data for Your Research   This article covers different ways of collecting data in preparation for writing a thesis.

  • << Previous: Research Methodology Basics
  • Next: Types of Research Methodologies >>
  • Last Updated: Sep 27, 2022 12:28 PM
  • URL: https://paperpile.libguides.com/research-methodologies

Research Methodology in Education

  • Get Edit Link
  • The Writing Space
  • Welcome Desk

research instrument interview meaning

March 7, 2016

  • RESEARCH TOOLS: INTERVIEWS & QUESTIONNAIRES

Introduction

We will start with a few key operational definitions. ‘ Surveying ’ is the process by which the researcher collects data through a questionnaire (O’Leary, 2014). A ‘ questionnaire ’ is the instrument for collecting the primary data (Cohen, 2013). ‘ Primary data’ by extension is data that would not otherwise exist if it were not for the research process and is collected through both questionnaires or interviews, which we discuss here today (O’Leary, 2014). An ‘ interview ’ is typically a face-to-face conversation between a researcher and a participant involving a transfer of information to the interviewer (Cresswell, 2012). We will investigate each data collection instrument independently, starting with the interview.

Interviews are primarily done in qualitative research and occur when researchers ask one or more participants general, open-ended questions and record their answers. Often audiotapes are utilized to allow for more consistent transcription (Creswell, 2012). The researcher often transcribes and types the data into a computer file, in order to analyze it after interviewing. Interviews are particularly useful for uncovering the story behind a participant’s experiences and pursuing in-depth information around a topic. Interviews may be useful to follow-up with individual respondents after questionnaires, e.g., to further investigate their responses. (McNamara, 1999). In qualitative research specifically, interviews are used to pursue the meanings of central themes in the world of their subjects. The main task in interviewing is to understand the meaning of what the interviewees say (McNamara, 2009). Usually open-ended questions are asked during interviews in hopes of obtaining impartial answers, while closed ended questions may force participants to answer in a particular way (Creswell, 2012; McNamara, 1999). An open-ended question gives participants more options for responding. For example an open-ended question may be, “How do you balance participation in athletics with your schoolwork (Creswell, 2012)”. A closed-ended question provides a preset response. For example, “Do you exercise?” where the answers are limited to yes or no (Cresswell, 2012).

Must-knows before the interview

Interviewer must be:

  • Knowledgeable – familiar with the topic.
  • Structured – outline the procedure of the interview.
  • Clear – provide simple, easy and short questions which are spoken distinctly and understandably.
  • Gentle – tolerant, sensitive and patient when receiving provocative and unconventional opinions.
  • Steering – controlling the course of the interview to avoid digressions from the topic.
  • Critical – testing the reliability and validity of the information that the interviewee offers.
  • Remembering – retaining the information provided by the interviewee.
  •  Interpreting – offering interpretation of what the interviewee says (Kvalve, 1996).

Different Types of Interviews

  • One-on-one: Most time consuming, costly approach, but most common in educational research. Completed one participant at a time, and suitable for interview participants who are not hesitant to speak.
  • Focus Group: Typically in groups of four to six.
  • Telephone: Can be easy and fast, but usually only a small number of questions can be asked.
  • E-Mail: Easy to complete and allows questions and answers to be well thought out. Ethical issues may need to be addressed.  For example, whether the researcher has received written permission from individuals before participating in the interview and the privacy of responses.
  • Open-Ended Questions on Questionnaires (Creswell, 2012). Cresswell recommends using only open-ended questions during interviews, since they are primarily qualitative.

Structured Versus Unstructured

  • The interviewer might consider a summary column at the end or to the side of your sheet in order to fill in additional information.
  • Most interviews are a combination of structured and unstructured, allowing flexibility (Bell & Waters, 2014).
  • The interviewer might consider recording the interview or informing the participant that they will be taking notes before starting.
  • One type of unstructured interview is a ‘preliminary interview,’ where the interviewer is seeking areas or topics of significance for the interviewees (Bell & Waters, 2014).
  • Focused interview: framework is established prior to the interview and recording / analysis are simplified. Flow between topics is uninterrupted or free flowing. (Bell & Waters, 2014).

Sequence of Questions

  • Get the respondents involved in the interview as soon as possible.
  • Before asking about controversial matters (such as feelings and conclusions), first ask about some facts.
  • Intersperse fact-based questions throughout the interview.
  • Ask questions about the present before questions about the past or future.
  • The last questions might allow respondents to provide any extra information they consider to be relevant, as well as their impressions of the interview (McNamara, 1999).
  • Questions must be worded with diligence.
  • Questions should be asked one at a time.
  • Wording should be open-ended. Respondents should have the opportunity to choose their own descriptive vocabulary while answering questions.
  • Questions should be as neutral as possible.
  • Questions should be worded clearly.
  • Be wary of asking “why” questions. This type of question may encourage a participant to answer unnaturally or feel defensive (McNamara, 1999; Creswell, 2012).

Both Creswell and McNamara highlighted very similar points about conducting interviews. McNamara’s literature is less descriptive, but more simple and concise. Another author who has come up consistently in the interviewing literature is Kvalve, whose literature is much more intensive and broad. These three authors are all very prominent in the interview research literature.

Conducting the Interview

These are the steps that are consistent in the literature on conducting interviews in research (Creswell, 2012; McNamara, 1999):

  • Identify the interviewees.
  • Determine the type of interview you will use.
  • During the interview, audiotape the questions and responses.
  • Take brief notes during the interview.
  • Locate a quiet, suitable place for the interview.
  • Obtain consent from the interviewer to participate in the study.
  • Have a plan, but be flexible.
  • Use probes to obtain additional information.
  • Be courteous and professional when the interview is over.
  • Interviews provide useful information when participants cannot be directly observed.
  • The interviewer has better control over the types of information that they receive. They can pick their own questions.
  • If worded effectively, questions will encourage unbiased and truthful answers.
  • The interviewee may provide biased information or be unreliable if only one interviewer is interpreting the information. The best research requires many different point of views.
  • The interview answers may be deceptive because the interviewee tries to respond in a way that will please the interviewer.
  • Equipment may be a problem. Equipment may be costly and require a high level of technical competence to use.
  • Can be time-consuming and inexperienced interviewers may not be able to keep the questions properly focused.

Questionnaires

Questionnaires have many uses, most notably to discover what the masses are thinking.  These include: market research, political polling, customer service feedback, evaluations, opinion polls, and social science research (O’Leary, 2014).

Formulating a Questionnaire

Starting out.

Bell & Waters (2014) and O’Leary (2014), each offer clear checklists for creating a questionnaire from beginning to end. By comparing the two, we have created a comprehensive list. Bell starts by reminding the researcher to obtain approval prior to administering their questionnaire, then to reflect on what our question is and whether this is the best method to obtain the intended information (Bell & Waters, 2014). O’Leary (2014) suggests that you operationalize concepts in the beginning and define the measurable variables. Prior to writing your own questions, O’Leary (2014) would have you explore existing possibilities in order to adapt previous instruments rather than ‘reinventing the wheel’. At this point, both authors have you write your questions.

Forming questions

Bell & Waters (2014), utilizes Youngman (1982)’s Question Types:

  • Verbal / Open

Bell & Waters (2014), highlight a plethora of potential difficulties in wording your questions, including ambiguity and imprecision, assumptions, memory, knowledge, double questions, leading questions, presuming questions, hypothetical questions, offensive questions, and questions covering sensitive issues. It is imperative that you check for jargon within your language and return to your hypothesis or objectives often to decide which questions are most pertinent (Bell & Waters, 2014).

Bell & Waters (2014) and O’Leary (2014) seem to disagree on the next step; while O’Leary would focus next on the response category, Bell would have you look further into the wording of the questions. Following O’Leary (2014)’s logic, we decide now whether to use open or closed questions, considering how the category will translate to different data types. Closed response answers include: yes/no, agree/disagree, fill in the blanks, choosing from a list, ordering options, and interval response scales. Any of the three standard scaling methods, (Likert, Guttman, and Thurstone) may be used where appropriate (O’Leary, 2014).

Bell & Waters (2014) suggest you check your wording at this point. O’Leary (2014) goes into detail to point out problems with questions such as ambiguity, leading, confronting, offensiveness, unwarranted assumptions, double-barrelled questions, or pretentiousness. Questions to avoid according to O’Leary are those that are:

  • Poorly worded
  • Biased, leading, or loaded
  • Recall-dependent questions
  • Offensive questions
  • Questions with assumed knowledge
  • Questions with unwarranted assumptions
  • Questions with socially desirable responses.

Ordering Questions / Appearance and layout

Both authors emphasize thoughtfulness about the order of questions, considering logic and ease for respondents. O’Leary (2014) goes into further detail regarding issues with organization and length; too lengthy and respondents are less likely to complete the questionnaire. He also suggests researchers avoid asking threatening, awkward, insulting, or difficult questions, especially in the beginning of the questionnaire. Bell & Waters (2014) takes a more broad view of the aesthetics of the questionnaire; leaving spaces for legibility, limiting the overall numbers of pages, and considering the impression the document leaves, to highlight a few examples.

Write Instructions

Clear and unambiguous instructions for respondents are emphasized by both authors (O’Leary, 2014; Bell & Waters, 2014). This step is followed by a ‘layout’, or rearranging of questions, in both descriptions, likely because this is the best time to review once the questions and other writing is complete. O’Leary (2014) warns researchers to use professional and aesthetically-pleasing formatting, as well as to be organized in order to attract respondents and to lower the probability of making your own mistakes (in repeating questions, for example). O’Leary (2014) offers  final instructions to include a cover letter that describes who you are, the aim of the project, assurances of confidentiality, etc.. Bell & Waters (2014), however, offers further steps.

            Sample & Pilot Testing

Bell & Waters (2014) go into further detail regarding response rates and ensuring you have a representative or generalizable sample, which we believe is irrelevant to this article. More pertinent steps would be to pilot-test your questionnaire with preliminary respondents (even family and friends) and follow-through to preliminary data analysis in order to ensure your methods are effective, making adjustments accordingly (Bell & Waters, 2014).  O’Leary (2014) lists six steps in a typical pilot test:

  •  Have a run-through
  •  Seek feedback
  •  Trial your statistics package
  •  Make modifications
  •  Back to the start?

Distribution

Bell & Waters (2014) briefly consider distribution methods; they emphasize the need to ensure confidentiality, to include a return date, to formulate a plan for ‘bounce backs’ via email, and to record data as soon as it arrives. O’Leary (2014) lists typical methods: face-to-face, snail mail, e-mail, and online. Bell & Waters (2014) highlight the advantage to administering your questionnaire personally, as it enables the researcher to explain the purpose of the study and increases the probability of receiving completed questionnaires in return. The authors go on to emphasize the value of online methods.  In particular, they mention “Survey Monkey” as the most popular and versatile survey tool available (Bell & Waters, 2014). O’Leary (2014) suggests sending out reminder letters or E-mails in order to increase response rate and the speed of response.

Bell & Waters (2014) and O’Leary (2014) disagree once again with respect to the analysis. O’Leary (2014) suggests collecting the data as soon as possible, whereas Bell (2014) suggests the researcher merely glance through the responses prior to coding and recoding, if time allows. Both methods have merit, as the researcher must consider the time they have available, as well as the amount of data they are working with in order to make a logical decision.

O’Leary (2014) offers some concerns in using questionnaires as a research tool, as they are time consuming, expensive, and sampling is difficult. O’Leary (2014) asserts that questionnaires are ‘notoriously difficult to get right’ and they often do not go as planned.

O’Leary (2014) suggests some obvious strengths for this research method, as administering a questionnaire allows the researcher to generate data specific to their own research and offers insights that might otherwise be unavailable. In listing the additional benefits of questionnaires, O’Leary (2014) suggests that they can:

  •      Reach a large number of respondents
  •      Represent an even larger population
  •      Allow for comparisons
  •      Generate standardized, quantifiable, empirical data
  •      Generate qualitative data through the use of open-ended questions
  •      Be confidential and even anonymous

Considerations for the Method

Cohen et al. (2013, p.394) offer special considerations for administering questionnaires within an educational setting:

  • Gaining access to schools and teachers
  • Gaining permission to conduct the research
  • Resentment by principals
  • People vetting what could be used
  • Finding enough willing participants for your sample
  • Schools suffering from ‘too much research’ by outsiders and insiders
  • Schools/people not wishing to divulge information about themselves
  • Schools not wishing to be identifiable, even with protections guaranteed
  • Local political factors that impinge on the school
  • Teachers’ fear of being identified/traceable, even with protections guaranteed
  • Fear of participation by teachers (lose their contracts)
  • Unwillingness of teachers to be involved because of their workload
  • The principal deciding on whether to involve staff, without consultation with the staff
  • Schools/institutions fears of criticism/loss of face
  • The sensitivity of the research, the issues being investigate

Bell, J., Waters, S., & Ebooks Corporation. (2014). Doing your research project: A guide for first-time researchers (Sixth ed.). Maidenhead, Berkshire: Open University Press.

Cohen, L., Manion, L., Morrison, K., & Ebooks Corporation. (2011; 2013; 1993). Research methods in education (7th ed.). Abingdon, Oxon; New York: Routledge. doi:10.4324/9780203720967.

Creswell, J. W. (2009). Research design: Qualitative, quantitative, and mixed methods approaches (3rd ed.). Los Angeles: Sage.

Kvale, S., & SAGE Research Methods Online. (2008). Doing interviews . Thousand Oaks; London: SAGE Publications, Limited.

McNamara, C. (1999). General Guidelines for Conducting Interviews, Authenticity Consulting, LLC, Retrieved from: http://www.managementhelp.org/evaluatn/intrview.htm

O’Leary, Z. (2014). The essential guide to doing your research project (2nd ed.). London: SAGE.

And So It Was Written

research instrument interview meaning

Author: ADJP Quad

Published: March 7, 2016

Word Count: 2375

Creative Commons CC-BY Attribution License

ORGANIZED BY

More to read.

Comments are closed.

Recently Written

  • An Introduction to Document Analysis
  • Observation: Not As Simple As You Thought (ADK)
  • Grounded Theory: A Down-to-Earth Explanation
  • Assignment (10)

View by Date Published

Search writings.

A TRU Writer powered SPLOT : Research Methodology in Education

Blame @cogdog — Up ↑

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings

Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .

  • Advanced Search
  • Journal List
  • HHS Author Manuscripts

Logo of nihpa

Researching the researcher-as-instrument: an exercise in interviewer self-reflexivity

Anne e pezalla.

Pennsylvania State University, USA

Jonathan Pettigrew

Michelle miller-day.

Because the researcher is the instrument in semistructured or unstructured qualitative interviews, unique researcher characteristics have the potential to influence the collection of empirical materials. This concept, although widely acknowledged, has garnered little systematic investigation. This article discusses the interviewer characteristics of three different interviewers who are part of a qualitative research team. The researcher/interviewers – and authors of this article – reflect on their own and each other’s interviews and explore the ways in which individual interview practices create unique conversational spaces. The results suggest that certain interviewer characteristics may be more effective than others in eliciting detailed narratives from respondents depending on the perceived sensitivity of the topic, but that variation in interviewer characteristics may benefit rather than detract from the goals of team-based qualitative inquiry. The authors call for the inclusion of enhanced self-reflexivity in interviewer training and development activities and argue against standardization of interviewer practices in qualitative research teams.

Introduction

Inner Silence Writing, Reflecting, Hoping Slipping into Truth Interviewing moments Take me by surprise Like Sunlight ( Janesick, 1998 : 53)

The level of researcher involvement in qualitative interviewing – indeed, the embodiment of the unique researcher as the instrument for qualitative data collection – has been widely acknowledged (e.g. Cassell, 2005 ; Rubin and Rubin, 2005 ; Turato, 2005 ). Because the researcher is the instrument in semistructured or unstructured qualitative interviews, unique researcher attributes have the potential to influence the collection of empirical materials. Although it is common for scholars to advocate for interviewer reflexivity ( Ellis and Berger, 2003 ; Pillow, 2003 ) and acknowledge the researcher as the primary instrument in qualitative interview studies ( Guba and Lincoln 1981 ; Merriam 2002 ), with some notable exceptions (e.g. Pitts and Miller-Day, 2007 ; Watts, 2008 ) few have actually examined the qualitative interview as a collaborative enterprise, as an exchange between two parties, reflecting on the ways in which the interviewer affects the organization of this talk-in-interaction and the processes by which the talk is produced. Given this, the first aim of this study is to provide a reflexive account of how three different interviewers (authors Jonathan, Annie, and Michelle) individually facilitate unique conversational spaces in their qualitative interviews.

Understanding the qualitative interview as social interaction is important for any sole qualitative researcher, but as Miller-Day et al. (2009) pointed out, this may be particularly germane for qualitative research teams (QRT). Herriott and Firestone (1983) argued that when there is more than one interviewer on a QRT, inconsistencies in interview style and approach may affect the quality of the research conversation and ultimately the study findings. Indeed, several published resources on QRTs suggest that interviewers should receive the same standard training with an eye toward producing consistent strategies and credible findings ( Bergman and Coxon, 2005 ; United States Agency for International Development’s Center for Development Information and Evaluation, 1996 ). Unfortunately, current literature addressing QRTs has primarily focused on the relationship dynamics among research team members (e.g. Fernald and Duclos, 2005 ; Rogers-Dillon, 2005 ; Sanders and Cuneo, 2010 ; Treloar and Graham, 2003 ) and on group analytical procedures (e.g. Guest and MacQueen, 2007 ; MacQueen et al., 1999 ; Olesen et al., 1994 ) rather than on the team member roles (e.g. interviewer, analyst) or data collection practices (e.g. strategies for building rapport). As QRTs are becoming more prevalent, especially in funded research ( Barry et al., 1999 ; Ferguson et al., 2009 ), there is a need for more information about how to maximize the use of multiple interviewers and maintain a focus on the unified research goals while respecting the flexibility of the in-depth qualitative interview as talk-in-interaction ( Mallozzi, 2009 ; Miller-Day et al., 2009 ). Toward that end, the second aim of this study is to reflect on and discuss implications of the study findings for qualitative research teams.

Researcher-as-instrument

The phrase researcher-as-instrument refers to the researcher as an active respondent in the research process ( Hammersley and Atkinson, 1995 ). Researchers ‘use their sensory organs to grasp the study objects, mirroring them in their consciousness, where they then are converted into phenomenological representations to be interpreted’ ( Turato, 2005 : 510). It is through the researcher’s facilitative interaction that a conversational space is created – that is, an arena where respondents feel safe to share stories on their experiences and life worlds ( Owens, 2006 ).

Across the years, scholars have considered the nature of researcher-as-instrument as interpreter of empirical materials and as involved in the construction of ideas ( Janesick, 2001 ; Singer et al., 1983 ). This consideration began to grow after feminist UK scholars such as Oakley (1981) and Graham (1983) criticized quantitative-based research methods that assumed a detached and value-free researcher in the acquisition and interpretation of gathered data, and was further developed by feminist ethnographers such as Stack (1995) , who offered seminal research on ‘dramatizing both writer and subject’ in fieldwork on neighborhoods and communities (p. 1). More recently, scholars have extended their interest of researcher-instruments to consider specific interviewing strategies. Conversation analysis tools have often been used to examine the intricacies of interview conversations, studying the ways in which the ‘how’ of a given interview shapes the ‘what’ that is produced ( Holstein and Gubrium, 1995 ; Pillow, 2003 ).

While qualitative scholars agree that a conversational space must be created, they often disagree as to what that space should look like. Some scholars argue for a Rogerian interviewing space, where empathy, transparency, and unconditional positive regard are felt ( Janesick, 2001 ; Mallozzi, 2009 ; Matteson and Lincoln, 2009 ). Pitts and Miller-Day (2007) documented specific trajectories experienced by qualitative interviewers when establishing rapport with research participants, and the authors argue that a feeling of interpersonal connection was necessary for the qualitative interviewer and interviewee to develop a partnership. These claims are grounded in the feminist or postructuralist perspective, which hold that ‘the essential self … is not automatically revealed in a neutral environment but can and might need to be benevolently coaxed out into a safe environment, where it can be actualized’ ( Mallozzi, 2009 : 1045).

Others advocate against a feminist approach to interviewing. Tanggaard (2007) , for example, viewed empathy to be a dangerous interviewer quality because it tends to create a superficial form of friendship between interviewer and respondent. Self-disclosure has been similarly critiqued ( Abell et al., 2006 ). These critics hold that self-disclosure may actually distance the interviewer from the respondent when the self-disclosure portrays the interviewer as more knowledgeable than the respondent. These studies question the popular assumption that displays of empathy or acts of self-disclosure are naturally interpreted by the respondent as a means of establishing a conversational space of rapport and mutual understanding.

So where do these opposing viewpoints lead us as researchers? For the three of us who are authoring this article, the answer to that question is an unsatisfactory, ‘we are not sure.’ Working as part of a QRT, we were trained in a systematic manner, provided with clear procedures for carrying out our qualitative interviews, and educated in the ultimate goals of the research project. The interviewees in this team project were a fairly homogenous group – rural 6–7th grade students – and all three of us interviewed youth in both grades, both male and female, gregarious and stoic. Yet, the interviews we conducted all turned out to be very different. What stood out to us was that our individual attributes as researchers seemed to impact the manner in which we conducted our interviews and affected how we accomplished the primary objective of the interviews, which was to elicit detailed narratives from the adolescents. Hence, we set forth to better understand how we, as research instruments, individually facilitated unique conversational spaces in our interviews and determine if there were some researcher attributes or practices that were more effective than others in eliciting detailed narratives from the adolescent respondents. Additionally, we sought to reflect on the emergent findings and offer a discussion of how unique conversational spaces might impact QRTs.

Gathering and analyzing empirical materials

The team-based qualitative research, participants.

The empirical materials for the current study came from a larger study designed to understand the social context of substance use for rural adolescents in two Mid-Atlantic States. A total of 113 participants between 12 and 19 years old ( M = 13.68, SD = 1.37) were recruited from schools identified as rural based on one of two main criteria: (a) the school district being located in a ‘rural’ area as determined by the National Center for Education Statistics (NCES, n.d.; and (b) the school’s location in a county being considered ‘Appalachian’ according to the Appalachian Regional Commission (ARC). Participating schools served a large population of economically disadvantaged students identified by family income being equal to or less than 180 percent of the United States Department of Agricultural federal poverty guidelines and these guidelines start at an annual salary of $20,036 but increase by $6,919 for each additional household member ( Ohio Department of Education 2010 ).

Interview team

Eleven interviewers comprised the qualitative research team for this team-based study. All underwent at least four hours of interviewer training, which reviewed interview protocol and procedures, summarized guidelines for ethical research, and included interview practice and feedback. During training, interviewers were given a clear interview schedule. Because the interviews were semistructured, the interviewers were instructed to use the schedule as a guide. They were instructed not to read the questions word-for-word from the interview schedule, but instead to use their own phrasing for asking each question, use additional probes or prompts if necessary, and use a communication style that felt comfortable and natural to them. Interviewers were also instructed to interact with their participants as learners attempting to understand the participants’ experiences and realities from their perspectives ( Baxter and Babbie, 2004 ). All interviewers on the team participated in mock interview sessions and were provided with initial feedback about their interview skill.

The interviews themselves were conducted in private locations within the schools such as guidance counselors’ offices or unused classrooms or conference rooms. In most cases, either the adult school contact or the study liaison brought students to their interview site to ensure that the interviewer did not know the students’ names – only their unique identification number. Researchers assured all students their responses would remain confidential, in accordance with Institutional Review Board standards, and the interviewee was permitted to withdraw his/her data from the study at any time. All interviews were digitally recorded and ranged from 18–91 minutes in length. This length is typical of interviews dealing with sensitive topics such as drug use in a school-based setting ( Alberts et al., 1991 ; Botvin et al., 2000 ).

The present study: Three Voices in the Crowd

Interview sample.

For the purpose of the present study we all agreed that self-reflexivity was necessary to ‘understand ourselves as part of the process of understanding others’ ( Ellis and Berger, 2003 : 486), increase the transparency of our findings, and increase the legitimacy and validity of our research. Therefore, we elected to limit our analysis to only those interviews that the three of us conducted, excluding transcripts from the other eight interviewers in the team-based study. Transcripts of the interviews were provided by a professional transcriptionist who was blind to the purpose of the study. A total of 18 interviews were transcribed (six per interviewer). Further refining the sample, we elected to analyze only interviews that we deemed to be of sufficient quality. Transcript quality was based on two indicators: (a) the level of transcription detail; and (b) the ability of the respondent to speak and understand English. Transcripts that were poorly done (i.e. that failed to include sufficient detail from the interview audio file) or that indicated that the respondent did not understand English were rated as low quality and were not included in final analyses. We took this step to ensure that all transcripts in the study sample were of sufficient quality and provided adequate detail to decipher our interviewer practices. From the 18 originally submitted transcripts, we found 13 to be of sufficient quality, and retained them for analysis.

Analysis procedures

Following Baptiste’s (2001) advice, the first step in our analysis was to acknowledge our interpretivist orientation and to honestly discuss among ourselves the risks involved with self-reflexively examining our own work. If you think it is difficult to listen to your own voice in an audio-recording, imagine listening to your own voice and simultaneously reading the text illustrating your own interview errors, dysfluencies, and awkward pauses! This first step was perhaps the most difficult, but it resulted in a shared agreement for honest self-reflection and analysis.

The next step involved restricting our analysis to three specifically selected topics from the research interview. The three discussion topics included rural living, identity and future selves, and risky behavior. We identified these topics of discussion because they each represented a different level of emotional risk for the respondents ( Corbin and Morse, 2003 ), based on the assumptions that (a) respondents were all relatively similar in their emotional well-being – specifically, that none were too emotionally fragile to engage in a conversation with us, and (b) discussing topics of illegal or private activities would arouse more powerful emotions in respondents than would topics of legal and mundane activities. Across the entire sample of interviews, conversations on rural living were seen as fairly low-risk topics of discussion. The topic often served as a warm-up for many interview conversations because the topic was easy for respondents to discuss. Conversations on identity and future selves were typically perceived as moderately uncomfortable for respondents. Respondents were asked to talk about their personality characteristics and who they wanted to become in the future. Although some respondents appeared to enjoy the opportunity to talk about themselves, many appeared mildly uncomfortable doing so, perhaps because they were being asked to talk about themselves with someone they did not know. Conversations on risky behavior were often perceived to be more dangerous. Despite being reassured that their stories would remain confidential, respondents were nevertheless being asked to disclose information about potentially illegal activities in which they had taken part. These topics of discussion were not always mutually exclusive (e.g. respondents often talked about risky behavior when they discussed rural living); but, because every interview in the larger study included topics of discussion that were low, moderate, and highly sensitive, we believed that the three chosen topics of discussion represented an appropriate cross-section of the interview.

Dividing interviews into topics of discussion provided a way to organize long transcripts into relatively distinct topical areas. It also allowed us to examine interviewer practices across comparable topics of discussion, and to assess the ways in which particular characteristics facilitated different conversational spaces.

The next step involved identifying and labeling the discussion of each of the three topics within each of the 13 transcripts. As we labeled the related passages in the transcripts, each of us followed the same iterative analytic process, commencing with an analysis of our own individual transcripts and followed by a cross-case analysis of each others’ transcripts. Our individual, within-case analysis proceeded along four main steps: reading through our own transcripts 2–3 times before extracting the separate topics of discussion; then within each topic of discussion across all of our own interviews, we inductively identified, interpreted, and labeled what we each saw as important in the utterances, sequencing, and details of the conversational interaction, assessing the ways in which interviewer practices seemed to facilitate and to inhibit respondent disclosure. For our purposes, we defined an interviewer practice as an action performed repeatedly. These practices were eventually categorized into groups of interviewer characteristics. We conceptually defined an interviewer characteristic as a distinguishing general feature or overall quality of the interviewer. Throughout this process we individually developed and refined our code lists, discussing our emergent codes with one another via weekly meetings and email correspondence. As part of this process, we coded our own transcripts and then shared and discussed our code list with the others. Next, each of us (re) coded a portion of each other’s transcripts and calculated the percentage of raw coding agreement. Disagreements were negotiated until we all reached consensus on a working list of codes. This cross-case analysis did not commence until we had reached a minimum coding agreement of .80. Within the topic of rural living, for example, if two of us each generated five codes to describe one interviewer’s researcher-as-instrument characteristics, consensus was necessary on at least four of those codes before a trustworthy assessment could be made.

During the cross-case analysis we compared and contrasted the coded material within and across the entire sample of transcripts to identify discrepancies and consistencies in our codes. From this process, we reduced the code list to a common set of researcher-as-instrument characteristics and interviewing practices that were present in the utterances, sequencing, and details of the conversational interactions. Throughout this process we explicitly identified evidence (excerpts from the interview transcripts) for any research claim to connect the empirical materials with any findings ( Maxwell, 1996 ). The three of us met periodically to conference, share ideas, and challenge and refine emergent findings. We used Nvivo 8 to manage and analyze the interview data. In the end, we were able to (a) identify and describe individual interviewer practices that served to characterize each of us as individual interviewers, and (b) compare and contrast our individual differences within and across the different topics in the interview conversation. During this comparison we paid special attention to the adolescent’s contribution to the conversation and his or her level of disclosure.

Interviewer characteristics

Annie’s general interviewer characteristics were coded as affirming, energetic , and interpretive. The affirming characteristic was defined as ‘showing support for a respondent’s idea or belief’ and is illustrated in the following excerpt:

Annie : What do you do? Resp : I help the milkers, I help – Annie : You know how to milk a cow? That’s so cool, that’s great. Resp : Yeah, but you have to watch out ’cause they kick sometimes. ’Cause they don’t want you messing with their teats – they kick, it’s, uh … Annie : Have you been kicked? Resp : I got kicked in the arm, but I’m scared I’m gonna get kicked in the face one of these days. Annie : Yeah, that would really hurt, huh? Oh, wow, that’s amazing.

Comments like ‘that’s so cool, that’s great,’ and ‘Oh, wow, that’s amazing’ illustrated the affirmation. Annie’s affirming characteristic could be seen in other transcript passages in phrases such as ‘great,’ ‘awesome,’ ‘amazing,’ and ‘excellent.’ Annie’s interviewer characteristics were also coded as energetic , defined as ‘showing wonder, astonishment, or confusion by something respondent said that was unexpected, or remarkable.’

Annie : So you like dirt bikes. Do you have one of your own? Resp : Yeah, I have a, it’s a one, it’s a two-fifty. It’s like a, it’s a CRX 250, it’s like … Annie : Oh, wow! Is it a pretty big bike? Wow, what do you like to do on it? Resp : I just ride around in the fields and usually chase after deer on it. Annie : Really!
Annie : Um, is your sister older or younger? Resp : She’s younger, she’s ten. Annie : So you kinda look out for her? Resp : Yeah. She likes to feed the calves. Annie : Oooooh!! Cute little baby calves. That’s neat. Wow! How unique. That’s really, really cool.
Annie : What’s a – dwarf bunny? What is that? Resp : Yeah, they’re like little bunnies – they’re about that big. Annie : Like real bunnies? Resp : Yeah, they’re about that big – Annie : Oh, dwarf bunnies. Oooh!

The sheer number of exclamation marks in Annie’s transcripts illustrated her energetic interviewer characteristic, but the words she used (wow, really, oooooh) also illustrated the lively quality of her interview approach.

Lastly, Annie was also characterized as being interpretive , conceptually defined as ‘expressing a personal opinion or interpretation regarding something a respondent said.’ For example:

Resp : And I chugged it and like, I passed out. Annie : Did you have to go to the hospital? Resp : Oh no. We were in the middle of the woods and we weren’t saying anything ’cause we all would get busted. Annie : Oh my gosh, oh, you must have felt terrible.
Annie : Do you think that he drinks beer, or does chew or smokes cigarettes? Resp : He probably does, but – Annie : Do you think so? Um, and so when he offered this to you, were you, were you uncomfortable? Like, did you feel kind of weird?

In all of the above passages, Annie’s interpretive nature is evident in instances where she offers her own construal of the respondent’s story (e.g. ‘you must have felt terrible’), or when she creates a hypothetical scenario for the respondent to comment on (‘do you think he drinks beer?’). Such utterances illustrate her tendency to offer an opinion, either in response to a respondent’s story or before a conversation formally began.

Jonathan’s interviewer characteristics were characterized by neutrality and naivety. The neutral interviewer characteristic, defined as ‘not engaged on one side of argument or another; neither affirming nor disapproving of respondent’s stories,’ was best illustrated by the lack of extensive commentary Jonathan provided in his interviews. In comparison to Annie’s transcripts, Jonathan’s transcripts were characterized by shorter utterances, fewer opinionated responses, and very few exclamation marks:

Jonathan : Who were you living with in [name of town]? Resp : My mom. But she, my grandma got custody of me, so. Jonathan : What, what happened to do that? Like, what, what brought you? Resp : Well, I got put in [the local in-patient treatment facility] ’cause I said I was gonna kill myself. Jonathan : Oh, okay.
Jonathan : Okay. What, um, so does your dad mind if you drink then? Like, if he found out that you were going to the bar party and that you had gotten drunk, what would he say? Resp : He probably wouldn’t do anything because, like, I used to have parties at his house, at my dad’s house. But then he got, then he went to jail, so we stopped [lowers tone, quieter] In case, like, ’cause they were keeping a good eye on him after he got out. Jonathan : Mm hmm. Resp : So we stopped having parties there, just so that, like, my dad wouldn’t get in trouble for, like, the underage drinking. Jonathan : Okay.

It was often difficult to even see evidence of Jonathan’s ‘footprint’ in his transcripts because he maintained a fairly minimal presence in his interviews. As seen from the illustrations above, Jonathan kept many of his responses or comments to single-word phrases, ‘Okay,’ or ‘Mm hmm,’ or ‘Yeah.’ When Jonathan did offer more extensive commentary, it was often to acknowledge his lack of understanding about a subject matter. His transcripts often included passages like ‘I’ve never been here before’ or ‘I don’t know anything about that .’ It was in these instances that Jonathan’s interviewer characteristic of naive , defined as showing a lack of knowledge or information about respondent, was best illustrated:

Jonathan : Is it like illegal? Or is it like the whole town shuts down, they do racing down the streets? Resp : It’s illegal. Jonathan : Yes? I don’t know – you got tell me these things. I am learning.

These illustrations of naivety were most likely uttered to give the respondent a sense of mastery over the interview topics of discussion, and to elicit the respondent’s interpretations of the events or topics of discussion.

Michelle’s interviewer characteristics illustrated different qualities than either Jonathan or Annie. Michelle’s qualities as an interviewer were coded as being high in affirmation and self-disclosure. Michelle’s transcripts were filled with encouragement and compliments toward her respondents. The following utterances from Michelle illustrate this characteristic:

My goodness, you are smart for a seventh grader … It sounds like you are very helpful … Yes, that is a skill that you have there, that not a lot of people do have …

These instances of affirmation, defined as ‘showing support for a respondent’s idea or belief,’ were found in almost every topic of discussion. Michelle’s transcripts were also filled with instances of self-disclosure. Michelle often used stories of her adolescent son when she was explaining a topic that she wanted to discuss with the adolescent respondents:

Resp : On Friday nights, tonight I’ll go to my gran’s and we usually have a get-together and just play cards, it’s just a thing we do. I like it. It’s just time to spend with family. Michelle : Absolutely. Well, that sounds really nice. And I have a 14-year old in eighth grade. And every Sunday night, we do the game night sort of thing and I look forward to it.

The passages above illustrate three distinct interviewer characteristics: one high in affirmations, energy, interpretations ; another characterized by neutrality and naivety ; and another high in affirmations and self-disclosure . Although all three interviewers demonstrated other instrument qualities in their interviews, the few qualities associated with each interviewer above were found in nearly every topic of discussion (e.g. in almost every conversational topic for Annie, there was evidence of her affirming, energetic , and interpretive interviewer characteristics). These qualities seemed to characterize the unique style of the interviewers rather than reflect reactions to specific contexts. These qualities also persisted in our other interviews not included in these analyses.

Topics of discussion

In the following section, we compare our general interviewer characteristics across the three topics of discussion: rural living, identity and future selves, and risky behavior. We also examine the ways in which our respective interviewer characteristics appeared to influence the conversational space of our interviews. Specifically, we assess how the various interviewer characteristics seemed to facilitate or inhibit respondent disclosure.

Low threat topic: Rural living

Rural living was generally a low-risk topic. In her discussion of this topic with one adolescent, Michelle tended to utilize her self-disclosing characteristic:

Michelle : Are there groups or, like, not cliques, I don’t wanna say, but groups in school; kids who are more like you, who are more into the computers, versus the kids who are huntin’ and fishin’, versus the jocks? I know at my son’s school there are. Resp : There’s not really anybody like that here. Like all of my friends who are like that, they’re in a higher grade than me. But there are some people in my grade where I can relate to in a sense, yeah. Michelle : Okay, so most kids you can relate to are older but most o’ the kids, your peers and your age, are more into the four wheeling and hunting and fishing and kinda stuff like that? That must feel, well, I don’t know, I’m, I’m projecting now unto my own son because sometimes he feels like, that you know, it’s just ridiculous. Resp : Yeah. Michelle : It, eh, ya’ know – and you feel kinda stuck. Resp : Mmm hmm. Michelle : Yeah? Resp : Yeah. I just, like I’ll be sitting there in class and then they’ll start talking about hunting or fishing and I just wanna pull out my hair’ cause I, I don’t know how you can like that stuff. Like it’s just sitting there for a couple of hours doing nothing. Michelle : Right, right.

From the excerpt above, the respondent’s experience with school crowds did not appear to coincide with Michelle’s understanding of her son’s with school crowds. However, Michelle’s self-disclosure seemed to open up the conversational space for the respondent to respond in kind. In the final passage, the respondent offered a different perspective on the nature of crowds in his school.

Conversely, in his conversations with respondents about rural living, Jonathan tended to demonstrate his naive interviewer characteristic:

Jonathan : Is this [name of X town]? Is that where you live now? I don’t even know where I am. Okay, okay. I thought this was [name of Y town] is why, but it’s just the name of the High School. Resp : Well, this is [name of Y town], but [name of X town] is out near. Jonathan : Uh, I’m not, I don’t know this area so well … Resp : And then, like, when you hit, there’s this big huge fire station … and then there’s the [name of X town] Elementary School. And then if you go down there and then you turn and you go up, and then that, like, that whole area in there is [name of X town]. Jonathan : Okay. Resp : And then you go back and where there’s classified as [name of X town], but it’s actually [name of Z town]. Jonathan : Okay.

In response to Jonathan’s naivety (‘I don’t even know where I am’ and ‘I don’t know this area so well’), the respondent appeared to seize the opportunity to teach Jonathan about the area. The respondent did not simply answer Jonathan’s questions; he provided information about which Jonathan did not ask (e.g. the whereabouts of the fire station, elementary school, and nearby towns).

In contrast, Annie’s conversations about rural living were filled with her energetic interviewer characteristic:

Annie : What do you mean by hang out, like what do you ha-, what do you do when you hang out? Resp : We go four wheeler riding. Annie : Oh, four wheeler riding! Cool! Is that dangerous? Is it? Resp : Yeah, and we go up to our camp we built. Um … Annie : That you and your friends built? Resp : Mmm hmm. Annie : Wow! How did you know how to do all that? Resp : Um, my brother and a couple of his friends, that we’re really good friends with, helped us. And like, over the summer we camp out like every night. Like, I’m never home in the summertime, ever. Annie : Wow! Resp : There are three bedrooms and it’s, has a wood burner and it, yeah. Annie : That’s like, that sounds like a real house. That’s amazing. Resp : We built it out of trees. We had our, couple of our friends and our dads help us. We’ve had it for three years and it’s really nice.

After Annie’s lively reply to the respondent’s interest in four wheeler riding (‘Oh, four wheeler riding! Cool!’), the respondent opened up about a different, but related topic: her summer camp house. Moreover, Annie’s energetic comment about the house (‘Wow! How did you know how to do all that?’) seemed to open the conversational space even more, as the respondent explained the ways the house was built, the amenities of the house, and the amount of time she spent in the house during the summer.

Moderate threat topic: Identity and future selves

Conversations about the adolescents’ identity and future selves were considered moderately uncomfortable for adolescents. The interview questions prompted the adolescents to talk about the qualities that described their personal and social identities, along with any hopes and aspirations they had for the future. Although the interview questions were designed to be as unobtrusive as possible, the topic was fairly personal. The interview questions required the adolescent respondent to be introspective with someone with whom they had no personal history:

Jonathan : After you’re all done with school, so you go through and you graduate from a high school. What do you want to do after that? Resp : Go back to Mexico and visit my family, and um get a job. Jonathan : Back in Mexico? Resp : It doesn’t really matter where, but just like get a job. Jonathan : Yes. What kind of job? Resp : Probably like a secretary or whatever job they give me, except prostitute. Jonathan : None a’ that. Is there anything you worry about in that transition of how you’re going to go get a job and what kind of job you’ll get, things like that? Resp : Not really, because like, you just have to like – I dunno, just like – just like – go on with life and whatever happens, just, take it.

Here again, Jonathan’s neutrality was demonstrated not by what he said, but what he did not say. Despite the fact that the adolescent shared a potentially troubling disclosure, that she would consider any job except prostitution, Jonathan kept his personal reactions to a minimum and provided only a short response (i.e. ‘None a’ that’). After this instance of neutrality, Jonathan moved on to a different topic (i.e. asking the respondent if she had any concerns about getting a job in the future), and the respondent moved on, as well, dutifully answering his questions. She provided no more information on her prostitution comment.

In comparison to Jonathan, Michelle and Annie’s utterances in their conversations on identity and future selves were replete with codes for affirmation:

Resp : I wanna be a pediatrician nurse or something. Like, I love kids to death. Like, I’ve, I learned how to change – I’ve been changing diapers – this is no lie – I’ve been changing diapers since I was like seven years old. ‘Cause my mom, step-mom, had a baby before my dad left again, and like I was always changing her diapers and stuff, and like, I babysit constantly. Annie : Aww, I bet you’re really good with kids. Resp : Oh, I’m amazing. Like, there’s this one little boy, like he goes to my church, he’s just like four, and I took him to my house one day and like he asked his mom to buy him a toy at the toy store, I cried, she’s like, she’s like, ‘Aww, I can’t sweetie, I don’t have the money’ and he was crying, he and he’s like ‘All my friends have toys. He was like two and he, like he, like he goes over to this daycare and he’s like ‘All my friends have these toys but I don’t have any.’ Like he had no toys at all and like my mom gave them, handed me a hundred dollars and she’s like ‘Go to, go, go buy toys. We gave him a hundred dollars, like we gave him all this money and they went out and bought like a b-, toys and stuff. It was really nice. Annie : That is, that’s really neat.
Michelle : So the first question that I have here is which of these things that you wrote down are you most proud of? Resp : Well, being helpful. Michelle : How are you helpful? Resp : Well. In school, there are some people that don’t like speak English that well. And I help them by translating. Michelle : Oh okay. Like you are doing for your teacher in there. You are helping do that. So how long have you been bilingual your whole life? Do both of your parents speak Spanish? Resp : Well, yes, they are Mexicans. They barely know English. Michelle : And they barely know English. And when did you come here? Resp : When I was nine months old. Michelle : When you were a baby. And before that you lived where? Resp : In Mexico. Michelle : Mexico. So you are 13, so that was when you were a year old. Okay, got it. Okay, so you learned here. So you speak English better than they do it, sounds like. Okay and then you translate. What’s that like translating for them? Resp : Well, for me it’s like sometimes difficult because I never went to school in Mexico and I know more English than Spanish and when I am translating it’s difficult for me. The big words my parents tell me to try to translate it in English. Michelle : Okay. So you’re doing both ways. You’re doing from English to and from Spanish to English. Both. Does that feel like a lot of responsibility for somebody your age? Resp : Yeah, especially when I got field trips stuff like that. I need to tell my parents, that my parents or if my parents needed something that comes in the mail, may be bills or something like that. Michelle : It sounds like you are very helpful. Who do you want to be when you are out of after high school? Resp : Since I like to help out people a lot, I mean, maybe be a translator and maybe in a hospital or in a school so – Michelle : Yes, that is a skill that you have there, that not a lot of people do have. So that’s – I’m glad you realized that, in terms of that.

Annie’s affirming characteristic could be seen in her affirmation of her respondent’s compassion for children (‘I bet you’re really good with kids’); for Michelle, the characteristic could be seen in her affirmations of her respondent’s willingness to help her parents, teacher, and classmates with their English or Spanish (‘… it sounds like you are very helpful’). Both Michelle and Annie’s affirmation seemed to foster a conversational space that was conducive for uninhibited self-disclosure. In response to Annie’s affirmation about owning a daycare someday, the respondent opened up to talk about her talents in working with children, and her compassion for the children in her community who were less fortunate than she. In response to Michelle’s affirmations about the responsibilities of translating for so many people, the respondent expounded on the difficulties of such a responsibility, and the tasks she must perform for various people (e.g. helping her classmates on field trips, assisting her parents with bills).

High threat topic: Alcohol, tobacco, and other drug use

Discussions about alcohol, tobacco, and other drug usage (ATOD) were considered highly sensitive topics of discussion, as adolescents were often encouraged to disclose information about their own or their peers’ drug use. Although the respondents were continually reassured that the information they provided was confidential, disclosing information about illegal activity to a stranger was likely a highly sensitive activity. When discussing ATOD with adolescents, each interviewer utilized a different interviewer characteristic. Jonathan’s dominant characteristic when discussing this topic was neutrality :

Resp : Her parents’, like, bar. Like, they own this big, huge bar. And then, like, in the back where the kids can go. Jonathan : Oh, okay. Resp : And her parents don’t really care if you drink. Jonathan : Oh, okay. Resp : Just as long as you do it in the bar. You don’t just go outside, or you don’t tell your parents. Jonathan : Okay. Resp : She doesn’t really know that we drink, but we usually crash in the van, in the RV. Jonathan : Uh huh. Resp : … or out in the yard. And we only do the RV in the summer or in the spring. And then at my other friend’s house who has the bar, we stay at, we do the, we have parties there all the time. Jonathan : Mm hmm. Resp : Just cause her parents don’t care. Jonathan : Yeah.

Even in the midst of some fairly controversial topics of discussion (e.g. underage binge drinking), Jonathan’s neutral characteristic was consistently demonstrated in his calm, even responses (‘okay,’ ‘uh huh’). These neutral responses seemed to provide an unobtrusive backdrop for the respondent to discuss her experiences. Indeed, Jonathan did not even need to ask any questions to the respondent. With minimal prompting, the respondent shared her story.

In comparison to Jonathan, when discussing ATOD, Annie’s approach was coded as interpretive ; she often interjected commentary about the respondents’ stories of risky behavior:

Annie : Do you think that he drinks beer, or does chew or smokes cigarettes? Resp : He probably does … Annie : Um, and so when he offered this to you, were you, were you uncomfortable? Like, did you feel kind of weird? Resp : Mm hmm. Annie : Um, and, and maybe that boy’s brother – like, that guy’s brother – he might smoke or drink from time to time, but, um, that’s about it? Resp : Mm hmm. Annie : It doesn’t seem like too many kids around here do that stuff. Resp : Not as I know.

Annie’s interpretive characteristic stands in stark contrast to Jonathan’s neutral characteristic. Whereas Jonathan’s responses were short and dispassionate, Annie’s responses were somewhat opinionated. These interpretive comments did not seem to generate a conversational space conducive for the respondent’s continued disclosure. Indeed, the transcript above shows that most of the commentary came from Annie, not the respondent.

In discussions on risky behavior, Michelle’s self-disclosing characteristic was evidenced by her stories of her 14-year-old son, and appeared to serve as a point of identification with respondents:

Resp : My parents get mad because I listen to music a lot and I don’t do anything than watch TV. Just hang out with my friends. Michelle : Then your parents get mad because that’s all you do. You know but the good thing about me is I’m not your parent and I don’t care. So I just want to know what kids are doing. It’s, you know, I have an eighth grader actually he’s 14. And that’s exactly what he does. And in the winter it stinks, though you are right because what else is there to do? You know it’s the question, um any way, okay. So, do you know my question to you is, and again, this is purely confidential, we don’t know names we don’t want names or anything. Has anybody ever offered you any alcohol or cigarettes or marijuana or any of those? And have you said yes or no to that? Resp : Yes, they offered me and I’d always told them ‘no’ and what it does. Michelle : Okay, so tell me … pretend that we’re shooting this video. Okay tell me the who when what where why and how. Right? Where were you, not who, not a name. But was it a friend who was older, younger, male, female? That kind of thing. Tell me the story of at least one of these offers. Resp : Okay. I was hanging out with my friends, just walking around, and there is this bigger kid that we know and he was joined by these smokers, and they would always, he would always tell me never to smoke and we just saw him … And then he offered us and we said no. This is not good for you and he plays soccer and he is not really good at soccer.

Michelle’s self-disclosure about her son experiencing similar challenges as the respondent was initially met by the respondent with a short response. However, Michelle’s subsequent question, framed as a hypothetical task (‘ pretend that we’re shooting this video ’), seemed to create an opening in the conversational space for the respondent to share a story.

Summary and discussion

In looking closely at the different practices we employed as interviewers, we were able to identify a variety of distinguishing features that seemed to characterize each of us uniquely. If we were characters in a novel or play, Annie’s character name would be energy , Jonathan’s neutrality , and Michelle’s self-disclosure . Across the different conversation topics in the interview, from low to high risk, these interviewer characteristics functioned differently in eliciting detail from adolescent respondents.

When the adolescents and researchers discussed the low-risk topic of rural living, the three interviewer characteristics (i.e. energy, neutrality, or self-disclosure) generated sufficiently detailed responses from the respondents. Variance across interviewers did not seem to have much impact on the quality of the responses obtained from the adolescent participants. This may have been due, in part, to the low-risk nature of the topic. This is a topic many adolescents can talk easily about, have talked about with others, and do not perceive the information they share as particularly threatening.

When the topic was moderately risky, as was the topic of identities and future selves, Jonathan’s neutral approach contrasted with Michelle and Annie’s affirming approach. Although neutrality appeared somewhat effective in facilitating an open conversational space for respondents, the affirming interviewer characteristic seemed to offer a more nurturing environment for conversation. Rich, detailed disclosures from adolescents about their identities occurred more often when the interviewer utilized an affirming approach and set a tone of acceptance for the respondents. Affirmation may be particularly important with adolescents, since adolescence is a notoriously vulnerable time in development.

When discussing a high risk topic such as alcohol and other drug use, Annie’s interpretive approach appeared to be the least effective in providing a satisfying conversational space for respondents. Jonathan’s neutral characteristic and Michelle’s self-disclosing characteristic appeared to elicit detailed information from their respondents, while Annie’s interpretive characteristic only served to inhibit her respondent’s stories. Michelle’s disclosures, while also interpretive, did not appear to limit responses from the adolescents. Couching Michelle’s interpretive language within a personal narrative may have mitigated its presence, although it still presented leading information. Hence, it could be argued that neutrality (displayed in this context by Jonathan) may be most effective when discussing high risk topics, because this neutrality provides the respondents with the most freedom to disclose what they want and how they want.

An important factor to note in this discussion is that of gender. While we did not explicitly study the role of gender in our analyses, our interviewing styles were rooted in traditional gender norms: Jonathan’s minimalist and neutral styles could be characterized as stereotypically masculine, and Annie and Michelle’s effusive and affirming interviewing styles could be characterized as traditionally feminine. These qualities suggest that interviewing styles cannot be disentangled from one’s gender, and that conversational spaces are influenced by more than simply an interviewer’s words. To this end, practices of reflexivity must acknowledge the implications of what an interviewer says and how it is said, as well as the ways in which those utterances are connected to one’s gender.

Although this study provides some intriguing findings, it was limited in a variety of ways. For one, we did not employ detailed conversation analysis procedures on each individual utterance in the interview. And despite the range of conversational segments in the interviews (i.e. introductions, research explanations, establishing rapport, soliciting honesty and openness, a period of questions and answers on six core topics, summarizing the discussion, and closings), for the purposes of this study, we elected to limit our analysis specifically to three topics in the question and answer segment. Nor did we examine other conversational features, such as the role of silence or turn-taking. Conversational features such as those, while certainly worth our attention, were beyond the scope of this exercise.

Lessons learned

Learning about interviewing and doing interviews are different tasks. This lesson was highly relevant for us when conducting this study. Even though we were all trained in interviewing, we still found ourselves displaying the classic mistakes of a novice researcher: asking long, complicated questions, posturing closed yes-or-no questions, and leading respondents ( deMarrais, 2004 ). While humbling, these mistakes forced us to reflect on how to develop our skills and have guided our interviewing work since that time. Indeed, the kind of self-reflexivity involved in conducting an analysis of your own interviews, and then comparing and contrasting them with others, could be beneficial for individual interviewers as they are honing their craft, and QRTs desiring to identify unique characteristics of their resident interviewers.

In considering our findings, we agree that researchers are indeed the ‘instruments’ in qualitative interview research. After all, it is through the researcher’s facilitative interaction that a conversational space is created where respondents share rich information about their lives. Yet, we argue that qualitative researchers are differently calibrated instruments.

In QRTs, in particular, the goal is often to calibrate all instruments to one standard of accuracy. However, the results of this study illustrate that variation in interviewer characteristics may be a benefit rather than a detriment to team-based qualitative inquiry. All interviewers in this study were effective in conducting engaging conversations with participants and eliciting information, but we did these things employing different practices, and sometimes to different ends. Each interviewer demonstrated a relatively consistent interviewer style across all of his or her interviews – Jonathan was consistently neutral, Michelle consistently self-disclosive, and Annie consistently energetic. This finding leads us to suggest that QRTs might benefit from learning what ‘natural style’ characterizes a possible interviewer and then staffing their teams with interviewers who have complementary styles. Interviewers may then be assigned interview tasks commensurate with their strengths. For example, our team needed to learn both about rural identity and about alcohol and drug use, so Michelle and Annie could have been assigned to interview respondents about rural identity (a ‘safe’ topic) and future selves (a moderately risky topic), which both fit our energetic style. This approach could have helped to engage participants in the research and establish rapport with them among the research team. Then, Jonathan could be assigned to the task of summarizing the information learned about the less risky topics and bringing that information into a second interview to pursue the high risk topic of drug use, implementing his neutral style for a non-evaluative conversational space. This suggestion is founded on a premise similar to utilizing information from personality inventories (e.g. Myers Briggs) to establish work teams in organizations ( Furlow, 2000 ).

Since many interviews must occur during a single visit, however, interviewer ‘profiling’ may not be realistic for QRTs. Another suggestion would be to audio-record interview trainees in mock interviews, share those recordings among the team, then devote some time for team members to offer commentary on (a) the ways in which their teammates embodied similar or different instruments in their interviews and (b) how those instruments seemed to create different conversational spaces. This process need not involve detailed conversation analysis tools; nor should it be formal or performance-based. Instead, it should be congenial and constructive, driven by efforts to respect interviewer flexibility while maintaining fidelity to the research approach. These recommendations are in line with calls issued by Mallozzi (2009) and Miller-Day et al. (2009) , who argued that consistency efforts be focused on research procedures (e.g. securing consent, managing empirical materials) and not on standardizing interviewer characteristics.

In carrying out these recommendations, more research will be needed to understand the complexities of how and under what conditions interviewer characteristics may impact respondent responses. More research will also be needed on the ways QRT practices may change if reflexivity was incorporated at other stages of the process (e.g. forming research questions and gaining access). Yet this study provides a running start toward that end. Through our exercise, we call for greater interviewer reflexivity and acknowledge that researchers are the primary instruments in qualitative interview studies – but differentially calibrated instruments. We disagree with claims that interviewers in qualitative research teams should receive the same standard training with an eye toward producing consistent interview strategies ( Bergman and Coxon, 2005 ) and argue, instead, that diversity of approaches among members of a research team has the potential to strengthen the team through complementarity.

Acknowledgments

This research received no specific grant from any funding agency in the public, commercial or not-for-profit sectors.

Biographies

Annie Pezalla is the Academic Skills Director at Walden University. Her research addresses identity development across adolescence and young adulthood.

Jonathan Pettigrew is a research analyst and project coordinator for the Drug Resistance Strategies project at Penn State University. His research examines how interpersonal and family interactions correspond with adolescent health.

Michelle Miller-Day is an Associate Professor of Communication Arts and Sciences at the Pennsylvania State University. She is the Founding Director of the Penn State Qualitative Research Interest Group, an interdisciplinary community of researchers involved in and supporting qualitative inquiry at Penn State University. Her research addresses human communication and health, including areas such as substance use prevention, suicide, and families and mental health. Her community-embedded research has involved numerous creative projects to translate research findings into social change. For the past 20 years she has served as the principal qualitative methodologist for a National Institute on Drug Abuse line of research.

Contributor Information

Anne E Pezalla, Pennsylvania State University, USA.

Jonathan Pettigrew, Pennsylvania State University, USA.

Michelle Miller-Day, Pennsylvania State University, USA.

  • Abell J, Locke A, Condor S, Gibson S, Stevenson C. Trying similarity, doing difference: the role of interviewer self-disclosure in interview talk with young people. Qualitative Research. 2006; 6 (2):221–244. [ Google Scholar ]
  • Alberts JK, Miller-Rassulo M, Hecht ML. A typology of drug resistance strategies. Journal of Applied Communication Research. 1991; 19 :129–151. [ Google Scholar ]
  • Barry CA, Britten N, Barber N, Bradley C, Stevenson F. Using reflexivity to optimize teamwork in qualitative research. Qualitative Health Research. 1999; 9 (1):26–44. [ PubMed ] [ Google Scholar ]
  • Baptiste I. Qualitative data analysis: common phases, strategic differences. Forum Qualitative Sozialforschung/Forum: Qualitative Social Research [On-line Journal] 2001; 2 (3):Art. 22. Available at: http://www.qualitative-research.net/index.php/fqs/article/view/965/2106 . [ Google Scholar ]
  • Baxter LA, Babbie ER. The Basics of Communication Research. Belmont, CA: Wadsworth; 2004. [ Google Scholar ]
  • Bergman MM, Coxon APM. The quality in qualitative methods. Forum Qualitative Sozialforschung/Forum: Qualitative Social Research [On-line Journal] 2005; 6 (2):Art. 34. Available at: http://www.qualitative-research.net/fqs-texte/2-05/05-2-34-e.htm . [ Google Scholar ]
  • Botvin GJ, Griffin KW, Diaz T, Scheier LM, Williams C, Epstein JA. Preventing illicit drug use in adolescents: long-term follow-up data from a randomized control trial of a school population. Addictive Behavior. 2000; 5 :769–774. [ PubMed ] [ Google Scholar ]
  • Cassell C. Creating the interviewer: identity work in the management research process. Qualitative Research. 2005; 5 (2):167–179. [ Google Scholar ]
  • Corbin J, Morse JM. The unstructured interactive interview: issues of reciprocity and risks when dealing with sensitive topics. Qualitative Inquiry. 2003; 9 (3):335–354. [ Google Scholar ]
  • deMarrais K. Qualitative interview studies: learning through experience. In: deMarrais K, Lapan S, editors. Foundations for Research: Methods of Inquiry in Education and the Social Sciences. Mahwah, NJ: Lawrence Erlbaum; 2004. pp. 51–68. [ Google Scholar ]
  • Ellis C, Berger L. Their story/my story/our story: including the researcher’s experience in interview research. In: Holstein JA, Gubrium JF, editors. Inside Interviewing: New Lenses, New Concerns. Thousand Oaks, CA: SAGE; 2003. pp. 467–493. [ Google Scholar ]
  • Ferguson DL, Tetler S, Baltzer K. Meeting the challenges of multi-site, multi-researcher interpretivist research. 2009 Available at: http://www.dpu.dk/Everest/Publications .
  • Fernald DH, Duclos CW. Enhance your team-based qualitative research. Annals of Family Medicine. 2005; 3 :360–364. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Furlow L. Job profiling: building a winning team using behavioral assessment. Journal of Nursing Administration. 2000; 30 (3):107–111. [ PubMed ] [ Google Scholar ]
  • Graham H. Do her answers fit his questions? Women and the survey method. In: Gamarnikow E, Morgan D, Purvis J, Taylorson D, editors. The Public and the Private. London: Heinemann; 1983. pp. 132–147. [ Google Scholar ]
  • Guba EG, Lincoln YS. The evaluator as instrument. In: Guba EG, Lincoln YS, editors. Effective Evaluation. San Francisco, CA: Jossey-Bass; 1981. pp. 128–152. [ Google Scholar ]
  • Guest G, MacQueen KM, editors. Handbook for Team-based Qualitative Research. Lanham, MD: Alta Mira Press; 2007. [ Google Scholar ]
  • Hammersley M, Atkinson P. Ethnography: Principles in Practice. 2. New York: Routledge; 1995. [ Google Scholar ]
  • Herriott RE, Firestone WA. Multisite qualitative policy research: optimizing description and generalizability. Educational Researcher. 1983; 12 (2):14–19. [ Google Scholar ]
  • Holstein JA, Gubrium JF. The Active Interview. Newbury Park, CA: Sage; 1995. [ Google Scholar ]
  • Janesick VJ. Stretching Exercises for Qualitative Researchers. Thousand Oaks, CA: Sage; 1998. [ Google Scholar ]
  • Janesick VJ. Intuition and creativity: a pas de deux for qualitative researchers. Qualitative Inquiry. 2001; 7 (5):531–540. [ Google Scholar ]
  • MacQueen KM, McLellan E, Kay K, Milstein B. Codebook development for team-based qualitative analysis. Cultural Anthropology Methods. 1999; 10 :31–36. [ Google Scholar ]
  • Mallozzi CA. Voicing the interview: a researcher’s exploration on a platform of empathy. Qualitative Inquiry. 2009; 15 (6):1042–1060. [ Google Scholar ]
  • Matteson SM, Lincoln YS. Using multiple interviewers in qualitative research studies: the influence of ethic of care behaviors in research interview settings. Qualitative Inquiry. 2009; 15 (8):659–674. [ Google Scholar ]
  • Maxwell JA. Qualitative Research Design: An Interactive Approach. Thousand Oaks, CA: Sage; 1996. [ Google Scholar ]
  • Merriam SB. Qualitative Research in Practice: Examples for Discussion and Analysis. San Francisco, CA: Jossey-Bass; 2002. [ Google Scholar ]
  • Miller-Day M, Pezalla A, Pettigrew J, Krieger J, Colby M, Hecht ML. The possibilities and pitfalls of team-based qualitative research. Paper presented at the Qualitative Inquiry in the Caribbean International Conference; October; Kingston 5, Jamaica. 2009. [ Google Scholar ]
  • National Center for Education Statistics. Identification of rural locales. n.d Available at: http://nces.ed.gov/ccd/rural_locales.asp .
  • Oakley A. Interviewing women: a contradiction in terms? In: Roberts H, editor. Doing Feminist Research. New York: Routledge; 1981. pp. 30–61. [ Google Scholar ]
  • Ohio Department of Education. Data for free and reduced price meal eligibility (MR81) 2010 Available at: http://education.ohio.gov/GD/Templates/Pages/ODE/ODEDetail.aspx?page=3&TopicRelationID=828&ContentID=13197&Content=79922 .
  • Olesen V, Droes N, Hatton D, Chico N, Schatzman L. Analyzing together: recollections of a team approach. In: Bryman A, Burgess RG, editors. Analyzing Qualitative Data. New York: Routledge; 1994. pp. 111–128. [ Google Scholar ]
  • Owens EO. Conversational space and participant shame in interviewing. Qualitative Inquiry. 2006; 12 (6):1160–1179. [ Google Scholar ]
  • Pillow WS. Confession, catharsis, or cure? Rethinking the uses of reflexivity as methodological power in qualitative research. International Journal of Qualitative Research in Education. 2003; 16 :175–196. [ Google Scholar ]
  • Pitts M, Miller-Day M. Upward turning points and positive rapport development across time in researcher-participant relationships. Qualitative Research. 2007; 7 :177–201. [ Google Scholar ]
  • Rogers-Dillon RH. Hierarchical qualitative research teams: refining the methodology. Qualitative Research. 2005; 5 (4):437–454. [ Google Scholar ]
  • Rubin HJ, Rubin IS. Qualitative Interviewing: The Art of Hearing Data. Thousand Oaks, CA: Sage; 2005. [ Google Scholar ]
  • Sanders CB, Cuneo CJ. Social reliability in qualitative team research. Sociology. 2010; 44 (2):325–343. [ Google Scholar ]
  • Singer E, Frankel M, Glassman MB. The effect of interviewer characteristics and expectations on response. Public Opinion Quarterly. 1983; 47 :68–83. [ Google Scholar ]
  • Stack CB. Writing ethnography: feminist critical practice. In: Wolf DL, editor. Feminist Dilemmas in Fieldwork. New York: Westview Press; 1995. pp. 1–19. [ Google Scholar ]
  • Tanggaard L. The research interview as discourses crossing swords: the researcher and apprentice on crossing roads. Qualitative Inquiry. 2007; 13 :160–176. [ Google Scholar ]
  • Treloar C, Graham IE. Multidisciplinary cross-national studies: a commentary on issues of collaboration, methodology, analysis, and publication. Qualitative Health Research. 2003; 13 (7):924–932. [ PubMed ] [ Google Scholar ]
  • Turato ER. Qualitative and quantitative methods in health: definitions, differences and research subjects. Revista de Saude Publica. 2005; 39 (3):507–514. [ PubMed ] [ Google Scholar ]
  • United States Agency for International Development’s Center for Development Information and Evaluation. Conducting Key Informant Interviews. (Performance Monitoring and Evaluation TIPS) 1996 Available at: http://pdf.usaid.gov/pdf_docs/PNABS541.pdf .
  • Watts JH. Emotion, empathy and exit: reflections on doing ethnographic qualitative research on sensitive topics. Medical Sociology Online. 2008; 3 (2):3–14. [ Google Scholar ]

Have a language expert improve your writing

Run a free plagiarism check in 10 minutes, generate accurate citations for free.

  • Knowledge Base

Methodology

  • Survey Research | Definition, Examples & Methods

Survey Research | Definition, Examples & Methods

Published on August 20, 2019 by Shona McCombes . Revised on June 22, 2023.

Survey research means collecting information about a group of people by asking them questions and analyzing the results. To conduct an effective survey, follow these six steps:

  • Determine who will participate in the survey
  • Decide the type of survey (mail, online, or in-person)
  • Design the survey questions and layout
  • Distribute the survey
  • Analyze the responses
  • Write up the results

Surveys are a flexible method of data collection that can be used in many different types of research .

Table of contents

What are surveys used for, step 1: define the population and sample, step 2: decide on the type of survey, step 3: design the survey questions, step 4: distribute the survey and collect responses, step 5: analyze the survey results, step 6: write up the survey results, other interesting articles, frequently asked questions about surveys.

Surveys are used as a method of gathering data in many different fields. They are a good choice when you want to find out about the characteristics, preferences, opinions, or beliefs of a group of people.

Common uses of survey research include:

  • Social research : investigating the experiences and characteristics of different social groups
  • Market research : finding out what customers think about products, services, and companies
  • Health research : collecting data from patients about symptoms and treatments
  • Politics : measuring public opinion about parties and policies
  • Psychology : researching personality traits, preferences and behaviours

Surveys can be used in both cross-sectional studies , where you collect data just once, and in longitudinal studies , where you survey the same sample several times over an extended period.

Here's why students love Scribbr's proofreading services

Discover proofreading & editing

Before you start conducting survey research, you should already have a clear research question that defines what you want to find out. Based on this question, you need to determine exactly who you will target to participate in the survey.

Populations

The target population is the specific group of people that you want to find out about. This group can be very broad or relatively narrow. For example:

  • The population of Brazil
  • US college students
  • Second-generation immigrants in the Netherlands
  • Customers of a specific company aged 18-24
  • British transgender women over the age of 50

Your survey should aim to produce results that can be generalized to the whole population. That means you need to carefully define exactly who you want to draw conclusions about.

Several common research biases can arise if your survey is not generalizable, particularly sampling bias and selection bias . The presence of these biases have serious repercussions for the validity of your results.

It’s rarely possible to survey the entire population of your research – it would be very difficult to get a response from every person in Brazil or every college student in the US. Instead, you will usually survey a sample from the population.

The sample size depends on how big the population is. You can use an online sample calculator to work out how many responses you need.

There are many sampling methods that allow you to generalize to broad populations. In general, though, the sample should aim to be representative of the population as a whole. The larger and more representative your sample, the more valid your conclusions. Again, beware of various types of sampling bias as you design your sample, particularly self-selection bias , nonresponse bias , undercoverage bias , and survivorship bias .

There are two main types of survey:

  • A questionnaire , where a list of questions is distributed by mail, online or in person, and respondents fill it out themselves.
  • An interview , where the researcher asks a set of questions by phone or in person and records the responses.

Which type you choose depends on the sample size and location, as well as the focus of the research.

Questionnaires

Sending out a paper survey by mail is a common method of gathering demographic information (for example, in a government census of the population).

  • You can easily access a large sample.
  • You have some control over who is included in the sample (e.g. residents of a specific region).
  • The response rate is often low, and at risk for biases like self-selection bias .

Online surveys are a popular choice for students doing dissertation research , due to the low cost and flexibility of this method. There are many online tools available for constructing surveys, such as SurveyMonkey and Google Forms .

  • You can quickly access a large sample without constraints on time or location.
  • The data is easy to process and analyze.
  • The anonymity and accessibility of online surveys mean you have less control over who responds, which can lead to biases like self-selection bias .

If your research focuses on a specific location, you can distribute a written questionnaire to be completed by respondents on the spot. For example, you could approach the customers of a shopping mall or ask all students to complete a questionnaire at the end of a class.

  • You can screen respondents to make sure only people in the target population are included in the sample.
  • You can collect time- and location-specific data (e.g. the opinions of a store’s weekday customers).
  • The sample size will be smaller, so this method is less suitable for collecting data on broad populations and is at risk for sampling bias .

Oral interviews are a useful method for smaller sample sizes. They allow you to gather more in-depth information on people’s opinions and preferences. You can conduct interviews by phone or in person.

  • You have personal contact with respondents, so you know exactly who will be included in the sample in advance.
  • You can clarify questions and ask for follow-up information when necessary.
  • The lack of anonymity may cause respondents to answer less honestly, and there is more risk of researcher bias.

Like questionnaires, interviews can be used to collect quantitative data: the researcher records each response as a category or rating and statistically analyzes the results. But they are more commonly used to collect qualitative data : the interviewees’ full responses are transcribed and analyzed individually to gain a richer understanding of their opinions and feelings.

Next, you need to decide which questions you will ask and how you will ask them. It’s important to consider:

  • The type of questions
  • The content of the questions
  • The phrasing of the questions
  • The ordering and layout of the survey

Open-ended vs closed-ended questions

There are two main forms of survey questions: open-ended and closed-ended. Many surveys use a combination of both.

Closed-ended questions give the respondent a predetermined set of answers to choose from. A closed-ended question can include:

  • A binary answer (e.g. yes/no or agree/disagree )
  • A scale (e.g. a Likert scale with five points ranging from strongly agree to strongly disagree )
  • A list of options with a single answer possible (e.g. age categories)
  • A list of options with multiple answers possible (e.g. leisure interests)

Closed-ended questions are best for quantitative research . They provide you with numerical data that can be statistically analyzed to find patterns, trends, and correlations .

Open-ended questions are best for qualitative research. This type of question has no predetermined answers to choose from. Instead, the respondent answers in their own words.

Open questions are most common in interviews, but you can also use them in questionnaires. They are often useful as follow-up questions to ask for more detailed explanations of responses to the closed questions.

The content of the survey questions

To ensure the validity and reliability of your results, you need to carefully consider each question in the survey. All questions should be narrowly focused with enough context for the respondent to answer accurately. Avoid questions that are not directly relevant to the survey’s purpose.

When constructing closed-ended questions, ensure that the options cover all possibilities. If you include a list of options that isn’t exhaustive, you can add an “other” field.

Phrasing the survey questions

In terms of language, the survey questions should be as clear and precise as possible. Tailor the questions to your target population, keeping in mind their level of knowledge of the topic. Avoid jargon or industry-specific terminology.

Survey questions are at risk for biases like social desirability bias , the Hawthorne effect , or demand characteristics . It’s critical to use language that respondents will easily understand, and avoid words with vague or ambiguous meanings. Make sure your questions are phrased neutrally, with no indication that you’d prefer a particular answer or emotion.

Ordering the survey questions

The questions should be arranged in a logical order. Start with easy, non-sensitive, closed-ended questions that will encourage the respondent to continue.

If the survey covers several different topics or themes, group together related questions. You can divide a questionnaire into sections to help respondents understand what is being asked in each part.

If a question refers back to or depends on the answer to a previous question, they should be placed directly next to one another.

Receive feedback on language, structure, and formatting

Professional editors proofread and edit your paper by focusing on:

  • Academic style
  • Vague sentences
  • Style consistency

See an example

research instrument interview meaning

Before you start, create a clear plan for where, when, how, and with whom you will conduct the survey. Determine in advance how many responses you require and how you will gain access to the sample.

When you are satisfied that you have created a strong research design suitable for answering your research questions, you can conduct the survey through your method of choice – by mail, online, or in person.

There are many methods of analyzing the results of your survey. First you have to process the data, usually with the help of a computer program to sort all the responses. You should also clean the data by removing incomplete or incorrectly completed responses.

If you asked open-ended questions, you will have to code the responses by assigning labels to each response and organizing them into categories or themes. You can also use more qualitative methods, such as thematic analysis , which is especially suitable for analyzing interviews.

Statistical analysis is usually conducted using programs like SPSS or Stata. The same set of survey data can be subject to many analyses.

Finally, when you have collected and analyzed all the necessary data, you will write it up as part of your thesis, dissertation , or research paper .

In the methodology section, you describe exactly how you conducted the survey. You should explain the types of questions you used, the sampling method, when and where the survey took place, and the response rate. You can include the full questionnaire as an appendix and refer to it in the text if relevant.

Then introduce the analysis by describing how you prepared the data and the statistical methods you used to analyze it. In the results section, you summarize the key results from your analysis.

In the discussion and conclusion , you give your explanations and interpretations of these results, answer your research question, and reflect on the implications and limitations of the research.

If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.

  • Student’s  t -distribution
  • Normal distribution
  • Null and Alternative Hypotheses
  • Chi square tests
  • Confidence interval
  • Quartiles & Quantiles
  • Cluster sampling
  • Stratified sampling
  • Data cleansing
  • Reproducibility vs Replicability
  • Peer review
  • Prospective cohort study

Research bias

  • Implicit bias
  • Cognitive bias
  • Placebo effect
  • Hawthorne effect
  • Hindsight bias
  • Affect heuristic
  • Social desirability bias

A questionnaire is a data collection tool or instrument, while a survey is an overarching research method that involves collecting and analyzing data from people using questionnaires.

A Likert scale is a rating scale that quantitatively assesses opinions, attitudes, or behaviors. It is made up of 4 or more questions that measure a single attitude or trait when response scores are combined.

To use a Likert scale in a survey , you present participants with Likert-type questions or statements, and a continuum of items, usually with 5 or 7 possible responses, to capture their degree of agreement.

Individual Likert-type questions are generally considered ordinal data , because the items have clear rank order, but don’t have an even distribution.

Overall Likert scale scores are sometimes treated as interval data. These scores are considered to have directionality and even spacing between them.

The type of data determines what statistical tests you should use to analyze your data.

The priorities of a research design can vary depending on the field, but you usually have to specify:

  • Your research questions and/or hypotheses
  • Your overall approach (e.g., qualitative or quantitative )
  • The type of design you’re using (e.g., a survey , experiment , or case study )
  • Your sampling methods or criteria for selecting subjects
  • Your data collection methods (e.g., questionnaires , observations)
  • Your data collection procedures (e.g., operationalization , timing and data management)
  • Your data analysis methods (e.g., statistical tests  or thematic analysis )

Cite this Scribbr article

If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.

McCombes, S. (2023, June 22). Survey Research | Definition, Examples & Methods. Scribbr. Retrieved September 5, 2024, from https://www.scribbr.com/methodology/survey-research/

Is this article helpful?

Shona McCombes

Shona McCombes

Other students also liked, qualitative vs. quantitative research | differences, examples & methods, questionnaire design | methods, question types & examples, what is a likert scale | guide & examples, "i thought ai proofreading was useless but..".

I've been using Scribbr for years now and I know it's a service that won't disappoint. It does a good job spotting mistakes”

Learning Materials

  • Business Studies
  • Combined Science
  • Computer Science
  • Engineering
  • English Literature
  • Environmental Science
  • Human Geography
  • Macroeconomics
  • Microeconomics
  • Research Instrument

Market research is a common practice used by companies to learn about customer behaviour and design suitable marketing campaigns. However, researching the market is not easy. To simplify the process, researchers can make use of research instruments. These are tools for collecting, measuring, and analysing data. Read along to learn what research instruments are used for and how they can be applied.

Millions of flashcards designed to help you ace your studies

  • Cell Biology

Magazines, online articles, and industry reports are ______ research instruments. 

What are research instruments?

The research instrument that involves watching people interacting in a controlled or uncontrolled environment is __________.

Focus groups can be organised online. 

Secondary data is not a research instrument. 

Generalizability means how well the participants' answers match those outside the study. 

___________ means  whether the research method will produce similar results multiple times.

What can be quantitative research instruments?

Focus group is a _________ research instrument. 

Review generated flashcards

to start learning or create your own AI flashcards

Start learning or create your own AI flashcards

  • Customer Driven Marketing Strategy
  • Digital Marketing
  • Integrated Marketing Communications
  • International Marketing
  • Introduction to Marketing
  • Marketing Campaign Examples
  • Marketing Information Management
  • Behavioral Targeting
  • Customer Relationship Management
  • Ethics in Marketing
  • Experimental Research
  • Focus Groups
  • Interview in Research
  • Market Calculations
  • Market Mapping
  • Market Research
  • Marketing Analytics
  • Marketing Information System
  • Marketing KPIs
  • Methods of Market Research
  • Multi level Marketing
  • Neuromarketing
  • Observational Research
  • Online Focus Groups
  • PED and YED
  • Primary Market Research
  • Sampling Plan
  • Secondary Market Research
  • Survey Research
  • Understanding Markets and Customers
  • Marketing Management
  • Strategic Marketing Planning

Research Instrument Meaning

Research instruments are tools used for data collection and analysis. Researchers can use these tools in most fields. In business, they aid marketers in market research and customer behaviour study.

Some examples of research instruments include interviews, questionnaires, online surveys, and checklists.

Choosing the right research instrument is essential as it can reduce data collection time and provide more accurate results for the research purpose.

  • A research instrument is a tool for collecting and analysing data in research.

Data in research is a form of evidence. It justifies how marketers reach a decision and apply a particular strategy to a marketing campaign.

In research, marketers often collect data from various sources to produce and validate research results.

Research Instrument Examples

There are many examples of research instruments. The most common ones are interviews, surveys, observations, and focus groups . Let's break them down one by one.

Research Instrument: Interviews

Research Instrument Interview example StudySmarter

The interview is a qualitative research method that collects data by asking questions. It includes three main types: structured, unstructured, and semi-structured interviews.

Structured interviews include an ordered list of questions. These questions are often closed-ended and draw a yes, no or a short answer from the respondents. Structured interviews are easy to execute but leave little room for spontaneity.

Unstructured interviews are the opposite of structured interviews. Questions are mostly open-ended and are not arranged in order. The participants can express themselves more freely and elaborate on their answers.

Semi-structured interviews are a blend of structured and unstructured interviews. They are more organised than unstructured interviews, though not as rigid as structured interviews.

Compared to other research instruments, interviews provide more reliable results and allow the interviewers to engage and connect with the participants. However, it requires experienced interviewers to drive the best response from the interviewees.

Tools used in interviews may include:

Audio recorder (face-to-face interview)

Cam recorder & video conferencing tools (online interview)

Check out our explanation Interview in Research to learn more.

Research Instrument: Surveys

Survey research is another primary data collection method that involves asking a group of people for their opinions on a topic. However, surveys are often given out in paper form or online instead of meeting the respondents face-to-face.

An example is a feedback survey you receive from a company from which you just purchased a product.

The most common form of a survey is a questionnaire. It is a list of questions to collect opinions from a group. These questions can be close-ended, open-ended, pre-selected answers, or scale ratings. Participants can receive the same or alternate questions.

The main benefit of a survey is that it is a cheap way to collect data from a large group. Most surveys are also anonymous, making people more comfortable sharing honest opinions. However, this approach does not always guarantee a response since people tend to ignore surveys in their email inboxes or in-store.

There are many types of surveys, including paper and online surveys.

Check out our explanation of Survey Research to learn more.

Research Instrument: Observations

Observation is another research instrument for marketers to collect data. It involves an observer watching people interacting in a controlled or uncontrolled environment.

An example is watching a group of kids playing and seeing how they interact, which kid is most popular in the group, etc.

Observation is easy to execute and also provides highly accurate results. However, these results might be subjected to observer bias (the observers' opinions and prejudice) which lowers their fairness and objectivity. Also, some types of observations are not cheap.

Tools for observations can vary based on the research purpose and business resources.

Simple observations can be carried out without any tool. An example might be an observer "shopping along" with a customer to see how they choose products and which store section catches their eyes.

More complex observations can require special equipment such as eye-tracking and brain-scanning devices. Websites may also use heat maps to see which areas are most clicked by page visitors.

Check out our explanation of Observational research to learn more.

Research Instrument: Focus groups

Research Instrument Focus group example StudySmarter

Focus groups are similar to interviews but include more than one participant. It is also a qualitative research method which aims to understand customers' opinions on a topic.

Focus groups often consist of one moderator and a group of participants. Sometimes, there are two moderators, one directing the conversation and the other observing.

Conducting focus groups are quick, cheap, and efficient. However, the data analysis can be time-consuming. Engaging a large group of people is tricky, and many participants may be shy or unwilling to give their opinions.

If focus groups are conducted online, tools like Zoom or Google Meeting are often used.

Check out our explanation Focus Groups to learn more.

Research Instrument: Existing data

Unlike the others, existing or secondary data is an instrument for secondary research. Secondary research means using data that another researcher has collected.

Secondary data can save a lot of research time and budget. Sources are also numerous, including internal (within the company) and external (outside the company) sources.

Internal sources include company reports, customer feedback, buyer personas, etc. External sources might include newspapers, magazines, journals, surveys, reports, Internet articles, etc.

Collecting from existing data is pretty simple, though the sources need validating before use.

Check out our explanation of Secondary Market Research to learn more.

Research Instrument Design

Research instrument design means creating research instruments to obtain the most quality, reliable, and actionable results. It is an intricate process that requires a lot of time and effort from the researchers.

A few things to keep in mind when designing research instrument 1 :

Validity means how well the participants' answers match those outside the study.

Reliability means whether the research method will produce similar results multiple times.

Replicability means whether the research results can be used for other research purposes.

G eneralizability means whether the research data can be generalised or applied to the whole population.

Research instrument design best practices

Here are some good practices for creating research instruments:

Define the research objective

Good research always starts with a hypothesis. This is the proposed explanation based on the evidence that the business currently has. Further research will be needed to prove this explanation is true.

Based on the hypothesis, the researchers can determine the research objectives:

What is the research's purpose?

What result does it try to measure?

What questions to ask?

How to know the results are reliable/actionable?

Prepare carefully

"To be prepared is half the victory". Preparation means designing how researchers will carry out the research. This may include creating questions and deciding on what tools to use.

Survey research design might include creating questions that are simple to understand and do not include biased language. The researcher can also use typography, spacing, colours, and images to make the survey attractive.

Create a guideline

The person carrying out the research may not be the same as who designs it. To ensure smooth implementation, an important step is to create a guideline.

For example, when using interviews in research, the researcher can also create a document that provides a focus for the interview. This is simply a document that defines the structure of the interview - what questions to ask and in which order.

Avoid interviewer bias

Interviewer bias happens when the researcher/observer/interviewer interacts directly with the participants. Interviewer bias means letting the interviewers' viewpoints and attitudes affect the research outcome. For example, the interviewer reacts differently around different interviewees or asks leading questions.

When designing research instruments, researchers should keep this in mind and leave out questions that might lead the respondent to their favourable responses.

Test and implement

To avoid mistakes, the researcher can first test it in a small sample before applying it to a large group. This is extremely important, especially in large-scale data collection methods like questionnaires. A minor error can make the whole process futile. A good practice is to ask a team member proofread the survey questions to spot any errors or inaccuracies.

After testing, the next task is to apply it to the target group. The response rate is a crucial KPI to determine the research's reliability. The higher the response rate, the more reliable the results are. However, other factors like the depth of answers are also important.

Research Instrument in quantitative research

Quantitative research means collecting and analysing numerical data. This kind of research helps spot patterns and trends to make predictions or generalise results to the whole population. Research instruments in quantitative research include surveys, questionnaires, telephone, and interviews.

The main component of surveys is questionnaires. These are lists of questions to collect data from a large group. In survey research, the questions are primarily closed-ended or include rating scales to collect data in a unified fashion.

The reliability of survey results greatly depends on the sample size. The larger the sample size, the higher validity it will have, though not cheap to execute.

There is limited interviewer bias and errors in surveys. However, the refusal rate is high as few people are willing to write down their answers.

Research instrument questionnaires

Questionnaires as a research instrument can be self-administered or with interference from the researcher.

Self-administered questionnaires are ones completed in the absence of the researcher. 2 The respondent fills out the questionnaire themselves, which gives the term "self-administered". Self-administered surveys allow participants to keep their anonymity and be more comfortable sharing their opinions. When surveys are self-administered, researcher bias can be removed. The only drawback is that the researcher can't track who will fill the questionnaires and when they will return the answer.

Questionnaires with interference from the researcher are primarily found in focus groups, interviews, or observational research . The researcher hands out the questionnaire and remains there to help the respondents fill it. They can answer questions and clear out any uncertainties the respondent might have. This type of questionnaire has more risk of researcher bias but will give more quality responses and have a higher response rate.

Research Instrument: Telephone

The telephone is another research instrument for quantitative research. It is based on random sampling and also has low interviewer bias. However, phone calls tend to be short (less than 15 minutes), giving interviewers little time to collect in-depth information. Customers can also hang up when they are distracted by something else.

Most interviews are qualitative in nature, but some are quantitative, especially those carried out in a structured manner. An example is structured interviews which include closed-ended questions arranged in a specific order.

Research Instrument - Key takeaways

  • Popular research instruments are interviews, surveys, observations, focus groups, and secondary data.
  • When designing research instruments, the researcher needs to consider the research results' validity, reliability, applicability, and generalizability.
  • Research instruments mostly used in quantitative research are telephone, interviews, and surveys.
  • Questionnaires as a research instrument can be self-administered or with the researcher's interference.
  • Vision Edge Marketing, How to Design an Effective Survey Instrument, https://visionedgemarketing.com/survey-instrument-effective-market-customer-research/.
  • Form Plus Blog, Self Administered Survey: Types, Uses + [Questionnaire Examples], https://www.formpl.us/blog/self-administered-survey, 2022.

Flashcards in Research Instrument 9

observation

Research Instrument

Learn with 9 Research Instrument flashcards in the free StudySmarter app

We have 14,000 flashcards about Dynamic Landscapes.

Already have an account? Log in

Frequently Asked Questions about Research Instrument

What instruments are used to collect quantitative data?

Instruments used to collect quantitative data include surveys, telephone, and (structured) interviews.

What is questionnaire in research instrument?

Questionnaires are lists of questions to gather data from the target group. It is mainly used in surveys to collect quantitative data. 

What are research instruments for data collection?

There are many research instruments for data collection. The most popular are interviews, surveys, observations, focus groups, and secondary data. Different research instruments can be used depending on the type and purpose of the research. 

What are research instrument examples?

Some research instrument examples are surveys, interviews, and focus groups. Surveys can be used to collect quantitative data from a large group while interviews and focus groups gather qualitative data from a smaller group of participants.

What are instrument design in research?

Research instrument design means creating research instruments to obtain high-quality and reliable research data. Good research instruments must match four qualities: validity, reliability, applicability, and generalizability. 

Test your knowledge with multiple choice flashcards

Research Instrument

Join the StudySmarter App and learn efficiently with millions of flashcards and more!

Keep learning, you are doing great.

Discover learning materials with the free StudySmarter app

1

About StudySmarter

StudySmarter is a globally recognized educational technology company, offering a holistic learning platform designed for students of all ages and educational levels. Our platform provides learning support for a wide range of subjects, including STEM, Social Sciences, and Languages and also helps students to successfully master various tests and exams worldwide, such as GCSE, A Level, SAT, ACT, Abitur, and more. We offer an extensive library of learning materials, including interactive flashcards, comprehensive textbook solutions, and detailed explanations. The cutting-edge technology and tools we provide help students create their own learning materials. StudySmarter’s content is not only expert-verified but also regularly updated to ensure accuracy and relevance.

Research Instrument

StudySmarter Editorial Team

Team Marketing Teachers

  • 11 minutes reading time
  • Checked by StudySmarter Editorial Team

Study anywhere. Anytime.Across all devices.

Create a free account to save this explanation..

Save explanations to your personalised space and access them anytime, anywhere!

By signing up, you agree to the Terms and Conditions and the Privacy Policy of StudySmarter.

Sign up to highlight and take notes. It’s 100% free.

Join over 22 million students in learning with our StudySmarter App

The first learning app that truly has everything you need to ace your exams in one place

  • Flashcards & Quizzes
  • AI Study Assistant
  • Study Planner
  • Smart Note-Taking

Join over 22 million students in learning with our StudySmarter App

Advertisement

Advertisement

Reliability and validity assessment of instrument to measure sustainability practices at shipping ports in India

  • Open access
  • Published: 03 September 2024
  • Volume 5 , article number  236 , ( 2024 )

Cite this article

You have full access to this open access article

research instrument interview meaning

  • L. Kishore 1 ,
  • Yogesh P. Pai 2 &
  • Parthesh Shanbhag 3  

Sustainability has emerged as one of the most critical factors influencing the competitiveness of maritime shipping ports. This emergence has led to a surge in research publications on port sustainability-related topics. However, despite the increasing awareness and adoption of sustainability practices, documented literature on empirical studies with survey and interview data is very limited. Moreover, the existence of validated instruments to objectively assess sustainability through sustainability practices for shipping ports in India needs to be traced. This study contributes by validating an instrument to evaluate objectively sustainability practices in shipping ports by adopting a four-stage process, starting with item identification based on an extensive literature review, instrument evaluation by subject matter experts, assessing of the instrument with suitable content validation indices, and finally evaluating the validity and reliability of the hypothesized theoretical model. For content validation, Content Validity Index, Cohens Kappa coefficient, and Lawshe’s Content Validity Ratio were computed with the assessment by a subject matter expert panel comprising six members from the port industry as well as academicians cum researchers in the field of shipping ports. The content-validated instrument was administered to 200 samples comprising officer category port employees. The measurement model was evaluated and validated using the Confirmatory Factor Analysis to assess the extent to which the measured variables represent the theoretical construct of the study and ascertain the factor structure. The empirically validated instrument met the required guidelines of model fit, reliability, and construct validity measures and was found to be a confirmed model for measuring sustainability practices in shipping ports. Structural Equation Modeling methodology was adopted to explain the variance and the path relationship between the higher-order and lower-order constructs of sustainability. The results indicate that the economic dimensions are the major contributors to the overall sustainability of the port as they drive investments in environmental and social dimensions, leading to overall sustainable development. The study’s findings will be helpful for researchers, academicians, policymakers, and industry practitioners working towards sustainability practices that contribute to sustainable growth and development in the shipping industry.

Similar content being viewed by others

research instrument interview meaning

Sustainability in Organizations: Bhutan’s Perspective

research instrument interview meaning

Preventing and developmental factors of sustainability in healthcare organisations from the perspective of decision makers: an exploratory factor analysis

research instrument interview meaning

Factors affecting the outcome of corporate sustainability policy: a review paper

Avoid common mistakes on your manuscript.

1 Introduction

Sustainability has increasingly been considered one of society and industry’s most significant focus areas [ 1 ] along with regulatory bodies in recent times, although only partially new [ 2 ]. Sustainable development is generally quoted as the one that “satisfies the needs and wants of the present generation” simultaneously without compromising the future generation’s needs and aspirations” [ 3 ]. Moreover, the challenge is balancing sustainability and economic growth [ 4 ]. This prerequisite has led organizations to look beyond mere economic performance and build social and environmentally friendly business models by integrating sustainability principles and practices [ 5 , 6 , 7 ] targeting competitive advantages [ 8 , 9 ]. Against this backdrop, Lun et al. [ 3 ] highlighted the importance of shipping ports in a country’s sustainable growth and development in the long run as they generate employment along with the export–import trade. It is widely accepted that port-led economic growth and development have been the backbone of many developed and developing countries. For instance, the Indian maritime sector, which is one of the most promising, emerging, developing, dynamic markets in the world, has received a facelift with one of the significant mega-initiatives of the government, “ Sagarmala ”, which is focused on “port-led economic development” [ 10 ]. 95% of India’s overall goods trade volume is through shipping ports, contributing about 14% of the GDP [ 11 ]. The impact of the subdued port performance is reflected in a country’s economic development [ 12 ]. Shipping ports are vital nodes that link other modes of transportation in global trade and are considered strategic assets demanding significant attention in maritime and transportation research and practice [ 4 , 13 , 14 ].

Lee et al. [ 15 ] flagged the concern of less attention given to sustainability in the shipping, port, and maritime industries, unlike the aviation and road transport sectors. Many studies [ 14 , 16 , 17 , 18 , 19 ] have emphasized sustainability as one of the most crucial aspects influencing the competitiveness and long-term sustenance of shipping ports; however, it is not fully incorporated into a strategic decision that demands a long-term view when deciding on port development and management. Further, coastal lines are densely populated, leading to higher levels of economic activity and rapid urbanization; however, they face the consequences that come along as the byproduct of development in the form of environmental, economic, and social concerns [ 14 , 20 ]. Some of the substantial problems in the ports discussed in many studies [ 14 , 18 , 19 , 21 , 22 , 23 , 24 , 25 ] include the depletion of the marine ecosystem and biodiversity due to dredging and reclamation works, water pollution due to ballast operations, oil spillage during ship anchoring and cargo operations, wastewater spillage from ships, air pollution due to various pollutants and particulate matters, dust and smoke pollution by a heavy vehicle, climate change effects, increased energy consumption for operations and the cost of energy, uncertainty in future economic returns of investment in the port, employment and diversity of jobs, employee productivity, displacement of local community along with impact on their livelihood, loss of agricultural land, steep increase in cost of living and land revenue rate due to swift urbanization, industrial and special economic zone development, inclusivity of community in developmental projects, safety and security in the port vicinity, social working environment, trade union interference and many more. These challenges, complemented by the growing importance and focus on the sustainable development of shipping ports, have led to a surge in research publications on sustainability-related topics that concentrate on environmental, economic, and social aspects [ 26 , 27 ].

A literature review gives insight into the focus of extant studies related to port sustainability, which is more on quantifying and measuring various dimensions of sustainability against benchmarks and developing indexes for multiple sizes of sustainability. However, qualitative studies need to be conducted to understand the extent to which sustainability measures are adopted and the interaction between the measures, directing towards empirical data-driven studies [ 27 , 28 ]. Alamoush et al. [ 2 ], in their port sustainability framework development study, found that only 16 percent of the articles published were empirical and were based on questionnaire surveys and personal interview data. At the same time, the majority, 40 percent, was conceptual and theoretical review, and the remaining was equally distributed among simulation and case studies. Empirical data-driven research on sustainability-related topics and port performance will be critical to the growing body of knowledge [ 28 ]. Further, the empirical studies on port sustainability [ 2 , 27 , 29 , 30 , 31 , 32 , 33 ] have adopted various indicators of sustainability and criteria for sustainability evaluation based on available scales from studies not directly related to port, rather context-adjusted for port industry. Moreover, despite the increasing number of empirical studies on sustainability in the port sector, in line with the claim made by Ayre et al. [ 34 ] on the hardly ever reporting of content validation by researchers, the existence of any content validation process adopted in any studies and the validated instrument so developed to measure sustainability and sustainability practices objectively for shipping ports is not traceable in the extant literature, especially for exponentially developing economy like India. The study of Ashrafi et al. [ 17 ], which pointed out their pilot study for validation, only assessed the perception of port sustainability in the US and Canada through an online survey to identify the primary factors and challenges in adopting and implementing sustainability strategies. However, the instrument needed to be more generic in capturing the overall sustainability barriers and influencing factors and needed a specific macro-level assessment of the three dimensions of sustainability.

There are studies [ 35 , 36 , 37 ] that discuss the importance of content validity to determine whether the measurement item used in the tool and the extent to which the tool is satisfactorily representing and addressing the domain of interest along with its relevance when measured. Thus, the need for precise, validated measurement tools for sustainability practices at shipping ports indicates a critical knowledge gap in the existing literature. It should also be noted that most seaport-related studies in the scholarly database concentrated on specific geographical areas in Europe [ 32 , 38 ] but not on the leading and growing economies like India. Another concern is that although sustainability is a widely discussed topic, there still needs to be a single universally acceptable and established definition for sustainability [ 39 , 40 , 41 ]. According to Maletič et al. [ 40 ], even though many have attempted to define and measure sustainability, there is an ongoing debate in the literature [ 42 ] on the existence of multiple ways to measure sustainability practices. Therefore, there is a vital need to have clarity and substantial justification on the dimensions and indicators that define the sustainability construct and standardize the assessment of sustainability to a greater extent, especially for seaports. Further to developing policies and schemes for sustainability, the implementations are essential for guaranteed sustainable development [ 28 , 43 ] and measuring the extent to which the port has focused on various sustainability practices can be a tool to assess the efforts towards sustainable development of the shipping port. Alamoush et al. [ 2 ] also pointed out their primary observation on the lack of study linking the port sustainability actions with the three sustainability dimensions represented by the three sustainability practices.

Considering this existing crucial gap and challenges discussed above, the novel contribution of this study is a validated instrument for assessing the sustainability practices followed at shipping ports covering the dimensions of sustainability. The measuring instrument can act as a guideline for seaport administrators and stakeholders to evaluate sustainability in shipping ports and develop seaport sustainability policy for sustainable maritime growth and sustainable development, specifically for an empirical and objective evaluation of sustainability practices adopted in shipping ports in India. Given this compelling necessity to have a content and construct validated instrument for sustainability assessment and strategy development, specifically in the context of Indian seaports, this study aims to explore, design, and develop an instrument for objectively assessing sustainability practices in shipping ports through a well-established content validation process. To achieve the aim of the study, the objectives are:

To identify the comprehensive list of dimensions of sustainability practice for shipping ports through an extensive literature survey

To validate the content of the measurement tool using globally accepted content validation indices viz Content Validity Index, Kohens Kappa coefficient, and Content Validity Ratio

To ascertain the factor structure of the measurement model using Confirmatory Factor Analysis

To estimate the relationship and contribution of three dimensions of sustainability practices to the higher-order sustainability practices construct

The study is structured into the following three major sections. The following section explores the theoretical foundations of sustainability construct and the related conceptual framework that shapes shipping ports’ sustainability dimensions. The following section covers the research methodology for achieving the study’s objectives. It outlines the steps followed in item identification through literature review, instrument development, and instrument assessment based on globally accepted indices and measurement model structure evaluation for validity and reliability using confirmatory factor analysis leading to Partial Least Square-Structural Equation Modeling (PLS-SEM) methods for relationship estimation and prediction of the relationship among the variables. Finally, the results are critically discussed, with the findings of the study highlighting the implications leading to a conclusion along with future research directions.

2 Theoretical background

Although certain sustainability practices are compelled by regulatory compliance, organizations are fortified to adopt and engage voluntarily and proactively in other sustainability practices to meet the needs of the broader society within which they operate [ 44 ]. Extant studies on sustainability [ 7 , 14 , 16 , 17 , 18 , 45 , 46 ] have discussed the need to integrate sustainability efforts into organizational goals, processes, and initiatives and link them with organizational strategy, without which the efforts are likely to fail. The goal for any firm is to secure a competitive advantage over its competitors, create wealth, capture the highest possible market share, and add value to the stakeholders while maintaining a balance between sustainability and economic growth [ 4 ]. Studies [ 16 , 47 ] have opined sustainability to be one of the most crucial aspects that influence the competitiveness of ports. Moreover, to achieve and sustain competitive advantage, organizations have been increasingly grappling with sustainability practices [ 8 ]. Simpson et al. [ 48 ] define practices as “the customary, habitual, or expected procedure or way of doing something. “The practices focused on sustainability could differ from industry to industry, and the shipping industry could concentrate on various practices, and relevant systems would be in place to support sustainable growth and development. Kang et al. [ 31 ] highlight the best practices that embrace sustainability and suggest many practices related to operations, resource optimization, safety and security, finance, risk, infrastructure upgradation, stakeholder management, environmental management systems, and the Port’s eco-friendly and social work environment.

Discussions in prior studies indicate mixed responses regarding the definition of sustainability. There is no universally acceptable definition [ 39 , 42 , 49 , 50 ], but a more generic definition emphasizes sustainability as the set of business strategies, policies, and associated practices or activities where the requirement of the present is satisfied without impacting the requirements of the future in the best interests of the port and related stakeholders. Different schools of thought have a general opinion that sustainability encompasses the three significant dimensions popularly termed as the triple bottom line (TBL) dimensions of “economic, social, and environmental practices,” which comprehend the broad framework of sustainable development [ 39 , 49 , 50 , 51 ]. Elkington [ 51 , 52 ] introduced the triple-bottom-line approach (TBL) incorporating these interrelated three dimensions—“environmental sustainability, economic sustainability, and social sustainability,” advocating organizations to adopt the TBL approach for long-term success [ 53 , 54 , 55 ], rather than short-term success focusing only on the economic dimension. The TBL aspects are considered the critical dimensions of sustainability [ 56 ]. Environmental dimensions concentrate on policies, initiatives, and practices that promote environmental management. In contrast, economic dimensions focus on policies, initiatives, and practices related to investments, economic benefits, and returns from those investments [ 57 ]. Social sustainability focuses on policies, initiatives, and practices that promote the overall improvement of society at large, including all other stakeholders [ 58 ]. Bansal [ 59 ] asserted that the three pillars, i.e., environmental integrity, economic prosperity, and social equity, should intersect for sustainability. Alamoush et al. [ 2 ] further related these dimensions of TBL to the planet, profit, and people as synonyms for environment, economic, and social sustainability.

In the context of ports-related studies, various environmental, economic, and social dimensions were adopted to assess the sustainability of the ports using different methodologies [ 2 , 27 ]. Oh et al. [ 29 ] adopted the importance-performance analysis technique to evaluate the sustainability of South Korean ports using 27 vital measures of the sustainability of ports adapted from the findings and discussions of previous research and found that those measures are essential from a port sustainability point of view. Their study classified the indicators of port sustainability in the three dimensions of sustainability as opined in the TBL concept. In contrast to this empirical quantitative approach, Vejvar et al. [ 33 ] adopted a case-study-based approach to study the institutional forces that compel organizations to adopt sustainability practices. However, they adopted open-ended questions to probe the sustainability practices adopted in the selected shipping ports. They performed a cross-case analysis to make the study more generalizable and increase the validity of the findings [ 32 ]. A thematic analysis of the sustainability performance of seaports was conducted, followed by semi-structured interviews. Later, a fuzzy analytical hierarchy process was applied to compute the weight for each port sustainability performance indicator. Their study also categorized the indicators into three dimensions of sustainability performance, namely social, environmental, and economic sustainability performance practices. Another multi-dimensional framework of sustainability practices was empirically tested by Maletič et al. [ 40 ], and they defined sustainability exploitation and exploration as two different sustainability practices. According to them, sustainability exploitation practices aim at incremental improvement in organizational processes, and sustainability exploration challenges current practices with innovative concepts in developing competencies and capabilities for sustainability. However, they also acknowledged the suitability of more objective measures, such as the TBL practices for sustainability studies. Sustainability practices aid organizations in developing opportunities while managing the three dimensions of organizational processes—economic, environmental, and social aspects in value creation over the long term [ 51 ]. In that definition given, profitability is the focus of economic sustainability, protection and concern towards the environment is the focus of environmental sustainability [ 60 , 61 ], and social sustainability focuses on sustained relations with all the stakeholders, including suppliers, customers, employees, and the community as well [ 62 ].

Regarding developing an index related to sustainability, Laxe et al. [ 43 ] developed the “Port Sustainability Synthetic Index” covering economic and environmental indicators using a sample of 16 ports in Spain. Molavi et al. [ 25 ] developed a novel framework for the smart port index for achieving sustainability using key performance indicators (KPI) that can assist in strategy development and policy framing. Their study indicated several sub-domains of environmental domains in the smart port index study, along with other domains such as operations, energy, safety, and security. However, their study mentioned environment-related quantitative KPIs and other domains that can be used to evaluate the smart port index. Still, it did not mention economic and social, although the sub-domain can be related to economic and social dimensions. In contrast, Stanković et al. [ 63 ] developed a novel composite index for assessing the sustainability of seaports covering environmental, economic, and social dimensions through its indicators based on the secondary data available in the Eurostat and the Organization for Economic Co-operation and Development database. However, the environmental dimension captured only air pollution particulate matter value as the only indicator. Their study also mentioned the limitations of not covering many indicators, including social inclusion and waste management, due to the unavailability of the database. These limitations of quantitative data available in secondary databases for index inclusion are also challenging. The data collection across ports is not yet standardized, and the diverse type of cargo handled in ports makes the index not universally adaptable. Mori et al. [ 64 ] had the same opinion about avoiding a synthesized composite index due to the chances of offset in evaluation [ 65 ]. Although indices have benefits, standard data availability limitations for computing indexes are another added concern that limits index adoption for benchmarking and assessment, thus making sustainability index adoption with caveats.

Therefore, following the justifications and proven theoretical foundations discussed above, this study is grounded on sustainability theory orchestrated by the TBL view, which incorporates the three interrelated dimensions of sustainability—“environmental sustainability, economic sustainability, and social sustainability” and the relevant sustainability practices focused on shipping ports. Therefore, based on previous studies on sustainability and sustainability practices, this study considers sustainability constructs, namely environmental sustainability practices (EnvSP), economic sustainability (EcoSP), and social sustainability practices (SocSP). The indicators thus identified would be used as the measurement scales to empirically measure through survey instruments and objectively evaluate sustainability practices adopted in shipping ports in India.

3 Methodology

The authors adopted the content validation process prescribed by Barbosa et al. [ 35 ]. The process starts with item identification based on an extensive literature review, instrument assessment by subject matter experts, and instrument evaluation with suitable content validation indices. This was followed by assessing the validity and reliability of the hypothesized model to confirm the theory established in the literature using Confirmatory Factor Analysis (CFA) [ 66 ]. CFA is the most widely adopted statistical technique that helps to determine the underlying structure among a set of latent variables and confirm the reliability of how well the scale measures the proposed concept. Hair et al. [ 67 ] elucidated on CFA as a technique to assess the contribution of each scale of item on the latent variable, which later can be incorporated into the estimation of the relationships in the structural model along with accounting for associated measurement error using the variance-based Partial Least Squares-Structural Equation Modelling (PLS-SEM) framework. Explaining the relationship between the exogenous and endogenous variables and predicting the variation in the relationship is the primary focus of PLS-SEM. The five-stage procedure adopted in the study is shown in the flow chart Fig.  1 below.

figure 1

Content validation process for the study (Author’s own)

3.1 Stage #1. Item identification and face validation

In the first stage, an extensive review of relevant articles related to port studies was performed to compile a comprehensive list of items related to the three dimensions of sustainability viz environmental, economic, and sustainability practices, with the help of a relevant keyword search in the Scopus database in the context of shipping ports. Multiple iterations with different combinations of keywords were performed to see the diversity of articles that can be traced in the scholarly database and the final set of keywords as [(“sustainability”) OR (“sustainability practices”)] AND [(“shipping ports”) OR (“maritime ports”) OR (“container ports”)] AND (scale OR items OR measurement OR indicators OR SEM) were adopted in article identification and followed by screening of articles for item identification and compilation of the list of items related to sustainability practices. The final set of related articles was critically reviewed to identify the relevant items for sustainability practices at shipping ports. Following the recommendation of Boateng et al. [ 68 ], face validation of the instrument was conducted with review and inputs from two senior academicians and experts with theoretical and practical knowledge of sustainability practices.

3.2 Stage #2. Instrument assessment

A subject matter expert panel selection followed this in the second phase of the content validation process to perform instrument assessment. Typically, experts evaluate the content validity, and for that, the recommended minimum number of experts is three and can go up to a maximum of 10 [ 68 , 69 , 70 ]. Following this study’s prescribed expert number requirement, a panel comprising six experts from academic and Port industry backgrounds assessed and validated items. Barbosa and Cansino [ 35 ] claim no unique formula or approach for selecting an expert panel exists. However, it points out the need for a heterogenous panel to mitigate the risk of biases in the validation. Therefore, the study included subject experts from the port industry and academicians with experience in port-related research studies.

The relevance of the items identified through literature review from various sources and the essentiality of these items are supposed to be assessed and content validated through the instrument assessment by the panel of experts. For content validation and evaluation, the Content Validity Index (CVI), Cohens Kappa coefficient, and Lawshe’s Content Validity Ratio (CVR) were adopted as they are the most widely adopted content validation tools for quantifying the opinions of experts [ 69 , 71 , 72 , 73 , 74 ]. The items were assessed for relevance on a 4-point Likert scale. The 4-point Likert scale for relevance captured the response as “1 = not relevant, 2 = somewhat relevant, 3 = quite relevant, and 4 = very relevant” for every item in the measurement instrument. Further, the items were assessed on a 3-point Likert scale to capture the extent of essentiality. Moreover, the 3-point Likert scale for essentiality captured the response as “1 = not essential; 2 = useful, but not essential; and 3 = essential”. Further, an additional comments column for each item was also provided to add feedback and remarks by the expert against each item.

3.3 Stage #3. Content validation using CVI, Cohen’s Kappa, and CVR index

Following the recommendation of [ 75 ], the validity of the instrument content was assessed using CVI, Cohen’s Kappa coefficient, and CVR indices. CVI is a straightforward computation of the agreement among the panelists and can be computed at both the individual item level and the overall scales [ 71 , 72 , 73 , 76 ]. Accordingly, I-CVI is the validity index for each item of the constructs of the study, whereas S-CVI is for the overall scale, which is calculated as the average of I-CVI. I-CVI can be determined as the ratio of several panelists’ ratings on a scale of 3 and above for each measurement item and the total panelists evaluating the relevance. Along similar lines, S-CVI can be computed using the number of measurement items in the assessment tool with a rating of 3 and above for each measurement item. To complement and increase the strength of assessment of relevance through CVI, Barbosa and Cansino [ 35 ] highlight the benefit of Cohen’s Kappa coefficient for evaluation of content validation with due consideration of the degree of agreement on the measurement item beyond certain chance along with the associated probability of inflated scales of agreement merely due to chance agreement. The formula to compute Cohen’s Kappa coefficient is as follows:

where the total number of experts is denoted as N, and it indicates the total number of subject matter experts indicating “essential,” P c is the probability of chance agreement and computed as:

According to Lawshe (1975), the CVR index can be computed using the formula:

The formulas described above were entered in a spreadsheet for the computation of CVR, Cohen’s Kappa coefficient, and CVI based on the rating given by the experts for the items identified. The scale of relevance and necessity of items marked by each expert was recorded and coded into the spreadsheet to facilitate the computation of indices for every item and the entire scale.

3.4 Stage #4. Reliability and validity using confirmatory factor analysis

3.4.1 sampling and data collection.

The content-validated questionnaire instrument was administered online through Microsoft Forms as well as offline to port employees working at the mid and senior management (Officer Category) level employees of various significant ports located in both the western and eastern coastal belts of India for data collection to test the validity and reliability of the measurement model. The instrument captured the respondents’ demographics and perceptions of how much the Port focuses on the three pillars of sustainability practices adopted in their respective ports. Many authors have opined the choice of sample size determination in business management and social science based on G-power [ 77 , 78 , 79 , 80 ]. As per the calculation, for an effect size at a medium level, implied as 0.15, a 5% significance level, and a power level of eighty percent, the recommended minimum sample size was 166. Further, Hair et al. [ 67 ] have outlined the guideline for the minimum sample size for a model structure with less than or equal to seven constructs as 150. However, to avert any possible statistical loss, the rationalized sample size was determined to be 20 percent over 166, thereby establishing the sample size required for the study to be 200. The respondents had to indicate their level of agreement on each item on a Likert scale of 1 to 5, 1 indicating “Strongly Disagree” and five indicating “Strongly Agree”. The data collection activity was carried out between January and December 2023 until the required sample usable data was received for further analysis.

3.4.2 Reliability and validity of the measurement model

Confirmatory factor analysis (CFA) was performed to check the factor structure confirmation of the sustainability practices dimensions using the sample data collected with IBM AMOS. In contrast to measurement error, reliability is the indicator of the “degree to which the observed variable measures the true value and is error-free [ 66 ]. It is also an “assessment of the degree of consistency between multiple measurements of a variable and the set of variables being measured.” Model fit, reliability, and construct validity indices were assessed based on the recommendations by [ 81 , 82 ]. Construct reliabilities were evaluated using Cronbach’s alpha, Composite reliability, and Average Variance Extracted (AVE) values. Construct validity was assessed using convergent validity and discriminant validity measures.

3.5 Stage #5. Structural equation modeling

SEM methodology facilitates the indirect measurement of unobserved latent variables with the measurement of indicator items for the variables in the model structure [ 82 ]. SEM methodology assesses how the latent variables in the model are related to one another and accounts for any errors in the measurement of the observed variables. Therefore, we adopted the PLS-SEM technique to estimate the relationship between the three dimensions of sustainability and their contribution to the overall sustainability construct through different item indicators for each of the dimensions of sustainability. Further, as per the recommendation of Hair et al. [ 83 ], the variance-based PLS-SEM framework is more suitable for our study because the sample size is comparatively less, and the normal distribution assumption is not significant for our study due to the innate nature of the items measuring the dimensions of sustainability. To specify the model parameters and estimate the relationship between the higher and lower-order constructs, we used SmartPLS [ 84 ], the most popular tool for PLS-SEM.

In the first stage of the analysis, items were identified based on an extensive literature review and face validation, instrument assessment by subject matter experts later, and instrument evaluation with suitable content validation indices. This was followed by assessing the validity and reliability of the hypothesized model to confirm the theory using CFA, leading to path relationship assessment using PLS-SEM methodology.

4.1 Item identification and face validation

The literature review compiled a comprehensive and exhaustive initial list comprising 48 items as indicators of sustainability practices adopted in shipping ports and the source (refer to Appendix 1 ). The initial list of items identified comprised 17 items as indicators of Environmental sustainability practices, 19 as indicators of environmental sustainability practices, and 12 as indicators of social sustainability practices adopted in shipping ports. The initial draft of the measuring instrument was subjected to face validation. The inputs from the two senior academicians and experts who carried out face validation were incorporated, which included necessary corrections such as the elimination of ambiguous terms, the inclusion of other indicators that were not included in the initial list, rephrasing of the sentences in the instrument for a better understanding of the context of the study along with the final formatting of the layout [ 68 ]. After incorporating the corrections of face validation along with their source, the items were compiled in the measurement instrument for content validation in the next stage.

4.2 Instrument assessment and content validation using CVI, Kappa, and CVR index

In the second stage, six selected subject matter experts conducted the content validation of the face-validated instrument. “Content validity is a subjective approach that evaluates the extent to which the content described through scale measures certain factors of study interest. Content validation evaluates whether the items in the questionnaire instrument are clear, readable, and relevant to the study context [ 85 ]. The relevance of the items, as well as the essentiality of these items, are supposed to be assessed and validated through the instrument assessment by the panel of subject matter experts. The assessment tool of the study instruments was administered to the six subject experts who were impaneled. Table 1 summarizes the profile of the experts who participated in validating the questionnaire items.

CVI and Kappa coefficients were calculated to assess the relevance of the items, and CVR was calculated to determine the items’ essentiality for the study context [ 74 ]. The responses of the experts on the item’s relevancy and essentiality were coded to spreadsheets for computation of CVI, Kappa coefficient, and CVR value as per the respective formulas [ 69 , 71 , 72 , 73 , 74 ]. The results after the computation of CVI, Kappa coefficient, and CVR are consolidated in Appendix 2 .

CVI indicates the proportion of experts who agreed on the tool and the measurement of the items for a given construct by considering ratings of 1 and 2 as invalid, whereas 3 and 4 are valid contents and consistent with the study conceptual framework [ 74 ]. Adopting the cut-offs suggested in previous established studies [ 69 , 74 ], items with a CVI of at least 0.84 were accepted in the validation. The validation tool indicated S-CVI as 0.86 and satisfies the minimum requirement of 0.80 per Shrotryia et al. [ 86 ] for an instrument to be considered content valid. Along with that, the Cohens Kappa coefficient was also computed with a cut-off of 0.74 to avoid any errors due to chance agreement by the expert panel [ 71 , 72 , 73 , 75 , 76 ]. According to Lawshe’s benchmark CVR index, for a panel size of 6, the cut-off CVR value prescribed is 0.99. It indicates agreement among the panel judges on the item’s necessity in the study questionnaire. Based on these inclusion criteria of CVI, Kappa, and CVR, the most essential and relevant shortlisted items and the final questionnaire were administered for construct validation. The finalized instrument for measuring the extent of the adoption of sustainability practices in shipping ports is shown in Appendix 3 .

4.3 Construct validity and reliability using confirmatory factor analysis and PLS-SEM

Empirical studies attempt to validate and justify the research framework developed with the help of primary data collected from respondents through a questionnaire instrument. Since the analysis solely depends on the data collected through the instrument and the data collected are not accurate measurements of factors of interest but observations of the respondent’s perceptions, the questionnaire should be subjected to validation and reliability checks [ 85 ]. The validation and reliability checking procedures aim to measure and address the measurement error caused by the difference in the actual scores measures from the measured or observed scores [ 66 ]. Validity exemplifies the extent to which the collected data represents the study’s primary purpose, in other words, “measuring what it proposes to measure. The content-validated measurement instrument was administered to port employees of Officer and above designation across various significant ports in India for data collection. Table 2 shows the demographic profiles of the samples who gave the responses to the questions in the instrument administered.

The goodness-of-fit indices were evaluated for the reflective measurement model considering the recommendation [ 81 , 82 ]. The model fit indexes for the hypothesized model were acceptable considering the benchmark recommended values [ 66 , 87 ]. The results [ \( \chi^{2} /\) df was 1.6, Goodness-of-fit index (GFI), Tucker Lewis Index (TLI) and Comparative Fit Index (CFI) > 0.9, Standardized Root Mean Square Residual (SRMR) = 0.053, and Root Mean Square Error of approximation (RMSEA) = 0.055] indicated acceptable model fit as per the recommendations. The standardized factor loading, construct validity, and reliability values are shown below in Table  3 .

Although Hair et al. [ 67 , p. 152–153] suggest a minimum factor loading benchmark value of 0.7 for statistical significance in general, it is also meant to consider 0.50 or above as practically significant in addition to another guideline recommending statistical significance of greater than 0.40-factor loading for a sample size of 200. Further, as per the recommendation of Chin et al. [ 88 ] and considering the practical significance of the items having more than 0.6 loadings, we believe all items with a factor loading above 0.60 are acceptable in the model structure. Therefore, all 26 items are retained in the measurement instrument. Construct reliabilities were assessed using Cronbach’s alpha and Composite reliability measures between 0.85 and 0.90, respectively. Following the reference guidelines by Hair et al. [ 89 ], the measures indicate good and acceptable internal consistency, thereby establishing the scale’s reliability in measuring the construct.

Construct validity was evaluated using convergent and discriminant validity measures except for the EcoSP construct; the other two constructs, viz. EnvSP and SocSP had AVE above the minimum benchmark of 0.50, whereas EcoSP was very close at 0.49. It can be approximated to 0.5, which is correct at the acceptable benchmark for estimating the convergent validity of the measurement model [ 90 ]. There are recommendations that marginal shortfall in AVE is adequate when Cronbach’s alpha and composite reliability are higher than 0.60 [ 89 , 90 ]. These results indicate the acceptable reliability of the scale for measuring sustainability practices in ports.

Hair et al. [ 89 ] emphasize the two established measures of discriminant validity in a model, viz., the Fornell–Larcker criterion and the Heterotrait-monotrait (HTMT) ratio. In the Fornell–Larcker criterion approach, the inter-construct correlations that measure the shared variance between latent variables are compared with the square root of average variance extracted values of the construct. The square root of AVE of the specific construct under consideration is expected to be greater than the particular construct’s highest inter-construct correlation, which signifies the shared variance with other constructs of the model under study. The square root of the AVE of all the constructs was compared with the correlation measures for every build. It was found to be greater than the respective correlation values of the construct under consideration, thereby ascertaining the discriminant validity of the construct. In the HTMT ratio approach, the estimated correlations measured are also termed unattenuated correlation, and the value of unattenuated correlation close to 1 implies an absence of discriminant validity. The benchmark value for the HTMT ratio is 0.90, and any measures above this threshold imply the absence of discriminant validity of the constructs [ 91 , 92 ]. All the measures of discriminant validity assessment indicated HTMT ratio values to be less than 0.9, thus satisfying the discriminant validity requirement of the measurement scale.

Variance-inflation-factor (VIF) was checked for the possibility of multi-collinearity issues [ 89 , 91 , 93 , 94 ]. Multi-collinearity was ruled out as all the VIF values were less than three. The above results support the reliability and validity of the sustainability constructs as collective indicators of the three dimensions of sustainability viz economic, environmental, and social sustainability, and confirm the relationship. Further, the bootstrapping procedure was run to test the significance of the path. The standardized path coefficient values, T-statistics, and p-values shown in Table  4 explain the variance of the three dimensions of the sustainability practice construct. The p-values (< 0.05) indicate that all the structural model relationships are statistically significant.

5 Discussion and implications of the study

The authors followed the systematic procedure of compiling a comprehensive list of related items for the three sustainability practices constructs through an extensive literature review followed by face validation and content validation to assess the relevance and essentiality of the items in the context of shipping ports in India. Empirical studies attempt to validate and justify the research framework developed with the help of primary data collected from respondents through a questionnaire instrument. Since the analysis solely depends on the data collected through the instrument and the data collected are not accurate measurements of factors of interest but observations of the respondent’s perceptions, the questionnaire should be subjected to validation and reliability checks [ 85 ]. The content-validated instrument was subjected to empirical evaluation with sample data collected and using the CFA technique to ascertain the reliability and validity of the model.

Specifically, the results indicate that the subject matter experts have prioritized essential and relevant items in the contemporary business environment, giving nearly equal weightage and importance to all three dimensions of sustainability practices: environmental, economic, and social. Among the items validated, the expert panel had the minor agreement for relevance and necessity on foreign direct investment and funding items, which postulates the shipping ports in India are primarily funded by the government as the minor and significant ports that comprise most of the ports controlled and administered by state and central government respectively. The same reason can be attributed to the low relevance of job security in the context of Indian shipping ports. Further, the items related to odor and smoke also received low relevance as they indicate the low degree of industrial development in shipping ports in India. Although shown as relevant, cold-ironing power sources for vessels on the berth also received a low degree of agreement for necessity. Even recognizing requirements and supporting the community also received little agreement for necessity. However, the remarks provided by the panelist highlight that these focus areas are essentially part of corporate social responsibility, and there is no necessity to assess this separately.

Content validation evaluates whether the items in the questionnaire instrument are clear, readable, and relevant to the study context [ 85 ]. After face and content-validation of the instrument, the finalized list comprised eight items as indicators of EnvSPs, ten as indicators of EcoSPs, and eight as indicators of SocSPs adopted in shipping ports. Thus, the content-validated items for the questionnaire instrument comprised 26 items for measuring the constructs of the study, which is closer to the number of items. Oh et al. [ 29 ] had adopted in the sustainability of ports study. Their study adopted the importance-performance analysis technique to evaluate the sustainability of South Korean ports using 27 vital measures of the sustainability of ports adapted from the findings and discussions of previous research and found that those measures are essential from a port sustainability point of view. Their study classified the indicators of port sustainability in the three dimensions of sustainability as opined in the TBL concept. Along similar lines, Narasimha et al. [ 32 ] conducted a thematic analysis of the sustainability performance of seaports followed by semi-structured interviews. They later applied a fuzzy analytical hierarchy process to compute the weight for each port sustainability performance indicator. Their study categorized the indicators into three dimensions of sustainability performance, namely social, environmental, and economic sustainability performance practices. Therefore, it can be interpreted from the results that these content-validated items are reflective indicators of the sustainability practice constructs and collectively constitute latent variables for empirical studies, confirming that the measurement model reflects the construct validity.

Very Specifically, this study supports the well-established “Tripple Bottom Line” (TBL) theory of sustainability coined by Elkington [ 52 ] that these validated sustainability practice-related items in the measuring instrument adequately represent the seaport domain, and the instrument can be used for measuring the constructs through empirical studies. Even the Sustainable Development Goals (SDG) of the United Nations Development Programme (UNDP) also talks about integrated sustainable development by balancing the three pillars of sustainability: environmental, economic, and social. Chang and Kuo [ 95 ] advise organizations to look at short- and long-term sustainable practices for short-term earnings and safeguard the environment and social integrity simultaneously. Thus, at the strategic level, the TBL practices are the higher-order constructs of sustainability practices, focusing on the long term [ 96 ]. Therefore, the findings of this study contribute to the extant body of literature knowledge by providing empirical evidence on the practical and statistical relationship between the environmental, economic, and social sustainability-related practices of the sustainability construct in the TBL theory-based framework applied for shipping ports.

Yadav et al. [ 97 ] also emphasized the availability of several environmental management systems (EMS) for achieving environmental sustainability. They also recommended introducing methods promoting green culture, supporting green behavior, and improving employee commitment to achieving environmental sustainability. The social dimension of sustainability primarily focuses on facilitating and providing equitable opportunities and the well-being of the port employees and other stakeholders, including the local community, driven by the policies and practices of the port authority. Alamoush et al. [ 2 ] equated economic sustainability to generating revenue and monetary gains and considered economic sustainability to be one of the primary drivers of the other two dimensions—environmental and social sustainability. Further, the results from the PLS-SEM analysis indicate that the most significant contribution towards the overall sustainability of the port is from the economic sustainability dimension of sustainability. Like the findings of Alamoush et al. [ 2 ], the financial investments in the port are the drivers of environmental and social sustainability. Poulsen et al. [ 98 ] proved with facts and figures that air quality was improved even with an increase in cargo throughput, mainly driven by the financial investments in air quality control systems in many ports across Europe.

The improvement in air quality around the port vicinity contributes to environmental sustainability. In addition, it also contributes to social sustainability as the community and the port surroundings, including the ecosystems and the natural habitat for birds and animals, experience better living conditions around the port vicinity. This affirms the indirect benefits achieved in environmental and social dimensions by implementing economic sustainability-related strategies and policies. Our findings also emphasize the need for an integrative approach to achieving sustainability of ports, and it can be achieved only when all three dimensions intersect and contribute to complement each other for overall sustainable development.

This study contributes with both novel theoretical and practical implications. Firstly, the study provides a comprehensive list of items about the indicators of sustainability practices in shipping ports, which are available in published scholarly articles and from domain experts working in the port industry. Secondly, as the first of its kind in the seaport sector, the study adopted a scientific content validation approach of indices and procedures to assess the relevance and essentiality of items in the context of shipping ports and contemporary sustainability practices focused on shipping ports. Our study validated an instrument for assessing the sustainability practices in shipping ports, which is a significant step in formulating policies and developing strategies focusing on the sustainable development of ports. The validated instrument can be adapted to determine the extent of adoption of sustainability practices and drive the necessary implementation through policy centered around the sustainability of shipping ports. The instrument can be a guideline for practitioners, policymakers, and researchers focusing on the sustainable development of shipping ports through environmental, economic, and social sustainability practices. Ports authorities can embrace the validated instrument to assess their level of adoption and focus on these sustainability practices, which will aid in developing policies and strategies for the sustainable development of ports. Further, the Global Reporting Initiative (GRI) Standards, developed by the Global Sustainability Standards Board (GSSB) primarily for sustainability reporting, can be referred to along with our validated instrument for sustainability evaluation and reporting in compliance with the GRI standards [ 99 ]. GRI Standards assist organizations in understanding and reporting the extent to which the organization impacts sustainability and contributes to sustainable development, considering the interests of all the stakeholders, including investors, policymakers, capital markets, and civil society, thus making the organization transparent and responsible for sustainability. Sector-specific standards have been developed, of which ports are part of Group 3, which comprises various Transport, infrastructure, and tourism-related sectors. However, it is not readily available for shipping ports but can be developed and customized by the port authorities. To do so, the findings of the validated instrument of our study can be considered as a guide in assessing and preparing the sustainability report as per the applicable GRI standards.

Further, sustainability assessment should not be considered a one-time activity in the port. Instead, the port authorities should have strategies and policies to track the trends and changes taking place to the extent of adopting sustainability practices in the port and their impact on sustainable development. Each individual port must do it through its team/department or personnel responsible for the sustainability assessment and policy implementation in the port, and it also must be a continuous activity at regular intervals, maybe once in 3 months or 6 months, depending on the policy and management decision. Thus, the longitudinal assessment, which keeps track of the various aspects of sustainability, will help the port evaluate the effectiveness of sustainability interventions implemented at shipping ports.

6 Limitation and scope for future work

Although the study achieved its objectives of a novel contribution of a content-validated sustainability measurement instrument for assessing sustainability practices in seaports, there were a few limitations, and there is also further scope for advancing the study in the future. The keywords used in the literature search were confined to published articles in the Scopus database. Future work can expand the search in other scholarly databases and increase the items’ relevance to measuring shipping ports’ sustainability practices. The study was limited to government-controlled significant ports on India’s east and west coasts. Due to permission and access-related challenges, the data collection did not cover the privately managed ports. The items of the study are generalized concerning a shipping port, and further research can consider further refinement specific to the type of cargo handled in the Port or confined to the terminal instead of a generalizable study irrespective of the kind of cargo being handled. The applications of digital technology and automation using Artificial Intelligence and Machine Learning, along with big data and blockchain technology, could be explored to assess their impact on sustainable port management and development. A different methodological approach can be adopted like the study of Yadav et al. [ 97 ], where the “multi-criteria-decision making (MCDM)” approach was used to identify the enablers of sustainability along with the determination of its intensity using “Robust-Best–Worst-method” (RBWM). Their analysis identified economic and environmental-related enablers as the high-intensity enablers of sustainability that organizations can focus on. Other stakeholders, such as customers, port users, government agencies linked with the port operations, and the local community, were not part of the panel for the validation process. In future studies, these other stakeholders can also be considered in the panel so that every aspect is covered in the evaluation. The items were based on a 5-point Likert scale in this study to capture only the perception of port employees on the sustainability practices adopted in the port. A suitable triangulation method and case studies can also be used to analyze the qualitative aspects of adopting sustainability practices in the port.

7 Conclusion

The study validated an instrument for assessing the sustainability practices in shipping ports, which is a significant step in formulating strategies focusing on the sustainable development of ports. The instrument can be a guideline for practitioners, policymakers, and researchers focusing on the sustainable development of shipping ports through environmental, economic, and social sustainability practices. The study prepared a comprehensive list comprising relevant items identified through a thorough literature review of articles published in the Scopus database. After face validation, the measurement tool was administered to six subject matter experts who evaluated its relevance and essentiality in measuring sustainability practices in shipping ports. The content validity was assessed using the most widely used and adopted indices: CVI, Cohen’s Kappa’s coefficient, and CVR. CVI and Cohen’s Kappa’s coefficient are the indices for assessing the relevance of the items in measuring sustainability practices, and CVR is the index for determining the essentiality of the items in measuring sustainability practices in shipping ports. Further, this study contributes to the extant body of literature by providing evidence on the empirical relationship between the environmental, economic, and social sustainability-related practices of the sustainability construct in the TBL theory-based framework applied for shipping ports.

Data availability

The data for analysis in the study was based on survey data collected through a questionnaire instrument administered on Likert scales, both online and offline modes of data collection. The data collection period was between December 2022 and Dec 2023. The instrument had a declaration mentioning maintaining the privacy of the participants and therefore, the data cannot be made public to protect study participant privacy. The primary data collected in the study are not publicly accessible but are available from the corresponding author upon reasonable request.

Meixell MJ, Luoma P. Stakeholder pressure in sustainable supply chain management: a systematic review. Int J Phys Distrib Logist Manag. 2015;45:69–89. https://doi.org/10.1108/IJPDLM-05-2013-0155 .

Article   Google Scholar  

Alamoush AS, Ballini F, Ölçer AI. Revisiting port sustainability as a foundation for the implementation of the United Nations Sustainable Development Goals (UN SDGs). J Shipp Trade. 2021;6(1):1–40. https://doi.org/10.1186/S41072-021-00101-6 .

Dyllick T, Hockerts K. Beyond the business case for corporate sustainability. Bus Strateg Environ. 2002;11(2):130–41. https://doi.org/10.1002/bse.323 .

Lun YHV, Lai K, Wong CWY, Cheng TCE. Green shipping management. Cham: Springer International Publishing; 2016. https://doi.org/10.1007/978-3-319-26482-0 .

Book   Google Scholar  

Porter ME, Van Der Linde C. Green and competitive: ending the stalemate. In: Corporate environmental responsibility. 2017. p. 47–60. https://doi.org/10.1016/0024-6301(95)99997-e .

Russo MV, Fouts PA. A resource-based perspective on corporate environmental performance and profitability. Acad Manag J. 1997;40(3):534–59. https://doi.org/10.2307/257052 .

Roszkowska-Menkes M. Porter and Kramer’s (2006) “shared value.” In: Encyclopedia of sustainable management. Cham: Springer International Publishing; 2021. p. 1–6. https://doi.org/10.1007/978-3-030-02006-4_393-1 .

Chapter   Google Scholar  

Zhu Q, Sarkis J. Relationships between operational practices and performance among early adopters of green supply chain management practices in Chinese manufacturing enterprises. J Oper Manag. 2004;22(3):265–89. https://doi.org/10.1016/j.jom.2004.01.005 .

Hong J, Zhang Y, Ding M. Sustainable supply chain management practices, supply chain dynamic capabilities, and enterprise performance. J Clean Prod. 2018;172:3508–19. https://doi.org/10.1016/J.JCLEPRO.2017.06.093 .

Ministry of Ports Shipping and Waterways. Maritime India vision 2030. Sagarmala; 2021.

Pradhan RP, Rathi C, Gupta S. Sagarmala & India’s maritime big push approach: seaports as India’s geo-economic gateways & neighborhood maritime lessons. J Indian Ocean Reg. 2022;18(3):209–29. https://doi.org/10.1080/19480881.2022.2114195 .

Mantry S, Ghatak RR. Comparing and contrasting competitiveness of major Indian and select international ports. Int J Res Finance Mark. 2017;7(5):1–19.

Google Scholar  

Song DW, Panayides PM. Global supply chain and port/terminal: integration and competitiveness. In: Maritime policy and management. London: Taylor & Francis; 2008. p. 73–87. https://doi.org/10.1080/03088830701848953 .

Yap WY, Lam JSL. 80 million-twenty-foot-equivalent-unit container port? Sustainability issues in port and coastal development. Ocean Coast Manag. 2013;71:13–25. https://doi.org/10.1016/j.ocecoaman.2012.10.011 .

Lee PTW, Kwon OK, Ruan X. Sustainability challenges in maritime transport and logistics industry and its way ahead. Sustainability. 2019;11(5):1331. https://doi.org/10.3390/SU11051331 .

Dragović B, Tzannatos E, Park NK. Simulation modelling in ports and container terminals: literature overview and analysis by research field, application area and tool. Flex Serv Manuf J. 2017;29(1):4–34. https://doi.org/10.1007/s10696-016-9239-5 .

Ashrafi M, Acciaro M, Walker TR, Magnan GM, Adams M. Corporate sustainability in Canadian and US maritime ports. J Clean Prod. 2019;220:386–97. https://doi.org/10.1016/j.jclepro.2019.02.098 .

Peris-Mora E, Orejas JMD, Subirats A, Ibáñez S, Alvarez P. Development of a system of indicators for sustainable port management. Mar Pollut Bull. 2005;50(12):1649–60. https://doi.org/10.1016/j.marpolbul.2005.06.048 .

Article   CAS   Google Scholar  

Ashrafi M, Walker TR, Magnan GM, Adams M, Acciaro M. A review of corporate sustainability drivers in maritime ports: a multi-stakeholder perspective. Marit Policy Manag. 2020;47(8):1027–44. https://doi.org/10.1080/03088839.2020.1736354 .

Stanković JJ, Marjanović I, Papathanasiou J, Drezgić S. Social, economic and environmental sustainability of port regions: MCDM approach in composite index creation. J Mar Sci Eng. 2021;9(1):74. https://doi.org/10.3390/JMSE9010074 .

Dinwoodie J, Tuck S, Knowles H, Benhin J, Sansom M. Sustainable development of maritime operations in ports. Bus Strateg Environ. 2011. https://doi.org/10.1002/bse.718 .

Ports primer: 7.1 environmental impacts | US EPA. https://www.epa.gov/community-port-collaboration/ports-primer-71-environmental-impacts . Accessed Apr 28 2024.

Notteboom T, Pallis A, Rodrigue J-P. Port economics, management and policy. Port Econ Manag Policy. 2021. https://doi.org/10.4324/9780429318184 .

Notteboom T, van der Lugt L, van Saase N, Sel S, Neyens K. The role of seaports in green supply chain management: initiatives, attitudes, and perspectives in Rotterdam, Antwerp, North Sea Port, and Zeebrugge. Sustainability. 2020;12(4):1688. https://doi.org/10.3390/su12041688 .

Molavi A, Lim GJ, Race B. A framework for building a smart port and smart port index. Int J Sustain Transp. 2020;14(9):686–700. https://doi.org/10.1080/15568318.2019.1610919 .

Wu Q, He Q, Duan Y. Explicating dynamic capabilities for corporate sustainability. EuroMed J Bus. 2013;8(3):255–72. https://doi.org/10.1108/EMJB-05-2013-0025 .

Argyriou I, Daras T, Tsoutsos T. Challenging a sustainable port. A case study of Souda port, Chania, Crete. Case Stud Transp Policy. 2022;10(4):2125–37. https://doi.org/10.1016/J.CSTP.2022.09.007 .

Bjerkan KY, Seter H. Reviewing tools and technologies for sustainable ports: does research enable decision making in ports? Transp Res D Transp Environ. 2019;72:243–60. https://doi.org/10.1016/j.trd.2019.05.003 .

Oh H, Lee S-W, Seo Y-J. The evaluation of seaport sustainability: the case of South Korea. Ocean Coast Manag. 2018;161:50–6. https://doi.org/10.1016/j.ocecoaman.2018.04.028 .

Lu CS, Shang KC, Lin CC. Examining sustainability performance at ports: port managers’ perspectives on developing sustainable supply chains. Marit Policy Manag. 2016;43(8):909–27. https://doi.org/10.1080/03088839.2016.1199918 .

Kang D, Kim S. Conceptual model development of sustainability practices: the case of port operations for collaboration and governance. Sustainability. 2017;9(12):2333. https://doi.org/10.3390/su9122333 .

Narasimha PT, Jena PR, Majhi R. Sustainability performance assessment framework for major seaports in India. Int J Sustain Dev Plan. 2022;17(2):693–704. https://doi.org/10.18280/ijsdp.170235 .

Vejvar M, Lai K, Lo CKY, Fürst EWM. Strategic responses to institutional forces pressuring sustainability practice adoption: case-based evidence from inland port operations. Transp Res D Transp Environ. 2018;61:274–88. https://doi.org/10.1016/j.trd.2017.08.014 .

Ayre C, Scally AJ. Critical values for Lawshe’s content validity ratio: revisiting the original methods of calculation. Meas Eval Couns Dev. 2014;47(1):79–86. https://doi.org/10.1177/0748175613513808 .

Barbosa MW, Cansino JM. A water footprint management construct in agri-food supply chains: a content validity analysis. Sustainability. 2022;14(9):4928. https://doi.org/10.3390/su14094928 .

Waltz CF, Strickland OL, Lenz ER. Measurement in nursing and health. Research. 2016. https://doi.org/10.1891/9780826170620 .

Ibiyemi A, Mohd Adnan Y, Daud MN, Olanrele S, Jogunola A. A content validity study of the test of valuers’ support for capturing sustainability in the valuation process in Nigeria. Pac Rim Prop Res J. 2019;25(3):177–93. https://doi.org/10.1080/14445921.2019.1703700 .

Diniz NV, Cunha DR, de Santana Porte M, Oliveira CBM, de Freitas Fernandes F. A bibliometric analysis of sustainable development goals in the maritime industry and port sector. Reg Stud Mar Sci. 2024;69: 103319. https://doi.org/10.1016/j.rsma.2023.103319 .

Amui LBL, Jabbour CJC, de Sousa Jabbour ABL, Kannan D. Sustainability as a dynamic organizational capability: a systematic review and a future agenda toward a sustainable transition. J Clean Prod. 2017;142:308–22. https://doi.org/10.1016/j.jclepro.2016.07.103 .

Maletič M, Maletič D, Gomišček B. The impact of sustainability exploration and sustainability exploitation practices on the organisational performance: a cross-country comparison. J Clean Prod. 2016. https://doi.org/10.1016/j.jclepro.2016.02.132 .

Berns M, Hopkins MS, Townend A, Khayat Z, Balagopal B, Reeves M. The business of sustainability: what it means to managers now. MIT Sloan Manag Rev. 2009;51(1).

Montiel I, Delgado-Ceballos J. Defining and measuring corporate sustainability. Organ Environ. 2014;27(2):113–39. https://doi.org/10.1177/1086026614526413 .

Laxe FG, Bermúdez FM, Palmero FM, Novo-Corti I. Assessment of port sustainability through synthetic indexes. Application to the Spanish case. Mar Pollut Bull. 2017;119(1):220–5. https://doi.org/10.1016/j.marpolbul.2017.03.064 .

Torugsa NA, O’Donohue W, Hecker R. Proactive CSR: an empirical analysis of the role of its economic, social and environmental dimensions on the association between capabilities and performance. J Bus Ethics. 2013. https://doi.org/10.1007/s10551-012-1405-4 .

Lauring J, Thomsen C. Collective ideals and practices in sustainable development: managing corporate identity. Corp Soc Responsib Environ Manag. 2009;16(1):38–47. https://doi.org/10.1002/csr.181 .

Hallstedt SI, Thompson AW, Lindahl P. Key elements for implementing a strategic sustainability perspective in the product innovation process. J Clean Prod. 2013;51:277–88. https://doi.org/10.1016/J.JCLEPRO.2013.01.043 .

Parola F, Risitano M, Ferretti M, Panetti E. The drivers of port competitiveness: a critical review. Transp Rev. 2017;37(1):116–38. https://doi.org/10.1080/01441647.2016.1231232 .

Simpson J, Weiner E, Durkin P. The Oxford English dictionary today. Trans Philol Soc. 2004;102(3):335–81. https://doi.org/10.1111/j.0079-1636.2004.00140.x .

Ruggerio CA. Sustainability and sustainable development: a review of principles and definitions. Sci Total Environ. 2021;786: 147481. https://doi.org/10.1016/J.SCITOTENV.2021.147481 .

Moore JE, Mascarenhas A, Bain J, Straus SE. Developing a comprehensive definition of sustainability. Implement Sci. 2017;12(1):1–8. https://doi.org/10.1186/S13012-017-0637-1/TABLES/3 .

Elkington J. Partnerships from cannibals with forks: the triple bottom line of 21st-century business. Environ Qual Manag. 1998;8(1):37–51. https://doi.org/10.1002/tqem.3310080106 .

Elkington J. Tripple bottom line. In: Cannibals with forks. Oxford: Capstone; 1997.

Waddock SA, Graves SB. The corporate social performance-financial performance link. Strateg Manag J. 1997;18(4):303–19. https://doi.org/10.1002/(SICI)1097-0266(199704)18:4%3c303::AID-SMJ869%3e3.0.CO;2-G .

Sharma S, Vredenburg H. Proactive corporate environmental strategy and the development of competitively valuable organizational capabilities. Strateg Manag J. 1998;19(8):729–53. https://doi.org/10.1002/(sici)1097-0266(199808)19:8%3c729::aid-smj967%3e3.3.co;2-w .

Carroll AB, Shabana KM. The business case for corporate social responsibility: a review of concepts, research and practice. Int J Manag Rev. 2010;12(1):85–105. https://doi.org/10.1111/j.1468-2370.2009.00275.x .

Beske P. Dynamic capabilities and sustainable supply chain management. Int J Phys Distrib Logist Manag. 2012;42(4):372–87. https://doi.org/10.1108/09600031211231344 .

Lam JSL, Li KX. Green port marketing for sustainable growth and development. Transp Policy. 2019;84:73–81. https://doi.org/10.1016/j.tranpol.2019.04.011 .

Olakitan Atanda J. Developing a social sustainability assessment framework. Sustain Cities Soc. 2019;44:237–52. https://doi.org/10.1016/j.scs.2018.09.023 .

Bansal P. Evolving sustainably: a longitudinal study of corporate sustainable development. Strateg Manag J. 2005;26(3):197–218. https://doi.org/10.1002/smj.441 .

Carter CR, Liane Easton P. Sustainable supply chain management: evolution and future directions. Int J Phys Distrib Logist Manag. 2011;41(1):46–62. https://doi.org/10.1108/09600031111101420 .

Janic M. Sustainable transport in the European Union: a review of the past research and future ideas. Transp Rev. 2006;26(1):81–104. https://doi.org/10.1080/01441640500178908 .

Steurer R, Langer ME, Konrad A, Martinuzzi A. Corporations, stakeholders and sustainable development I: a theoretical exploration of business-society relations. J Bus Ethics. 2005;61(3):263–81. https://doi.org/10.1007/s10551-005-7054-0 .

Stanković JJ, Marjanović IM, Papathanasiou J, Drezgić SD. Marine science and engineering social, economic and environmental sustainability of port regions: MCDM approach in composite index creation. J Mar Sci Eng. 2021. https://doi.org/10.3390/jmse9010074 .

Mori K, Christodoulou A. Review of sustainability indices and indicators: towards a new city sustainability index (CSI). Environ Impact Assess Rev. 2012;32(1):94–106. https://doi.org/10.1016/J.EIAR.2011.06.001 .

Mayer AL. Strengths and weaknesses of common sustainability indices for multidimensional systems. Environ Int. 2007. https://doi.org/10.1016/j.envint.2007.09.004 .

Hair J, Black W, Babin B, Anderson R. Multivariate data analysis: a global perspective. In: Multivariate data analysis: a global perspective, vol. 7. Upper Saddle River: Pearson Education; 2010.

Hair JF, Black WC, Babin BJ, Anderson RE. Multivariate data analysis. Hampshire: Cengage Learning; 2019.

Boateng GO, Neilands TB, Frongillo EA, Melgar-Quiñonez HR, Young SL. Best practices for developing and validating scales for health, social, and behavioral research: a primer. Front Public Health. 2018;6:149. https://doi.org/10.3389/fpubh.2018.00149 .

Elangovan N, Sundaravel E. Method of preparing a document for survey instrument validation by experts. MethodsX. 2021;8: 101326. https://doi.org/10.1016/J.MEX.2021.101326 .

Papadas KK, Avlonitis GJ, Carrigan M. Green marketing orientation: conceptualization, scale development and validation. J Bus Res. 2017;80:236–46. https://doi.org/10.1016/J.JBUSRES.2017.05.024 .

Polit DF, Beck CT. The content validity index: are you sure you know what’s being reported? Critique and recommendations. Res Nurs Health. 2008;31(4):489–97.

Polit DF, Beck CT, Owen SV. Focus on research methods: Is the CVI an acceptable indicator of content validity? Appraisal and recommendations. Res Nurs Health. 2007;30(4):459–67. https://doi.org/10.1002/nur.20199 .

Zamanzadeh V, Ghahramanian A, Rassouli M, Abbaszadeh A, Alavi-Majd H, Nikanfar A-R. Design and implementation content validity study: development of an instrument for measuring patient-centered communication. J Caring Sci. 2015;4(2):165. https://doi.org/10.15171/jcs.2015.017 .

de Souza AC, Alexandre NMC, Guirardello EDB, de Souza AC, Alexandre NMC, Guirardello EDB. Propriedades psicométricas na avaliação de instrumentos: avaliação da confiabilidade e da validade. Epidemiologia e Serviços de Saúde. 2017;26(3):649–59. https://doi.org/10.5123/S1679-49742017000300022 .

Rodrigues IB, Adachi JD, Beattie KA, MacDermid JC. Development and validation of a new tool to measure the facilitators, barriers and preferences to exercise in people with osteoporosis. BMC Musculoskelet Disord. 2017;18(1):540. https://doi.org/10.1186/s12891-017-1914-5 .

Bobos P, Pouliopoulou DVS, Harriss A, Sadi J, Rushton A, MacDermid JC. A systematic review and meta-analysis of measurement properties of objective structured clinical examinations used in physical therapy licensure and a structured review of licensure practices in countries with well-developed regulation systems. PLoS ONE. 2021;16(8): e0255696. https://doi.org/10.1371/journal.pone.0255696 .

Hair JF, Hult GTM, Ringle CM, Sarstedt M, Thiele KO. Mirror, mirror on the wall: a comparative evaluation of composite-based structural equation modeling methods. J Acad Mark Sci. 2017;45(5):616–32. https://doi.org/10.1007/s11747-017-0517-x .

Cohen J. A power primer. In: Methodological issues and strategies in clinical research. 4th ed. Washington: American Psychological Association; 2016. p. 279–84. https://doi.org/10.1037/14805-018 .

Roldán JL, Sánchez-Franco MJ. Variance-based structural equation modeling. In: Research methodologies, innovations and philosophies in software systems engineering and information systems. Pennsylvania: IGI Global; 2012. p. 193–221. https://doi.org/10.4018/978-1-4666-0179-6.ch010 .

Faul F, Erdfelder E, Buchner A, Lang A-G. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses. Behav Res Methods. 2009;41(4):1149–60. https://doi.org/10.3758/BRM.41.4.1149 .

Goodboy AK, Kline RB. Statistical and practical concerns with published communication research featuring structural equation modeling. Commun Res Rep. 2017;34(1):68–77. https://doi.org/10.1080/08824096.2016.1214121 .

Crawford JA, Kelder J-A. Do we measure leadership effectively? Articulating and evaluating scale development psychometrics for best practice. Leadersh Q. 2019;30(1):133–44. https://doi.org/10.1016/j.leaqua.2018.07.001 .

Sarstedt M, Hair JF, Cheah JH, Becker JM, Ringle CM. How to specify, estimate, and validate higher-order constructs in PLS-SEM. Australas Mark J. 2019;27(3):197–211. https://doi.org/10.1016/J.AUSMJ.2019.05.003 .

Ringle CM, Wende S, Becker J-M. SmartPLS 4. http://www.smartpls.com .

Malhotra S. Study of features of mobile trading apps: a silver lining of pandemic. J Global Inf Bus Strateg. 2020. https://doi.org/10.5958/2582-6115.2020.00009.0 .

Shrotryia VK, Dhanda U. Content validity of assessment instrument for employee engagement. SAGE Open. 2019;9(1):2158244018821751. https://doi.org/10.1177/2158244018821751 .

Bagozzi RP, Yi Y. On the evaluation of structural equation models. J Acad Mark Sci. 1988;16(1):74–94. https://doi.org/10.1007/BF02723327 .

Chin WW, Gopal A, Salisbury WD. Advancing the theory of adaptive structuration: the development of a scale to measure faithfulness of appropriation. Inf Syst Res. 1997;8(4):342–67. https://doi.org/10.1287/isre.8.4.342 .

Hair JF, Hult Jr GTM, Ringle CM, Sarstedt M. A primer on partial least squares structural equations modeling (PLS-SEM). J Tour Res. 2021;6(2).

Fornell C, Larcker DF. Evaluating structural equation models with unobservable variables and measurement error. J Mark Res. 1981;18(1):39–50. https://doi.org/10.1177/002224378101800104 .

Henseler J, Ringle CM, Sarstedt M. A new criterion for assessing discriminant validity in variance-based structural equation modeling. J Acad Mark Sci. 2015;43(1):115–35. https://doi.org/10.1007/s11747-014-0403-8 .

Franke G, Sarstedt M. Heuristics versus statistics in discriminant validity testing: a comparison of four procedures. Internet Res. 2019;29(3):430–47. https://doi.org/10.1108/IntR-12-2017-0515 .

Hu LT, Bentler PM. Cutoff criteria for fit indexes in covariance structure analysis: conventional criteria versus new alternatives. Struct Equ Model. 1999;6(1):1–55. https://doi.org/10.1080/10705519909540118 .

Becker JM, Ringle CM, Sarstedt M, Völckner F. How collinearity affects mixture regression results. Mark Lett. 2015;26(4):643–59. https://doi.org/10.1007/s11002-014-9299-9 .

Chang DS, Kuo LCR. The effects of sustainable development on firms’ financial performance—an empirical approach. Sustain Dev. 2008;16(6):365–80. https://doi.org/10.1002/sd.351 .

Ogunbiyi O, Oladapo A, Goulding J. An empirical study of the impact of lean construction techniques on sustainable construction in the UK. Constr Innov. 2014;14(1):88–107. https://doi.org/10.1108/CI-08-2012-0045 .

Yadav G, Kumar A, Luthra S, Garza-Reyes JA, Kumar V, Batista L. A framework to achieve sustainability in manufacturing organisations of developing economies using industry 4.0 technologies’ enablers. Comput Ind. 2020;122: 103280. https://doi.org/10.1016/j.compind.2020.103280 .

Poulsen RT, Ponte S, Sornn-Friese H. Environmental upgrading in global value chains: the potential and limitations of ports in the greening of maritime transport. Geoforum. 2018;89:83–95. https://doi.org/10.1016/J.GEOFORUM.2018.01.011 .

GRI -Standards. https://www.globalreporting.org/standards/ . Accessed 09 May 2024.

Lu C-S, Shang K-C, Lin C-C. Identifying crucial sustainability assessment criteria for container seaports. Marit Bus Rev. 2016;1(2):90–106. https://doi.org/10.1108/MABR-05-2016-0009 .

Download references

Acknowledgements

We acknowledge the contribution of the expert panel for their reviews and feedback that enabled us to optimize the items in the instrument.

Open access funding provided by Manipal Academy of Higher Education, Manipal. This study has not received any funding from any institutions or agencies.

Author information

Authors and affiliations.

Department of Commerce, Manipal Academy of Higher Education, Manipal, 576104, India

Department of Humanities and Management, Manipal Institute of Technology, Manipal Academy of Higher Education, Manipal, 576104, India

Yogesh P. Pai

T A Pai Management Institute, Yelahanka, Govindapura, Bengaluru, 560064, Karnataka, India

Parthesh Shanbhag

You can also search for this author in PubMed   Google Scholar

Contributions

All the authors contributed to the manuscript equally K.L conceptualized the study and executed data collection All the authors jointly performed data analysis and authored the manuscript All authors reviewed the manuscript before submitting.

Corresponding author

Correspondence to Yogesh P. Pai .

Ethics declarations

Competing interests.

The authors declare no competing interests.

Additional information

Publisher's note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendix 1. Construct-wise list of items and source

 

Source

Environmental sustainability practices

 Avoiding the use of unpolluted land in the port area

[ , , , , , ]

 Developing and maintaining mangroves, gardens, and landscapes

 Avoiding environmental destruction during dredging

 Considering environmental protection when handling cargo

 Using recyclable or environment-friendly materials in port construction

 Protecting the ecological environment in the port area

 Reduction of noise pollution

 Mitigating light influence on neighboring residents

 Controlling smoke level

 Maintaining air quality

 Reduction of greenhouse gas

 Reduction of carbon emissions

 Preventing odour pollution

 Optimal utilization of renewables and alternate energy sources

 Facilities for wastewater and sewage treatment

 Implementation of dust suppression systems

Economic sustainability practices

 Facilitating economic growth and acting as a supply chain link in local and global trade

[ , , , , , ]

 Investments in port infrastructure development

 Establishing port development funding

 Attracting foreign direct investments

 Promotion and development of cruise tourism services

 Employment generation and career growth opportunities

 Ensuring that cargo is handled safely and effectively

 Low damage or loss record for cargo delivery

 Usage of energy-efficient electrical and electronic appliances like LED lamps

 Optimal utilization of infrastructure, land, and space in the port area

 Offering one-stop logistics solutions, including freight forwarding and additional services

 Optimizing the routing of vehicles in and out of port

 Mitigating congestion in the port

 Providing incentives for green shipping practices

 Landlord activities

 Investment in climate change adaptation strategies

 Sustainable supply chain policy

 Investment in innovation strategy

 Transshipment and storage of dangerous goods

Social sustainability practices

 Recognizing the requirements of the neighboring community

[ , , , , , ]

 Giving support to community social activities

 Providing training and education for employees regularly

 Providing employees’ welfare benefits and other facilities

 Staff job security even during uncertainties of the business

 Strengthening safety and security management standards and protocols of the port

 Accident prevention in the port area

 Social equality and gender diversity in employment

 Job satisfaction of employees

 Consulting various interest groups such as labor unions and community leaders when making port project decision

 Strengthening port infrastructure for social contribution

 Engaging in corporate social responsibility practices

Appendix 2. Results of content validity

Item code

Items description

Agree to count for CVI

i-CVI

Pc

K

Agree to count for CVR

CVR

EnvSP1

Avoiding the use of unpolluted land in the port area

4

0.67

0.23

0.56

6

1.00

EnvSP2

Developing and maintaining mangroves, gardens, and landscapes

6

1

0.02

1

6

1.00

EnvSP3

Avoiding environmental destruction during dredging

4

0.67

0.23

0.56

5

0.67

EnvSP4

Considering environmental protection when handling cargo

4

0.67

0.23

0.56

6

1.00

EnvSP5

Using recyclable or environment-friendly materials in port construction

5

0.83

0.09

0.82

5

0.67

EnvSP6

Protecting the ecological environment in the port area

6

1

0.02

1

6

1.00

EnvSP7

Reduction of noise pollution

5

0.83

0.09

0.82

5

0.67

EnvSP8

Mitigating light influence on neighboring residents

4

0.67

0.23

0.56

4

0.33

EnvSP9

Controlling smoke level

3

0.5

0.31

0.27

6

1.00

EnvSP10

Maintaining air quality

6

1

0.02

1

6

1.00

EnvSP11

Reduction of greenhouse gas

6

1

0.02

1

6

1.00

EnvSP12

Reduction of carbon emissions

6

1

0.02

1

6

1.00

EnvSP13

Preventing odour pollution

3

0.5

0.31

0.27

4

0.33

EnvSP14

Optimal utilization of renewables and alternate energy sources

6

1

0.02

1

6

1.00

EnvSP15

Facilities for wastewater and sewage treatment

6

1

0.02

1

6

1.00

EnvSP16

Implementation of dust suppression systems

6

1

0.02

1

6

1.00

EnvSP17

Cold-ironing source of power for vessels on the berth

4

0.67

0.23

0.56

4

0.33

EcoSP1

Facilitating economic growth and acting as a supply chain link in local and global trade

6

1

0.02

1

6

1.00

EcoSP2

Investments in port infrastructure development

6

1

0.02

1

6

1.00

EcoSP3

Establishing port development funding

2

0.33

0.23

0.13

4

0.33

EcoSP4

Attracting foreign direct investments

2

0.33

0.23

0.13

4

0.33

EcoSP5

Promotion and development of cruise tourism services

6

1

0.02

1

6

1.00

EcoSP6

Employment generation and career growth opportunities

6

1

0.02

1

6

1.00

EcoSP7

Ensuring that cargo is handled safely and effectively

3

0.5

0.31

0.27

6

1.00

EcoSP8

Low damage or loss record for cargo delivery

4

0.67

0.23

0.56

6

1.00

EcoSP9

Usage of energy-efficient electrical and electronic appliances

6

1

0.02

1

6

1.00

EcoSP10

Optimal utilization of infrastructure, land, and space in the port area

6

1

0.02

1

6

1.00

EcoSP11

Offering one-stop logistics solutions, including freight forwarding and additional services

6

1

0.02

1

5

0.67

EcoSP12

Optimizing the routing of vehicles in and out of port

6

1

0.02

1

6

1.00

EcoSP13

Mitigating congestion in the port

6

1

0.02

1

6

1.00

EcoSP14

Providing incentives for green shipping practices

6

1

0.02

1

5

0.67

EcoSP15

Landlord activities

6

1

0.02

1

5

0.67

EcoSP16

Investment in climate change adaptation strategies

6

1

0.02

1

6

1.00

EcoSP17

Sustainable supply chain policy

6

1

0.02

1

6

1.00

EcoSP18

Investment in innovation strategy

6

1

0.02

1

5

0.67

EcoSP19

Transshipment and storage of dangerous goods

6

1

0.02

1

4

0.33

SocSP1

Recognizing the requirements of the neighboring community

6

1

0.02

1

4

0.33

SocSP2

Giving support to community social activities

6

1

0.02

1

4

0.33

SocSP3

Providing training and education for employees regularly

6

1

0.02

1

6

1.00

SocSP4

Providing employees’ welfare benefits and other facilities

6

1

0.02

1

6

1.00

SocSP5

Staff job security even during uncertainties of the business

3

0.5

0.31

0.27

5

0.67

SocSP6

Strengthening safety and security management standards and protocols of the port

6

1

0.02

1

6

1.00

SocSP7

Accident prevention in the port area

6

1

0.02

1

6

1.00

SocSP8

Social equality and gender diversity in employment

6

1

0.02

1

6

1.00

SocSP9

Job satisfaction of employees

6

1

0.02

1

6

1.00

SocSP10

Consulting various interest groups such as labor unions and community leaders when making port project decision

6

1

0.02

1

6

1.00

SocSP11

Strengthening port infrastructure for social contribution

6

1

0.02

1

5

0.67

SocSP12

Engaging in corporate social responsibility practices

6

1

0.02

1

6

1.00

  • i-CVI indicates item CVI, Pc the probability of a chance occurrence, and K—the kappa statistic

Appendix 3. Instrument for data collection

4.1 section a—demographic profile.

figure a

4.2 Section B—practices related to port

Please indicate the extent to which you agree on statements related to your port on a scale of 1–5.

1—strongly disagree, 2—disagree, 3—neutral, 4—agree, 5—strongly agree.

If you are unaware of the Port’s practices, you may choose “3-Neutral.”

Environmental sustainability practices adopted in your port focusses on

1

2

3

4

5

Developing and maintaining mangroves, gardens, and landscapes

     

Protecting the ecological environment in the port area

     

Maintaining air quality

     

Reduction of greenhouse gas

     

Reduction of carbon emissions

     

Optimal utilization of renewables and alternate energy sources

     

Facilities for wastewater and sewage treatment

     

Implementation of dust suppression systems

     

Economic sustainability practices adopted in your port focusses on

1

2

3

4

5

Facilitating economic growth and acting as a supply chain link in local and global trade

     

Investments in port infrastructure development

     

Promotion and development of cruise tourism services

     

Employment generation and career growth opportunities

     

Usage of energy-efficient electrical and electronic appliances like LED lamps

     

Optimal utilization of infrastructure, land, and space in the port area

     

Optimizing the routing of vehicles in and out of port

     

Mitigating congestion in the port

     

Investment in climate change adaptation strategies

     

Sustainable supply chain policy

     

Social sustainability practices adopted in your port focusses on

1

2

3

4

5

Providing training and education for employees regularly

     

Providing employees’ welfare benefits and other facilities

     

Strengthening port safety management standards and protocols

     

Accident prevention in the port area

     

Social equality and gender diversity in employment

     

Job satisfaction of employees

     

Consulting various interests groups such as labor unions and community leaders when making port projects decision

     

Engaging in corporate social responsibility practices

     

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Kishore, L., Pai, Y.P. & Shanbhag, P. Reliability and validity assessment of instrument to measure sustainability practices at shipping ports in India. Discov Sustain 5 , 236 (2024). https://doi.org/10.1007/s43621-024-00395-z

Download citation

Received : 27 January 2024

Accepted : 02 August 2024

Published : 03 September 2024

DOI : https://doi.org/10.1007/s43621-024-00395-z

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Shipping ports
  • Sustainability practices
  • Measurement instrument
  • Content Validity Index
  • Cohen’s Kappa Index
  • Content validity ratio
  • Confirmatory factor analysis
  • Structural equation model
  • Find a journal
  • Publish with us
  • Track your research

IMAGES

  1. Appendix C: Example Interview Instrument(REVV)

    research instrument interview meaning

  2. Research instrument

    research instrument interview meaning

  3. What is a Research Interview? (Types + Steps of Conducting)

    research instrument interview meaning

  4. (PDF) INTERVIEW: A RESEARCH INSTRUMENT FOR SOCIAL SCIENCE RESEARCHERS

    research instrument interview meaning

  5. What is a Research Interview? (Types + Steps of Conducting)

    research instrument interview meaning

  6. Lesson 4

    research instrument interview meaning

VIDEO

  1. Research Episode 13. RESEARCH INSTRUMENT o QUESTIONNAIRE: Buuin natin!

  2. Research Instrument 1

  3. Research Instrument

  4. Instrumentation and Control Engineering Question and Answer for Job Interview

  5. What is research instrument examples?#youtubeshorts

  6. Interview: Meaning and Types of Interview

COMMENTS

  1. Types of Interviews in Research

    There are several types of interviews, often differentiated by their level of structure. Structured interviews have predetermined questions asked in a predetermined order. Unstructured interviews are more free-flowing. Semi-structured interviews fall in between. Interviews are commonly used in market research, social science, and ethnographic ...

  2. Chapter 11. Interviewing

    Introduction. Interviewing people is at the heart of qualitative research. It is not merely a way to collect data but an intrinsically rewarding activity—an interaction between two people that holds the potential for greater understanding and interpersonal development. Unlike many of our daily interactions with others that are fairly shallow ...

  3. Research Instruments: a Questionnaire and An Interview Guide Used to

    The aim is to present a systematic and detailed explanation of the construction and administration of two research instruments (a questionnaire and an interview guide) used for data collection in ...

  4. (PDF) How to Conduct an Effective Interview; A Guide to Interview

    Vancouver, Canada. Abstract. Interviews are one of the most promising ways of collecting qualitative data throug h establishment of a. communication between r esearcher and the interviewee. Re ...

  5. What is a Research Interview? (Types + Steps of Conducting)

    Here are some common types of research interviews: 1. Structured Interviews. Structured interviews are standardized and follow a fixed format. Therefore, these interviews have a pre-determined set of questions. All the participants are asked the same set of questions in the same order.

  6. Chapter 13: Interviews

    What are interviews? An interviewing method is the most commonly used data collection technique in qualitative research. 1 The purpose of an interview is to explore the experiences, understandings, opinions and motivations of research participants. 2 Interviews are conducted one-on-one with the researcher and the participant. Interviews are most appropriate when seeking to understand a ...

  7. Research Methods Guide: Interview Research

    Develop an interview guide. Introduce yourself and explain the aim of the interview. Devise your questions so interviewees can help answer your research question. Have a sequence to your questions / topics by grouping them in themes. Make sure you can easily move back and forth between questions / topics. Make sure your questions are clear and ...

  8. PDF Interview as a Method for Qualitative Research

    Definitions. The qualitative research interview seeks to describe and the meanings of central themes in the life world of the subjects. The main task in interviewing is to understand the meaning of what the interviewees say. (Kvale,1996) participant's experiences. The interviewer can information around the topic.

  9. Structured Interview

    Structured Interview | Definition, Guide & Examples. Published on January 27, 2022 by Tegan George and Julia Merkus. Revised on June 22, 2023. A structured interview is a data collection method that relies on asking questions in a set order to collect data on a topic. It is one of four types of interviews.. In research, structured interviews are often quantitative in nature.

  10. PDF Structured Methods: Interviews, Questionnaires and Observation

    instruments is an important skill for research-ers. Such survey instruments can be used in many types of research, from case study, to cross-sectional survey, to experiment. A study of this sort can involve anything from a short paper-and-pencil feedback form, to an intensive one-to-one interview asking a large number of

  11. Qualitative research method-interviewing and observation

    Interviewing. This is the most common format of data collection in qualitative research. According to Oakley, qualitative interview is a type of framework in which the practices and standards be not only recorded, but also achieved, challenged and as well as reinforced.[] As no research interview lacks structure[] most of the qualitative research interviews are either semi-structured, lightly ...

  12. Semi-Structured Interview

    A semi-structured interview is a data collection method that relies on asking questions within a predetermined thematic framework. However, the questions are not set in order or in phrasing. In research, semi-structured interviews are often qualitative in nature. They are generally used as an exploratory tool in marketing, social science ...

  13. 9 Best Examples of Research Instruments in Qualitative Research Explained

    Visual Methods. Visual methods, such as photography, video recording, or drawings, can be used as qualitative research instruments. These methods allow participants to express their experiences and perspectives visually, providing rich and nuanced data. Visual methods can be particularly useful in studying topics related to art, culture, or ...

  14. What is a Research Instrument?

    The term research instrument refers to any tool that you may use to collect or obtain data, measure data and analyse data that is relevant to the subject of your research. Research instruments are often used in the fields of social sciences and health sciences. These tools can also be found within education that relates to patients, staff ...

  15. Semi-structured Interview: A Methodological Reflection on the

    This article aims to describe how the semi-structured interview as a research instrument is used in qualitative research. The main focus of this article is to disclose some methodological

  16. PDF Research Instrument Examples

    research instrument can include interviews, tests, surveys, or checklists. The Research Instrument is usually determined by researcher and is tied to the study methodology. This document offers some examples of research instruments and study methods. Choosing a Research Instrument 1. Select a topic

  17. Research Instruments

    Research Methodologies: Research Instruments

  18. RESEARCH TOOLS: INTERVIEWS & QUESTIONNAIRES

    A ' questionnaire ' is the instrument for collecting the primary data (Cohen, 2013). ' Primary data' by extension is data that would not otherwise exist if it were not for the research process and is collected through both questionnaires or interviews, which we discuss here today (O'Leary, 2014). An ' interview ' is typically a ...

  19. Researching the researcher-as-instrument: an exercise in interviewer

    The level of researcher involvement in qualitative interviewing - indeed, the embodiment of the unique researcher as the instrument for qualitative data collection - has been widely acknowledged (e.g. Cassell, 2005; Rubin and Rubin, 2005; Turato, 2005).Because the researcher is the instrument in semistructured or unstructured qualitative interviews, unique researcher attributes have the ...

  20. PDF Selecting and Describing Your Research Instruments

    Advisor Consultation Checklist Use the checklist below to ensure that you consulted with your advisor during the key steps in the process of selecting and describing your research instruments. 1. _____ Read this checklist. 2. _____ Made an appointment for our first meeting to discuss the instrument selection. 3.

  21. Q: What is a research instrument?

    A research instrument is a tool used to obtain, measure, and analyze data from subjects around the research topic. You need to decide the instrument to use based on the type of study you are conducting: quantitative, qualitative, or mixed-method. For instance, for a quantitative study, you may decide to use a questionnaire, and for a ...

  22. Survey Research

    Survey research means collecting information about a group of people by asking them questions and analyzing the results. To conduct an effective survey, follow these six steps: Determine who will participate in the survey. Decide the type of survey (mail, online, or in-person) Design the survey questions and layout.

  23. Research Instrument: Meaning & Examples

    Research Instrument: Interviews. Interview as a research instrument, Unsplash. The interview is a qualitative research method that collects data by asking questions. It includes three main types: structured, unstructured, and semi-structured interviews. Structured interviews include an ordered list of questions. These questions are often closed ...

  24. Reliability and validity assessment of instrument to measure

    Sustainability has emerged as one of the most critical factors influencing the competitiveness of maritime shipping ports. This emergence has led to a surge in research publications on port sustainability-related topics. However, despite the increasing awareness and adoption of sustainability practices, documented literature on empirical studies with survey and interview data is very limited ...