The life history interviews ran for 40 – 60 minutes. The timing for sessions 2 and 3 is not provided.
Interviews are the most common data collection technique in qualitative research. There are four main types of interviews; the one you choose will depend on your research question, aims and objectives. It is important to formulate open-ended interview questions that are understandable and easy for participants to answer. Key considerations in setting up the interview will enhance the quality of the data obtained and the experience of the interview for the participant and the researcher.
Qualitative Research – a practical guide for health and social care researchers and practitioners Copyright © 2023 by Danielle Berkovic is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License , except where otherwise noted.
Interview as a Method for Qualitative Research
There are several types of interviews, including:
What are the important steps involved in interviews?
Do I have to choose either a survey or interviewing method?
No. In fact, many researchers use a mixed method - interviews can be useful as follow-up to certain respondents to surveys, e.g., to further investigate their responses.
Is training an interviewer important?
Yes, since the interviewer can control the quality of the result, training the interviewer becomes crucial. If more than one interviewers are involved in your study, it is important to have every interviewer understand the interviewing procedure and rehearse the interviewing process before beginning the formal study.
Introduction.
Qualitative research is a valuable approach that allows researchers to explore complex phenomena and gain in-depth insights into the experiences and perspectives of individuals. In order to conduct qualitative research effectively, researchers often utilize various research methodologies and instruments. These methodologies and instruments serve as tools to collect and analyze data, enabling researchers to uncover rich and nuanced information.
In this article, we will delve into the world of qualitative research instruments, specifically focusing on research instrument examples. We will explore the different types of qualitative research instruments, provide specific examples, and discuss the advantages and limitations of using these instruments in qualitative research. By the end of this article, you will have a comprehensive understanding of the role and significance of research instruments in qualitative research.
Qualitative research instruments are tools that researchers use to collect and analyze data in qualitative research studies. These instruments help researchers gather rich and detailed information about a particular phenomenon or topic.
One of the main goals of qualitative research is to understand the subjective experiences and perspectives of individuals. To achieve this, researchers need to use instruments that allow for in-depth exploration and interpretation of data. Qualitative research instruments can take various forms, including interviews, questionnaires, observations, and focus groups. Each instrument has its own strengths and limitations, and researchers need to carefully select the most appropriate instrument for their study objectives.
Exploring qualitative research instruments involves understanding the characteristics and features of each instrument, as well as considering the research context and the specific research questions being addressed. Researchers also need to consider the ethical implications of using qualitative research instruments, such as ensuring informed consent and maintaining confidentiality and anonymity of participants.
Qualitative research instruments are tools used to collect data and gather information in qualitative research studies. These instruments help researchers explore and understand complex social phenomena in depth. There are several types of qualitative research instruments that can be used depending on the research objectives and the nature of the study.
Interviews are one of the most commonly used qualitative research instruments. They involve direct communication between the researcher and the participant, allowing for in-depth exploration of the participant’s experiences, perspectives, and opinions. Interviews can be structured, semi-structured, or unstructured , depending on the level of flexibility in the questioning process. They involve researchers asking open-ended questions to participants to gather in-depth information and insights. Interviews can be conducted face-to-face, over the phone, or through video conferencing.
Focus groups are another example of qualitative research instrument that involves a group discussion led by a researcher or moderator. Participants in a focus group share their thoughts, ideas, and experiences on a specific topic. This instrument allows for the exploration of group dynamics and the interaction between participants. It also allow researchers to gather multiple perspectives and generate rich qualitative data.
Observations are a powerful qualitative research instrument that involves systematic and careful observation of participants in their natural settings. This type of qualitative research instrument allows researchers to gather data on behavior, interactions, and social processes. Observations can be participant observations, where the researcher actively participates in the setting, or non-participant observations, where the researcher remains an observer.
Document analysis is a qualitative research instrument that involves the examination, analyzation and interpretation of written or recorded materials such as documents, texts, audio/video recordings or other written materials. Researchers analyze documents to gain insights into social, cultural, or historical contexts, as well as to understand the perspectives and meanings embedded in the documents.
Visual methods, such as photography, video recording, or drawings, can be used as qualitative research instruments. These methods allow participants to express their experiences and perspectives visually, providing rich and nuanced data. Visual methods can be particularly useful in studying topics related to art, culture, or visual communication.
Diaries or journals can be used as qualitative research instruments to collect data on participants’ thoughts, feelings, and experiences over a period of time. Participants record their daily activities, reflections, and emotions, providing valuable insights into their lived experiences.
While surveys are commonly associated with quantitative research, they can also be used as qualitative research instruments. Qualitative surveys typically include open-ended questions that allow participants to provide detailed responses. Surveys can be administered online, through interviews, or in written form.
Case studies are in-depth investigations of a particular individual, group, or phenomenon. They involve collecting and analyzing qualitative data from various sources such as interviews, observations, and document analysis. Case studies provide rich and detailed insights into specific contexts or situations.
Ethnography is a qualitative research instrument that involves immersing researchers in a particular social or cultural group to observe and understand their behaviors, beliefs, and practices. Ethnographic research often includes participant observation, interviews, and document analysis.
These are just a few examples of qualitative research instruments. Researchers can choose the most appropriate data collection method or combination of methods based on their research objectives, the nature of the research question, and the available resources.
Gathering in-depth and detailed information.
Qualitative research instruments offer several advantages that make them valuable tools in the research process. Firstly, qualitative research instruments allow researchers to gather in-depth and detailed information. Unlike quantitative research instruments that focus on numerical data, qualitative instruments provide rich and descriptive data about participants’ feelings, opinions, and experiences. This depth of information allows researchers to gain a comprehensive understanding of the research topic .
Another advantage of qualitative research instruments is their flexibility. Researchers can adapt their methods and questions during data collection to respond to emerging insights. This flexibility allows for a more dynamic and responsive research process, enabling researchers to explore new avenues and uncover unexpected findings.
Qualitative research instruments also offer the advantage of capturing data in natural settings. Unlike controlled laboratory settings often used in quantitative research, qualitative research takes place in real-world contexts. This natural setting allows researchers to observe participants’ behaviors and interactions in their natural environment, providing a more authentic and realistic representation of their experiences.
Furthermore, qualitative research instruments promote participant engagement and collaboration. By using methods such as interviews and focus groups, researchers can actively involve participants in the research process. This engagement fosters a sense of ownership and empowerment among participants, leading to more meaningful and insightful data.
Lastly, qualitative research instruments allow for the exploration of complex issues. Qualitative research is particularly useful when studying complex phenomena that cannot be easily quantified or measured. It allows researchers to delve into the underlying meanings, motivations, and social dynamics that shape individuals’ behaviors and experiences.
Qualitative research instruments have several limitations that researchers need to consider when conducting their studies. In this section, we will delve into the limitations of qualitative research instruments as compared to quantitative research.
One of the main drawbacks of qualitative research is that the process is time-consuming. Unlike quantitative research, which can collect data from a large sample size in a relatively short period of time, qualitative research requires in-depth interviews, observations, and analysis, which can take a significant amount of time.
Another limitation of qualitative research instruments is that the interpretations are subjective. Since qualitative research focuses on understanding the meaning and context of phenomena, the interpretations of the data can vary depending on the researcher’s perspective and biases. This subjectivity can introduce potential bias and affect the reliability and validity of the findings.
Additionally, qualitative research instruments often involve complex data analysis. Unlike quantitative research, which can use statistical methods to analyze data, qualitative research requires researchers to analyze textual or visual data, which can be time-consuming and challenging. The analysis process involves coding, categorizing, and interpreting the data, which requires expertise and careful attention to detail.
Furthermore, qualitative research instruments may face challenges in maintaining anonymity. In some cases, researchers may need to collect sensitive or personal information from participants, which can raise ethical concerns . Ensuring the privacy and confidentiality of participants’ data can be challenging, and researchers need to take appropriate measures to protect the participants’ identities and maintain their trust.
Another limitation of qualitative research instruments is the limited generalizability of the findings. Qualitative research often focuses on a specific context or a small sample size, which may limit the generalizability of the findings to a larger population. While qualitative research provides rich and detailed insights into a particular phenomenon, it may not be representative of the broader population or applicable to other settings.
Lastly, replicating findings in qualitative research can be difficult. Since qualitative research often involves in-depth exploration of a specific phenomenon, replicating the exact conditions and context of the original study can be challenging. This can make it difficult for other researchers to validate or replicate the findings, which is an essential aspect of scientific research.
Despite these limitations, qualitative research instruments offer valuable insights and understanding of complex phenomena. By acknowledging and addressing these limitations, researchers can enhance the rigor and validity of their qualitative research studies.
In conclusion, qualitative research instruments are powerful tools that enable researchers to explore and uncover the complexities of human experiences. By utilizing a range of instruments and considering their advantages and limitations, researchers can enhance the rigor and depth of their qualitative research studies.
Save my name, email, and website in this browser for the next time I comment.
Keep up-to-date on postgraduate related issues with our quick reads written by students, postdocs, professors and industry leaders.
The term research instrument refers to any tool that you may use to collect or obtain data, measure data and analyse data that is relevant to the subject of your research.
Research instruments are often used in the fields of social sciences and health sciences. These tools can also be found within education that relates to patients, staff, teachers and students.
The format of a research instrument may consist of questionnaires, surveys, interviews, checklists or simple tests. The choice of which specific research instrument tool to use will be decided on the by the researcher. It will also be strongly related to the actual methods that will be used in the specific study.
A good research instrument is one that has been validated and has proven reliability. It should be one that can collect data in a way that’s appropriate to the research question being asked.
The research instrument must be able to assist in answering the research aims , objectives and research questions, as well as prove or disprove the hypothesis of the study.
It should not have any bias in the way that data is collect and it should be clear as to how the research instrument should be used appropriately.
The general format of an interview is where the interviewer asks the interviewee to answer a set of questions which are normally asked and answered verbally. There are several different types of interview research instruments that may exist.
An observation research instrument is one in which a researcher makes observations and records of the behaviour of individuals. There are several different types.
Structured observations occur when the study is performed at a predetermined location and time, in which the volunteers or study participants are observed used standardised methods.
Naturalistic observations are focused on volunteers or participants being in more natural environments in which their reactions and behaviour are also more natural or spontaneous.
A participant observation occurs when the person conducting the research actively becomes part of the group of volunteers or participants that he or she is researching.
The types of research instruments will depend on the format of the research study being performed: qualitative, quantitative or a mixed methodology. You may for example utilise questionnaires when a study is more qualitative or use a scoring scale in more quantitative studies.
How should you spend your first week as a PhD student? Here’s are 7 steps to help you get started on your journey.
Academic conferences are expensive and it can be tough finding the funds to go; this naturally leads to the question of are academic conferences worth it?
There’s no doubt about it – writing can be difficult. Whether you’re writing the first sentence of a paper or a grant proposal, it’s easy
Join thousands of other students and stay up to date with the latest PhD programmes, funding opportunities and advice.
The term rationale of research means the reason for performing the research study in question.
Are you always finding yourself working on sections of your research tasks right up until your deadlines? Are you still finding yourself distracted the moment
Dr Jadavji completed her PhD in Medical Genetics & Neuroscience from McGill University, Montreal, Canada in 2012. She is now an assistant professor involved in a mix of research, teaching and service projects.
Nidhi is a PhD student at Virginia Tech, focused on developing an engineered platform to study the breast tumor microenvironment, for diagnostic and prognostic purposes.
A research instrument is a tool you will use to help you collect, measure and analyze the data you use as part of your research. The choice of research instrument will usually be yours to make as the researcher and will be whichever best suits your methodology.
There are many different research instruments you can use in collecting data for your research:
These are the most common ways of carrying out research, but it is really dependent on your needs as a researcher and what approach you think is best to take. It is also possible to combine a number of research instruments if this is necessary and appropriate in answering your research problem.
How to Collect Data for Your Research This article covers different ways of collecting data in preparation for writing a thesis.
March 7, 2016
We will start with a few key operational definitions. ‘ Surveying ’ is the process by which the researcher collects data through a questionnaire (O’Leary, 2014). A ‘ questionnaire ’ is the instrument for collecting the primary data (Cohen, 2013). ‘ Primary data’ by extension is data that would not otherwise exist if it were not for the research process and is collected through both questionnaires or interviews, which we discuss here today (O’Leary, 2014). An ‘ interview ’ is typically a face-to-face conversation between a researcher and a participant involving a transfer of information to the interviewer (Cresswell, 2012). We will investigate each data collection instrument independently, starting with the interview.
Interviews are primarily done in qualitative research and occur when researchers ask one or more participants general, open-ended questions and record their answers. Often audiotapes are utilized to allow for more consistent transcription (Creswell, 2012). The researcher often transcribes and types the data into a computer file, in order to analyze it after interviewing. Interviews are particularly useful for uncovering the story behind a participant’s experiences and pursuing in-depth information around a topic. Interviews may be useful to follow-up with individual respondents after questionnaires, e.g., to further investigate their responses. (McNamara, 1999). In qualitative research specifically, interviews are used to pursue the meanings of central themes in the world of their subjects. The main task in interviewing is to understand the meaning of what the interviewees say (McNamara, 2009). Usually open-ended questions are asked during interviews in hopes of obtaining impartial answers, while closed ended questions may force participants to answer in a particular way (Creswell, 2012; McNamara, 1999). An open-ended question gives participants more options for responding. For example an open-ended question may be, “How do you balance participation in athletics with your schoolwork (Creswell, 2012)”. A closed-ended question provides a preset response. For example, “Do you exercise?” where the answers are limited to yes or no (Cresswell, 2012).
Interviewer must be:
Both Creswell and McNamara highlighted very similar points about conducting interviews. McNamara’s literature is less descriptive, but more simple and concise. Another author who has come up consistently in the interviewing literature is Kvalve, whose literature is much more intensive and broad. These three authors are all very prominent in the interview research literature.
These are the steps that are consistent in the literature on conducting interviews in research (Creswell, 2012; McNamara, 1999):
Questionnaires have many uses, most notably to discover what the masses are thinking. These include: market research, political polling, customer service feedback, evaluations, opinion polls, and social science research (O’Leary, 2014).
Starting out.
Bell & Waters (2014) and O’Leary (2014), each offer clear checklists for creating a questionnaire from beginning to end. By comparing the two, we have created a comprehensive list. Bell starts by reminding the researcher to obtain approval prior to administering their questionnaire, then to reflect on what our question is and whether this is the best method to obtain the intended information (Bell & Waters, 2014). O’Leary (2014) suggests that you operationalize concepts in the beginning and define the measurable variables. Prior to writing your own questions, O’Leary (2014) would have you explore existing possibilities in order to adapt previous instruments rather than ‘reinventing the wheel’. At this point, both authors have you write your questions.
Bell & Waters (2014), utilizes Youngman (1982)’s Question Types:
Bell & Waters (2014), highlight a plethora of potential difficulties in wording your questions, including ambiguity and imprecision, assumptions, memory, knowledge, double questions, leading questions, presuming questions, hypothetical questions, offensive questions, and questions covering sensitive issues. It is imperative that you check for jargon within your language and return to your hypothesis or objectives often to decide which questions are most pertinent (Bell & Waters, 2014).
Bell & Waters (2014) and O’Leary (2014) seem to disagree on the next step; while O’Leary would focus next on the response category, Bell would have you look further into the wording of the questions. Following O’Leary (2014)’s logic, we decide now whether to use open or closed questions, considering how the category will translate to different data types. Closed response answers include: yes/no, agree/disagree, fill in the blanks, choosing from a list, ordering options, and interval response scales. Any of the three standard scaling methods, (Likert, Guttman, and Thurstone) may be used where appropriate (O’Leary, 2014).
Bell & Waters (2014) suggest you check your wording at this point. O’Leary (2014) goes into detail to point out problems with questions such as ambiguity, leading, confronting, offensiveness, unwarranted assumptions, double-barrelled questions, or pretentiousness. Questions to avoid according to O’Leary are those that are:
Both authors emphasize thoughtfulness about the order of questions, considering logic and ease for respondents. O’Leary (2014) goes into further detail regarding issues with organization and length; too lengthy and respondents are less likely to complete the questionnaire. He also suggests researchers avoid asking threatening, awkward, insulting, or difficult questions, especially in the beginning of the questionnaire. Bell & Waters (2014) takes a more broad view of the aesthetics of the questionnaire; leaving spaces for legibility, limiting the overall numbers of pages, and considering the impression the document leaves, to highlight a few examples.
Clear and unambiguous instructions for respondents are emphasized by both authors (O’Leary, 2014; Bell & Waters, 2014). This step is followed by a ‘layout’, or rearranging of questions, in both descriptions, likely because this is the best time to review once the questions and other writing is complete. O’Leary (2014) warns researchers to use professional and aesthetically-pleasing formatting, as well as to be organized in order to attract respondents and to lower the probability of making your own mistakes (in repeating questions, for example). O’Leary (2014) offers final instructions to include a cover letter that describes who you are, the aim of the project, assurances of confidentiality, etc.. Bell & Waters (2014), however, offers further steps.
Bell & Waters (2014) go into further detail regarding response rates and ensuring you have a representative or generalizable sample, which we believe is irrelevant to this article. More pertinent steps would be to pilot-test your questionnaire with preliminary respondents (even family and friends) and follow-through to preliminary data analysis in order to ensure your methods are effective, making adjustments accordingly (Bell & Waters, 2014). O’Leary (2014) lists six steps in a typical pilot test:
Bell & Waters (2014) briefly consider distribution methods; they emphasize the need to ensure confidentiality, to include a return date, to formulate a plan for ‘bounce backs’ via email, and to record data as soon as it arrives. O’Leary (2014) lists typical methods: face-to-face, snail mail, e-mail, and online. Bell & Waters (2014) highlight the advantage to administering your questionnaire personally, as it enables the researcher to explain the purpose of the study and increases the probability of receiving completed questionnaires in return. The authors go on to emphasize the value of online methods. In particular, they mention “Survey Monkey” as the most popular and versatile survey tool available (Bell & Waters, 2014). O’Leary (2014) suggests sending out reminder letters or E-mails in order to increase response rate and the speed of response.
Bell & Waters (2014) and O’Leary (2014) disagree once again with respect to the analysis. O’Leary (2014) suggests collecting the data as soon as possible, whereas Bell (2014) suggests the researcher merely glance through the responses prior to coding and recoding, if time allows. Both methods have merit, as the researcher must consider the time they have available, as well as the amount of data they are working with in order to make a logical decision.
O’Leary (2014) offers some concerns in using questionnaires as a research tool, as they are time consuming, expensive, and sampling is difficult. O’Leary (2014) asserts that questionnaires are ‘notoriously difficult to get right’ and they often do not go as planned.
O’Leary (2014) suggests some obvious strengths for this research method, as administering a questionnaire allows the researcher to generate data specific to their own research and offers insights that might otherwise be unavailable. In listing the additional benefits of questionnaires, O’Leary (2014) suggests that they can:
Cohen et al. (2013, p.394) offer special considerations for administering questionnaires within an educational setting:
Bell, J., Waters, S., & Ebooks Corporation. (2014). Doing your research project: A guide for first-time researchers (Sixth ed.). Maidenhead, Berkshire: Open University Press.
Cohen, L., Manion, L., Morrison, K., & Ebooks Corporation. (2011; 2013; 1993). Research methods in education (7th ed.). Abingdon, Oxon; New York: Routledge. doi:10.4324/9780203720967.
Creswell, J. W. (2009). Research design: Qualitative, quantitative, and mixed methods approaches (3rd ed.). Los Angeles: Sage.
Kvale, S., & SAGE Research Methods Online. (2008). Doing interviews . Thousand Oaks; London: SAGE Publications, Limited.
McNamara, C. (1999). General Guidelines for Conducting Interviews, Authenticity Consulting, LLC, Retrieved from: http://www.managementhelp.org/evaluatn/intrview.htm
O’Leary, Z. (2014). The essential guide to doing your research project (2nd ed.). London: SAGE.
Author: ADJP Quad
Published: March 7, 2016
Word Count: 2375
More to read.
Comments are closed.
Search writings.
A TRU Writer powered SPLOT : Research Methodology in Education
Blame @cogdog — Up ↑
An official website of the United States government
The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.
The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.
Preview improvements coming to the PMC website in October 2024. Learn More or Try it out now .
Anne e pezalla.
Pennsylvania State University, USA
Michelle miller-day.
Because the researcher is the instrument in semistructured or unstructured qualitative interviews, unique researcher characteristics have the potential to influence the collection of empirical materials. This concept, although widely acknowledged, has garnered little systematic investigation. This article discusses the interviewer characteristics of three different interviewers who are part of a qualitative research team. The researcher/interviewers – and authors of this article – reflect on their own and each other’s interviews and explore the ways in which individual interview practices create unique conversational spaces. The results suggest that certain interviewer characteristics may be more effective than others in eliciting detailed narratives from respondents depending on the perceived sensitivity of the topic, but that variation in interviewer characteristics may benefit rather than detract from the goals of team-based qualitative inquiry. The authors call for the inclusion of enhanced self-reflexivity in interviewer training and development activities and argue against standardization of interviewer practices in qualitative research teams.
Inner Silence Writing, Reflecting, Hoping Slipping into Truth Interviewing moments Take me by surprise Like Sunlight ( Janesick, 1998 : 53)
The level of researcher involvement in qualitative interviewing – indeed, the embodiment of the unique researcher as the instrument for qualitative data collection – has been widely acknowledged (e.g. Cassell, 2005 ; Rubin and Rubin, 2005 ; Turato, 2005 ). Because the researcher is the instrument in semistructured or unstructured qualitative interviews, unique researcher attributes have the potential to influence the collection of empirical materials. Although it is common for scholars to advocate for interviewer reflexivity ( Ellis and Berger, 2003 ; Pillow, 2003 ) and acknowledge the researcher as the primary instrument in qualitative interview studies ( Guba and Lincoln 1981 ; Merriam 2002 ), with some notable exceptions (e.g. Pitts and Miller-Day, 2007 ; Watts, 2008 ) few have actually examined the qualitative interview as a collaborative enterprise, as an exchange between two parties, reflecting on the ways in which the interviewer affects the organization of this talk-in-interaction and the processes by which the talk is produced. Given this, the first aim of this study is to provide a reflexive account of how three different interviewers (authors Jonathan, Annie, and Michelle) individually facilitate unique conversational spaces in their qualitative interviews.
Understanding the qualitative interview as social interaction is important for any sole qualitative researcher, but as Miller-Day et al. (2009) pointed out, this may be particularly germane for qualitative research teams (QRT). Herriott and Firestone (1983) argued that when there is more than one interviewer on a QRT, inconsistencies in interview style and approach may affect the quality of the research conversation and ultimately the study findings. Indeed, several published resources on QRTs suggest that interviewers should receive the same standard training with an eye toward producing consistent strategies and credible findings ( Bergman and Coxon, 2005 ; United States Agency for International Development’s Center for Development Information and Evaluation, 1996 ). Unfortunately, current literature addressing QRTs has primarily focused on the relationship dynamics among research team members (e.g. Fernald and Duclos, 2005 ; Rogers-Dillon, 2005 ; Sanders and Cuneo, 2010 ; Treloar and Graham, 2003 ) and on group analytical procedures (e.g. Guest and MacQueen, 2007 ; MacQueen et al., 1999 ; Olesen et al., 1994 ) rather than on the team member roles (e.g. interviewer, analyst) or data collection practices (e.g. strategies for building rapport). As QRTs are becoming more prevalent, especially in funded research ( Barry et al., 1999 ; Ferguson et al., 2009 ), there is a need for more information about how to maximize the use of multiple interviewers and maintain a focus on the unified research goals while respecting the flexibility of the in-depth qualitative interview as talk-in-interaction ( Mallozzi, 2009 ; Miller-Day et al., 2009 ). Toward that end, the second aim of this study is to reflect on and discuss implications of the study findings for qualitative research teams.
The phrase researcher-as-instrument refers to the researcher as an active respondent in the research process ( Hammersley and Atkinson, 1995 ). Researchers ‘use their sensory organs to grasp the study objects, mirroring them in their consciousness, where they then are converted into phenomenological representations to be interpreted’ ( Turato, 2005 : 510). It is through the researcher’s facilitative interaction that a conversational space is created – that is, an arena where respondents feel safe to share stories on their experiences and life worlds ( Owens, 2006 ).
Across the years, scholars have considered the nature of researcher-as-instrument as interpreter of empirical materials and as involved in the construction of ideas ( Janesick, 2001 ; Singer et al., 1983 ). This consideration began to grow after feminist UK scholars such as Oakley (1981) and Graham (1983) criticized quantitative-based research methods that assumed a detached and value-free researcher in the acquisition and interpretation of gathered data, and was further developed by feminist ethnographers such as Stack (1995) , who offered seminal research on ‘dramatizing both writer and subject’ in fieldwork on neighborhoods and communities (p. 1). More recently, scholars have extended their interest of researcher-instruments to consider specific interviewing strategies. Conversation analysis tools have often been used to examine the intricacies of interview conversations, studying the ways in which the ‘how’ of a given interview shapes the ‘what’ that is produced ( Holstein and Gubrium, 1995 ; Pillow, 2003 ).
While qualitative scholars agree that a conversational space must be created, they often disagree as to what that space should look like. Some scholars argue for a Rogerian interviewing space, where empathy, transparency, and unconditional positive regard are felt ( Janesick, 2001 ; Mallozzi, 2009 ; Matteson and Lincoln, 2009 ). Pitts and Miller-Day (2007) documented specific trajectories experienced by qualitative interviewers when establishing rapport with research participants, and the authors argue that a feeling of interpersonal connection was necessary for the qualitative interviewer and interviewee to develop a partnership. These claims are grounded in the feminist or postructuralist perspective, which hold that ‘the essential self … is not automatically revealed in a neutral environment but can and might need to be benevolently coaxed out into a safe environment, where it can be actualized’ ( Mallozzi, 2009 : 1045).
Others advocate against a feminist approach to interviewing. Tanggaard (2007) , for example, viewed empathy to be a dangerous interviewer quality because it tends to create a superficial form of friendship between interviewer and respondent. Self-disclosure has been similarly critiqued ( Abell et al., 2006 ). These critics hold that self-disclosure may actually distance the interviewer from the respondent when the self-disclosure portrays the interviewer as more knowledgeable than the respondent. These studies question the popular assumption that displays of empathy or acts of self-disclosure are naturally interpreted by the respondent as a means of establishing a conversational space of rapport and mutual understanding.
So where do these opposing viewpoints lead us as researchers? For the three of us who are authoring this article, the answer to that question is an unsatisfactory, ‘we are not sure.’ Working as part of a QRT, we were trained in a systematic manner, provided with clear procedures for carrying out our qualitative interviews, and educated in the ultimate goals of the research project. The interviewees in this team project were a fairly homogenous group – rural 6–7th grade students – and all three of us interviewed youth in both grades, both male and female, gregarious and stoic. Yet, the interviews we conducted all turned out to be very different. What stood out to us was that our individual attributes as researchers seemed to impact the manner in which we conducted our interviews and affected how we accomplished the primary objective of the interviews, which was to elicit detailed narratives from the adolescents. Hence, we set forth to better understand how we, as research instruments, individually facilitated unique conversational spaces in our interviews and determine if there were some researcher attributes or practices that were more effective than others in eliciting detailed narratives from the adolescent respondents. Additionally, we sought to reflect on the emergent findings and offer a discussion of how unique conversational spaces might impact QRTs.
The team-based qualitative research, participants.
The empirical materials for the current study came from a larger study designed to understand the social context of substance use for rural adolescents in two Mid-Atlantic States. A total of 113 participants between 12 and 19 years old ( M = 13.68, SD = 1.37) were recruited from schools identified as rural based on one of two main criteria: (a) the school district being located in a ‘rural’ area as determined by the National Center for Education Statistics (NCES, n.d.; and (b) the school’s location in a county being considered ‘Appalachian’ according to the Appalachian Regional Commission (ARC). Participating schools served a large population of economically disadvantaged students identified by family income being equal to or less than 180 percent of the United States Department of Agricultural federal poverty guidelines and these guidelines start at an annual salary of $20,036 but increase by $6,919 for each additional household member ( Ohio Department of Education 2010 ).
Eleven interviewers comprised the qualitative research team for this team-based study. All underwent at least four hours of interviewer training, which reviewed interview protocol and procedures, summarized guidelines for ethical research, and included interview practice and feedback. During training, interviewers were given a clear interview schedule. Because the interviews were semistructured, the interviewers were instructed to use the schedule as a guide. They were instructed not to read the questions word-for-word from the interview schedule, but instead to use their own phrasing for asking each question, use additional probes or prompts if necessary, and use a communication style that felt comfortable and natural to them. Interviewers were also instructed to interact with their participants as learners attempting to understand the participants’ experiences and realities from their perspectives ( Baxter and Babbie, 2004 ). All interviewers on the team participated in mock interview sessions and were provided with initial feedback about their interview skill.
The interviews themselves were conducted in private locations within the schools such as guidance counselors’ offices or unused classrooms or conference rooms. In most cases, either the adult school contact or the study liaison brought students to their interview site to ensure that the interviewer did not know the students’ names – only their unique identification number. Researchers assured all students their responses would remain confidential, in accordance with Institutional Review Board standards, and the interviewee was permitted to withdraw his/her data from the study at any time. All interviews were digitally recorded and ranged from 18–91 minutes in length. This length is typical of interviews dealing with sensitive topics such as drug use in a school-based setting ( Alberts et al., 1991 ; Botvin et al., 2000 ).
Interview sample.
For the purpose of the present study we all agreed that self-reflexivity was necessary to ‘understand ourselves as part of the process of understanding others’ ( Ellis and Berger, 2003 : 486), increase the transparency of our findings, and increase the legitimacy and validity of our research. Therefore, we elected to limit our analysis to only those interviews that the three of us conducted, excluding transcripts from the other eight interviewers in the team-based study. Transcripts of the interviews were provided by a professional transcriptionist who was blind to the purpose of the study. A total of 18 interviews were transcribed (six per interviewer). Further refining the sample, we elected to analyze only interviews that we deemed to be of sufficient quality. Transcript quality was based on two indicators: (a) the level of transcription detail; and (b) the ability of the respondent to speak and understand English. Transcripts that were poorly done (i.e. that failed to include sufficient detail from the interview audio file) or that indicated that the respondent did not understand English were rated as low quality and were not included in final analyses. We took this step to ensure that all transcripts in the study sample were of sufficient quality and provided adequate detail to decipher our interviewer practices. From the 18 originally submitted transcripts, we found 13 to be of sufficient quality, and retained them for analysis.
Following Baptiste’s (2001) advice, the first step in our analysis was to acknowledge our interpretivist orientation and to honestly discuss among ourselves the risks involved with self-reflexively examining our own work. If you think it is difficult to listen to your own voice in an audio-recording, imagine listening to your own voice and simultaneously reading the text illustrating your own interview errors, dysfluencies, and awkward pauses! This first step was perhaps the most difficult, but it resulted in a shared agreement for honest self-reflection and analysis.
The next step involved restricting our analysis to three specifically selected topics from the research interview. The three discussion topics included rural living, identity and future selves, and risky behavior. We identified these topics of discussion because they each represented a different level of emotional risk for the respondents ( Corbin and Morse, 2003 ), based on the assumptions that (a) respondents were all relatively similar in their emotional well-being – specifically, that none were too emotionally fragile to engage in a conversation with us, and (b) discussing topics of illegal or private activities would arouse more powerful emotions in respondents than would topics of legal and mundane activities. Across the entire sample of interviews, conversations on rural living were seen as fairly low-risk topics of discussion. The topic often served as a warm-up for many interview conversations because the topic was easy for respondents to discuss. Conversations on identity and future selves were typically perceived as moderately uncomfortable for respondents. Respondents were asked to talk about their personality characteristics and who they wanted to become in the future. Although some respondents appeared to enjoy the opportunity to talk about themselves, many appeared mildly uncomfortable doing so, perhaps because they were being asked to talk about themselves with someone they did not know. Conversations on risky behavior were often perceived to be more dangerous. Despite being reassured that their stories would remain confidential, respondents were nevertheless being asked to disclose information about potentially illegal activities in which they had taken part. These topics of discussion were not always mutually exclusive (e.g. respondents often talked about risky behavior when they discussed rural living); but, because every interview in the larger study included topics of discussion that were low, moderate, and highly sensitive, we believed that the three chosen topics of discussion represented an appropriate cross-section of the interview.
Dividing interviews into topics of discussion provided a way to organize long transcripts into relatively distinct topical areas. It also allowed us to examine interviewer practices across comparable topics of discussion, and to assess the ways in which particular characteristics facilitated different conversational spaces.
The next step involved identifying and labeling the discussion of each of the three topics within each of the 13 transcripts. As we labeled the related passages in the transcripts, each of us followed the same iterative analytic process, commencing with an analysis of our own individual transcripts and followed by a cross-case analysis of each others’ transcripts. Our individual, within-case analysis proceeded along four main steps: reading through our own transcripts 2–3 times before extracting the separate topics of discussion; then within each topic of discussion across all of our own interviews, we inductively identified, interpreted, and labeled what we each saw as important in the utterances, sequencing, and details of the conversational interaction, assessing the ways in which interviewer practices seemed to facilitate and to inhibit respondent disclosure. For our purposes, we defined an interviewer practice as an action performed repeatedly. These practices were eventually categorized into groups of interviewer characteristics. We conceptually defined an interviewer characteristic as a distinguishing general feature or overall quality of the interviewer. Throughout this process we individually developed and refined our code lists, discussing our emergent codes with one another via weekly meetings and email correspondence. As part of this process, we coded our own transcripts and then shared and discussed our code list with the others. Next, each of us (re) coded a portion of each other’s transcripts and calculated the percentage of raw coding agreement. Disagreements were negotiated until we all reached consensus on a working list of codes. This cross-case analysis did not commence until we had reached a minimum coding agreement of .80. Within the topic of rural living, for example, if two of us each generated five codes to describe one interviewer’s researcher-as-instrument characteristics, consensus was necessary on at least four of those codes before a trustworthy assessment could be made.
During the cross-case analysis we compared and contrasted the coded material within and across the entire sample of transcripts to identify discrepancies and consistencies in our codes. From this process, we reduced the code list to a common set of researcher-as-instrument characteristics and interviewing practices that were present in the utterances, sequencing, and details of the conversational interactions. Throughout this process we explicitly identified evidence (excerpts from the interview transcripts) for any research claim to connect the empirical materials with any findings ( Maxwell, 1996 ). The three of us met periodically to conference, share ideas, and challenge and refine emergent findings. We used Nvivo 8 to manage and analyze the interview data. In the end, we were able to (a) identify and describe individual interviewer practices that served to characterize each of us as individual interviewers, and (b) compare and contrast our individual differences within and across the different topics in the interview conversation. During this comparison we paid special attention to the adolescent’s contribution to the conversation and his or her level of disclosure.
Annie’s general interviewer characteristics were coded as affirming, energetic , and interpretive. The affirming characteristic was defined as ‘showing support for a respondent’s idea or belief’ and is illustrated in the following excerpt:
Annie : What do you do? Resp : I help the milkers, I help – Annie : You know how to milk a cow? That’s so cool, that’s great. Resp : Yeah, but you have to watch out ’cause they kick sometimes. ’Cause they don’t want you messing with their teats – they kick, it’s, uh … Annie : Have you been kicked? Resp : I got kicked in the arm, but I’m scared I’m gonna get kicked in the face one of these days. Annie : Yeah, that would really hurt, huh? Oh, wow, that’s amazing.
Comments like ‘that’s so cool, that’s great,’ and ‘Oh, wow, that’s amazing’ illustrated the affirmation. Annie’s affirming characteristic could be seen in other transcript passages in phrases such as ‘great,’ ‘awesome,’ ‘amazing,’ and ‘excellent.’ Annie’s interviewer characteristics were also coded as energetic , defined as ‘showing wonder, astonishment, or confusion by something respondent said that was unexpected, or remarkable.’
Annie : So you like dirt bikes. Do you have one of your own? Resp : Yeah, I have a, it’s a one, it’s a two-fifty. It’s like a, it’s a CRX 250, it’s like … Annie : Oh, wow! Is it a pretty big bike? Wow, what do you like to do on it? Resp : I just ride around in the fields and usually chase after deer on it. Annie : Really!
Annie : Um, is your sister older or younger? Resp : She’s younger, she’s ten. Annie : So you kinda look out for her? Resp : Yeah. She likes to feed the calves. Annie : Oooooh!! Cute little baby calves. That’s neat. Wow! How unique. That’s really, really cool.
Annie : What’s a – dwarf bunny? What is that? Resp : Yeah, they’re like little bunnies – they’re about that big. Annie : Like real bunnies? Resp : Yeah, they’re about that big – Annie : Oh, dwarf bunnies. Oooh!
The sheer number of exclamation marks in Annie’s transcripts illustrated her energetic interviewer characteristic, but the words she used (wow, really, oooooh) also illustrated the lively quality of her interview approach.
Lastly, Annie was also characterized as being interpretive , conceptually defined as ‘expressing a personal opinion or interpretation regarding something a respondent said.’ For example:
Resp : And I chugged it and like, I passed out. Annie : Did you have to go to the hospital? Resp : Oh no. We were in the middle of the woods and we weren’t saying anything ’cause we all would get busted. Annie : Oh my gosh, oh, you must have felt terrible.
Annie : Do you think that he drinks beer, or does chew or smokes cigarettes? Resp : He probably does, but – Annie : Do you think so? Um, and so when he offered this to you, were you, were you uncomfortable? Like, did you feel kind of weird?
In all of the above passages, Annie’s interpretive nature is evident in instances where she offers her own construal of the respondent’s story (e.g. ‘you must have felt terrible’), or when she creates a hypothetical scenario for the respondent to comment on (‘do you think he drinks beer?’). Such utterances illustrate her tendency to offer an opinion, either in response to a respondent’s story or before a conversation formally began.
Jonathan’s interviewer characteristics were characterized by neutrality and naivety. The neutral interviewer characteristic, defined as ‘not engaged on one side of argument or another; neither affirming nor disapproving of respondent’s stories,’ was best illustrated by the lack of extensive commentary Jonathan provided in his interviews. In comparison to Annie’s transcripts, Jonathan’s transcripts were characterized by shorter utterances, fewer opinionated responses, and very few exclamation marks:
Jonathan : Who were you living with in [name of town]? Resp : My mom. But she, my grandma got custody of me, so. Jonathan : What, what happened to do that? Like, what, what brought you? Resp : Well, I got put in [the local in-patient treatment facility] ’cause I said I was gonna kill myself. Jonathan : Oh, okay.
Jonathan : Okay. What, um, so does your dad mind if you drink then? Like, if he found out that you were going to the bar party and that you had gotten drunk, what would he say? Resp : He probably wouldn’t do anything because, like, I used to have parties at his house, at my dad’s house. But then he got, then he went to jail, so we stopped [lowers tone, quieter] In case, like, ’cause they were keeping a good eye on him after he got out. Jonathan : Mm hmm. Resp : So we stopped having parties there, just so that, like, my dad wouldn’t get in trouble for, like, the underage drinking. Jonathan : Okay.
It was often difficult to even see evidence of Jonathan’s ‘footprint’ in his transcripts because he maintained a fairly minimal presence in his interviews. As seen from the illustrations above, Jonathan kept many of his responses or comments to single-word phrases, ‘Okay,’ or ‘Mm hmm,’ or ‘Yeah.’ When Jonathan did offer more extensive commentary, it was often to acknowledge his lack of understanding about a subject matter. His transcripts often included passages like ‘I’ve never been here before’ or ‘I don’t know anything about that .’ It was in these instances that Jonathan’s interviewer characteristic of naive , defined as showing a lack of knowledge or information about respondent, was best illustrated:
Jonathan : Is it like illegal? Or is it like the whole town shuts down, they do racing down the streets? Resp : It’s illegal. Jonathan : Yes? I don’t know – you got tell me these things. I am learning.
These illustrations of naivety were most likely uttered to give the respondent a sense of mastery over the interview topics of discussion, and to elicit the respondent’s interpretations of the events or topics of discussion.
Michelle’s interviewer characteristics illustrated different qualities than either Jonathan or Annie. Michelle’s qualities as an interviewer were coded as being high in affirmation and self-disclosure. Michelle’s transcripts were filled with encouragement and compliments toward her respondents. The following utterances from Michelle illustrate this characteristic:
My goodness, you are smart for a seventh grader … It sounds like you are very helpful … Yes, that is a skill that you have there, that not a lot of people do have …
These instances of affirmation, defined as ‘showing support for a respondent’s idea or belief,’ were found in almost every topic of discussion. Michelle’s transcripts were also filled with instances of self-disclosure. Michelle often used stories of her adolescent son when she was explaining a topic that she wanted to discuss with the adolescent respondents:
Resp : On Friday nights, tonight I’ll go to my gran’s and we usually have a get-together and just play cards, it’s just a thing we do. I like it. It’s just time to spend with family. Michelle : Absolutely. Well, that sounds really nice. And I have a 14-year old in eighth grade. And every Sunday night, we do the game night sort of thing and I look forward to it.
The passages above illustrate three distinct interviewer characteristics: one high in affirmations, energy, interpretations ; another characterized by neutrality and naivety ; and another high in affirmations and self-disclosure . Although all three interviewers demonstrated other instrument qualities in their interviews, the few qualities associated with each interviewer above were found in nearly every topic of discussion (e.g. in almost every conversational topic for Annie, there was evidence of her affirming, energetic , and interpretive interviewer characteristics). These qualities seemed to characterize the unique style of the interviewers rather than reflect reactions to specific contexts. These qualities also persisted in our other interviews not included in these analyses.
In the following section, we compare our general interviewer characteristics across the three topics of discussion: rural living, identity and future selves, and risky behavior. We also examine the ways in which our respective interviewer characteristics appeared to influence the conversational space of our interviews. Specifically, we assess how the various interviewer characteristics seemed to facilitate or inhibit respondent disclosure.
Rural living was generally a low-risk topic. In her discussion of this topic with one adolescent, Michelle tended to utilize her self-disclosing characteristic:
Michelle : Are there groups or, like, not cliques, I don’t wanna say, but groups in school; kids who are more like you, who are more into the computers, versus the kids who are huntin’ and fishin’, versus the jocks? I know at my son’s school there are. Resp : There’s not really anybody like that here. Like all of my friends who are like that, they’re in a higher grade than me. But there are some people in my grade where I can relate to in a sense, yeah. Michelle : Okay, so most kids you can relate to are older but most o’ the kids, your peers and your age, are more into the four wheeling and hunting and fishing and kinda stuff like that? That must feel, well, I don’t know, I’m, I’m projecting now unto my own son because sometimes he feels like, that you know, it’s just ridiculous. Resp : Yeah. Michelle : It, eh, ya’ know – and you feel kinda stuck. Resp : Mmm hmm. Michelle : Yeah? Resp : Yeah. I just, like I’ll be sitting there in class and then they’ll start talking about hunting or fishing and I just wanna pull out my hair’ cause I, I don’t know how you can like that stuff. Like it’s just sitting there for a couple of hours doing nothing. Michelle : Right, right.
From the excerpt above, the respondent’s experience with school crowds did not appear to coincide with Michelle’s understanding of her son’s with school crowds. However, Michelle’s self-disclosure seemed to open up the conversational space for the respondent to respond in kind. In the final passage, the respondent offered a different perspective on the nature of crowds in his school.
Conversely, in his conversations with respondents about rural living, Jonathan tended to demonstrate his naive interviewer characteristic:
Jonathan : Is this [name of X town]? Is that where you live now? I don’t even know where I am. Okay, okay. I thought this was [name of Y town] is why, but it’s just the name of the High School. Resp : Well, this is [name of Y town], but [name of X town] is out near. Jonathan : Uh, I’m not, I don’t know this area so well … Resp : And then, like, when you hit, there’s this big huge fire station … and then there’s the [name of X town] Elementary School. And then if you go down there and then you turn and you go up, and then that, like, that whole area in there is [name of X town]. Jonathan : Okay. Resp : And then you go back and where there’s classified as [name of X town], but it’s actually [name of Z town]. Jonathan : Okay.
In response to Jonathan’s naivety (‘I don’t even know where I am’ and ‘I don’t know this area so well’), the respondent appeared to seize the opportunity to teach Jonathan about the area. The respondent did not simply answer Jonathan’s questions; he provided information about which Jonathan did not ask (e.g. the whereabouts of the fire station, elementary school, and nearby towns).
In contrast, Annie’s conversations about rural living were filled with her energetic interviewer characteristic:
Annie : What do you mean by hang out, like what do you ha-, what do you do when you hang out? Resp : We go four wheeler riding. Annie : Oh, four wheeler riding! Cool! Is that dangerous? Is it? Resp : Yeah, and we go up to our camp we built. Um … Annie : That you and your friends built? Resp : Mmm hmm. Annie : Wow! How did you know how to do all that? Resp : Um, my brother and a couple of his friends, that we’re really good friends with, helped us. And like, over the summer we camp out like every night. Like, I’m never home in the summertime, ever. Annie : Wow! Resp : There are three bedrooms and it’s, has a wood burner and it, yeah. Annie : That’s like, that sounds like a real house. That’s amazing. Resp : We built it out of trees. We had our, couple of our friends and our dads help us. We’ve had it for three years and it’s really nice.
After Annie’s lively reply to the respondent’s interest in four wheeler riding (‘Oh, four wheeler riding! Cool!’), the respondent opened up about a different, but related topic: her summer camp house. Moreover, Annie’s energetic comment about the house (‘Wow! How did you know how to do all that?’) seemed to open the conversational space even more, as the respondent explained the ways the house was built, the amenities of the house, and the amount of time she spent in the house during the summer.
Conversations about the adolescents’ identity and future selves were considered moderately uncomfortable for adolescents. The interview questions prompted the adolescents to talk about the qualities that described their personal and social identities, along with any hopes and aspirations they had for the future. Although the interview questions were designed to be as unobtrusive as possible, the topic was fairly personal. The interview questions required the adolescent respondent to be introspective with someone with whom they had no personal history:
Jonathan : After you’re all done with school, so you go through and you graduate from a high school. What do you want to do after that? Resp : Go back to Mexico and visit my family, and um get a job. Jonathan : Back in Mexico? Resp : It doesn’t really matter where, but just like get a job. Jonathan : Yes. What kind of job? Resp : Probably like a secretary or whatever job they give me, except prostitute. Jonathan : None a’ that. Is there anything you worry about in that transition of how you’re going to go get a job and what kind of job you’ll get, things like that? Resp : Not really, because like, you just have to like – I dunno, just like – just like – go on with life and whatever happens, just, take it.
Here again, Jonathan’s neutrality was demonstrated not by what he said, but what he did not say. Despite the fact that the adolescent shared a potentially troubling disclosure, that she would consider any job except prostitution, Jonathan kept his personal reactions to a minimum and provided only a short response (i.e. ‘None a’ that’). After this instance of neutrality, Jonathan moved on to a different topic (i.e. asking the respondent if she had any concerns about getting a job in the future), and the respondent moved on, as well, dutifully answering his questions. She provided no more information on her prostitution comment.
In comparison to Jonathan, Michelle and Annie’s utterances in their conversations on identity and future selves were replete with codes for affirmation:
Resp : I wanna be a pediatrician nurse or something. Like, I love kids to death. Like, I’ve, I learned how to change – I’ve been changing diapers – this is no lie – I’ve been changing diapers since I was like seven years old. ‘Cause my mom, step-mom, had a baby before my dad left again, and like I was always changing her diapers and stuff, and like, I babysit constantly. Annie : Aww, I bet you’re really good with kids. Resp : Oh, I’m amazing. Like, there’s this one little boy, like he goes to my church, he’s just like four, and I took him to my house one day and like he asked his mom to buy him a toy at the toy store, I cried, she’s like, she’s like, ‘Aww, I can’t sweetie, I don’t have the money’ and he was crying, he and he’s like ‘All my friends have toys. He was like two and he, like he, like he goes over to this daycare and he’s like ‘All my friends have these toys but I don’t have any.’ Like he had no toys at all and like my mom gave them, handed me a hundred dollars and she’s like ‘Go to, go, go buy toys. We gave him a hundred dollars, like we gave him all this money and they went out and bought like a b-, toys and stuff. It was really nice. Annie : That is, that’s really neat.
Michelle : So the first question that I have here is which of these things that you wrote down are you most proud of? Resp : Well, being helpful. Michelle : How are you helpful? Resp : Well. In school, there are some people that don’t like speak English that well. And I help them by translating. Michelle : Oh okay. Like you are doing for your teacher in there. You are helping do that. So how long have you been bilingual your whole life? Do both of your parents speak Spanish? Resp : Well, yes, they are Mexicans. They barely know English. Michelle : And they barely know English. And when did you come here? Resp : When I was nine months old. Michelle : When you were a baby. And before that you lived where? Resp : In Mexico. Michelle : Mexico. So you are 13, so that was when you were a year old. Okay, got it. Okay, so you learned here. So you speak English better than they do it, sounds like. Okay and then you translate. What’s that like translating for them? Resp : Well, for me it’s like sometimes difficult because I never went to school in Mexico and I know more English than Spanish and when I am translating it’s difficult for me. The big words my parents tell me to try to translate it in English. Michelle : Okay. So you’re doing both ways. You’re doing from English to and from Spanish to English. Both. Does that feel like a lot of responsibility for somebody your age? Resp : Yeah, especially when I got field trips stuff like that. I need to tell my parents, that my parents or if my parents needed something that comes in the mail, may be bills or something like that. Michelle : It sounds like you are very helpful. Who do you want to be when you are out of after high school? Resp : Since I like to help out people a lot, I mean, maybe be a translator and maybe in a hospital or in a school so – Michelle : Yes, that is a skill that you have there, that not a lot of people do have. So that’s – I’m glad you realized that, in terms of that.
Annie’s affirming characteristic could be seen in her affirmation of her respondent’s compassion for children (‘I bet you’re really good with kids’); for Michelle, the characteristic could be seen in her affirmations of her respondent’s willingness to help her parents, teacher, and classmates with their English or Spanish (‘… it sounds like you are very helpful’). Both Michelle and Annie’s affirmation seemed to foster a conversational space that was conducive for uninhibited self-disclosure. In response to Annie’s affirmation about owning a daycare someday, the respondent opened up to talk about her talents in working with children, and her compassion for the children in her community who were less fortunate than she. In response to Michelle’s affirmations about the responsibilities of translating for so many people, the respondent expounded on the difficulties of such a responsibility, and the tasks she must perform for various people (e.g. helping her classmates on field trips, assisting her parents with bills).
Discussions about alcohol, tobacco, and other drug usage (ATOD) were considered highly sensitive topics of discussion, as adolescents were often encouraged to disclose information about their own or their peers’ drug use. Although the respondents were continually reassured that the information they provided was confidential, disclosing information about illegal activity to a stranger was likely a highly sensitive activity. When discussing ATOD with adolescents, each interviewer utilized a different interviewer characteristic. Jonathan’s dominant characteristic when discussing this topic was neutrality :
Resp : Her parents’, like, bar. Like, they own this big, huge bar. And then, like, in the back where the kids can go. Jonathan : Oh, okay. Resp : And her parents don’t really care if you drink. Jonathan : Oh, okay. Resp : Just as long as you do it in the bar. You don’t just go outside, or you don’t tell your parents. Jonathan : Okay. Resp : She doesn’t really know that we drink, but we usually crash in the van, in the RV. Jonathan : Uh huh. Resp : … or out in the yard. And we only do the RV in the summer or in the spring. And then at my other friend’s house who has the bar, we stay at, we do the, we have parties there all the time. Jonathan : Mm hmm. Resp : Just cause her parents don’t care. Jonathan : Yeah.
Even in the midst of some fairly controversial topics of discussion (e.g. underage binge drinking), Jonathan’s neutral characteristic was consistently demonstrated in his calm, even responses (‘okay,’ ‘uh huh’). These neutral responses seemed to provide an unobtrusive backdrop for the respondent to discuss her experiences. Indeed, Jonathan did not even need to ask any questions to the respondent. With minimal prompting, the respondent shared her story.
In comparison to Jonathan, when discussing ATOD, Annie’s approach was coded as interpretive ; she often interjected commentary about the respondents’ stories of risky behavior:
Annie : Do you think that he drinks beer, or does chew or smokes cigarettes? Resp : He probably does … Annie : Um, and so when he offered this to you, were you, were you uncomfortable? Like, did you feel kind of weird? Resp : Mm hmm. Annie : Um, and, and maybe that boy’s brother – like, that guy’s brother – he might smoke or drink from time to time, but, um, that’s about it? Resp : Mm hmm. Annie : It doesn’t seem like too many kids around here do that stuff. Resp : Not as I know.
Annie’s interpretive characteristic stands in stark contrast to Jonathan’s neutral characteristic. Whereas Jonathan’s responses were short and dispassionate, Annie’s responses were somewhat opinionated. These interpretive comments did not seem to generate a conversational space conducive for the respondent’s continued disclosure. Indeed, the transcript above shows that most of the commentary came from Annie, not the respondent.
In discussions on risky behavior, Michelle’s self-disclosing characteristic was evidenced by her stories of her 14-year-old son, and appeared to serve as a point of identification with respondents:
Resp : My parents get mad because I listen to music a lot and I don’t do anything than watch TV. Just hang out with my friends. Michelle : Then your parents get mad because that’s all you do. You know but the good thing about me is I’m not your parent and I don’t care. So I just want to know what kids are doing. It’s, you know, I have an eighth grader actually he’s 14. And that’s exactly what he does. And in the winter it stinks, though you are right because what else is there to do? You know it’s the question, um any way, okay. So, do you know my question to you is, and again, this is purely confidential, we don’t know names we don’t want names or anything. Has anybody ever offered you any alcohol or cigarettes or marijuana or any of those? And have you said yes or no to that? Resp : Yes, they offered me and I’d always told them ‘no’ and what it does. Michelle : Okay, so tell me … pretend that we’re shooting this video. Okay tell me the who when what where why and how. Right? Where were you, not who, not a name. But was it a friend who was older, younger, male, female? That kind of thing. Tell me the story of at least one of these offers. Resp : Okay. I was hanging out with my friends, just walking around, and there is this bigger kid that we know and he was joined by these smokers, and they would always, he would always tell me never to smoke and we just saw him … And then he offered us and we said no. This is not good for you and he plays soccer and he is not really good at soccer.
Michelle’s self-disclosure about her son experiencing similar challenges as the respondent was initially met by the respondent with a short response. However, Michelle’s subsequent question, framed as a hypothetical task (‘ pretend that we’re shooting this video ’), seemed to create an opening in the conversational space for the respondent to share a story.
In looking closely at the different practices we employed as interviewers, we were able to identify a variety of distinguishing features that seemed to characterize each of us uniquely. If we were characters in a novel or play, Annie’s character name would be energy , Jonathan’s neutrality , and Michelle’s self-disclosure . Across the different conversation topics in the interview, from low to high risk, these interviewer characteristics functioned differently in eliciting detail from adolescent respondents.
When the adolescents and researchers discussed the low-risk topic of rural living, the three interviewer characteristics (i.e. energy, neutrality, or self-disclosure) generated sufficiently detailed responses from the respondents. Variance across interviewers did not seem to have much impact on the quality of the responses obtained from the adolescent participants. This may have been due, in part, to the low-risk nature of the topic. This is a topic many adolescents can talk easily about, have talked about with others, and do not perceive the information they share as particularly threatening.
When the topic was moderately risky, as was the topic of identities and future selves, Jonathan’s neutral approach contrasted with Michelle and Annie’s affirming approach. Although neutrality appeared somewhat effective in facilitating an open conversational space for respondents, the affirming interviewer characteristic seemed to offer a more nurturing environment for conversation. Rich, detailed disclosures from adolescents about their identities occurred more often when the interviewer utilized an affirming approach and set a tone of acceptance for the respondents. Affirmation may be particularly important with adolescents, since adolescence is a notoriously vulnerable time in development.
When discussing a high risk topic such as alcohol and other drug use, Annie’s interpretive approach appeared to be the least effective in providing a satisfying conversational space for respondents. Jonathan’s neutral characteristic and Michelle’s self-disclosing characteristic appeared to elicit detailed information from their respondents, while Annie’s interpretive characteristic only served to inhibit her respondent’s stories. Michelle’s disclosures, while also interpretive, did not appear to limit responses from the adolescents. Couching Michelle’s interpretive language within a personal narrative may have mitigated its presence, although it still presented leading information. Hence, it could be argued that neutrality (displayed in this context by Jonathan) may be most effective when discussing high risk topics, because this neutrality provides the respondents with the most freedom to disclose what they want and how they want.
An important factor to note in this discussion is that of gender. While we did not explicitly study the role of gender in our analyses, our interviewing styles were rooted in traditional gender norms: Jonathan’s minimalist and neutral styles could be characterized as stereotypically masculine, and Annie and Michelle’s effusive and affirming interviewing styles could be characterized as traditionally feminine. These qualities suggest that interviewing styles cannot be disentangled from one’s gender, and that conversational spaces are influenced by more than simply an interviewer’s words. To this end, practices of reflexivity must acknowledge the implications of what an interviewer says and how it is said, as well as the ways in which those utterances are connected to one’s gender.
Although this study provides some intriguing findings, it was limited in a variety of ways. For one, we did not employ detailed conversation analysis procedures on each individual utterance in the interview. And despite the range of conversational segments in the interviews (i.e. introductions, research explanations, establishing rapport, soliciting honesty and openness, a period of questions and answers on six core topics, summarizing the discussion, and closings), for the purposes of this study, we elected to limit our analysis specifically to three topics in the question and answer segment. Nor did we examine other conversational features, such as the role of silence or turn-taking. Conversational features such as those, while certainly worth our attention, were beyond the scope of this exercise.
Learning about interviewing and doing interviews are different tasks. This lesson was highly relevant for us when conducting this study. Even though we were all trained in interviewing, we still found ourselves displaying the classic mistakes of a novice researcher: asking long, complicated questions, posturing closed yes-or-no questions, and leading respondents ( deMarrais, 2004 ). While humbling, these mistakes forced us to reflect on how to develop our skills and have guided our interviewing work since that time. Indeed, the kind of self-reflexivity involved in conducting an analysis of your own interviews, and then comparing and contrasting them with others, could be beneficial for individual interviewers as they are honing their craft, and QRTs desiring to identify unique characteristics of their resident interviewers.
In considering our findings, we agree that researchers are indeed the ‘instruments’ in qualitative interview research. After all, it is through the researcher’s facilitative interaction that a conversational space is created where respondents share rich information about their lives. Yet, we argue that qualitative researchers are differently calibrated instruments.
In QRTs, in particular, the goal is often to calibrate all instruments to one standard of accuracy. However, the results of this study illustrate that variation in interviewer characteristics may be a benefit rather than a detriment to team-based qualitative inquiry. All interviewers in this study were effective in conducting engaging conversations with participants and eliciting information, but we did these things employing different practices, and sometimes to different ends. Each interviewer demonstrated a relatively consistent interviewer style across all of his or her interviews – Jonathan was consistently neutral, Michelle consistently self-disclosive, and Annie consistently energetic. This finding leads us to suggest that QRTs might benefit from learning what ‘natural style’ characterizes a possible interviewer and then staffing their teams with interviewers who have complementary styles. Interviewers may then be assigned interview tasks commensurate with their strengths. For example, our team needed to learn both about rural identity and about alcohol and drug use, so Michelle and Annie could have been assigned to interview respondents about rural identity (a ‘safe’ topic) and future selves (a moderately risky topic), which both fit our energetic style. This approach could have helped to engage participants in the research and establish rapport with them among the research team. Then, Jonathan could be assigned to the task of summarizing the information learned about the less risky topics and bringing that information into a second interview to pursue the high risk topic of drug use, implementing his neutral style for a non-evaluative conversational space. This suggestion is founded on a premise similar to utilizing information from personality inventories (e.g. Myers Briggs) to establish work teams in organizations ( Furlow, 2000 ).
Since many interviews must occur during a single visit, however, interviewer ‘profiling’ may not be realistic for QRTs. Another suggestion would be to audio-record interview trainees in mock interviews, share those recordings among the team, then devote some time for team members to offer commentary on (a) the ways in which their teammates embodied similar or different instruments in their interviews and (b) how those instruments seemed to create different conversational spaces. This process need not involve detailed conversation analysis tools; nor should it be formal or performance-based. Instead, it should be congenial and constructive, driven by efforts to respect interviewer flexibility while maintaining fidelity to the research approach. These recommendations are in line with calls issued by Mallozzi (2009) and Miller-Day et al. (2009) , who argued that consistency efforts be focused on research procedures (e.g. securing consent, managing empirical materials) and not on standardizing interviewer characteristics.
In carrying out these recommendations, more research will be needed to understand the complexities of how and under what conditions interviewer characteristics may impact respondent responses. More research will also be needed on the ways QRT practices may change if reflexivity was incorporated at other stages of the process (e.g. forming research questions and gaining access). Yet this study provides a running start toward that end. Through our exercise, we call for greater interviewer reflexivity and acknowledge that researchers are the primary instruments in qualitative interview studies – but differentially calibrated instruments. We disagree with claims that interviewers in qualitative research teams should receive the same standard training with an eye toward producing consistent interview strategies ( Bergman and Coxon, 2005 ) and argue, instead, that diversity of approaches among members of a research team has the potential to strengthen the team through complementarity.
This research received no specific grant from any funding agency in the public, commercial or not-for-profit sectors.
Annie Pezalla is the Academic Skills Director at Walden University. Her research addresses identity development across adolescence and young adulthood.
Jonathan Pettigrew is a research analyst and project coordinator for the Drug Resistance Strategies project at Penn State University. His research examines how interpersonal and family interactions correspond with adolescent health.
Michelle Miller-Day is an Associate Professor of Communication Arts and Sciences at the Pennsylvania State University. She is the Founding Director of the Penn State Qualitative Research Interest Group, an interdisciplinary community of researchers involved in and supporting qualitative inquiry at Penn State University. Her research addresses human communication and health, including areas such as substance use prevention, suicide, and families and mental health. Her community-embedded research has involved numerous creative projects to translate research findings into social change. For the past 20 years she has served as the principal qualitative methodologist for a National Institute on Drug Abuse line of research.
Anne E Pezalla, Pennsylvania State University, USA.
Jonathan Pettigrew, Pennsylvania State University, USA.
Michelle Miller-Day, Pennsylvania State University, USA.
Run a free plagiarism check in 10 minutes, generate accurate citations for free.
Methodology
Published on August 20, 2019 by Shona McCombes . Revised on June 22, 2023.
Survey research means collecting information about a group of people by asking them questions and analyzing the results. To conduct an effective survey, follow these six steps:
Surveys are a flexible method of data collection that can be used in many different types of research .
What are surveys used for, step 1: define the population and sample, step 2: decide on the type of survey, step 3: design the survey questions, step 4: distribute the survey and collect responses, step 5: analyze the survey results, step 6: write up the survey results, other interesting articles, frequently asked questions about surveys.
Surveys are used as a method of gathering data in many different fields. They are a good choice when you want to find out about the characteristics, preferences, opinions, or beliefs of a group of people.
Common uses of survey research include:
Surveys can be used in both cross-sectional studies , where you collect data just once, and in longitudinal studies , where you survey the same sample several times over an extended period.
Discover proofreading & editing
Before you start conducting survey research, you should already have a clear research question that defines what you want to find out. Based on this question, you need to determine exactly who you will target to participate in the survey.
The target population is the specific group of people that you want to find out about. This group can be very broad or relatively narrow. For example:
Your survey should aim to produce results that can be generalized to the whole population. That means you need to carefully define exactly who you want to draw conclusions about.
Several common research biases can arise if your survey is not generalizable, particularly sampling bias and selection bias . The presence of these biases have serious repercussions for the validity of your results.
It’s rarely possible to survey the entire population of your research – it would be very difficult to get a response from every person in Brazil or every college student in the US. Instead, you will usually survey a sample from the population.
The sample size depends on how big the population is. You can use an online sample calculator to work out how many responses you need.
There are many sampling methods that allow you to generalize to broad populations. In general, though, the sample should aim to be representative of the population as a whole. The larger and more representative your sample, the more valid your conclusions. Again, beware of various types of sampling bias as you design your sample, particularly self-selection bias , nonresponse bias , undercoverage bias , and survivorship bias .
There are two main types of survey:
Which type you choose depends on the sample size and location, as well as the focus of the research.
Sending out a paper survey by mail is a common method of gathering demographic information (for example, in a government census of the population).
Online surveys are a popular choice for students doing dissertation research , due to the low cost and flexibility of this method. There are many online tools available for constructing surveys, such as SurveyMonkey and Google Forms .
If your research focuses on a specific location, you can distribute a written questionnaire to be completed by respondents on the spot. For example, you could approach the customers of a shopping mall or ask all students to complete a questionnaire at the end of a class.
Oral interviews are a useful method for smaller sample sizes. They allow you to gather more in-depth information on people’s opinions and preferences. You can conduct interviews by phone or in person.
Like questionnaires, interviews can be used to collect quantitative data: the researcher records each response as a category or rating and statistically analyzes the results. But they are more commonly used to collect qualitative data : the interviewees’ full responses are transcribed and analyzed individually to gain a richer understanding of their opinions and feelings.
Next, you need to decide which questions you will ask and how you will ask them. It’s important to consider:
There are two main forms of survey questions: open-ended and closed-ended. Many surveys use a combination of both.
Closed-ended questions give the respondent a predetermined set of answers to choose from. A closed-ended question can include:
Closed-ended questions are best for quantitative research . They provide you with numerical data that can be statistically analyzed to find patterns, trends, and correlations .
Open-ended questions are best for qualitative research. This type of question has no predetermined answers to choose from. Instead, the respondent answers in their own words.
Open questions are most common in interviews, but you can also use them in questionnaires. They are often useful as follow-up questions to ask for more detailed explanations of responses to the closed questions.
To ensure the validity and reliability of your results, you need to carefully consider each question in the survey. All questions should be narrowly focused with enough context for the respondent to answer accurately. Avoid questions that are not directly relevant to the survey’s purpose.
When constructing closed-ended questions, ensure that the options cover all possibilities. If you include a list of options that isn’t exhaustive, you can add an “other” field.
In terms of language, the survey questions should be as clear and precise as possible. Tailor the questions to your target population, keeping in mind their level of knowledge of the topic. Avoid jargon or industry-specific terminology.
Survey questions are at risk for biases like social desirability bias , the Hawthorne effect , or demand characteristics . It’s critical to use language that respondents will easily understand, and avoid words with vague or ambiguous meanings. Make sure your questions are phrased neutrally, with no indication that you’d prefer a particular answer or emotion.
The questions should be arranged in a logical order. Start with easy, non-sensitive, closed-ended questions that will encourage the respondent to continue.
If the survey covers several different topics or themes, group together related questions. You can divide a questionnaire into sections to help respondents understand what is being asked in each part.
If a question refers back to or depends on the answer to a previous question, they should be placed directly next to one another.
Professional editors proofread and edit your paper by focusing on:
See an example
Before you start, create a clear plan for where, when, how, and with whom you will conduct the survey. Determine in advance how many responses you require and how you will gain access to the sample.
When you are satisfied that you have created a strong research design suitable for answering your research questions, you can conduct the survey through your method of choice – by mail, online, or in person.
There are many methods of analyzing the results of your survey. First you have to process the data, usually with the help of a computer program to sort all the responses. You should also clean the data by removing incomplete or incorrectly completed responses.
If you asked open-ended questions, you will have to code the responses by assigning labels to each response and organizing them into categories or themes. You can also use more qualitative methods, such as thematic analysis , which is especially suitable for analyzing interviews.
Statistical analysis is usually conducted using programs like SPSS or Stata. The same set of survey data can be subject to many analyses.
Finally, when you have collected and analyzed all the necessary data, you will write it up as part of your thesis, dissertation , or research paper .
In the methodology section, you describe exactly how you conducted the survey. You should explain the types of questions you used, the sampling method, when and where the survey took place, and the response rate. You can include the full questionnaire as an appendix and refer to it in the text if relevant.
Then introduce the analysis by describing how you prepared the data and the statistical methods you used to analyze it. In the results section, you summarize the key results from your analysis.
In the discussion and conclusion , you give your explanations and interpretations of these results, answer your research question, and reflect on the implications and limitations of the research.
If you want to know more about statistics , methodology , or research bias , make sure to check out some of our other articles with explanations and examples.
Research bias
A questionnaire is a data collection tool or instrument, while a survey is an overarching research method that involves collecting and analyzing data from people using questionnaires.
A Likert scale is a rating scale that quantitatively assesses opinions, attitudes, or behaviors. It is made up of 4 or more questions that measure a single attitude or trait when response scores are combined.
To use a Likert scale in a survey , you present participants with Likert-type questions or statements, and a continuum of items, usually with 5 or 7 possible responses, to capture their degree of agreement.
Individual Likert-type questions are generally considered ordinal data , because the items have clear rank order, but don’t have an even distribution.
Overall Likert scale scores are sometimes treated as interval data. These scores are considered to have directionality and even spacing between them.
The type of data determines what statistical tests you should use to analyze your data.
The priorities of a research design can vary depending on the field, but you usually have to specify:
If you want to cite this source, you can copy and paste the citation or click the “Cite this Scribbr article” button to automatically add the citation to our free Citation Generator.
McCombes, S. (2023, June 22). Survey Research | Definition, Examples & Methods. Scribbr. Retrieved September 5, 2024, from https://www.scribbr.com/methodology/survey-research/
Other students also liked, qualitative vs. quantitative research | differences, examples & methods, questionnaire design | methods, question types & examples, what is a likert scale | guide & examples, "i thought ai proofreading was useless but..".
I've been using Scribbr for years now and I know it's a service that won't disappoint. It does a good job spotting mistakes”
Learning Materials
Market research is a common practice used by companies to learn about customer behaviour and design suitable marketing campaigns. However, researching the market is not easy. To simplify the process, researchers can make use of research instruments. These are tools for collecting, measuring, and analysing data. Read along to learn what research instruments are used for and how they can be applied.
Millions of flashcards designed to help you ace your studies
Magazines, online articles, and industry reports are ______ research instruments.
What are research instruments?
The research instrument that involves watching people interacting in a controlled or uncontrolled environment is __________.
Focus groups can be organised online.
Secondary data is not a research instrument.
Generalizability means how well the participants' answers match those outside the study.
___________ means whether the research method will produce similar results multiple times.
What can be quantitative research instruments?
Focus group is a _________ research instrument.
Review generated flashcards
to start learning or create your own AI flashcards
Start learning or create your own AI flashcards
Research instruments are tools used for data collection and analysis. Researchers can use these tools in most fields. In business, they aid marketers in market research and customer behaviour study.
Some examples of research instruments include interviews, questionnaires, online surveys, and checklists.
Choosing the right research instrument is essential as it can reduce data collection time and provide more accurate results for the research purpose.
Data in research is a form of evidence. It justifies how marketers reach a decision and apply a particular strategy to a marketing campaign.
In research, marketers often collect data from various sources to produce and validate research results.
There are many examples of research instruments. The most common ones are interviews, surveys, observations, and focus groups . Let's break them down one by one.
The interview is a qualitative research method that collects data by asking questions. It includes three main types: structured, unstructured, and semi-structured interviews.
Structured interviews include an ordered list of questions. These questions are often closed-ended and draw a yes, no or a short answer from the respondents. Structured interviews are easy to execute but leave little room for spontaneity.
Unstructured interviews are the opposite of structured interviews. Questions are mostly open-ended and are not arranged in order. The participants can express themselves more freely and elaborate on their answers.
Semi-structured interviews are a blend of structured and unstructured interviews. They are more organised than unstructured interviews, though not as rigid as structured interviews.
Compared to other research instruments, interviews provide more reliable results and allow the interviewers to engage and connect with the participants. However, it requires experienced interviewers to drive the best response from the interviewees.
Tools used in interviews may include:
Audio recorder (face-to-face interview)
Cam recorder & video conferencing tools (online interview)
Check out our explanation Interview in Research to learn more.
Survey research is another primary data collection method that involves asking a group of people for their opinions on a topic. However, surveys are often given out in paper form or online instead of meeting the respondents face-to-face.
An example is a feedback survey you receive from a company from which you just purchased a product.
The most common form of a survey is a questionnaire. It is a list of questions to collect opinions from a group. These questions can be close-ended, open-ended, pre-selected answers, or scale ratings. Participants can receive the same or alternate questions.
The main benefit of a survey is that it is a cheap way to collect data from a large group. Most surveys are also anonymous, making people more comfortable sharing honest opinions. However, this approach does not always guarantee a response since people tend to ignore surveys in their email inboxes or in-store.
There are many types of surveys, including paper and online surveys.
Check out our explanation of Survey Research to learn more.
Observation is another research instrument for marketers to collect data. It involves an observer watching people interacting in a controlled or uncontrolled environment.
An example is watching a group of kids playing and seeing how they interact, which kid is most popular in the group, etc.
Observation is easy to execute and also provides highly accurate results. However, these results might be subjected to observer bias (the observers' opinions and prejudice) which lowers their fairness and objectivity. Also, some types of observations are not cheap.
Tools for observations can vary based on the research purpose and business resources.
Simple observations can be carried out without any tool. An example might be an observer "shopping along" with a customer to see how they choose products and which store section catches their eyes.
More complex observations can require special equipment such as eye-tracking and brain-scanning devices. Websites may also use heat maps to see which areas are most clicked by page visitors.
Check out our explanation of Observational research to learn more.
Focus groups are similar to interviews but include more than one participant. It is also a qualitative research method which aims to understand customers' opinions on a topic.
Focus groups often consist of one moderator and a group of participants. Sometimes, there are two moderators, one directing the conversation and the other observing.
Conducting focus groups are quick, cheap, and efficient. However, the data analysis can be time-consuming. Engaging a large group of people is tricky, and many participants may be shy or unwilling to give their opinions.
If focus groups are conducted online, tools like Zoom or Google Meeting are often used.
Check out our explanation Focus Groups to learn more.
Unlike the others, existing or secondary data is an instrument for secondary research. Secondary research means using data that another researcher has collected.
Secondary data can save a lot of research time and budget. Sources are also numerous, including internal (within the company) and external (outside the company) sources.
Internal sources include company reports, customer feedback, buyer personas, etc. External sources might include newspapers, magazines, journals, surveys, reports, Internet articles, etc.
Collecting from existing data is pretty simple, though the sources need validating before use.
Check out our explanation of Secondary Market Research to learn more.
Research instrument design means creating research instruments to obtain the most quality, reliable, and actionable results. It is an intricate process that requires a lot of time and effort from the researchers.
A few things to keep in mind when designing research instrument 1 :
Validity means how well the participants' answers match those outside the study.
Reliability means whether the research method will produce similar results multiple times.
Replicability means whether the research results can be used for other research purposes.
G eneralizability means whether the research data can be generalised or applied to the whole population.
Here are some good practices for creating research instruments:
Good research always starts with a hypothesis. This is the proposed explanation based on the evidence that the business currently has. Further research will be needed to prove this explanation is true.
Based on the hypothesis, the researchers can determine the research objectives:
What is the research's purpose?
What result does it try to measure?
What questions to ask?
How to know the results are reliable/actionable?
"To be prepared is half the victory". Preparation means designing how researchers will carry out the research. This may include creating questions and deciding on what tools to use.
Survey research design might include creating questions that are simple to understand and do not include biased language. The researcher can also use typography, spacing, colours, and images to make the survey attractive.
The person carrying out the research may not be the same as who designs it. To ensure smooth implementation, an important step is to create a guideline.
For example, when using interviews in research, the researcher can also create a document that provides a focus for the interview. This is simply a document that defines the structure of the interview - what questions to ask and in which order.
Interviewer bias happens when the researcher/observer/interviewer interacts directly with the participants. Interviewer bias means letting the interviewers' viewpoints and attitudes affect the research outcome. For example, the interviewer reacts differently around different interviewees or asks leading questions.
When designing research instruments, researchers should keep this in mind and leave out questions that might lead the respondent to their favourable responses.
To avoid mistakes, the researcher can first test it in a small sample before applying it to a large group. This is extremely important, especially in large-scale data collection methods like questionnaires. A minor error can make the whole process futile. A good practice is to ask a team member proofread the survey questions to spot any errors or inaccuracies.
After testing, the next task is to apply it to the target group. The response rate is a crucial KPI to determine the research's reliability. The higher the response rate, the more reliable the results are. However, other factors like the depth of answers are also important.
Quantitative research means collecting and analysing numerical data. This kind of research helps spot patterns and trends to make predictions or generalise results to the whole population. Research instruments in quantitative research include surveys, questionnaires, telephone, and interviews.
The main component of surveys is questionnaires. These are lists of questions to collect data from a large group. In survey research, the questions are primarily closed-ended or include rating scales to collect data in a unified fashion.
The reliability of survey results greatly depends on the sample size. The larger the sample size, the higher validity it will have, though not cheap to execute.
There is limited interviewer bias and errors in surveys. However, the refusal rate is high as few people are willing to write down their answers.
Questionnaires as a research instrument can be self-administered or with interference from the researcher.
Self-administered questionnaires are ones completed in the absence of the researcher. 2 The respondent fills out the questionnaire themselves, which gives the term "self-administered". Self-administered surveys allow participants to keep their anonymity and be more comfortable sharing their opinions. When surveys are self-administered, researcher bias can be removed. The only drawback is that the researcher can't track who will fill the questionnaires and when they will return the answer.
Questionnaires with interference from the researcher are primarily found in focus groups, interviews, or observational research . The researcher hands out the questionnaire and remains there to help the respondents fill it. They can answer questions and clear out any uncertainties the respondent might have. This type of questionnaire has more risk of researcher bias but will give more quality responses and have a higher response rate.
The telephone is another research instrument for quantitative research. It is based on random sampling and also has low interviewer bias. However, phone calls tend to be short (less than 15 minutes), giving interviewers little time to collect in-depth information. Customers can also hang up when they are distracted by something else.
Most interviews are qualitative in nature, but some are quantitative, especially those carried out in a structured manner. An example is structured interviews which include closed-ended questions arranged in a specific order.
observation
We have 14,000 flashcards about Dynamic Landscapes.
Already have an account? Log in
What instruments are used to collect quantitative data?
Instruments used to collect quantitative data include surveys, telephone, and (structured) interviews.
What is questionnaire in research instrument?
Questionnaires are lists of questions to gather data from the target group. It is mainly used in surveys to collect quantitative data.
What are research instruments for data collection?
There are many research instruments for data collection. The most popular are interviews, surveys, observations, focus groups, and secondary data. Different research instruments can be used depending on the type and purpose of the research.
What are research instrument examples?
Some research instrument examples are surveys, interviews, and focus groups. Surveys can be used to collect quantitative data from a large group while interviews and focus groups gather qualitative data from a smaller group of participants.
What are instrument design in research?
Research instrument design means creating research instruments to obtain high-quality and reliable research data. Good research instruments must match four qualities: validity, reliability, applicability, and generalizability.
Keep learning, you are doing great.
StudySmarter is a globally recognized educational technology company, offering a holistic learning platform designed for students of all ages and educational levels. Our platform provides learning support for a wide range of subjects, including STEM, Social Sciences, and Languages and also helps students to successfully master various tests and exams worldwide, such as GCSE, A Level, SAT, ACT, Abitur, and more. We offer an extensive library of learning materials, including interactive flashcards, comprehensive textbook solutions, and detailed explanations. The cutting-edge technology and tools we provide help students create their own learning materials. StudySmarter’s content is not only expert-verified but also regularly updated to ensure accuracy and relevance.
Team Marketing Teachers
Create a free account to save this explanation..
Save explanations to your personalised space and access them anytime, anywhere!
By signing up, you agree to the Terms and Conditions and the Privacy Policy of StudySmarter.
Sign up to highlight and take notes. It’s 100% free.
The first learning app that truly has everything you need to ace your exams in one place
Advertisement
You have full access to this open access article
Sustainability has emerged as one of the most critical factors influencing the competitiveness of maritime shipping ports. This emergence has led to a surge in research publications on port sustainability-related topics. However, despite the increasing awareness and adoption of sustainability practices, documented literature on empirical studies with survey and interview data is very limited. Moreover, the existence of validated instruments to objectively assess sustainability through sustainability practices for shipping ports in India needs to be traced. This study contributes by validating an instrument to evaluate objectively sustainability practices in shipping ports by adopting a four-stage process, starting with item identification based on an extensive literature review, instrument evaluation by subject matter experts, assessing of the instrument with suitable content validation indices, and finally evaluating the validity and reliability of the hypothesized theoretical model. For content validation, Content Validity Index, Cohens Kappa coefficient, and Lawshe’s Content Validity Ratio were computed with the assessment by a subject matter expert panel comprising six members from the port industry as well as academicians cum researchers in the field of shipping ports. The content-validated instrument was administered to 200 samples comprising officer category port employees. The measurement model was evaluated and validated using the Confirmatory Factor Analysis to assess the extent to which the measured variables represent the theoretical construct of the study and ascertain the factor structure. The empirically validated instrument met the required guidelines of model fit, reliability, and construct validity measures and was found to be a confirmed model for measuring sustainability practices in shipping ports. Structural Equation Modeling methodology was adopted to explain the variance and the path relationship between the higher-order and lower-order constructs of sustainability. The results indicate that the economic dimensions are the major contributors to the overall sustainability of the port as they drive investments in environmental and social dimensions, leading to overall sustainable development. The study’s findings will be helpful for researchers, academicians, policymakers, and industry practitioners working towards sustainability practices that contribute to sustainable growth and development in the shipping industry.
Avoid common mistakes on your manuscript.
Sustainability has increasingly been considered one of society and industry’s most significant focus areas [ 1 ] along with regulatory bodies in recent times, although only partially new [ 2 ]. Sustainable development is generally quoted as the one that “satisfies the needs and wants of the present generation” simultaneously without compromising the future generation’s needs and aspirations” [ 3 ]. Moreover, the challenge is balancing sustainability and economic growth [ 4 ]. This prerequisite has led organizations to look beyond mere economic performance and build social and environmentally friendly business models by integrating sustainability principles and practices [ 5 , 6 , 7 ] targeting competitive advantages [ 8 , 9 ]. Against this backdrop, Lun et al. [ 3 ] highlighted the importance of shipping ports in a country’s sustainable growth and development in the long run as they generate employment along with the export–import trade. It is widely accepted that port-led economic growth and development have been the backbone of many developed and developing countries. For instance, the Indian maritime sector, which is one of the most promising, emerging, developing, dynamic markets in the world, has received a facelift with one of the significant mega-initiatives of the government, “ Sagarmala ”, which is focused on “port-led economic development” [ 10 ]. 95% of India’s overall goods trade volume is through shipping ports, contributing about 14% of the GDP [ 11 ]. The impact of the subdued port performance is reflected in a country’s economic development [ 12 ]. Shipping ports are vital nodes that link other modes of transportation in global trade and are considered strategic assets demanding significant attention in maritime and transportation research and practice [ 4 , 13 , 14 ].
Lee et al. [ 15 ] flagged the concern of less attention given to sustainability in the shipping, port, and maritime industries, unlike the aviation and road transport sectors. Many studies [ 14 , 16 , 17 , 18 , 19 ] have emphasized sustainability as one of the most crucial aspects influencing the competitiveness and long-term sustenance of shipping ports; however, it is not fully incorporated into a strategic decision that demands a long-term view when deciding on port development and management. Further, coastal lines are densely populated, leading to higher levels of economic activity and rapid urbanization; however, they face the consequences that come along as the byproduct of development in the form of environmental, economic, and social concerns [ 14 , 20 ]. Some of the substantial problems in the ports discussed in many studies [ 14 , 18 , 19 , 21 , 22 , 23 , 24 , 25 ] include the depletion of the marine ecosystem and biodiversity due to dredging and reclamation works, water pollution due to ballast operations, oil spillage during ship anchoring and cargo operations, wastewater spillage from ships, air pollution due to various pollutants and particulate matters, dust and smoke pollution by a heavy vehicle, climate change effects, increased energy consumption for operations and the cost of energy, uncertainty in future economic returns of investment in the port, employment and diversity of jobs, employee productivity, displacement of local community along with impact on their livelihood, loss of agricultural land, steep increase in cost of living and land revenue rate due to swift urbanization, industrial and special economic zone development, inclusivity of community in developmental projects, safety and security in the port vicinity, social working environment, trade union interference and many more. These challenges, complemented by the growing importance and focus on the sustainable development of shipping ports, have led to a surge in research publications on sustainability-related topics that concentrate on environmental, economic, and social aspects [ 26 , 27 ].
A literature review gives insight into the focus of extant studies related to port sustainability, which is more on quantifying and measuring various dimensions of sustainability against benchmarks and developing indexes for multiple sizes of sustainability. However, qualitative studies need to be conducted to understand the extent to which sustainability measures are adopted and the interaction between the measures, directing towards empirical data-driven studies [ 27 , 28 ]. Alamoush et al. [ 2 ], in their port sustainability framework development study, found that only 16 percent of the articles published were empirical and were based on questionnaire surveys and personal interview data. At the same time, the majority, 40 percent, was conceptual and theoretical review, and the remaining was equally distributed among simulation and case studies. Empirical data-driven research on sustainability-related topics and port performance will be critical to the growing body of knowledge [ 28 ]. Further, the empirical studies on port sustainability [ 2 , 27 , 29 , 30 , 31 , 32 , 33 ] have adopted various indicators of sustainability and criteria for sustainability evaluation based on available scales from studies not directly related to port, rather context-adjusted for port industry. Moreover, despite the increasing number of empirical studies on sustainability in the port sector, in line with the claim made by Ayre et al. [ 34 ] on the hardly ever reporting of content validation by researchers, the existence of any content validation process adopted in any studies and the validated instrument so developed to measure sustainability and sustainability practices objectively for shipping ports is not traceable in the extant literature, especially for exponentially developing economy like India. The study of Ashrafi et al. [ 17 ], which pointed out their pilot study for validation, only assessed the perception of port sustainability in the US and Canada through an online survey to identify the primary factors and challenges in adopting and implementing sustainability strategies. However, the instrument needed to be more generic in capturing the overall sustainability barriers and influencing factors and needed a specific macro-level assessment of the three dimensions of sustainability.
There are studies [ 35 , 36 , 37 ] that discuss the importance of content validity to determine whether the measurement item used in the tool and the extent to which the tool is satisfactorily representing and addressing the domain of interest along with its relevance when measured. Thus, the need for precise, validated measurement tools for sustainability practices at shipping ports indicates a critical knowledge gap in the existing literature. It should also be noted that most seaport-related studies in the scholarly database concentrated on specific geographical areas in Europe [ 32 , 38 ] but not on the leading and growing economies like India. Another concern is that although sustainability is a widely discussed topic, there still needs to be a single universally acceptable and established definition for sustainability [ 39 , 40 , 41 ]. According to Maletič et al. [ 40 ], even though many have attempted to define and measure sustainability, there is an ongoing debate in the literature [ 42 ] on the existence of multiple ways to measure sustainability practices. Therefore, there is a vital need to have clarity and substantial justification on the dimensions and indicators that define the sustainability construct and standardize the assessment of sustainability to a greater extent, especially for seaports. Further to developing policies and schemes for sustainability, the implementations are essential for guaranteed sustainable development [ 28 , 43 ] and measuring the extent to which the port has focused on various sustainability practices can be a tool to assess the efforts towards sustainable development of the shipping port. Alamoush et al. [ 2 ] also pointed out their primary observation on the lack of study linking the port sustainability actions with the three sustainability dimensions represented by the three sustainability practices.
Considering this existing crucial gap and challenges discussed above, the novel contribution of this study is a validated instrument for assessing the sustainability practices followed at shipping ports covering the dimensions of sustainability. The measuring instrument can act as a guideline for seaport administrators and stakeholders to evaluate sustainability in shipping ports and develop seaport sustainability policy for sustainable maritime growth and sustainable development, specifically for an empirical and objective evaluation of sustainability practices adopted in shipping ports in India. Given this compelling necessity to have a content and construct validated instrument for sustainability assessment and strategy development, specifically in the context of Indian seaports, this study aims to explore, design, and develop an instrument for objectively assessing sustainability practices in shipping ports through a well-established content validation process. To achieve the aim of the study, the objectives are:
To identify the comprehensive list of dimensions of sustainability practice for shipping ports through an extensive literature survey
To validate the content of the measurement tool using globally accepted content validation indices viz Content Validity Index, Kohens Kappa coefficient, and Content Validity Ratio
To ascertain the factor structure of the measurement model using Confirmatory Factor Analysis
To estimate the relationship and contribution of three dimensions of sustainability practices to the higher-order sustainability practices construct
The study is structured into the following three major sections. The following section explores the theoretical foundations of sustainability construct and the related conceptual framework that shapes shipping ports’ sustainability dimensions. The following section covers the research methodology for achieving the study’s objectives. It outlines the steps followed in item identification through literature review, instrument development, and instrument assessment based on globally accepted indices and measurement model structure evaluation for validity and reliability using confirmatory factor analysis leading to Partial Least Square-Structural Equation Modeling (PLS-SEM) methods for relationship estimation and prediction of the relationship among the variables. Finally, the results are critically discussed, with the findings of the study highlighting the implications leading to a conclusion along with future research directions.
Although certain sustainability practices are compelled by regulatory compliance, organizations are fortified to adopt and engage voluntarily and proactively in other sustainability practices to meet the needs of the broader society within which they operate [ 44 ]. Extant studies on sustainability [ 7 , 14 , 16 , 17 , 18 , 45 , 46 ] have discussed the need to integrate sustainability efforts into organizational goals, processes, and initiatives and link them with organizational strategy, without which the efforts are likely to fail. The goal for any firm is to secure a competitive advantage over its competitors, create wealth, capture the highest possible market share, and add value to the stakeholders while maintaining a balance between sustainability and economic growth [ 4 ]. Studies [ 16 , 47 ] have opined sustainability to be one of the most crucial aspects that influence the competitiveness of ports. Moreover, to achieve and sustain competitive advantage, organizations have been increasingly grappling with sustainability practices [ 8 ]. Simpson et al. [ 48 ] define practices as “the customary, habitual, or expected procedure or way of doing something. “The practices focused on sustainability could differ from industry to industry, and the shipping industry could concentrate on various practices, and relevant systems would be in place to support sustainable growth and development. Kang et al. [ 31 ] highlight the best practices that embrace sustainability and suggest many practices related to operations, resource optimization, safety and security, finance, risk, infrastructure upgradation, stakeholder management, environmental management systems, and the Port’s eco-friendly and social work environment.
Discussions in prior studies indicate mixed responses regarding the definition of sustainability. There is no universally acceptable definition [ 39 , 42 , 49 , 50 ], but a more generic definition emphasizes sustainability as the set of business strategies, policies, and associated practices or activities where the requirement of the present is satisfied without impacting the requirements of the future in the best interests of the port and related stakeholders. Different schools of thought have a general opinion that sustainability encompasses the three significant dimensions popularly termed as the triple bottom line (TBL) dimensions of “economic, social, and environmental practices,” which comprehend the broad framework of sustainable development [ 39 , 49 , 50 , 51 ]. Elkington [ 51 , 52 ] introduced the triple-bottom-line approach (TBL) incorporating these interrelated three dimensions—“environmental sustainability, economic sustainability, and social sustainability,” advocating organizations to adopt the TBL approach for long-term success [ 53 , 54 , 55 ], rather than short-term success focusing only on the economic dimension. The TBL aspects are considered the critical dimensions of sustainability [ 56 ]. Environmental dimensions concentrate on policies, initiatives, and practices that promote environmental management. In contrast, economic dimensions focus on policies, initiatives, and practices related to investments, economic benefits, and returns from those investments [ 57 ]. Social sustainability focuses on policies, initiatives, and practices that promote the overall improvement of society at large, including all other stakeholders [ 58 ]. Bansal [ 59 ] asserted that the three pillars, i.e., environmental integrity, economic prosperity, and social equity, should intersect for sustainability. Alamoush et al. [ 2 ] further related these dimensions of TBL to the planet, profit, and people as synonyms for environment, economic, and social sustainability.
In the context of ports-related studies, various environmental, economic, and social dimensions were adopted to assess the sustainability of the ports using different methodologies [ 2 , 27 ]. Oh et al. [ 29 ] adopted the importance-performance analysis technique to evaluate the sustainability of South Korean ports using 27 vital measures of the sustainability of ports adapted from the findings and discussions of previous research and found that those measures are essential from a port sustainability point of view. Their study classified the indicators of port sustainability in the three dimensions of sustainability as opined in the TBL concept. In contrast to this empirical quantitative approach, Vejvar et al. [ 33 ] adopted a case-study-based approach to study the institutional forces that compel organizations to adopt sustainability practices. However, they adopted open-ended questions to probe the sustainability practices adopted in the selected shipping ports. They performed a cross-case analysis to make the study more generalizable and increase the validity of the findings [ 32 ]. A thematic analysis of the sustainability performance of seaports was conducted, followed by semi-structured interviews. Later, a fuzzy analytical hierarchy process was applied to compute the weight for each port sustainability performance indicator. Their study also categorized the indicators into three dimensions of sustainability performance, namely social, environmental, and economic sustainability performance practices. Another multi-dimensional framework of sustainability practices was empirically tested by Maletič et al. [ 40 ], and they defined sustainability exploitation and exploration as two different sustainability practices. According to them, sustainability exploitation practices aim at incremental improvement in organizational processes, and sustainability exploration challenges current practices with innovative concepts in developing competencies and capabilities for sustainability. However, they also acknowledged the suitability of more objective measures, such as the TBL practices for sustainability studies. Sustainability practices aid organizations in developing opportunities while managing the three dimensions of organizational processes—economic, environmental, and social aspects in value creation over the long term [ 51 ]. In that definition given, profitability is the focus of economic sustainability, protection and concern towards the environment is the focus of environmental sustainability [ 60 , 61 ], and social sustainability focuses on sustained relations with all the stakeholders, including suppliers, customers, employees, and the community as well [ 62 ].
Regarding developing an index related to sustainability, Laxe et al. [ 43 ] developed the “Port Sustainability Synthetic Index” covering economic and environmental indicators using a sample of 16 ports in Spain. Molavi et al. [ 25 ] developed a novel framework for the smart port index for achieving sustainability using key performance indicators (KPI) that can assist in strategy development and policy framing. Their study indicated several sub-domains of environmental domains in the smart port index study, along with other domains such as operations, energy, safety, and security. However, their study mentioned environment-related quantitative KPIs and other domains that can be used to evaluate the smart port index. Still, it did not mention economic and social, although the sub-domain can be related to economic and social dimensions. In contrast, Stanković et al. [ 63 ] developed a novel composite index for assessing the sustainability of seaports covering environmental, economic, and social dimensions through its indicators based on the secondary data available in the Eurostat and the Organization for Economic Co-operation and Development database. However, the environmental dimension captured only air pollution particulate matter value as the only indicator. Their study also mentioned the limitations of not covering many indicators, including social inclusion and waste management, due to the unavailability of the database. These limitations of quantitative data available in secondary databases for index inclusion are also challenging. The data collection across ports is not yet standardized, and the diverse type of cargo handled in ports makes the index not universally adaptable. Mori et al. [ 64 ] had the same opinion about avoiding a synthesized composite index due to the chances of offset in evaluation [ 65 ]. Although indices have benefits, standard data availability limitations for computing indexes are another added concern that limits index adoption for benchmarking and assessment, thus making sustainability index adoption with caveats.
Therefore, following the justifications and proven theoretical foundations discussed above, this study is grounded on sustainability theory orchestrated by the TBL view, which incorporates the three interrelated dimensions of sustainability—“environmental sustainability, economic sustainability, and social sustainability” and the relevant sustainability practices focused on shipping ports. Therefore, based on previous studies on sustainability and sustainability practices, this study considers sustainability constructs, namely environmental sustainability practices (EnvSP), economic sustainability (EcoSP), and social sustainability practices (SocSP). The indicators thus identified would be used as the measurement scales to empirically measure through survey instruments and objectively evaluate sustainability practices adopted in shipping ports in India.
The authors adopted the content validation process prescribed by Barbosa et al. [ 35 ]. The process starts with item identification based on an extensive literature review, instrument assessment by subject matter experts, and instrument evaluation with suitable content validation indices. This was followed by assessing the validity and reliability of the hypothesized model to confirm the theory established in the literature using Confirmatory Factor Analysis (CFA) [ 66 ]. CFA is the most widely adopted statistical technique that helps to determine the underlying structure among a set of latent variables and confirm the reliability of how well the scale measures the proposed concept. Hair et al. [ 67 ] elucidated on CFA as a technique to assess the contribution of each scale of item on the latent variable, which later can be incorporated into the estimation of the relationships in the structural model along with accounting for associated measurement error using the variance-based Partial Least Squares-Structural Equation Modelling (PLS-SEM) framework. Explaining the relationship between the exogenous and endogenous variables and predicting the variation in the relationship is the primary focus of PLS-SEM. The five-stage procedure adopted in the study is shown in the flow chart Fig. 1 below.
Content validation process for the study (Author’s own)
In the first stage, an extensive review of relevant articles related to port studies was performed to compile a comprehensive list of items related to the three dimensions of sustainability viz environmental, economic, and sustainability practices, with the help of a relevant keyword search in the Scopus database in the context of shipping ports. Multiple iterations with different combinations of keywords were performed to see the diversity of articles that can be traced in the scholarly database and the final set of keywords as [(“sustainability”) OR (“sustainability practices”)] AND [(“shipping ports”) OR (“maritime ports”) OR (“container ports”)] AND (scale OR items OR measurement OR indicators OR SEM) were adopted in article identification and followed by screening of articles for item identification and compilation of the list of items related to sustainability practices. The final set of related articles was critically reviewed to identify the relevant items for sustainability practices at shipping ports. Following the recommendation of Boateng et al. [ 68 ], face validation of the instrument was conducted with review and inputs from two senior academicians and experts with theoretical and practical knowledge of sustainability practices.
A subject matter expert panel selection followed this in the second phase of the content validation process to perform instrument assessment. Typically, experts evaluate the content validity, and for that, the recommended minimum number of experts is three and can go up to a maximum of 10 [ 68 , 69 , 70 ]. Following this study’s prescribed expert number requirement, a panel comprising six experts from academic and Port industry backgrounds assessed and validated items. Barbosa and Cansino [ 35 ] claim no unique formula or approach for selecting an expert panel exists. However, it points out the need for a heterogenous panel to mitigate the risk of biases in the validation. Therefore, the study included subject experts from the port industry and academicians with experience in port-related research studies.
The relevance of the items identified through literature review from various sources and the essentiality of these items are supposed to be assessed and content validated through the instrument assessment by the panel of experts. For content validation and evaluation, the Content Validity Index (CVI), Cohens Kappa coefficient, and Lawshe’s Content Validity Ratio (CVR) were adopted as they are the most widely adopted content validation tools for quantifying the opinions of experts [ 69 , 71 , 72 , 73 , 74 ]. The items were assessed for relevance on a 4-point Likert scale. The 4-point Likert scale for relevance captured the response as “1 = not relevant, 2 = somewhat relevant, 3 = quite relevant, and 4 = very relevant” for every item in the measurement instrument. Further, the items were assessed on a 3-point Likert scale to capture the extent of essentiality. Moreover, the 3-point Likert scale for essentiality captured the response as “1 = not essential; 2 = useful, but not essential; and 3 = essential”. Further, an additional comments column for each item was also provided to add feedback and remarks by the expert against each item.
Following the recommendation of [ 75 ], the validity of the instrument content was assessed using CVI, Cohen’s Kappa coefficient, and CVR indices. CVI is a straightforward computation of the agreement among the panelists and can be computed at both the individual item level and the overall scales [ 71 , 72 , 73 , 76 ]. Accordingly, I-CVI is the validity index for each item of the constructs of the study, whereas S-CVI is for the overall scale, which is calculated as the average of I-CVI. I-CVI can be determined as the ratio of several panelists’ ratings on a scale of 3 and above for each measurement item and the total panelists evaluating the relevance. Along similar lines, S-CVI can be computed using the number of measurement items in the assessment tool with a rating of 3 and above for each measurement item. To complement and increase the strength of assessment of relevance through CVI, Barbosa and Cansino [ 35 ] highlight the benefit of Cohen’s Kappa coefficient for evaluation of content validation with due consideration of the degree of agreement on the measurement item beyond certain chance along with the associated probability of inflated scales of agreement merely due to chance agreement. The formula to compute Cohen’s Kappa coefficient is as follows:
where the total number of experts is denoted as N, and it indicates the total number of subject matter experts indicating “essential,” P c is the probability of chance agreement and computed as:
According to Lawshe (1975), the CVR index can be computed using the formula:
The formulas described above were entered in a spreadsheet for the computation of CVR, Cohen’s Kappa coefficient, and CVI based on the rating given by the experts for the items identified. The scale of relevance and necessity of items marked by each expert was recorded and coded into the spreadsheet to facilitate the computation of indices for every item and the entire scale.
3.4.1 sampling and data collection.
The content-validated questionnaire instrument was administered online through Microsoft Forms as well as offline to port employees working at the mid and senior management (Officer Category) level employees of various significant ports located in both the western and eastern coastal belts of India for data collection to test the validity and reliability of the measurement model. The instrument captured the respondents’ demographics and perceptions of how much the Port focuses on the three pillars of sustainability practices adopted in their respective ports. Many authors have opined the choice of sample size determination in business management and social science based on G-power [ 77 , 78 , 79 , 80 ]. As per the calculation, for an effect size at a medium level, implied as 0.15, a 5% significance level, and a power level of eighty percent, the recommended minimum sample size was 166. Further, Hair et al. [ 67 ] have outlined the guideline for the minimum sample size for a model structure with less than or equal to seven constructs as 150. However, to avert any possible statistical loss, the rationalized sample size was determined to be 20 percent over 166, thereby establishing the sample size required for the study to be 200. The respondents had to indicate their level of agreement on each item on a Likert scale of 1 to 5, 1 indicating “Strongly Disagree” and five indicating “Strongly Agree”. The data collection activity was carried out between January and December 2023 until the required sample usable data was received for further analysis.
Confirmatory factor analysis (CFA) was performed to check the factor structure confirmation of the sustainability practices dimensions using the sample data collected with IBM AMOS. In contrast to measurement error, reliability is the indicator of the “degree to which the observed variable measures the true value and is error-free [ 66 ]. It is also an “assessment of the degree of consistency between multiple measurements of a variable and the set of variables being measured.” Model fit, reliability, and construct validity indices were assessed based on the recommendations by [ 81 , 82 ]. Construct reliabilities were evaluated using Cronbach’s alpha, Composite reliability, and Average Variance Extracted (AVE) values. Construct validity was assessed using convergent validity and discriminant validity measures.
SEM methodology facilitates the indirect measurement of unobserved latent variables with the measurement of indicator items for the variables in the model structure [ 82 ]. SEM methodology assesses how the latent variables in the model are related to one another and accounts for any errors in the measurement of the observed variables. Therefore, we adopted the PLS-SEM technique to estimate the relationship between the three dimensions of sustainability and their contribution to the overall sustainability construct through different item indicators for each of the dimensions of sustainability. Further, as per the recommendation of Hair et al. [ 83 ], the variance-based PLS-SEM framework is more suitable for our study because the sample size is comparatively less, and the normal distribution assumption is not significant for our study due to the innate nature of the items measuring the dimensions of sustainability. To specify the model parameters and estimate the relationship between the higher and lower-order constructs, we used SmartPLS [ 84 ], the most popular tool for PLS-SEM.
In the first stage of the analysis, items were identified based on an extensive literature review and face validation, instrument assessment by subject matter experts later, and instrument evaluation with suitable content validation indices. This was followed by assessing the validity and reliability of the hypothesized model to confirm the theory using CFA, leading to path relationship assessment using PLS-SEM methodology.
The literature review compiled a comprehensive and exhaustive initial list comprising 48 items as indicators of sustainability practices adopted in shipping ports and the source (refer to Appendix 1 ). The initial list of items identified comprised 17 items as indicators of Environmental sustainability practices, 19 as indicators of environmental sustainability practices, and 12 as indicators of social sustainability practices adopted in shipping ports. The initial draft of the measuring instrument was subjected to face validation. The inputs from the two senior academicians and experts who carried out face validation were incorporated, which included necessary corrections such as the elimination of ambiguous terms, the inclusion of other indicators that were not included in the initial list, rephrasing of the sentences in the instrument for a better understanding of the context of the study along with the final formatting of the layout [ 68 ]. After incorporating the corrections of face validation along with their source, the items were compiled in the measurement instrument for content validation in the next stage.
In the second stage, six selected subject matter experts conducted the content validation of the face-validated instrument. “Content validity is a subjective approach that evaluates the extent to which the content described through scale measures certain factors of study interest. Content validation evaluates whether the items in the questionnaire instrument are clear, readable, and relevant to the study context [ 85 ]. The relevance of the items, as well as the essentiality of these items, are supposed to be assessed and validated through the instrument assessment by the panel of subject matter experts. The assessment tool of the study instruments was administered to the six subject experts who were impaneled. Table 1 summarizes the profile of the experts who participated in validating the questionnaire items.
CVI and Kappa coefficients were calculated to assess the relevance of the items, and CVR was calculated to determine the items’ essentiality for the study context [ 74 ]. The responses of the experts on the item’s relevancy and essentiality were coded to spreadsheets for computation of CVI, Kappa coefficient, and CVR value as per the respective formulas [ 69 , 71 , 72 , 73 , 74 ]. The results after the computation of CVI, Kappa coefficient, and CVR are consolidated in Appendix 2 .
CVI indicates the proportion of experts who agreed on the tool and the measurement of the items for a given construct by considering ratings of 1 and 2 as invalid, whereas 3 and 4 are valid contents and consistent with the study conceptual framework [ 74 ]. Adopting the cut-offs suggested in previous established studies [ 69 , 74 ], items with a CVI of at least 0.84 were accepted in the validation. The validation tool indicated S-CVI as 0.86 and satisfies the minimum requirement of 0.80 per Shrotryia et al. [ 86 ] for an instrument to be considered content valid. Along with that, the Cohens Kappa coefficient was also computed with a cut-off of 0.74 to avoid any errors due to chance agreement by the expert panel [ 71 , 72 , 73 , 75 , 76 ]. According to Lawshe’s benchmark CVR index, for a panel size of 6, the cut-off CVR value prescribed is 0.99. It indicates agreement among the panel judges on the item’s necessity in the study questionnaire. Based on these inclusion criteria of CVI, Kappa, and CVR, the most essential and relevant shortlisted items and the final questionnaire were administered for construct validation. The finalized instrument for measuring the extent of the adoption of sustainability practices in shipping ports is shown in Appendix 3 .
Empirical studies attempt to validate and justify the research framework developed with the help of primary data collected from respondents through a questionnaire instrument. Since the analysis solely depends on the data collected through the instrument and the data collected are not accurate measurements of factors of interest but observations of the respondent’s perceptions, the questionnaire should be subjected to validation and reliability checks [ 85 ]. The validation and reliability checking procedures aim to measure and address the measurement error caused by the difference in the actual scores measures from the measured or observed scores [ 66 ]. Validity exemplifies the extent to which the collected data represents the study’s primary purpose, in other words, “measuring what it proposes to measure. The content-validated measurement instrument was administered to port employees of Officer and above designation across various significant ports in India for data collection. Table 2 shows the demographic profiles of the samples who gave the responses to the questions in the instrument administered.
The goodness-of-fit indices were evaluated for the reflective measurement model considering the recommendation [ 81 , 82 ]. The model fit indexes for the hypothesized model were acceptable considering the benchmark recommended values [ 66 , 87 ]. The results [ \( \chi^{2} /\) df was 1.6, Goodness-of-fit index (GFI), Tucker Lewis Index (TLI) and Comparative Fit Index (CFI) > 0.9, Standardized Root Mean Square Residual (SRMR) = 0.053, and Root Mean Square Error of approximation (RMSEA) = 0.055] indicated acceptable model fit as per the recommendations. The standardized factor loading, construct validity, and reliability values are shown below in Table 3 .
Although Hair et al. [ 67 , p. 152–153] suggest a minimum factor loading benchmark value of 0.7 for statistical significance in general, it is also meant to consider 0.50 or above as practically significant in addition to another guideline recommending statistical significance of greater than 0.40-factor loading for a sample size of 200. Further, as per the recommendation of Chin et al. [ 88 ] and considering the practical significance of the items having more than 0.6 loadings, we believe all items with a factor loading above 0.60 are acceptable in the model structure. Therefore, all 26 items are retained in the measurement instrument. Construct reliabilities were assessed using Cronbach’s alpha and Composite reliability measures between 0.85 and 0.90, respectively. Following the reference guidelines by Hair et al. [ 89 ], the measures indicate good and acceptable internal consistency, thereby establishing the scale’s reliability in measuring the construct.
Construct validity was evaluated using convergent and discriminant validity measures except for the EcoSP construct; the other two constructs, viz. EnvSP and SocSP had AVE above the minimum benchmark of 0.50, whereas EcoSP was very close at 0.49. It can be approximated to 0.5, which is correct at the acceptable benchmark for estimating the convergent validity of the measurement model [ 90 ]. There are recommendations that marginal shortfall in AVE is adequate when Cronbach’s alpha and composite reliability are higher than 0.60 [ 89 , 90 ]. These results indicate the acceptable reliability of the scale for measuring sustainability practices in ports.
Hair et al. [ 89 ] emphasize the two established measures of discriminant validity in a model, viz., the Fornell–Larcker criterion and the Heterotrait-monotrait (HTMT) ratio. In the Fornell–Larcker criterion approach, the inter-construct correlations that measure the shared variance between latent variables are compared with the square root of average variance extracted values of the construct. The square root of AVE of the specific construct under consideration is expected to be greater than the particular construct’s highest inter-construct correlation, which signifies the shared variance with other constructs of the model under study. The square root of the AVE of all the constructs was compared with the correlation measures for every build. It was found to be greater than the respective correlation values of the construct under consideration, thereby ascertaining the discriminant validity of the construct. In the HTMT ratio approach, the estimated correlations measured are also termed unattenuated correlation, and the value of unattenuated correlation close to 1 implies an absence of discriminant validity. The benchmark value for the HTMT ratio is 0.90, and any measures above this threshold imply the absence of discriminant validity of the constructs [ 91 , 92 ]. All the measures of discriminant validity assessment indicated HTMT ratio values to be less than 0.9, thus satisfying the discriminant validity requirement of the measurement scale.
Variance-inflation-factor (VIF) was checked for the possibility of multi-collinearity issues [ 89 , 91 , 93 , 94 ]. Multi-collinearity was ruled out as all the VIF values were less than three. The above results support the reliability and validity of the sustainability constructs as collective indicators of the three dimensions of sustainability viz economic, environmental, and social sustainability, and confirm the relationship. Further, the bootstrapping procedure was run to test the significance of the path. The standardized path coefficient values, T-statistics, and p-values shown in Table 4 explain the variance of the three dimensions of the sustainability practice construct. The p-values (< 0.05) indicate that all the structural model relationships are statistically significant.
The authors followed the systematic procedure of compiling a comprehensive list of related items for the three sustainability practices constructs through an extensive literature review followed by face validation and content validation to assess the relevance and essentiality of the items in the context of shipping ports in India. Empirical studies attempt to validate and justify the research framework developed with the help of primary data collected from respondents through a questionnaire instrument. Since the analysis solely depends on the data collected through the instrument and the data collected are not accurate measurements of factors of interest but observations of the respondent’s perceptions, the questionnaire should be subjected to validation and reliability checks [ 85 ]. The content-validated instrument was subjected to empirical evaluation with sample data collected and using the CFA technique to ascertain the reliability and validity of the model.
Specifically, the results indicate that the subject matter experts have prioritized essential and relevant items in the contemporary business environment, giving nearly equal weightage and importance to all three dimensions of sustainability practices: environmental, economic, and social. Among the items validated, the expert panel had the minor agreement for relevance and necessity on foreign direct investment and funding items, which postulates the shipping ports in India are primarily funded by the government as the minor and significant ports that comprise most of the ports controlled and administered by state and central government respectively. The same reason can be attributed to the low relevance of job security in the context of Indian shipping ports. Further, the items related to odor and smoke also received low relevance as they indicate the low degree of industrial development in shipping ports in India. Although shown as relevant, cold-ironing power sources for vessels on the berth also received a low degree of agreement for necessity. Even recognizing requirements and supporting the community also received little agreement for necessity. However, the remarks provided by the panelist highlight that these focus areas are essentially part of corporate social responsibility, and there is no necessity to assess this separately.
Content validation evaluates whether the items in the questionnaire instrument are clear, readable, and relevant to the study context [ 85 ]. After face and content-validation of the instrument, the finalized list comprised eight items as indicators of EnvSPs, ten as indicators of EcoSPs, and eight as indicators of SocSPs adopted in shipping ports. Thus, the content-validated items for the questionnaire instrument comprised 26 items for measuring the constructs of the study, which is closer to the number of items. Oh et al. [ 29 ] had adopted in the sustainability of ports study. Their study adopted the importance-performance analysis technique to evaluate the sustainability of South Korean ports using 27 vital measures of the sustainability of ports adapted from the findings and discussions of previous research and found that those measures are essential from a port sustainability point of view. Their study classified the indicators of port sustainability in the three dimensions of sustainability as opined in the TBL concept. Along similar lines, Narasimha et al. [ 32 ] conducted a thematic analysis of the sustainability performance of seaports followed by semi-structured interviews. They later applied a fuzzy analytical hierarchy process to compute the weight for each port sustainability performance indicator. Their study categorized the indicators into three dimensions of sustainability performance, namely social, environmental, and economic sustainability performance practices. Therefore, it can be interpreted from the results that these content-validated items are reflective indicators of the sustainability practice constructs and collectively constitute latent variables for empirical studies, confirming that the measurement model reflects the construct validity.
Very Specifically, this study supports the well-established “Tripple Bottom Line” (TBL) theory of sustainability coined by Elkington [ 52 ] that these validated sustainability practice-related items in the measuring instrument adequately represent the seaport domain, and the instrument can be used for measuring the constructs through empirical studies. Even the Sustainable Development Goals (SDG) of the United Nations Development Programme (UNDP) also talks about integrated sustainable development by balancing the three pillars of sustainability: environmental, economic, and social. Chang and Kuo [ 95 ] advise organizations to look at short- and long-term sustainable practices for short-term earnings and safeguard the environment and social integrity simultaneously. Thus, at the strategic level, the TBL practices are the higher-order constructs of sustainability practices, focusing on the long term [ 96 ]. Therefore, the findings of this study contribute to the extant body of literature knowledge by providing empirical evidence on the practical and statistical relationship between the environmental, economic, and social sustainability-related practices of the sustainability construct in the TBL theory-based framework applied for shipping ports.
Yadav et al. [ 97 ] also emphasized the availability of several environmental management systems (EMS) for achieving environmental sustainability. They also recommended introducing methods promoting green culture, supporting green behavior, and improving employee commitment to achieving environmental sustainability. The social dimension of sustainability primarily focuses on facilitating and providing equitable opportunities and the well-being of the port employees and other stakeholders, including the local community, driven by the policies and practices of the port authority. Alamoush et al. [ 2 ] equated economic sustainability to generating revenue and monetary gains and considered economic sustainability to be one of the primary drivers of the other two dimensions—environmental and social sustainability. Further, the results from the PLS-SEM analysis indicate that the most significant contribution towards the overall sustainability of the port is from the economic sustainability dimension of sustainability. Like the findings of Alamoush et al. [ 2 ], the financial investments in the port are the drivers of environmental and social sustainability. Poulsen et al. [ 98 ] proved with facts and figures that air quality was improved even with an increase in cargo throughput, mainly driven by the financial investments in air quality control systems in many ports across Europe.
The improvement in air quality around the port vicinity contributes to environmental sustainability. In addition, it also contributes to social sustainability as the community and the port surroundings, including the ecosystems and the natural habitat for birds and animals, experience better living conditions around the port vicinity. This affirms the indirect benefits achieved in environmental and social dimensions by implementing economic sustainability-related strategies and policies. Our findings also emphasize the need for an integrative approach to achieving sustainability of ports, and it can be achieved only when all three dimensions intersect and contribute to complement each other for overall sustainable development.
This study contributes with both novel theoretical and practical implications. Firstly, the study provides a comprehensive list of items about the indicators of sustainability practices in shipping ports, which are available in published scholarly articles and from domain experts working in the port industry. Secondly, as the first of its kind in the seaport sector, the study adopted a scientific content validation approach of indices and procedures to assess the relevance and essentiality of items in the context of shipping ports and contemporary sustainability practices focused on shipping ports. Our study validated an instrument for assessing the sustainability practices in shipping ports, which is a significant step in formulating policies and developing strategies focusing on the sustainable development of ports. The validated instrument can be adapted to determine the extent of adoption of sustainability practices and drive the necessary implementation through policy centered around the sustainability of shipping ports. The instrument can be a guideline for practitioners, policymakers, and researchers focusing on the sustainable development of shipping ports through environmental, economic, and social sustainability practices. Ports authorities can embrace the validated instrument to assess their level of adoption and focus on these sustainability practices, which will aid in developing policies and strategies for the sustainable development of ports. Further, the Global Reporting Initiative (GRI) Standards, developed by the Global Sustainability Standards Board (GSSB) primarily for sustainability reporting, can be referred to along with our validated instrument for sustainability evaluation and reporting in compliance with the GRI standards [ 99 ]. GRI Standards assist organizations in understanding and reporting the extent to which the organization impacts sustainability and contributes to sustainable development, considering the interests of all the stakeholders, including investors, policymakers, capital markets, and civil society, thus making the organization transparent and responsible for sustainability. Sector-specific standards have been developed, of which ports are part of Group 3, which comprises various Transport, infrastructure, and tourism-related sectors. However, it is not readily available for shipping ports but can be developed and customized by the port authorities. To do so, the findings of the validated instrument of our study can be considered as a guide in assessing and preparing the sustainability report as per the applicable GRI standards.
Further, sustainability assessment should not be considered a one-time activity in the port. Instead, the port authorities should have strategies and policies to track the trends and changes taking place to the extent of adopting sustainability practices in the port and their impact on sustainable development. Each individual port must do it through its team/department or personnel responsible for the sustainability assessment and policy implementation in the port, and it also must be a continuous activity at regular intervals, maybe once in 3 months or 6 months, depending on the policy and management decision. Thus, the longitudinal assessment, which keeps track of the various aspects of sustainability, will help the port evaluate the effectiveness of sustainability interventions implemented at shipping ports.
Although the study achieved its objectives of a novel contribution of a content-validated sustainability measurement instrument for assessing sustainability practices in seaports, there were a few limitations, and there is also further scope for advancing the study in the future. The keywords used in the literature search were confined to published articles in the Scopus database. Future work can expand the search in other scholarly databases and increase the items’ relevance to measuring shipping ports’ sustainability practices. The study was limited to government-controlled significant ports on India’s east and west coasts. Due to permission and access-related challenges, the data collection did not cover the privately managed ports. The items of the study are generalized concerning a shipping port, and further research can consider further refinement specific to the type of cargo handled in the Port or confined to the terminal instead of a generalizable study irrespective of the kind of cargo being handled. The applications of digital technology and automation using Artificial Intelligence and Machine Learning, along with big data and blockchain technology, could be explored to assess their impact on sustainable port management and development. A different methodological approach can be adopted like the study of Yadav et al. [ 97 ], where the “multi-criteria-decision making (MCDM)” approach was used to identify the enablers of sustainability along with the determination of its intensity using “Robust-Best–Worst-method” (RBWM). Their analysis identified economic and environmental-related enablers as the high-intensity enablers of sustainability that organizations can focus on. Other stakeholders, such as customers, port users, government agencies linked with the port operations, and the local community, were not part of the panel for the validation process. In future studies, these other stakeholders can also be considered in the panel so that every aspect is covered in the evaluation. The items were based on a 5-point Likert scale in this study to capture only the perception of port employees on the sustainability practices adopted in the port. A suitable triangulation method and case studies can also be used to analyze the qualitative aspects of adopting sustainability practices in the port.
The study validated an instrument for assessing the sustainability practices in shipping ports, which is a significant step in formulating strategies focusing on the sustainable development of ports. The instrument can be a guideline for practitioners, policymakers, and researchers focusing on the sustainable development of shipping ports through environmental, economic, and social sustainability practices. The study prepared a comprehensive list comprising relevant items identified through a thorough literature review of articles published in the Scopus database. After face validation, the measurement tool was administered to six subject matter experts who evaluated its relevance and essentiality in measuring sustainability practices in shipping ports. The content validity was assessed using the most widely used and adopted indices: CVI, Cohen’s Kappa’s coefficient, and CVR. CVI and Cohen’s Kappa’s coefficient are the indices for assessing the relevance of the items in measuring sustainability practices, and CVR is the index for determining the essentiality of the items in measuring sustainability practices in shipping ports. Further, this study contributes to the extant body of literature by providing evidence on the empirical relationship between the environmental, economic, and social sustainability-related practices of the sustainability construct in the TBL theory-based framework applied for shipping ports.
The data for analysis in the study was based on survey data collected through a questionnaire instrument administered on Likert scales, both online and offline modes of data collection. The data collection period was between December 2022 and Dec 2023. The instrument had a declaration mentioning maintaining the privacy of the participants and therefore, the data cannot be made public to protect study participant privacy. The primary data collected in the study are not publicly accessible but are available from the corresponding author upon reasonable request.
Meixell MJ, Luoma P. Stakeholder pressure in sustainable supply chain management: a systematic review. Int J Phys Distrib Logist Manag. 2015;45:69–89. https://doi.org/10.1108/IJPDLM-05-2013-0155 .
Article Google Scholar
Alamoush AS, Ballini F, Ölçer AI. Revisiting port sustainability as a foundation for the implementation of the United Nations Sustainable Development Goals (UN SDGs). J Shipp Trade. 2021;6(1):1–40. https://doi.org/10.1186/S41072-021-00101-6 .
Dyllick T, Hockerts K. Beyond the business case for corporate sustainability. Bus Strateg Environ. 2002;11(2):130–41. https://doi.org/10.1002/bse.323 .
Lun YHV, Lai K, Wong CWY, Cheng TCE. Green shipping management. Cham: Springer International Publishing; 2016. https://doi.org/10.1007/978-3-319-26482-0 .
Book Google Scholar
Porter ME, Van Der Linde C. Green and competitive: ending the stalemate. In: Corporate environmental responsibility. 2017. p. 47–60. https://doi.org/10.1016/0024-6301(95)99997-e .
Russo MV, Fouts PA. A resource-based perspective on corporate environmental performance and profitability. Acad Manag J. 1997;40(3):534–59. https://doi.org/10.2307/257052 .
Roszkowska-Menkes M. Porter and Kramer’s (2006) “shared value.” In: Encyclopedia of sustainable management. Cham: Springer International Publishing; 2021. p. 1–6. https://doi.org/10.1007/978-3-030-02006-4_393-1 .
Chapter Google Scholar
Zhu Q, Sarkis J. Relationships between operational practices and performance among early adopters of green supply chain management practices in Chinese manufacturing enterprises. J Oper Manag. 2004;22(3):265–89. https://doi.org/10.1016/j.jom.2004.01.005 .
Hong J, Zhang Y, Ding M. Sustainable supply chain management practices, supply chain dynamic capabilities, and enterprise performance. J Clean Prod. 2018;172:3508–19. https://doi.org/10.1016/J.JCLEPRO.2017.06.093 .
Ministry of Ports Shipping and Waterways. Maritime India vision 2030. Sagarmala; 2021.
Pradhan RP, Rathi C, Gupta S. Sagarmala & India’s maritime big push approach: seaports as India’s geo-economic gateways & neighborhood maritime lessons. J Indian Ocean Reg. 2022;18(3):209–29. https://doi.org/10.1080/19480881.2022.2114195 .
Mantry S, Ghatak RR. Comparing and contrasting competitiveness of major Indian and select international ports. Int J Res Finance Mark. 2017;7(5):1–19.
Google Scholar
Song DW, Panayides PM. Global supply chain and port/terminal: integration and competitiveness. In: Maritime policy and management. London: Taylor & Francis; 2008. p. 73–87. https://doi.org/10.1080/03088830701848953 .
Yap WY, Lam JSL. 80 million-twenty-foot-equivalent-unit container port? Sustainability issues in port and coastal development. Ocean Coast Manag. 2013;71:13–25. https://doi.org/10.1016/j.ocecoaman.2012.10.011 .
Lee PTW, Kwon OK, Ruan X. Sustainability challenges in maritime transport and logistics industry and its way ahead. Sustainability. 2019;11(5):1331. https://doi.org/10.3390/SU11051331 .
Dragović B, Tzannatos E, Park NK. Simulation modelling in ports and container terminals: literature overview and analysis by research field, application area and tool. Flex Serv Manuf J. 2017;29(1):4–34. https://doi.org/10.1007/s10696-016-9239-5 .
Ashrafi M, Acciaro M, Walker TR, Magnan GM, Adams M. Corporate sustainability in Canadian and US maritime ports. J Clean Prod. 2019;220:386–97. https://doi.org/10.1016/j.jclepro.2019.02.098 .
Peris-Mora E, Orejas JMD, Subirats A, Ibáñez S, Alvarez P. Development of a system of indicators for sustainable port management. Mar Pollut Bull. 2005;50(12):1649–60. https://doi.org/10.1016/j.marpolbul.2005.06.048 .
Article CAS Google Scholar
Ashrafi M, Walker TR, Magnan GM, Adams M, Acciaro M. A review of corporate sustainability drivers in maritime ports: a multi-stakeholder perspective. Marit Policy Manag. 2020;47(8):1027–44. https://doi.org/10.1080/03088839.2020.1736354 .
Stanković JJ, Marjanović I, Papathanasiou J, Drezgić S. Social, economic and environmental sustainability of port regions: MCDM approach in composite index creation. J Mar Sci Eng. 2021;9(1):74. https://doi.org/10.3390/JMSE9010074 .
Dinwoodie J, Tuck S, Knowles H, Benhin J, Sansom M. Sustainable development of maritime operations in ports. Bus Strateg Environ. 2011. https://doi.org/10.1002/bse.718 .
Ports primer: 7.1 environmental impacts | US EPA. https://www.epa.gov/community-port-collaboration/ports-primer-71-environmental-impacts . Accessed Apr 28 2024.
Notteboom T, Pallis A, Rodrigue J-P. Port economics, management and policy. Port Econ Manag Policy. 2021. https://doi.org/10.4324/9780429318184 .
Notteboom T, van der Lugt L, van Saase N, Sel S, Neyens K. The role of seaports in green supply chain management: initiatives, attitudes, and perspectives in Rotterdam, Antwerp, North Sea Port, and Zeebrugge. Sustainability. 2020;12(4):1688. https://doi.org/10.3390/su12041688 .
Molavi A, Lim GJ, Race B. A framework for building a smart port and smart port index. Int J Sustain Transp. 2020;14(9):686–700. https://doi.org/10.1080/15568318.2019.1610919 .
Wu Q, He Q, Duan Y. Explicating dynamic capabilities for corporate sustainability. EuroMed J Bus. 2013;8(3):255–72. https://doi.org/10.1108/EMJB-05-2013-0025 .
Argyriou I, Daras T, Tsoutsos T. Challenging a sustainable port. A case study of Souda port, Chania, Crete. Case Stud Transp Policy. 2022;10(4):2125–37. https://doi.org/10.1016/J.CSTP.2022.09.007 .
Bjerkan KY, Seter H. Reviewing tools and technologies for sustainable ports: does research enable decision making in ports? Transp Res D Transp Environ. 2019;72:243–60. https://doi.org/10.1016/j.trd.2019.05.003 .
Oh H, Lee S-W, Seo Y-J. The evaluation of seaport sustainability: the case of South Korea. Ocean Coast Manag. 2018;161:50–6. https://doi.org/10.1016/j.ocecoaman.2018.04.028 .
Lu CS, Shang KC, Lin CC. Examining sustainability performance at ports: port managers’ perspectives on developing sustainable supply chains. Marit Policy Manag. 2016;43(8):909–27. https://doi.org/10.1080/03088839.2016.1199918 .
Kang D, Kim S. Conceptual model development of sustainability practices: the case of port operations for collaboration and governance. Sustainability. 2017;9(12):2333. https://doi.org/10.3390/su9122333 .
Narasimha PT, Jena PR, Majhi R. Sustainability performance assessment framework for major seaports in India. Int J Sustain Dev Plan. 2022;17(2):693–704. https://doi.org/10.18280/ijsdp.170235 .
Vejvar M, Lai K, Lo CKY, Fürst EWM. Strategic responses to institutional forces pressuring sustainability practice adoption: case-based evidence from inland port operations. Transp Res D Transp Environ. 2018;61:274–88. https://doi.org/10.1016/j.trd.2017.08.014 .
Ayre C, Scally AJ. Critical values for Lawshe’s content validity ratio: revisiting the original methods of calculation. Meas Eval Couns Dev. 2014;47(1):79–86. https://doi.org/10.1177/0748175613513808 .
Barbosa MW, Cansino JM. A water footprint management construct in agri-food supply chains: a content validity analysis. Sustainability. 2022;14(9):4928. https://doi.org/10.3390/su14094928 .
Waltz CF, Strickland OL, Lenz ER. Measurement in nursing and health. Research. 2016. https://doi.org/10.1891/9780826170620 .
Ibiyemi A, Mohd Adnan Y, Daud MN, Olanrele S, Jogunola A. A content validity study of the test of valuers’ support for capturing sustainability in the valuation process in Nigeria. Pac Rim Prop Res J. 2019;25(3):177–93. https://doi.org/10.1080/14445921.2019.1703700 .
Diniz NV, Cunha DR, de Santana Porte M, Oliveira CBM, de Freitas Fernandes F. A bibliometric analysis of sustainable development goals in the maritime industry and port sector. Reg Stud Mar Sci. 2024;69: 103319. https://doi.org/10.1016/j.rsma.2023.103319 .
Amui LBL, Jabbour CJC, de Sousa Jabbour ABL, Kannan D. Sustainability as a dynamic organizational capability: a systematic review and a future agenda toward a sustainable transition. J Clean Prod. 2017;142:308–22. https://doi.org/10.1016/j.jclepro.2016.07.103 .
Maletič M, Maletič D, Gomišček B. The impact of sustainability exploration and sustainability exploitation practices on the organisational performance: a cross-country comparison. J Clean Prod. 2016. https://doi.org/10.1016/j.jclepro.2016.02.132 .
Berns M, Hopkins MS, Townend A, Khayat Z, Balagopal B, Reeves M. The business of sustainability: what it means to managers now. MIT Sloan Manag Rev. 2009;51(1).
Montiel I, Delgado-Ceballos J. Defining and measuring corporate sustainability. Organ Environ. 2014;27(2):113–39. https://doi.org/10.1177/1086026614526413 .
Laxe FG, Bermúdez FM, Palmero FM, Novo-Corti I. Assessment of port sustainability through synthetic indexes. Application to the Spanish case. Mar Pollut Bull. 2017;119(1):220–5. https://doi.org/10.1016/j.marpolbul.2017.03.064 .
Torugsa NA, O’Donohue W, Hecker R. Proactive CSR: an empirical analysis of the role of its economic, social and environmental dimensions on the association between capabilities and performance. J Bus Ethics. 2013. https://doi.org/10.1007/s10551-012-1405-4 .
Lauring J, Thomsen C. Collective ideals and practices in sustainable development: managing corporate identity. Corp Soc Responsib Environ Manag. 2009;16(1):38–47. https://doi.org/10.1002/csr.181 .
Hallstedt SI, Thompson AW, Lindahl P. Key elements for implementing a strategic sustainability perspective in the product innovation process. J Clean Prod. 2013;51:277–88. https://doi.org/10.1016/J.JCLEPRO.2013.01.043 .
Parola F, Risitano M, Ferretti M, Panetti E. The drivers of port competitiveness: a critical review. Transp Rev. 2017;37(1):116–38. https://doi.org/10.1080/01441647.2016.1231232 .
Simpson J, Weiner E, Durkin P. The Oxford English dictionary today. Trans Philol Soc. 2004;102(3):335–81. https://doi.org/10.1111/j.0079-1636.2004.00140.x .
Ruggerio CA. Sustainability and sustainable development: a review of principles and definitions. Sci Total Environ. 2021;786: 147481. https://doi.org/10.1016/J.SCITOTENV.2021.147481 .
Moore JE, Mascarenhas A, Bain J, Straus SE. Developing a comprehensive definition of sustainability. Implement Sci. 2017;12(1):1–8. https://doi.org/10.1186/S13012-017-0637-1/TABLES/3 .
Elkington J. Partnerships from cannibals with forks: the triple bottom line of 21st-century business. Environ Qual Manag. 1998;8(1):37–51. https://doi.org/10.1002/tqem.3310080106 .
Elkington J. Tripple bottom line. In: Cannibals with forks. Oxford: Capstone; 1997.
Waddock SA, Graves SB. The corporate social performance-financial performance link. Strateg Manag J. 1997;18(4):303–19. https://doi.org/10.1002/(SICI)1097-0266(199704)18:4%3c303::AID-SMJ869%3e3.0.CO;2-G .
Sharma S, Vredenburg H. Proactive corporate environmental strategy and the development of competitively valuable organizational capabilities. Strateg Manag J. 1998;19(8):729–53. https://doi.org/10.1002/(sici)1097-0266(199808)19:8%3c729::aid-smj967%3e3.3.co;2-w .
Carroll AB, Shabana KM. The business case for corporate social responsibility: a review of concepts, research and practice. Int J Manag Rev. 2010;12(1):85–105. https://doi.org/10.1111/j.1468-2370.2009.00275.x .
Beske P. Dynamic capabilities and sustainable supply chain management. Int J Phys Distrib Logist Manag. 2012;42(4):372–87. https://doi.org/10.1108/09600031211231344 .
Lam JSL, Li KX. Green port marketing for sustainable growth and development. Transp Policy. 2019;84:73–81. https://doi.org/10.1016/j.tranpol.2019.04.011 .
Olakitan Atanda J. Developing a social sustainability assessment framework. Sustain Cities Soc. 2019;44:237–52. https://doi.org/10.1016/j.scs.2018.09.023 .
Bansal P. Evolving sustainably: a longitudinal study of corporate sustainable development. Strateg Manag J. 2005;26(3):197–218. https://doi.org/10.1002/smj.441 .
Carter CR, Liane Easton P. Sustainable supply chain management: evolution and future directions. Int J Phys Distrib Logist Manag. 2011;41(1):46–62. https://doi.org/10.1108/09600031111101420 .
Janic M. Sustainable transport in the European Union: a review of the past research and future ideas. Transp Rev. 2006;26(1):81–104. https://doi.org/10.1080/01441640500178908 .
Steurer R, Langer ME, Konrad A, Martinuzzi A. Corporations, stakeholders and sustainable development I: a theoretical exploration of business-society relations. J Bus Ethics. 2005;61(3):263–81. https://doi.org/10.1007/s10551-005-7054-0 .
Stanković JJ, Marjanović IM, Papathanasiou J, Drezgić SD. Marine science and engineering social, economic and environmental sustainability of port regions: MCDM approach in composite index creation. J Mar Sci Eng. 2021. https://doi.org/10.3390/jmse9010074 .
Mori K, Christodoulou A. Review of sustainability indices and indicators: towards a new city sustainability index (CSI). Environ Impact Assess Rev. 2012;32(1):94–106. https://doi.org/10.1016/J.EIAR.2011.06.001 .
Mayer AL. Strengths and weaknesses of common sustainability indices for multidimensional systems. Environ Int. 2007. https://doi.org/10.1016/j.envint.2007.09.004 .
Hair J, Black W, Babin B, Anderson R. Multivariate data analysis: a global perspective. In: Multivariate data analysis: a global perspective, vol. 7. Upper Saddle River: Pearson Education; 2010.
Hair JF, Black WC, Babin BJ, Anderson RE. Multivariate data analysis. Hampshire: Cengage Learning; 2019.
Boateng GO, Neilands TB, Frongillo EA, Melgar-Quiñonez HR, Young SL. Best practices for developing and validating scales for health, social, and behavioral research: a primer. Front Public Health. 2018;6:149. https://doi.org/10.3389/fpubh.2018.00149 .
Elangovan N, Sundaravel E. Method of preparing a document for survey instrument validation by experts. MethodsX. 2021;8: 101326. https://doi.org/10.1016/J.MEX.2021.101326 .
Papadas KK, Avlonitis GJ, Carrigan M. Green marketing orientation: conceptualization, scale development and validation. J Bus Res. 2017;80:236–46. https://doi.org/10.1016/J.JBUSRES.2017.05.024 .
Polit DF, Beck CT. The content validity index: are you sure you know what’s being reported? Critique and recommendations. Res Nurs Health. 2008;31(4):489–97.
Polit DF, Beck CT, Owen SV. Focus on research methods: Is the CVI an acceptable indicator of content validity? Appraisal and recommendations. Res Nurs Health. 2007;30(4):459–67. https://doi.org/10.1002/nur.20199 .
Zamanzadeh V, Ghahramanian A, Rassouli M, Abbaszadeh A, Alavi-Majd H, Nikanfar A-R. Design and implementation content validity study: development of an instrument for measuring patient-centered communication. J Caring Sci. 2015;4(2):165. https://doi.org/10.15171/jcs.2015.017 .
de Souza AC, Alexandre NMC, Guirardello EDB, de Souza AC, Alexandre NMC, Guirardello EDB. Propriedades psicométricas na avaliação de instrumentos: avaliação da confiabilidade e da validade. Epidemiologia e Serviços de Saúde. 2017;26(3):649–59. https://doi.org/10.5123/S1679-49742017000300022 .
Rodrigues IB, Adachi JD, Beattie KA, MacDermid JC. Development and validation of a new tool to measure the facilitators, barriers and preferences to exercise in people with osteoporosis. BMC Musculoskelet Disord. 2017;18(1):540. https://doi.org/10.1186/s12891-017-1914-5 .
Bobos P, Pouliopoulou DVS, Harriss A, Sadi J, Rushton A, MacDermid JC. A systematic review and meta-analysis of measurement properties of objective structured clinical examinations used in physical therapy licensure and a structured review of licensure practices in countries with well-developed regulation systems. PLoS ONE. 2021;16(8): e0255696. https://doi.org/10.1371/journal.pone.0255696 .
Hair JF, Hult GTM, Ringle CM, Sarstedt M, Thiele KO. Mirror, mirror on the wall: a comparative evaluation of composite-based structural equation modeling methods. J Acad Mark Sci. 2017;45(5):616–32. https://doi.org/10.1007/s11747-017-0517-x .
Cohen J. A power primer. In: Methodological issues and strategies in clinical research. 4th ed. Washington: American Psychological Association; 2016. p. 279–84. https://doi.org/10.1037/14805-018 .
Roldán JL, Sánchez-Franco MJ. Variance-based structural equation modeling. In: Research methodologies, innovations and philosophies in software systems engineering and information systems. Pennsylvania: IGI Global; 2012. p. 193–221. https://doi.org/10.4018/978-1-4666-0179-6.ch010 .
Faul F, Erdfelder E, Buchner A, Lang A-G. Statistical power analyses using G*Power 3.1: tests for correlation and regression analyses. Behav Res Methods. 2009;41(4):1149–60. https://doi.org/10.3758/BRM.41.4.1149 .
Goodboy AK, Kline RB. Statistical and practical concerns with published communication research featuring structural equation modeling. Commun Res Rep. 2017;34(1):68–77. https://doi.org/10.1080/08824096.2016.1214121 .
Crawford JA, Kelder J-A. Do we measure leadership effectively? Articulating and evaluating scale development psychometrics for best practice. Leadersh Q. 2019;30(1):133–44. https://doi.org/10.1016/j.leaqua.2018.07.001 .
Sarstedt M, Hair JF, Cheah JH, Becker JM, Ringle CM. How to specify, estimate, and validate higher-order constructs in PLS-SEM. Australas Mark J. 2019;27(3):197–211. https://doi.org/10.1016/J.AUSMJ.2019.05.003 .
Ringle CM, Wende S, Becker J-M. SmartPLS 4. http://www.smartpls.com .
Malhotra S. Study of features of mobile trading apps: a silver lining of pandemic. J Global Inf Bus Strateg. 2020. https://doi.org/10.5958/2582-6115.2020.00009.0 .
Shrotryia VK, Dhanda U. Content validity of assessment instrument for employee engagement. SAGE Open. 2019;9(1):2158244018821751. https://doi.org/10.1177/2158244018821751 .
Bagozzi RP, Yi Y. On the evaluation of structural equation models. J Acad Mark Sci. 1988;16(1):74–94. https://doi.org/10.1007/BF02723327 .
Chin WW, Gopal A, Salisbury WD. Advancing the theory of adaptive structuration: the development of a scale to measure faithfulness of appropriation. Inf Syst Res. 1997;8(4):342–67. https://doi.org/10.1287/isre.8.4.342 .
Hair JF, Hult Jr GTM, Ringle CM, Sarstedt M. A primer on partial least squares structural equations modeling (PLS-SEM). J Tour Res. 2021;6(2).
Fornell C, Larcker DF. Evaluating structural equation models with unobservable variables and measurement error. J Mark Res. 1981;18(1):39–50. https://doi.org/10.1177/002224378101800104 .
Henseler J, Ringle CM, Sarstedt M. A new criterion for assessing discriminant validity in variance-based structural equation modeling. J Acad Mark Sci. 2015;43(1):115–35. https://doi.org/10.1007/s11747-014-0403-8 .
Franke G, Sarstedt M. Heuristics versus statistics in discriminant validity testing: a comparison of four procedures. Internet Res. 2019;29(3):430–47. https://doi.org/10.1108/IntR-12-2017-0515 .
Hu LT, Bentler PM. Cutoff criteria for fit indexes in covariance structure analysis: conventional criteria versus new alternatives. Struct Equ Model. 1999;6(1):1–55. https://doi.org/10.1080/10705519909540118 .
Becker JM, Ringle CM, Sarstedt M, Völckner F. How collinearity affects mixture regression results. Mark Lett. 2015;26(4):643–59. https://doi.org/10.1007/s11002-014-9299-9 .
Chang DS, Kuo LCR. The effects of sustainable development on firms’ financial performance—an empirical approach. Sustain Dev. 2008;16(6):365–80. https://doi.org/10.1002/sd.351 .
Ogunbiyi O, Oladapo A, Goulding J. An empirical study of the impact of lean construction techniques on sustainable construction in the UK. Constr Innov. 2014;14(1):88–107. https://doi.org/10.1108/CI-08-2012-0045 .
Yadav G, Kumar A, Luthra S, Garza-Reyes JA, Kumar V, Batista L. A framework to achieve sustainability in manufacturing organisations of developing economies using industry 4.0 technologies’ enablers. Comput Ind. 2020;122: 103280. https://doi.org/10.1016/j.compind.2020.103280 .
Poulsen RT, Ponte S, Sornn-Friese H. Environmental upgrading in global value chains: the potential and limitations of ports in the greening of maritime transport. Geoforum. 2018;89:83–95. https://doi.org/10.1016/J.GEOFORUM.2018.01.011 .
GRI -Standards. https://www.globalreporting.org/standards/ . Accessed 09 May 2024.
Lu C-S, Shang K-C, Lin C-C. Identifying crucial sustainability assessment criteria for container seaports. Marit Bus Rev. 2016;1(2):90–106. https://doi.org/10.1108/MABR-05-2016-0009 .
Download references
We acknowledge the contribution of the expert panel for their reviews and feedback that enabled us to optimize the items in the instrument.
Open access funding provided by Manipal Academy of Higher Education, Manipal. This study has not received any funding from any institutions or agencies.
Authors and affiliations.
Department of Commerce, Manipal Academy of Higher Education, Manipal, 576104, India
Department of Humanities and Management, Manipal Institute of Technology, Manipal Academy of Higher Education, Manipal, 576104, India
Yogesh P. Pai
T A Pai Management Institute, Yelahanka, Govindapura, Bengaluru, 560064, Karnataka, India
Parthesh Shanbhag
You can also search for this author in PubMed Google Scholar
All the authors contributed to the manuscript equally K.L conceptualized the study and executed data collection All the authors jointly performed data analysis and authored the manuscript All authors reviewed the manuscript before submitting.
Correspondence to Yogesh P. Pai .
Competing interests.
The authors declare no competing interests.
Publisher's note.
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Source | |
---|---|
Environmental sustainability practices | |
Avoiding the use of unpolluted land in the port area | [ , , , , , ] |
Developing and maintaining mangroves, gardens, and landscapes | |
Avoiding environmental destruction during dredging | |
Considering environmental protection when handling cargo | |
Using recyclable or environment-friendly materials in port construction | |
Protecting the ecological environment in the port area | |
Reduction of noise pollution | |
Mitigating light influence on neighboring residents | |
Controlling smoke level | |
Maintaining air quality | |
Reduction of greenhouse gas | |
Reduction of carbon emissions | |
Preventing odour pollution | |
Optimal utilization of renewables and alternate energy sources | |
Facilities for wastewater and sewage treatment | |
Implementation of dust suppression systems | |
Economic sustainability practices | |
Facilitating economic growth and acting as a supply chain link in local and global trade | [ , , , , , ] |
Investments in port infrastructure development | |
Establishing port development funding | |
Attracting foreign direct investments | |
Promotion and development of cruise tourism services | |
Employment generation and career growth opportunities | |
Ensuring that cargo is handled safely and effectively | |
Low damage or loss record for cargo delivery | |
Usage of energy-efficient electrical and electronic appliances like LED lamps | |
Optimal utilization of infrastructure, land, and space in the port area | |
Offering one-stop logistics solutions, including freight forwarding and additional services | |
Optimizing the routing of vehicles in and out of port | |
Mitigating congestion in the port | |
Providing incentives for green shipping practices | |
Landlord activities | |
Investment in climate change adaptation strategies | |
Sustainable supply chain policy | |
Investment in innovation strategy | |
Transshipment and storage of dangerous goods | |
Social sustainability practices | |
Recognizing the requirements of the neighboring community | [ , , , , , ] |
Giving support to community social activities | |
Providing training and education for employees regularly | |
Providing employees’ welfare benefits and other facilities | |
Staff job security even during uncertainties of the business | |
Strengthening safety and security management standards and protocols of the port | |
Accident prevention in the port area | |
Social equality and gender diversity in employment | |
Job satisfaction of employees | |
Consulting various interest groups such as labor unions and community leaders when making port project decision | |
Strengthening port infrastructure for social contribution | |
Engaging in corporate social responsibility practices |
Item code | Items description | Agree to count for CVI | i-CVI | Pc | K | Agree to count for CVR | CVR |
---|---|---|---|---|---|---|---|
EnvSP1 | Avoiding the use of unpolluted land in the port area | 4 | 0.67 | 0.23 | 0.56 | 6 | 1.00 |
EnvSP2 | Developing and maintaining mangroves, gardens, and landscapes | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
EnvSP3 | Avoiding environmental destruction during dredging | 4 | 0.67 | 0.23 | 0.56 | 5 | 0.67 |
EnvSP4 | Considering environmental protection when handling cargo | 4 | 0.67 | 0.23 | 0.56 | 6 | 1.00 |
EnvSP5 | Using recyclable or environment-friendly materials in port construction | 5 | 0.83 | 0.09 | 0.82 | 5 | 0.67 |
EnvSP6 | Protecting the ecological environment in the port area | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
EnvSP7 | Reduction of noise pollution | 5 | 0.83 | 0.09 | 0.82 | 5 | 0.67 |
EnvSP8 | Mitigating light influence on neighboring residents | 4 | 0.67 | 0.23 | 0.56 | 4 | 0.33 |
EnvSP9 | Controlling smoke level | 3 | 0.5 | 0.31 | 0.27 | 6 | 1.00 |
EnvSP10 | Maintaining air quality | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
EnvSP11 | Reduction of greenhouse gas | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
EnvSP12 | Reduction of carbon emissions | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
EnvSP13 | Preventing odour pollution | 3 | 0.5 | 0.31 | 0.27 | 4 | 0.33 |
EnvSP14 | Optimal utilization of renewables and alternate energy sources | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
EnvSP15 | Facilities for wastewater and sewage treatment | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
EnvSP16 | Implementation of dust suppression systems | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
EnvSP17 | Cold-ironing source of power for vessels on the berth | 4 | 0.67 | 0.23 | 0.56 | 4 | 0.33 |
EcoSP1 | Facilitating economic growth and acting as a supply chain link in local and global trade | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
EcoSP2 | Investments in port infrastructure development | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
EcoSP3 | Establishing port development funding | 2 | 0.33 | 0.23 | 0.13 | 4 | 0.33 |
EcoSP4 | Attracting foreign direct investments | 2 | 0.33 | 0.23 | 0.13 | 4 | 0.33 |
EcoSP5 | Promotion and development of cruise tourism services | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
EcoSP6 | Employment generation and career growth opportunities | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
EcoSP7 | Ensuring that cargo is handled safely and effectively | 3 | 0.5 | 0.31 | 0.27 | 6 | 1.00 |
EcoSP8 | Low damage or loss record for cargo delivery | 4 | 0.67 | 0.23 | 0.56 | 6 | 1.00 |
EcoSP9 | Usage of energy-efficient electrical and electronic appliances | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
EcoSP10 | Optimal utilization of infrastructure, land, and space in the port area | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
EcoSP11 | Offering one-stop logistics solutions, including freight forwarding and additional services | 6 | 1 | 0.02 | 1 | 5 | 0.67 |
EcoSP12 | Optimizing the routing of vehicles in and out of port | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
EcoSP13 | Mitigating congestion in the port | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
EcoSP14 | Providing incentives for green shipping practices | 6 | 1 | 0.02 | 1 | 5 | 0.67 |
EcoSP15 | Landlord activities | 6 | 1 | 0.02 | 1 | 5 | 0.67 |
EcoSP16 | Investment in climate change adaptation strategies | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
EcoSP17 | Sustainable supply chain policy | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
EcoSP18 | Investment in innovation strategy | 6 | 1 | 0.02 | 1 | 5 | 0.67 |
EcoSP19 | Transshipment and storage of dangerous goods | 6 | 1 | 0.02 | 1 | 4 | 0.33 |
SocSP1 | Recognizing the requirements of the neighboring community | 6 | 1 | 0.02 | 1 | 4 | 0.33 |
SocSP2 | Giving support to community social activities | 6 | 1 | 0.02 | 1 | 4 | 0.33 |
SocSP3 | Providing training and education for employees regularly | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
SocSP4 | Providing employees’ welfare benefits and other facilities | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
SocSP5 | Staff job security even during uncertainties of the business | 3 | 0.5 | 0.31 | 0.27 | 5 | 0.67 |
SocSP6 | Strengthening safety and security management standards and protocols of the port | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
SocSP7 | Accident prevention in the port area | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
SocSP8 | Social equality and gender diversity in employment | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
SocSP9 | Job satisfaction of employees | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
SocSP10 | Consulting various interest groups such as labor unions and community leaders when making port project decision | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
SocSP11 | Strengthening port infrastructure for social contribution | 6 | 1 | 0.02 | 1 | 5 | 0.67 |
SocSP12 | Engaging in corporate social responsibility practices | 6 | 1 | 0.02 | 1 | 6 | 1.00 |
4.1 section a—demographic profile.
Please indicate the extent to which you agree on statements related to your port on a scale of 1–5.
1—strongly disagree, 2—disagree, 3—neutral, 4—agree, 5—strongly agree.
If you are unaware of the Port’s practices, you may choose “3-Neutral.”
Environmental sustainability practices adopted in your port focusses on | 1 | 2 | 3 | 4 | 5 |
---|---|---|---|---|---|
Developing and maintaining mangroves, gardens, and landscapes | |||||
Protecting the ecological environment in the port area | |||||
Maintaining air quality | |||||
Reduction of greenhouse gas | |||||
Reduction of carbon emissions | |||||
Optimal utilization of renewables and alternate energy sources | |||||
Facilities for wastewater and sewage treatment | |||||
Implementation of dust suppression systems |
Economic sustainability practices adopted in your port focusses on | 1 | 2 | 3 | 4 | 5 |
---|---|---|---|---|---|
Facilitating economic growth and acting as a supply chain link in local and global trade | |||||
Investments in port infrastructure development | |||||
Promotion and development of cruise tourism services | |||||
Employment generation and career growth opportunities | |||||
Usage of energy-efficient electrical and electronic appliances like LED lamps | |||||
Optimal utilization of infrastructure, land, and space in the port area | |||||
Optimizing the routing of vehicles in and out of port | |||||
Mitigating congestion in the port | |||||
Investment in climate change adaptation strategies | |||||
Sustainable supply chain policy |
Social sustainability practices adopted in your port focusses on | 1 | 2 | 3 | 4 | 5 |
---|---|---|---|---|---|
Providing training and education for employees regularly | |||||
Providing employees’ welfare benefits and other facilities | |||||
Strengthening port safety management standards and protocols | |||||
Accident prevention in the port area | |||||
Social equality and gender diversity in employment | |||||
Job satisfaction of employees | |||||
Consulting various interests groups such as labor unions and community leaders when making port projects decision | |||||
Engaging in corporate social responsibility practices |
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .
Reprints and permissions
Kishore, L., Pai, Y.P. & Shanbhag, P. Reliability and validity assessment of instrument to measure sustainability practices at shipping ports in India. Discov Sustain 5 , 236 (2024). https://doi.org/10.1007/s43621-024-00395-z
Download citation
Received : 27 January 2024
Accepted : 02 August 2024
Published : 03 September 2024
DOI : https://doi.org/10.1007/s43621-024-00395-z
Anyone you share the following link with will be able to read this content:
Sorry, a shareable link is not currently available for this article.
Provided by the Springer Nature SharedIt content-sharing initiative
IMAGES
VIDEO
COMMENTS
There are several types of interviews, often differentiated by their level of structure. Structured interviews have predetermined questions asked in a predetermined order. Unstructured interviews are more free-flowing. Semi-structured interviews fall in between. Interviews are commonly used in market research, social science, and ethnographic ...
Introduction. Interviewing people is at the heart of qualitative research. It is not merely a way to collect data but an intrinsically rewarding activity—an interaction between two people that holds the potential for greater understanding and interpersonal development. Unlike many of our daily interactions with others that are fairly shallow ...
The aim is to present a systematic and detailed explanation of the construction and administration of two research instruments (a questionnaire and an interview guide) used for data collection in ...
Vancouver, Canada. Abstract. Interviews are one of the most promising ways of collecting qualitative data throug h establishment of a. communication between r esearcher and the interviewee. Re ...
Here are some common types of research interviews: 1. Structured Interviews. Structured interviews are standardized and follow a fixed format. Therefore, these interviews have a pre-determined set of questions. All the participants are asked the same set of questions in the same order.
What are interviews? An interviewing method is the most commonly used data collection technique in qualitative research. 1 The purpose of an interview is to explore the experiences, understandings, opinions and motivations of research participants. 2 Interviews are conducted one-on-one with the researcher and the participant. Interviews are most appropriate when seeking to understand a ...
Develop an interview guide. Introduce yourself and explain the aim of the interview. Devise your questions so interviewees can help answer your research question. Have a sequence to your questions / topics by grouping them in themes. Make sure you can easily move back and forth between questions / topics. Make sure your questions are clear and ...
Definitions. The qualitative research interview seeks to describe and the meanings of central themes in the life world of the subjects. The main task in interviewing is to understand the meaning of what the interviewees say. (Kvale,1996) participant's experiences. The interviewer can information around the topic.
Structured Interview | Definition, Guide & Examples. Published on January 27, 2022 by Tegan George and Julia Merkus. Revised on June 22, 2023. A structured interview is a data collection method that relies on asking questions in a set order to collect data on a topic. It is one of four types of interviews.. In research, structured interviews are often quantitative in nature.
instruments is an important skill for research-ers. Such survey instruments can be used in many types of research, from case study, to cross-sectional survey, to experiment. A study of this sort can involve anything from a short paper-and-pencil feedback form, to an intensive one-to-one interview asking a large number of
Interviewing. This is the most common format of data collection in qualitative research. According to Oakley, qualitative interview is a type of framework in which the practices and standards be not only recorded, but also achieved, challenged and as well as reinforced.[] As no research interview lacks structure[] most of the qualitative research interviews are either semi-structured, lightly ...
A semi-structured interview is a data collection method that relies on asking questions within a predetermined thematic framework. However, the questions are not set in order or in phrasing. In research, semi-structured interviews are often qualitative in nature. They are generally used as an exploratory tool in marketing, social science ...
Visual Methods. Visual methods, such as photography, video recording, or drawings, can be used as qualitative research instruments. These methods allow participants to express their experiences and perspectives visually, providing rich and nuanced data. Visual methods can be particularly useful in studying topics related to art, culture, or ...
The term research instrument refers to any tool that you may use to collect or obtain data, measure data and analyse data that is relevant to the subject of your research. Research instruments are often used in the fields of social sciences and health sciences. These tools can also be found within education that relates to patients, staff ...
This article aims to describe how the semi-structured interview as a research instrument is used in qualitative research. The main focus of this article is to disclose some methodological
research instrument can include interviews, tests, surveys, or checklists. The Research Instrument is usually determined by researcher and is tied to the study methodology. This document offers some examples of research instruments and study methods. Choosing a Research Instrument 1. Select a topic
Research Methodologies: Research Instruments
A ' questionnaire ' is the instrument for collecting the primary data (Cohen, 2013). ' Primary data' by extension is data that would not otherwise exist if it were not for the research process and is collected through both questionnaires or interviews, which we discuss here today (O'Leary, 2014). An ' interview ' is typically a ...
The level of researcher involvement in qualitative interviewing - indeed, the embodiment of the unique researcher as the instrument for qualitative data collection - has been widely acknowledged (e.g. Cassell, 2005; Rubin and Rubin, 2005; Turato, 2005).Because the researcher is the instrument in semistructured or unstructured qualitative interviews, unique researcher attributes have the ...
Advisor Consultation Checklist Use the checklist below to ensure that you consulted with your advisor during the key steps in the process of selecting and describing your research instruments. 1. _____ Read this checklist. 2. _____ Made an appointment for our first meeting to discuss the instrument selection. 3.
A research instrument is a tool used to obtain, measure, and analyze data from subjects around the research topic. You need to decide the instrument to use based on the type of study you are conducting: quantitative, qualitative, or mixed-method. For instance, for a quantitative study, you may decide to use a questionnaire, and for a ...
Survey research means collecting information about a group of people by asking them questions and analyzing the results. To conduct an effective survey, follow these six steps: Determine who will participate in the survey. Decide the type of survey (mail, online, or in-person) Design the survey questions and layout.
Research Instrument: Interviews. Interview as a research instrument, Unsplash. The interview is a qualitative research method that collects data by asking questions. It includes three main types: structured, unstructured, and semi-structured interviews. Structured interviews include an ordered list of questions. These questions are often closed ...
Sustainability has emerged as one of the most critical factors influencing the competitiveness of maritime shipping ports. This emergence has led to a surge in research publications on port sustainability-related topics. However, despite the increasing awareness and adoption of sustainability practices, documented literature on empirical studies with survey and interview data is very limited ...