CHAPTER ONE


QUALITATIVE DATA ANALYSIS

Jennifer B. Averill

Qualitative data analysis aims to make sense of the abundant, varied, mostly nonnumeric forms of information that accrue during an investigation. As qualitative researchers, we reflect not only on each piece of data by itself but also on all the data as an integrated, blended, composite package. Increasingly, qualitative researchers are participants in interdisciplinary, mixed-methods research teams for which analytic and interpretive processes are necessarily complementary, distinct, clearly articulated, and critical to the larger investigation. We search for insight, meaning, understanding, and larger patterns of knowledge, intent, and action in what we generate as data. Approaching this task in a responsive, inductive, transparent, yet systematic way demands our best balance of good science, appropriate rigor and quality, and openness to unanticipated findings. Many qualitative studies now include multiple sources of data, including narrative or textual and visual (e.g., photographs, videos, creative works and art, and theatric or performative components) information for analysis. Thorne (2008) describes the analytic process as moving “from pieces to patterns” (p. 142) through the activities of organizing, reading and reviewing mindfully, coding, reflection, thematic derivation, and finding meaning.

WHAT ARE THE COMPONENTS OF QUALITATIVE DATA ANALYSIS?

Regardless of the kind of qualitative design one uses or data one generates, the overarching approach incorporates the following phases in whatever way has been planned or negotiated with participants and stakeholders: data generation; data display; data reduction; data analysis and interpretation (meaning-making/conclusion-drawing); assuring the integrity, transparency, and accuracy of all activities and findings; and dissemination. Stakeholders in qualitative work include the research team, the academic partners, and the specific community or group partners (e.g., community groups, citizen groups, families, students, providers, planners, tribes, important others, representatives of organizations). For qualitative studies, because the researcher is the instrument of reflection, analysis, and interpretation, the phases are not strictly linear. In fact, data generation and analysis usually proceed concurrently, and if a need arises to clarify or revisit something with participants, the phases may overlap or repeat before running their course. Using a recent study of mine to demonstrate the components of qualitative data analysis in action may help to clarify and explain the substance and importance of these actions. Visually, this array of activities can be represented as a funnel, shown in Figure 1.1.

image

Figure 1.1. Visualizing the process of qualitative data analysis.

HOW WAS THE QUALITATIVE DATA ANALYSIS FOR THIS STUDY CONDUCTED?

Overall Purpose and Background of the Study

The purpose of my long-time rural health research in the Southwestern United States, including the study represented here, is threefold: to describe, understand, and critically analyze the meaning of health and key health disparities, barriers, problems, and priorities for rural older adults from their perspective and in their own words; to facilitate contextually and culturally congruent solutions to their identified needs, invoking the voices, assets, and actions of community members; and to disseminate and situate findings within the contexts of communities, public health discourse, and the nursing discipline. The study I am using to describe the activities of qualitative data analysis is Health Care Perceptions and Issues for Rural Elders (National Institutes of Health Grant 1R15NR008217-01A2). The major aim of that study was to analyze definitive indicators of health care disparities, such as affordability of prescriptions, access to basic and specialty care services, and the quality of interactions between them and their providers, for community-dwelling adults aged 60 years and older. Situated near the border of New Mexico and Mexico, this critical ethnography had a sample of 64 participants, covered three rural counties, unfolded in the context of community-based participatory research (CBPR) as interpreted by Wallerstein and Duran (2008), and posed considerable challenges in accurate data analysis.

Actual Practice of Data Analysis

Sources of data included ethnographic interviews (taped and transcribed), field notes, my critical–reflective journal, archival review notes (relevant historical, news-related, eligibility, and care-related documents, excluding personal records), and photography. Stake (2010) noted that using multiple sources of data helps qualitative researchers to answer research questions more completely and deepens the meaning of the findings. Procedures used for the phases of data analysis included the following steps and components:

    1.  Electronic capture, software management: All data were readied for analysis by electronic capture: transcripts, field notes, archival notes, and journal reflections were recorded in Microsoft Word initially. From Word, all narrative data were transferred into ATLAS.ti (2011; version 6) for more precise, line-by-line scrutiny, processing, and organization through the analysis. For small qualitative studies, there is no need for a sophisticated and expensive software package to assist with the organization and processing of data. Because it is the researcher, not the software, who does the work of making decisions, discerning what is and is not important, and choosing the best ways to manage and interpret the abundant data, any word-processing package can suffice for a small sample or modestly sized study. However, in an ongoing program of research, it is helpful and easier to track data, trends in findings, activities of project collaborators, and layers of developing evidence by using specialized software, especially if the work is preserved and followed in a cloud-based computing shared space (Griffith, 2013). Novice researchers and graduate students still developing their research focus and software preference can benefit from less expensive, student versions of the larger software packages, such as ATLAS.ti or NVivo (2014; version 10), as well as free, open-source packages, such as the Centers for Disease Control and Prevention’s EZ-Text for PCs (2000; version 4), or alternative free packages for Macs, available in any good online search for open-source software. Photographs taken during my study were saved in a separate file for later analysis and contribution to overall understanding of findings; however, the larger software packages can assimilate multiple kinds of visual data into the data capture, so they can be retrieved and placed in context as needed during the analysis and interpretation.

    2.  Detailed reading of all individual transcripts, field notes, archival notes, and reflective notes, followed by open or first-level coding: Coding is a process of early sense making of all the data; a component of data reduction, it may be thought of as a process of annotating and disentangling a mass of data (Flick, 2009) or, as Madison (2012) described it, “the process of grouping together … categories that you have accumulated in the field” (p. 43). I see codes as analogous to individual atoms in a molecule or specific concepts in a model or theory—they are the smallest distinct units of meaning that one begins to find by synthesizing the raw data into distinct ideas or conceptual units. For instance, in my study, some of the initial codes were too far from the doctors, choosing between food and medications, and hard to get around. There are two outcomes that one hopes for in this initial coding:

          (a)  The first is documenting distinct codes as one reads the data, usually by way of electronic or handwritten thumbnails, such as my examples, at the margins of the text or in some way electronically that fits the software requirements. A code may be written in the investigator’s words or the words of a participant, whichever captures the essence of the segment that generated it. The coded segment may cover a line or two or even several lines of content in a text, depending on what is said; one can use brackets or some other way of noting how many lines are involved in each distinct code of the individual interviews, sets of notes, or archival entries.

          (b)  The second outcome is the cleaning of the data. In my study, as I read through each document, I marked, then eliminated or refiled elsewhere, segments of text that were not useful or relevant to the study questions or the project in general. Examples of this were comments by the participant and me when a loud thunderstorm interrupted our interview and we had to move inside, close windows, and so forth. I documented it in notes, but then removed that passage from the record because it was not directly relevant to the questions or the study. Another time, several friends of the participant came by while we were talking, briefly interrupted the discussion to say hi, and then walked away from us. All of that got recorded into the transcript but was removed because it was not pertinent. In another instance, I was asked by a family to go back over the history of my research, explaining how I got to the present moment, how this study evolved from earlier ones, and so forth. In this case, I stored the segment in a new file in case I wanted to use it some other time; but it was not important to the immediate study, so it was removed from the large body of data to be coded and analyzed.

                After coding each document, researchers may notice that there are commonly coded segments of text throughout the set of documents, or they may observe that similarly coded segments across the individual documents could be collapsed and combined into a commonly coded label. For instance, a segment coded as not enough money to pay for meds in one interview might be similar to the segment too many expenses to buy my meds coded in another interview. The researcher may decide to recode both of these as not enough money for meds. In this way, initial coding ends with a number of distinct codes distributed among the various documents being coded. Redundancy is avoided so that there are no remaining cases of two very closely worded but different codes.

    3.  Second-level or sequential coding: Second-level coding consists of extracting commonly coded segments from all individual documents and placing them in new documents holding all instances of commonly coded items, creating composite collections of distinct conceptual categories. This moves the coding and synthesis from the individual document level to the level of group data for a second level of refinement or coding across the new composite documents. Metaphorically, this moves the process down into the narrowing segment of the funnel (see Figure 1.1) in qualitative data analysis. The outcome of this process is a new set of distinct codes created by a synthesis and integration of previous coding across all available data. In an ethnographic study such as mine, the second-level codes represented the final array of distinct conceptual ideas common to all data generated. At this point, the researcher develops a codebook, or a listing of distinct codes, each with its own definition or description. The codebook is preserved as a supplementary or appendix-level document for reference and auditability as one moves on to thematic derivation, interpretation, and dissemination. In my study, examples of final codes across all data included inadequate resources for managing health, fragmented services, and cultural tensions.

    4.  Inclusion of visual data as complementary information: As researchers, we use language and words as our collateral. We conduct studies, write about them, and talk about them using words to convey our meaning. Yet, in qualitative work, there is an opportunity to enhance, enrich, illustrate, or demonstrate some things or ideas that cannot be adequately expressed with words. Depending on the audience for our work, visual data can sometimes convey meaning, insight, impact, or significance much more quickly and effectively than words can alone. For instance, original creative works, handmade foods, tools, photographs, or videos can reveal a great deal about the ones who generate these things—things that might be missed or never asked about in traditional research methods. Gubrium and Harper (2013) noted, “Emergent digital and visual methodologies, such as digital storytelling and participatory digital archiving, are changing the ways that social scientists conduct research and are opening up new possibilities for participatory approaches that appeal to diverse audiences and reposition participants as co-producers of knowledge and potentially as co-researchers” (p. 13). In Sullivan’s (2010) words, “What is common is the attention to systematic inquiry, yet in a way that privileges the role imagination and intellect plays in constructing knowledge that is not only new but has the capacity to transform human understanding” (p. xix).

              In my study, when reporting on the challenges an older adult had living alone in the mountains, two photographs of her yard, showing a very steep embankment just outside her front door, demonstrated how difficult it would be for her to meet a transport van to take her to the doctor. The viewers’ eyes immediately recognized the impact of her environment on her capacity to travel anywhere. In an earlier study of mine, when visiting migrant farmers’ labor camps in the evening, I was struck by the poverty, yet a women made and offered fresh, warm tortillas to the nurses when we visited, showing that regardless of their resources, they had the capacity and desire to share something of themselves.

    5.  Thematic analysis and interpretation: This is the phase of synthesis and integration of the recurrent patterns and linkages between and among codes, emergent across all of the data, into distinct themes or propositional statements. Codes common across all data are now linked propositionally in some tentatively meaningful way, pending new evidence. Atoms are meaningfully linked by biochemical bonds that create in this union a separate and important, larger element, such as water (hydrogen and oxygen). Individual concepts are meaningfully linked by propositional statements in a theory or model, such as the concepts of health, health promotion, stress, and prevention being linked conceptually in a model for health promotion. In my study, codes were linked propositionally by suggesting data-based, potentially testable relationships between or among them. Linguistically, themes are larger units of meaning than codes and may be stated as longer phrases or even declarative sentences. Two themes from my study were: (a) older adults experience inadequate access to both primary and specialty care in rural areas, and (b) resources are scarce for frail older adults trying to remain at home as they age. Themes are what qualitative researchers call their findings or results, and they often form the basis for future research or interventions.

    6.  Matrix analysis as a complementary strategy: Matrix analysis in qualitative analysis is simply the cross-matching of an x-axis (one set of categorical elements) and a y-axis (a second set of categorical elements) for the purposes of data reduction, display, synthesis, analysis, and interpretation. Using bullet points of succinct information, the cells of a matrix are informative and comparative. Viewers can see an immediate visual comparison of information across research questions, categories of participants, or other designated classifications of data. A matrix can be descriptive, process oriented, or outcome focused, depending on researcher preferences. For one of my studies (Averill, 2002), I used matrix analysis to depict overall findings across participant categories, specific settings for data generation, demographic variations, and researcher reflections for each finding. I continue to use this tool for synthesis and CBPR interactions.

Integrating Strategies for Methodological Rigor

I agree with Morse and colleagues (2002) that without some kind of ongoing methodological rigor and verification of the work—both the process and the findings—all research (including qualitative) may be undependable or useless. Altheide and Johnson (2011) stated, “It is necessary to give an accounting of how we know things, what we regard and treat as empirical materials—the experiences—from which we produce our second (or third) accounts of ‘what was happening’” (p. 591). Cohen and Crabtree (2008) suggested that all research should attend to the following criteria in general: ethical conduct, choosing important research that advances knowledge, good writing, appropriate and rigorous methods, managing researcher bias, establishment of reliability (verification), and validity (credibility). Yet, they argued that it is not easy to agree on a single, always applicable set of criteria by which to ensure quality, representativeness, and methodological rigor in all qualitative studies, because “qualitative research is grounded in a range of theoretical frameworks and uses a variety of methodological approaches to guide data collection and analysis” (Cohen & Crabtree, 2008, p. 336). Their statement suggests that consensus on qualitative rigor is not yet a reality, and I concur with that assessment.

The recognized and classic depiction of methodological rigor in qualitative analysis came from the work of Lincoln and Guba (1985). Their suggested criteria of credibility (internal validity), transferability (external validity), dependability (reliability), and confirmability (objectivity) are still cited today as the bedrock of qualitative verification. They suggested operationalizing the criteria by practicing “prolonged engagement (with participants), persistent observation, triangulation (of sources, methods, investigators and theories)” (p. 301), peer debriefing, negative case analysis, some form of member checking with participants, and maintaining a transparent audit trail of all research activities. I do not disagree with these criteria, but I think the Cohen and Crabtree (2008) perspective should not be ignored in the discourse about qualitative rigor.

Lincoln (2002) enriched the dialogue on methodological quality by suggesting a new set of criteria that incorporates the dimensions of social justice, caring, and community perspective. Her criteria resonate with traditionally marginalized groups or people mistrustful of research in general. They also reflect a broader commitment to relational dynamics with study participants and the social value of disciplined inquiry. Specifically, she encouraged qualitative researchers to address the criteria of positionality or standpoint judgments, communities as arbiters of quality work, attention to voice for all participants, critical subjectivity (critical reflexivity), reciprocity, sacredness (honoring the ecological alongside the human dimensions), and sharing the privileges of publication and recognition with participants. I have found her criteria valuable in studies that incorporate CBPR and the democratization of research as an overall approach, such as the study I shared in this chapter. For that study, I applied the criteria of transparency, partnership, precision, evidence, and compassion (Averill, 2012), which support both good science and a more socially engaged approach to inquiry.

Dissemination of Findings

After a transparent, detailed analysis of the data, with attention to the integrity of all phases of work, it is time to close the loop and share the findings. Dissemination is not only vital for any CBPR study like the one I shared here, but for any good research. It is the scholar’s responsibility to own his or her work, articulate his or her voice in the discourse of topics, and allow peers and others to review what has been done and how it was achieved. I propose that two levels of dissemination exist for all of us. One is the obvious step of publishing and presenting the research to colleagues, professional peers, students, and funding organizations. However, a second and no less important venue is to share the work with the people directly affected by it (e.g., community or organizational stakeholders, gatekeepers, advisors, members of the local media, and possibly policy makers). In all studies for my research program, I have asked key community partners and advisors how they would like me to use the findings for their benefit and how I should share the findings. Their responses have included such steps as presenting inservices or executive summaries of the work (in plain language, not scientific jargon) at community or organizational meetings, with ample opportunities for the attendees and interested public listeners to ask questions; contributing to the county’s website of health-related information and activities; and finally, presenting to students at the local community college or university. In the spirit of the criteria mentioned previously and the commitment to a more relational discourse with the public about the work we do as investigators, I think it is also valuable to invite at least a few of the key stakeholders or partners to help shape the manuscripts and presentations, and possibly even to be mentioned as contributors or coauthors, depending on the extent of their roles.

CONCLUSION

As a long-time critical ethnographer and CBPR investigator in rural health disparities, I share several assumptions that informed this work: (a) All people are entitled to know what research is, why it is done, and for whose benefit; the implication of this is that they may be better informed so they may decide for themselves whether or to what extent they want to participate. (b) All people hold knowledge that benefits not only themselves and their communities and organizations but also the work of science, health care, and reducing inequities. (c) It is possible to conduct rigorous research while simultaneously respecting, honoring, and benefiting the residents in all kinds of communities. (d) Well-done qualitative research is a complement to additional types and kinds of inquiry; it allows a personal perspective, voice, and experiential presence to be a part of meaningful inquiry to describe, explain, predict, enlighten, measure, and/or improve life and health for all people. (e) Like all forms of systematic inquiry, qualitative research is a work in progress, sensitive to the changes, contexts, challenges, priorities, and other factors that comprise the human condition, in all types of settings. It is from this perspective that I offer these pages to all people.

REFERENCES

Altheide, D. L., & Johnson, J. M. (2011). Reflections on interpretive adequacy in qualitative research. In N. K. Denzin & Y. S. Lincoln (Eds.), The Sage handbook of qualitative research (4th ed., pp. 581–594). Los Angeles, CA: Sage.

Atlas.ti Scientific Software Development Gmbh. (2011). Atlas.ti (version 6) [Computer software]. Retrieved from http://www.atlasti.com/index.html

Averill, J. B. (2002). Matrix analysis as a complementary analytic strategy in qualitative inquiry. Qualitative Health Research, 12, 855–866.

Averill, J. B. (2012). Priorities for action in a rural older adults study. Family & Community Health, 35, 358–372.

Centers for Disease Control and Prevention. (2000). CDC EZ-Text (version 4) [Computer software]. Retrieved from http://www.cdc-eztext.com/

Cohen, D. J., & Crabtree, B. F. (2008). Evaluative criteria for qualitative research in health care: Controversies and recommendations. Annals of Family Medicine, 6, 331–339.

Flick, U. (2009). An introduction to qualitative research (4th ed.). Los Angeles, CA: Sage.

Griffith, E. (2013, March). What is cloud computing? PC Magazine. Retrieved from http://www.pcmag.com/article2/0,2817,2372163,00.asp

Gubrium, A., & Harper, K. (2013). Participatory visual and digital methods. Walnut Creek, CA: Left Coast Press.

Lincoln, Y. S. (2002). Emerging criteria for quality in qualitative and interpretive research. In N. K. Denzin & Y. S. Lincoln (Eds.), The qualitative inquiry reader (pp. 327–345). Thousand Oaks, CA: Sage.

Lincoln, Y. S., & Guba, E. G. (1985). Naturalistic inquiry. Newbury Park, CA: Sage.

Madison, D. S. (2012). Critical ethnography: Method, ethics, and performance (2nd ed.). Los Angeles, CA: Sage.

Morse, J. M., Barrett, M., Mayan, M., Olson, K., & Spiers, J. (2002). Verification strategies for establishing reliability and validity in qualitative research. International Journal of Qualitative Methods, 1(2), Article 2. Retrieved from http://www.ualberta.ca/~ijqm

QSR International. (2014). NVivo (version 10) [Computer software]. Retrieved from http://www.qsrinternational.com/products_nvivo.aspx?utm_source=NVivo+10+for+Mac

Stake, R. E. (2010). Qualitative research. New York, NY: Guilford Press.

Sullivan, G. (2010). Art practice as research (2nd ed.). Los Angeles, CA: Sage.

Thorne, S. (2008). Interpretive description. Walnut Creek, CA: Left Coast Press.

Wallerstein, N., & Duran, B. (2008). The theoretical, historical, and practice roots of CBPR. In M. Minkler & N. Wallerstein (Eds.), Community-based participatory research for health: From process to outcomes (2nd ed., pp. 25–46). San Francisco, CA: Jossey-Bass.