«If there were only one truth, you couldn’t paint a hundred canvases on the same theme. Pablo Picasso, 1966 Introduction As one of today’s most ...»
The issue of trustworthiness of the study was discussed in terms of the criteria suggested by Lincoln and Guba (1985): credibility, dependability, transferability, and confirmability. Credibility was established mainly through member checking and peer debriefing. Member checking was used in four ways at various stages of data collection and data analysis: (1) at the pilot stage, the interviewer discussed the interview questions with participants at the end of each interview; (2) during formal interviews, the interviewer fed ideas back to participants to refine, rephrase, and interpret; (3) in an informal post-interview session, each participant was given the chance to discuss the findings; and (4) an additional session was conducted with a sample of five participants willing to provide feedback on the transcripts of their own interview as well as evaluate the research findings. Peer debriefing was used in the study to “confirm interpretations and coding decisions including the development of categories” (Foster, 2004, p.231). No further details about who conducted the debriefing or how it was conducted were reported in the paper.
The transferability of the present study was ensured by “rich description and reporting of the research process” (Foster, 2004, p.230). Future researchers can make transferability judgments based on the detailed description provided by Foster. The issues of dependability and confirmability were addressed through the author’s “research notes, which recorded decisions, queries, working out, and the development results” (Foster, 2004, p.230). By referring to these materials, Foster could audit his own inferences and interpretations, and other interested researchers could review the research findings.
The content analysis findings were reported by describing each component in the model of information seeking behaviors in interdisciplinary contexts that emerged from this study. Diagrams and tables were used to facilitate the description. A few quotations from participants were provided to reinforce the author’s abstraction of three processes of interdisciplinary information seeking: opening, orientation, and consolidation. Finally, Foster discussed the implications of the new model for the exploration of information behaviors in general.
Conclusion Qualitative content analysis is a valuable alternative to more traditional quantitative content analysis, when the researcher is working in an interpretive paradigm.
The goal is to identify important themes or categories within a body of content, and to provide a rich description of the social reality created by those themes/categories as they are lived out in a particular setting. Through careful data preparation, coding, and interpretation, the results of qualitative content analysis can support the development of new theories and models, as well as validating existing theories and providing thick descriptions of particular settings or phenomena.
Cited Works Allen, B., & Reser, D. (1990). Content analysis in library and information science research. Library & Information Science Research, 12(3), 251-260.
Berg, B.L. (2001). Qualitative Research Methods for the Social Sciences. Boston: Allyn and Bacon.
Bradley, J. (1993). Methodological issues and practices in qualitative research. Library Quarterly, 63(4), 431-449.
De Wever, B., Schellens, T., Valcke, M., & Van Keer, H. (2006). Content analysis schemes to analyze transcripts of online asynchronous discussion groups: A review.
Computer & Education, 46, 6-28.
Denzin, N.K. (1989). Interpretive Interactionism. Newbury Park, CA: Sage.
Foster, A. (2004). A nonlinear model of information-seeking behavior. Journal of the American Society for Information Science & Technology, 55(3), 228-237.
Glaser, B.G., & Strauss, A.L. (1967). The Discovery of Grounded Theory: Strategies for Qualitative Research. New York: Aldine.
Hsieh, H.-F., & Shannon, S.E. (2005). Three approaches to qualitative content analysis.
Qualitative Health Research, 15(9), 1277-1288.
Lincoln, Y.S., & Guba, E.G. (1985). Naturalistic Inquiry. Beverly Hills, CA: Sage Publications.
Mayring, P. (2000). Qualitative content analysis. Forum: Qualitative Social Research, 1(2). Retrieved July 28, 2008, from http://22.214.171.124/fqs-texte/2-00/2mayring-e.pdf.
Miles, M., & Huberman, A.M. (1994). Qualitative Data Analysis. Thousand Oaks, CA:
Minichiello, V., Aroni, R., Timewell, E., & Alexander, L. (1990). In-Depth Interviewing:
Researching People. Hong Kong: Longman Cheshire.
Neuendorf, K.A. (2002). The Content Analysis Guidebook. Thousand Oaks, CA: Sage Publications.
Patton, M.Q. (2002). Qualitative Research and Evaluation Methods. Thousand Oaks, CA: Sage.
Picasso, P. (1966). Quoted in Hélène Parmelin, “Truth,” In Picasso Says. London: Allen & Unwin (trans. 1969).
Schamber, L. (2000). Time-line interviews and inductive content analysis: Their effectivenss for exploring cognitive behaviors. Journal of the American Society for Information Science, 51(8), 734-744.
Schamber, L. (1991). Users’ Criteria for Evaluation in Multimedia Information Seeking and Use Situations. Ph.D. dissertation, Syracuse University.
Schilling, J. (2006). On the pragmatics of qualitative assessment: Designing the process for content analysis. European Journal of Psychological Assessment, 22(1), 28-37.
Smith, H.W. (1975). Strategies of Social Research: The Methodological Imagination.
Englewood Cliffs, NJ: Prentice-Hall.
Tesch, R. (1990). Qualitative Research: Analysis Types & Software Tools. Bristol, PA:
Weber, R.P. (1990). Basic Content Analysis. Newbury Park, CA: Sage Publications.