Supporting Qualitative Analysis with Large Language Models: Combining Codebook with GPT-3 for Deductive Coding - Inria - Institut national de recherche en sciences et technologies du numérique
Communication Dans Un Congrès Année : 2023

Supporting Qualitative Analysis with Large Language Models: Combining Codebook with GPT-3 for Deductive Coding

Ziang Xiao
Xingdi Yuan
Q. Vera Liao
Pierre-Yves Oudeyer

Résumé

Qualitative analysis of textual contents unpacks rich and valuable information by assigning labels to the data. However, this process is often labor-intensive, particularly when working with large datasets. While recent AI-based tools demonstrate utility, researchers may not have readily available AI resources and expertise, let alone be challenged by the limited generalizability of those task-specific models. In this study, we explored the use of large language models (LLMs) in supporting deductive coding, a major category of qualitative analysis where researchers use pre-determined codebooks to label the data into a fixed set of codes. Instead of training task-specific models, a pre-trained LLM could be used directly for various tasks without fine-tuning through prompt learning. Using a curiosity-driven questions coding task as a case study, we found, by combining GPT-3 with expert-drafted codebooks, our proposed approach achieved fair to substantial agreements with expert-coded results. We lay out challenges and opportunities in using LLMs to support qualitative coding and beyond.

Dates et versions

hal-04369097 , version 1 (02-01-2024)

Licence

Identifiants

Citer

Ziang Xiao, Xingdi Yuan, Q. Vera Liao, Rania Abdelghani, Pierre-Yves Oudeyer. Supporting Qualitative Analysis with Large Language Models: Combining Codebook with GPT-3 for Deductive Coding. IUI 2023 - 28th International Conference on Intelligent User Interfaces, Mar 2023, Sydney, Australia. pp.75-78, ⟨10.1145/3581754.3584136⟩. ⟨hal-04369097⟩
29 Consultations
0 Téléchargements

Altmetric

Partager

More