Cálculo de la fiabilidad y concordancia entre codificadores de un sistema de categorías para el estudio del foro online en e-learning

Authors

  • Juan Jesús Torres Gordillo
  • Víctor Hugo Perera Rodríguez
Keywords: inter-rater reliability, Fleiss’ Kappa, coding scheme, online discussion board, e-learning

Abstract

We offer detailed results measuring inter-rater reliability of a coding scheme of higher education online discussion boards. This is part of a piece of research on asynchronous communication in e-Learning. We have used Fleiss’ Kappa coefficient (k) for three raters. Our Kappa coefficient reaches a value of k=0.77. If we consider various authors’ interpretation tables of this index, this k value can be interpreted as a high or good value regarding the strength of agreement. The high reliability of this coding scheme allows it to be used by any researcher at any time, and guarantees results that explain the roles of communication and teaching-learning processes within e-Learning.

Downloads

Download data is not yet available.
Published
11-02-2010
How to Cite
Torres Gordillo, J. J., & Perera Rodríguez, V. H. (2010). Cálculo de la fiabilidad y concordancia entre codificadores de un sistema de categorías para el estudio del foro online en e-learning. Journal of Educational Research, 27(1), 89–103. Retrieved from https://revistas.um.es/rie/article/view/94291
Issue
Section
Articles