MEDEIROS, D. C.; http://lattes.cnpq.br/5473318071754607; MEDEIROS, Danielle Chaves de.
Resumen:
The selection of criteria for the information analysis during the participation evaluation process, in discussion forums of distance courses, is a major challenge. There are many variables to consider in this process, in addition to the subjectivity inherent in the analysis carried out by the instructor, which is subject to human error.
Instructors, generally, do not have at their disposal all the resources necessary, thus, the use of methodologies or tools that can help them with this process are necessary. Facing this demand, and after performing a study of the major qualitative/quantitative indicators used by distance education teachers, we developed a framework for the analysis of the student participation in the forums. The aim of
this framework is to support the decision-making process, by providing a more effective mechanism to measure the quantity and quality of interactions, capable of adjusting itself to the traditional methodology adopted by each teacher. The
validation of this framework was performed by the administration of questionnaires that surveyed the opinion of active teachers in the distance learning area, and by the execution of case studies involving the assessment of the accuracy of instances of
this framework for calculating the participation grade of students. The study involved the development of an Expert System, for the treatment and processing of the data,
using similarity functions to perform, semi-automatically, the assessment of the content of the students' messages. Thus, it was possible to confront the calculated participation grades with the grades assigned by the teacher using the traditional approach. The results showed that, in three out of the five classes observed, it was
not possible to verify the existence of statistically significant differences between the performance of both the approaches studied. A study of the accuracy and correlation shows that, in all the cases analyzed, there is a strong relationship between the data, and the average error was less than 3%, demonstrating the applicability of the
proposed framework to the assessment of student participation in forums.