Abstract
The subject of this paper is to provide a robust user satisfaction evaluation of an educational distance-learning platform with the use of multicriteria satisfaction analysis (MUSA), an innovative and consistent decision-making algorithm, which leads to analytical satisfaction charts and improvement action charts. The educational platform evaluated is Moodle. MUSA algorithm criteria used for the purposes of the present analysis are: (1) technical dimension, (2) possibilities of teachers, (3) possibilities of participants, (4) pedagogical dimension, and (5) automated functions. The originality of this re-search is the fact that MUSA algorithms criteria weights are calculated both for the total number of participants in the present study and for smaller sample subgroups, which represent various levels of satisfaction (above average grade represents overall satisfied users and below average grade represents overall dissatisfied users), age, gender and identity (teachers or university students). The selected cluster sampling leads to differentiated criteria weights and action diagram in MUSA algorithm. The selected methodology is a crucial step for the optimization of the existing user satisfaction algorithm and leads to more robust and valid results. As a result, the modified method is called cluster sampling MUSA algorithm (CSMUSA) and leads to an enhanced decision-making procedure, which is considered fundamental for the constant improvement of any educational platform and software and could be implemented by software companies during the design process.
License
This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
Article Type: Research Article
EURASIA J Math Sci Tech Ed, 2023, Volume 19, Issue 9, Article No: em2320
https://doi.org/10.29333/ejmste/13472
Publication date: 01 Sep 2023
Online publication date: 10 Jul 2023
Article Views: 853
Article Downloads: 808
Open Access References How to cite this article