Submitted Successfully!
To reward your contribution, here is a gift for you: A free trial for our video production service.
Thank you for your contribution! You can also upload a video entry or images related to this topic.
Version Summary Created by Modification Content Size Created at Operation
1 -- 1561 2023-09-18 17:24:16 |
2 references update -2 word(s) 1559 2023-09-20 08:16:22 |

Video Upload Options

We provide professional Video Production Services to translate complex research into visually appealing presentations. Would you like to try it?

Confirm

Are you sure to Delete?
Cite
If you have any further questions, please contact Encyclopedia Editorial Office.
Vlachogianni, P.; Tselios, N. Post-Study System Usability Questionnaire. Encyclopedia. Available online: https://encyclopedia.pub/entry/49347 (accessed on 15 November 2024).
Vlachogianni P, Tselios N. Post-Study System Usability Questionnaire. Encyclopedia. Available at: https://encyclopedia.pub/entry/49347. Accessed November 15, 2024.
Vlachogianni, Prokopia, Nikolaos Tselios. "Post-Study System Usability Questionnaire" Encyclopedia, https://encyclopedia.pub/entry/49347 (accessed November 15, 2024).
Vlachogianni, P., & Tselios, N. (2023, September 18). Post-Study System Usability Questionnaire. In Encyclopedia. https://encyclopedia.pub/entry/49347
Vlachogianni, Prokopia and Nikolaos Tselios. "Post-Study System Usability Questionnaire." Encyclopedia. Web. 18 September, 2023.
Post-Study System Usability Questionnaire
Edit

The Post-Study System Usability Questionnaire (PSSUQ) is an evaluation tool used to assess perceived usability, and it does not require a license for its usage. As PSSUQ is a common and widely used usability tool it has been translated into many languages, mainly European.

Post-Study System Usability Questionnaire (PSSUQ) usability educational technology

1. Introduction

In a rapidly evolving world, the integration of technology has revolutionized many aspects of our everyday lives, including education. Both educators and learners face the challenge to embrace successfully technological means to unlock interactive learning experiences and personalized instruction. In this context, the term educational technology emerged as well as the subsequent need for its interpretation. The Association for Educational Communications and Technology (2008) defines educational technology as “the study and ethical practice of facilitating learning and improving performance by creating, using and managing appropriate technological processes and resources” [1]; the systematic application of teaching methods, instructional techniques, multimedia, tools, and technology to continuously enhance learning, teaching, and academic outcomes. Huang [2]—giving emphasis to the various context learning might take place—states that educational technology encompasses the utilization of tools, technologies, processes, resources, and methods aimed at enhancing learning encounters across diverse settings, including formal, informal, non-formal, lifelong, on-demand, workplace, and just-in-time learning. This field has progressed from early adoption of teaching tools to a rapid expansion in recent times, now incorporating a wide range of devices and approaches such as mobile technologies, virtual and augmented realities, simulations, immersive environments, collaborative learning, social networking, cloud computing, flipped classrooms, and various other innovations. A simple, concise, recent definition of educational technology refers to the application of technology in diverse educational environments, aiming to optimize learning and enhance educational achievements [3].
Educational technology is widely employed in every aspect of the learning process since it seems more appealing to students and teachers than using traditional means. It is met at all educational stages, from preschool to universities in many types and forms. Specifically, technology usage in education is proved to be beneficial to: students’ motivation and engagement [4][5][6][7], self-confidence [4], increased student understanding [4], increased instructional differentiation [4][5], increased exposure to more current content material [4][6], accessible and located learning [5].
However, the use of technological tools is not a panacea for achieving every single learning objective. Even though it is often referred as a means of maximizing the learning outcomes, while reducing costs, further research is needed to estimate the true cost-effectiveness in each context [8]. The “technology usage–improved learning outcomes” relationship is mediated and influenced by many factors, such as users’ learning styles [9][10], frequency of technological system’s use [11], instructional methods [12][13], information and communication technologies (ΙCT) competency, and the technological system’s usability.
In 1998, the International Organization for Standardization established a widely accepted definition of usability. According to this definition, usability pertains to the level to which a product or system can be effectively, efficiently, and satisfactorily used by its users aiming to achieve specific objectives in a particular environment [14]. Bevan et al. [15] give emphasis to usability as a result of interaction rather than an intrinsic characteristic of a product itself. Researchers’ primary focus were the first two objective dimensions initially, but afterwards the need of evaluating the third subjective dimension arose [16].
At this point, it is of paramount importance to highlight the distinction between usability and perceived usability. Even though they are strongly related concepts, they have some key differences. On the one hand, perceived usability indicates how easy to use a technology is perceived by its intended users. It is a subjective measure, and it can be influenced by many factors such as user expectations, ICT competency, past experiences, personal characteristics, and preferences. Contrarily, the level of usability is linked to the tangible, actual degree of convenience and user-friendliness exhibited by a technological system, while perceived usability is a measure of how easy the technology is perceived to be by the users. Thus, perceived usability is more related to the user’s perception and experience, while usability is more related to the technical aspects of the technology.
Perceived usability (satisfaction) seems to play an important role to students’ learning gain [17][18][19][20][21]. A user-friendly interface enables students and teachers to interact directly and effectively with a technological system without having to spend cognitive resources on learning the system thus focusing on the learning content. Hence, the perceived usability primarily hinges on the user’s subjective perception and experiential encounters, whereas usability primarily revolves around the intricate technical facets of the technology at hand.
Literature findings highlight the disconnection between human computer interaction research and technology enhanced learning as it is reflected upon the lack of usability frameworks [18]. Researchers reviewed the current body of scholarly works regarding the utilization of educational technology and how it is assessed in terms of perceived usability. Given that benchmarks for educational technology have already been created using the System Usability Scale (SUS) [22], the predominant instrument employed to gauge the perceived usability. By conducting a systematic review using PSSUQ/CSUQ, researchers can compare the results and findings from multiple studies that have used PSSUQ and CSUQ in educational technology. This comparison allows for a deeper understanding of the factors affecting usability and user experience across different technological interventions. It is noteworthy that SUS and PSSUQ/CSUQ scores are proved to be highly correlated in recent studies [11][23].
Consequently, the significance of usability in educational technology is underscored, and all parties involved are aware of its meaning for the technology-enhanced learning. In addition, access is now provided to a concentrated framework which includes many variables such as: educational stage, type of participant, study’s date, age of participants, subject being learned. A comprehensive review of existing literature using PSSUQ and CSUQ ensures that decisions related to the implementation and improvement of educational technology are based on evidence and empirical data rather than anecdotal or subjective assessments. Consequently, the development and improvement of technological tools and systems used for educational purposes can now be relied on solid research data.

2. Post-Study System Usability Questionnaire

The Post-Study System Usability Questionnaire (PSSUQ) is an evaluation tool used to assess perceived usability, and it does not require a license for its usage. Its first version consists of 19 items (short version: 16 items) and utilizes a 7-point Likert scale where lower ratings signify greater degrees of perceived usability (satisfaction). In numerous studies, factor analysis has consistently identified three distinct factors (subscales) known as System Usefulness, Information Quality, and Interface Quality. It is highly reliable with a Cronbach’s alpha that spans from 0.83 to 0.96 [24]. The Computer System Usability Questionnaire (CSUQ) was developed afterwards with identical items and similar wording using the present tense instead of PSSUQ’s past tense. It is sensitive both concerning many variables (type of participant, age of participant, type of technology assessed, user’s extensive years of experience and wide-ranging exposure to computer systems) and small sample sizes. The PSSUQ/CSUQ covers all aspects of usability: effectiveness, efficiency, satisfaction, and learnability [25]. They have been referred as tools that are used in various applications and provide a good validation in order to evaluate the usability of educational technology systems [26].
There are several tools, techniques, and methodologies for evaluating usability. The rationale for using the PSSUQ (Post-Study System Usability Questionnaire) and CSUQ (Computer System Usability Questionnaire) scales lies in their effectiveness and extensive use as reliable tools for evaluating usability. Specifically, researchers employing these scales can measure users’ satisfaction levels about the interaction with a technology system and they are able to identify precisely the areas that need improvement in terms of usability.
Standardized usability questionnaires are widely accepted. However, they add value only when an interpretive framework is provided. Regarding PSSUQ/CSUQ scales, Sauro and Lewis [27] aggregated 21 studies using PSSUQ and provided benchmarks regarding the three subscales and the overall score. The scores for system usefulness, information quality, interface quality, and overall score are as follows: system usefulness scored 2.80, information quality scored 3.02, interface quality scored 2.49, and the overall score was 2.82. Researchers and Human–Computer Interaction (HCI) experts can interpret a system’s perceived usability by examining the PSSUQ/CSUQ score in light of the benchmarks mentioned above. Since it is highly correlated with SUS scores, it can support the originally published norms for SUS after converting them to a 0–100-point scale [28].
Tullis and Stetson [29] found that 12 participants produced equivalent results to a larger sample size in 90% of the cases. It has also been referred [30] that as a questionnaire CSUQ seem to have a positive aspect in its statements. This may simplify the answer procedure but also causes a kind of response bias.
As PSSUQ is a common and widely used usability tool it has been translated into many languages, mainly European [31]. There are versions in Greek [32], Arabic [31], Portuguese [33], Turkish [34], French [35], Spanish [36].
Regarding demographics, gender does not seem to significantly affect PSSUQ scores in a dataset of 21 studies on dictation systems [24]. Alhadreti [11] examined possible correlation between CSUQ ratings of the Blackboard platform and the age of participants in a dataset of 187 scholars affiliated with Umm Al-Qura University in Saudi Arabia. He found no statistically significant correlation between CSUQ score and age (p = 0.820, ns). Sonderegger et al. [37] employed a 2 × 2 between-subjects design in a quasi-experiment using PSSUQ, with two age groups (old and young). Between the two age groups, in a sample of 60 subjects, no significant difference was found (F < 0). In conclusion, the PSSUQ/CSUQ tools seem to be easy generalizable and able providing a stable reliability through different implementations [38].

References

  1. Januszewski, A.; Molenda, M. (Eds.) Educational Technology: A Definition with Commentary; Routledge: New York, NY, USA, 2008; Available online: http://www.aect.org/publications/EducationalTechnology/ (accessed on 2 August 2023).
  2. Huang, R. Educational Technology a Primer for the 21st Century; Springer Nature Singapore Pte Ltd.: Singapore, 2019.
  3. Chugh, R.; Turnbull, D.; Cowling, M.A.; Vanderburg, R.; Vanderburg, M.A. Implementing educational technology in Higher Education Institutions: A review of technologies, stakeholder perceptions, frameworks and metrics. Educ. Inf. Technol. 2023, 1–27.
  4. Carver, L.B. Teacher Perception of Barriers and Benefits in K-12 Technology Usage. Turk. Online J. Educ. Technol.-TOJET 2016, 15, 110–116.
  5. Criollo-C, S.; Guerrero-Arias, A.; Jaramillo-Alcázar, Á.; Luján-Mora, S. Mobile Learning Technologies for Education: Benefits and Pending Issues. Appl. Sci. 2021, 11, 4111.
  6. Mathew, I.R.; Ebelelloanya, J. Open and distance learning: Benefits and challenges of technology usage for online teaching and learning in Africa. In Proceedings of the Pan-Commonwealth Forum. Botswana. Commonwealth of Learning and Open University of Malaysia, Kuala Lumpur, Malaysia, 15–30 November 2016.
  7. Nikolopoulou, K. Secondary education teachers’ perceptions of mobile phone and tablet use in classrooms: Benefits, constraints and concerns. J. Comput. Educ. 2020, 7, 257–275.
  8. Luschei, T.F. Assessing the Costs and Benefits of Educational Technology. In Handbook of Research on Educational Communications and Technology; Spector, J.M., Merrill, M.D., Elen, J., Bishop, M.J., Eds.; Springer Science+Business Media: New York, NY, USA, 2014; pp. 239–248.
  9. Bajaj, R.; Sharma, V. Smart Education with artificial intelligence based determination of learning styles. Procedia Comput. Sci. 2018, 132, 834–842.
  10. Ha, N.T.T. Effects of learning style on students achievement. Linguist. Cult. Rev. 2021, 5, 329–339.
  11. Alhadreti, O. Assessing Academics’ Perceptions of Blackboard Usability Using SUS and CSUQ: A Case Study during the COVID-19 Pandemic. Int. J. Hum.-Comput. Interact. 2021, 37, 1003–1015.
  12. Nicolaou, C.; Matsiola, M.; Kalliris, G. Technology-enhanced learning and teaching methodologies through audiovisual media. Educ. Sci. 2019, 9, 196.
  13. Wetzel, K.; Buss, R.; Foulger, T.S.; Lindsey, L. Infusing Educational Technology in Teaching Methods Courses: Successes and Dilemmas. J. Digit. Learn. Teach. Educ. 2014, 30, 89–103.
  14. ISO 9241-11; Ergonomic Requirements for Office Work with Visual Display Terminals (VDTs): Part 11: Guidance on Usability. International Organization for Standardization: Geneva, Switzerland, 1998.
  15. Bevan, N.; Carter, J.; Harker, S. ISO 9241-11 revised: What have we learnt about usability since 1998? In Proceedings of the International Conference on Human-Computer Interaction, Bamberg, Germany, 14–18 September 2015; pp. 143–151.
  16. Lewis, J.R. The System Usability Scale: Past, Present, and Future. Int. J. Hum.-Comput. Interact. 2018, 34, 577–590.
  17. Alghabban, W.G.; Hendley, R. Perceived Level of Usability as an Evaluation Metric in Adaptive E-learning. SN Comput. Sci. 2022, 3, 238.
  18. Law, E.L.-C.; Heintz, M. Augmented reality applications for K-12 education: A systematic review from the usability and user experience perspective. Int. J. Child-Comput. Interact. 2021, 30, 100321.
  19. Meiselwitz, G.; Sadera, W.A. Investigating the connection between usability and learning outcomes in online learning environments. J. Online Learn. Teach. 2008, 4, 234–242.
  20. Orfanou, K.; Tselios, N.; Katsanos, C. Perceived usability evaluation of learning management systems: Empirical evaluation of the System Usability Scale. Int. Rev. Res. Open Distrib. Learn. 2015, 16, 227–246.
  21. Vlachogianni, P.; Tselios, N. The relationship between perceived usability, personality traits and learning gain in an e-learning context. Int. J. Inf. Learn. Technol. 2022, 39, 70–81.
  22. Vlachogianni, P.; Tselios, N. Perceived usability evaluation of educational technology using the System Usability Scale (SUS): A systematic review. J. Res. Technol. Educ. 2021, 54, 392–409.
  23. Berkman, M.I.; Karahoca, D. Re-Assessing the Usability Metric for User Experience (UMUX) Scale. J. Usability Stud. 2016, 11, 89–109.
  24. Lewis, J. Psychometric Evaluation of the PSSUQ Using Data from Five Years of Usability Studies. Int. J. Hum.-Comput. Interact. 2002, 14, 463–488.
  25. Hodrien, A.; Fernando, T. A Review of Post-Study and Post-Task Subjective Questionnaires to Guide Assessment of System Usability. J. Usability Stud. 2021, 16, 203–232.
  26. Schnall, R.; Cho, H.; Liu, J. Health Information Technology Usability Evaluation Scale (Health-ITUES) for Usability Assessment of Mobile Health Technology: Validation Study. JMIR mHealth uHealth 2018, 6, e4.
  27. Sauro, J.; Lewis, J.R. Quantifying the User Experience: Practical Statistics for User Research. Morgan Kaufmann; Elsevier: Amsterdam, The Netherlands, 2016.
  28. Sauro, J. 10 Things to Know About the Post Study System Usability Questionnaire. 2019. Available online: https://measuringu.com/pssuq/ (accessed on 2 August 2023).
  29. Tullis, T.S.; Stetson, J.N. A comparison of questionnaires for assessing website usability. In Usability Professional Association Conference; 2004; Volume 1, Available online: https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.396.3677&rep=rep1&type=pdf (accessed on 2 August 2023).
  30. García-Peñalvo, F.J.; Vázquez-Ingelmo, A.; García-Holgado, A. Study of the usability of the WYRED Ecosystem using heuristic evaluation. In Learning and Collaboration Technologies. Designing Learning Experiences: 6th International Conference, LCT 2019, Held as Part of the 21st HCI International Conference, HCII 2019, Orlando, FL, USA, 26–31 July 2019, Proceedings, Part I 21; Springer International Publishing: Cham, Switzerland, 2019; pp. 50–63.
  31. Al-Tahat, K.S. Arabic Translation, Cultural Adaptation and Psychometric Validation of the Post-Study System Usability Questionnaire (PSSUQ). Int. J. Hum.-Comput. Interact. 2021, 37, 1815–1822.
  32. Katsanos, C.; Tselios, N.; Liapis, A. PSSUQ-GR: A First Step Towards Standardization of the Post-Study System Usability Questionnaire in Greek. In Proceedings of the CHI Greece 2021: 1st International Conference of the ACM Greek SIGCHI Chapter, Athens, Greece, 25–27 November 2021.
  33. Rosa, A.F.; Martins, A.I.; Costa, V.; Queiros, A.; Silva, A.; Rocha, N.P. European Portuguese validation of the Post-Study System Usability Questionnaire (PSSUQ). In Proceedings of the 2015 10th Iberian Conference on Information Systems and Technologies (CISTI), Aveiro, Portugal, 17–20 June 2015; pp. 1–5.
  34. Erdinç, O.; Lewis, J.R. Psychometric Evaluation of the T-CSUQ: The Turkish Version of the Computer System Usability Questionnaire. Int. J. Hum.-Comput. Interact. 2013, 29, 319–326.
  35. Gronier, G.; Johannsen, L. Proposition d’une adaptation française et premières validations de l’échelle d’utilisabilité Computer System Usability Questionnaire (F-CSUQ) Proposal for a French adaptation and first validations of the Computer System Usability Questionnaire (F-CSUQ). In Proceedings of the 33rd Conference on l’Interaction Humain-Machine, Namur, Belgium, 5–8 April 2022; pp. 1–11.
  36. Aguilar, M.I.H.; González, A.D.l.G.; Miranda, M.P.S.; Villegas, A.A.G. Adaptación al español del Cuestionario de Usabilidad de Sistemas Informáticos CSUQ/Spanish language adaptation of the Computer Systems Usability Questionnaire CSUQ. RECI Rev. Iberoam. De Las Cienc. Comput. E Informática 2015, 4, 84–99.
  37. Sonderegger, A.; Schmutz, S.; Sauer, J. The influence of age in usability testing. Appl. Ergon. 2016, 52, 291–300.
  38. Lewis, J.R. Measuring perceived usability: The CSUQ, SUS, and UMUX. Int. J. Hum.-Comput. Interact. 2018, 34, 1148–1156.
More
Information
Contributors MDPI registered users' name will be linked to their SciProfiles pages. To register with us, please refer to https://encyclopedia.pub/register : ,
View Times: 2.8K
Revisions: 2 times (View History)
Update Date: 20 Sep 2023
1000/1000
ScholarVision Creations