Heuristic Evaluation Plan
706.021 Mensch-Maschine-Kommunikation 3VU SS 2017
Group G3-02
Florian Leski
Franz Mandl
Maximilian Weber
Paul Ganster
Evaluation of the Web Site
https://www.tugraz.at/
HE Plan of 23th March 2017
1 Introduction
The web site we are going to evaluate provides information about the Technical University Graz. The visitor gets informed about the range of studies, facilities and institutes.
We are going to evaluate this web site, because we participate in the course Human Computer Interaction. We also want to get a good grade.
2 Evaluation Methodology
A heuristic evaluation is a fast, cheap and efficient way to check the usability of an user interface. To achieve this goal, a group of evaluators checks the UI individually and notes all the negatives and positives.
Heuristic Evaluation (originally proposed by Nielsen and Molich, 1990) is a discount method for quick, cheap, and easy evaluation of the user interface. [Danino2001]. At first the evaluators have to consider which target group is using the UI and describe their needs and tasks. The evaluators try to think and act like the typical user. This is necessary to find all the problems the users could face. (...) differences between heuristic evaluation sessions and traditional user testing are the willingness of the observer to answer questions from the evaluators during the session and the extent to which the evaluators can be provided with hints on using the interface. [Nie1995].
The evaluators also have to test on different platforms and devices the main target group may use. All the problems the evaluators find have to be documented for analyse purpose. During the documentation screenshots and screen captures are made for better visualisation and comprehensibility of the problem. For this evaluation, the "Andrews General Usability Heuristics", shown in Appendix A, will be used. These are based on and slightly adapted from Nielsen's revised set of ten usability heuristics [Nie1994]. These ten heuristics contain rules how a UI should look and response to an user. Then every evaluator puts his notes into a list and gives them grades depending on the importance of the problem. After the validation, the individual lists are merged into one big list and then get sorted by the importance of the problem. With this list of positives and negatives they can create the heuristic report. This detailed report will be given to the customer the heuristic evaluation was made for.
3 User Profiles
The typical user that generates the most traffic would be a prospective student. Since the lifeblood of a university are the people studying there, the university also wants to attract as many of them as possible. In addition to that, the site also wants to provide information about the quality of the university to employers of graduates. Another possible user group would be Jobseekers, who want to look up available job offers and most likely the pay. Students and graduates could also visit the site from time to time.
The prospective student is used to computers, knows his way around but also wants to see modern design. Since the user is used to responsive web and doesn't know the internet before it was as beautiful as it is now. The employer is most likely a business man using computers constantly since he looked up the university of his prospective employee. He doesn't want to spend a lot of time on the site and wants information as fast as possible. Jobseekers probably want to invest a lot of time in the site to gather as much information as possible about available job offers and the required qualifications for these. A student is already familiar with the site and uses it from time to time. He is very familiar with computers and uses them regularly. The graduate wants to connect with his fellow graduates. He doesn't use computers too much. Unless the interface is complex to handle, he/she is able to use it.
Our typical user, the prospective student, wants to get information about available studies and how to enroll into courses. The Curriculum of the study he is interested in is also very important for him. The employer wants to see the quality of the university in comparison to other national or international universities. The jobseekers want to look up available job offers and the amount of salary. Students want information about deadlines and the Curriculum. Graduates want to use the Alumni network to connect with fellow graduates.
4 Extent of the Evaluation
The field of the evaluation includes the whole website. The site consists of the main site and main menu which is separated in TU Graz, Studying and Teaching, Research, Faculties and Institutes and Information for. We assess every menu item in order to their usability. Each of the menu items provides specific information for the visitors of the website. As part of the evaluation we try to think like a typical and potential user and use the website like they would do. Every menu item is more or less important for specific users.
5 Evaluators and Evaluation Environments
The evaluators will use the evaluation environments shown in Table 1.
Evaluator | Paul Ganster (PG) | Florian Leski (FL) | Franz Mandl (FM) | Maximilian Weber (MW) |
---|---|---|---|---|
Age | 21 | 21 | 21 | 21 |
Sex | male | male | male | male |
Device | XMG P505 | Apple MacBook pro | Samsung Galaxy S6 G920F 32GB | Samsung Galaxy S5 SM-G900F |
Operating System | Ubuntu 16.04 | macOS Sierra Version 10.12.3 | Android 6.0.1 | Android 6.0.1 |
Web Browser | Chrome Version 57.0.2987.110 | Safari Version 10.0.3 | Internet 4.0.10-53 (Samsung nativ Browser) | Chrome Version 56.0.2924.87 |
Ad Blocker | uBlock Origin v1.11.4 | none | Adblock Plus for Samsung Browser | none |
Internet Connection | 75 mbps, UPC Fiber Power Pack Medium | 30 mbps, A1 | 75 mbps, UPC Fiber Power Pack Medium | 30 mbps, A1 |
Screen Size | 15″ | 15.4″ | 5.1″ | 5.1″ |
Screen Resolution | 1920x1080 | 2880x1800 | 1440x2560 | 1080x1920 |
Browser Resolution | 1920x735 | 2880x1800 | 1344x2560 | 1080x1920 |
Date of Evaluation | YYYY-MM-DD | YYYY-MM-DD | YYYY-MM-DD | YYYY-MM-DD |
Time of Evaluation | HH:MM:SS | HH:MM:SS | HH:MM:SS | HH:MM:SS |
MW will use the native screenshot function by Samsung for his Android phone and for screen recordings the app AZ Screen Recorder will be used. FM will use the same tools for screenshots and screen recordings like MW. PG will use RecordMyDesktop for Video Capture and the Ubuntu Native screenshot feature for screenshots. Quicktime player will be used by FL for his MacBook.
References
- [Nie1994]
- Jakob Nielsen;
Enhancing the Exploratory Power of Usability Heuristics;
Proc. Conference on Human Factors in Computing Systems (CHI’94).
ACM. Boston, Massachusetts, USA, Apr 1994, pages 152–158.
doi:10.1145/191666.191729
- [Nie1995]
- Jakob Nielsen;
How to Conduct a Heuristic Evaluation;
Visited 2017-03-23.
https://www.nngroup.com/articles/how-to-conduct-a-heuristic-evaluation/
- [Danino2001]
- Nicky Danino;
Heuristic Evaluation - a Step By Step Guide Article;
Visited 2017-03-27.
https://www.sitepoint.com/heuristic-evaluation-guide/
A Heuristics
The evaluators will use the Andrews General Usability Heuristics 2013
found in file:
heuristics.pdf
.
B Skeleton Log File
The evaluators will use the following (plain text) log files to collect notes during their individual evaluations:
Name | Log File |
---|---|
Florian Leski | log-fl.txt |
Franz Mandl | log-fm.txt |
Maximilian Weber | log-mw.txt |
Paul Ganster | log-pg.txt |
C Skeleton Spreadsheet
The evaluation manager will use the skeleton spreadsheet
helist.xls
to merge the individual lists.