Upload
others
View
0
Download
0
Embed Size (px)
Citation preview
Evaluation of e-Readers
A Preliminary Analysis
Ujala Rampaul u5051329
May 2014
Supervisor: Prof. Tom Gedeon
Human Computer Interaction Project: COMP6474
E-Reader evaluation Ujala Rampaul Page 2 of 66
Acknowledgements
I would like to thank Professor Tom Gedeon for his invaluable support, guidance knowledge sharing and assistance.
Dr. Weifa Liang (Project Coordinator) for his guidance and organising the community of practice meetings.
Dr. Duncan Stevenson for sharing his knowledge and advice on grounded theory.
Leana Copeland and Dr. Sabrina Caldwell for their assistance with the experiments.
Dr. Dingyun Zhu for his helpful comments on analysis of variance
To all the participants who took part in the evaluations, thank you.
E-Reader evaluation Ujala Rampaul Page 3 of 66
Abstract
Evaluation of new consumer goods, particularly electronic, are often done from the perspective of analysing the
latest models, comparing their advantages and disadvantages with respect to price. This style of evaluation is
often performed by one of a few product experts on a wide range of features that may not be applicable to each
user.
This project uses a scenario-based approach to evaluate a number of e-readers. The scenario mimics a user who
is interested in a new product or technology and has a limited budget. The objective is to evaluate from a
purchasing perspective the quality and usability of e-readers available within that budget range.
The e-readers evaluated were operated in multiple ways, which suggests that the interface design should cater
for users with different levels of experience with technology. The results of a large user study with over 70
participants, shows that the popular brands do not necessarily produce the best products in terms of e-readers.
E-Reader evaluation Ujala Rampaul Page 4 of 66
1 Contents
Acknowledgements................................................................................................................................................................................2
Abstract.......................................................................................................................................................................................................3
1 Introduction....................................................................................................................................................................................7
1.1 Motivation ............................................................................................................................................ 8
2 Literaturesurvey.......................................................................................................................................................................10
3 Methodology................................................................................................................................................................................16
3.1 Experiment 1 ..................................................................................................................................... 16
3.1.1 E-reader .......................................................................................................................................... 16
3.1.2 Scenario .......................................................................................................................................... 17
3.1.3 Measurement of tasks ..................................................................................................................... 17
3.2 Experiment 2 ..................................................................................................................................... 18
3.2.1 E-readers ......................................................................................................................................... 18
3.2.2 Questionnaire .................................................................................................................................. 19
4 Results............................................................................................................................................................................................20
4.1 Experiment 1 ..................................................................................................................................... 20
4.1.1 Turning on the e-reader ................................................................................................................... 20
4.1.2 Finding the first document .............................................................................................................. 20
4.1.3 Opening the document .................................................................................................................... 21
4.1.4 Increasing the font size ................................................................................................................... 21
4.1.5 Navigate to specific section ............................................................................................................ 22
4.1.6 Navigate to a second document ...................................................................................................... 22
4.1.7 Readability on the screen ................................................................................................................ 23
4.1.8 E-Reader preference ....................................................................................................................... 23
4.1.9 Pairwise comparison on e-readers with significant differences ...................................................... 24
4.2 Experiment 2 ..................................................................................................................................... 25
4.2.1 Turn on the e-reader ........................................................................................................................ 25
4.2.2 Find the first document ................................................................................................................... 25
4.2.3 Open the document ......................................................................................................................... 25
4.2.4 Increase the font size....................................................................................................................... 26
4.2.5 Navigate to specific section ............................................................................................................ 26
4.2.6 Navigate to second document ......................................................................................................... 26
4.2.7 Readability on the screen ................................................................................................................ 27
4.2.8 e-reader preference ......................................................................................................................... 27
4.2.9 Pairwise comparison for Q6 ........................................................................................................... 27
4.2.10 Pairwise comparison on Q3 – Q6 ............................................................................................... 28
4.3 Discussion ........................................................................................................................................... 29
E-Reader evaluation Ujala Rampaul Page 5 of 66
5 Conclusions..................................................................................................................................................................................33
5.1 Future work ....................................................................................................................................... 33
6 Bibliography................................................................................................................................................................................34
7 Appendices...................................................................................................................................................................................35
7.1 Appendix 1: Technical specification on e-readers .......................................................................... 35
7.2 Appendix 2: Scenario ........................................................................................................................ 38
7.3 Appendix 3: Questionnaire ............................................................................................................... 39
7.4 Appendix 4: Questionnaire for Experiment 2 ................................................................................. 41
7.5 Appendix 5: ANOVA calculations ................................................................................................... 43
7.5.1 Experiment 1 ................................................................................................................................... 43
7.5.2 Experiment 2 ................................................................................................................................... 49
7.6 Appendix 6: Participants comments for experiment 1 ................................................................... 54
7.7 Contents of second document (Dates 2013) ..................................................................................... 66
E-Reader evaluation Ujala Rampaul Page 6 of 66
List of Figures Figure 1. Growth of ownership of e-readers and tablets in the United States ......................................................... 8
Figure 2. Map of common reading devices between this study and previous study ............................................. 14
Figure 3. Map of tasks between this study and previous study ............................................................................. 15
List of Tables Table 1. Cost of second-hand e-readers .................................................................................................................. 9
Table 2. Matrix of the e-reader randomisation pair for device A. ........................................................................ 17
Table 3. Matrix of touch screen e-reader pairs for device B ................................................................................. 18
Table 4. Mean Ranking and Standard Deviation (SD) for turning the e-readers on ............................................. 20
Table 5. Mean Ranking and Standard Deviation (SD) for finding the document ................................................. 21
Table 6. Mean Ranking and Standard Deviation (SD) on opening the document: significant ............................. 21
Table 7. Mean Ranking and Standard Deviation (SD) for increasing font size: significant ................................. 22
Table 8. Mean Ranking and Standard Deviation (SD) for navigation to section 3: significant ............................ 22
Table 9. Mean Ranking and Standard Deviation (SD) for navigation to second document ................................. 23
Table 10. Mean Ranking and Standard Deviation (SD) for readability ................................................................ 23
Table 11. Mean Ranking and Standard Deviation (SD) for e-reader preference: significant ............................... 23
Table 12. Qualitative, quantitative and overall ranking ........................................................................................ 24
Table 13. Mean Ranking and Standard Deviation (SD) for turning the e-reader on. ........................................... 25
Table 14. Mean Ranking and Standard Deviation (SD) for finding a document .................................................. 25
Table 15. Mean Ranking and Standard Deviation (SD) for opening the document ............................................. 26
Table 16. Mean Ranking and Standard Deviation (SD) for increasing the font size ............................................ 26
Table 17. Mean Ranking and Standard Deviation (SD) for navigating to section 3 ............................................. 26
Table 18. Mean Ranking and Standard Deviation (SD) for navigating to second document: significant ............ 27
Table 19. Mean Ranking and Standard Deviation for readability ........................................................................ 27
Table 20. Mean Ranking and Standard Deviation on user preference .................................................................. 27
Table 21. Qualitative, quantitative and overall ranking for Q6 ............................................................................ 28
Table 22. Qualitative, quantitative and overall ranking for Q3-Q6. ..................................................................... 28
E-Reader evaluation Ujala Rampaul Page 7 of 66
1 Introduction
In 2007, Amazon launched the Kindle in the United States and it sold out in five and a half hours (Patel, 2007).
Prior to the Kindle launch there was not a huge demand for e-readers as earlier releases were the Rocket eBook
in 1998, the Sony Libriè in 2004 and Sony Reader in 2006. These earlier devices were unsuccessful because
they were very expensive, had too many technological limitations and lacked available content.
The Kindle was different because it was more affordable, consumers could purchase a wide variety of e-books
via the wireless data connection and it was comparatively easy to use.
Apple again ignited the market in 2010, with the introduction of the iPad, raising the profile of e-readers and
tablets.
Dedicated reading devices (e-readers) differ from the iPad and other tablets. E-readers with e-ink screens
simulate the experience of reading a paper book whereas tablets are multifunctional devices. Tablets allow
users to browse the web, read and write email messages, take and view photographs, view movies, play games
and perform many of the same functions as with a computer.
The advantage of an e-reader is that it is easier to read even in direct sunlight, does not consume a lot of
battery, is lighter in weight, permits undisturbed reading of an e-book and the eyes do not fatigue compared to
devices with backlit LCD displays.
In the United States, the cost of e-readers has continued to fall, making them more accessible to purchase. Data
from Pew Research Centre over a three-year period from May 2010 to September 2013 indicate e-reader
ownership has grown over 20%, see Figure 1 on ownership growth of e-readers and tablets in the United
States.
Overall, 50% of Americans now have a dedicated handheld device – either a tablet computer like an iPad, or
an e-reader such as a Kindle or Nook for reading e-content (Zickuhr & Rainie, 2014).
E-Reader evaluation Ujala Rampaul Page 8 of 66
Figure 1. Growth of ownership of e-readers and tablets in the United States
Source: Pew Research Centre (Rainie & Smith, 2013)
The success of e-readers in the United States has influenced the Australian consumer market for e-readers,
which continues to grow despite the popularity of tablets. Although the price of e-readers has fallen over the
years, they are still expensive. For example, the price range of e-readers in 2013 as indicated by Choice, a
consumer advisory group, varied from AUD$100 to $469 for the latest e-reader then on the market.
1.1 Motivation
The objective of this project was to evaluate from a purchasing perspective the quality of e-readers that could be
purchased within a limited budget.
Shoppers of e-readers and e-reading devices are often influenced by advertising, reviews conducted by computer
and consumer sites such as PC Authority and Choice, as well as opinions of friends and families. The
commercial reviews compare the performance, features, file formats and specifications of the latest e-reader
models against the market price to identify the best value for money.
To obtain e-readers for this study a budget based on the cost of a new paper book ($25) was established and
multiplied to set the budget range of approximately $50-75 (excluding shipping costs) for the purchase of an e-
reader. Establishing a budget reduced the range of e-readers that could be obtained i.e. either, earlier, e-reader
models with a higher specification or later models with a relatively low specification.
E-Reader evaluation Ujala Rampaul Page 9 of 66
We argue that if we focused on the quality of the devices available within this budget range, then the differences
in the e-reader model and specifications were not limiting factors. The experiment requires that a realistic range
of e-readers of a high quality could be obtained from eBay and other auction sites for an effective comparison.
The following nine e-readers were selected and mostly purchased from within Australia except for the Sony and
jetBook, at a lower price range than originally estimated being $35 -$70. This includes a selection of e-readers
having buttons only, touch screen only or a combination of both.
e-readers cost in AUD$
Barnes & Noble Nook 69.90
Kobo 60.00
Amazon Kindle 4G 57.85
Sony PRS-600 44.88
iRiver Story HD 52.00
Elonex 621EB 41.00
Ectaco jetBook 61.28
Pandigital Novel 36.85
EZReader EZ601 55.00
Table 1. Cost of second-hand e-readers
Each device was labeled with an alphabetical letter from A to I, based on their arrival sequence.
E-Reader evaluation Ujala Rampaul Page 10 of 66
2 Literature survey
The rise of e-books and e-readers are part of a larger story about the shift from printed to digital material. Early
research focused on academic contributions towards education where students used e-readers in the classrooms
to read textbooks.
There have been numerous approaches taken when evaluating both ebooks and ebook readers in projects such as
Superbook and Electronic Books On-screen interface (EBONI). (Gibson & Gibb, 2011).
Other studies included university and public libraries investigating strategies to support the borrowing of e-
books and e-readers. “In particular, many academic libraries since begun pilot projects using a variety of
different reader devices to investigate the possibilities for simplifying and innovating the content and related
services they offer to users in light of the technologies that are available. Countless libraries have experimented
with offering lending programs for their devices. Many libraries have undertaken these initiatives on their own
whereas others have partnered with either Sony electronics or Amazon to design and carry out their projects”
(Behler & Lush, 2010).
To establish design guidelines for e-readers (Pearson, et al., 2010) conducted a study to determine the usability
concerns with user interface of the e-readers using human computer interaction (HCI) principles. The
experiment was carried out using the Kindle 2, Sony PRS 600 and Sony PRS 300. These three e-readers had
similar screen type (e-Ink technology), screen size (six inches) and resolution (600 x 800). The Sony PRS 600
was a combination of touch and button device whereas the Kindle 2 and Sony PRS 300 were button devices.
Based on the guidelines and principles for ergonomics, consistency, completeness, page numbering, book
marks, annotation and magnification, the Kindle was found to be better than Sony PRS-300 and PRS-600. The
positioning of the buttons in the Kindle made it easier to turn pages for both left and right-handed users. The
Kindle had a full QWERTY keyboard which supported web browsing and location identifiers in relation to the
file instead of page numbers. The availability of full annotation and zooming was only available on the Sony
PRS-600. User feedback on the three e-readers was leveraged from Amazon’s online customer review and
weighted on a Likert scale rating. This could have attributed for the higher ratings for the Kindle, as it is
produced by Amazon.
The study by Gibson and Gibb (2011) evaluated four second-generation e-reading devices, namely Sony
PSRS505; Bookeen Cybook 3g; iRex iLiad and Asus Eee PC 105 HA, with 33 participants. Their evaluation
measured weight, quality of display, size and robustness of the devices based on a five point Likert scale rating.
On overall impressions and functionality, the Asus Eee netbook was the best, followed by Sony PRS 505, iLiad
and lastly Cybook. The Asus Eee netbook was the preferred choice because participants were familiar with the
design and layout compared to the dedicated e-readers (Gibson & Gibb, 2011).
E-Reader evaluation Ujala Rampaul Page 11 of 66
Both the Sony PRS505 and Booken Cybook were similar in screen size (six inches) and resolution (600 x 800
pixels) whereas the iRex iLiad had a larger screen size (eight inches) and a larger resolution (768 x 1024 pixels).
All three devices had screens with e-Ink technology. The Asus Eee PC 105 HA, a netbook had a LCD screen, a
much larger screen size (ten inches) as well as a large resolution (600x1024 pixels) and weighed seven times
more than the Booken Cybook, the lightest (175g) device in the trial. This meant the refresh rate for turning the
page were faster in the netbook compared to the e-readers using e-Ink technology. This was an unequal
comparison between three dedicated e-readers and a netbook.
Usability was favourable towards the e-readers for lightness and portability, readability and ease of use.
Participants commented, “the screen was not wearing on the eyes” (iLiad) and that it was a “straightforward
operation” (Sony), and that “the non-glare screen made the text as easy to read as ink” (iLiad) (Gibson & Gibb,
2011).
Based on reported sales and market share in the United States, Richardson & Mahmood (2012) studied five
leading e-readers namely Kindle 3G; iPad 1G; Nook; Kobo N647; and Sony PRS950. Their objective was to
identify the advantages and disadvantages of e-readers and “to compare and contrast the most popular (i.e.
bestselling) devices against a comprehensive, if not exhaustive, set of technical specifications as well as
qualitative judgment of users” (Richardson & Mahmood, 2012).
The results from Richardson and Mahmood’s study showed that the Kindle was the most popular device, though
some participants commented on poor navigation. Some 47% of the participants that undertook this study
owned a Kindle, accounting for a probably biased popularity of the Kindle. There was no evidence for the
technical comparison nor did they elaborate what it consisted of. The comparison between the iPad and the e-
readers was probably unjust as the tablet is a multi-function device, whereas e-readers are designed to read and
purchase e-books.
The NextMedia eReading Project conducted a qualitative and quantitative study on reading behaviours and
patterns on e-readers including the effects of age groups. The objective of the main project was to encourage
Finnish consumers to adopt and grow an e-reading community. This eReading Project consisted of multiple
smaller projects that focused on different aspect of the study such as usability of e-reading devices, types of
content, emotional and cognitive reactions.
Heikkilä (2011) usability study of e-reading devices was based on a scenario, which involved the participants
recording their experiences of opening and using a newly purchased e-reading device. A number of pages from
a paper book was the benchmark for the evaluations. The same content being available on the e-reading devices.
Over a week, seventeen participants recorded their reading times, places they had read and their experiences on
using the e-readers. The time-based assessment tasks included opening the device; finding a book; finding a
specific spot in the book and changing the font size.
Heikkilä used the following e-readers in his study Kindle (included both Kindle 2 and Kindle 3); Sony PRS600;
iPad 1G; BeBook Neo; Booken Cybook Opus; Elonex eBook and Samsung galaxy.
E-Reader evaluation Ujala Rampaul Page 12 of 66
The outcome of the usability study was the creation of a conceptual model of an e-reader labeled Acola. The
Acola was a combination of touch sensitive and gesture savvy pad on an e-ink device. The pad would handle
page turning and skimming. Swiping to the left would turn the page forward, swiping to the right would move
one page backwards. Swiping fast two times successively, would turn two pages, three times: three pages etc.
Swiping with two fingers simultaneously would move you between chapters. A menu and an OK button could be
situated in the upper and lower border of the pad (Heikkilä, 2011).
There is some similarity between Heikkilä’s (2011) study and my report in the use of a scenario for the
evaluation of the e-reading devices. It is interesting that his study evaluated two tablets and four e-readers to
create their ideal e-reader.
Siegenthaler, et al., (2012) conducted a study to with the following reading devices Sony PRS505, Sony
PRS600 and an Apple iPad, to determine effects of touch screen technology.
Twelve participants tested the three devices sequentially within a session based on a set of time related tasks
such as open a book, open a specific page within the book, highlight a sentence, find a highlighted sentence,
delete a highlighted sentence, change the orientation of the layout and increase the font size. The participants
rated the above tasks on the navigation, design, handiness and handling based on a Likert scale rating.
The results showed “that e-reading devices with touch screens correlate with better navigation rating” and touch
screen technology also has its advantages in terms of a higher intuitiveness and flexibility for adaptations of the
navigation (e.g., due to firmware updates) compared to devices with static buttons. (Siegenthaler, et al., 2012).
The evaluation of handiness and handling was ambiguous, for example the related questions were “How handy
do you rate the reading device?” and “how easy was it for you to handle the reading device?” There was no
clarification to what this meant and the evaluation questions were vague.
We earlier identified the differences between an e-reader, which is solely for reading e-books whereas a tablet is
a multifunction device making this evaluation an unequal comparison.
Siegenthaler, et al, (2012) argued, “E-ink technology has low power consumption, thereby increasing battery
life and allowing for a more lightweight device. Another advantage is that e-ink devices can be used outside
without glare being a big issue. However, e-ink screens have some disadvantages, most of them are black and
white and the pages do not refresh as quickly as devices with an LCD screen”.
The disadvantage of tablets is that they cost considerably more than e-readers, the LCD screens are susceptible
to glare, they do not provide a comfortable reading experience, are heavier and the battery does not last as long.
E-ink technology is designed to emulate printed books. Independent studies on the reading behaviours measured
by eye tracking found no difference between a paper book and a dedicated e-reader (Nielsen, 2009) and
E-Reader evaluation Ujala Rampaul Page 13 of 66
(Siegenthaler, et al., 2010).
Similarly, analysis of the reading behaviours between a dedicated e-reader and an e-reading device (tablet)
found no significant difference as measured by eye speed (Nielsen, 2010) and (Siegenthaler, et al., 2012).
However, the iPad “actually scores slightly higher in user satisfaction” (Nielsen, 2010).
Please see Figure 2 which maps the reading devices that are common to this study and to other studies
identified. Most studies shared one common e-reader, for example Richard & Mahmood (2012) share the Nook,
Siegenthaler, et al.,(2012) shared the Sony PRS-600, Siegenthaler, et al., (2010) share the jetBook and Pearson
et al., (2010) share the Sony PRS-600. The exception is Heikkilä (2011) where there two common e-readers
namely the Sony PRS-600 and the Elonex.
Please see Figure 3 which maps the tasks that are common to this study and to other studies identified. All
studies share at least one common task, with Siegenthaler et al, (2010) and (2012) evaluating two similar tasks
namely, open a document (open a book) and increase font size. The other exception is Heikkilä (2011), where
four similar tasks were evaluated that is turn the power on (open device), navigate to first document (find a
particular book), increase font size (change font size) and navigate to specific section within the book (find a
specific place within the book). All studies have commented on readability.
No previous studies have been conducted to evaluate from a purchasing perspective the quality and usability of
e-readers purchased within a budget of up-to $70. This study proposes to conduct the evaluation of the e-readers
based on a scenario with two experiments.
1. Evaluation of nine e-readers that are button interface devices only, touch screen only and a combination
of both to determine which device is the best
2. Evaluation of e-readers with touch screens
E-Reader evaluation Ujala Rampaul Page 14 of 66
Figure 2. Map of common reading devices between this study and previous study
E-Reader evaluation Ujala Rampaul Page 15 of 66
Figure 3. Map of tasks between this study and previous study
E-Reader evaluation Ujala Rampaul Page 16 of 66
3 Methodology
The Human Research Ethics Committee at the Australian National University (ANU) approved this study.
The objective of this project was to evaluate from a purchasing perspective the quality and usability of e-readers
purchased within a limited budget.
Participants from the ANU College of Engineering and Computer Science were recruited for the experiments.
The participants evaluated the devices in pairs so one could act as the scribe and observe, while the other
operated the device. On completing the evaluation of the first device, the participants swapped roles to evaluate
the next device. This grouping encouraged discussion amongst themselves on their observations and interactions
with the e-readers.
3.1 Experiment 1
In experiment 1, 72 senior students (3rd year HCI course participants, Honours, Masters and PhD) took part in
the evaluations.
3.1.1 E-reader
Nine e-readers with e-readers having buttons only, touch screen only or a combination of both were used in the
evaluations. Each device was allocated an alphabetical letter based on the arrival sequence for easy
identification in the randomisation matrix, namely:
A. Barnes & Noble Nook
B. Kobo Touch
C. Amazon Kindle 4G
D. Sony PRS-600
E. iRiver Story HD
F. Elonex 621EB
G. Ectaco jetBook
H. Pandigital Novel
I. EZReader EZ601
Please see Appendix 1 for the full technical specifications of the above e-readers.
The randomisation was similar to a Latin square where two sets of the e-readers were labeled from A to I and
arranged in such a way that no row contained the same letter twice. The pairs were balanced, so that the number
of times a letter came first or second was the same. There were 36 pairs of e-readers for 72 participants to
evaluate, which meant each device was evaluated four times as a first device and then four times as a second
device. Table 2 shows an example of the device ‘A’ evaluated eight times with the other devices.
E-Reader evaluation Ujala Rampaul Page 17 of 66
BA BH HE IF
GE CA BI GC
DC IE DA BG
DI EC CH EA
AF HD CF HF
CB AG GD CI
GF DB AH ED
FD IH EB AI
IG FE HG FB
Table 2. Matrix of the e-reader randomisation pair for device A.
The evaluation commenced when each participant pair were given the first e-reader from a randomised pair, and
a questionnaire. The participants received an overview of the study and its objectives. After performing the tasks
on the first e-reader, the tasks were repeated on the second e-reader.
The questionnaire included a scenario, the evaluation tasks based on Likert-like numerical scale, open-ended
questions to gain an insight on improving the device and comments after each device evaluated. It also gathered
optional personal details including the name, gender, age, year of study, familiarity with e-readers and follow-up
contact details for purposes of clarification or removal from the experiment in adherence with The Human
Research Ethics Committee guidelines of the Australian National University. Lastly, the questionnaire requested
the subjective opinions and reasons on whether one of the two devices was preferred.
3.1.2 Scenario
The evaluations were designed around the same scenario to provide context to the experiment. The scenario was
based on a user finding a gift received a couple months ago. Included in the box was a note to register the device
prior to use. The participant had to register the device without the assistance of the user manual, which was
based on a series of tasks.
For both the experiments, the participants had to assume the battery on the e-readers were fully charged, as it
was not a requirement of the scenario.
Please see Appendix 2 for full details of the scenario.
3.1.3 Measurement of tasks
The tasks was measured on a Likert-like summative scale rating ranging from the most positive to the least
positive:
5 = Very good
4 = Good
3 = Ok
2 = Bad
1 = Very bad
E-Reader evaluation Ujala Rampaul Page 18 of 66
The tasks were
1. To turn the device on
2. Navigate to a document (called “Somedevice XY User Guide”)
3. Open the document
4. Increase the font size
5. Navigate to specific section to find the ‘model number’
6. Navigate to a second document (“Dates-2013” and use the model number).
7. Assess readability on the screen
Please see Appendix 3 for the full questionnaire.
3.2 Experiment 2
The results of the first experiment suggested that a comparison be conducted between the touch screen operated
devices. Please see the results section 4.1 for details. The objective of the second experiment was to identify
preferences between e-readers with touch screens excluding button only e-readers.
In experiment 2, 64 mostly junior (first year) students took part in the evaluations. In a few instances, senior
students filled in as partners when a second junior student was not available.
3.2.1 E-readers
Three touch screen e-readers from experiment 1, and two additional e-readers were used in experiment 2
evaluations, namely:
B. Kobo Touch
D. Sony PRS-600
H. Pandigital Novel
J. Nook Touch
K. Kindle Touch
Experiment 2 applied a similar randomisation as used in experiment 1. Table 3 shows an example for device ‘B’
on the randomisation of the touch screen e-reader pairs. The pairs in italics indicate the outstanding data
remaining to be collected for experiment 2.
BD KH KD BH HD DJ JB KJ2
JH BD2 JK2 KH2 HD2 DK JD BK2
DB HJ JB2 KJ HB2 JD2 BK KD2
KB2 DB2 HJ2 BH2 HK2 BJ KB DH2
JH2 DH HB HK BJ2 DK2 DJ2 JK
Table 3. Matrix of touch screen e-reader pairs for device B
E-Reader evaluation Ujala Rampaul Page 19 of 66
3.2.2 Questionnaire
The same scenario and the tasks based on a Likert-like numerical scale rating from experiment 1 was used in the
evaluation of experiment 2.
Explicit instructions were given to not use the recent history, continue reading or the ‘date’ search in the e-
readers.
Additional subjective open-ended questions were requested, on likes and dislikes about the e-readers; previous
use or ownership of an e-reader and the model and if they were regular users of a smartphone; tablet or laptop.
These questions were to gain a better understanding into the user’s experience with mobile devices.
Please see Appendix 4 for the full updated questionnaire as used in experiment 2.
E-Reader evaluation Ujala Rampaul Page 20 of 66
4 Results
Statistical analysis was performed using F-statistics based on a repeated measures ANOVA for the within factor
of e-readers. A critical p < 0.05 was used for statistical significance in all analyses. Each of the following
sections report the ANOVA results for that question based on the usability tasks measured on a Likert-like
numerical scale rating from 1 (very bad) to 5 (very good) on the e-readers.
4.1 Experiment 1
4.1.1 Turning on the e-reader
There were no significant differences found in the responses to the question on powering on the e-readers. Table
3 shows the results and ranks for the e-readers. The jetBook, Pandigital and the Kobo are ranked at the top.
There are four e-readers (i.e. Nook, iRiver, Elonex and EZReader) ranked at the middle in equal fourth. The
Sony and the Kindle were ranked at the bottom, however there were no significant differences found in the e-
readers for turning them on.
e-Reader Mean SD Rank
jetBook 4.1 0.83 1
Pandigital 4.0 0.76 2
Kobo 3.9 1.17 3
Nook 3.8 1.28 = 4
iRiver 3.8 1.28 = 4
Elonex 3.8 0.71 = 4
EZReader 3.8 1.28 = 4
Sony 3.4 1.06 8
Kindle 3.1 0.83 9
Table 4. Mean Ranking and Standard Deviation (SD) for turning the e-readers on
4.1.2 Finding the first document
There were no significant differences found in in response to the question on finding the first document. Table 4
shows the results and ranks for the e-readers. Sony and Kobo ranked in the top for finding a document whereas
the iRiver, Pandigital, jetBook, and Nook were ranked in the middle. EZReader and Kindle were ranked at the
bottom but there was no significant difference found in all e-readers.
e-Reader Mean SD Rank
Sony 4.0 0.53 1
Kobo 3.5 1.41 2
iRiver 3.3 1.16 = 3
Pandigital 3.3 1.49 = 3
jetBook 3.0 1.31 5
E-Reader evaluation Ujala Rampaul Page 21 of 66
e-Reader Mean SD Rank
Nook 2.9 1.46 6
Elonex 2.5 0.76 7
EZReader 2.4 1.60 8
Kindle 2.3 0.89 9
Table 5. Mean Ranking and Standard Deviation (SD) for finding the document
4.1.3 Opening the document
An analysis of variance showed that the responses to the question on opening the document was significantly
different between the e-readers, F(8,63) = 3.21, p =0.004. Table 5 shows the results and rankings for opening
the document. The Kobo, Kindle and the iRiver were ranked at the top. The Sony and Elonex were ranked equal
fourth. The jetBook and Pandigital ranked equal sixth. The EZReader and Nook rank at the bottom. This shows
there is a significant difference in the e-readers to perform the task of opening a document.
e-Reader Mean SD Rank
Kobo 4.6 0.52 1
Kindle 4.5 0.93 = 2
iRiver 4.5 0.76 = 2
Sony 4.4 0.74 = 4
Elonex 4.4 0.74 = 4
jetBook 4.3 1.39 = 6
Pandigital 4.3 0.89 = 6
EZReader 3.5 1.20 8
Nook 2.8 1.16 9
Table 6. Mean Ranking and Standard Deviation (SD) on opening the document: significant
4.1.4 Increasing the font size
An analysis of variance showed that increasing the font size was significantly different between the e-readers,
F(8,63) = 2.25, p =0.03. Table 6 shows the results and rankings on the e-readers for increasing the font size. The
Pandigital, jetBook and iRiver were ranked at the top three positions. The Kindle and Elonex were ranked equal
fourth. The Nook was ranked at sixth position. The Kobo and EZReader were ranked at the bottom two
positions as equal eighth. This shows a significant difference was found in the e-readers to perform the task of
increasing the font size.
e-Reader Mean SD Rank
Pandigital 4.6 0.52 1
jetBook 4.1 1.13 2
iRiver 4.0 1.07 3
Kindle 3.9 1.46 = 4
Elonex 3.9 1.13 = 4
Nook 3.4 1.06 6
Sony 3.3 0.71 7
E-Reader evaluation Ujala Rampaul Page 22 of 66
e-Reader Mean SD Rank
Kobo 2.8 1.75 8
EZReader 2.8 1.49 8
Table 7. Mean Ranking and Standard Deviation (SD) for increasing font size: significant
4.1.5 Navigate to specific section
An analysis of variance showed that navigating to a specific section within the document was significantly
different between the devices, F(8,63) = 3.43, p =0.002. Table 7 shows the results and rankings on the e-readers
for navigating to a specific section within the document. The jetBook, Pandigital and the Kindle ranked at the
top. The Sony, iRiver and the Kobo ranked in the middle and the Elonex, Nook and EZReader ranked in the
bottom positions. This shows a significant difference in the e-readers for navigating to specific section within a
document.
e-Reader Mean SD Rank
jetBook 4.0 0.93 1
Pandigital 3.6 1.19 2
Kindle 3.5 1.07 3
Sony 3.4 1.41 4
iRiver 3.3 1.28 5
Kobo 2.8 1.39 6
Elonex 2.6 1.30 7
Nook 1.9 1.46 8
EZReader 1.6 0.92 9
Table 8. Mean Ranking and Standard Deviation (SD) for navigation to section 3: significant
4.1.6 Navigate to a second document
There was no significant differences found in navigating to a second document with the e-readers. Table 8
shows the results and ranks for navigating to a second document. The iRiver, jetBook and the Sony ranked in
the top three positions. The Pandigital, Elonex and the Kobo ranked at the middle and the Nook, Kindle and
EZReader ranked at the bottom three positions. Navigating to a second document was not significantly different
in the e-readers.
e-Reader Mean SD Rank
iRiver 4.0 0.93 1
jetBook 3.9 1.81 2
Sony 3.8 1.04 3
Pandigital 3.6 0.92 4
Elonex 3.4 1.69 5
Kobo 3.1 1.64 6
Nook 2.9 1.46 7
Kindle 2.6 1.41 8
EZReader 2.5 1.31 9
E-Reader evaluation Ujala Rampaul Page 23 of 66
Table 9. Mean Ranking and Standard Deviation (SD) for navigation to second document
4.1.7 Readability on the screen
There were no significant differences found in response on assessing the readability. Table 9 shows the results
and ranks for readability. The iRiver was ranked the highest, followed by Kobo, Kindle and jetBook in second
position and the Pandigital in the fifth. The Nook and EZReader were both ranked equal sixth. The Elonex and
Sony ranked lower but no significant difference were found between the e-readers for readability.
e-Reader Mean SD Rank
iRiver 4.5 0.76 1
Kobo 4.0 0.53 = 2
Kindle 4.0 0.93 = 2
Jetbook 4.0 0.93 = 2
Pandigital 3.9 0.35 5
Nook 3.8 0.71 = 6
EZReader 3.8 1.28 = 6
Elonex 3.5 1.20 8
Sony 3.3 1.28 9
Table 10. Mean Ranking and Standard Deviation (SD) for readability
4.1.8 E-Reader preference
For the question, “Did you like one of the two devices better?” an analysis of variance showed that the
responses were significantly different between the e-readers, F(8,54) = 2.33, p =0.03. Table 10 shows the results
and rankings of the participants’ preference when comparing e-readers. The Kindle and iRiver ranked equal
first, followed by Sony, jetBook and Pandigital in equal third position. The Elonex was ranked sixth, Kobo
seventh and both Nook and EZReader ranked at the bottom. This indicates a significant difference in the e-
readers that participants’ preferred the most.
e-Reader Mean SD Rank
Kindle 0.6 0.52 = 1
iRiver 0.6 0.52 = 1
Sony 0.5 0.53 = 3
jetBook 0.5 0.53 = 3
Pandigital 0.5 0.53 = 3
Elonex 0.4 0.52 6
Kobo 0.3 0.46 7
Nook 0.0 0.00 = 8
EZReader 0.0 0.00 = 8
Table 11. Mean Ranking and Standard Deviation (SD) for e-reader preference: significant
E-Reader evaluation Ujala Rampaul Page 24 of 66
4.1.9 Pairwise comparison on e-readers with significant differences
Pairwise comparison is commonly used to estimate preference values of finite alternatives with respect to a
given criterion. This is part of the model structure of the analytical hierarchy process, a widely used multi-
criteria decision-making methodology. The main difficulty is to reconcile the inevitable inconsistency of the
pairwise comparison matrix elicited from the decision makers in real-world applications (Choo & William,
2004)
The elimination of designs or candidates can change the tabulated rankings of those designs or candidates that
remain under consideration. The determination of which design is “best” or which candidate is “preferred most”
may well be sensitive to the set of designs considered (Dym, et al., 2002).
The overall ranking we used based on the qualitative and quantitative mean rankings of the questions about e-
readers that indicated a significant difference as shown in 4.1.3 opening a document, 4.14 increasing the font
size, 4.1.5 navigating to specific section within the document and 4.1.8 e-reader preference. Table 11 shows the
overall ranking of e-readers where significant difference was noted.
e-Reader Qualitative ranking Quantitative ranking Overall ranking
Kindle =1 3 =1
Pandigital =3 1 =1
iRiver =1 4 =3
jetBook =3 2 =3
Sony =3 5 5
Elonex 6 6 6
Kobo 7 7 7
Nook =8 8 8
EZReader =8 9 9
Table 12. Qualitative, quantitative and overall ranking
The qualitative rankings were based on the number of yes responses to the question on whether one of the two
devices was better. The quantitative rankings were based on Likert-like numerical scale rating on the individual
tasks and participants’ responses in questions where a statistically significant difference was found in the
responses.
The qualitative and quantitative rankings are similar, within one rank for the majority of e-readers. This suggests
that our rankings can be relied on, as the two forms of data are largely consistent. The exceptions are the Kindle
(1th in qualitative ranking and 3rd in quantitative ranking); Pandigital (3rd in qualitative ranking and 1st in
quantitative ranking); iRiver (1st in qualitative ranking and 4th in quantitative ranking) and Sony (3rd in
qualitative ranking and 5th in quantitative ranking). These differences are discussed in section 4.3.
E-Reader evaluation Ujala Rampaul Page 25 of 66
4.2 Experiment 2
The data collection for experiment 2 is not complete, so this analysis is preliminary only.
4.2.1 Turn on the e-reader
Similar to experiment one, there were no significant differences found in the response to the question on
powering on the e-readers. Refer to Table 13 for the Mean Ranking and Standard Deviation (SD) for switching
the e-reader on. The Kobo was in the top first, followed by the Kindle Touch (2nd), Sony (3rd), Pandigital (4th)
and the Nook Touch at the bottom (5th).
e-Reader Mean SD Rank
Kobo 4.3 0.9 1
Kindle Touch 4.2 0.7 2
Sony 4.1 0.9 3
Pandigital 3.5 1.0 = 4
Nook Touch 3.5 1.0 = 4
Table 13. Mean Ranking and Standard Deviation (SD) for turning the e-reader on.
4.2.2 Find the first document
There were no significant differences found in response to the question on finding the first document. Refer to
Table 14 on the Mean Rankings and the Standard Deviation for finding a document. Sony ranked the highest
(1st) followed by Kobo (2nd), Kindle Touch (3rd), Nook Touch (4th) and the Pandigital (5th).
e-reader Mean SD Rank
Sony 4.0 0.6 1
Kobo 3.8 0.8 2
Kindle Touch 3.6 1.1 3
Nook Touch 3.5 1.4 4
Pandigital 3.0 1.4 5
Table 14. Mean Ranking and Standard Deviation (SD) for finding a document
4.2.3 Open the document
There were no significant differences found in response to the question on opening the document. Refer to Table
15 on the Mean Rating and Standard Deviation on opening the document. The Nook Touch ranked the highest
(1st) followed by Kobo (2nd), Sony (3rd), Pandigital (4th) and Kindle Touch (5th).
e-Reader Mean SD Rank
Nook Touch 4.9 0.4 1
Kobo 4.6 0.5 2
Sony 4.5 0.9 3
Pandigital 4.5 0.5 4
Kindle Touch 4.4 0.7 5
E-Reader evaluation Ujala Rampaul Page 26 of 66
Table 15. Mean Ranking and Standard Deviation (SD) for opening the document
4.2.4 Increase the font size
There were no significant differences found in response to the question on increasing the font size. Refer to
Table 16 on the results of the mean ranking and standard deviation for increasing the font size. The Sony was
ranked in the top (1st), followed by Kindle Touch (2nd), Pandigital (3rd), Nook Touch (4th) and Kobo (5th).
e-Reader Mean SD Rank
Sony 4.0 0.7 1
Kindle Touch 3.7 1.3 2
Pandigital 3.5 1.0 3
Nook Touch 3.5 1.3 4
Kobo 3.4 1.3 5
Table 16. Mean Ranking and Standard Deviation (SD) for increasing the font size
4.2.5 Navigate to specific section
There were no significant differences found in response to the question on navigating to a specific section within
the document. Refer to Table 17 on the mean ranking and standard deviation on navigating to a specific section
within the document. The Nook Touch ranked the highest (1st) followed by Sony (2nd), Kindle Touch (3rd) and
Pandigital (5th).
e-Reader Mean SD RANK
Nook Touch 4.0 1.2 1
Sony 3.4 1.3 2
Kindle Touch 3.1 1.3 3
Pandigital 3.1 1.4 4
Kobo 2.9 1.1 5
Table 17. Mean Ranking and Standard Deviation (SD) for navigating to section 3
4.2.6 Navigate to second document
An analysis of variance showed that navigating to a second document was significantly different between the e-
readers, F(4,59) = 2.77, p =0.04. Refer to Table 18 on mean ranking and standard deviation for navigating to the
second document. The Nook Touch and the Sony ranked in the two positions. The Kobo and the Kindle Touch
ranked in the middle with third and fourth position. The Pandigital ranked at the bottom in the fifth position.
The results show a significant difference in the e-readers for navigating to a second document.
e-Reader Mean SD RANK
Nook 4.2 1.1 1
Sony 3.8 1.2 2
Kobo 3.6 1.3 3
E-Reader evaluation Ujala Rampaul Page 27 of 66
e-Reader Mean SD RANK
Kindle 3.4 1.3 4
Pandigital 2.6 1.5 5
Table 18. Mean Ranking and Standard Deviation (SD) for navigating to second document: significant
4.2.7 Readability on the screen
There were no significant differences found in response on assessing the readability. Refer to Table 19 on mean
ranking and standard deviation for readability. The Nook Touch ranked the highest (1st) followed by Kobo (2nd),
Sony (3rd), Pandigital (4th) and Kindle Touch (5th).
e-Reader Mean SD Rank
Nook Touch 4.6 0.5 1
Kobo 4.4 0.8 2
Sony 4.3 0.7 3
Pandigital 4.3 0.5 4
Kindle Touch 4.1 0.9 5
Table 19. Mean Ranking and Standard Deviation for readability
4.2.8 e-reader preference
No significant differences were found in response to the question on “which device was better”. Refer to Table
20 on the mean rankings and standard deviation for user preference. The Nook Touch ranked the highest (1st)
followed by Kindle Touch (2nd), Sony (3rd), Kobo (4th) and Pandigital (5th).
e-Reader Mean SD Rank
Nook Touch 0.6 0.5 1
Kindle Touch 0.5 0.5 2
Sony 0.4 0.5 3
Kobo 0.3 0.5 4
Pandigital 0.2 0.4 5
Table 20. Mean Ranking and Standard Deviation on user preference
4.2.9 Pairwise comparison for Q6
The overall ranking is based on the qualitative and quantitative mean rankings of the e-readers with a significant
difference as shown in 4.26. Navigate to second document. Refer to Table 21 on the pairwise comparison for
Q6.
There is no change in the qualitative, quantitative and overall rankings for the Nook Touch (1st) and Pandigital
(5th). The Sony remains in the middle (3rd) for both the qualitative and quantitative rankings. The exceptions are
the Kobo (4th in qualitative ranking and 2nd in the quantitative ranking) and the Kindle Touch (2nd in the
qualitative ranking and 4th in the quantitative ranking).
E-Reader evaluation Ujala Rampaul Page 28 of 66
e-reader Qualitative rank Quantitative rank Overall rank
Nook Touch 1 1 1
Sony 3 2 2
Kindle Touch 2 4 3
Kobo 4 3 4
Pandigital 5 5 5
Table 21. Qualitative, quantitative and overall ranking for Q6
4.2.10 Pairwise comparison on Q3 – Q6
We have used comparisons on the questions which were significant in the previous experiment, as we assume
that when the full results are collected, questions 3 to 5 will again be significant, and we have added question 6
which is already shown to be significant.
The overall ranking is based on the qualitative and quantitative mean rankings of the e-readers with a significant
difference from experiment 1 (Q3-Q5) and experiment 2 (Q6). Refer to Table 22 on the pairwise comparison for
Q3 -Q6.
There is no difference in the qualitative, quantitative and overall rankings for the Nook Touch (1st) and
Pandigital (5th). Refer to Table 22 on the pairwise comparison for Q3 – Q6. The Kindle Touch and the Kobo are
within one rank of each. The exception is Sony where it has ranked 4th in the qualitative rankings and 2nd in the
quantitative ranking.
e-reader Qualitative rank Quantitative rank Overall rank
Nook Touch 1 1 1
Kindle Touch 2 3 2
Sony 4 2 3
Kobo 3 4 4
Pandigital 5 5 5
Table 22. Qualitative, quantitative and overall ranking for Q3-Q6.
E-Reader evaluation Ujala Rampaul Page 29 of 66
4.3 Discussion
Interaction with e-readers should be simple, effective and a pleasant experience as it is emulating a paper book.
The power ON/OFF buttons on the Sony and the Kindle was not easily noticeable. On the Sony, three similar
buttons of the same shape and size were located at the top edge of the device and the label was not distinct due
to the reflective surface and the angle at which the device was commonly held. A comment on the Sony was
“doesn’t stand out from the other buttons on the same area”. The power switch on the Kindle was positioned on
the lower edge of the device and the button was not labeled. Some of the comments on the Kindle power button
were “button hard to find at the bottom - unusual position”; “hard to find switch button”; and “hard to locate the
button, button un-labeled”. The ANOVA results showed no significant differences in the e-readers in response
to the question for turning the e-readers on.
Some devices such as Sony, Elonex, jetBook, iRiver, Pandigital and EZReader stored the recently read e-books
in the current reading or recent history folder. This extremely useful feature allows the reader to go straight to
their e-book without being distracted with cumbersome navigation and searching. In the experiments there was
only one of each e-reader, when an e-reader was available for the next pair to evaluate there was not sufficient
time to refresh the reading history folder, as it generally requires a full factory reset. It was a compromise to
leave the documents within the folder especially if alternate methods for finding the document were supported.
The recent history was not an obvious feature to all participants. While users were instructed that if the e-reader
‘accidentally’ started in the target document, then they should back out of it and navigate to it, it is clear from
the comments that sometimes they ‘cheated’.
Typing a search query in the Kindle was cumbersome and required too much effort to complete the full query as
it involved navigating the keyboard via the 5-way controller. Prevention of keyboard slips and incorrect spelling
was not supported and frustrating to the user. An autocomplete feature would be a suitable design solution to
combat the complexity in typing and error management.
Buttons are designed to afford pressing to trigger a visible change on screen. The search button on the EZReader
was unresponsive and disconcerting as the users were unsure how to continue. If a button is displayed it must be
associated to an action otherwise it leads to user frustration and a waste of valuable real estate on the interface.
Although the participants encountered some difficulty with some of the e-readers on finding the document, the
ANOVA results show there was no significant difference in the e-readers.
Opening a document was found to be either intuitive on some e-readers or challenging on other e-readers. This
is also reflected in the ANOVA results, which showed a significant difference in the e-readers to support
opening a document. For example, tapping on the document title opened the document on the Kobo, Sony and
Pandigital. The iRiver had two “enter” buttons, one situated above the keyboard and the other within the
keyboard for easy selection. The center buttons on the multi-controllers of the Kindle, Elonex and the jetBook
behaved as the “enter” key.
E-Reader evaluation Ujala Rampaul Page 30 of 66
Comments on the EZReader for opening a document were “hard to find”, “tried the arrows keys first“, and “a bit
slow”. The “enter” button on the Nook touch panel was not identifiable because it did not meet a user’s concept
of a button and it was not labeled. A comment on the Nook for opening the document was “open button is not
obvious, task a bit to find. Fact screen is not touch is not obvious”. The “enter” key could not be distinguished if
it was a radio button or a decorative element.
The ANOVA results for increasing the font size showed a significant difference in the e-readers to accomplish
this task. The Pandigital supported alternate methods to change the font size for example, within an opened
document from previous task, zoom icons (both plus and minus) were available on the bottom right corner of the
screen and selecting the menu button displayed a toolbar with additional access buttons such as “My Library;
Dictionary; TOC; Bookmarks; Go To; Font size and Next” on the top of the screen.
The font size adjust button was visible on the multi-controller of the jetBook, the keyboard on the iRiver and a
stepper on the Elonex. The option to change the font size in the Kindle was accessible via the “menu” button.
Comments on the Nook on changing the font size were “menu a bit tricky to scroll through” and “not intuitive to
find". The Kobo has a multi-touch screen which meant each area of the screen was reserved for a function, for
example tapping in the middle or top right of the screen turned a page forward and tapping on the top left of the
screen turned the page backwards. A tap to the bottom of the screen was essential to unfold the font size icon
button and a comment was “hidden toolbar, obvious icon”.
In the EZReader a “font” option was displayed via the “option” button but this was ambiguous as only the font
type could be changed and not the font size. Comments on changing the font size on the EZReader were “no
idea where to navigate”; “had to select font first, that wasn't any good so we kept looking through other menus”
and “can't find the setting”. Each e-reader had its own methodology to accomplish the task of changing the font
size which accounts for the difference identified in the analysis of variance calculation.
In the Kobo the ‘table of contents’ options was available via the concealed toolbar, similar to accessing the font
size option. Comments on the Elonex were “first tried to use ‘explore’ button and press keyword ‘model
number’, no response came out. Easy to find when using ‘content’ button”. The ‘content’ button was accessible
via the “menu” button.
There were no visual cues for a search or table of contents feature in the Nook and user’s found it to confusing.
Comments on navigation in the Nook were “couldn't figure out how to scroll the screen”, “not found” and kept
pressing the wrong button (left button sent pages forward)”.
Similar to the Nook there were no visual cues in the EZReader to gain access to the specific section within the
document. Comments on the EZReader to navigate to the section within the document were “cannot find it”,
“we cannot find it”, “needed 15% hint for go to page”, “it's impossible to find that paragraph” and “can't find the
document”. These variations for navigating in the e-readers to a specific section within a document were also
identified in the analysis variance results.
E-Reader evaluation Ujala Rampaul Page 31 of 66
Navigating and opening a second document reinforced the learning experience from the previous task for
finding and opening a document and no significant differences were found in the ANOVA results.
There was no significant difference for readability. Almost all the e-readers had e-Ink screen types. Pandigital
had SiPiX and jetBook had a LCD screen type, but these did not affect the readability. SiPiX technology uses
white particles suspended in black ink while e-Ink technology has both black and white particles suspended in a
clear fluid. A comment on Sony was “screen is dull and not enough backlight for contrast” but it was not
significantly different compared to the other e-readers.
A significant difference in the analysis of variance results were found when asked which e-reader was preferred
from the pair evaluated. This was then used to conduct a comparison to determine which e-readers was overall
significantly better, based on counting the number of times each device was considered better than the other
device.
There were four e-readers for which their qualitative rankings were more than one-step different in the
quantitative rankings. The Kindle, Sony and the iRiver were all higher by two to three steps and the Pandigital
was two steps lower. This could be due to the Pandigital being an unknown brand compared to the Kindle and
Sony as popular brands, though this does not explain the iRiver which showed the highest difference between
the qualitative (1st) and quantitative (4th). It is possible that the brand recognition is the primary answer for the
Kindle and Sony. For the iRiver, it appears that the overall interface is visually appealing to the users which
would account for the high qualitative rankings but the functional aspects and the aesthetics of the interface are
not in complete harmony to meet the user’s expectation.
Comments on the iRiver and EZReader were that the keyboards were redundant; the buttons were too small and
overall not very useful. A full list of participants’ comments can be viewed in Appendix 6. Here is a summary of
the comments
1) The e-readers were either non-intuitive or intuitive
2) Some e-readers were found to be user friendly and easy to use
3) Expectations were for faster and more responsive feedback
4) E-readers with un-labeled buttons and use of icons were confusing
5) The lack of contrast colour of buttons and interface was deceptive
6) Preference for a device with touch screen interface
Grounded theory is a systematic generation of theory from systematic research (Corbin & Strauss, 2008).
A “grounded theory” approach was considered to interpret the qualitative data but the comments were found to
be too short and insufficient to code and to form concepts of the code, for a meaningful analysis.
E-Reader evaluation Ujala Rampaul Page 32 of 66
In this study, three e-readers, the Pandigital, the Sony and the Kobo used Touch technology. The Pandigital and
the Sony are also button-operated interfaces, while the Kobo was touch only. The qualitative rankings for these
devices were Pandigital (3rd) and Sony (3rd) and quantitative rankings were Pandigital (1st) and Sony (5th). The
Kobo is consistent with both its qualitative and quantitative ranking at 7th. It is clear from these tasks highlighted
in 4.1.9 pairwise comparison, that button operated interfaces are consistently better, that is the three devices
which had touch interfaces are distributed among the top, middle and bottom. Further, the only full touch e-
reader (Kobo) is among the bottom three.
It is worth noting that most new devices available on the market in 2014 are mostly touch devices, and over 60%
of the participants commented, that they would either preferred a device with touch screen interface or would
improve the device by adding a touch screen interface.
In experiment 2, the preliminary analysis of variance calculation showed a significant difference in the e-readers
only for navigating and opening a second document. This could be attributed to the instruction on to not use the
‘date’ search. That is, mentioning ‘search’ could have predisposed the participants to use the search feature to
find the documents. The second document could be found only by the title search. Content search was not
possible because the second document was a very short document, with very little content to search on. See
Appendix 7.7 on the list of ‘Dates 2013’ in the second document.
The search and table of contents features has not been investigated in previous studies, most likely, because it is
not a feature associated with paper books. However, on an electronic device a search feature is critical to finding
information quickly. The search feature should be investigated in the future.
The overall ranking is based on the qualitative and quantitative mean rankings of the e-readers with a significant
difference from the experiment 1 (Q3-Q5) and experiment 2 (Q6) to understand the difference between the sets.
Preliminary results from experiment 2 show that touch devices are preferred over the Pandigital, which had
ranked equal 1st in experiment 1. There are a number of possible explanations. For example, in the first
experiment there was only one full touch e-reader and the other e-readers were either, button only device or a
combination of button and touch. In the first experiment, perhaps the Pandigital ranked in the top because it was
compared to mostly button devices whereas in the second experiment the Pandigital was compared to touch
devices. That is, the Pandigital’s button interface may be better than the button interface of most e-readers in our
study and perhaps even gain extra ‘points’ for having touch. However, in the second experiment, perhaps the
Pandigital’s touch interface was used more and did not compare well to other touch e-readers.
A second explanation could be the selection difference in the participants. Perhaps, senior students evaluated the
devices more objectively, and were not persuaded by preconceptions. In the second experiment, a few students
commented on brand recognition, for example “I don’t recognise the manufacturer so it can’t be any good”
(about the Pandigital). While comments after the experiment suggested that the Kobo interface is similar to a
smartphone, with the home button at the bottom centre of the interface and that this was “a very clever design”.
E-Reader evaluation Ujala Rampaul Page 33 of 66
5 Conclusions
In experiment 1, the ANOVA results showed significant differences between the e-readers for the tasks of
opening the document, increasing the font size, navigating to a specific section within the document, and user
preference on the e-reader from the device pair evaluated.
In experiment 2, the only significant difference in the e-readers identified from the ANOVA results was for the
tasks of navigating to and opening a second document.
In this study the tasks was measured on a Likert-like numerical scale rating the e-readers which was used to
calculated the Mean, Standard Deviation and Analysis of Variance (ANOVA). From ANOVA results, a further
comparison on the qualitative and quantitative rankings were used to determine, which e-readers were overall
better.
In the overall ranking for experiment 1, the Kindle, Pandigital, iRiver and jetBook were in the top four
positions. The Kindle, iRiver and jetBook are button-operated e-readers whereas the Pandigital is combination
of buttons and touch. e-Readers with buttons are intuitive to use. Users know when they have selected a physical
button whereas tapping a button on a touchscreen is not always instantaneous due to the speed of response and
the user may continue to tap the screen.
In the overall ranking for experiment 2, the touch devices ranked higher than the Pandigital. The touch e-readers
are a later release than the Pandigital, which may account for a better performance and speed of response, or it
may be that its touch interface is not as good as its button interface. Almost all the junior participants were
familiar with using a touch screen device and 95% indicated they were regular users of a smartphone.
From the evaluations, it is clear that a good quality device within a limited budget can be obtained and the latest
device is not necessarily the best device on the market that meets a user’s requirements.
5.1 Future work
This study concluded an analysis on nine e-readers and conducted a preliminary analysis on five touch e-
readers. The next stage of this project is to complete the evaluation for experiment 2, requiring more data
collection and further analysis. Exploring the direction for further studies could involve:
Updating the scenario to evaluate tasks for continuous reading
Evaluating the e-readers by concealing the brand names
Evaluating the usability of an e-reader to read one-handed whilst drinking coffee, holding an umbrella,
standing on a bus or train and so on
Evaluating the search and table of contents features in e-readers
E-Reader evaluation Ujala Rampaul Page 34 of 66
6 Bibliography
Behler, A. & Lush, B., 2010. Are You Ready for E-readers?. The Reference Librarian, 52(1-2), pp. 75-87.
Choo, E. U. & William, W. C., 2004. A common framework for deriving preference values from pairwise
comparison matrices. Computers & Operators, 31(6), pp. 893-908.
Corbin, J. & Strauss, A., 2008. Basics of Qualitative Research. 3rd ed. s.l.:Sage.
Dym, C. L., Wood, W. H. & Scott, M. J., 2002. Rank Ordering Engineering Designs: Pairwise Comparison
Charts and Borda Counts. Research in Engineering Design, Volume 13, pp. 236-242.
Gibson, C. & Gibb, F., 2011. An evaulation of second generation ebook readers. 29(3), pp. 303-319.
Heikkilä, H., 2011. eReading User Experience: eBook devices, reading software & contents. eReading Media
Use, Experience & Adoption, 10 January, Volume 1.
Nielsen, J., 2009. Kindle 2 Usability Review. [Online]
Available at: http://www.nngroup.com/articles/kindle-2-usability-review/
[Accessed 10 March 2014].
Nielsen, J., 2010. iPad and Kindle Reading Speeds. [Online]
Available at: http://www.nngroup.com/articles/ipad-and-kindle-reading-speeds/
[Accessed 10 March 2014].
Patel, N., 2007. Kindle sells out in 5.5 hours. [Online]
Available at: http://www.engadget.com/2007/11/21/kindle-sells-out-in-two-days/
[Accessed 5 April 2014].
Pearson, J., Buchanan, G. & Thimbleby, H., 2010. HCL Design Principles for eReaders. Swansea and London,
ACM, New York, USA, pp. 15-24.
Rainie, L. & Smith, A., 2013. Tablet and E-reader Ownership Update. [Online]
Available at: http://www.pewinternet.org/2013/10/18/tablet-and-e-reader-ownership-update/
[Accessed 8 April 2014].
Richardson, J. & Mahmood, K., 2012. eBook readers: user satisfaction and usability issues. Emerald insight,
30(1), pp. 170-185.
Saaty, T. L., 1990. How to make a decision: The Analytisc Hierarchy Process. European Journal of Operational
Research, Volume 48, pp. 9-26.
Siegenthaler, E. et al., 2012. The effects of touch screen technology on the usability of e-reading devices.
Journal of Usability Studies, May, 7(3), pp. 94-104.
Siegenthaler, E., Wurtz, P. & Groner, R., 2010. Improving the usability of e-book readers. Journal of Usability
Studies, November, 6(1), pp. 25-38.
Siegenthaler, E., Wyss, M., Schmid, L. & Wurtz, P., 2012. LCD vs E-ink: An analysis of Reading Behaviour.
Journal of Eye Movement Research, pp. 1-7.
Zickuhr, K. & Rainie, L., 2014. E-Reading Rises as Device Ownership Jumps. [Online]
Available at: http://www.pewinternet.org/2014/01/16/e-reading-rises-as-device-ownership-jumps/
[Accessed 5 April 2014].
E-Reader evaluation Ujala Rampaul Page 35 of 66
7 Appendices
7.1 Appendix 1: Technical specification on e-readers
Name ID e-Reader Cost
Screen
size
Screen
type
Resolution
(pixel)
Button
s
Touch
screen
Weight
(g)
Nook A
69.90 6 inches e-Ink 600 x 800 Yes for
page
turning
Yes 324
Kobo B
60 6 inches e-Ink
Pearl
600 x 800 No Yes 185
Kindle C
57.85 6 inches e-Ink
Pearl
600 x 800 Yes No 170
Sony D 41.79 6 inches e-Ink 600 x 800 Yes for
navigati
on
Yes 284
E-Reader evaluation Ujala Rampaul Page 36 of 66
Name ID e-Reader Cost
Screen
size
Screen
type
Resolution
(pixel)
Button
s
Touch
screen
Weight
(g)
iRiver E
52 6 inches e-Ink 600 x 800 Yes No 202
Elonex F
41 6 inches e-Ink
Pearl
600 x 800 Yes No 186
jetBook G
61.28 5 inches e-Ink
Pearl
600 x 800 Yes No 215
Pandigital H
36.85 6 inches siPiX 600 x 800 Yes Yes 264
E-Reader evaluation Ujala Rampaul Page 37 of 66
Name ID e-Reader Cost
Screen
size
Screen
type
Resolution
(pixel)
Button
s
Touch
screen
Weight
(g)
EZReader I
55 6 inches e-Ink 480 x800 Yes No 236
E-Reader evaluation Ujala Rampaul Page 38 of 66
7.2 Appendix 2: Scenario
Tutorial and experiment on e-readers
Professor Tom Gedeon, RSCS, ANU
Experiment:
In this exercise you will get some experience in a simple evaluation of a user interface of a device. We have a
number of different e-readers, each with its own user interface. We will form groups of 2 students and give
each group one of these e-readers together with a task to be performed, and then swap over. After the task the
group will discuss their observations about the user interface of their particular e-reader.
A second purpose of this exercise is to be a pilot study in how we might evaluate e-readers. We will collect
your notes and these will help me design a more formal study.
Experiment Setting:
You have been given an e-reader as an unexpected gift, and you haven’t opened the box until now, you have had
it since last year sometime. You find a slip of paper which says “Important check registration deadline – device
not usable if not registered by this date” and tells you to open the “Somedevice XY User Guide” document and
find section 3, paragraph 2, and find the model number, and then look for that number in the “Dates-2013”
document to check how long you have to register the e-reader. You have no manual for the device, so you just
have to try and figure it out as you go along.
Task:
Appoint one of your group as a scribe.
Please find out your registration deadline, and make observations about the device, which the scribe recording
interesting or important observations whether positive, neutral or negative on the form provided.
Select from a Likert scale (V. good / Good / OK / Bad / V. bad) for all of the 7 points below.
Write in some comments into the form on the following pages.
Form sections on following page:
1. Turning it on
2. Finding the right document (called “Somedevice XY User Guide”)
[Some hints which you may need on some devices: the author is Unknown; and some devices have multiple
places to store documents/books – it’s definitely on there somewhere, I’ve checked!]
3. Opening the document
4. Increasing the font size (please assume it’s too small to read as is)
5. Navigating to section 3 (about 15% into the book), and finding the end of the 2nd (short) paragraph to find
the ‘model number’
6. Going to the other document (“Dates-2013” and use the model number from step 5).
7. Readability of the screen
E-Reader evaluation Ujala Rampaul Page 39 of 66
i. How would you improve this device?
ii. Any other comments:
7.3 Appendix 3: Questionnaire
0. Device letter:
1. Turning it on
V. good
Good
OK
Bad
V. bad
2. Finding the right document ( “Somedevice XY User Guide”)
V. good
Good
OK
Bad
V. bad
3. Opening the document
V. good
Good
OK
Bad
V. bad
4. Increasing the font size
V. good
Good
OK
Bad
V. bad
5. Navigating to section 3, and finding the 2nd paragraph, model number is:
V. good
Good
OK
Bad
V. bad
6. Going to “Dates-2013” write in the date (year) here:
V. good
Good
E-Reader evaluation Ujala Rampaul Page 40 of 66
OK
Bad
V. bad
7. Readability of the screen
V. good
Good
OK
Bad
V. bad
i. How would you improve this device?
ii. Any other comments:
Scribe (1st device) Scribe (2nd device)
Name
Gender
Age
Familiarity with
eReaders
Year of study
Can we contact you to
follow-up this
experiment?
Last question:
Did you like one of the two devices better?
If so, why?
Thank you!
E-Reader evaluation Ujala Rampaul Page 41 of 66
7.4 Appendix 4: Questionnaire for Experiment 2
The scenario was the same as the first experiment.
Device letter: 1. Turning it on
V. good Good OK Bad V. bad
2. Finding the right document ( “Somedevice XY User Guide”) V. good Good OK Bad V. bad
3. Opening the document V. good Good OK Bad V. bad
4. Increasing the font size V. good Good OK Bad V. bad
5. Navigate to section 3, and find the 2nd paragraph, model number is: V. good Good OK Bad V. bad
6. Navigate to “Dates-2013” write in the date (year) here: V. good Good OK Bad V. bad
7. Readability of the screen V. good Good OK Bad V. bad
i. How would you improve this device?
ii. What did you like about this device?
iii. What did you dislike about this device?
E-Reader evaluation Ujala Rampaul Page 42 of 66
iv. Any other comments:
I need both your names on this form.
Person 1 (usually scribe 1st
device)
Person 2 (usually for 2nd device)
Name
Gender
Age
Do you own or have used an e-
reader before?
If so, what device was it?
Which of these do you own or use
regularly?
Smartphone
Tablet
Laptop
Year of study
Can we contact you to follow-up
this experiment?
Last question: Did you like one of the two devices better?
If so, why?
Thank you!
E-Reader evaluation Ujala Rampaul Page 43 of 66
7.5 Appendix 5: ANOVA calculations
7.5.1 Experiment 1
1) Turning the e-reader on
Anova: Single Factor Q1
SUMMARY
Groups Count Sum Average Variance
A 8 30 3.75 1.64
B 8 31 3.88 0.98
C 8 25 3.13 0.70
D 8 27 3.38 1.13
E 8 30 3.75 1.64
F 8 30 3.75 0.50
G 8 33 4.13 0.70
H 8 32 4.00 0.57
I 8 30 3.75 1.64
ANOVA
Source of Variation SS df MS F P-value F crit
Between Groups 5.944444 8 0.743056 0.704 0.687 2.089
Within Groups 66.5 63 1.055556
Total 72.44444 71
2) Finding a document
Anova: Single Factor Q2
SUMMARY
Groups Count Sum Average Variance
A 8 23 2.9 2.1
B 8 28 3.5 2.0
C 8 18 2.3 0.8
D 8 32 4.0 0.3
E 8 26 3.3 1.4
F 8 20 2.5 0.6
G 8 24 3.0 1.7
H 8 26 3.3 2.2
I 8 19 2.4 2.6
ANOVA
E-Reader evaluation Ujala Rampaul Page 44 of 66
Source of Variation SS df MS F P-value F crit
Between Groups 20.75 8 2.59375 1.72 0.11 2.09
Within Groups 95.25 63 1.511905
Total 116 71
3) Opening a document
Anova: Single Factor Q3
SUMMARY
Groups Count Sum Average Variance
A 8 22 2.75 1.36
B 8 37 4.63 0.27
C 8 36 4.50 0.86
D 8 35 4.38 0.55
E 8 36 4.50 0.57
F 8 35 4.38 0.55
G 8 34 4.25 1.93
H 8 34 4.25 0.79
I 8 28 3.50 1.43
ANOVA
Source of Variation SS df MS F P-value F crit
Between Groups 23.75 8 2.96875 3.218 0.004 2.089
Within Groups 58.125 63 0.922619
Total 81.875 71
4) Increasing the font size
Anova: Single Factor Q4
SUMMARY
Groups Count Sum Average Variance
A 8 27 3.4 1.13
B 8 22 2.8 3.07
C 8 31 3.9 2.13
D 8 26 3.3 0.50
E 8 32 4.0 1.14
F 8 31 3.9 1.27
G 8 33 4.1 1.27
H 8 37 4.6 0.27
I 8 22 2.8 2.21
E-Reader evaluation Ujala Rampaul Page 45 of 66
ANOVA
Source of Variation SS df MS F P-value F crit
Between Groups 26 8 3.25 2.25 0.03 2.09
Within Groups 90.875 63 1.44246
Total 116.875 71
5) Navigating to a specific section within the document
Anova: Single Factor Q5
SUMMARY
Groups Count Sum Average Variance
A 8 15 1.88 2.13
B 8 22 2.75 1.93
C 8 28 3.50 1.14
D 8 27 3.38 1.98
E 8 26 3.25 1.64
F 8 21 2.63 1.70
G 8 32 4.00 0.86
H 8 29 3.63 1.41
I 8 13 1.63 0.84
ANOVA
Source of Variation SS df MS F P-value F crit
Between Groups 41.5 8 5.1875 3.43 0.002 2.089
Within Groups 95.375 63 1.513889
Total 136.875 71
6) Navigating to a second document
Anova: Single Factor Q6
SUMMARY
Groups Count Sum Average Variance
A 8 23 2.88 2.13
B 8 25 3.13 2.70
C 8 21 2.63 1.98
D 8 30 3.75 1.07
E-Reader evaluation Ujala Rampaul Page 46 of 66
E 8 32 4.00 0.86
F 8 27 3.38 2.84
G 8 31 3.88 3.27
H 8 29 3.63 0.84
I 8 20 2.50 1.71
ANOVA
Source of Variation SS df MS F P-value F crit
Between Groups 19.52778 8 2.440972 1.26 0.28 2.09
Within Groups 121.75 63 1.93254
Total 141.2778 71
7) Readability
Anova: Single Factor Q7
SUMMARY
Groups Count Sum Average Variance
A 8 30 3.8 0.50
B 8 32 4.0 0.29
C 8 32 4.0 0.86
D 8 26 3.3 1.64
E 8 36 4.5 0.57
F 8 28 3.5 1.43
G 8 32 4.0 0.86
H 8 31 3.9 0.13
I 8 30 3.8 1.64
ANOVA
Source of Variation SS df MS F P-value F crit
Between Groups 7.944444 8 0.993056 1.13 0.36 2.09
Within Groups 55.375 63 0.878968
Total 63.31944 71
8) Number of yes responses
Anova: Single Factor No. of Yes responses
E-Reader evaluation Ujala Rampaul Page 47 of 66
SUMMARY
Groups Count
Su
m Average
Varianc
e
A 8 0 0.0 0.00
B 8 2 0.3 0.21
C 8 5 0.6 0.27
D 8 4 0.5 0.29
E 8 5 0.6 0.27
F 8 3 0.4 0.27
G 8 4 0.5 0.29
H 8 4 0.5 0.29
I 8 0 0.0 0.00
ANOVA
Source of Variation SS df MS F P-value F crit
Between Groups 3.75 8 0.46875 2.25
0.03501
4
2.08918
5
Within Groups 13.125 63
0.20833
3
Total 16.875 71
9) Overall Ranking Calculations
For Q3-Q5 with significant differences
No. Yes Rank Total AVE Rank Overall Rank
A Nook 0.00 8 2.7 8 8.0 8
B Kobo 0.25 7 3.3 7 7.0 7
C Kindle 0.63 1 4.0 3 2.0 1
D Sony 0.50 3 3.7 5 4.0 5
E iRiver 0.63 1 3.9 4 2.5 3
F Elonex 0.38 6 3.6 6 6.0 6
G Jetbook 0.50 3 4.0 2 2.5 3
H Pandigital 0.50 3 4.0 1 2.0 1
I EZReader 0.00 8 2.6 9 8.5 9
1 2 3 4 5 6 7 8 9
E-Reader evaluation Ujala Rampaul Page 48 of 66
Nook
Kobo
Kindle
Sony
iRiver
Elonex
Jetbook
Pandigital
EZReader
Finally, after judgments have been made on the impact of all the elements and priorities have been computed for
the hierarchy as a whole, sometimes, and with care, the less important elements can be dropped from further
consideration because of their relatively small impact on the overall objective. The priorities can then be
recomputed throughout, either with or without changing the remaining judgments (Saaty, 1990).
Q2-Q4,7 ONLY
No. Yes Rank Total AVE Rank Overall Rank
A Nook 0.00 8 3.0 8 8.0 8
B Kobo 0.25 7 3.5 6 6.5 7
C Kindle 0.63 1 3.3 7 4.0 5
D Sony 0.50 3 3.8 2 2.5 2
E iRiver 0.63 1 3.9 1 1.0 1
F Elonex 0.38 6 3.5 5 5.5 6
G Jetbook 0.50 3 3.8 3 3.0 3
H Pandigital 0.50 3 3.8 4 3.5 4
I EZReader 0.00 8 2.8 9 8.5 9
1 2 3 4 5 6 7 8 9
Nook
Kobo
Kindle
Sony
iRiver
Elonex
Jetbook
Pandigital
EZReader
It is unclear if a comparison for Q2-Q4 and Q7 is meaningful, because Q2 and Q7 were not significant on the
ANOVA. According to Saaty’s study if we decide which questions seem important then we get a different result
The most important would be finding a document, opening a document, increasing the font size and readability
compared to navigating to a specific section with the document and navigating to a second document. More
analysis would be required.
E-Reader evaluation Ujala Rampaul Page 49 of 66
7.5.2 Experiment 2
1) Turning the e-reader on
Anova: Single Factor Q1
SUMMARY
Groups Count Sum Average Variance
B 13 56 4.31 0.90
D 12 49 4.08 0.81
H 11 42 3.82 0.96
J 14 49 3.50 1.04
K 14 59 4.21 0.49
ANOVA
Source of Variation SS df MS F P-value F crit
Between Groups 5.804971 4 1.451243 1.741 0.153 2.528
Within Groups 49.1794 59 0.833549
Total 54.98438 63
2) Finding a document
Anova: Single Factor Q2
SUMMARY
Groups Count Sum Average Variance
B 13 50 3.8 0.64
D 12 48 4.0 0.36
H 11 33 3.0 2.00
J 14 52 3.7 0.99
K 14 51 3.6 1.17
ANOVA
Source of Variation SS df MS F P-value F crit
Between Groups 6.673764 4 1.668441 1.65 0.17 2.53
Within Groups 59.76374 59 1.012945
Total 66.4375 63
E-Reader evaluation Ujala Rampaul Page 50 of 66
3) Opening a document
Anova: Single Factor Q3
SUMMARY
Groups Count Sum Average Variance
B 13 60 4.6 0.26
D 12 54 4.5 0.82
H 11 49 4.5 0.27
J 14 68 4.9 0.13
K 14 61 4.4 0.55
ANOVA
Source of Variation SS df MS F P-value F crit
Between Groups 2.017233 4 0.504308 1.253717 0.298407 2.527907
Within Groups 23.73277 59 0.40225
Total 25.75 63
4) Increasing the font size
Anova: Single Factor Q4
SUMMARY
Groups Count Sum Average Variance
B 13 44 3.4 1.76
D 12 48 4.0 0.55
H 11 39 3.5 1.07
J 14 49 3.5 1.81
K 14 52 3.7 1.60
ANOVA
Source of Variation SS df MS F P-value F crit
Between Groups 2.838661 4 0.709665 0.510 0.729 2.528
Within Groups 82.16134 59 1.392565
Total 85 63
E-Reader evaluation Ujala Rampaul Page 51 of 66
5) Navigating to a specific section within the document
Anova: Single Factor Q5
SUMMARY
Groups Count Sum Average Variance
B 13 38 2.9 1.24
D 12 44 3.7 0.61
H 11 34 3.1 1.89
J 14 56 4.0 1.38
K 14 44 3.1 1.67
ANOVA
Source of Variation SS df MS F P-value F crit
Between Groups
10.7868
8 4 2.69672 1.98 0.11 2.53
Within Groups
80.2131
2 59 1.359544
Total 91 63
6) Navigating to a second document
Anova: Single Factor Q6
SUMMARY
Groups Count Sum Average Variance
B - Kobo 13 47 3.62 1.59
D- Sony 12 45 3.75 1.48
H- Pandigital 11 29 2.64 1.65
J- Nook Touch 14 59 4.21 1.10
K- Kindle Touch 14 47 3.36 1.63
ANOVA
Source of Variation SS df MS F P-value F crit
Between Groups 16.41557 4 4.103892 2.77 0.04 2.53
Within Groups 87.44381 59 1.482098
E-Reader evaluation Ujala Rampaul Page 52 of 66
Total 103.8594 63
7) Readability
Anova: Single Factor Q7
SUMMARY
Groups Count Sum Average Variance
B 13 57 4.4 0.59
D 12 52 4.3 0.42
H 11 47 4.3 0.22
J 14 65 4.6 0.25
K 14 58 4.1 0.75
ANOVA
Source of Variation SS df MS F P-value F crit
Between Groups 1.880396 4 0.470099 1.033 0.398 2.528
Within Groups 26.85398 59 0.455152
Total 28.73438 63
8) Number of yes responses
Anova: Single Factor No. Of Yes responses
SUMMARY
Groups Count Sum Average Variance
B 16 5 0.31 0.23
D 16 6 0.38 0.25
H 16 3 0.19 0.16
J 16 9 0.56 0.26
K 16 8 0.50 0.27
ANOVA
Source of Variation SS df MS F P-value F crit
Between Groups 1.425 4 0.35625 1.521 0.205 2.494
Within Groups 17.5625 75 0.234167
E-Reader evaluation Ujala Rampaul Page 53 of 66
Total 18.9875 79
9) Overall calculations for
Q6 only
No.Yes Rank Total Av Rank Overall Rank
Kobo 0.31 4 3.6 3 3.5 4
Sony 0.39 3 3.8 2 2.5 2
Pandigital 0.19 5 2.6 5 5 5
Nook Touch 0.63 1 4.2 1 1 1
Kindle Touch 0.50 2 3.4 4 3 3
Q3-Q5
No.Yes Rank Total Av Rank Overall Rank
Kobo 0.31 4 3.64 5 5 4
Sony 0.38 3 4.06 2 3 2
Pandigital 0.19 5 3.70 4 5 4
Nook Touch 0.63 1 4.12 1 1 1
Kindle Touch 0.50 2 3.74 3 3 2
Q3-Q6
No.Yes Rank Total Av Rank Overall Rank
Kobo 0.31 4 3.63 4 4 4
Sony 0.38 3 3.98 2 3 2
Pandigital 0.19 5 3.43 5 5 5
Nook 0.63 1 4.14 1 1 1
Kindle 0.50 2 3.64 3 3 2
FOR ALL QUESTIONS
No.Yes Rank Total Av Rank Overall Rank
Kobo 0.31 4 3.87 3 4 4
Sony 0.38 3 4.05 2 3 2
Pandigital 0.19 5 3.55 5 5 5
Nook 0.63 1 4.06 1 1 1
Kindle 0.50 2 3.80 4 3 3
E-Reader evaluation Ujala Rampaul Page 54 of 66
7.6 Appendix 6: Participants comments for experiment 1
Q1. Turn on the device
e-reader pair ID
ID RANK
Comment
IF I
F
ED E
HF
AH
BI
HE
EB
IE I 4 looked like On button screen
CH C 3 Button hard to find at the bottom - unusual position
DI
HD
FB
EA E 3 slow initialisation
AF a 5 on screen instructions - easy
F 4 button in standard location but need to be pressed for a while
IH
CI C 3 hard to find switch button
IA
IG I 2 I thought it was touch screen because it displayed an "On" icon on screen
G 5 click a button and its ON
GF G 4 I had to look around for the power button
F 5 The button is onto of the device
FD
GD
CF
EC E 4 Button should be on the front
AG
FE F 4 Easy to find 'power on' button but during the 1st press and second press, not sure if the device is on or not. Need to press button twice.
E 5 Turn on button is easy to find because it is the only button that looks like a "turn-on" button.
DC
DA D 3 doesn't stand out from other buttons on the same area
A 2 too long to start
DB
CB C 2 Hard to locate the button. Button un-labeled
B 5 intuitive and prevents accidental switching
BA
CA
E-Reader evaluation Ujala Rampaul Page 55 of 66
e-reader pair ID
ID RANK
Comment
GC G 5 obvious ON/ OFF symbol
C 5 obvious power button
GE
HG
BG
Q2. Find the first document
e-reader pair ID ID Rating Comment
IF
I 1 hard to find
F
ED E
HF
AH
BI
HE
EB
IE I 4 we cheated, recent doc
CH C 1 very hard
DI
HD
FB
AF
A 4 "my library" pretty intuitive
F 2 Took a bit to work out how to control screen. Document was in "unknown" author folder- had to guess this. Would be easy to not find it. (overlook)
IH
CI C 2 Hard to find search (just randomly enter the interface). Hard to delete character, no hotkey
AI
IG
I 1 There is a search button but I cannot use it when I click on it. It's easy to navigate through folder but I can't find the file
G 2 user had to search each folder to find the document , there's no search action
GF G 3 later I understood it was in A-Z order, and the hint helped a lot
F 3 It took me time to go to the end of the document list
FD
GD
CF
EC
e 2 Found in recent documents - took some time
C 3 search should work with partial characters - required full string "some device"
AG G 1 feel confused cause there is an icon, which is used to change the font and seems like usually search icon
FE F 2 1. Hard to find the correct button to start finding. The button is black and the surface of the e-reader is black too. In addition, the button does not have a label.
E-Reader evaluation Ujala Rampaul Page 56 of 66
e-reader pair ID ID Rating Comment
2. no search function
3. the screen is slow and always flashing, misleading the user that she hasn't pressed the button
E 4 1. We have to know the rule to move the cursor. 2. Need time to get used to the move button.
DC
DA D 4
A 1 keyboard is horrible, not intuitive
DB
CB
C 3 Controls are obvious but it is hard to navigate through 8 pages of documents. Search looks through documents instead of titles.
B 2 Touch buttons have low responsiveness but keyboard superior. Browse interface placed ahead of the search
BA
CA
GC G 4 hint helped
C 3 took a second to figure that the page turn buttons were necessary
GE
HG
BG
Q3. Open the document
e-reader pair ID ID RANK Comment
IF I 1 Hard to find
F
ED E
HF
AH
BI
HE
EB
IE I 4 tried > key first
CH C 5 easy
DI
HD
FB
AF A 3 "open" button is not obvious. task a bit to find. Fact screen is not touch is not obvious
F 5 move to it and click it
IH
CI C 5 fast
I 3 a little bit slow
AI
IG I 5 straight forward
G 0 Opening the document was easy. Just click 'ok' button to open it
E-Reader evaluation Ujala Rampaul Page 57 of 66
GF G & F 5 easy
FD
GD
CF
EC
AG
FE F 5 easy to open
E 4 1. There is clearly a label on the button to enter the file.
2. When keeping press the 'down' button, it would not go to the next page but it will jump to the top. We have to use the tiny 'flip page' button to go next page.
DC
DA
D 4 easy
A 1 couldn't figure out how to open document needed assistance from tutor
DB B 1 could be easy found by research
CB C 5 easy & quick
B 5 easy & quick
BA
CA
GC G 5 straight forward
GE
HG
BG
Q4. Increase font size
e-reader pair ID ID RANK Comment
IF
I 1 no idea to navigate
F
ED
E 5 ugly the screen is thinking
D 3 called zoom instead of font size
HF
AH
BI
HE
EB
IE I 2 had to select font first, that wasn't any good so we kept looking through other menus
CH c 5 easy
DI
HD
FB F 1 couldn’t find the model no.
AF A 4 Menu a bit tricky to scroll through
F 5 Handy button on side
IH
CI
E-Reader evaluation Ujala Rampaul Page 58 of 66
e-reader pair ID ID RANK Comment
AI
IG
I 1 can't find the setting
G 4 user just had to click "F" button, however user don't know how to make it return to normal size
GF G 5 take few minutes but it is easy to increase the font size
F 5 using + and - is very user friendly
FD
GD
CF
EC
AG
FE F 2 1. long response time after pressing correct button
2. easy to find the accurate button
3. After pressing the 'increase size' button 6 times the font returns back to minimal size
E 5 1. It is easy to find the button to increase the font size. Have clearly label 'AA' - easy to understand the meaning of
2. After changing the size, the highlight is not correct. "AA' is not in the right position.
DC
DA
D 3
A 2 Touch screen small for fingers. Not intuitive to find
DB
CB C 5 first level menu is as intuitive as it gets
B 4 hidden toolbar, obvious icon
BA
CA
GC G 5 magnifying glass icon was helpful
C 5 easy, clearly described in menu
GE
HG H 2 We did not try to find an easy way to go to section 3. So maybe it’s not that bad
BG
Q5. Navigate to section 3 within document
e-reader pair ID ID RANK Comment
IF I 1 cannot find it
F
ED E 3 It looks like touch but not. It was sluggish and took time to figure it out - up & down buttons
E-Reader evaluation Ujala Rampaul Page 59 of 66
e-reader pair ID ID RANK Comment
HF
AH A 1 not found
BI I 1 we cannot find it
HE
EB
IE I 3 Needed 15% hint for go to page
CH C 2 quite hard
DI
HD
FB F 1 couldn’t find the model no.
B 2 long res time
AF A 3 Scrolling was easy when I tested but file opened to correct page! This should be fixed for future tests. Side buttons could be a little more ??? and have a little less lag time. Scrolling slow but content links fix this. Home scrolling through links was also slow.
F 4 scrolling bit slow
IH H 0 could not find it
CI I 0 it's impossible to find that paragraph
AI
IG I 1 can't find the document
G 4 user just have to click on the up/down button
GF G 3 It takes time to find the option and it is slow
F 1 I couldn't find it
FD
GD
CF
EC E 1 selecting content from menu counter intuitive
AG
FE F 4 First tried to use ' explore' button and press keyword 'model number', no response came out. Easy to find when using 'content' button
E 3 Sometimes the highlighted square is disappeared. Need to re-open the doc. The button to move highlighted square and the button to flip pages are different. It is confusing.
DC
DA D 1 cannot find 2nd para, not clearly indicated
A 1 couldn't figure out how to scroll the screen
DB
CB C 4 intuitive if you are familiar with those devices
BA A 2 kept pressing the wrong button (left button sent pages forward)
CA
GC G 4 first icon guessed was correct
C 5 TOC was easy to find
GE
HG
BG
E-Reader evaluation Ujala Rampaul Page 60 of 66
Q6. Navigate to second document
e-reader pair ID ID RANK Comment
IF I 1 cannot find it
ED E took much time to find the search function as we misunderstood, we thought it was in the same doc
HF
AH
BI
BI
HE
EB
IE
CH
DI
HD
FB F 1 can't find date -2013
B 1 speed is very low& cannot find the menu now to open it
AF A 5 once I know the navigation - easy
F 4 scrolling bit slow
IH H 0 could not find it
CI
AI
IG I 1 can't find file
G 5 user just had to click on the go back button
GF G & F 5 easy to find the document
FD
GD G 1 Device froze
CF
EC E 3 similar to Q3 - however instructions on front page not very clear
C 3 Keyboard slow. Hit done, yet had to hit 'enter' again to search for DOC - repetitive
AG
FE F 1 cannot find it
E 5 It is easy to find. However it is because someone had used it recently, so it is in the "recent''. Even not use the 'recent' hotkey , its easy to find the doc because it is sorted by title alphabetically
DA D 1 Scrolling not suitable for big fingers. Did not realise 'date 2013' was a book. Scroll bar too close to the edge of the hardware lip (raised)
A 2 buttons hard to press
DB
CB C 1 Complicated search function. Didn't match -
B 5 used library search
BA
CA
GC G 4 Didn't take too much time
E-Reader evaluation Ujala Rampaul Page 61 of 66
C 5 easy to find second book (since we had done it before)
GE
HG
BG
Q7. Access readability and comments on improving the device
e-reader pair ID ID RANK Comment Comment on Improving the device
Other comments
BH B touch sensitivity
H weight, more responsive
IF F screen is too small, cannot control lighting of screen, no touch screen function
ED E Make it less buttons, improve navigation
D say font size instead of zoom
zoom button opens a menu rather than actually zooming
HF H needs to be more responsive to touch
some confusion with buttons
F Search function - sort names by title rather than author
navigation buttons weren't immediately obvious
AH A page design of the buttons in opposite direction - see illustration
Really hard to use. Poor reaction with operation. Poor arrangement of screen
H Add scrollbar to allow fingers slide the screen
accelerate the processing speed
BI B make it more colourful slightly slow / show time
I Improve search functionality as had to use 2 fingers to type cap letters.
HE H better usability, touch screen, response /sensitive
E Improve the system operating speed. Add number buttons
the touch screen could be used for this device
EB E Improve main navigation (appears like a roll). Duplication of arrow?
on/off response
B navigation is always visible as buttons. Power button on front - very laggy unresponsive
IE I faster why must the screen black whenever the screen changes
E Faster loading. Scroll button can be enter as well
CH c touch screen fewer clicks to do stuff, improve sorting
improve Q7 clarity - separate document
H better touch screen sensitivity, colour, bigger
book opens on last opened page i.e. sections already
E-Reader evaluation Ujala Rampaul Page 62 of 66
e-reader pair ID ID RANK Comment Comment on Improving the device
Other comments
DI D either have buttons or touchscreen
scrolling / swiping difficult
I Touch screen, menu options, get specific location
very slow page changing
HD H less buttons, simpler menu
D Better zooming (font size). Touch screen improvements
FB F touchable & adjustable
B Power button should be placed in a different place, speed, how to open the menu?
EA E Add suggestions/ hints for keys to use. Make screen touch.
Navigation bar is hard to use. Make the content dynamic as have to sequentially access content.
A Touch screen does not work well. Counterintuitive buttons & navigation
Not user friendly
AF A Make the screen less like a touch screen. Reduce command lag. Add hardware 'up / down' buttons in unobstructed location
F Place manuals in folder at the top of the library marked manuals
Jump n page. Texture is nice.
IH I touch screen Not user friendly
H increase the reaction speed
should have help tips, just like turn ON the device
CI C Add 'delete' button, touch screen
back' button is a little tricky
I Change a better CPU, its running too slow
AI A increase the responding speed
I improve flexibility and efficiency of use
IG I The button should respond faster when click. When click on search button it should work
it's good the text is clear
G fasten feedback, provide search function
each letter & #'s should have their own button
GF G 5 easy to read off the screen
make it touch screen it takes time to load
F 5 easy to read make it touch screen and colour display
it is bit faster than previous device but it needs to be more faster
FD F make it touchable
E-Reader evaluation Ujala Rampaul Page 63 of 66
e-reader pair ID ID RANK Comment Comment on Improving the device
Other comments
D The arrow icon beside the power is confusing. Make it point to the left.
GD G improve the response time the device is too slow
D searching time takes way too long, may improve in that aspect
When searching is failed the device jumps to the home screen. We think this design is confusing
CF C Colour will improve usability. Add Directory like "Home", "back", XX for users to know where they are.
How do I zoom in?
F functionality to search document
EC E Content selection in TOC. Have touch screen - included instruction? Button to move manual from "recent document"
C On screen QWERTY slow to use with cursor movement. Also, touch screen would be better.
AG A Change proportions of buttons. Make them map the context
Very bad UI design. Difficult to figure out how to use.
G there are a lot of place need to improve
FE F 2 Always flashing, slow. The hotkeys at the sides are well designed
stop flashing clear label on the button
E 5 it's acceptable Integrate the "flip' button and the "more' button to one button to avoid confusing.
Cannot turn the screen over? At main page
DC D not use e-paper(better, faster, display)
Only use USB from power. No light for night time reading
C better way of entering text and search should be optimized
Buttons are a bit redundant as both sides and it gets a while to realise how to use them. Touch screen could have been better as display response is slow.
DA D 1 screen is dull not enough backlight for contrast
Brighter whiter background for contrast. Device should be bigger. Make screen flush with hardware for smoothness
Controls should be more visible, standout. It's too hard to press the buttons. Device is heavy. Use of colour to differentiate buttons & actions
A 3 make device touch screen controls are too small and hard to press. Response is slow
E-Reader evaluation Ujala Rampaul Page 64 of 66
e-reader pair ID ID RANK Comment Comment on Improving the device
Other comments
DB D should remove less use buttons
add more 'screen touch based features'
B function list should be remap, based on the frequency of use or some other criterion
CB C 3 No backlight grey display
good search basic e-reader
B 3 No backlight grey display
keyboard first / improve screen refresh time
BA B faster response time home button too small
A make touchscreen, take away second set of scroll buttons on side, make opening documents more intuitive
CA A to support touch screen to increase the frame per second (fps)
C to increase the processor speed
the buttons on the sides are confusing
GC G backlight, lighter, touchscreen
C no particular ideas
GE G Buttons need to have more meaningful icons. Loading time. Can't find the home back button
screen size needs to be large
E The keyboard was not used. It was difficult to input numbers, the main buttons could be bigger on the interface
HG
H 3 V Good with list. V.Bad with thumbnails
My library thumbnails is far too small. Either make it better or offer an intuitive zoom, or have large thumbnails
G it looks quite good to use
BG
B Buttons on screen are confusing. Hard to search within documents
Good that it is Touch
G Hard to see what device is selection. Font size button looks like search button - confusing
Q8. User preference from the comparison
Comparison pair ID Q Comments for which device is better?
BH Pandigital better than the Kobo. It responses faster and higher reliability.
Pandigital has better readability than Kobo
IF Elonex better than iRiver on navigation and ease of use
ED Sony better than iRiver on navigation and is a touch screen. iRiver is not intuitive. Sony uses the word 'Zoom' instead of font size, which opens a menu rather than actually zooming.
E-Reader evaluation Ujala Rampaul Page 65 of 66
7 iRiver looks like a touch screen but is not.
HF Pandigital better than Elonex because it has touch screen, faster response time & search facility
AH Neither was any good, hard to do operation but Pandigital better than Nook.
Neither, even though the Pandigital is better than Nook. It still needs to improve its usability
BI Kobo better than EZ reader but needs a faster response rate
HE iRiver is better than Pandigital. The screen is more reliable and response quicker. The navigation button is flexible.
EB iRiver is better than Kobo. iRiver is more responsive, intuitive. Even though not as 'minimal' interface easier to work out what each button does.
IE iRiver much easier and faster than EZ Reader
CH Pandigital is more intuitive & has touch, better than kindle
DI Sony has better navigation & user-friendly than EZReader
HD None both device Pandigital & Sony are not user friendly. Affordance of Zoom instead of font size
FB None, Elonex and Kobo not intuitive. Kobo is slow to respond.
EA iRiver is better, more user friendly than Nook
AF Elonex is easier to use, no touch screen, intuitive button controls, compared to Nook
IH Pandigital has touch screen and user guide, compared to than EZReader
CI Kindle faster than EZReader on navigation. Add touch interface to kindle
6 Couldn't locate paragraph in EZReader
AI Mixed. One preferred Nook because it had touch screen and is better understood and easier. The other preferred EZReader as found the keyboard was easier to use & understand
IG Jetbook rated better than EZReader because user was able to find the required documents
functions are minimum but can still find documents
buttons actually provide functions when user click on them
GF Elonex more user friendly than Jetbook. Easy to find documents and it present the documents nicely.
FD Sony better than Elonex because it is touchable and convenient for user to search the documents.
GD Sony better than Jetbook. Touch screen, user-friendly interface, more fitting into modern conceptual model of e-readers.
CF None. Disliked both UI was terrible
EC Kindle was better than iRiver. Navigating to Document easier on Kindle. iRiver required 2 separate 4 way nav. buttons to select a section from the TOC. Search function on Kindle easier to find Docs. Keyboard too complex for an iRiver suitable for web browsing.
AG None both devices Nook & jetBook are not user friendly, non intuitive
FE iRiver better than Elonex because easy to use and beautiful.
DC None, both devices Sony & kindle is not user friendly /non-intuitive. Preference for touch, faster, light for night reading
DA None, both Nook and Sony is non-intuitive. Control on Nook are small and hard to press.
DB None, both Sony & Kobo is non-intuitive. Preference for touch screen as iPad
CB Kindle better than Kobo. Better feedback, No touchscreen less confusing navigation
E-Reader evaluation Ujala Rampaul Page 66 of 66
BA Kobo better than Nook. I like the function that changing the font size with 2 fingers and other functions like iPad does. Kobo is much more intuitive , simpler interface, touch screen, we made a lot less mistakes
CA Kindle because the Nook is even worse, the operation is confusing.
GC Kindle better than Jetbook because bit lighter, clearer design, bit faster
GE jetBook better than iRiver as the keyboard is not user friendly.
HG Jetbook better than Pandigital because of intuitive navigation. Like the black sliding button more
BG Jetbook better than Kobo because better controls, navigation. Kobo is better to read and the touch screen is good.
7.7 Contents of second document (Dates 2013)
The entire contents of the Dates file:
A1234: 1984
B3942: 2001
C7654: 2013
D3423: 1901
E7777: 1632
F9875: 1492
G1254: 1939
H3110: 1918
I4444: 1995
J7631: 2014
K0078: 2030