SlideShare uma empresa Scribd logo
1 de 24
Baixar para ler offline
Critical Evaluation and Comparison of Two Internet Public Health Information Resources

ABSTRACT

OBJECTIVE: To determine which of two Websites, HealthInsite and eMedicine Consumer Health is the better Internet public
health information resource
DESIGN: Pilot study of 10 Websites to select 2 finalists; objective comparison of the two final sites and their breast cancer
information content, using Minervation and Net Scoring benchmarking tools, and a manual and online readability tests
DATA SOURCES: Key features from all the Websites
MAIN OUTCOME MEASURES: Accessibility, Usability, Reliability and Readability of the sites
RESULTS: All figures are for HealthInsite vs. eMedicine. With Minervation tool, Accessibility was 88.9% vs. 54%; Usability
83.3% vs. 72.2%; Reliability 85.2% vs. 51.8%; Overall score was 86.1% vs. 60.4% . With Net Scoring the corresponding scores
were 50% each (Accesibility); 74.4% vs. 70.6% (Usability); 65.1% vs. 52.7% (Reliability); 68.7% vs. 60.5% (Overall).
Readability scores were 43.1 vs. 47 (FRE) (p=0.99); 11.6 vs. 10.7 (FKGL) (p=0.98); 9.7 vs. 8.2 (Fog). With online readability
tool, the scores were 61.8 vs. 61.3 (FRE); 8.9 vs. 8.7 (FKGL); 12.2 vs. 11.8 (Fog)
CONCLUSION: As a patient/public health information resource, HealthInsite was better overall. Both HealthInsite and eMedicine
failed to meet UK government requirements. Quality benchmarking tools and readability tests/formulae are not perfect and lack
conformity amongst themselves. The task of benchmarking and measuring readability is rigorous and time-consuming.
Automating both processes through a comprehensive tool may aid the human experts in performing their task more efficiently.

Key words: Quality benchmarking tools; Readability tests

(The following document’s FRE=28.6 and FKGL=12)
INTRODUCTION
Ninety-five million Americans use Internet for health information.1 There were >100,000 medical Websites in 1999, and
increasing phenomenally.2 These give rise to some cogent questions begging for urgent answers. How much of the information is
useful, genuine or usable to the public? What impact does it have on them?3,4 What benchmarking tools to use to assess the
authenticity/reliability/validity of online information? How to improve the benchmarking process?

This essay considers these inter-related issues. We have critically compared/contrasted two public/patient Internet health
information resources from two regions, from a public/patient’s and a specialist’s perspective. We selected breast cancer because
it is the most common cancer in women, kills 400,000 annually, and can strike early;5,6 it is the biggest cause of cancer deaths in
Australian women, and second biggest cause in US and Britain;6,7 it is one of the most common health-related search topics
among Internet users;8 and finally, the author of this essay manages the Breast Clinic in the Seychelles Ministry of Health.

MATERIALS AND METHODS
Downloading/Installing HONcode Toolbar
The HONcode9,10 Accreditation Search and Verification Toolbars software was downloaded and installed in our browser (IE
Version-6.0.2800.1106) Explorer Bar, through a series of HONcode 1.2Setup wizard dialogue boxes. (Figures-1,2)



                                                                                                 Figure-1: HON and HONcode logos



                                                          Figure-2: Screenshot of HONcode 1.2 Setup wizard box


We installed the automatic HONcode accreditation status indicator on the Toolbar (View menu→Toolbar option). We did not
install HONcode search box because it was slowing down the opening of our browser. Right-clicking on some highlighted text
and selecting ‘HONcode search’ indicated the site’s accreditation status.11

Piloting 10 sites
                      Figure-3: C-H-i-Q logo
                                            Next, a pilot study was conducted on ten Websites (selected from Internet in Health
                                            and Healthcare12 and from Internet survey) to finalise two for
                                            evaluation/comparison. Patient/public-oriented resources and some professional
                                            ones were included (Appendix-Box-A). The Centre for Health Information Quality
                                            (C-H-i-Q)13 (Figure-3) checklist was applied to each site. The parameters were
                                            scored on a scale from 0 (not present) to 5 (fully present). We determined the Web
                                            Impact Factor (WIF)8,14 from the results returned by AltaVista
                                            (http://www.altavista.com/; accessed 18 June 2005), by entering ‘link:URL -
                                            host:URL’ in the search box, after selecting ‘search Worldwide’ option. Some
additional points, including HONcode status were also included, with a score of 0 (not present) or 1 (present).(Appendix-Box-1,
Appendix-Table-1)

HealthInsite / eMedicine Analysis and Comparison
We applied two quality benchmarking tools to the two finalists, HealthInsite (Australian) and eMedicine (American), to compare
two resources from different regions. The two benchmarking tools were:

1. A tool from Minervation (LIDA Instrument version1.2) (Figure-4): This tool assesses a site on three Levels; Accessibility,
Usability and Reliability, which are further subdivided into sub-levels and sub-sub-levels (Appendix-Box-2).15




                                                                                                                 Figure-4: Minervation homepage


For Accessibility (Level-1) we went to www.minervation.com/validation (Accessed 12 June 2005) and entered the respective
URLs (HealthInsite, eMedicine) in the designated box. The validation tool generated answers to first 4questions. For all the
remaining questions we viewed the sites normally and entered the appropriate scores. Each question was scored on a scale of 0-3,



MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal             2
where 0=Never; 1=Sometimes; 2=Mostly; 3=Always. The supplemental questions in Reliability (Level-3) were not considered
since they required contacting the site producers.15

2. Net Scoring, a French quality benchmarking tool, was applied next (Figure-5). This has 49 criteria grouped into 8 categories;
Credibility, Content, Hyperlinks, Design, Interactivity, Quantitative aspects, Ethics, and Accessibility. Each criterion is classified
as essential (0-9), important (0-6), or minor (0-3); (Maximum=312points).16 All categories were used for evaluation except
Quantitative, and one important criterion under Hyperlinks, which were not applicable to us. Therefore our assessment was on a
maximum of 294(312minus18) points.




  Figure-5: Central Health/Net Scoring logos



Readability scoring
Breast cancer contents of each site were compared by means of readability indices. For consistency, specific breast cancer topics
were selected8.(Table-3)

                                                                The Readability tests were Flesch Reading Ease (FRE)17, Flesch-Kincaid
                                                                Grade Level (FKGL)17, Gunning's Fog Index18, and an online readability
                                                                tool that automatically generated Kincaid, ARI (Automated Readability
                                                                Index), Coleman-Liau, Flesch, Fog, Bjornsson’s Lix and McLaughlin's
                                                                SMOG scores19(Appendix-Boxes-3,4,5)

                                                                Microsoft® Word has in-built facility to give the FRE and FKGL scores.
                                                                The ‘Tools’ menu in the MSWord 2003 was configured as outlined in
                                                                Appendix-Box-5a, Figure-6.17




                                                                Figure-6: Screenshot of Tools menu Options dialogue box



FRE and FKGL scores: Text from the documents was copied in clipboard and pasted in Microsoft® (Redmond, WA) Word2003.
Each document was meticulously ‘processed’ as per Pfizer guidelines (viz. headings/titles, page navigation, bullets,
references/URLs removed; hyphenated words/proper nouns included; footnotes excluded); only the main text body was used.17,20
On running spellchecker in MSWord2003, after it finished checking, it displayed statistics about the document and FRE/FKGL
scores.17 Mean, Standard Deviation, Variance and Probability associated with Student’s t test were computed in MSExcel2003.
Fog Index: In the absence of software,21,22 we calculated Fog Index ‘manually’, as outlined in Appendix-Box-5.18 We counted all
the words with >3 syllables according to Pfizer guidelines (saying the word aloud with a finger under chin; each chin drop
counted as a syllable).20
Online readability tool: For further readability check, the documents were uploaded onto an automated online readability tool
(http://www.readability.info/uploadfile.shtml; accessed 12 June 2005). The instrument converted the document into plain text and
generated the scores.19

RESULTS
HONcode toolbar installation
The HONcode Status and Search icons were installed on the Explorer Bar and the accreditation status indicator in the Toolbar.
The latter automatically displayed the HONcode accreditation status of a Website (Figures-7a,b).




                                                                                                                           :
  Figure-7a HONcode Status Search icons installed and accreditation status                          HONcode Search icon
                                                                                                                           Accreditation status
                                                                                                    HONcode Status icon



MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal       3
Figure-7b: Accreditation status

Pilot study results
The top scorers were HealthInsite (54), NHSDirect and eMedicine (50 each) (Box-2, Appendix-Table-1). Only MedlinePlus,
healthfinder®, HealthInsite and eMedicine were HONcode-accredited. NHSDirect and NeLH carried the NHS seal. MedlinePlus
breast cancer page was not HONcode-accredited.

  Box-2: Scores of Websites in pilot study

  -HealthInsite (Australia) –                54
  -NHS Direct Online (UK) –                  50
  -eMedicine Consumer Health (USA) –         50
  -MedlinePlus (USA) –                       48
  -Healthfinder® (USA) –                     42
  -NIHSeniorHealth (USA) –                   33
  -NeLH/NLH (UK) –                           32
  -DIPEx (USA) –                             28
  -Cochrane Library –                        26
  -HealthConnect (Australia) –               2

HealthConnect, under re-development, had no breast cancer search results. Cochrane Library and NeLH/NLH had insufficient
public material. AltaVista search results for NeLH/NLH, were 43,300/255 respectively. DIPEx breast cancer search returned only
subjective Interview Transcripts rather than objective information. NIHSeniorHealth had features typically suited for the elderly.
MedlinePlus and healthfinder® were comparable, but breast cancer information was more systematically arranged in the former.
The three top-scorers in the pilot, NHSDirect, HealthInsite and eMedicine had almost comparable features. NHSDirect had the
highest results (231,000) from AltaVista search.(Figures-8-16)




                                                                     Figure-8: HealthConnect breast cancer search result




                                                                          NHS Seal




  Figure-9-: NeLH breast cancer search page, patient information




                                                                   Figure 10: NIHSeniorHealth breast cancer search page




MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal   4
Figure 11: DIPEx breast cancer search page




12                                                                                   13
     Figures-12: MedlinePlus homepage; 13: Breast cancer information




                                                      NHS
                                                      seal




                                                                       Figure 14: NHS Direct Online homepage; highest link popularity




                                                                                                                      Figure-15: HealthInsite homepage




                                                                                    Figure-16: eMedicine Consumer Health homepage


Benchmarking results
With Minervation tool, HealthInsite secured 86.1% against eMedicine’s 60.4% (Table-1, Appendix-Tables-2,3).

Table-1: Results with Minervation tool



MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal                   5
HealthInsite      eMedicine
Level-1 (Accessibility) (Maximum=63points)
First four automated tests (Maximum=57points)                                   50               28
Browser test (Maximum=3points)                                                  3                3
Registration (Maximum=3points)                                                  3                3
                  Subtotal (% of 63)                                             56 (88.9%)       34 (54%)
Level-2 (Usability) (Maximum=54points)
Clarity (6 questions; maximum=18points)                                         15               12
Consistency (3 questions; maximum=9points)                                      8                9
Functionality (5 questions; maximum=15points)                                   13               13
Engagibility (4 questions; maximum=12points)                                    9                5
                  Subtotal (% of 54)                                             45 (83.3%)      39 (72.2%)
Level-3 (Reliability) (Maximum=27points)
Currency (3 questions; maximum=9points)                                         9                3
Conflicts of interest (3 questions; maximum=9points)                            9                6
Content production (3 questions; maximum=9points)                               5                5
                  Subtotal (% of 27)                                             23 (85.2%)      14 (51.8%)
               Grand total (% of 144)                                           124 (86.1%)      87 (60.4%)

With Net Scoring, HealthInsite scored marginally better (68.7%) than eMedicine (60.5%) (Table-2; Appendix-Tables-4,5,6).

Table-2: Results with Net Scoring

                                                                                                              HealthInsite        eMedicine
Content category (Maximum=87points)                                                                            64(74.7%)         46 (52.9%)
Credibility category (Maximum=99points)                                                                        55(55.5%)          52(52.5%)
Hyperlinks (Maximum=45points) (minus 6points-see text)                                                         31(79.5%)          29(74.4%)
Design (Maximum=21points)                                                                                      15(71.4%)          16(76.2%)
Accessibility (Maximum=12points)                                                                                6(50%)             6(50%)
Interactivity (Maximum=18points)                                                                               13(72.2%)          11(61.1%)
Ethics (Maximum=18points)                                                                                      18(100%)           18(100%)
Grand total (% of 294)                                                                                        202(68.7%)         178(60.5%)

Figure-17 graphically represents the total scores from two sites by the two benchmarking tools.
                             Tw o sites vs. tw o tools
  100.00%
   90.00%
   80.00%
   70.00%
   60.00%
   50.00%
   40.00%
   30.00%
   20.00%
   10.00%
    0.00%
                                                                 HealthInsite
                  Minervation tool                 Net Scoring
                                                                 eMedicine       Figure-17: 2 x2 Comparison [Two sites on basis of two benchmarks]


Readability results
The results of MSWord (FRE, FKGL) and manual technique (Fog) are summarized in Boxes-3,4; Table-3; Figures-18,19;
embedded MSExcel2003 worksheets-1,2.

  Box-3: HealthInsite Fog Index                                                          Box-4: eMedicine Fog Index

  Words/Sentences = 5198 / 237 = 21.93                                                   Words/Sentences = 3319 / 196 = 16.93
  [Words>3 syllables / Words] x 100                                                      [Words>3 syllables / Words] x 100
  = [126 / 5198] x 100 = 2.42                                                            = [124 / 3319] x 100 = 3.73
  [21.93 + 2.42] x 0.4 = 9.74                                                            [16.93 + 3.73] x 0.4 = 8.26


Table-3: Readability Statistics

                                                                             HealthInsite                      eMedicine
Text                                                                     FRE FKGL Fog                   FRE      FKGL Fog
Breast cancer overview / facts and figures                                55      9.9                   56.2      8.8
Breast cancer causes / risk factors                                      55.6     8.9                   48.2       11
Tests for breast cancer / mammography                                    49.7    10.4                   51.9      9.7
Treatment options for breast cancer                                      38.1     12                    41.1      11.7


MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal                             6
Support / follow up for women with breast cancer                 40.8        11.7                41.5       11.6
Combined text                                                   43.1         11.6      9.74     47.0        10.7      8.26
Mean μ=( ∑A-E / 5 )                                             47.84       10.58               47.78      10.56
Std Deviation (SD) σ=                                           8.05        1.28                6.56       1.26
Variance                                                        51.87        1.32               34.42       1.28

Figure-18a: HealthInsite Readability Indices screenshot                      Figure-18b: eMedicine Readability Indices screenshot




                                                          Flesch Reading Ease
                                                          HealthInsite                 eMedicine
                                                                                  55               56.2
                                                                                55.6               48.2
                                                                                49.7               51.9
                                                                                38.1               41.1
                                                                                40.8               41.5


 Mean                                                                          47.84             47.78
 SD                                                                      8.052515135     6.559496932


 Variance                                                                    51.8744          34.4216
 Probability associated with a Student's t test                                          0.990022434
                                                                                                            Sheet-1: Comparison of FRE scores. No
 (2-tailed distribution, unpaired 2-sample with                                                             statistical difference. Double click anywhere
 unequal variance )                                                                                         on table to get MSExcel Worksheet


                                                          Flesch-Kincaid Grade Level
                                                          HealthInsite                          eMedicine
                                                                                          9.9                8.8
                                                                                          8.9                 11
                                                                                         10.4                9.7
                                                                                           12               11.7
                                                                                         11.7               11.6
 Mean                                                                                   10.58              10.56
 Standard Deviation                                                             1.283354978        1.266096363
 Variance                                                                              1.3176             1.2824
 Probability associated with a Student's t test                                                    0.980816678
 (2-tailed distribution, unpaired 2-sample with                                                                     Sheet-2: Comparison of FKGL scores. No
                                                                                                                    statistical difference. Double click anywhere
 unequal variance )
                                                                                                                    on table to get MSExcel Worksheet

The results from the online readability tool are summarized in Table-4, Figure-19

Table-4: Online readability results

Test/Formula             HealthInsite                eMedicine
Kincaid                  8.9                         8.7
ARI                      10.3                        9.7
Coleman-Liau             13.1                        12.9
Flesch Index             61.8                        61.3
Fog Index                12.2                        11.8
Lix                      41.1 (School year 7)        40.3 (School year 6)


MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal                         7
SMOG-Grading            11.3                               11.1

  Figure-19: Readability of the two sites through automated tool
                                                                                                     The mean readability values (MSWord-derived) for
                             Readability comparison
                                                                 HealthInsite    eMedicine
                                                                                                     HealthInsite and eMedicine) were similar (FRE 47.84 vs.47.78
65                                                                                                   (p=0.99); FKGL 10.58 vs.10.56 (p=0.98), respectively). The
60                                                                                                   automated test generated higher FRE and lower FKGL than
55                                                                                                   MSWord. Conversely, it returned higher Fog index than the
50                                                                                                   manual method. The mean scores and scores for combined text
45                                                                                                   (MSword-derived) were similar for eMedicine but not so for
40                                                                                                   HealthInsite. We found a high negative correlation (–0.96)
35                                                                                                   between FRE and FKGL, measured with MSExcel CORREL
30                                                                                                   function (Worksheet-3, Chart-1).23
25
20
15
10
5
0
      Kincaid     ARI    Cloeman-Liau Flesch Index   Fog Index      Lix           SMOG-
                                                                                  Grading
                                 Readability Form ulae




                FRE / FKGL Correlation
                                                                                                          FRE-FKGL Correlation
                FRE                                  FKGL
                                                                                100
                                               55                    9.9
                                            55.6                     8.9
                                            49.7                   10.4
                                            38.1                      12
                                            40.8                   11.7                                                                   FRE
                                                                                10
                                                                                                                                          FKGL
                                            56.2                     8.8
                                            48.2                      11
                                            51.9                     9.7
                                                                                                                                                 Sheet-3 and Chart-1: FRE - FKGL
                                            41.1                   11.7                                                                          correlation. Almost perfect negative
                                            41.5                   11.6          1                                                               correlation. Double click anywhere on
Correlation                                              -0.964999025
                                                                                      1      2   3    4      5    6    7    8    9   10          table to get MSExcel Worksheet


DISCUSSION OF METHODS AND RESULTS
HONcode represents a pledge to abide by 8 ethical principles.10 It is not a quality benchmarking instrument/seal. Unethical
developers may cut-and-paste it onto their sites.13 Moreover, sites displaying a HONcode seal may not comply with the code.8
They may violate the HONcode after the accreditation was awarded by HON, and before their next infrequent check. Though we
installed the toolbar plugin,11 these caveats should be kept in mind; HON-toolbar per se does not detect violations in a HON-
accredited site.

We utilized Web Impact Factor (link/‘peer-review’ popularity) in our pilot study. This is a better indicator of popularity than click
popularity (frequency of site visitation), which may be manipulated.8,14 By AltaVista search, eMedicine had lower WIF than
healthfinder® and NHSDirect, and HealthInsite had even lower(Appendix-Table-1). Thus, popularity of a site does not
necessarily correlate with quality.8,14

HealthInsite / eMedicine – Critical/Analytical Review/Comparison (Figures-15,16,20a,b,21,22)

Benchmarking tools: Score ranges like 0-3, 0-6 etc are pseudo-objective, giving a false sense of mathematical precision.
Moreover, low scores in one important criterion may be compensated by high scores in two unimportant criteria, giving the same
overall score. There is lack of conformity between different tools.15,16 There is also the problem of inter-rater reliability (kappa-
value).24,25 Finally there are rater-training and rating-the-rater issues to be considered.14 But in the absence of other means of site
assessment, scoring systems represent the only available fallback. They force our attention towards the important points about a
health information site. They are considered acceptable if at least some issues (like inter-rater agreement kappa-value >0.6) are
dealt with at the outset.24

Accessibilty: This determines if the Website meets the W3C-WAI and Bobby standards, if it is ‘future-proof’ and if users can
access the information.15,26,27 Automated tools/measurements of accessibility are not perfect. They should be used with caution,
and their results interpreted with discretion.28 eMedicine failed in the automated tests (Page setup, Access restrictions, Outdated
code, Dublin core tags). With 87% overall, HealthInsite still did not meet the UK government legal standards.15 By Net Scoring,



MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal                                            8
both sites scored 50% in Accessibility, but we cannot rely on this figure because Net Scoring attached rather low importance to
this category.

Usability: This determines if users can find the required information; the ease of use of a service/component. Good usability
increases usage, and ‘stickability’. Low usability results in decreased perception of usefulness.12,15,29 With Minervation tool,
HealthInsite scored somewhat better than eMedicine. The latter lost out on clarity and site interactivity. Usability under
Minervation tool corresponds to Hyperlinks-Design-Interactivity combination under Net Scoring. There was no significant
difference between the two sites with this tool [Table-2, Appendix-Table-6]. HealthInsite relied entirely on external sites (>70) to
provide information, reachable through a series of mouse-clicks. This rendered usability assessment somewhat difficult. This was
not so with eMedicine, which provided its own material. Both had good print quality, though HealthInsite’s multiple partners
resulted in variable fonts and sizes. Somewhat cluttered and confusing appearance of eMedicine homepage (Figure-16b) rendered
it inferior in site design. Both sites had no author contact, and only HealthInsite enabled consumer participation.

Reliability: This determines if the site provides relevant, unbiased, or unreliable and potentially harmful information. In a
systematic review of Web health information quality, problem was found in 70%.15 Under Minervation tool, eMedicine failed.
Main reason was failure to specify currency in all pages and conflicts of interest. Despite providing its content through external
resources, most of HealthInsite’s material was well categorized and sensibly linked together as a coherent whole.30 This category
roughly corresponds with Content-Credibility components of Net Scoring, which attaches a lot of importance to them. With Net
Scoring, the composite score difference between the two sites was less [Table-2, Appendix-Tables-4,5]. Both Websites performed
poorly in noting omissions, displaying information categories, name/title of author, source of financing, conflicts of interest, and
webmastering process. Both sites had good editorial review and language quality. Only HealthInsite had provided alternative
language facility. (Figure-20a)




                                                                            Figure-20a: HealthInsite –Language options



A site can only check the quality of its immediately linked pages. It is virtually impossible to verify all subsequent pages that the
partner sites link to; therefore it cannot be considered a quality requirement. HealthInsite had provided a disclaimer to this effect.
HealthInsite did not specify influence/bias and had no metadata, while eMedicine did not mention hierarchy of evidence, original
source, and had no help page and scientific review. But its site had ICRA-label v0231. Net Scoring considers an evolving
technology like metadata an essential criterion.13 Net Scoring, but not Minervation tool, had ethics category. It was implicit in
HealthInsite and explicit in eMedicine through a ‘Code of ethics’ link. (Figure-22)




                                                                                                                           User freedom




  Figure-20b: HealthInsite –Re-direction to partner sites




MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal        9
Figure-21: HealthInsite salient points




                                Advertisement




                                                                                          Figure-22: eMedicine salient points


Privacy policies: Both sites had similar policies with regard to type of information collected, how it was used, under what
circumstances and to whom it could be disclosed, and use of clickstream data/cookies. HealthInsite adhered to Australian
Guidelines for Federal and ACT Government World Wide Websites, and complied with Information Privacy Principles (Glossary
1-3,10,11; Privacy Act). It explained about E-mail privacy, site security and user anonymity. Contact officer’s E-mail was
provided for privacy-related queries.32 eMedicine gathered information to understand user demographics. It obtained additional
information about site traffic/use via Webtrends™ software. It occasionally shared some information in aggregate form (Figures-
23a,b).33 With a P3P-enabled browser, in-built in MS-IE6, it could be possible to view sites’ privacy policies and match them with
user’s preferences.12,14




                                                             Figure-23a: HealthInsite Privacy Statement



MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal                 10
Figure-23b: eMedicine Privacy


Advertising: HealthInsite did not accept advertisements,34 but eMedicine did. Advertisement-containing pages take longer to
load, chances of ad-ware/pop-ups/virus attacks increase, pages may be confusing to the uninitiated user, annoying ads may distract
the reader and affect site usability, there may be content bias/consumerism (more commercial than factually-oriented),35
advertisers may not strictly adhere to ethical principles, and privacy/cookie policies of advertisers may be at variance with that of
main site.33 However, discretely placed ads may be a good thing, and sponsored resources may have more user-customized
information. Impact of Web advertising needs further research.35

Regional cultural/linguistic differences: USA has a substantial Hispanic population, yet eMedicine did not have ‘Espanol’
option. It has been claimed that American articles are more difficult to read than British affiliated ones;22 others have challenged
it.36 Our findings did not corroborate the original claim. If the Web is to be truly “accessible to everyone”(Berners-Lee)26, and if
we are to sincerely try to reduce racial/ethnic Internet access disparities (a la Digital Divide), then apart from alternate language
options, readability levels must be appropriate for the socio-ethnic minorities.37

Readability: There was no significant difference between the two sites. Readability can be tested by using test subjects, readability
experts or readability formulae.37 We selected the last approach because of expediency. The beginning, middle, end portions of
text must be selected for testing.20,37 Reliability of the results depends on proper ‘cleaning’ of the documents. Variable results
from the same document and discrepancies between results from different tools arise from improper sampling/cleaning of the
documents.17,20

MSWord and online tool gave opposing results. This also emphasizes the variability between different tools/formulae.19
Readability formulae measure structure/composition of text rather than meaning/context; word-length rather than the words.18,23
They do not distinguish between written discourse and meaningless sentences.37 Shorter sentences and words with fewer syllables
might improve readability scores without improving readability.20 Readability formulae do not measure language familiarity,
clarity, new concepts, format/design, cultural sensitivity/relevance, credibility/believability and comprehensibility.18,20 They do
not address communication/interactivity, reader’s interest, experience, knowledge or motivation, time to read, and the unique
characteristics of Internet.37

For some European languages within an English document, MSWord displays statistics but not readability scores.38 Applying
FRE to German documents does not deliver good results.19 The problem of testing Spanish documents is applicable to USA. Apart
from establishing a panel of pre-tested experts, we need software like Lexiles Framework® that measures readability in English
and Spanish.37 Given all these constraints, readability scores computed using formulae should be interpreted with caution. But
they are quick, easy, and better than nothing at all.23

Recommendations to HealthInsite/eMedicine
The following site-specific recommendations are based on the deficiencies noted in the sites. Appendix-Box-6 gives some generic
principles by Nielsen and Constantine.39

HealthInsite
        Accessibility:
                 -Implement Table Summaries (let visually-impaired users know what is in a table)15
                 -Implement HTTP-Equivalent Content-Type in header (W3C requirement)12,15
        Usability:
                 -Integrate non-textual media (visuals) in Website23
                 -Consistent style, avoid excessive fonts and size-variations12
        Reliability:
                 -Specify influence, bias16

eMedicine
       Accessibility:


MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal   11
-Implement metadata-Dublin core title tags (compatibility with NHS directives)15
                   -Eliminate outdated codes-HTML elements that would not be used in future versions; specifically body and
                   colour font tags15
                   -Use stylesheets (efficient/consistent design practices)15
                   -Implement Image Alt Tags and Table Summaries (let visually-impaired users know what is in an image and
                   table, respectively)15
                   -Implement DTD (makes site XML-compatible/future-proof)15
                   -Implement HTML Language Definition (Bobby recommendation)15,40
          Usability:
                   -Clear statement of who this Website is for16
                   -Render Website user-interactive (user personalisation)12,16
                   -Make page more neat and trim12,16,39
                   -Provide Help page16,39
                   -Alternate language options16
                   -Advertising links discretely-placed; separate from general content35
                   -Reduce page download time12
          Reliability:
                   -Content-specific feedback mechanism; provide forums/chat (to submit content-specific comments)16
                   -Currency-update content at appropriate intervals; mention last update/review15,16,41,42
                   -State hierarchy/levels of evidence (for medical decisions)14,16
                   -Specify scientific review process16
                   -Provide information on how to evaluate online health information16
Both
          Usability:
                   -Link persistence; verify functioning hyper-links; no broken links12,16,29,43
          Reliability:
                   -Implement MedCIRCLE labeling -follow-up of MedCERTAIN, implements HIDDEL vocabulary (latter is
                   based on MedPICs and is a further development of PICS)14,29,44
                   -Mention source of information (let users verify from the original source)15,16
                   -Author’s name/title, contact information on each document12,22,35
                   -Note any omissions16
                   -Clearly display information categories (factual data, abstracts, full-text documents)16
                   -Proper/formal ‘What’s new’ page16
                   -Specify conflicts of interest (financing source, author independence etc)12,15,16
                   -Mention Webmastering process16
          Readability:
                   -Scale readability level down to 6th Grade level37,45
                   -Have ‘readability seal’ along lines of HONcode seal (inform readers of reading ease/difficulty level)23

Lessons Learned from Study
Our study had several limitations. Our methodology and results have not been validated in independent studies. Only two sites and
limited content were compared over a short period.22,35 Due to the dynamic nature of the Web, some of our findings may change
over time.35 Our study did not evaluate the advertised resources in eMedicine, which may have had more customized
information.35 Accuracy was assessed only by the author; more objective measures for evaluation must be established.8,35

But we learned several lessons. Both our test sites, when run through the gauntlet of a quasi-mathematical objective scoring
system, did not meet the UK government legal standards.16 ‘Popular’ resources are not good enough, and/or quality benchmarking
tools employ criteria that are too difficult to fulfill. Quality benchmarking of online health information resources is a strenuous
task. This is compounded by the fact that rating/scoring systems/tools and readability tools are not perfect, with considerable lack
of conformity between them.15,16,19 Usability is a subjective assessment while reliability/content is more objective.35 Rating tools
are more useful for researchers/informaticians rather than for patients and clinicians.35 People are relying more on the Internet for
health information.1 Our study may provide a basis for clinicians to guide patients seeking relevant/reliable Web health
information.35

Medical knowledge should be treated as a single blob/pool of knowledge with uniform accessibility to professionals and public.
This viewpoint has its supporters and dissenters.23 Yet, the Internet is rife with different information sets for professionals and
public (Internet in Health and Healthcare; slides 5-14/15-21).12 Our pilot study highlighted the essential differences between
patient/consumer and professional medical/health information resources (NeLH/Cochrane, for example). Our final study
methodology/results may be generalized to the former but not to the latter types of resources. For these we require different tools;
Oxford CEBM,13 ANAES method (level of evidence for therapy),14 DISCERN guidelines (for treatment choices), Medical Matrix
star ranking system, AMA Guidelines, HSWG Criteria,13 CONSORT statement (for randomized trials), QUORUM statement (for
systematic reviews), and CHERRIES statement (for Internet E-surveys).46

Public health information resources are supposed to be gateways to public education. This involves providing reliable/accurate
information, and informing them how to assess the quality of information. HealthInsite had taken cognizance of these points. This
also entails keeping in mind the literacy/readability levels of the average population. Average public readability level is usually


MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal   12
lower than the school grade level completed.37 The estimated reading age level of UK general population is 9 years.23,47 About
47% of US population demonstrate low literacy levels.48 OECD considers level 3 as minimum requirement for modern life;
considerable proportion of the population is below that.49 Most of the evaluated documents required lengthy scrolling, carried
small font, and ranked ‘Difficult’/‘Fairly difficult’ (Figure-24). Similar findings were noted by others.22,23,47,50-52 They should be
scaled down to the level appropriate to the target audience, i.e. ‘Standard English’/‘Fairly easy’.48,50 Improving readability will
enhance their public consumption.22,23,47,50 We had to employ different tools to evaluate Websites and readability. Ideally, quality
benchmarking checklists should include parameters for testing readability also.23




  Figure-24: FRE vs. Comparable literature (Breese et al, JAMA 2005)

Figure 25 outlines the complex inter-relationships between quality, accuracy, trust and popularity of Websites, elucidated from
various studies.8,53,54 But there is no uniformity between quality indicators,15,16 and current quality criteria cannot identify
potentially harmful online health information.24 We found HealthInsite better than eMedicine, though both were HONcode-
accredited and had comparable accuracy. Further studies are required to establish the true inter-relationships.


       HONcode logo              Organisation domain            Copyright display




       Accuracy                                                 Quality

    Author/medical credentials ≠ Accuracy
    Lack of currency ≠ Inaccuracy
    Presence of advertisements ≠ Inaccuracy
    Quality ≠ Popularity
    Type of Website → Popularity                                Trust

    Ref: Meric F et al 2002; Fallis et al 2002; Lampe et al 2003

  Figure-25: Complex inter-relationships (quality/accuracy/trust/popularity) – far from perfect

Making Evaluation/Comparison Better
Generalisability: Ideally we should evaluate ~200 Websites,8 a variety of subjects (diabetes, hypertension, asthma, Alzheimer’s,
lung/colon cancer, etc), and include more topics under each disease. To generalize our findings we need broader studies.22,35

Accuracy assessment: In our study the author’s personal knowledge of breast cancer was utilised to assess accuracy. But such may
not be the case for all topics/illnesses. We need to use a panel of experts for each topic8 and/or develop an instrument from
authoritative sources (viz. Harrison OnLine) for each topic, to assess accuracy of Web content.53 Likewise readability tests should
ideally be supplemented by feedback from a panel of readability experts.23

Objectively measuring Web content: ‘Concise’ or ‘scannable’ or ‘objective’ Web content increases readability by 58%, 47% and
27% respectively; all three attributes increases readability by 124%.55 We should objectively measure Web content quality using
Nielsen’s five usability matrices (Appendix-Box-7).55

Refining readability scoring: There are many readability tests/formulae (Appendix-Box-8)18-21,37. Ideally a combination of several
tests37 that incorporates the best parameters from all tests should be utilized. Using different tests/formulae will serve to cross-
check the readability scores of the text pieces under study, and also serve to validate one tool against the other. We have tried to
achieve both these to a limited extent in our study.

Comprehensive quality benchmarking system: A comprehensive quality benchmarking tool may be developed by pooling the best
criteria from all systems currently available. Even better would be an intelligent software wizard, which automatically qualifies a
Website according to pre-programmed criteria.13 It is emphasized that tools/wizards can never fully replace humans in quality
benchmarking tasks; they can only help them work more efficiently and ensure they follow the required evaluation protocol.13


MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal   13
Such a system would require defining a standard (core) statement and developing criteria, perhaps based on AMOUR principle,
which specifies the required quality level to satisfy the standard.13 The ideal scenario would be intelligent software that performs
automatic site and readability scoring, using best-of-breed criteria for both measurements.13

Summary
Quality control of Internet health information rests on four stanchions: consumer education, self regulation, third-party evaluation,
and sanction enforcements.56 The basic ingredients for usefulness are Usability, Relevance, Integrability and Quality.29 Nielsen’s
‘usability heuristics’ should cover all aspects of usability.40 Education of online user is important. Silberg’s 4-core criteria/JAMA-
benchmark (Authorship, Attribution, Currency, Disclosure) is the bare minimum to be looked for in a Website.8,14,42 Additional
points are Accessibility, Privacy and Transparency (EU Guidelines)41. Website popularity is not an essential quality requirement.14

REFERENCES

1. Health Information online: an American survey from the Pew Internet Project. May 2005.                       URL:
http://www.pewinternet.org/pdfs/PIP_Healthtopics_May05.pdf (Accessed 24 June 2005).

2. Eysenbach G, Ryoung Sa E, Diepgen TL. Shopping around the Internet today and tomorrow: towards the millennium of
cybermedicine. BMJ November 1999;319:1294.       URL: http://bmj.bmjjournals.com/cgi/content/full/319/7220/1294
(Accessed 1 June 2005).

3. Coiera EW. Will the Internet replace your doctor? Digital doctors. 1999.                 URL: http://abc.net.au/future/health.htm
(Accessed 1 June 2005)

4. Coiera E. Information epidemics, economics, and immunity on the Internet. BMJ November 1998; 317:1469-1470.

5. Reaney P. Kylie's case shows breast cancer can strike early. Reuters website. May 2005. URL:
http://www.reuters.co.uk/newsArticle.jhtml?type=healthNews&storyID=8517644&section=news&src=rss/uk/healthNews
(Accessed 1 June 2005).

6. BREAST CANCER FACTS AND FIGURES. myDr homepage. March 2001.                                    URL:
http://www.mydr.com.au/default.asp?article=2942 (Accessed 1 June 2005).

7. Kylie's cancer surgery a success. Yahoo UK news. May 2005.                   URL: http://uk.news.yahoo.com/050521/325/fjhlq.html
(Accessed 1 June 2005).

8. Meric F, Bernstam EV, Mirza NQ, Hunt KK, Ames FC, Ross MI, Kuerer HM, Pollock RE, Musen MA and Singletary Eva S.
Breast cancer on the world wide web: cross sectional survey of quality of information and popularity of websites. BMJ
2002;324;577-81.       URL: http://bmj.com/cgi/content/full/324/7337/577 (Accessed 1 June 2005)

9. HONcode. Health On the Net Foundation website.                  URL: http://www.hon.ch/HONcode/ (Accessed 1 June 2005).

10. HON Code of Conduct (HONcode) for medical and health Web sites. Health On the Net Foundation website.                           URL:
http://www.hon.ch/HONcode/Conduct.html (Accessed 1 June 2005).

11. HONcode Toolbar.             URL: http://www.hon.ch/HONcode/Plugin/Plugins.html (Accessed 1 June 2005)

12. Boulos MNK. Internet in Health and Healthcare.     URL: http://www.e-
courses.rcsed.ac.uk/mschi/unit5/KamelBoulos_Internet_in_Healthcare.ppt (Accessed 1 June 2005).

13. Boulos MNK, Roudsari AV, Gordon C, Gray JAM. The Use of Quality Benchmarking in Assessing Web Resources for the
Dermatology Virtual Branch Library of the National electronic Library for Health (NeLH). J Med Internet Res 2001;3(1):e5
     URL: http://www.jmir.org/2001/1/e5/ (Accessed 1 June 2005)

14. Boulos MNK. On quality benchmarking of online medical/health-related information resources. University of Bath School for
Health. March 2004.    URL: http://staff.bath.ac.uk/mpsmnkb/MNKB_Quality.PDF (Accessed 1 June 2005).

15. The LIDA Instrument version 1.2 – Minervation validation instrument for health care web sites. © 2005 Minervation Ltd.
     URL: http://www.minervation.com/mod_lida/minervalidation.pdf (Accessed 12 June 2005).

16. Net Scoring ®: criteria to assess the quality of Health Internet information. Last updated 2001.                  URL: http://www.chu-
rouen.fr/netscoring/netscoringeng.html (Accessed 1 June 2005).




MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal          14
17. Boulos MNK. Activity: Readability of online public/patient health information services. Royal College of Surgeons of
Edinburgh message board site. 2004.     URL: http://www.e-
courses.rcsed.ac.uk/mb3/msgs/dispmessage.asp?MID=MID2004417225527533 (Accessed I June 2005).

18. Everything you ever wanted know about readability tests but were afraid to ask. In: Klare, A Second Look at the validity of
Readability Formulas. Journal of Reading Behaviour 1976; 8:129-52.        URL:
http://www.gopdg.com/plainlanguage/readability.html (Accessed 1 June 2005).

19. Readability.Info. © 2004 by Dave Taylor & Intuitive Systems.                  URL: http://www.readability.info/uploadfile.shtml
(Accessed 12 June 2005).

20. Doak LG, Doak CC, eds. Pfizer Principles for Clear Health Communication, 2nd Ed. New York: Pfizer Inc., 2004.  URL:
http://www.pfizerhealthliteracy.com/pdfs/Pfizers_Principles_for_Clear_Health_Communication.pdf (Accessed 4 June 2005)

21. Readability Calculations. Micro Power & Light Co. Dallas, TX.  URL:
http://www.micropowerandlight.com/rdformulas.html (Accessed 5 June 2005)

22. Weeks WB, Wallace AE. Readability of British and American medical prose at the start of the 21st century. BMJ December
2002;325:1451-2.    URL: http://bmj.bmjjournals.com/cgi/content/full/325/7378/1451 (Accessed 1 June 2005)

23. Boulos MNK. British Internet-Derived Patient Information on Diabetes Mellitus: Is It Readable? DIABETES
TECHNOLOGY & THERAPEUTICS 2005; 7(3). © Mary Ann Liebert, Inc.                 URL: http://www.e-
courses.rcsed.ac.uk/mb3/msgs/dispattachment.asp?AID=AID200557131244705 (Accessed 1 June 2005).

24. Walji M, Sagaram S, Sagaram D, Meric-Bernstam F, Johnson C, Mirza NQ, Bernstam EV. Efficacy of Quality Criteria to
Identify Potentially Harmful Information: A Cross-sectional Survey of Complementary and Alternative Medicine Web Sites. J
Med Internet Res 2004;6(2):e21.       URL: http://www.jmir.org/2004/2/e21/ (Accessed 20 June 2005).

25. Downs SH, Black N. The feasibility of creating a checklist for the assessment of the methodological quality both of
randomised and non-randomised studies of health care interventions. J Epidemiol Community Health. 1998 Jun;52(6):377-84.
     URL: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&dopt=Abstract&list_uids=9764259
(Accessed 24 June 2005).

26. Web Accessibility Initiative (WAI). W3C® Web Accessibility Initiative website. Last revised June 2005.                        URL:
http://www.w3.org/WAI/ (Accessed 20 June 2005)

27. Kitemarks. Judge: web sites for health. Last updated September 2004. URL:
http://www.judgehealth.org.uk/how_judge_kitemarks.htm (Accessed 20 June 2005).

28. Inaccessible website demo. Disability Rights Commission. 2005.                  URL: http://www.drc.org.uk/newsroom/demo.asp
 (Accessed 20 June 2005).

29. Boulos MNK. Optimising the Utility of the NeLH VBL for Musculoskeletal Diseases—Technical Considerations.                            URL:
http://healthcybermap.semanticweb.org/publications/nelh27Nov02.ppt (Accessed 1 June 2005)

30. Boulos MNK. What classes as a website. Royal College of Surgeons of Edinburgh message board. 2003. URL:
http://www.e-courses.rcsed.ac.uk/mb3/msgs/dispmessage.asp?MID=MID200312222458705 (Accessed 20 June 2005).

31. ICRA (Internet Content Rating Association). ©1999-2003 Internet Content Rating Association®.                           URL:
http://www.icra.org/about/ (Accessed 1 June 2005).

32. HealthInsite Privacy Statement. HealthInsite Website. Last Updated Oct 2004. URL:
http://www.healthinsite.gov.au/content/internal/page.cfm?ObjID=00063FB9-061E-1D2D-81CF83032BFA006D (Accessed 22
June 2005).

33. Privacy. eMedicine Health Website.             URL: http://www.emedicinehealth.com/common/privacy.asp (Accessed 22 June
2005).

34. HealthInsite Disclaimer. Updated March 2005.        URL:
http://www.healthinsite.gov.au/content/internal/page.cfm?ObjID=0006CF21-0624-1D2D-81CF83032BFA006D (Accessed 22
June 2005).



MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal            15
35. Bedell SE., Agrawal A, Petersen LE. A systematic critique of diabetes on the world wide web for patients and their
physicians. International Journal of Medical Informatics 2004.     URL: http://www.e-
courses.rcsed.ac.uk/mb3/msgs/dispattachment.asp?AID=AID2004811113029705 (Accessed 1 June 2005).

36. Albert T. Letter to Editor: Transatlantic writing differences are probably exaggerated. BMJ March 2003;326:711.                URL:
http://bmj.bmjjournals.com/cgi/content/full/326/7391/711 (Accessed 1 June 2005).

37. CHAPTER 4 -READABILITY ASSESSMENT OF HEALTH INFORMATION ON THE INTERNET.                                                URL:
http://www.rand.org/publications/documents/interneteval/interneteval.pdf/chap4.pdf (Accessed 23 June 2005)

38. Microsoft® Office Word 2003 (11.5604.5606). Part of Microsoft Office Professional Edition 2003. Copyright © 1983-2003
Microsoft Corporation.

39. APPENDIX A-3. HEURISTIC GUIDELINES FOR EXPERT CRITIQUE OF A WEB SITE. Evaluation Design/Planning and
Methodology for the NIH Web Site – Phase 1. URL: http://irm.cit.nih.gov/itmra/weptest/app_a3.htm#usability (Accessed 1
June 2005).

40. Bobby. © 2003-2004 Watchfire Corporation.                 URL: http://bobby.watchfire.com/bobby/html/en/index.jsp (Accessed 20
June 2005).

41. Quality Criteria for Health Related Websites. Europa European Union Website. Last updated March 2005.         URL:
http://europa.eu.int/information_society/eeurope/ehealth/quality/draft_guidelines/index_en.htm (Accessed 25 June 2005).

42. Silberg WM, Lundberg GD, Musacchio RA. Assessing, controlling, and assuring the quality of medical information on the
Internet: Caveant lector et viewor--Let the reader and viewer beware. JAMA April 1997 16;277(15):1244-5.

43. Managing Web Resources for Persistent Access. National Library of Australia. March 2001.                         URL:
http://www.nla.gov.au/guidelines/persistence.html (Accessed 26 June 2005).

44. MedCIRCLE The Collaboration for Internet Rating,Certification, Labeling and Evaluation of Health Information. Last
updated 17 Dec 2002.   URL: http://www.medcircle.org/ (Accessed 20 June 2005)

45. Ask Me 3 (Pfizer Inc.): Advancing Clear Health Communication to Positively Impact Health Outcomes (Professional
Presentation Tool Kit). Internet document 2003.     URL: http://www.askme3.org/PFCHC/professional_presentation.ppt
(Accessed 24 June 2005).

46. Eysenbach G. Improving the quality of Web surveys: the Checklist for Reporting Results of Internet E-Surveys (CHERRIES).
J Med Internet Res. 2004 Sep 29;6(3):e34.      URL:
http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=15471760&dopt=Abstract (Accessed 20
June 2005).

47. Chestnutt IG. Internet-derived patient information on common oral pathologies: is it readable? Prim Dent Care. 2004
Apr;11(2):51-4.       URL:
http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=pubmed&dopt=Abstract&list_uids=15119094 (Accessed 23
June 2005).

48. Clear & Simple: Developing Effective Print Materials for Low-Literate Readers. National Cancer Institute Website. Updated
27 Feb 2003.     URL: http://www.cancer.gov/aboutnci/oc/clear-and-simple/ (Accessed 24 June 2005)

49. Office for National Statistics, UK: Adult Literacy Survey: Literacy Level of Adults by Gender and Age. Internet document
1996.      URL: http://www.statistics.gov.uk/StatBase/Expodata/Spreadsheets/D5047.xls (Accessed 25 June 2005)

50. Breese P, Burman W. Readability of Notice of Privacy Forms Used by Major Health Care Institutions. JAMA (Reprinted)
April 2005; 293 (13):1593-4.     URL: http://www.e-
courses.rcsed.ac.uk/mb3/msgs/dispattachment.asp?AID=AID2005513141638705 (Accessed 1 June 2005).

51. Jaffery JB, Becker BN. Evaluation of eHealth web sites for patients with chronic kidney disease. Am J Kidney Dis. 2004
Jul;44(1):71-6.     URL:
http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=pubmed&dopt=Abstract&list_uids=15211440 (Accessed 24
June 2005)

52. Kirksey O, Harper K, Thompson S, Pringle M. Assessment of Selected Patient Educational Materials of Various Chain
Pharmacies. J Health Commun. 2004;9(2):91-93.      URL:

MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal      16
http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=pubmed&dopt=Abstract&list_uids=15204820 (Accessed 24
June 2005)

53. Fallis D, Fricke M. Indicators of accuracy of consumer health information on the Internet: a study of indicators relating to
information for managing fever in children in the home. J Am Med Inform Assoc. 2002 Jan-Feb;9(1):73-9.            URL:
http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=pubmed&dopt=Abstract&list_uids=11751805 (Accessed 1
June 2005).

54. Lampe K, Doupi P, van den Hoven MJ. Internet health resources: from quality to trust. Methods Inf Med. 2003;42(2):134-42.
     URL: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=12743649&dopt=Abstract
(Accessed 24 June 2005).

55. Morkes J, Nielsen J. Concise, SCANNABLE, and Objective: How to Write for the Web. October 1997.                        URL:
http://www.useit.com/papers/webwriting/writing.html (Accessed 4 June 2004).

56. Eysenbach G. Consumer health informatics. BMJ June 2000;320:1713-1716.                         URL:
http://bmj.bmjjournals.com/cgi/content/full/320/7251/1713 (Accessed 1 June 2005).

List of abbreviations

AMA: American Medical Association
AMOUR: Achievable, Measurable, Observable, Understandable Reasonable
CEBM: Centre for Evidence Based Medicine
CHERRIES: Checklist for Reporting Results of Internet E-Surveys
DTD: Document Type Definition
FKGL: Flesch-Kincaid Grade Level
FRE: Flesch Reading Ease
HIDDEL: Health Information Disclosure, Description and Evaluation Language
HONcode: Health On the Net Foundation code of conduct
HSWG: Health Summit Working Group
HTML: Hypertext Markup Language
IE: Internet Explorer
MedCERTAIN: MedPICS Certification and Rating of Trustworthy and Assessed Health Information on the Net
MedCIRCLE: Collaboration for Internet Rating, Certification, Labeling and Evaluation
MS-IE: Microsoft Internet Explorer
NeLH: National electronic Library for Health (now, National Library for Health)
P3P: Platform for Privacy Preferences Project
PICS: Platform for Internet Content Selection
SMOG: Simple Measure of Gobbledegook
W3C: World Wide Web Consortium
WAI: Web Accessibility Initiative
OECD: Organisation for Economic Co-operation and Development
WIF: Web Impact Factor

APPENDICES
    Appendix-Box-A: Websites included in pilot study

    1.  MedlinePlus: http://medlineplus.gov/ or http://www.medlineplus.gov (Accessed 1 June 2005)
    2.  healthfinder®: http://www.healthfinder.gov/ (Accessed 1 June 2005)
    3.  HealthInsite: http://www.healthinsite.gov.au/ (Accessed 1 June 2005)
    4.  HealthConnect: http://www.healthconnect.gov.au (Accessed 1 June 2005)
    5.  NHS Direct Online: http://www.nhsdirect.nhs.uk (Accessed 1 June 2005)
    6.  NeLH (National electronic Library for Health); now called NLH (National Library of Health): http://www.nelh.nhs.uk/ or
        http://www.nlh.nhs.uk (Accessed 1 June 2005)
    7. Cochrane Library: Through NeLH; this also goes through Wiley Interscience interface http://www.nelh.nhs.uk/cochrane.asp;
        through Wiley Interscience interface http://www3.interscience.wiley.com/cgi-bin/mrwhome/106568753/HOME or
        http://www.mrw.interscience.wiley.com/cochrane/ (Accessed 1 June 2005)
    8. DIPEx (Database of Individual Patient Experiences): http://www.dipex.org (Accessed 1 June 2005)
    9. NIHSeniorHealth: http://nihseniorhealth.gov/ (Accessed 1 June 2005)
    10. eMedicine Health: http://www.emedicinehealth.com/ (Accessed 1 June 2005)




MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal     17
Appendix-Box-1: Centre for Health Information Quality (C-H-i-Q) checklist13

                      1.    Accessibility: Information is in appropriate format for target audience
                      2.    Accuracy: Information is based on best available evidence
                      3.    Appropriateness: Information communicates relevant messages
                      4.    Availability: Information is available to wide audience
                      5.    Currency: Information is up-to-date
                      6.    Legibility: Written information is clearly presented
                      7.    Originality: Information not already produced for the same audience in the same format
                      8.    Patient involvement: Information is specifically designed to meet needs of patient
                      9.    Reliability: Information addresses all essential issues
                      10.   Readability: Words / sentences are kept short; jargon minimized


                                          Appendix-Table-1: Scores of Websites in Pilot Study

Features             MedlinePlus      Health-       Health     Health          NHS         NeLH/     Cochrane      DIPEx     NIH            eMedicine
                                      finder®       Insite     Connect         Direct      NLH       Library                 Senior
                                                                               Online                                        Health
Accessibility        5                4             5          Site under      4           1         0             2         2              4
Accuracy             4                3             5          re-             4           5         5             2         2              4
Appropriateness      4                4             5          development     5           3         2             1         2              5
Availability         5                4             5                          5           2         1             3         2              4
Currency             3                4             4                          4           4         4             3         3              3
Legibility           5                2             5                          5           2         2             2         3              5
Originality          5                2             5                          5           5         4             4         5              5
Patient              5                5             5                          5           2         2             5         5              5
involvement
Reliability          3                3             4                          4           3         3             1         2              4
Readability          3                3             4                          3           1         1             2         3              4
    C-H-i-Q                 42            34            47           0             44          28        24            25        29             43
    subtotal
Web Impact           53,100           376,000       36,000     142 results     231,000     43,300    72 results    1,010     1,800          56,900
Factor (Alta         results          results       results    (2)             results     / 255     (1)           results   results        results
Vista) [0-99=1;       (4)             (5)           (4)                        (5)         results                 (3)       (3)            (4)
                                                                                           (4/2)
100-999=2;
1000-9999=3;
10000-99999=4
100000+=5]
Homepage             1                1             1          0               0           0         0             0         0              1
HONcode-
accredited
Language             1                1             1          0               0           0         1             0         0              0
option(s)
Breast cancer        0                1             1          0               0           0         0             0         0              1
page HONcode-
accredited
Additional           0                0             0          0               1 (Lang     0         0             0         1 (Text        1 (ICRA
features                                                                       options                                       size,          label)
                                                                               in audio                                      contrast,
                                                                               clips)                                        speech)
 Miscellaneous              6              8            7            2             6           4          2            3         4              7
   subtotal
  Total Score               48            42            54           2             50          32        26            28        33             50




MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal               18
Appendix-Box-2: Minervation tool parameters15

                  Level 1 (Accessibility) (Maximum 63 points)

                  1.   Page setup             | Automated test (maximum 57 points including all 4 automated tests)
                  2.   Access restrictions    | -do-
                  3.   Outdated code          | -do-
                  4.   Dublin core title tags | -do-
                  5.   Browser test (Maximum 3 points)
                  6.   Registration (Maximum 3 points)

                  Level 2 (Usability) (Maximum 54 points)

                  1.   Clarity (6 questions; maximum 18 points)
                  2.   Consistency (3 questions; maximum 9 points)
                  3.   Functionality (5 questions; maximum 15 points)
                  4.   Engagibility (4 questions; maximum 12 points)

                  Level 3 (Reliability) (Maximum 27 points + 24 supplemental points)

                  1.   Currency (3 questions; maximum 9 points)
                  2.   Conflicts of interest (3 questions; maximum 9 points)
                  3.   Content production (3 questions; maximum 9 points)
                  4.   Content production procedure – supplemental (5 questions; maximum15 points)
                  5.   Output of content - supplemental (3 questions; maximum 9 points)


Appendix-Box-3: Flesch Reading Ease (FRE) score

This readability score is normally used to assess adult materials.21 It bases its rating on the average number of syllables per word (ASW) and
words per sentence (ASL, i.e. Average Sentence Length). It rates text on a scale of 0 to 100; the higher the score, the easier it is to understand the
document. The score for ‘plain English’ is 65. Flesch scores of <30 indicate extremely difficult reading, like in a legal contract.22
Formula for FRE score
FRE = 206.835 – (1.015 x ASL) – (84.6 x ASW); ASL = Average sentence length (number of words / number of sentences); ASW = Average
number of syllables per word (number of syllables / number of words)


Appendix-Box-4: Flesch-Kincaid Grade Level (FKGL) score

This is most reliable when used with upper elementary and secondary materials.21 It also bases its rating on ASW and ASL. It rates text
on a U.S. grade-school level (a rough measure of how many years of schooling it would take someone to understand the content, with a top score
of 12). A score of 5.0 means that a fifth grader 10-year old can understand the document. For most standard documents, we should aim for a
score of approximately 5.0.
Formula for FKGL score
FKGL = (.39 x ASL) + (11.8 x ASW) – 15.59

Appendix-Box-5: Gunning's Fog Index

It is widely used in health care and general insurance industries for general business publications.21 FOG scores of >16 indicate
extremely difficult reading, like in a legal contract.22
Calculating Fog Index18
(A) Total number of words is divided by total number of sentences to give average number of words per sentence
(B) Number of words with >3 syllables is divided by total number of words to give the percentage of difficult words
(C) Sum of two figures (A) and (B) is multiplied by 0.4. This is the Fog Index in years of education

Others19
Fog Index = 0.4*(wds/sent+100*((wds >= 3 syll)/wds))
ARI = 4.71*chars/wds+0.5*wds/sentences-21.43
Coleman-Liau = 5.89*chars/wds-0.3*sentences/(100*wds)-15.8
Lix = wds/sent+100*(wds >= 6 char)/wds
SMOG-Grading = square root of (((wds >= 3 syll)/sent)*30) + 3


                            Appendix-Box-5a: Configuring MSWord to display Readability Scores

                            Tools menu →Options→ Spelling & Grammar tab
                            Check grammar with spelling check box selected
                            Show readability statistics check box selected; clicked OK

                                Appendix-Table-2: Comparison on the basis of Minervation online tool




MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal              19
Parameter                                                   HealthInsite                      eMedicine
Level 1 (Accessibility) (Maximum=63points)
First four automated tests (Maximum=57points)                                     50                                  28
-Browser test (Maximum=3points)                                                   3                                   3
-Registration (Maximum=3points)                                                   3                                   3
                           Subtotal (% of 63)                                               56 (88.9%)                        34 (54%)
Level 2 (Usability) (Maximum=54points)
Clarity (6 questions; maximum=18points)
-Is there a clear statement of who this web site is for?                          3                                   0
-Is the level of detail appropriate to their level of knowledge?                  3                                   2
-Is the layout of the main block of information clear and readable?               2                                   2
-Is the navigation clear and well structured?                                     3                                   3
-Can you always tell your current location in the site?                           2                                   3
-Is the colour scheme appropriate and engaging?                                   2                                   2
Consistency (3 questions; maximum=9points)
-Is the same page layout used throughout the site?                               2                                    3
-Do navigational links have a consistent function?                               3                                    3
-Is the site structure (categories or organisation of pages) applied             3                                    3
consistently?
Functionality (5 questions; maximum=15points)
-Does the site provide an effective search facility?                             3                                    3
-Does the site provide effective browsing facilities?                            3                                    3
-Does the design minimise the cognitive overhead of using the site?              2                                    2
-Does the site support the normal browser navigational tools?                    2                                    2
-Can you use the site without third party plug-ins?                              3                                    3
Engagibility (4 questions; maximum=12points)
-Can the user make an effective judgment of whether the site applies             3                                    2
to them?
-Is the web site interactive?                                                    3                                    0
-Can the user personalise their experience of using the site?                    3                                    1
-Does the web site integrate non-textual media?                                  0                                    2
                           Subtotal (% of 54)                                               45 (83.3%)                       39 (72.2%)
Level 3 (Reliability) (Maximum=27points)
Currency (3 questions; maximum=9points)
-Does the site respond to recent events?                                          3 (‘News’ link)                     2 (eNews letter would be
                                                                                                                      sent)
-Can users submit comments on specific content?                                  3 (‘Consumer participation’          0
                                                                                 link
-Is site content updated at an appropriate interval?                             3 (Mentioned)                        1
Conflicts of interest (3 questions; maximum=9points)
-Is it clear who runs the site?                                                  3 (Australian government)            3 (private company)
-Is it clear who pays for the site?                                              3 (-do-)                             0 (cannot tell)
-Is there a declaration of the objectives of the people who run the              3                                    3
site?
Content production (3 questions; maximum=9points)
-Does the site report a clear content production method?                          3 (‘About HealthInsite’ link)       3 (‘About us’ link)
-Is this a robust method?                                                         2                                   2
-Can the information be checked from original sources?                            0 (Can’t tell)                      0 (Can’t tell)
                          Subtotal (% of 27)                                               23 (85.2%)                        14 (51.8%)
                        Grand total (% of 144)                                            124 (86.1%)                        87 (60.4%)

Appendix-Table-3: Automated Accessibility results of HealthInsite and eMedicine (Minervation)

HealthInsite

1.1 Page Setup                                            80 %
1.1.1 Document Type Definition                              3
1.1.2 HTTP-Equiv Content-Type (in header)                   0
1.1.3 HTML Language Definition                              3
1.1.4 Page Title                                            3
1.1.5 Meta Tag Keywords                                     3




MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal           20
1.2 Access Restrictions                                   66 %
1.2.1 Image Alt Tags                                        3                         http://www.healthinsite.gov.au scores 87%.   Medium
                                                                                      1.1 Page Setup Pass rate of ~80%             Medium
1.2.2 Specified Image Widths                                2
1.2.3 Table Summaries                                       0                         1.2 Access Restrictions Pass rate of ~66%    Medium
1.2.4 Frames                                                3                         1.3 Outdated Code Pass rate of ~100%         High
                                                                                      1.4 Dublin Core Tags Pass rate of ~100%      High
1.3 Outdated Code                                        100 %
1.3.1 Body Tags - Body Background Colour                   3
1.3.2 Body Tags - Body Topmargin                           3
1.3.3 Body Tags - Body Margin Height                       3
1.3.4 Table Tags - Table Background Colour                 3
1.3.5 Table Tags - Table Column (td) Height                3
1.3.6 Table Tags - Table Row (tr) Height                   3
1.3.7 Font Tags - Font Color                               3
1.3.8 Font Tags - Font Size                                3
1.3.9 Align (non style sheet)                              3

1.4 Dublin Core Tags                                   100 %
1.4.1 Dublin Core Title Tag                               3
Accessibility:                                      87 % (50 / 57)
TOTAL RATING                                        87 % (50 / 57)

eMedicine
1.1 Page Setup                                           60 %
1.1.1 Document Type Definition                             0
1.1.2 HTTP-Equiv Content-Type (in header)                  3                      http://www.emedicinehealth.com/ scores 49%.      Low
1.1.3 HTML Language Definition                             0                      1.1 Page Setup pass rate of ~60%                 Medium
1.1.4 Page Title                                           3                      1.2 Access Restrictions pass rate of ~50%        Low
1.1.5 Meta Tag Keywords                                    3                      1.3 Outdated Code pass rate of ~48%              Low
                                                                                  1.4 Dublin Core Tags pass rate of ~0%            Low
1.2 Access Restrictions                                  50 %
1.2.1 Image Alt Tags                                       1
1.2.2 Specified Image Widths                               2
1.2.3 Table Summaries                                      0
1.2.4 Frames                                               3

1.3 Outdated Code                                        48 %
1.3.1 Body Tags - Body Background Colour                   0
1.3.2 Body Tags - Body Topmargin                           0
1.3.3 Body Tags - Body Margin Height                       0
1.3.4 Table Tags - Table Background Colour                 2
1.3.5 Table Tags - Table Column (td) Height                3
1.3.6 Table Tags - Table Row (tr) Height                   3
1.3.7 Font Tags - Font Color                               0
1.3.8 Font Tags - Font Size                                3
1.3.9 Align (non style sheet)                              2

1.4 Dublin Core Tags                                    0%
1.4.1 Dublin Core Title Tag                               0
Accessibility:                                      49 % (28 / 57)
TOTAL RATING                                        49 % (28 / 57)

                                  Appendix-Table-4: Comparison of Content category (Net Scoring)


MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal    21
Content (information) quality (Content category) (Maximum=87points)
                                                   HealthInsite (Australia)                            eMedicine Consumer Health (USA)
Accuracy (essential criterion)             9                                                         9
Hierarchy of evidence (important           6 (‘Reviews of Evidence for                               0 (Not specified in any page)
criterion)                                 Treatments’)
Original Source Stated (essential          9 (Most pages mentioned it)                               0 (Not specified in any page)
criterion)
Disclaimer (important criterion)           6 (Disclaimer provided)                                   6 (Disclaimer provided)
Logic organization (navigability)          7 (Pages redirected to partner websites,                  9
(essential criterion)                      with notification)
Quality of the internal search engine      6                                                         6
(important criterion)
General index (important criterion)        6                                                         6
What’s new page (important criterion)      4 (‘News’ and ‘HealthInsite Newsletter’)                  3 (‘eMedicine Spotlight’)
Help page (minor criterion)                3                                                         0
Map of the site (minor criterion)          3                                                         3
Omissions noted (essential criterion)      0 (None)                                                  0
Fast load of the site and its different    6                                                         4 (Ads reduced speed of loading)
pages (important criterion)
Clear display of available information     0 (None)                                                  0
categories (factual data, abstracts, full-
text documents, catalogue, databases)
(important criterion)
          SUBTOTAL (%of 87)                                65 (74.7%)                                                  46 (52.9%)

                                Appendix-Table-5: Comparison of Credibility category (Net Scoring)

Completeness / currency / usefulness of information (Credibility category) (Maximum=99points)
                                                   HealthInsite (Australia)            eMedicine Consumer Health (USA)
Name, logo and references of the           9 (All pages, including partner sites had 9 (All pages)
institution on each document of the site   them)
(essential criterion)
Name and title of author on each           0 (None mentioned)                        0 (None mentioned)
document of the site (essential criterion)
Context: source of financing,              0 (None mentioned)                        0 (None mentioned)
independence of the author(s) (essential
criterion)
Conflict of interest (important criterion) 0 (None mentioned)                        0 (None mentioned)
Influence, bias (important criterion)      0 (None mentioned)                        3 (Mentioned partly in Disclaimer)
Updating: currency information of the      9                                         6 (some pages mentioned it)
site (essential criterion) including:
- date of creation                         Yes                                       Yes
- date of last update / last version       Yes                                       No
Relevance/utility (essential criterion)    9 (For public information)                8 (For public + healthcare professionals)
Editorial review process (essential        9 (Mentioned)                             9 (Mentioned)
criterion)
Webmastering process (important            1 (Mentioned in one partner site)         0 (Not mentioned anywhere)
criterion)
Scientific review process (important       6 (‘Reviews of Evidence for               0 (Not mentioned)
criterion)                                 Treatments’)
Target/purpose of the web site; access to  6 (Free access to all pages)              4 (Site had a ‘Registration’ link; general
the site (free or not, reserved or not)                                              public info could be freely accessed;
(important criterion)                                                                sponsored links present)
Quality of the language and/or translation 6 (good language; other language options 3 (good language; no other language
(important criterion)                      provided)                                 options)
Use of metadata (essential criterion)      0                                         10 (ICRA label v02)
           SUBTOTAL (%of 99)                               55 (55.5%)                                52 (52.5%)

                     Appendix-Table-6: User interface / Ease of finding information / Usability (Net Scoring)

                                              HealthInsite (Australia)            eMedicine Consumer Health (USA)
Hyperlinks category (Maximum=45points) (Minus 6points for NA parameter; so maximum=39points)
Selection (essential criterion)       9                                         9
Architecture (important criterion)    6                                         4 (Hyperlinks were a bit cluttered)


MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal            22
Content (essential criterion)                     9                                                  9
Web Impact Factor: Back-links                     4 (36,000 results from AltaVista)                  4 (56,900 results from AltaVista)
(important criterion)
Regular verification that hyper-links are         0 (not mentioned, though no broken links           0 (not mentioned, though no broken links
functioning, i.e., no broken links                were encountered)                                  were encountered)
(important criterion)
In case of modification of the site               NA (Not applicable)                                NA
structure, link between old and new
HTML documents (important criterion)
Distinction between internal and external         3 (Specified)                                      3 (There were no separate hyperlinks)
hyper-links (minor criterion)
          SUBTOTAL (%of 39)                                         31 (79.5%)                                         29 (74.4%)
Design category (Maximum=21points)
Design of the site (essential criterion)          9 (Neat and trim, user-friendly)                   7 (Somewhat cluttered, likely to be
                                                                                                     confusing to some)
Readability of the text, (important               3 (See Readability scores)                         4 (See Readability scores)
criterion)
Quality of the print (important criterion)
                                        3 [Combination of Tahoma (font 9, 9.5,                       5 [Only Times New Roman (font 12) for
                                       10, 11.5), Verdana (font 7, 9.5, 12), Ariel                   headings and Verdana (font 7.5) for text]
                                       (font 10)]
SUBTOTAL (%of 21)                                       15 (71.4%)                                                     16 (76.2%)
Accessibility category (Maximum=12points)
Accessibility from the main search     6 (Dual mode of access – from search                          6 (Same arguments apply)
engines and catalogues (important      box and from A-Z site map; latter gave
criterion)                             more logical arrangement of topics.
                                       Search engine gave results according to
                                       relevance ranking)
Intuitive address of a site (important 0 (Not present)                                               0
criterion)
         SUBTOTAL (% of 12)                              6 (50%)                                                           6 (50%)
Interactivity category (Maximum=18points)
Feedback mechanism: Email of author on 5 (‘Feedback’/‘Contact us’ links in main                      5 (‘Contact us’ links in all site pages; no
every document (essential criterion)   pages and some partner site pages; no                         author or contact info)
                                       author or contact info)
Forums, chat (minor criterion)         2 (‘Consumer participation’ link)                             0 (None)
Traceability, cookies etc (important   6 (Cookies etc specified)                                     6 (same points)
criterion)
         SUBTOTAL (% of 18)                             13 (72.2%)                                                     11 (61.1%)
Ethics category (Maximum=18points)
Liability of the reader (essential     9 (‘Disclaimer’ link)                                         9 (‘Disclaimer’ link)
criterion)
Medical privacy (essential criterion)   9 (‘Privacy’ link)                                           9 (‘Privacy’ link)
         SUBTOTAL (% of 18)                             18 (100%)                                                    18 (100%)

    Appendix-Box-6: User interface design principles by Nielsen (1994)39
    1. Visibility of system status: The system should keep users informed about what is going on, through appropriate timely
    2. System and real world match: Follow real-world conventions, making information appear in a natural and logical order.
    3. User freedom: Users need a clearly marked ‘emergency exit’ from mistakes. Support undo and redo.
    4. Consistency and standards: Follow platform conventions to avoid confusion among users
    5. Error prevention: Careful design prevents a problem from occurring
    6. Recognition rather than recall: Make objects, actions, and options visible. Instructions for use of the system should be
        visible.
    7. Flexibility and efficiency: Allow users to tailor frequent actions.
    8. Aesthetic design: Dialogues should not contain information that is irrelevant or rarely needed.
    9. Help users recognize and recover from errors: Error messages should express in plain language the problem, and a solution.
    10. Help and documentation: Any such information should be easy to search, focused on the user’s task, list concrete steps to be
        carried out and not be too large.
    Usability principles by Constantine (1994)39
    A. Structure Principle: Organize the user interface purposefully, that put related things together and separate unrelated things.
    B. Simplicity Principle: Make common tasks simple to do, communicate simply in user’s own language, provide good shortcuts.
    C. Visibility Principle: Keep all options and materials for a given task visible.
    D. Feedback Principle: Keep users informed of actions/interpretations, changes of state/condition, and errors/exceptions.
    E. Tolerance Principle: Be flexible and tolerant, reducing the cost of mistakes and misuse by allowing undoing and redoing
        while preventing errors.
    F. Reuse Principle: Reduce the need for users to rethink and remember by reusing internal and external components and
        behaviors.




MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal             23
Appendix-Box-7: Usability matrices for objectively measuring Web content (Morkes and Nielsen)55

1.   Task time: Number of seconds to find answers for tasks
2.   Task errors: Percentage score based on the number of incorrect answers
3.   Memory: Recognition and recall
          a. Recognition memory: A percentage score based on the number of correct answers minus the number of incorrect answers to
               questions
          b. Recall memory: A percentage score based on the number of pages correctly recalled minus the number incorrectly recalled
4.   Sitemap time:
          a. Time to recall site structure: The number of seconds to draw a sitemap
          b. Sitemap accuracy: A percentage score based on the number of pages and connections between pages correctly identified, minus
               the number of pages and connections incorrectly identified
5.   Subjective satisfaction: Subjective satisfaction index is the mean score of four indices – Quality, Ease of use, Likeability, User effect


Appendix-Box-8: Various Readability tools, formulae and software18-21,37

•    Dale-Chall: Original vocabulary-based formula used to assess upper elementary through secondary materials
•    Fry Graph: Used over a wide grade range of materials, from elementary through college and beyond
•    Powers-Sumner-Kearl: For assessing primary through early elementary level materials
•    FORCAST: Focuses on functional literacy. Used to assess non- running narrative, e.g. questionnaires, forms, tests etc
•    Spache: Original vocabulary-based formula widely used in assessing primary through fourth grade materials
•    McLaughlin's SMOG (Simple Measure of Gobbledegook): Unlike any of the other formulas, SMOG predicts the grade level required for
     100% comprehension
•    Cloze procedure: The "cloze" procedure (from the word ‘closure’) for testing writing is often treated as a readability test because a formula
     exists for translating the data from "cloze tests" into numerical results
•    Lexiles Framework® software tool that measures readability in both English and Spanish.
•    ARI: The Automated Readability Index is typically higher than Kincaid and Coleman-Liau, but lower than Flesch
•    Coleman-Liau Formula usually gives a lower grade than Kincaid, ARI and Flesch when applied to technical documents.
•    Lix formula developed by Bjornsson from Sweden is very simple and employs a mapping table as well




MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal           24

Mais conteúdo relacionado

Destaque (6)

HONcode
HONcodeHONcode
HONcode
 
An infographic on how to find trustworthy health information online
An infographic on how to find trustworthy health information onlineAn infographic on how to find trustworthy health information online
An infographic on how to find trustworthy health information online
 
Not All that Gets Viral is Gold
Not All that Gets Viral is GoldNot All that Gets Viral is Gold
Not All that Gets Viral is Gold
 
The Quality of Health Information on the Web: a Case for Regulation?
The Quality of Health Information on the Web: a Case for Regulation?The Quality of Health Information on the Web: a Case for Regulation?
The Quality of Health Information on the Web: a Case for Regulation?
 
Secondary information resources
Secondary information resourcesSecondary information resources
Secondary information resources
 
U-11_Learning Styles Research Proposal
U-11_Learning Styles Research ProposalU-11_Learning Styles Research Proposal
U-11_Learning Styles Research Proposal
 

Semelhante a U-5_Health Info Resource Comparison

eFolioMinnesota Text-Based Usability Test Findings and Analysis Report
eFolioMinnesota Text-Based Usability Test Findings and Analysis ReporteFolioMinnesota Text-Based Usability Test Findings and Analysis Report
eFolioMinnesota Text-Based Usability Test Findings and Analysis Report
Kevin L. Glenz
 
ePRO_Presentation_BYOD Webinar_10Mar2016_FINAL
ePRO_Presentation_BYOD Webinar_10Mar2016_FINALePRO_Presentation_BYOD Webinar_10Mar2016_FINAL
ePRO_Presentation_BYOD Webinar_10Mar2016_FINAL
jencrager
 
Assessment Worksheet Aligning Risks, Threats, and Vuln.docx
Assessment Worksheet Aligning Risks, Threats, and Vuln.docxAssessment Worksheet Aligning Risks, Threats, and Vuln.docx
Assessment Worksheet Aligning Risks, Threats, and Vuln.docx
festockton
 
AM   What is a common size income statement, it is the presentat.docx
AM   What is a common size income statement, it is the presentat.docxAM   What is a common size income statement, it is the presentat.docx
AM   What is a common size income statement, it is the presentat.docx
daniahendric
 
ePRO_Presentation_BYOD Webinar_5 Final 9 March 2016 YPrime
ePRO_Presentation_BYOD Webinar_5 Final 9 March 2016 YPrimeePRO_Presentation_BYOD Webinar_5 Final 9 March 2016 YPrime
ePRO_Presentation_BYOD Webinar_5 Final 9 March 2016 YPrime
Cindy Howry, MS
 
Assessment Worksheet Aligning Risks, Threats, and Vuln.docx
Assessment Worksheet Aligning Risks, Threats, and Vuln.docxAssessment Worksheet Aligning Risks, Threats, and Vuln.docx
Assessment Worksheet Aligning Risks, Threats, and Vuln.docx
davezstarr61655
 

Semelhante a U-5_Health Info Resource Comparison (20)

OHA Usability Test Plan.pdf
OHA Usability Test Plan.pdfOHA Usability Test Plan.pdf
OHA Usability Test Plan.pdf
 
eFolioMinnesota Text-Based Usability Test Findings and Analysis Report
eFolioMinnesota Text-Based Usability Test Findings and Analysis ReporteFolioMinnesota Text-Based Usability Test Findings and Analysis Report
eFolioMinnesota Text-Based Usability Test Findings and Analysis Report
 
Improving collaborative filtering using lexicon-based sentiment analysis
Improving collaborative filtering using lexicon-based sentiment  analysisImproving collaborative filtering using lexicon-based sentiment  analysis
Improving collaborative filtering using lexicon-based sentiment analysis
 
Evaluating FAIRness
Evaluating FAIRnessEvaluating FAIRness
Evaluating FAIRness
 
Reliability Improvement with PSP of Web-Based Software Applications
Reliability Improvement with PSP of Web-Based Software ApplicationsReliability Improvement with PSP of Web-Based Software Applications
Reliability Improvement with PSP of Web-Based Software Applications
 
ePRO_Presentation_BYOD Webinar_10Mar2016_FINAL
ePRO_Presentation_BYOD Webinar_10Mar2016_FINALePRO_Presentation_BYOD Webinar_10Mar2016_FINAL
ePRO_Presentation_BYOD Webinar_10Mar2016_FINAL
 
User test report for a flight booking travel website jeffrey jacob
User test report for a flight booking travel website   jeffrey jacobUser test report for a flight booking travel website   jeffrey jacob
User test report for a flight booking travel website jeffrey jacob
 
Assessment Worksheet Aligning Risks, Threats, and Vuln.docx
Assessment Worksheet Aligning Risks, Threats, and Vuln.docxAssessment Worksheet Aligning Risks, Threats, and Vuln.docx
Assessment Worksheet Aligning Risks, Threats, and Vuln.docx
 
AM   What is a common size income statement, it is the presentat.docx
AM   What is a common size income statement, it is the presentat.docxAM   What is a common size income statement, it is the presentat.docx
AM   What is a common size income statement, it is the presentat.docx
 
Analytic Hierarchy Process-based Fuzzy Measurement to Quantify Vulnerabilitie...
Analytic Hierarchy Process-based Fuzzy Measurement to Quantify Vulnerabilitie...Analytic Hierarchy Process-based Fuzzy Measurement to Quantify Vulnerabilitie...
Analytic Hierarchy Process-based Fuzzy Measurement to Quantify Vulnerabilitie...
 
Analytic Hierarchy Process-based Fuzzy Measurement to Quantify Vulnerabilitie...
Analytic Hierarchy Process-based Fuzzy Measurement to Quantify Vulnerabilitie...Analytic Hierarchy Process-based Fuzzy Measurement to Quantify Vulnerabilitie...
Analytic Hierarchy Process-based Fuzzy Measurement to Quantify Vulnerabilitie...
 
ANALYTIC HIERARCHY PROCESS-BASED FUZZY MEASUREMENT TO QUANTIFY VULNERABILITIE...
ANALYTIC HIERARCHY PROCESS-BASED FUZZY MEASUREMENT TO QUANTIFY VULNERABILITIE...ANALYTIC HIERARCHY PROCESS-BASED FUZZY MEASUREMENT TO QUANTIFY VULNERABILITIE...
ANALYTIC HIERARCHY PROCESS-BASED FUZZY MEASUREMENT TO QUANTIFY VULNERABILITIE...
 
ANALYTIC HIERARCHY PROCESS-BASED FUZZY MEASUREMENT TO QUANTIFY VULNERABILITIE...
ANALYTIC HIERARCHY PROCESS-BASED FUZZY MEASUREMENT TO QUANTIFY VULNERABILITIE...ANALYTIC HIERARCHY PROCESS-BASED FUZZY MEASUREMENT TO QUANTIFY VULNERABILITIE...
ANALYTIC HIERARCHY PROCESS-BASED FUZZY MEASUREMENT TO QUANTIFY VULNERABILITIE...
 
PATHS state of the art monitoring report
PATHS state of the art monitoring reportPATHS state of the art monitoring report
PATHS state of the art monitoring report
 
ePRO_Presentation_BYOD Webinar_5 Final 9 March 2016 YPrime
ePRO_Presentation_BYOD Webinar_5 Final 9 March 2016 YPrimeePRO_Presentation_BYOD Webinar_5 Final 9 March 2016 YPrime
ePRO_Presentation_BYOD Webinar_5 Final 9 March 2016 YPrime
 
Assessment Worksheet Aligning Risks, Threats, and Vuln.docx
Assessment Worksheet Aligning Risks, Threats, and Vuln.docxAssessment Worksheet Aligning Risks, Threats, and Vuln.docx
Assessment Worksheet Aligning Risks, Threats, and Vuln.docx
 
Mobile technology Usage by Humanitarian Programs: A Metadata Analysis
Mobile technology Usage by Humanitarian Programs: A Metadata AnalysisMobile technology Usage by Humanitarian Programs: A Metadata Analysis
Mobile technology Usage by Humanitarian Programs: A Metadata Analysis
 
Validating enterprise data lake using open source data validator
Validating enterprise data lake using open source data validatorValidating enterprise data lake using open source data validator
Validating enterprise data lake using open source data validator
 
CrUx Report and Improving Web vitals
CrUx Report and Improving Web vitalsCrUx Report and Improving Web vitals
CrUx Report and Improving Web vitals
 
VEPSER
VEPSERVEPSER
VEPSER
 

Mais de Sanjoy Sanyal

Lunar Views – Potential Landing Sites - Compiled by Sanjoy Sanyal
Lunar Views – Potential Landing Sites - Compiled by Sanjoy SanyalLunar Views – Potential Landing Sites - Compiled by Sanjoy Sanyal
Lunar Views – Potential Landing Sites - Compiled by Sanjoy Sanyal
Sanjoy Sanyal
 
MARS Images ISRO-NASA-Compiled by Sanjoy Sanyal
MARS Images ISRO-NASA-Compiled by Sanjoy SanyalMARS Images ISRO-NASA-Compiled by Sanjoy Sanyal
MARS Images ISRO-NASA-Compiled by Sanjoy Sanyal
Sanjoy Sanyal
 
Aditya-L1 Suit Images ISRO - Compiled by Sanjoy Sanyal.pptx
Aditya-L1 Suit Images ISRO - Compiled by Sanjoy Sanyal.pptxAditya-L1 Suit Images ISRO - Compiled by Sanjoy Sanyal.pptx
Aditya-L1 Suit Images ISRO - Compiled by Sanjoy Sanyal.pptx
Sanjoy Sanyal
 
4-5-6_Rwanda-KSA-Libya Service Certificates
4-5-6_Rwanda-KSA-Libya Service Certificates4-5-6_Rwanda-KSA-Libya Service Certificates
4-5-6_Rwanda-KSA-Libya Service Certificates
Sanjoy Sanyal
 

Mais de Sanjoy Sanyal (20)

Lunar Views – Potential Landing Sites - Compiled by Sanjoy Sanyal
Lunar Views – Potential Landing Sites - Compiled by Sanjoy SanyalLunar Views – Potential Landing Sites - Compiled by Sanjoy Sanyal
Lunar Views – Potential Landing Sites - Compiled by Sanjoy Sanyal
 
MARS Images ISRO-NASA-Compiled by Sanjoy Sanyal
MARS Images ISRO-NASA-Compiled by Sanjoy SanyalMARS Images ISRO-NASA-Compiled by Sanjoy Sanyal
MARS Images ISRO-NASA-Compiled by Sanjoy Sanyal
 
Aditya-L1 Suit Images ISRO - Compiled by Sanjoy Sanyal.pptx
Aditya-L1 Suit Images ISRO - Compiled by Sanjoy Sanyal.pptxAditya-L1 Suit Images ISRO - Compiled by Sanjoy Sanyal.pptx
Aditya-L1 Suit Images ISRO - Compiled by Sanjoy Sanyal.pptx
 
Charting Neural Pathways in Schizophrenia and BPD-Chicago Conference 2016 - S...
Charting Neural Pathways in Schizophrenia and BPD-Chicago Conference 2016 - S...Charting Neural Pathways in Schizophrenia and BPD-Chicago Conference 2016 - S...
Charting Neural Pathways in Schizophrenia and BPD-Chicago Conference 2016 - S...
 
Aorta–IVC–Kidney Dissection and Surgical Correlations - Dr Sanjoy Sanyal
Aorta–IVC–Kidney Dissection and Surgical Correlations - Dr Sanjoy SanyalAorta–IVC–Kidney Dissection and Surgical Correlations - Dr Sanjoy Sanyal
Aorta–IVC–Kidney Dissection and Surgical Correlations - Dr Sanjoy Sanyal
 
Anterior Thoracic Wall Surgical Anatomy - Sanjoy Sanyal
Anterior Thoracic Wall Surgical Anatomy - Sanjoy SanyalAnterior Thoracic Wall Surgical Anatomy - Sanjoy Sanyal
Anterior Thoracic Wall Surgical Anatomy - Sanjoy Sanyal
 
Functional Surgical Aspects -Triceps Surae-Tendo Calcaneus - Sanjoy Sanyal
Functional Surgical Aspects -Triceps Surae-Tendo Calcaneus - Sanjoy SanyalFunctional Surgical Aspects -Triceps Surae-Tendo Calcaneus - Sanjoy Sanyal
Functional Surgical Aspects -Triceps Surae-Tendo Calcaneus - Sanjoy Sanyal
 
Surgical Aspects of Popliteal Fossa - Dr. Sanjoy Sanyal
Surgical Aspects of Popliteal Fossa - Dr. Sanjoy SanyalSurgical Aspects of Popliteal Fossa - Dr. Sanjoy Sanyal
Surgical Aspects of Popliteal Fossa - Dr. Sanjoy Sanyal
 
Surgical Anatomy of Cadaveric Abdominal Viscera - Dr Sanjoy Sanyal
Surgical Anatomy of Cadaveric Abdominal Viscera - Dr Sanjoy SanyalSurgical Anatomy of Cadaveric Abdominal Viscera - Dr Sanjoy Sanyal
Surgical Anatomy of Cadaveric Abdominal Viscera - Dr Sanjoy Sanyal
 
4-5-6_Rwanda-KSA-Libya Service Certificates
4-5-6_Rwanda-KSA-Libya Service Certificates4-5-6_Rwanda-KSA-Libya Service Certificates
4-5-6_Rwanda-KSA-Libya Service Certificates
 
Rotation Model Blended Learning Project-ARS Feedback IEC Orlando Jan2016 - Sa...
Rotation Model Blended Learning Project-ARS Feedback IEC Orlando Jan2016 - Sa...Rotation Model Blended Learning Project-ARS Feedback IEC Orlando Jan2016 - Sa...
Rotation Model Blended Learning Project-ARS Feedback IEC Orlando Jan2016 - Sa...
 
Abnormal Right Vertebral Artery MRA Sequence - Sanjoy Sanyal
Abnormal Right Vertebral Artery MRA Sequence - Sanjoy SanyalAbnormal Right Vertebral Artery MRA Sequence - Sanjoy Sanyal
Abnormal Right Vertebral Artery MRA Sequence - Sanjoy Sanyal
 
ISL_Cert0021
ISL_Cert0021ISL_Cert0021
ISL_Cert0021
 
Ionizing Radiation in Surgery - Sanjoy Sanyal
Ionizing Radiation in Surgery - Sanjoy SanyalIonizing Radiation in Surgery - Sanjoy Sanyal
Ionizing Radiation in Surgery - Sanjoy Sanyal
 
Lasers in Surgery Systemic Applications Part-III - Sanjoy Sanyal
Lasers in Surgery Systemic Applications Part-III - Sanjoy SanyalLasers in Surgery Systemic Applications Part-III - Sanjoy Sanyal
Lasers in Surgery Systemic Applications Part-III - Sanjoy Sanyal
 
Illustrated Surgical GI Endoscopy - Sanjoy Sanyal
Illustrated Surgical GI Endoscopy - Sanjoy SanyalIllustrated Surgical GI Endoscopy - Sanjoy Sanyal
Illustrated Surgical GI Endoscopy - Sanjoy Sanyal
 
Lasers in Surgery Specific Applications Part-II - Sanjoy Sanyal
Lasers in Surgery Specific Applications Part-II - Sanjoy SanyalLasers in Surgery Specific Applications Part-II - Sanjoy Sanyal
Lasers in Surgery Specific Applications Part-II - Sanjoy Sanyal
 
Laparoscopic Surgery Scenario Part-I - Sanjoy Sanyal
Laparoscopic Surgery Scenario Part-I - Sanjoy SanyalLaparoscopic Surgery Scenario Part-I - Sanjoy Sanyal
Laparoscopic Surgery Scenario Part-I - Sanjoy Sanyal
 
Automatic Physiological Assessment in Surgery Computer Program - Sanjoy Sanyal
Automatic Physiological Assessment in Surgery Computer Program - Sanjoy SanyalAutomatic Physiological Assessment in Surgery Computer Program - Sanjoy Sanyal
Automatic Physiological Assessment in Surgery Computer Program - Sanjoy Sanyal
 
Surgical Aspects of Colorectal Endoscopy Part-IV - Sanjoy Sanyal
Surgical Aspects of Colorectal Endoscopy Part-IV - Sanjoy SanyalSurgical Aspects of Colorectal Endoscopy Part-IV - Sanjoy Sanyal
Surgical Aspects of Colorectal Endoscopy Part-IV - Sanjoy Sanyal
 

Último

Call Girl in Indore 8827247818 {LowPrice} ❤️ (ahana) Indore Call Girls * UPA...
Call Girl in Indore 8827247818 {LowPrice} ❤️ (ahana) Indore Call Girls  * UPA...Call Girl in Indore 8827247818 {LowPrice} ❤️ (ahana) Indore Call Girls  * UPA...
Call Girl in Indore 8827247818 {LowPrice} ❤️ (ahana) Indore Call Girls * UPA...
mahaiklolahd
 
💚Call Girls In Amritsar 💯Anvi 📲🔝8725944379🔝Amritsar Call Girl No💰Advance Cash...
💚Call Girls In Amritsar 💯Anvi 📲🔝8725944379🔝Amritsar Call Girl No💰Advance Cash...💚Call Girls In Amritsar 💯Anvi 📲🔝8725944379🔝Amritsar Call Girl No💰Advance Cash...
💚Call Girls In Amritsar 💯Anvi 📲🔝8725944379🔝Amritsar Call Girl No💰Advance Cash...
Sheetaleventcompany
 
Call Girls in Gagan Vihar (delhi) call me [🔝 9953056974 🔝] escort service 24X7
Call Girls in Gagan Vihar (delhi) call me [🔝  9953056974 🔝] escort service 24X7Call Girls in Gagan Vihar (delhi) call me [🔝  9953056974 🔝] escort service 24X7
Call Girls in Gagan Vihar (delhi) call me [🔝 9953056974 🔝] escort service 24X7
9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
Dehradun Call Girls Service {8854095900} ❤️VVIP ROCKY Call Girl in Dehradun U...
Dehradun Call Girls Service {8854095900} ❤️VVIP ROCKY Call Girl in Dehradun U...Dehradun Call Girls Service {8854095900} ❤️VVIP ROCKY Call Girl in Dehradun U...
Dehradun Call Girls Service {8854095900} ❤️VVIP ROCKY Call Girl in Dehradun U...
Sheetaleventcompany
 
Call Girls in Lucknow Just Call 👉👉7877925207 Top Class Call Girl Service Avai...
Call Girls in Lucknow Just Call 👉👉7877925207 Top Class Call Girl Service Avai...Call Girls in Lucknow Just Call 👉👉7877925207 Top Class Call Girl Service Avai...
Call Girls in Lucknow Just Call 👉👉7877925207 Top Class Call Girl Service Avai...
adilkhan87451
 

Último (20)

(Low Rate RASHMI ) Rate Of Call Girls Jaipur ❣ 8445551418 ❣ Elite Models & Ce...
(Low Rate RASHMI ) Rate Of Call Girls Jaipur ❣ 8445551418 ❣ Elite Models & Ce...(Low Rate RASHMI ) Rate Of Call Girls Jaipur ❣ 8445551418 ❣ Elite Models & Ce...
(Low Rate RASHMI ) Rate Of Call Girls Jaipur ❣ 8445551418 ❣ Elite Models & Ce...
 
Call Girl in Indore 8827247818 {LowPrice} ❤️ (ahana) Indore Call Girls * UPA...
Call Girl in Indore 8827247818 {LowPrice} ❤️ (ahana) Indore Call Girls  * UPA...Call Girl in Indore 8827247818 {LowPrice} ❤️ (ahana) Indore Call Girls  * UPA...
Call Girl in Indore 8827247818 {LowPrice} ❤️ (ahana) Indore Call Girls * UPA...
 
Most Beautiful Call Girl in Bangalore Contact on Whatsapp
Most Beautiful Call Girl in Bangalore Contact on WhatsappMost Beautiful Call Girl in Bangalore Contact on Whatsapp
Most Beautiful Call Girl in Bangalore Contact on Whatsapp
 
Call Girls Coimbatore Just Call 8250077686 Top Class Call Girl Service Available
Call Girls Coimbatore Just Call 8250077686 Top Class Call Girl Service AvailableCall Girls Coimbatore Just Call 8250077686 Top Class Call Girl Service Available
Call Girls Coimbatore Just Call 8250077686 Top Class Call Girl Service Available
 
Saket * Call Girls in Delhi - Phone 9711199012 Escorts Service at 6k to 50k a...
Saket * Call Girls in Delhi - Phone 9711199012 Escorts Service at 6k to 50k a...Saket * Call Girls in Delhi - Phone 9711199012 Escorts Service at 6k to 50k a...
Saket * Call Girls in Delhi - Phone 9711199012 Escorts Service at 6k to 50k a...
 
Call Girls Hyderabad Just Call 8250077686 Top Class Call Girl Service Available
Call Girls Hyderabad Just Call 8250077686 Top Class Call Girl Service AvailableCall Girls Hyderabad Just Call 8250077686 Top Class Call Girl Service Available
Call Girls Hyderabad Just Call 8250077686 Top Class Call Girl Service Available
 
Call Girls Amritsar Just Call 8250077686 Top Class Call Girl Service Available
Call Girls Amritsar Just Call 8250077686 Top Class Call Girl Service AvailableCall Girls Amritsar Just Call 8250077686 Top Class Call Girl Service Available
Call Girls Amritsar Just Call 8250077686 Top Class Call Girl Service Available
 
Call Girls Mumbai Just Call 8250077686 Top Class Call Girl Service Available
Call Girls Mumbai Just Call 8250077686 Top Class Call Girl Service AvailableCall Girls Mumbai Just Call 8250077686 Top Class Call Girl Service Available
Call Girls Mumbai Just Call 8250077686 Top Class Call Girl Service Available
 
💚Call Girls In Amritsar 💯Anvi 📲🔝8725944379🔝Amritsar Call Girl No💰Advance Cash...
💚Call Girls In Amritsar 💯Anvi 📲🔝8725944379🔝Amritsar Call Girl No💰Advance Cash...💚Call Girls In Amritsar 💯Anvi 📲🔝8725944379🔝Amritsar Call Girl No💰Advance Cash...
💚Call Girls In Amritsar 💯Anvi 📲🔝8725944379🔝Amritsar Call Girl No💰Advance Cash...
 
Call Girls Ahmedabad Just Call 9630942363 Top Class Call Girl Service Available
Call Girls Ahmedabad Just Call 9630942363 Top Class Call Girl Service AvailableCall Girls Ahmedabad Just Call 9630942363 Top Class Call Girl Service Available
Call Girls Ahmedabad Just Call 9630942363 Top Class Call Girl Service Available
 
Premium Bangalore Call Girls Jigani Dail 6378878445 Escort Service For Hot Ma...
Premium Bangalore Call Girls Jigani Dail 6378878445 Escort Service For Hot Ma...Premium Bangalore Call Girls Jigani Dail 6378878445 Escort Service For Hot Ma...
Premium Bangalore Call Girls Jigani Dail 6378878445 Escort Service For Hot Ma...
 
Call Girls in Gagan Vihar (delhi) call me [🔝 9953056974 🔝] escort service 24X7
Call Girls in Gagan Vihar (delhi) call me [🔝  9953056974 🔝] escort service 24X7Call Girls in Gagan Vihar (delhi) call me [🔝  9953056974 🔝] escort service 24X7
Call Girls in Gagan Vihar (delhi) call me [🔝 9953056974 🔝] escort service 24X7
 
Dehradun Call Girls Service {8854095900} ❤️VVIP ROCKY Call Girl in Dehradun U...
Dehradun Call Girls Service {8854095900} ❤️VVIP ROCKY Call Girl in Dehradun U...Dehradun Call Girls Service {8854095900} ❤️VVIP ROCKY Call Girl in Dehradun U...
Dehradun Call Girls Service {8854095900} ❤️VVIP ROCKY Call Girl in Dehradun U...
 
Call Girls Rishikesh Just Call 8250077686 Top Class Call Girl Service Available
Call Girls Rishikesh Just Call 8250077686 Top Class Call Girl Service AvailableCall Girls Rishikesh Just Call 8250077686 Top Class Call Girl Service Available
Call Girls Rishikesh Just Call 8250077686 Top Class Call Girl Service Available
 
Call Girls Service Jaipur {8445551418} ❤️VVIP BHAWNA Call Girl in Jaipur Raja...
Call Girls Service Jaipur {8445551418} ❤️VVIP BHAWNA Call Girl in Jaipur Raja...Call Girls Service Jaipur {8445551418} ❤️VVIP BHAWNA Call Girl in Jaipur Raja...
Call Girls Service Jaipur {8445551418} ❤️VVIP BHAWNA Call Girl in Jaipur Raja...
 
Call Girls Varanasi Just Call 8250077686 Top Class Call Girl Service Available
Call Girls Varanasi Just Call 8250077686 Top Class Call Girl Service AvailableCall Girls Varanasi Just Call 8250077686 Top Class Call Girl Service Available
Call Girls Varanasi Just Call 8250077686 Top Class Call Girl Service Available
 
VIP Hyderabad Call Girls Bahadurpally 7877925207 ₹5000 To 25K With AC Room 💚😋
VIP Hyderabad Call Girls Bahadurpally 7877925207 ₹5000 To 25K With AC Room 💚😋VIP Hyderabad Call Girls Bahadurpally 7877925207 ₹5000 To 25K With AC Room 💚😋
VIP Hyderabad Call Girls Bahadurpally 7877925207 ₹5000 To 25K With AC Room 💚😋
 
Call Girls Service Jaipur {9521753030 } ❤️VVIP BHAWNA Call Girl in Jaipur Raj...
Call Girls Service Jaipur {9521753030 } ❤️VVIP BHAWNA Call Girl in Jaipur Raj...Call Girls Service Jaipur {9521753030 } ❤️VVIP BHAWNA Call Girl in Jaipur Raj...
Call Girls Service Jaipur {9521753030 } ❤️VVIP BHAWNA Call Girl in Jaipur Raj...
 
Models Call Girls In Hyderabad 9630942363 Hyderabad Call Girl & Hyderabad Esc...
Models Call Girls In Hyderabad 9630942363 Hyderabad Call Girl & Hyderabad Esc...Models Call Girls In Hyderabad 9630942363 Hyderabad Call Girl & Hyderabad Esc...
Models Call Girls In Hyderabad 9630942363 Hyderabad Call Girl & Hyderabad Esc...
 
Call Girls in Lucknow Just Call 👉👉7877925207 Top Class Call Girl Service Avai...
Call Girls in Lucknow Just Call 👉👉7877925207 Top Class Call Girl Service Avai...Call Girls in Lucknow Just Call 👉👉7877925207 Top Class Call Girl Service Avai...
Call Girls in Lucknow Just Call 👉👉7877925207 Top Class Call Girl Service Avai...
 

U-5_Health Info Resource Comparison

  • 1. Critical Evaluation and Comparison of Two Internet Public Health Information Resources ABSTRACT OBJECTIVE: To determine which of two Websites, HealthInsite and eMedicine Consumer Health is the better Internet public health information resource DESIGN: Pilot study of 10 Websites to select 2 finalists; objective comparison of the two final sites and their breast cancer information content, using Minervation and Net Scoring benchmarking tools, and a manual and online readability tests DATA SOURCES: Key features from all the Websites MAIN OUTCOME MEASURES: Accessibility, Usability, Reliability and Readability of the sites RESULTS: All figures are for HealthInsite vs. eMedicine. With Minervation tool, Accessibility was 88.9% vs. 54%; Usability 83.3% vs. 72.2%; Reliability 85.2% vs. 51.8%; Overall score was 86.1% vs. 60.4% . With Net Scoring the corresponding scores were 50% each (Accesibility); 74.4% vs. 70.6% (Usability); 65.1% vs. 52.7% (Reliability); 68.7% vs. 60.5% (Overall). Readability scores were 43.1 vs. 47 (FRE) (p=0.99); 11.6 vs. 10.7 (FKGL) (p=0.98); 9.7 vs. 8.2 (Fog). With online readability tool, the scores were 61.8 vs. 61.3 (FRE); 8.9 vs. 8.7 (FKGL); 12.2 vs. 11.8 (Fog) CONCLUSION: As a patient/public health information resource, HealthInsite was better overall. Both HealthInsite and eMedicine failed to meet UK government requirements. Quality benchmarking tools and readability tests/formulae are not perfect and lack conformity amongst themselves. The task of benchmarking and measuring readability is rigorous and time-consuming. Automating both processes through a comprehensive tool may aid the human experts in performing their task more efficiently. Key words: Quality benchmarking tools; Readability tests (The following document’s FRE=28.6 and FKGL=12)
  • 2. INTRODUCTION Ninety-five million Americans use Internet for health information.1 There were >100,000 medical Websites in 1999, and increasing phenomenally.2 These give rise to some cogent questions begging for urgent answers. How much of the information is useful, genuine or usable to the public? What impact does it have on them?3,4 What benchmarking tools to use to assess the authenticity/reliability/validity of online information? How to improve the benchmarking process? This essay considers these inter-related issues. We have critically compared/contrasted two public/patient Internet health information resources from two regions, from a public/patient’s and a specialist’s perspective. We selected breast cancer because it is the most common cancer in women, kills 400,000 annually, and can strike early;5,6 it is the biggest cause of cancer deaths in Australian women, and second biggest cause in US and Britain;6,7 it is one of the most common health-related search topics among Internet users;8 and finally, the author of this essay manages the Breast Clinic in the Seychelles Ministry of Health. MATERIALS AND METHODS Downloading/Installing HONcode Toolbar The HONcode9,10 Accreditation Search and Verification Toolbars software was downloaded and installed in our browser (IE Version-6.0.2800.1106) Explorer Bar, through a series of HONcode 1.2Setup wizard dialogue boxes. (Figures-1,2) Figure-1: HON and HONcode logos Figure-2: Screenshot of HONcode 1.2 Setup wizard box We installed the automatic HONcode accreditation status indicator on the Toolbar (View menu→Toolbar option). We did not install HONcode search box because it was slowing down the opening of our browser. Right-clicking on some highlighted text and selecting ‘HONcode search’ indicated the site’s accreditation status.11 Piloting 10 sites Figure-3: C-H-i-Q logo Next, a pilot study was conducted on ten Websites (selected from Internet in Health and Healthcare12 and from Internet survey) to finalise two for evaluation/comparison. Patient/public-oriented resources and some professional ones were included (Appendix-Box-A). The Centre for Health Information Quality (C-H-i-Q)13 (Figure-3) checklist was applied to each site. The parameters were scored on a scale from 0 (not present) to 5 (fully present). We determined the Web Impact Factor (WIF)8,14 from the results returned by AltaVista (http://www.altavista.com/; accessed 18 June 2005), by entering ‘link:URL - host:URL’ in the search box, after selecting ‘search Worldwide’ option. Some additional points, including HONcode status were also included, with a score of 0 (not present) or 1 (present).(Appendix-Box-1, Appendix-Table-1) HealthInsite / eMedicine Analysis and Comparison We applied two quality benchmarking tools to the two finalists, HealthInsite (Australian) and eMedicine (American), to compare two resources from different regions. The two benchmarking tools were: 1. A tool from Minervation (LIDA Instrument version1.2) (Figure-4): This tool assesses a site on three Levels; Accessibility, Usability and Reliability, which are further subdivided into sub-levels and sub-sub-levels (Appendix-Box-2).15 Figure-4: Minervation homepage For Accessibility (Level-1) we went to www.minervation.com/validation (Accessed 12 June 2005) and entered the respective URLs (HealthInsite, eMedicine) in the designated box. The validation tool generated answers to first 4questions. For all the remaining questions we viewed the sites normally and entered the appropriate scores. Each question was scored on a scale of 0-3, MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal 2
  • 3. where 0=Never; 1=Sometimes; 2=Mostly; 3=Always. The supplemental questions in Reliability (Level-3) were not considered since they required contacting the site producers.15 2. Net Scoring, a French quality benchmarking tool, was applied next (Figure-5). This has 49 criteria grouped into 8 categories; Credibility, Content, Hyperlinks, Design, Interactivity, Quantitative aspects, Ethics, and Accessibility. Each criterion is classified as essential (0-9), important (0-6), or minor (0-3); (Maximum=312points).16 All categories were used for evaluation except Quantitative, and one important criterion under Hyperlinks, which were not applicable to us. Therefore our assessment was on a maximum of 294(312minus18) points. Figure-5: Central Health/Net Scoring logos Readability scoring Breast cancer contents of each site were compared by means of readability indices. For consistency, specific breast cancer topics were selected8.(Table-3) The Readability tests were Flesch Reading Ease (FRE)17, Flesch-Kincaid Grade Level (FKGL)17, Gunning's Fog Index18, and an online readability tool that automatically generated Kincaid, ARI (Automated Readability Index), Coleman-Liau, Flesch, Fog, Bjornsson’s Lix and McLaughlin's SMOG scores19(Appendix-Boxes-3,4,5) Microsoft® Word has in-built facility to give the FRE and FKGL scores. The ‘Tools’ menu in the MSWord 2003 was configured as outlined in Appendix-Box-5a, Figure-6.17 Figure-6: Screenshot of Tools menu Options dialogue box FRE and FKGL scores: Text from the documents was copied in clipboard and pasted in Microsoft® (Redmond, WA) Word2003. Each document was meticulously ‘processed’ as per Pfizer guidelines (viz. headings/titles, page navigation, bullets, references/URLs removed; hyphenated words/proper nouns included; footnotes excluded); only the main text body was used.17,20 On running spellchecker in MSWord2003, after it finished checking, it displayed statistics about the document and FRE/FKGL scores.17 Mean, Standard Deviation, Variance and Probability associated with Student’s t test were computed in MSExcel2003. Fog Index: In the absence of software,21,22 we calculated Fog Index ‘manually’, as outlined in Appendix-Box-5.18 We counted all the words with >3 syllables according to Pfizer guidelines (saying the word aloud with a finger under chin; each chin drop counted as a syllable).20 Online readability tool: For further readability check, the documents were uploaded onto an automated online readability tool (http://www.readability.info/uploadfile.shtml; accessed 12 June 2005). The instrument converted the document into plain text and generated the scores.19 RESULTS HONcode toolbar installation The HONcode Status and Search icons were installed on the Explorer Bar and the accreditation status indicator in the Toolbar. The latter automatically displayed the HONcode accreditation status of a Website (Figures-7a,b). : Figure-7a HONcode Status Search icons installed and accreditation status HONcode Search icon Accreditation status HONcode Status icon MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal 3
  • 4. Figure-7b: Accreditation status Pilot study results The top scorers were HealthInsite (54), NHSDirect and eMedicine (50 each) (Box-2, Appendix-Table-1). Only MedlinePlus, healthfinder®, HealthInsite and eMedicine were HONcode-accredited. NHSDirect and NeLH carried the NHS seal. MedlinePlus breast cancer page was not HONcode-accredited. Box-2: Scores of Websites in pilot study -HealthInsite (Australia) – 54 -NHS Direct Online (UK) – 50 -eMedicine Consumer Health (USA) – 50 -MedlinePlus (USA) – 48 -Healthfinder® (USA) – 42 -NIHSeniorHealth (USA) – 33 -NeLH/NLH (UK) – 32 -DIPEx (USA) – 28 -Cochrane Library – 26 -HealthConnect (Australia) – 2 HealthConnect, under re-development, had no breast cancer search results. Cochrane Library and NeLH/NLH had insufficient public material. AltaVista search results for NeLH/NLH, were 43,300/255 respectively. DIPEx breast cancer search returned only subjective Interview Transcripts rather than objective information. NIHSeniorHealth had features typically suited for the elderly. MedlinePlus and healthfinder® were comparable, but breast cancer information was more systematically arranged in the former. The three top-scorers in the pilot, NHSDirect, HealthInsite and eMedicine had almost comparable features. NHSDirect had the highest results (231,000) from AltaVista search.(Figures-8-16) Figure-8: HealthConnect breast cancer search result NHS Seal Figure-9-: NeLH breast cancer search page, patient information Figure 10: NIHSeniorHealth breast cancer search page MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal 4
  • 5. Figure 11: DIPEx breast cancer search page 12 13 Figures-12: MedlinePlus homepage; 13: Breast cancer information NHS seal Figure 14: NHS Direct Online homepage; highest link popularity Figure-15: HealthInsite homepage Figure-16: eMedicine Consumer Health homepage Benchmarking results With Minervation tool, HealthInsite secured 86.1% against eMedicine’s 60.4% (Table-1, Appendix-Tables-2,3). Table-1: Results with Minervation tool MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal 5
  • 6. HealthInsite eMedicine Level-1 (Accessibility) (Maximum=63points) First four automated tests (Maximum=57points) 50 28 Browser test (Maximum=3points) 3 3 Registration (Maximum=3points) 3 3 Subtotal (% of 63) 56 (88.9%) 34 (54%) Level-2 (Usability) (Maximum=54points) Clarity (6 questions; maximum=18points) 15 12 Consistency (3 questions; maximum=9points) 8 9 Functionality (5 questions; maximum=15points) 13 13 Engagibility (4 questions; maximum=12points) 9 5 Subtotal (% of 54) 45 (83.3%) 39 (72.2%) Level-3 (Reliability) (Maximum=27points) Currency (3 questions; maximum=9points) 9 3 Conflicts of interest (3 questions; maximum=9points) 9 6 Content production (3 questions; maximum=9points) 5 5 Subtotal (% of 27) 23 (85.2%) 14 (51.8%) Grand total (% of 144) 124 (86.1%) 87 (60.4%) With Net Scoring, HealthInsite scored marginally better (68.7%) than eMedicine (60.5%) (Table-2; Appendix-Tables-4,5,6). Table-2: Results with Net Scoring HealthInsite eMedicine Content category (Maximum=87points) 64(74.7%) 46 (52.9%) Credibility category (Maximum=99points) 55(55.5%) 52(52.5%) Hyperlinks (Maximum=45points) (minus 6points-see text) 31(79.5%) 29(74.4%) Design (Maximum=21points) 15(71.4%) 16(76.2%) Accessibility (Maximum=12points) 6(50%) 6(50%) Interactivity (Maximum=18points) 13(72.2%) 11(61.1%) Ethics (Maximum=18points) 18(100%) 18(100%) Grand total (% of 294) 202(68.7%) 178(60.5%) Figure-17 graphically represents the total scores from two sites by the two benchmarking tools. Tw o sites vs. tw o tools 100.00% 90.00% 80.00% 70.00% 60.00% 50.00% 40.00% 30.00% 20.00% 10.00% 0.00% HealthInsite Minervation tool Net Scoring eMedicine Figure-17: 2 x2 Comparison [Two sites on basis of two benchmarks] Readability results The results of MSWord (FRE, FKGL) and manual technique (Fog) are summarized in Boxes-3,4; Table-3; Figures-18,19; embedded MSExcel2003 worksheets-1,2. Box-3: HealthInsite Fog Index Box-4: eMedicine Fog Index Words/Sentences = 5198 / 237 = 21.93 Words/Sentences = 3319 / 196 = 16.93 [Words>3 syllables / Words] x 100 [Words>3 syllables / Words] x 100 = [126 / 5198] x 100 = 2.42 = [124 / 3319] x 100 = 3.73 [21.93 + 2.42] x 0.4 = 9.74 [16.93 + 3.73] x 0.4 = 8.26 Table-3: Readability Statistics HealthInsite eMedicine Text FRE FKGL Fog FRE FKGL Fog Breast cancer overview / facts and figures 55 9.9 56.2 8.8 Breast cancer causes / risk factors 55.6 8.9 48.2 11 Tests for breast cancer / mammography 49.7 10.4 51.9 9.7 Treatment options for breast cancer 38.1 12 41.1 11.7 MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal 6
  • 7. Support / follow up for women with breast cancer 40.8 11.7 41.5 11.6 Combined text 43.1 11.6 9.74 47.0 10.7 8.26 Mean μ=( ∑A-E / 5 ) 47.84 10.58 47.78 10.56 Std Deviation (SD) σ= 8.05 1.28 6.56 1.26 Variance 51.87 1.32 34.42 1.28 Figure-18a: HealthInsite Readability Indices screenshot Figure-18b: eMedicine Readability Indices screenshot Flesch Reading Ease HealthInsite eMedicine 55 56.2 55.6 48.2 49.7 51.9 38.1 41.1 40.8 41.5 Mean 47.84 47.78 SD 8.052515135 6.559496932 Variance 51.8744 34.4216 Probability associated with a Student's t test 0.990022434 Sheet-1: Comparison of FRE scores. No (2-tailed distribution, unpaired 2-sample with statistical difference. Double click anywhere unequal variance ) on table to get MSExcel Worksheet Flesch-Kincaid Grade Level HealthInsite eMedicine 9.9 8.8 8.9 11 10.4 9.7 12 11.7 11.7 11.6 Mean 10.58 10.56 Standard Deviation 1.283354978 1.266096363 Variance 1.3176 1.2824 Probability associated with a Student's t test 0.980816678 (2-tailed distribution, unpaired 2-sample with Sheet-2: Comparison of FKGL scores. No statistical difference. Double click anywhere unequal variance ) on table to get MSExcel Worksheet The results from the online readability tool are summarized in Table-4, Figure-19 Table-4: Online readability results Test/Formula HealthInsite eMedicine Kincaid 8.9 8.7 ARI 10.3 9.7 Coleman-Liau 13.1 12.9 Flesch Index 61.8 61.3 Fog Index 12.2 11.8 Lix 41.1 (School year 7) 40.3 (School year 6) MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal 7
  • 8. SMOG-Grading 11.3 11.1 Figure-19: Readability of the two sites through automated tool The mean readability values (MSWord-derived) for Readability comparison HealthInsite eMedicine HealthInsite and eMedicine) were similar (FRE 47.84 vs.47.78 65 (p=0.99); FKGL 10.58 vs.10.56 (p=0.98), respectively). The 60 automated test generated higher FRE and lower FKGL than 55 MSWord. Conversely, it returned higher Fog index than the 50 manual method. The mean scores and scores for combined text 45 (MSword-derived) were similar for eMedicine but not so for 40 HealthInsite. We found a high negative correlation (–0.96) 35 between FRE and FKGL, measured with MSExcel CORREL 30 function (Worksheet-3, Chart-1).23 25 20 15 10 5 0 Kincaid ARI Cloeman-Liau Flesch Index Fog Index Lix SMOG- Grading Readability Form ulae FRE / FKGL Correlation FRE-FKGL Correlation FRE FKGL 100 55 9.9 55.6 8.9 49.7 10.4 38.1 12 40.8 11.7 FRE 10 FKGL 56.2 8.8 48.2 11 51.9 9.7 Sheet-3 and Chart-1: FRE - FKGL 41.1 11.7 correlation. Almost perfect negative 41.5 11.6 1 correlation. Double click anywhere on Correlation -0.964999025 1 2 3 4 5 6 7 8 9 10 table to get MSExcel Worksheet DISCUSSION OF METHODS AND RESULTS HONcode represents a pledge to abide by 8 ethical principles.10 It is not a quality benchmarking instrument/seal. Unethical developers may cut-and-paste it onto their sites.13 Moreover, sites displaying a HONcode seal may not comply with the code.8 They may violate the HONcode after the accreditation was awarded by HON, and before their next infrequent check. Though we installed the toolbar plugin,11 these caveats should be kept in mind; HON-toolbar per se does not detect violations in a HON- accredited site. We utilized Web Impact Factor (link/‘peer-review’ popularity) in our pilot study. This is a better indicator of popularity than click popularity (frequency of site visitation), which may be manipulated.8,14 By AltaVista search, eMedicine had lower WIF than healthfinder® and NHSDirect, and HealthInsite had even lower(Appendix-Table-1). Thus, popularity of a site does not necessarily correlate with quality.8,14 HealthInsite / eMedicine – Critical/Analytical Review/Comparison (Figures-15,16,20a,b,21,22) Benchmarking tools: Score ranges like 0-3, 0-6 etc are pseudo-objective, giving a false sense of mathematical precision. Moreover, low scores in one important criterion may be compensated by high scores in two unimportant criteria, giving the same overall score. There is lack of conformity between different tools.15,16 There is also the problem of inter-rater reliability (kappa- value).24,25 Finally there are rater-training and rating-the-rater issues to be considered.14 But in the absence of other means of site assessment, scoring systems represent the only available fallback. They force our attention towards the important points about a health information site. They are considered acceptable if at least some issues (like inter-rater agreement kappa-value >0.6) are dealt with at the outset.24 Accessibilty: This determines if the Website meets the W3C-WAI and Bobby standards, if it is ‘future-proof’ and if users can access the information.15,26,27 Automated tools/measurements of accessibility are not perfect. They should be used with caution, and their results interpreted with discretion.28 eMedicine failed in the automated tests (Page setup, Access restrictions, Outdated code, Dublin core tags). With 87% overall, HealthInsite still did not meet the UK government legal standards.15 By Net Scoring, MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal 8
  • 9. both sites scored 50% in Accessibility, but we cannot rely on this figure because Net Scoring attached rather low importance to this category. Usability: This determines if users can find the required information; the ease of use of a service/component. Good usability increases usage, and ‘stickability’. Low usability results in decreased perception of usefulness.12,15,29 With Minervation tool, HealthInsite scored somewhat better than eMedicine. The latter lost out on clarity and site interactivity. Usability under Minervation tool corresponds to Hyperlinks-Design-Interactivity combination under Net Scoring. There was no significant difference between the two sites with this tool [Table-2, Appendix-Table-6]. HealthInsite relied entirely on external sites (>70) to provide information, reachable through a series of mouse-clicks. This rendered usability assessment somewhat difficult. This was not so with eMedicine, which provided its own material. Both had good print quality, though HealthInsite’s multiple partners resulted in variable fonts and sizes. Somewhat cluttered and confusing appearance of eMedicine homepage (Figure-16b) rendered it inferior in site design. Both sites had no author contact, and only HealthInsite enabled consumer participation. Reliability: This determines if the site provides relevant, unbiased, or unreliable and potentially harmful information. In a systematic review of Web health information quality, problem was found in 70%.15 Under Minervation tool, eMedicine failed. Main reason was failure to specify currency in all pages and conflicts of interest. Despite providing its content through external resources, most of HealthInsite’s material was well categorized and sensibly linked together as a coherent whole.30 This category roughly corresponds with Content-Credibility components of Net Scoring, which attaches a lot of importance to them. With Net Scoring, the composite score difference between the two sites was less [Table-2, Appendix-Tables-4,5]. Both Websites performed poorly in noting omissions, displaying information categories, name/title of author, source of financing, conflicts of interest, and webmastering process. Both sites had good editorial review and language quality. Only HealthInsite had provided alternative language facility. (Figure-20a) Figure-20a: HealthInsite –Language options A site can only check the quality of its immediately linked pages. It is virtually impossible to verify all subsequent pages that the partner sites link to; therefore it cannot be considered a quality requirement. HealthInsite had provided a disclaimer to this effect. HealthInsite did not specify influence/bias and had no metadata, while eMedicine did not mention hierarchy of evidence, original source, and had no help page and scientific review. But its site had ICRA-label v0231. Net Scoring considers an evolving technology like metadata an essential criterion.13 Net Scoring, but not Minervation tool, had ethics category. It was implicit in HealthInsite and explicit in eMedicine through a ‘Code of ethics’ link. (Figure-22) User freedom Figure-20b: HealthInsite –Re-direction to partner sites MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal 9
  • 10. Figure-21: HealthInsite salient points Advertisement Figure-22: eMedicine salient points Privacy policies: Both sites had similar policies with regard to type of information collected, how it was used, under what circumstances and to whom it could be disclosed, and use of clickstream data/cookies. HealthInsite adhered to Australian Guidelines for Federal and ACT Government World Wide Websites, and complied with Information Privacy Principles (Glossary 1-3,10,11; Privacy Act). It explained about E-mail privacy, site security and user anonymity. Contact officer’s E-mail was provided for privacy-related queries.32 eMedicine gathered information to understand user demographics. It obtained additional information about site traffic/use via Webtrends™ software. It occasionally shared some information in aggregate form (Figures- 23a,b).33 With a P3P-enabled browser, in-built in MS-IE6, it could be possible to view sites’ privacy policies and match them with user’s preferences.12,14 Figure-23a: HealthInsite Privacy Statement MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal 10
  • 11. Figure-23b: eMedicine Privacy Advertising: HealthInsite did not accept advertisements,34 but eMedicine did. Advertisement-containing pages take longer to load, chances of ad-ware/pop-ups/virus attacks increase, pages may be confusing to the uninitiated user, annoying ads may distract the reader and affect site usability, there may be content bias/consumerism (more commercial than factually-oriented),35 advertisers may not strictly adhere to ethical principles, and privacy/cookie policies of advertisers may be at variance with that of main site.33 However, discretely placed ads may be a good thing, and sponsored resources may have more user-customized information. Impact of Web advertising needs further research.35 Regional cultural/linguistic differences: USA has a substantial Hispanic population, yet eMedicine did not have ‘Espanol’ option. It has been claimed that American articles are more difficult to read than British affiliated ones;22 others have challenged it.36 Our findings did not corroborate the original claim. If the Web is to be truly “accessible to everyone”(Berners-Lee)26, and if we are to sincerely try to reduce racial/ethnic Internet access disparities (a la Digital Divide), then apart from alternate language options, readability levels must be appropriate for the socio-ethnic minorities.37 Readability: There was no significant difference between the two sites. Readability can be tested by using test subjects, readability experts or readability formulae.37 We selected the last approach because of expediency. The beginning, middle, end portions of text must be selected for testing.20,37 Reliability of the results depends on proper ‘cleaning’ of the documents. Variable results from the same document and discrepancies between results from different tools arise from improper sampling/cleaning of the documents.17,20 MSWord and online tool gave opposing results. This also emphasizes the variability between different tools/formulae.19 Readability formulae measure structure/composition of text rather than meaning/context; word-length rather than the words.18,23 They do not distinguish between written discourse and meaningless sentences.37 Shorter sentences and words with fewer syllables might improve readability scores without improving readability.20 Readability formulae do not measure language familiarity, clarity, new concepts, format/design, cultural sensitivity/relevance, credibility/believability and comprehensibility.18,20 They do not address communication/interactivity, reader’s interest, experience, knowledge or motivation, time to read, and the unique characteristics of Internet.37 For some European languages within an English document, MSWord displays statistics but not readability scores.38 Applying FRE to German documents does not deliver good results.19 The problem of testing Spanish documents is applicable to USA. Apart from establishing a panel of pre-tested experts, we need software like Lexiles Framework® that measures readability in English and Spanish.37 Given all these constraints, readability scores computed using formulae should be interpreted with caution. But they are quick, easy, and better than nothing at all.23 Recommendations to HealthInsite/eMedicine The following site-specific recommendations are based on the deficiencies noted in the sites. Appendix-Box-6 gives some generic principles by Nielsen and Constantine.39 HealthInsite Accessibility: -Implement Table Summaries (let visually-impaired users know what is in a table)15 -Implement HTTP-Equivalent Content-Type in header (W3C requirement)12,15 Usability: -Integrate non-textual media (visuals) in Website23 -Consistent style, avoid excessive fonts and size-variations12 Reliability: -Specify influence, bias16 eMedicine Accessibility: MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal 11
  • 12. -Implement metadata-Dublin core title tags (compatibility with NHS directives)15 -Eliminate outdated codes-HTML elements that would not be used in future versions; specifically body and colour font tags15 -Use stylesheets (efficient/consistent design practices)15 -Implement Image Alt Tags and Table Summaries (let visually-impaired users know what is in an image and table, respectively)15 -Implement DTD (makes site XML-compatible/future-proof)15 -Implement HTML Language Definition (Bobby recommendation)15,40 Usability: -Clear statement of who this Website is for16 -Render Website user-interactive (user personalisation)12,16 -Make page more neat and trim12,16,39 -Provide Help page16,39 -Alternate language options16 -Advertising links discretely-placed; separate from general content35 -Reduce page download time12 Reliability: -Content-specific feedback mechanism; provide forums/chat (to submit content-specific comments)16 -Currency-update content at appropriate intervals; mention last update/review15,16,41,42 -State hierarchy/levels of evidence (for medical decisions)14,16 -Specify scientific review process16 -Provide information on how to evaluate online health information16 Both Usability: -Link persistence; verify functioning hyper-links; no broken links12,16,29,43 Reliability: -Implement MedCIRCLE labeling -follow-up of MedCERTAIN, implements HIDDEL vocabulary (latter is based on MedPICs and is a further development of PICS)14,29,44 -Mention source of information (let users verify from the original source)15,16 -Author’s name/title, contact information on each document12,22,35 -Note any omissions16 -Clearly display information categories (factual data, abstracts, full-text documents)16 -Proper/formal ‘What’s new’ page16 -Specify conflicts of interest (financing source, author independence etc)12,15,16 -Mention Webmastering process16 Readability: -Scale readability level down to 6th Grade level37,45 -Have ‘readability seal’ along lines of HONcode seal (inform readers of reading ease/difficulty level)23 Lessons Learned from Study Our study had several limitations. Our methodology and results have not been validated in independent studies. Only two sites and limited content were compared over a short period.22,35 Due to the dynamic nature of the Web, some of our findings may change over time.35 Our study did not evaluate the advertised resources in eMedicine, which may have had more customized information.35 Accuracy was assessed only by the author; more objective measures for evaluation must be established.8,35 But we learned several lessons. Both our test sites, when run through the gauntlet of a quasi-mathematical objective scoring system, did not meet the UK government legal standards.16 ‘Popular’ resources are not good enough, and/or quality benchmarking tools employ criteria that are too difficult to fulfill. Quality benchmarking of online health information resources is a strenuous task. This is compounded by the fact that rating/scoring systems/tools and readability tools are not perfect, with considerable lack of conformity between them.15,16,19 Usability is a subjective assessment while reliability/content is more objective.35 Rating tools are more useful for researchers/informaticians rather than for patients and clinicians.35 People are relying more on the Internet for health information.1 Our study may provide a basis for clinicians to guide patients seeking relevant/reliable Web health information.35 Medical knowledge should be treated as a single blob/pool of knowledge with uniform accessibility to professionals and public. This viewpoint has its supporters and dissenters.23 Yet, the Internet is rife with different information sets for professionals and public (Internet in Health and Healthcare; slides 5-14/15-21).12 Our pilot study highlighted the essential differences between patient/consumer and professional medical/health information resources (NeLH/Cochrane, for example). Our final study methodology/results may be generalized to the former but not to the latter types of resources. For these we require different tools; Oxford CEBM,13 ANAES method (level of evidence for therapy),14 DISCERN guidelines (for treatment choices), Medical Matrix star ranking system, AMA Guidelines, HSWG Criteria,13 CONSORT statement (for randomized trials), QUORUM statement (for systematic reviews), and CHERRIES statement (for Internet E-surveys).46 Public health information resources are supposed to be gateways to public education. This involves providing reliable/accurate information, and informing them how to assess the quality of information. HealthInsite had taken cognizance of these points. This also entails keeping in mind the literacy/readability levels of the average population. Average public readability level is usually MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal 12
  • 13. lower than the school grade level completed.37 The estimated reading age level of UK general population is 9 years.23,47 About 47% of US population demonstrate low literacy levels.48 OECD considers level 3 as minimum requirement for modern life; considerable proportion of the population is below that.49 Most of the evaluated documents required lengthy scrolling, carried small font, and ranked ‘Difficult’/‘Fairly difficult’ (Figure-24). Similar findings were noted by others.22,23,47,50-52 They should be scaled down to the level appropriate to the target audience, i.e. ‘Standard English’/‘Fairly easy’.48,50 Improving readability will enhance their public consumption.22,23,47,50 We had to employ different tools to evaluate Websites and readability. Ideally, quality benchmarking checklists should include parameters for testing readability also.23 Figure-24: FRE vs. Comparable literature (Breese et al, JAMA 2005) Figure 25 outlines the complex inter-relationships between quality, accuracy, trust and popularity of Websites, elucidated from various studies.8,53,54 But there is no uniformity between quality indicators,15,16 and current quality criteria cannot identify potentially harmful online health information.24 We found HealthInsite better than eMedicine, though both were HONcode- accredited and had comparable accuracy. Further studies are required to establish the true inter-relationships. HONcode logo Organisation domain Copyright display Accuracy Quality Author/medical credentials ≠ Accuracy Lack of currency ≠ Inaccuracy Presence of advertisements ≠ Inaccuracy Quality ≠ Popularity Type of Website → Popularity Trust Ref: Meric F et al 2002; Fallis et al 2002; Lampe et al 2003 Figure-25: Complex inter-relationships (quality/accuracy/trust/popularity) – far from perfect Making Evaluation/Comparison Better Generalisability: Ideally we should evaluate ~200 Websites,8 a variety of subjects (diabetes, hypertension, asthma, Alzheimer’s, lung/colon cancer, etc), and include more topics under each disease. To generalize our findings we need broader studies.22,35 Accuracy assessment: In our study the author’s personal knowledge of breast cancer was utilised to assess accuracy. But such may not be the case for all topics/illnesses. We need to use a panel of experts for each topic8 and/or develop an instrument from authoritative sources (viz. Harrison OnLine) for each topic, to assess accuracy of Web content.53 Likewise readability tests should ideally be supplemented by feedback from a panel of readability experts.23 Objectively measuring Web content: ‘Concise’ or ‘scannable’ or ‘objective’ Web content increases readability by 58%, 47% and 27% respectively; all three attributes increases readability by 124%.55 We should objectively measure Web content quality using Nielsen’s five usability matrices (Appendix-Box-7).55 Refining readability scoring: There are many readability tests/formulae (Appendix-Box-8)18-21,37. Ideally a combination of several tests37 that incorporates the best parameters from all tests should be utilized. Using different tests/formulae will serve to cross- check the readability scores of the text pieces under study, and also serve to validate one tool against the other. We have tried to achieve both these to a limited extent in our study. Comprehensive quality benchmarking system: A comprehensive quality benchmarking tool may be developed by pooling the best criteria from all systems currently available. Even better would be an intelligent software wizard, which automatically qualifies a Website according to pre-programmed criteria.13 It is emphasized that tools/wizards can never fully replace humans in quality benchmarking tasks; they can only help them work more efficiently and ensure they follow the required evaluation protocol.13 MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal 13
  • 14. Such a system would require defining a standard (core) statement and developing criteria, perhaps based on AMOUR principle, which specifies the required quality level to satisfy the standard.13 The ideal scenario would be intelligent software that performs automatic site and readability scoring, using best-of-breed criteria for both measurements.13 Summary Quality control of Internet health information rests on four stanchions: consumer education, self regulation, third-party evaluation, and sanction enforcements.56 The basic ingredients for usefulness are Usability, Relevance, Integrability and Quality.29 Nielsen’s ‘usability heuristics’ should cover all aspects of usability.40 Education of online user is important. Silberg’s 4-core criteria/JAMA- benchmark (Authorship, Attribution, Currency, Disclosure) is the bare minimum to be looked for in a Website.8,14,42 Additional points are Accessibility, Privacy and Transparency (EU Guidelines)41. Website popularity is not an essential quality requirement.14 REFERENCES 1. Health Information online: an American survey from the Pew Internet Project. May 2005. URL: http://www.pewinternet.org/pdfs/PIP_Healthtopics_May05.pdf (Accessed 24 June 2005). 2. Eysenbach G, Ryoung Sa E, Diepgen TL. Shopping around the Internet today and tomorrow: towards the millennium of cybermedicine. BMJ November 1999;319:1294. URL: http://bmj.bmjjournals.com/cgi/content/full/319/7220/1294 (Accessed 1 June 2005). 3. Coiera EW. Will the Internet replace your doctor? Digital doctors. 1999. URL: http://abc.net.au/future/health.htm (Accessed 1 June 2005) 4. Coiera E. Information epidemics, economics, and immunity on the Internet. BMJ November 1998; 317:1469-1470. 5. Reaney P. Kylie's case shows breast cancer can strike early. Reuters website. May 2005. URL: http://www.reuters.co.uk/newsArticle.jhtml?type=healthNews&storyID=8517644&section=news&src=rss/uk/healthNews (Accessed 1 June 2005). 6. BREAST CANCER FACTS AND FIGURES. myDr homepage. March 2001. URL: http://www.mydr.com.au/default.asp?article=2942 (Accessed 1 June 2005). 7. Kylie's cancer surgery a success. Yahoo UK news. May 2005. URL: http://uk.news.yahoo.com/050521/325/fjhlq.html (Accessed 1 June 2005). 8. Meric F, Bernstam EV, Mirza NQ, Hunt KK, Ames FC, Ross MI, Kuerer HM, Pollock RE, Musen MA and Singletary Eva S. Breast cancer on the world wide web: cross sectional survey of quality of information and popularity of websites. BMJ 2002;324;577-81. URL: http://bmj.com/cgi/content/full/324/7337/577 (Accessed 1 June 2005) 9. HONcode. Health On the Net Foundation website. URL: http://www.hon.ch/HONcode/ (Accessed 1 June 2005). 10. HON Code of Conduct (HONcode) for medical and health Web sites. Health On the Net Foundation website. URL: http://www.hon.ch/HONcode/Conduct.html (Accessed 1 June 2005). 11. HONcode Toolbar. URL: http://www.hon.ch/HONcode/Plugin/Plugins.html (Accessed 1 June 2005) 12. Boulos MNK. Internet in Health and Healthcare. URL: http://www.e- courses.rcsed.ac.uk/mschi/unit5/KamelBoulos_Internet_in_Healthcare.ppt (Accessed 1 June 2005). 13. Boulos MNK, Roudsari AV, Gordon C, Gray JAM. The Use of Quality Benchmarking in Assessing Web Resources for the Dermatology Virtual Branch Library of the National electronic Library for Health (NeLH). J Med Internet Res 2001;3(1):e5 URL: http://www.jmir.org/2001/1/e5/ (Accessed 1 June 2005) 14. Boulos MNK. On quality benchmarking of online medical/health-related information resources. University of Bath School for Health. March 2004. URL: http://staff.bath.ac.uk/mpsmnkb/MNKB_Quality.PDF (Accessed 1 June 2005). 15. The LIDA Instrument version 1.2 – Minervation validation instrument for health care web sites. © 2005 Minervation Ltd. URL: http://www.minervation.com/mod_lida/minervalidation.pdf (Accessed 12 June 2005). 16. Net Scoring ®: criteria to assess the quality of Health Internet information. Last updated 2001. URL: http://www.chu- rouen.fr/netscoring/netscoringeng.html (Accessed 1 June 2005). MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal 14
  • 15. 17. Boulos MNK. Activity: Readability of online public/patient health information services. Royal College of Surgeons of Edinburgh message board site. 2004. URL: http://www.e- courses.rcsed.ac.uk/mb3/msgs/dispmessage.asp?MID=MID2004417225527533 (Accessed I June 2005). 18. Everything you ever wanted know about readability tests but were afraid to ask. In: Klare, A Second Look at the validity of Readability Formulas. Journal of Reading Behaviour 1976; 8:129-52. URL: http://www.gopdg.com/plainlanguage/readability.html (Accessed 1 June 2005). 19. Readability.Info. © 2004 by Dave Taylor & Intuitive Systems. URL: http://www.readability.info/uploadfile.shtml (Accessed 12 June 2005). 20. Doak LG, Doak CC, eds. Pfizer Principles for Clear Health Communication, 2nd Ed. New York: Pfizer Inc., 2004. URL: http://www.pfizerhealthliteracy.com/pdfs/Pfizers_Principles_for_Clear_Health_Communication.pdf (Accessed 4 June 2005) 21. Readability Calculations. Micro Power & Light Co. Dallas, TX. URL: http://www.micropowerandlight.com/rdformulas.html (Accessed 5 June 2005) 22. Weeks WB, Wallace AE. Readability of British and American medical prose at the start of the 21st century. BMJ December 2002;325:1451-2. URL: http://bmj.bmjjournals.com/cgi/content/full/325/7378/1451 (Accessed 1 June 2005) 23. Boulos MNK. British Internet-Derived Patient Information on Diabetes Mellitus: Is It Readable? DIABETES TECHNOLOGY & THERAPEUTICS 2005; 7(3). © Mary Ann Liebert, Inc. URL: http://www.e- courses.rcsed.ac.uk/mb3/msgs/dispattachment.asp?AID=AID200557131244705 (Accessed 1 June 2005). 24. Walji M, Sagaram S, Sagaram D, Meric-Bernstam F, Johnson C, Mirza NQ, Bernstam EV. Efficacy of Quality Criteria to Identify Potentially Harmful Information: A Cross-sectional Survey of Complementary and Alternative Medicine Web Sites. J Med Internet Res 2004;6(2):e21. URL: http://www.jmir.org/2004/2/e21/ (Accessed 20 June 2005). 25. Downs SH, Black N. The feasibility of creating a checklist for the assessment of the methodological quality both of randomised and non-randomised studies of health care interventions. J Epidemiol Community Health. 1998 Jun;52(6):377-84. URL: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&dopt=Abstract&list_uids=9764259 (Accessed 24 June 2005). 26. Web Accessibility Initiative (WAI). W3C® Web Accessibility Initiative website. Last revised June 2005. URL: http://www.w3.org/WAI/ (Accessed 20 June 2005) 27. Kitemarks. Judge: web sites for health. Last updated September 2004. URL: http://www.judgehealth.org.uk/how_judge_kitemarks.htm (Accessed 20 June 2005). 28. Inaccessible website demo. Disability Rights Commission. 2005. URL: http://www.drc.org.uk/newsroom/demo.asp (Accessed 20 June 2005). 29. Boulos MNK. Optimising the Utility of the NeLH VBL for Musculoskeletal Diseases—Technical Considerations. URL: http://healthcybermap.semanticweb.org/publications/nelh27Nov02.ppt (Accessed 1 June 2005) 30. Boulos MNK. What classes as a website. Royal College of Surgeons of Edinburgh message board. 2003. URL: http://www.e-courses.rcsed.ac.uk/mb3/msgs/dispmessage.asp?MID=MID200312222458705 (Accessed 20 June 2005). 31. ICRA (Internet Content Rating Association). ©1999-2003 Internet Content Rating Association®. URL: http://www.icra.org/about/ (Accessed 1 June 2005). 32. HealthInsite Privacy Statement. HealthInsite Website. Last Updated Oct 2004. URL: http://www.healthinsite.gov.au/content/internal/page.cfm?ObjID=00063FB9-061E-1D2D-81CF83032BFA006D (Accessed 22 June 2005). 33. Privacy. eMedicine Health Website. URL: http://www.emedicinehealth.com/common/privacy.asp (Accessed 22 June 2005). 34. HealthInsite Disclaimer. Updated March 2005. URL: http://www.healthinsite.gov.au/content/internal/page.cfm?ObjID=0006CF21-0624-1D2D-81CF83032BFA006D (Accessed 22 June 2005). MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal 15
  • 16. 35. Bedell SE., Agrawal A, Petersen LE. A systematic critique of diabetes on the world wide web for patients and their physicians. International Journal of Medical Informatics 2004. URL: http://www.e- courses.rcsed.ac.uk/mb3/msgs/dispattachment.asp?AID=AID2004811113029705 (Accessed 1 June 2005). 36. Albert T. Letter to Editor: Transatlantic writing differences are probably exaggerated. BMJ March 2003;326:711. URL: http://bmj.bmjjournals.com/cgi/content/full/326/7391/711 (Accessed 1 June 2005). 37. CHAPTER 4 -READABILITY ASSESSMENT OF HEALTH INFORMATION ON THE INTERNET. URL: http://www.rand.org/publications/documents/interneteval/interneteval.pdf/chap4.pdf (Accessed 23 June 2005) 38. Microsoft® Office Word 2003 (11.5604.5606). Part of Microsoft Office Professional Edition 2003. Copyright © 1983-2003 Microsoft Corporation. 39. APPENDIX A-3. HEURISTIC GUIDELINES FOR EXPERT CRITIQUE OF A WEB SITE. Evaluation Design/Planning and Methodology for the NIH Web Site – Phase 1. URL: http://irm.cit.nih.gov/itmra/weptest/app_a3.htm#usability (Accessed 1 June 2005). 40. Bobby. © 2003-2004 Watchfire Corporation. URL: http://bobby.watchfire.com/bobby/html/en/index.jsp (Accessed 20 June 2005). 41. Quality Criteria for Health Related Websites. Europa European Union Website. Last updated March 2005. URL: http://europa.eu.int/information_society/eeurope/ehealth/quality/draft_guidelines/index_en.htm (Accessed 25 June 2005). 42. Silberg WM, Lundberg GD, Musacchio RA. Assessing, controlling, and assuring the quality of medical information on the Internet: Caveant lector et viewor--Let the reader and viewer beware. JAMA April 1997 16;277(15):1244-5. 43. Managing Web Resources for Persistent Access. National Library of Australia. March 2001. URL: http://www.nla.gov.au/guidelines/persistence.html (Accessed 26 June 2005). 44. MedCIRCLE The Collaboration for Internet Rating,Certification, Labeling and Evaluation of Health Information. Last updated 17 Dec 2002. URL: http://www.medcircle.org/ (Accessed 20 June 2005) 45. Ask Me 3 (Pfizer Inc.): Advancing Clear Health Communication to Positively Impact Health Outcomes (Professional Presentation Tool Kit). Internet document 2003. URL: http://www.askme3.org/PFCHC/professional_presentation.ppt (Accessed 24 June 2005). 46. Eysenbach G. Improving the quality of Web surveys: the Checklist for Reporting Results of Internet E-Surveys (CHERRIES). J Med Internet Res. 2004 Sep 29;6(3):e34. URL: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=15471760&dopt=Abstract (Accessed 20 June 2005). 47. Chestnutt IG. Internet-derived patient information on common oral pathologies: is it readable? Prim Dent Care. 2004 Apr;11(2):51-4. URL: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=pubmed&dopt=Abstract&list_uids=15119094 (Accessed 23 June 2005). 48. Clear & Simple: Developing Effective Print Materials for Low-Literate Readers. National Cancer Institute Website. Updated 27 Feb 2003. URL: http://www.cancer.gov/aboutnci/oc/clear-and-simple/ (Accessed 24 June 2005) 49. Office for National Statistics, UK: Adult Literacy Survey: Literacy Level of Adults by Gender and Age. Internet document 1996. URL: http://www.statistics.gov.uk/StatBase/Expodata/Spreadsheets/D5047.xls (Accessed 25 June 2005) 50. Breese P, Burman W. Readability of Notice of Privacy Forms Used by Major Health Care Institutions. JAMA (Reprinted) April 2005; 293 (13):1593-4. URL: http://www.e- courses.rcsed.ac.uk/mb3/msgs/dispattachment.asp?AID=AID2005513141638705 (Accessed 1 June 2005). 51. Jaffery JB, Becker BN. Evaluation of eHealth web sites for patients with chronic kidney disease. Am J Kidney Dis. 2004 Jul;44(1):71-6. URL: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=pubmed&dopt=Abstract&list_uids=15211440 (Accessed 24 June 2005) 52. Kirksey O, Harper K, Thompson S, Pringle M. Assessment of Selected Patient Educational Materials of Various Chain Pharmacies. J Health Commun. 2004;9(2):91-93. URL: MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal 16
  • 17. http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=pubmed&dopt=Abstract&list_uids=15204820 (Accessed 24 June 2005) 53. Fallis D, Fricke M. Indicators of accuracy of consumer health information on the Internet: a study of indicators relating to information for managing fever in children in the home. J Am Med Inform Assoc. 2002 Jan-Feb;9(1):73-9. URL: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=pubmed&dopt=Abstract&list_uids=11751805 (Accessed 1 June 2005). 54. Lampe K, Doupi P, van den Hoven MJ. Internet health resources: from quality to trust. Methods Inf Med. 2003;42(2):134-42. URL: http://www.ncbi.nlm.nih.gov/entrez/query.fcgi?cmd=Retrieve&db=PubMed&list_uids=12743649&dopt=Abstract (Accessed 24 June 2005). 55. Morkes J, Nielsen J. Concise, SCANNABLE, and Objective: How to Write for the Web. October 1997. URL: http://www.useit.com/papers/webwriting/writing.html (Accessed 4 June 2004). 56. Eysenbach G. Consumer health informatics. BMJ June 2000;320:1713-1716. URL: http://bmj.bmjjournals.com/cgi/content/full/320/7251/1713 (Accessed 1 June 2005). List of abbreviations AMA: American Medical Association AMOUR: Achievable, Measurable, Observable, Understandable Reasonable CEBM: Centre for Evidence Based Medicine CHERRIES: Checklist for Reporting Results of Internet E-Surveys DTD: Document Type Definition FKGL: Flesch-Kincaid Grade Level FRE: Flesch Reading Ease HIDDEL: Health Information Disclosure, Description and Evaluation Language HONcode: Health On the Net Foundation code of conduct HSWG: Health Summit Working Group HTML: Hypertext Markup Language IE: Internet Explorer MedCERTAIN: MedPICS Certification and Rating of Trustworthy and Assessed Health Information on the Net MedCIRCLE: Collaboration for Internet Rating, Certification, Labeling and Evaluation MS-IE: Microsoft Internet Explorer NeLH: National electronic Library for Health (now, National Library for Health) P3P: Platform for Privacy Preferences Project PICS: Platform for Internet Content Selection SMOG: Simple Measure of Gobbledegook W3C: World Wide Web Consortium WAI: Web Accessibility Initiative OECD: Organisation for Economic Co-operation and Development WIF: Web Impact Factor APPENDICES Appendix-Box-A: Websites included in pilot study 1. MedlinePlus: http://medlineplus.gov/ or http://www.medlineplus.gov (Accessed 1 June 2005) 2. healthfinder®: http://www.healthfinder.gov/ (Accessed 1 June 2005) 3. HealthInsite: http://www.healthinsite.gov.au/ (Accessed 1 June 2005) 4. HealthConnect: http://www.healthconnect.gov.au (Accessed 1 June 2005) 5. NHS Direct Online: http://www.nhsdirect.nhs.uk (Accessed 1 June 2005) 6. NeLH (National electronic Library for Health); now called NLH (National Library of Health): http://www.nelh.nhs.uk/ or http://www.nlh.nhs.uk (Accessed 1 June 2005) 7. Cochrane Library: Through NeLH; this also goes through Wiley Interscience interface http://www.nelh.nhs.uk/cochrane.asp; through Wiley Interscience interface http://www3.interscience.wiley.com/cgi-bin/mrwhome/106568753/HOME or http://www.mrw.interscience.wiley.com/cochrane/ (Accessed 1 June 2005) 8. DIPEx (Database of Individual Patient Experiences): http://www.dipex.org (Accessed 1 June 2005) 9. NIHSeniorHealth: http://nihseniorhealth.gov/ (Accessed 1 June 2005) 10. eMedicine Health: http://www.emedicinehealth.com/ (Accessed 1 June 2005) MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal 17
  • 18. Appendix-Box-1: Centre for Health Information Quality (C-H-i-Q) checklist13 1. Accessibility: Information is in appropriate format for target audience 2. Accuracy: Information is based on best available evidence 3. Appropriateness: Information communicates relevant messages 4. Availability: Information is available to wide audience 5. Currency: Information is up-to-date 6. Legibility: Written information is clearly presented 7. Originality: Information not already produced for the same audience in the same format 8. Patient involvement: Information is specifically designed to meet needs of patient 9. Reliability: Information addresses all essential issues 10. Readability: Words / sentences are kept short; jargon minimized Appendix-Table-1: Scores of Websites in Pilot Study Features MedlinePlus Health- Health Health NHS NeLH/ Cochrane DIPEx NIH eMedicine finder® Insite Connect Direct NLH Library Senior Online Health Accessibility 5 4 5 Site under 4 1 0 2 2 4 Accuracy 4 3 5 re- 4 5 5 2 2 4 Appropriateness 4 4 5 development 5 3 2 1 2 5 Availability 5 4 5 5 2 1 3 2 4 Currency 3 4 4 4 4 4 3 3 3 Legibility 5 2 5 5 2 2 2 3 5 Originality 5 2 5 5 5 4 4 5 5 Patient 5 5 5 5 2 2 5 5 5 involvement Reliability 3 3 4 4 3 3 1 2 4 Readability 3 3 4 3 1 1 2 3 4 C-H-i-Q 42 34 47 0 44 28 24 25 29 43 subtotal Web Impact 53,100 376,000 36,000 142 results 231,000 43,300 72 results 1,010 1,800 56,900 Factor (Alta results results results (2) results / 255 (1) results results results Vista) [0-99=1; (4) (5) (4) (5) results (3) (3) (4) (4/2) 100-999=2; 1000-9999=3; 10000-99999=4 100000+=5] Homepage 1 1 1 0 0 0 0 0 0 1 HONcode- accredited Language 1 1 1 0 0 0 1 0 0 0 option(s) Breast cancer 0 1 1 0 0 0 0 0 0 1 page HONcode- accredited Additional 0 0 0 0 1 (Lang 0 0 0 1 (Text 1 (ICRA features options size, label) in audio contrast, clips) speech) Miscellaneous 6 8 7 2 6 4 2 3 4 7 subtotal Total Score 48 42 54 2 50 32 26 28 33 50 MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal 18
  • 19. Appendix-Box-2: Minervation tool parameters15 Level 1 (Accessibility) (Maximum 63 points) 1. Page setup | Automated test (maximum 57 points including all 4 automated tests) 2. Access restrictions | -do- 3. Outdated code | -do- 4. Dublin core title tags | -do- 5. Browser test (Maximum 3 points) 6. Registration (Maximum 3 points) Level 2 (Usability) (Maximum 54 points) 1. Clarity (6 questions; maximum 18 points) 2. Consistency (3 questions; maximum 9 points) 3. Functionality (5 questions; maximum 15 points) 4. Engagibility (4 questions; maximum 12 points) Level 3 (Reliability) (Maximum 27 points + 24 supplemental points) 1. Currency (3 questions; maximum 9 points) 2. Conflicts of interest (3 questions; maximum 9 points) 3. Content production (3 questions; maximum 9 points) 4. Content production procedure – supplemental (5 questions; maximum15 points) 5. Output of content - supplemental (3 questions; maximum 9 points) Appendix-Box-3: Flesch Reading Ease (FRE) score This readability score is normally used to assess adult materials.21 It bases its rating on the average number of syllables per word (ASW) and words per sentence (ASL, i.e. Average Sentence Length). It rates text on a scale of 0 to 100; the higher the score, the easier it is to understand the document. The score for ‘plain English’ is 65. Flesch scores of <30 indicate extremely difficult reading, like in a legal contract.22 Formula for FRE score FRE = 206.835 – (1.015 x ASL) – (84.6 x ASW); ASL = Average sentence length (number of words / number of sentences); ASW = Average number of syllables per word (number of syllables / number of words) Appendix-Box-4: Flesch-Kincaid Grade Level (FKGL) score This is most reliable when used with upper elementary and secondary materials.21 It also bases its rating on ASW and ASL. It rates text on a U.S. grade-school level (a rough measure of how many years of schooling it would take someone to understand the content, with a top score of 12). A score of 5.0 means that a fifth grader 10-year old can understand the document. For most standard documents, we should aim for a score of approximately 5.0. Formula for FKGL score FKGL = (.39 x ASL) + (11.8 x ASW) – 15.59 Appendix-Box-5: Gunning's Fog Index It is widely used in health care and general insurance industries for general business publications.21 FOG scores of >16 indicate extremely difficult reading, like in a legal contract.22 Calculating Fog Index18 (A) Total number of words is divided by total number of sentences to give average number of words per sentence (B) Number of words with >3 syllables is divided by total number of words to give the percentage of difficult words (C) Sum of two figures (A) and (B) is multiplied by 0.4. This is the Fog Index in years of education Others19 Fog Index = 0.4*(wds/sent+100*((wds >= 3 syll)/wds)) ARI = 4.71*chars/wds+0.5*wds/sentences-21.43 Coleman-Liau = 5.89*chars/wds-0.3*sentences/(100*wds)-15.8 Lix = wds/sent+100*(wds >= 6 char)/wds SMOG-Grading = square root of (((wds >= 3 syll)/sent)*30) + 3 Appendix-Box-5a: Configuring MSWord to display Readability Scores Tools menu →Options→ Spelling & Grammar tab Check grammar with spelling check box selected Show readability statistics check box selected; clicked OK Appendix-Table-2: Comparison on the basis of Minervation online tool MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal 19
  • 20. Parameter HealthInsite eMedicine Level 1 (Accessibility) (Maximum=63points) First four automated tests (Maximum=57points) 50 28 -Browser test (Maximum=3points) 3 3 -Registration (Maximum=3points) 3 3 Subtotal (% of 63) 56 (88.9%) 34 (54%) Level 2 (Usability) (Maximum=54points) Clarity (6 questions; maximum=18points) -Is there a clear statement of who this web site is for? 3 0 -Is the level of detail appropriate to their level of knowledge? 3 2 -Is the layout of the main block of information clear and readable? 2 2 -Is the navigation clear and well structured? 3 3 -Can you always tell your current location in the site? 2 3 -Is the colour scheme appropriate and engaging? 2 2 Consistency (3 questions; maximum=9points) -Is the same page layout used throughout the site? 2 3 -Do navigational links have a consistent function? 3 3 -Is the site structure (categories or organisation of pages) applied 3 3 consistently? Functionality (5 questions; maximum=15points) -Does the site provide an effective search facility? 3 3 -Does the site provide effective browsing facilities? 3 3 -Does the design minimise the cognitive overhead of using the site? 2 2 -Does the site support the normal browser navigational tools? 2 2 -Can you use the site without third party plug-ins? 3 3 Engagibility (4 questions; maximum=12points) -Can the user make an effective judgment of whether the site applies 3 2 to them? -Is the web site interactive? 3 0 -Can the user personalise their experience of using the site? 3 1 -Does the web site integrate non-textual media? 0 2 Subtotal (% of 54) 45 (83.3%) 39 (72.2%) Level 3 (Reliability) (Maximum=27points) Currency (3 questions; maximum=9points) -Does the site respond to recent events? 3 (‘News’ link) 2 (eNews letter would be sent) -Can users submit comments on specific content? 3 (‘Consumer participation’ 0 link -Is site content updated at an appropriate interval? 3 (Mentioned) 1 Conflicts of interest (3 questions; maximum=9points) -Is it clear who runs the site? 3 (Australian government) 3 (private company) -Is it clear who pays for the site? 3 (-do-) 0 (cannot tell) -Is there a declaration of the objectives of the people who run the 3 3 site? Content production (3 questions; maximum=9points) -Does the site report a clear content production method? 3 (‘About HealthInsite’ link) 3 (‘About us’ link) -Is this a robust method? 2 2 -Can the information be checked from original sources? 0 (Can’t tell) 0 (Can’t tell) Subtotal (% of 27) 23 (85.2%) 14 (51.8%) Grand total (% of 144) 124 (86.1%) 87 (60.4%) Appendix-Table-3: Automated Accessibility results of HealthInsite and eMedicine (Minervation) HealthInsite 1.1 Page Setup 80 % 1.1.1 Document Type Definition 3 1.1.2 HTTP-Equiv Content-Type (in header) 0 1.1.3 HTML Language Definition 3 1.1.4 Page Title 3 1.1.5 Meta Tag Keywords 3 MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal 20
  • 21. 1.2 Access Restrictions 66 % 1.2.1 Image Alt Tags 3 http://www.healthinsite.gov.au scores 87%. Medium 1.1 Page Setup Pass rate of ~80% Medium 1.2.2 Specified Image Widths 2 1.2.3 Table Summaries 0 1.2 Access Restrictions Pass rate of ~66% Medium 1.2.4 Frames 3 1.3 Outdated Code Pass rate of ~100% High 1.4 Dublin Core Tags Pass rate of ~100% High 1.3 Outdated Code 100 % 1.3.1 Body Tags - Body Background Colour 3 1.3.2 Body Tags - Body Topmargin 3 1.3.3 Body Tags - Body Margin Height 3 1.3.4 Table Tags - Table Background Colour 3 1.3.5 Table Tags - Table Column (td) Height 3 1.3.6 Table Tags - Table Row (tr) Height 3 1.3.7 Font Tags - Font Color 3 1.3.8 Font Tags - Font Size 3 1.3.9 Align (non style sheet) 3 1.4 Dublin Core Tags 100 % 1.4.1 Dublin Core Title Tag 3 Accessibility: 87 % (50 / 57) TOTAL RATING 87 % (50 / 57) eMedicine 1.1 Page Setup 60 % 1.1.1 Document Type Definition 0 1.1.2 HTTP-Equiv Content-Type (in header) 3 http://www.emedicinehealth.com/ scores 49%. Low 1.1.3 HTML Language Definition 0 1.1 Page Setup pass rate of ~60% Medium 1.1.4 Page Title 3 1.2 Access Restrictions pass rate of ~50% Low 1.1.5 Meta Tag Keywords 3 1.3 Outdated Code pass rate of ~48% Low 1.4 Dublin Core Tags pass rate of ~0% Low 1.2 Access Restrictions 50 % 1.2.1 Image Alt Tags 1 1.2.2 Specified Image Widths 2 1.2.3 Table Summaries 0 1.2.4 Frames 3 1.3 Outdated Code 48 % 1.3.1 Body Tags - Body Background Colour 0 1.3.2 Body Tags - Body Topmargin 0 1.3.3 Body Tags - Body Margin Height 0 1.3.4 Table Tags - Table Background Colour 2 1.3.5 Table Tags - Table Column (td) Height 3 1.3.6 Table Tags - Table Row (tr) Height 3 1.3.7 Font Tags - Font Color 0 1.3.8 Font Tags - Font Size 3 1.3.9 Align (non style sheet) 2 1.4 Dublin Core Tags 0% 1.4.1 Dublin Core Title Tag 0 Accessibility: 49 % (28 / 57) TOTAL RATING 49 % (28 / 57) Appendix-Table-4: Comparison of Content category (Net Scoring) MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal 21
  • 22. Content (information) quality (Content category) (Maximum=87points) HealthInsite (Australia) eMedicine Consumer Health (USA) Accuracy (essential criterion) 9 9 Hierarchy of evidence (important 6 (‘Reviews of Evidence for 0 (Not specified in any page) criterion) Treatments’) Original Source Stated (essential 9 (Most pages mentioned it) 0 (Not specified in any page) criterion) Disclaimer (important criterion) 6 (Disclaimer provided) 6 (Disclaimer provided) Logic organization (navigability) 7 (Pages redirected to partner websites, 9 (essential criterion) with notification) Quality of the internal search engine 6 6 (important criterion) General index (important criterion) 6 6 What’s new page (important criterion) 4 (‘News’ and ‘HealthInsite Newsletter’) 3 (‘eMedicine Spotlight’) Help page (minor criterion) 3 0 Map of the site (minor criterion) 3 3 Omissions noted (essential criterion) 0 (None) 0 Fast load of the site and its different 6 4 (Ads reduced speed of loading) pages (important criterion) Clear display of available information 0 (None) 0 categories (factual data, abstracts, full- text documents, catalogue, databases) (important criterion) SUBTOTAL (%of 87) 65 (74.7%) 46 (52.9%) Appendix-Table-5: Comparison of Credibility category (Net Scoring) Completeness / currency / usefulness of information (Credibility category) (Maximum=99points) HealthInsite (Australia) eMedicine Consumer Health (USA) Name, logo and references of the 9 (All pages, including partner sites had 9 (All pages) institution on each document of the site them) (essential criterion) Name and title of author on each 0 (None mentioned) 0 (None mentioned) document of the site (essential criterion) Context: source of financing, 0 (None mentioned) 0 (None mentioned) independence of the author(s) (essential criterion) Conflict of interest (important criterion) 0 (None mentioned) 0 (None mentioned) Influence, bias (important criterion) 0 (None mentioned) 3 (Mentioned partly in Disclaimer) Updating: currency information of the 9 6 (some pages mentioned it) site (essential criterion) including: - date of creation Yes Yes - date of last update / last version Yes No Relevance/utility (essential criterion) 9 (For public information) 8 (For public + healthcare professionals) Editorial review process (essential 9 (Mentioned) 9 (Mentioned) criterion) Webmastering process (important 1 (Mentioned in one partner site) 0 (Not mentioned anywhere) criterion) Scientific review process (important 6 (‘Reviews of Evidence for 0 (Not mentioned) criterion) Treatments’) Target/purpose of the web site; access to 6 (Free access to all pages) 4 (Site had a ‘Registration’ link; general the site (free or not, reserved or not) public info could be freely accessed; (important criterion) sponsored links present) Quality of the language and/or translation 6 (good language; other language options 3 (good language; no other language (important criterion) provided) options) Use of metadata (essential criterion) 0 10 (ICRA label v02) SUBTOTAL (%of 99) 55 (55.5%) 52 (52.5%) Appendix-Table-6: User interface / Ease of finding information / Usability (Net Scoring) HealthInsite (Australia) eMedicine Consumer Health (USA) Hyperlinks category (Maximum=45points) (Minus 6points for NA parameter; so maximum=39points) Selection (essential criterion) 9 9 Architecture (important criterion) 6 4 (Hyperlinks were a bit cluttered) MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal 22
  • 23. Content (essential criterion) 9 9 Web Impact Factor: Back-links 4 (36,000 results from AltaVista) 4 (56,900 results from AltaVista) (important criterion) Regular verification that hyper-links are 0 (not mentioned, though no broken links 0 (not mentioned, though no broken links functioning, i.e., no broken links were encountered) were encountered) (important criterion) In case of modification of the site NA (Not applicable) NA structure, link between old and new HTML documents (important criterion) Distinction between internal and external 3 (Specified) 3 (There were no separate hyperlinks) hyper-links (minor criterion) SUBTOTAL (%of 39) 31 (79.5%) 29 (74.4%) Design category (Maximum=21points) Design of the site (essential criterion) 9 (Neat and trim, user-friendly) 7 (Somewhat cluttered, likely to be confusing to some) Readability of the text, (important 3 (See Readability scores) 4 (See Readability scores) criterion) Quality of the print (important criterion) 3 [Combination of Tahoma (font 9, 9.5, 5 [Only Times New Roman (font 12) for 10, 11.5), Verdana (font 7, 9.5, 12), Ariel headings and Verdana (font 7.5) for text] (font 10)] SUBTOTAL (%of 21) 15 (71.4%) 16 (76.2%) Accessibility category (Maximum=12points) Accessibility from the main search 6 (Dual mode of access – from search 6 (Same arguments apply) engines and catalogues (important box and from A-Z site map; latter gave criterion) more logical arrangement of topics. Search engine gave results according to relevance ranking) Intuitive address of a site (important 0 (Not present) 0 criterion) SUBTOTAL (% of 12) 6 (50%) 6 (50%) Interactivity category (Maximum=18points) Feedback mechanism: Email of author on 5 (‘Feedback’/‘Contact us’ links in main 5 (‘Contact us’ links in all site pages; no every document (essential criterion) pages and some partner site pages; no author or contact info) author or contact info) Forums, chat (minor criterion) 2 (‘Consumer participation’ link) 0 (None) Traceability, cookies etc (important 6 (Cookies etc specified) 6 (same points) criterion) SUBTOTAL (% of 18) 13 (72.2%) 11 (61.1%) Ethics category (Maximum=18points) Liability of the reader (essential 9 (‘Disclaimer’ link) 9 (‘Disclaimer’ link) criterion) Medical privacy (essential criterion) 9 (‘Privacy’ link) 9 (‘Privacy’ link) SUBTOTAL (% of 18) 18 (100%) 18 (100%) Appendix-Box-6: User interface design principles by Nielsen (1994)39 1. Visibility of system status: The system should keep users informed about what is going on, through appropriate timely 2. System and real world match: Follow real-world conventions, making information appear in a natural and logical order. 3. User freedom: Users need a clearly marked ‘emergency exit’ from mistakes. Support undo and redo. 4. Consistency and standards: Follow platform conventions to avoid confusion among users 5. Error prevention: Careful design prevents a problem from occurring 6. Recognition rather than recall: Make objects, actions, and options visible. Instructions for use of the system should be visible. 7. Flexibility and efficiency: Allow users to tailor frequent actions. 8. Aesthetic design: Dialogues should not contain information that is irrelevant or rarely needed. 9. Help users recognize and recover from errors: Error messages should express in plain language the problem, and a solution. 10. Help and documentation: Any such information should be easy to search, focused on the user’s task, list concrete steps to be carried out and not be too large. Usability principles by Constantine (1994)39 A. Structure Principle: Organize the user interface purposefully, that put related things together and separate unrelated things. B. Simplicity Principle: Make common tasks simple to do, communicate simply in user’s own language, provide good shortcuts. C. Visibility Principle: Keep all options and materials for a given task visible. D. Feedback Principle: Keep users informed of actions/interpretations, changes of state/condition, and errors/exceptions. E. Tolerance Principle: Be flexible and tolerant, reducing the cost of mistakes and misuse by allowing undoing and redoing while preventing errors. F. Reuse Principle: Reduce the need for users to rethink and remember by reusing internal and external components and behaviors. MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal 23
  • 24. Appendix-Box-7: Usability matrices for objectively measuring Web content (Morkes and Nielsen)55 1. Task time: Number of seconds to find answers for tasks 2. Task errors: Percentage score based on the number of incorrect answers 3. Memory: Recognition and recall a. Recognition memory: A percentage score based on the number of correct answers minus the number of incorrect answers to questions b. Recall memory: A percentage score based on the number of pages correctly recalled minus the number incorrectly recalled 4. Sitemap time: a. Time to recall site structure: The number of seconds to draw a sitemap b. Sitemap accuracy: A percentage score based on the number of pages and connections between pages correctly identified, minus the number of pages and connections incorrectly identified 5. Subjective satisfaction: Subjective satisfaction index is the mean score of four indices – Quality, Ease of use, Likeability, User effect Appendix-Box-8: Various Readability tools, formulae and software18-21,37 • Dale-Chall: Original vocabulary-based formula used to assess upper elementary through secondary materials • Fry Graph: Used over a wide grade range of materials, from elementary through college and beyond • Powers-Sumner-Kearl: For assessing primary through early elementary level materials • FORCAST: Focuses on functional literacy. Used to assess non- running narrative, e.g. questionnaires, forms, tests etc • Spache: Original vocabulary-based formula widely used in assessing primary through fourth grade materials • McLaughlin's SMOG (Simple Measure of Gobbledegook): Unlike any of the other formulas, SMOG predicts the grade level required for 100% comprehension • Cloze procedure: The "cloze" procedure (from the word ‘closure’) for testing writing is often treated as a readability test because a formula exists for translating the data from "cloze tests" into numerical results • Lexiles Framework® software tool that measures readability in both English and Spanish. • ARI: The Automated Readability Index is typically higher than Kincaid and Coleman-Liau, but lower than Flesch • Coleman-Liau Formula usually gives a lower grade than Kincaid, ARI and Flesch when applied to technical documents. • Lix formula developed by Bjornsson from Sweden is very simple and employs a mapping table as well MSc Healthcare Informatics RCSEd+Univ of Bath; Unit 5-Remote Healthcare; 2005. Tutor: MNK Boulos; Student: Sanjoy Sanyal 24