New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Ishii thesis
1. 1University of Aizu, Graduation Thesis. March, 2012 s1170047
Abstract
Literature on computer assisted language learning
is mostly silent on how web-based design analysis
could be effectively used as a tool and framework for
developing critical thinking skills and language
proficiency in an EFL classroom. This article
reported on how EFL learners perform with English
website analysis tasks in a language reception and
product context. English website analysis is
challenging for an EFL learner with low-moderate
English language proficiency. The website analysis
experiment with the Belize tourism website as
reported in this article was performed with a group of
16 students in an EFL classroom. The results mainly
discussed students' preliminary understanding of the
website content, design, navigation and usability,
rather than the aspects of how their use of English
(grammatically) for responding to design queries
during website analysis was correct or
deficient. Further, this study also analyzed how the
three coders with non-native English language
proficiency used for analyzing the responses to the
open-ended design questions asked of the participants,
interpreted responses based on the criteria grading
rubric used for the purpose. Results show relatively
higher levels of proficiency when answering
questions related to overall website organization,
design, layout and audience analysis. However,
performance scores dropped for more inference-
based queries related to overall use of technology,
validity of content etc. Some relative variations in
scoring could be observed between coders. Relatively
large variation in the scores could be seen in case of
Coder A, when compared to B and C.
1 Introduction
There is substantial research in language studies
and cognition that establishes cognition and language
development to be closely related (Liaw et al., 2007).
Theorists and educators have for long argued about
the close relationships between language and
thinking skills (Piaget, 1971; Vygotsky, 1962). It is
believed that developing students’ ability to reflect on
their own learning process can help them progress in
learning. Literature in foreign language studies have
clearly established a relation showing how higher-
order thinking skills promote higher order learning
skills which in turn enable students to reach higher
levels of language proficiency (Renner, 1996).
Educators have identified multiple features and
elements of reading and writing to have always
influenced thinking skills to a large extent (Moffett &
Wagner, 1983; Pearson & Tierney, 1984; Stanford &
Roark, 1974; Staton, 1984). There has always been a
strong appeal to promote higher order thinking in
ESL and EFL classrooms, and research has clearly
focused on the need to foster critical thinking in a
foreign language classroom (Chamot, 1995; Tarvin &
Al-Arishi, 1991; Chapple & Curtis, 2000; Davidson,
1994, 1995). However, unfortunately language
learning and thinking skills were almost always
treated as independent processes (Miraman &
Tishman, 1988; Suhor, 1984; Pica, 2000).
This study on website analysis as a tool for critical
thinking in an EFL classroom is partly influenced by
the Kasper (2000) study which focused on extensive
and sustained content analysis using information
technology resources. This study establishes that such
attempt helps with both linguistic and cognitive
information processing ability. This study is also
influenced by the fact that we have shifted from Web
1.0 towards Web 2.0 where there is an increased
emergence of computer-mediated communication,
social networking and active interaction between the
user and the web environment. Website analysis in
this environment help EFL readers not only to scan
the website under consideration, but also encourages
them to access online discussion forums, testimonials
on the web, additional resources from other sources,
talk live with a service agent, chat etc. It brings to life
an environment which was long considered to be
passive and only generating information which was
completely up to the user to receive. So, information
reception was a one-way traffic.
However, in this web-based communicative
environment, English website analysis is challenging
for an EFL learner with low-moderate English
language proficiency. English website analysis in a
typical EFL environment involve reading and
comprehending the website content in English,
performing selected translation of the content in their
native language, or switching back and forth between
English and the native version if available,
comprehend the design queries and its scope in their
native language, frame the responses for the design
Performance with Website Analysis in an EFL
Classroom: Exploring Consistency in Coding
Takahide Ishii s1170047 Supervised by Prof. Debopriyo Roy
2. 2University of Aizu, Graduation Thesis. March, 2012 s1170047
questions mentally in their native language, and then
some will try to respond directly in English, while
others will take the help of translation software
(Google translator in most cases), and online and/or
portable dictionaries (e.g., Weblio). Besides, because
of the way a typical Web 2.0 environment is used, a
typical user might also get involved in searching and
reading comments in English or their native language
about a specific place, hotel, location etc. The
website analyzer might also want to ask questions to
a service agent. On a completely different topic, if the
EFL website analyzer is not sure about how to
approach a website design query being asked, he/she
could also start reading about website design etc.
The point here is that we can never be exactly sure
about the combination of tools being used for critical
thinking and language processing during website
analysis in an EFL context. The individual difference
in responses to website analysis could arise not only
from ability to think through the problem, but also
due to the difference in the online resources solicited
during the processing of a response.
Whatever may be the case, in a language context,
website or any other interface analysis with specific
design-based queries might be one way to promote
analytical thinking through its focus on creating,
evaluating and analyzing (Atherton, 2002) and
promotes active participation, argumentation,
problem solving, conducting investigations and
tackling subject matter that is complex (Tytler, 2004).
The experiment with website analysis performed
with a group of 16 students in an EFL classroom will
equip students to better understand the interface they
use for elearning applications. The analysis of the
results from the experiment reported in this article
mainly discussed students' preliminary understanding
of the website content, design, navigation and
usability, rather than the aspects of how their use of
English (grammatically) for responding to design
queries during website analysis was correct or
deficient. Further, this study is also designed to
analyze how coders with non-native English
language proficiency used for analyzing the open-
ended design questions, interpret responses based on
the criteria grading rubric as is used for the reported
experiment.
The major point in the literature review concerning
the need to think and analyze in the target language is
more relevant since the entire analysis was conducted
in English for non-native speakers. This study is
important for various reasons:
● The study focuses on the extent to which
readers could process a critical response to a
website design query.
● The study tries to identify whether readers are
able to understand suitably and differentiate
between design queries asked during website
analysis.
The study tries to identify the extent to which
novice website designers in an EFL context are able
to assess a design response based on a specific
criteria list.
2 Research Questions
The following research questions provide the
backbone of the experiment reported in this study.
● How did the EFL readers perform with various
design questions as asked during website
analysis?
● Is there any significant difference between
coders who graded responses to the design
questions suggesting significant difference
between responses to a design question and/or
suggesting that one or more coder(s) have not
understood the questions and responses
correctly and resultantly could not use the
assessment rubric correctly for grading the
design responses?
This is an exploratory analysis because the
literature on computer assisted language learning is
not rich on how website analysis could be used as a
tool for promoting critical thinking and language
proficiency in a language classroom.
3 Review of the Literature
Using the Internet for ESL/EFL (English as
second/foreign language) writing instructions is a
common practice now (Krajka, 2000). The issue of
using web pages for teaching writing is raised in Tan
et.al. (1999). Trokeloshvili and Jost (1997) concluded
that public displaying of student text on a student
home page highly motivates students to conduct
writing and publishing, and helps to remove mental
blocks associated with publishing ordinary writing.
There is research indicating that web analysis has the
potential to be a beneficial exercise (Bunz, 2001;
Spyridakis, 2000) and more so in an EFL context.
The information processing strategies reported
earlier will influence readers' ability to analyze
information organization, design and layout,
grouping, navigation, audience analysis etc. Neilson
(1997) has demonstrated that the website analysis
task is different from any other reading task, because
it requires an analytical mindset, analysis and
resultant English text production in a specific design
context. Also, ability to explain a design and layout
might not always incorporate reading and
comprehending the entire text in the web page.
Readers might get away with merely understanding
3. 3University of Aizu, Graduation Thesis. March, 2012 s1170047
the headlines, the menu items, the introductory
sentence of a paragraph etc. (Neilson’s Alert box,
1997). Research (Lynch et al., 2001) suggests that
extending critical thinking skills to the web is
important in a first language context and there is
nothing in the literature to suggest why the same
argument should not be valid in a foreign language
context.
Van Hoosier-Care (1997) describes the website
assignment as a rhetorical exercise in the technical
communication classroom. It is important for the
reader to understand the conceptual process of
designing a website, include the rationale of the
project, target audience, purpose of the website etc
(December and Ginsberg, 1995). The experimental
and goal-orientated nature of web design projects
involve tasks such as deciding with a partner where
to place a picture on a page being constructed, or
browsing, which requires active choices of where to
search next. These are claimed to help promote
higher order thinking skills (Mike, 1996), which
include reviewing, scanning, selecting and
negotiating, and particularly important for EFL
students doing further studies in other disciplines,
research and rhetorical skills that may be developed.
Furthermore, Warschauer (1997) points out that web
design skills incorporate 'situated learning': that
which allows students: "to carry out meaningful tasks,
and solve meaningful problems, in an environment
that reflects their own personal interests as well as the
multiple purposes to which their knowledge will be
put in the future" (Collins, Brown, & Newman, 1989).
With the goal of designing and publishing web pages,
students can actively make use of new technologies,
skills, and knowledge. Warschauer (1997) also
acknowledges this, and supports the view that many
skills, in particular, those that are involved in
collaboratively accessing and interpreting worldwide
information, and with peoples from different cultures,
will be critical for success in the 21st century.
The design and other questions asked during this
website analysis were based on the model proposed
by Garrett (2011). Specific questions were designed
based on audience and task analysis, product goals,
information design, interaction design, information
architecture, etc. Figure 1 shows an explanation of
Garrett’s (2011) user experience model that has been
simplified and referred to as part of this study.
http://www.netmagazine.com/features/content-
first-content-left-right-and-centre
Figure 1. Garrett’s (2011) User Experience Model:
An Explanation
4 Sample and Context
Participants (N=17) are junior level students (age
group: 18-20 years) in their third year undergraduate
program specializing in computer science in a
Japanese technical university. With this specific
elective course named Writing and Design for World
Wide Web, students mostly focused on the process of
online writing, designing and analyzing websites
based on design principles, besides designing concept
maps on websites they analyzed. So, for most weeks
during the course, there was an all-round effort to
sharpen student skills on writing and thinking.
Students were given brief lectures on basic design
principles for website design, followed by at least
four weeks of regular practice on website design,
analysis and brainstorming activities on website
content which included designing concept maps.
5 Methods
5.1 Preparing for the experiment
During the first couple of weeks into the course, a
proper in-class lecture was delivered on the basics of
website design. The lecture focused on how issues
like organization, layout, formatting, and typography,
content chunks, simple wording, headings, titles, use
of white space etc are important design
considerations.
5.2 In-class Website Analysis
Assignment
4. 4University of Aizu, Graduation Thesis. March, 2012 s1170047
As part of the website analysis assignment (named
as Assignment A), students were asked to study a
specific website in a chosen domain (e.g., education,
entertainment, government, tourism, sports etc) and
then provide open-ended responses to 8 standard
questions asked of them. The questions were related
to content, presentation, navigation, technology used,
real-world application of content, website and content
usability, audience analysis and product goals.
Readers had one complete week to complete this
assignment. Readers confronted the same set of
design questions every week, but the website to be
analyzed changed every week.
5.3 Actual Experiment
The experiment was conducted in a controlled
environment as an in-class activity, and over two
weeks. The first part of the website analysis activity
was conducted entirely in Moodle. The actual
experiment ran over two weeks and started during the
7th
week into the course, at the time when students
already had two weeks of design lecture and four
weeks of experience with website design, planning
and analysis. During the 1st week of the actual
experiment, students analyzed the Belize tourism
website based on the 8 open-ended questions asked
during the same assignment that happened over the
previous weeks (but with a different website each
week), and they had one-week to complete the
analysis, besides the 90 minutes of class time where
they could consult their friends. Students entered
their responses in Moodle (learning management
system) as in-line text in open-ended format. To
encourage writing and proper explanation, the
minimum word limit for the assignment was set at
500 words. They had to write the responses in their
own words.
5.4 Instruments
The Belize tourism website was chosen with the
following reasonable conditions in mind.
The content in the website is not text heavy
and clear navigation is possible.
Information could be searched directly from
the home page.
Attractive pictures are available to keep the
reader engaged in the task of finding
information.
Japanese version of the webpage is NOT
available so that readers are forced to look for
information from the English version alone.
The instructions for the first week of the
assignment (where readers had to respond to 8 open-
ended questions) were all in English, largely because
readers already had practice from the weeks before
regarding what is expected of them.
The questions asked for the design analysis were as
follows.
Table 1
Open Design Questions asked of Participants
1. Explain whether the organization of
information in the site is user-friendly or not?
2. Explain whether the presentation of content
is appealing or not?
3. Explain whether the effective use of
technology is demonstrated?
4. Who is the target audience? Is the website
appropriate for the projected audience?
5. Explain the quality of the text content.
6. Is the information accessible?
7. Explain whether the resources use real-
world situations.
8. Here are some common reasons for building
this website. Rank them in order of importance to
you. Do you have a reason that is not listed?
5.5 Data Analysis – Use of Coders
Three undergraduate students (not part of the class
with the sample) who took the same class at an
earlier semester were appointed as coders with the
task of grading the first week assignment where
readers participated in an open-ended evaluation of
the Belize tourism website. The coders were
advanced undergraduate students with reasonable
English language proficiency and better experience
with website design and analysis, and had the ability
to grasp design lectures with reasonable success. The
coders were given a set of criteria (discussed in the
next section and shown in Figure 3), on the basis of
which they graded each open-ended response, for all
the 8 questions assigned. Coders were specifically
trained as to how they could grade each response (for
each of the 8 questions) on the basis of six criteria.
Each of the 8 open-ended responses for each of the
17 participants was rated thrice, once by each of the
three coders. The three coders went through a
practice session during the first week, wherein they
graded one student and then wrote a verbal report for
each criterion, justifying their grade. The group
(including the project supervisor and the three
coders) then discussed each grade for each question
and criterion. The three coders were handed out all
the 17 response sheets (with answers to each of 8
questions) for coding.
6 Findings
5. 5University of Aizu, Graduation Thesis. March, 2012 s1170047
The first important finding that readers would be
interested to know is how participants reading the
Belize tourism website in this EFL context has
performed with the 8 design questions they had to
write based on their understanding of the content in
the website, and their impression of the website
design, capabilities, levels of comfort navigating
through the website, etc.
Each coder has graded each of the 8 questions
separately, based on 6 criteria, as reported in the
methods section. So, the mean and SD values we see
above is the sum of mean scores obtained from the
three graders. The maximum sum of mean values for
each question could be no more 18 (6 * 3) (6 being
the maximum score for each question). We see that
the highest mean value was obtained for the first
question on “whether the organization of the website
is user-friendly” (sum of mean = 14.56). A high sum
of mean score does not indicate that the Belize
tourism website is highly user-friendly. Rather, it
means that readers who were exploring the website
could explain the answer with maximum efficiency,
in line with the six criteria. Mostly we see the sum of
mean scores in the range of 11 ~ 12, indicating an
average score for a single coder to be in the range of
3.8 ~ 4.0 in a 6 scale.
Next, let us see how the three coders have graded
them based on the 6 criteria used for each of the 8
questions. Each coder could put a grade of 1 or 0 for
each of the six criteria assigned for each of the 8
questions. That means each criterion has a minimum
value of 0 and a maximum value of 8, across 8
questions. Now, if we consider all the 6 criteria
tighter, each criteria would be in the range of 0 ~ 8.
So, the maximum total that a person can score for the
8 questions in total, for a given coder is 48 (6 criteria
* 8 questions). So, the maximum mean score could
be 48 divided by 6 (criteria) = 8 (for each criteria),
and a minimum mean score could be 0 (for each
criteria).
Data shows the highest sum of mean scores for Q1
(organization of information is user-friendly) at 14.56,
with the score for Q2 (explain whether the
presentation of content is appealing) close behind at
12.56. We see relatively lower scores for Q3 ~ Q7 in
the range of 10.33 ~ 11.67. A low score of 10.33 is
observed for Q7 (whether the resources use real
world situation). Analysis of actual responses in
Moodle show that readers were not exactly sure
about what to include as part of their response to Q7.
While in some cases participants included talk about
technologies, in some other cases they actually
analyzed the content of the website and whether it
included any mention of technology and something
that is important for real life applications. However,
coders observed relatively high levels of accuracy
and response quality for questions related to audience
analysis (Q4).
Data further shows us the mean and SD values for
participants as marked by Coder A. For Coder A, we
see that for most participants a mean of total mean
score is in the range of 4 ~ 5.5 or a little more.
However, participants S6, S13 and S14 have done
very well with a high score around 6.5 ~ 7. These
scores are the average for all the 6 criteria combined.
Data shows us the mean and SD values for
participants as marked by Coder B. For Coder B, we
see that for most participants a mean of total mean
score is in the range of 6 ~ 7.5 or a little more. These
mean scores are significantly more than what we
observed for Coder A. However, participants S3, S7
and S8 have scored less than other participants. We
see some similarity between the results of Coder A
and B.
For Coder C, similar to Coder B we see a
comparatively high mean of total mean score for
almost all the participants. For S8 we see a
consistently low score across all the coders,
confirming very low levels of proficiency in writing
the answers to the open-ended design questions. In all
other cases, we see a consistent high score in the
range of almost 7 and above, and S11 securing a full
score of 8. Surprisingly, Coder A has given only 5.83
to S11 and also, the other scores are consistently
lower for other participants as well.
More data allows us to see an overview of the
significant Pearson correlation values between the
scores on the 8 open-ended questions. We see a rather
high number of significant correlation values,
suggesting similarity of response. The scores for each
question, as calculated in the table 6 is a mean of
scores obtained from all three coders.
7 Discussion
Results show a stronger relative performance for
whether the Belize tourism website is user-friendly or
not, and that readers could successfully explain
whether the content of the site is user-friendly. The
mean scores for the above two questions are
relatively higher when compared to other scores. A
high score also suggests that readers could provide
concrete examples in support of their argument, and
also that technically, the language proficiency
demonstrated while writing the answers are of
acceptable quality. However, when we observe the
sum of mean scores for each question, we do not see
much of a difference that could qualify as
significantly different from each other. Given the fact
that each coder was grading the same response, it
could be safely concluded that there is a large
6. 6University of Aizu, Graduation Thesis. March, 2012 s1170047
variation the score as seen in case of Coder A, but
relatively lesser variation for scores assigned by
Coder B and C. This makes one wonder if Coder A
was relatively stricter than Coder B and C. However,
strictness is a relative term and it also makes us
wonder how Coder B and C have interpreted the
results for any given criteria. One reason for choosing
multiple coders is to get a weighted average of the
scores and then consider it to explain the actual
scores on the open-ended design questions as asked.
However, in spite of this the fact remains that the
ratings of individual coders will remain subjectively
varied in this EFL context. The variability rises from
the EFL orientation of the coders themselves, their
own moderate levels of language proficiency and
moderate levels of experience with interface design.
8 Conclusion
This is an interesting preliminary analysis of how
L2 learners in a typical EFL context could approach
English website analysis and how coders with better
language proficiency and understanding of the
specific design context could interpret responses in
terms of rubrics. For coders, it was a test of not only
understanding what constitutes valid information, and
good organization of response, but it also required
ability to read through the criteria rubric used in the
study, and demonstrate at least moderate levels of
language proficiency. Future studies could
extensively look at website analysis in an EFL
context with more structured design questions, and
with each design questions having specific sub-
questions to channelize readers’ thought process in
the right direction. Similarly, coders could be trained
in very specific assessment mechanisms and
standards to bring about uniformity and eliminate
subjectivity among responses. Finally, these
experiments or assignments when used in a language-
learning context, should clearly explain the language
learning outcomes, processes and expectations. One
serious limitation of the study was systematic
continuous feedback and input from the primary
investigator or instructor of the course to the coders,
and to the respondents of the open-ended questions.
That process could have enriched the process of
assessment, but in a repeated measures design
context. That scenario is clearly beyond the scope of
this specific experiment. However, this study is one
of its kinds because of lack of any substantial
literature on website analysis in EFL language
learning context in the field of computer assisted
language learning.
Reference
[1] Hasnah Tang King Yee, Wong Su Luan, Ahmad
Fauzi Mohd Ayub and Rosnaini Mahmud , “A
Review of the Literature: Determinants of
Online Learning Among Students”, European
Journal of Social Sciences, Vol. 8, no. 2, 2009,
pp. 246-247.
[2] F. Genesee, G. R. Tucker and W. E. Lambert,
“Communication Skills of Bilingual Children”,
Child Development, Vol. 46, no. 4, Dec. 1975,
pp. 1010-1014.
[3] James E. Bailey, Adriana Sburlati, Vassily
Hatzimanikatis, Kelvin Lee, Wolfgang A.
Renner and Philip S. Tsai, ISI Journal Citation
Reports: Biotechnology and Bioengineering.
New York: Thomson Reuters, 1996.
[4] Meei-Ling Liaw, “Content-Based Reading and
Writing for Critical Thinking Skills in an EFL
Context”, English Teaching & Learning 31.2,
summer 2007, pp. 45-87.
[5] Servat Shirkhani and Mansour Fahim,
“Enhancing critical thinking in foreign
language learners”, 1st International
Conference on Foreign Language Teaching and
Applied Linguistics, May 5-7 2011, pp. 1091-
1095.
[6] Loretta F. Kasper, “NEW TECHNOLOGIES,
NEW LITERACIES:FOCUS DISCIPLINE
RESEARCH AND ESL LEARNING
COMMUNITIES”, Language Learning &
Technology, vol. 4, no. 2, pp. 105-128,
September 2000.
[7] Gail Chittleborough, Wendy Jobling, Peter
Hubber, and Gerard Calnin, The use of Web 2.0
Technologies to promote higher order thinking
skills. Canberra: Australian Association for
Research in Education, 2008.
[8] Binnur GENC ILTER, “EFFECT OF
TECHNOLOGY ON MOTIVATION IN EFL
CLASSROOMS”, Turkish Online Journal of
Distance Education-TOJDE, ISSN 1302-6488,
vol. 10, no. 4, Article 9, October 2009.
[9] Krajka, J, “Using the internet in ESL writing
instruction”, The Internet TESL Journal, vol. 6,
no. 11, 2000.
[10] Lawrence Chao, “Learning to write in English
via the internet”, January 2009.
[11] Ulla K. Bunz, Usability and Gratifications
[microform]: Towards a Website Analysis
Model. Washington, D.C.: ERIC Clearinghouse,
2001.
[12] Magda Pieczka, Public Relations: Critical
Debates and Contemporary Practice. London:
Routledge, 2006.
7. 7University of Aizu, Graduation Thesis. March, 2012 s1170047
[13] Jakob Nielsen, “How Users Read on the Web”.
[Online]http://www.nngroup.com/articles/how-
users-read-on-the-web/
[14] Ulla Bunz, “The Website Assignment as a
Valuable Exercise – Beyond Establishing
Presence to Creating Significance”, German
Online Research Conference Göttingen, May
2001, pp. 10-15.
[15] December, John and Ginsberg, Mark, HTML
and CGI. Indiana: Sam's Publishing, 1995.
[16] Mike D, “Internet in the schools: A literacy
perspective”, Journal of Adolescent and Adult
Literacy, vol. 40, no. 1, pp. 1-13, 1996.
[17] Warschauer M, “Computer-Mediated
Collaborative Learning: Theory and Practice”,
Modern Language Journal, vol. 81, no. 3, pp.
470-481, 1997.
[18] Iain Davey, “The use of collaborative Web
page-design projects for teaching EFL, with a
focus on Japanese university students”.
[Online]http://callej.org/journal/3-1/davey.html