SlideShare uma empresa Scribd logo
1 de 9
Baixar para ler offline
Assessing speaking skills:. a
workshop for teacher development
Ben Knight
294
Speaking skills are often considered the most important part of an EFL
course, and yet the difficulties in testing oral skills frequently lead teachers
into using inadequate oral tests or even not testing speaking skills at all.
This article describes a workshop used in teacher development
programmes to help teachers with one aspect of the problem of oral
testing: what should we look for when we assess a student’s ability to speak
English? The workshop looks first at the range of criteria that teachers
might use in such assessment. Then it examines how the selection and
weighting of those criteria should depend on the circumstances in which
the test takes place. The article also discusses issues raised by the
workshop, and considers its applicability to people working in different
circumstances.
Reasons for the Assessment of speaking skills often lags far behind the importance given
workshop to teaching those skills in the curriculum. We recognize the importance of
relevant and reliable assessment for providing vital information to the
students and teachers about the progress made and the work to be done.
We also recognize the importance of backwash (the effect of the test on
the teaching and learning during the course). Most teachers would accept
that ‘if you want to encourage oral ability, then test oral ability’ (Hughes,
1989:44). But the problems of testing oral ability make teachers either
reluctant to take it on or lacking in any confidence in the validity of their
assessments. Such problems include: the practical problem of finding the
time, the facilities and the personnel for testing oral ability; the problem of
designing productive and relevant speaking tasks; and the problem of
being consistent (on different occasions, with different testees and
between different assessors). Another problem, which is the focus of the
workshop framework described here, is deciding which criteria to use in
making an assessment. The workshop has two principal aims:
1 to make teachers more consciously aware of the different possible
criteria they could be using to assess their students’ speaking skills;
2 to make teachers more aware of the way their selection and weighting
of those criteria depend on the context in which they are to be used.
Achieving these aims is crucial for making valid and reliable tests. Except
where tests are being marked holistically (simply in terms of degrees of
communicative success), marking involves the use of assessment criteria.
Even when the assessment is holistic on the surface, the assessor may be
thinking in terms of criteria in judging that overall communicative success
(Bachman, 1990: 329). It is doubtful whether the criteria can be
ELT Journal Volume 46/3 July 1992 © Oxford University Press 1992
articles welcome
considered justified and validated if the assessor is not even explicitly
aware of them. The reliability of an assessor on different occasions with
different testees can be improved by more explicit criteria, as can the
reliability between assessors.
The workshop The workshop takes between about 1½ and 2¼ hours and requires two or
three short video clips of students talking.] Making your own video clips
is preferable, as you can make the task and situation reflect the type of test
which the teachers you are addressing are most likely to use.
Stage 1: a. Viewing and reflection. (10 mins.)
assessment Teac h
criteria
ers are shown a video clip of a student (or students) talking and are
asked to reflect on the question ‘Which aspects of the students’ speaking
would affect the grade you would give the students for their speaking
skills?‘. The presenter needs to say in advance how long the clip will be,
and what instructions were given to the students.
b. Discussion (15 mins.)
Teachers can compare their notes in pairs or small groups, and then this
discussion can open up into a plenary. The objective at this stage is to get
the teachers to be more conscious of what affects their own judgements,
and to see how others may view it differently. The presenter’s role will
include pinning people down on vague terms, such as ‘communicative’ or
‘passive’ (I have heard ‘communicative’ being used for: i) easy to
understand, ii) says a lot, iii) makes up for linguistic weakness with
gestures, etc., iv) interacts well with other person in role-play). Another
role is to elicit or suggest concrete examples (from the clip) of features
being discussed (e.g. an example of ‘inappropriate language’). There is no
need at this stage to try to resolve all the differences of opinion, as often
those differences stem from different assumptions about the testing
context, which will be looked at later. After this discussion, it is useful to
show the clip again, so that people can reconsider the points made by
themselves and others.
c. List of assessment criteria (10-15 mins.)
The presenter then hands out copies of the list of assessment criteria (see
Figure 1). This list is fairly comprehensive in its broad categories, though
within those there could be many more detailed criteria (for example, in
Figure 1 Assessment criteria
1 GRAMMAR
a. range
b. accuracy
2 VOCABULARY
a. range
b. accuracy
3 PRONUNCIATION
a. individual sounds (esp. phonemic distinctions)
b. stress and rhythm
Assessing speaking skills 295
articles welcome
c. intonation
d. linking/elision/assimilation
4 FLUENCY
a. speed of talking
b. hestitation while speaking
c. hesitation before speaking
5 CONSERVATIONAL SKILL
a. topic development
b. initiative (in turn taking, and topic control)
c. cohesion: i) with own utterances
ii) with interlocutor
d. conversation maintenance
(inc. clarification, repair, checking, pause fillers, etc.)
6 SOCIOLINGUISTIC SKILL
a. distinguishing register and style
(e.g. formal or informal, persuasive or conciliatory)
b. use of cultural references
7 NON-VERBAL
a. eye-contact and body posture
b. gestures, facial expressions
8 CONTENT
a. coherence of arguments
b. relevance
an investigation of ‘fluency’ alone (Lennon, 1990: 404-405), 12 different
variables were looked at, ranging from ‘words per minute’ to ‘mean pause
time at T-Unit boundaries’). The amount of explanation needed for the
terms on the list will of course depend on the teachers.
d. Viewing and comment (10-15 mins.)
The teachers are then shown another clip of students talking, and are
asked to think about the usefulness and relevance of the criteria on the list
for assessing the students’ speaking skills, adding or deleting as they think
necessary. The objective of this stage is to consider further the criteria
they think are relevant for assessing speaking skills, and also, by getting
them to relate their views to the terms on the list, to give them a common
vocabulary to aid discussion of differences of opinions. I have found
teachers tend to be less talkative here, since it is a point of mental
reorganization for them as they try to relate their own feelings and
experience with the list.
Stage 2: a. Introduction (5 mins.)
Assessment and
the context
By this stage, the question of context should have arisen several times
(e.g. in the form of comments beginning ‘Well, it depends on why . . .
/what . . . /where . . . /how . . . ‘). The presenter now recalls various
examples of these and notes how they show the importance of the context
in deciding the choice of assessment criteria.
296 Ben Knight
articles welcome
b. Examples (10-15 mins.)
Teachers are then given the hand-out with the examples of different
selections and weightings of criteria together with descriptions of the
relevant contexts (see below). Note that the number under the
‘Weighting’ column does not represent maximum marks for that
criterion, but its value relative to the other criteria. For example, each
criterion might be given a mark out of ten, and each score would be
multiplied by its weighting number before being totalled up. Teachers can
then be encouraged to ask any questions about the criteria, the context and
the relationship between the two. For example: ‘Why did you include
listening comprehension in the placement test, but not in the end-of-term
test?’ It would, of course, be wise for you as presenter to use your own
examples (criteria you have used yourself) so that you are more likely to
be able to answer such questions.
Criteria and context
Look at these two examples of differences in the selection and weighting of
assessment criteria. Note how these two examples of assessment criteria sets vary
according to the situation, and try to list the factors in the testing situation which
can affect such selection and weighting.
1 Placement test
A placement test for University students (to decide which level they go into for
their General English Communication course. This is an interview -basically
answering questions -with a teacher who is both interviewer and scorer at the
same time. It is taken with a single written gap-fill test, which assesses knowledge
of grammatical structures, vocabulary, and functionally appropriate structures.
In informal terms, the qualities we felt were important for success in a class were
the ability to understand the teacher, knowledge of vocabulary, structures and
functions (largely tested in the written gap-fill test), confidence and a willingness
to take chances and try things out, as well as the ability to distinguish
(productively) basic phonemes in English. The category of ‘range of grammar and
vocabulary’ aims to capture people with a wide experience of English (who we
thought would progress more quickly).
(Numbers on
main list) Weighting
1 Range of grammar and vocabulary (la and 2a) 3
2 Accuracy of grammar and vocabulary (lband 2b) 2
3 Phonemic distinctions (3a) 2
4 Hesitation (4b, c) 4
5 Initiative, topic development, and
conversational control (5a, b,d) 4
6 Listening comprehension (not listed) 5
2 End-of-term ESP test
This test was given at the end of one term of a course for receptionists in
international hotels, to see how much they had progressed and what they needed to
work on the following term. The speaking test was a role-play with two students,
and the teacher only observed and marked the score sheet. There were several other
tests -gap-fill and multiple-choice tests of grammar, vocabulary, and functions,
and a listening comprehension test.
Assessing speaking skills 291
articles welcome
298
1 Grammar and vocabulary
2 Pronunciation
a. individual sounds
b. stress and rhythm
c. intonation and linking
3 Fluency
(Numbers on
main list)
(1 and 2)
(3)
Weighting
3
1
1
1
1
(4)
a. hestitation before speaking
b. hestitation while speaking
4 Conversational skill
a. cohesion
(5)
b. conversation maintenance
5 Sociolinguistic skill
a. distinguishing register
and style
(6)
b. use of cultural references
6 Non-verbal (7)
a) eye contact and body posture
b) gestures and facial expressions
7 Content (relevance) (8)
c. Context variables (15 mins.)
By looking at the examples and thinking of their own experience, the
teachers are asked to abstract the variables in the context which may affect
the choice and weighting of the criteria. The variables could include the
following:
i. The purpose of the test:
-achievement, proficiency, predictive or diagnostic?
-and depending on that: the course objectives; the underlying theory
(formal or informal) of what constitutes language proficiency; the
future situations the students are being tested for; the types of
feedback the students in question would understand and could
benefit from.
ii. The circumstances of eliciting the sample of language being assessed.
-the degree of freedom or control over what the student could say
and do.
-the number of participants and their roles in the situation.
iii. Observation restrictions
-extent to which assessor participates in speaking situation (e.g.
interviewer or observer).
-whether recorded or not (on audio or video cassette).
iv. the other tests in the battery (e.g. the selection or weighting of a
criterion for grammatical accuracy may depend on how much it has
been assessed in accompanying written tests).
d. Using the different criteria sets (15 mins.)
(Optional if short of time.)
Teachers watch another video clip and assess the student’s oral skills
Ben Knight
articles welcome
using first one of the example criteria sets and then the other. This is to
demonstrate how different criteria sets (appropriate in different contexts)
can produce different assessments of the same performance. Different
criteria sets will not always produce differing results, and so care needs to
be taken to use a clip which will make the point clearly.
e. Task (20-30 mins.)
Teachers are given details of a testing situation (preferably compatible
with their own) and are asked to decide on the criteria they would use and
the weighting for those criteria. There follows an example of such a task:
TASK (Selecting Assessment Criteria.)
You should imagine you are responsible for the oral testing of 100 students who have
applied to study in U.S. colleges for one year (various subjects). You have to judge
whether they have sufficient speaking skills to survive and do well there. You can
conduct a lo-minute interview for each (with only one interviewer/assessor for each
interview). The interviewers are all experienced and trained EFL teachers. The other
tests in the battery will be multiple-choice tests of grammar and vocabulary, of
academic reading and lecture listening comprehension, and essay-type questions.
Decide on the criteria you would use in assessing their spoken English, and the
relative weighting of each.
The purpose of this task is:
-for teachers to think more concretely about the points raised so far, to
let them see how referring to a particular context (described in the task)
can reduce the differences in opinion when talking more generally.
-to provide an intermediary stage between the thinking in the earlier part
and the need for application of those thoughts to their own situation
after the workshop.
f. Conclusion (5 mins.)
The presenter can ask students for their comments on the workshop---how
useful it was, how it could have been more useful, whether they think they
would change the way they assess their students’ speaking skills, and so
on.
Discussion There is still a great deal of subjectivity in a) the selection of criteria, and
1. Objective b) the way each criterion is measured (e.g. how exactly do you decide the
criteria? grammatical accuracy of a speaker’s performance?). The workshop aims
only to improve the quality of those subjective decisions about selecting
criteria by making it more conscious and explicit, and by giving the
teachers a chance to discuss other points of view. It assumes that teachers
do not have the resources to carry out their own research. A kind of
collective subjectivity can be reached for how each criterion is measured
by ‘training’ or ‘moderating’ sessions for assessors. But for those who
have the time and resources to look closely and objectively at these
questions, the following will be of interest: Hinofotis (1980), Hieke
(1985), Fulcher (1987), and Lennon (1990).
Assessing speaking skills 299
articles welcome
2. Analytic or Several tests, such as the RSA Profile Certificate or the ILR (Interagency
holistic Language Roundtable) oral interview, use a different type of criterion to
assessment? this workshop. The speakers are observed in different speaking tasks and
they are simply judged for their degree of success in that task. This
holistic approach argues that, as we cannot observe directly mental
characteristics like grammatical knowledge or ability to maintain
conversations, it will be inaccurate to give (inferred) scores for them.
Rather we should simply assess the learner’s (observable) success in
performing authentic language tasks.
The approach behind this workshop, however, is one which argues that it
is those mental abilities (which we must infer from the learner’s
performance) that we are most interested in, for at least the following
reasons. Firstly, we cannot predict, let alone test for, every function and
situation which a learner might need English for. Therefore any claim
about general proficiency must involve a lot of inferences from the few
sample functions/situations we do test. Secondly, a lot of teachers’ own
tests are partly diagnostic, and teachers need to know more about why a
particular learner performed badly in some situations and better in others.
This will usually come down to inferred component abilities, such as
range of vocabulary or fluency. For a detailed discussion of the two
approaches, see Bachman (1990: 41-42, 301-333). Hughes (1989: 110)
recommends using both approaches, with one as a check on the other.
3. When to do this This workshop on its own may seem peripheral as teachers often worry
workshop? more about such problems as making a practical test, setting fair tasks and
getting a fair sample of the students’ language, and being consistent.
However, it is probably helpful to tackle the problem of assessment
criteria before these other questions, since we need to start by deciding
what we want to measure, before deciding what is the most reliable and
practical means to measure it. The danger, otherwise, is that we choose
reliable and practical tests (e.g. multiple-choice tests) which do not give
us the information we really want about our students’ oral skills, and
which can have a negative effect on students’ attitudes to developing
those skills during the course.
4.
300
Too complicated? Considering the context in selecting assessment criteria does make the
discussion more complicated. So with teachers for whom this topic is
completely new, it would probably be better to leave such considerations
aside or condense them severely. I have found some untrained teachers
saying that they wished they could have come away from the workshop
with one fixed set of ‘best criteria’. Gebhard (1990: 158,160) reports that
handed-down direction is preferred by beginning teachers (quoting
research in Copeland, 1982) and by teachers in certain countries who feel
that ‘if the teacher is not given direction by the supervisor, then the
supervisor is not considered qualified’.
However, taking the testing context into account is valuable, despite the
added complexity, in dealing with two common problems with teacher-
development workers.
Ben Knight
articles welcome
Firstly, it makes it easier for teachers to apply what they learn in the
workshop to their own situations, especially when they are working in
contexts very different from that of the presenter. This is also helped by
the final task (which is an exercise in applying to a particular situation
principles learnt during the workshop).
Secondly, it helps resolves conflicts of opinion. Many of the
disagreements at the beginning of a workshop can be related to different
assumptions about the testing context and its effect on the selection of
criteria. Thus, it not only improves our understanding, but also improves
the conduct of the workshop: it avoids the ‘anything goes’ approach
which creates cynicism, and it reduces the ex cathedra judgements by the
presenter which can lead to resentment or passivity.
5. Usable in other The main limitations to this workshop are:
circumstances? a. There must be a video player, and some video clips of students
speaking (this is best done by recording students speaking in situations
most likely to be used in tests by those teachers -i.e. a video camera is
helpful.).
b. The teachers must have a sufficient level of English (or other target
language) to assess the students and to contribute to the discussion.
c. The teachers need to have some experience of assessing students’
speaking skills. Pre-service teachers would probably be overwhelmed
by the workshop in its present form.
The workshop, however, need not be limited to native speakers, trained
teachers, or to EFL teachers, as it has proved as useful to non-native
speakers, untrained (but practising) teachers, and teachers of other
languages. Each of these three latter groups may have certain
characteristics which should be taken into account.
Some non-native speakers tend to place far greater emphasis on
grammatical accuracy to begin with, though this is usually due to the way
they learnt English rather than with any difficulty in perceiving discourse
and sociolinguistic skills.
Experienced, untrained teachers often lack the relevant vocabulary to talk
clearly about learning and teaching, and sometimes appear to be more
dogmatic or to lose perspective (e.g. claiming the single most important
criterion in assessing speaking skills is whether the speaker keeps eye-
contact with the listener).
While it is obvious that the details of grammatical accuracy and
pronunciation will be different for other languages, that is probably also
true of the other criteria such as conversation maintenance and non-verbal
behaviour. However, provided their level of English is sufficient for the
tasks, the workshop’s aims (awareness of the range of different criteria
possible, and awareness of how their selection depends on the testing
context) can be met for teachers of different languages in the same group.
It should also be noted that these three groups bring different perspectives
Assessing speaking skills 301
articles welcome
which can only enrich the discussions: for example, suggesting as a
criterion for communication between non-native speakers ‘the ability to
adapt your level of English to that of your interlocutor’, or (from a
Japanese teacher) ‘the skill of stating an opinion in such a way that it is
easy for the listener to disagree without seeming argumentative’.
Conclusions The workshop works as a way of stimulating teachers to think about and
discuss the way they assess their students’ speaking skills. It is rare for a
participant not to be absorbed by the tasks and the exchange of ideas. A
few participants have found it rather frustrating not to have a fixed set of
testing criteria at the end of the workshop, but most seemed to find the
process of relating criteria to context helpful in clarifying their own
positions. A large survey of teachers’ testing experience found ‘there is
evidence that most [teachers] prefer to use informal and flexible
approaches which can be adapted to different student populations’
(Brindley, 1989: 31). This workshop suits such preferences.
Received September 1991
Note
1 The video clips I have used (about 3-5 mins. long)
mainly showed students speaking in pairs, usually
in a simple role-play.
For example:
Student A: You want to go on a holiday to Hawaii.
Try to persuade your partner to come with you,
though he or she wants to go somewhere else.
Student B: You want to go on a holiday to Europe.
Try to persuade your partner to come with you,
though he or she wants to go somewhere else.
References
Bachman, L. 1990. Fundamental Considerations in
Language Testing. Oxford: Oxford University
Press.
Brindley, G. 1989. Assessing Achievement in the
Learner-centred Curriculum. Sydney: National
Centre for English Language Teaching and
Research.
Copeland, W. 1982. ‘Student teachers’ preference
for supervisory approach’. Journal of Teacher
Education. XXXIII/2: 32-36.
Fulcher, G. 1987. ‘Tests of Oral Performance: the
need for data-based criteria.’ ELT Journal XLI/4:
287-291.
Gebhard, J. 1990. ‘Models of Supervision: choices’.
in Richards J. and D. Nunan (eds.) 1990.
Hieke, A. 1985. ‘A Componential Approach to Oral
Fluency Evaluation’. The Modern Language
Journal. LXIX/2: 135-42.
Hinofotis, F. 1983. ‘The Structure of Oral
Communication in an Educational Environment: a
comparison of factor analytic rational procedures’.
in Oller, J. (ed.) 1983.
Hughes, A. 1989. Testing for Language Teachers.
Cambridge: Cambridge University Press.
Lennon, P. 1990. ‘Investigating Fluency in EFL: A
Quantitative Approach’. Language Learning
XL/3: 387/417.
Oller, J. 1983. Issues in Language Testing Research.
Rowley, Mass.: Newbury House.
Richards, J. and D. Nunan (eds.) 1990. Second
Language Teacher Education. Cambridge:
Cambridge University Press.
The author
Ben Knight teaches EFL and linguistics at Shinshu
University, Japan. He obtained an MSc in Applied
Linguistics from the University of Edinburgh in
1987, and has taught EFL/ESL in Britain, Kenya,
Italy, India, and Sri Lanka. His current interests
include the testing of spoken English, teacher
development, and language learning beyond the
classroom.
302 Ben Knight
articles welcome

Mais conteúdo relacionado

Mais procurados

Assesing listening - language learning evaluation
Assesing listening - language learning evaluationAssesing listening - language learning evaluation
Assesing listening - language learning evaluationMuktia Amalina
 
Language testing approaches & techniques
Language testing approaches & techniquesLanguage testing approaches & techniques
Language testing approaches & techniquesShin Chan
 
Principles of language assessment
Principles of language assessmentPrinciples of language assessment
Principles of language assessmentAstrid Caballero
 
Chapter 6( assessing listening)
Chapter 6( assessing listening)Chapter 6( assessing listening)
Chapter 6( assessing listening)Kheang Sokheng
 
Teaching speaking brown
Teaching speaking brownTeaching speaking brown
Teaching speaking brownshohreh12345
 
Chapter 3(designing classroom language tests)
Chapter 3(designing classroom language tests)Chapter 3(designing classroom language tests)
Chapter 3(designing classroom language tests)Kheang Sokheng
 
Chapter 2(principles of language assessment)
Chapter 2(principles of language assessment)Chapter 2(principles of language assessment)
Chapter 2(principles of language assessment)Kheang Sokheng
 
ASSESSMENT: DISCRETE POINT TEST, INTEGRATIVE TESTING, PERFORMANCE-BASED ASSES...
ASSESSMENT: DISCRETE POINT TEST, INTEGRATIVE TESTING, PERFORMANCE-BASED ASSES...ASSESSMENT: DISCRETE POINT TEST, INTEGRATIVE TESTING, PERFORMANCE-BASED ASSES...
ASSESSMENT: DISCRETE POINT TEST, INTEGRATIVE TESTING, PERFORMANCE-BASED ASSES...A. Tenry Lawangen Aspat Colle
 
Genre based approach
Genre based approachGenre based approach
Genre based approachPapa Kayla
 
Chapter 1 harmer 2007
Chapter 1 harmer 2007Chapter 1 harmer 2007
Chapter 1 harmer 2007Estela Braun
 
Speaking assessment-..1330508 (1)
Speaking assessment-..1330508 (1)Speaking assessment-..1330508 (1)
Speaking assessment-..1330508 (1)CristinaGrumal
 
Understanding Authenticity in Language Teaching & Assessment
Understanding Authenticity in Language Teaching & Assessment Understanding Authenticity in Language Teaching & Assessment
Understanding Authenticity in Language Teaching & Assessment Omaima Ayoub
 

Mais procurados (20)

Assesing listening - language learning evaluation
Assesing listening - language learning evaluationAssesing listening - language learning evaluation
Assesing listening - language learning evaluation
 
Assessing speaking
Assessing speakingAssessing speaking
Assessing speaking
 
Assessing listening
Assessing listening Assessing listening
Assessing listening
 
Language testing approaches & techniques
Language testing approaches & techniquesLanguage testing approaches & techniques
Language testing approaches & techniques
 
Assessing Writing
Assessing WritingAssessing Writing
Assessing Writing
 
Principles of language assessment
Principles of language assessmentPrinciples of language assessment
Principles of language assessment
 
Chapter 6( assessing listening)
Chapter 6( assessing listening)Chapter 6( assessing listening)
Chapter 6( assessing listening)
 
Teaching speaking brown
Teaching speaking brownTeaching speaking brown
Teaching speaking brown
 
Chapter 3(designing classroom language tests)
Chapter 3(designing classroom language tests)Chapter 3(designing classroom language tests)
Chapter 3(designing classroom language tests)
 
Chapter 2(principles of language assessment)
Chapter 2(principles of language assessment)Chapter 2(principles of language assessment)
Chapter 2(principles of language assessment)
 
Extensive Listening Assessment
Extensive Listening AssessmentExtensive Listening Assessment
Extensive Listening Assessment
 
Approaches to language testing
Approaches to language testingApproaches to language testing
Approaches to language testing
 
ASSESSMENT: DISCRETE POINT TEST, INTEGRATIVE TESTING, PERFORMANCE-BASED ASSES...
ASSESSMENT: DISCRETE POINT TEST, INTEGRATIVE TESTING, PERFORMANCE-BASED ASSES...ASSESSMENT: DISCRETE POINT TEST, INTEGRATIVE TESTING, PERFORMANCE-BASED ASSES...
ASSESSMENT: DISCRETE POINT TEST, INTEGRATIVE TESTING, PERFORMANCE-BASED ASSES...
 
Genre based approach
Genre based approachGenre based approach
Genre based approach
 
Chapter 1 harmer 2007
Chapter 1 harmer 2007Chapter 1 harmer 2007
Chapter 1 harmer 2007
 
Speaking assessment-..1330508 (1)
Speaking assessment-..1330508 (1)Speaking assessment-..1330508 (1)
Speaking assessment-..1330508 (1)
 
Designing classroom language tests
Designing classroom language testsDesigning classroom language tests
Designing classroom language tests
 
Assessing listening
Assessing listeningAssessing listening
Assessing listening
 
Understanding Authenticity in Language Teaching & Assessment
Understanding Authenticity in Language Teaching & Assessment Understanding Authenticity in Language Teaching & Assessment
Understanding Authenticity in Language Teaching & Assessment
 
English - Assessing writing
English - Assessing writingEnglish - Assessing writing
English - Assessing writing
 

Destaque

Chapter 7(assessing speaking )
Chapter 7(assessing speaking )Chapter 7(assessing speaking )
Chapter 7(assessing speaking )Kheang Sokheng
 
Making A Difference Open Source and libraries June 2008
Making A Difference Open Source and libraries June 2008Making A Difference Open Source and libraries June 2008
Making A Difference Open Source and libraries June 2008Ken Chad Consulting Ltd
 
Report Writing by Prof.Pravin Mulay
Report Writing by Prof.Pravin MulayReport Writing by Prof.Pravin Mulay
Report Writing by Prof.Pravin Mulayguest8c8c25
 
Topic 6 Assessing Language Skills and Content
Topic 6 Assessing Language Skills and ContentTopic 6 Assessing Language Skills and Content
Topic 6 Assessing Language Skills and ContentYee Bee Choo
 
Factors that influence speech
Factors that influence speechFactors that influence speech
Factors that influence speechAlan Bessette
 
Compiler Construction Course - Introduction
Compiler Construction Course - IntroductionCompiler Construction Course - Introduction
Compiler Construction Course - IntroductionMuhammad Sanaullah
 
Speaking assessment test
Speaking assessment testSpeaking assessment test
Speaking assessment testmakarenasanchez
 
Introduction to Compiler Construction
Introduction to Compiler Construction Introduction to Compiler Construction
Introduction to Compiler Construction Sarmad Ali
 
Report writing: a way to polish your skills
Report writing: a way to polish your skillsReport writing: a way to polish your skills
Report writing: a way to polish your skillssyed ahmed
 
IMPACT OF INDIAN MEDIA ON OUR CULTURE
IMPACT OF INDIAN MEDIA ON OUR CULTUREIMPACT OF INDIAN MEDIA ON OUR CULTURE
IMPACT OF INDIAN MEDIA ON OUR CULTUREOwais Khokhar
 
Report Writing for Academic Purposes
Report Writing for Academic PurposesReport Writing for Academic Purposes
Report Writing for Academic PurposesLindsey Cottle
 
korean porn
korean porn korean porn
korean porn Meetav
 

Destaque (20)

Chapter 7(assessing speaking )
Chapter 7(assessing speaking )Chapter 7(assessing speaking )
Chapter 7(assessing speaking )
 
Assessing speaking
Assessing speakingAssessing speaking
Assessing speaking
 
Making A Difference Open Source and libraries June 2008
Making A Difference Open Source and libraries June 2008Making A Difference Open Source and libraries June 2008
Making A Difference Open Source and libraries June 2008
 
Future of Library Discovery Services
Future of Library Discovery ServicesFuture of Library Discovery Services
Future of Library Discovery Services
 
Assessing speaking skill
Assessing speaking skillAssessing speaking skill
Assessing speaking skill
 
Report Writing by Prof.Pravin Mulay
Report Writing by Prof.Pravin MulayReport Writing by Prof.Pravin Mulay
Report Writing by Prof.Pravin Mulay
 
Topic 6 Assessing Language Skills and Content
Topic 6 Assessing Language Skills and ContentTopic 6 Assessing Language Skills and Content
Topic 6 Assessing Language Skills and Content
 
Factors that influence speech
Factors that influence speechFactors that influence speech
Factors that influence speech
 
Compiler Construction Course - Introduction
Compiler Construction Course - IntroductionCompiler Construction Course - Introduction
Compiler Construction Course - Introduction
 
Speaking assessment test
Speaking assessment testSpeaking assessment test
Speaking assessment test
 
Introduction to Compiler Construction
Introduction to Compiler Construction Introduction to Compiler Construction
Introduction to Compiler Construction
 
Report writing: a way to polish your skills
Report writing: a way to polish your skillsReport writing: a way to polish your skills
Report writing: a way to polish your skills
 
Lex
LexLex
Lex
 
IMPACT OF INDIAN MEDIA ON OUR CULTURE
IMPACT OF INDIAN MEDIA ON OUR CULTUREIMPACT OF INDIAN MEDIA ON OUR CULTURE
IMPACT OF INDIAN MEDIA ON OUR CULTURE
 
Employee motivation
Employee motivation   Employee motivation
Employee motivation
 
Compilers
CompilersCompilers
Compilers
 
Report Writing for Academic Purposes
Report Writing for Academic PurposesReport Writing for Academic Purposes
Report Writing for Academic Purposes
 
Report writing
Report writingReport writing
Report writing
 
Technical Report writing
Technical Report writingTechnical Report writing
Technical Report writing
 
korean porn
korean porn korean porn
korean porn
 

Semelhante a Assesing speaking skills

Fundamental concepts and principles in Language Testing
Fundamental concepts and principles in Language TestingFundamental concepts and principles in Language Testing
Fundamental concepts and principles in Language TestingPhạm Phúc Khánh Minh
 
Summary of all the chapters
Summary of all the chaptersSummary of all the chapters
Summary of all the chapterskashmasardar
 
Howtoassessgrammar
HowtoassessgrammarHowtoassessgrammar
Howtoassessgrammaringridbelloa
 
Designing Calssroom Language Test and Test Methods
Designing Calssroom Language Test and Test MethodsDesigning Calssroom Language Test and Test Methods
Designing Calssroom Language Test and Test MethodsIin Widya Lestari
 
WAYS TO ASSESS PRONUNCIATION LEARNING.docx
WAYS TO ASSESS PRONUNCIATION LEARNING.docxWAYS TO ASSESS PRONUNCIATION LEARNING.docx
WAYS TO ASSESS PRONUNCIATION LEARNING.docxNikMan8
 
Assessment &testing in the classroom
Assessment &testing in the classroomAssessment &testing in the classroom
Assessment &testing in the classroomCidher89
 
Assessment &testing in the classroom
Assessment &testing in the classroomAssessment &testing in the classroom
Assessment &testing in the classroomCidher89
 
Chapter 3 Constructing Tests
Chapter 3  Constructing TestsChapter 3  Constructing Tests
Chapter 3 Constructing TestsIES JFK
 
UTPL-LENGUAGE TESTING-I-BIMESTRE-(OCTUBRE 2011-FEBRERO 2012)
UTPL-LENGUAGE TESTING-I-BIMESTRE-(OCTUBRE 2011-FEBRERO 2012)UTPL-LENGUAGE TESTING-I-BIMESTRE-(OCTUBRE 2011-FEBRERO 2012)
UTPL-LENGUAGE TESTING-I-BIMESTRE-(OCTUBRE 2011-FEBRERO 2012)Videoconferencias UTPL
 
Rating scales Chapter 7 by Ahmet YUSUF
Rating scales Chapter 7  by Ahmet YUSUFRating scales Chapter 7  by Ahmet YUSUF
Rating scales Chapter 7 by Ahmet YUSUFأحمد يوسف
 
Design of a speaking test
Design of a speaking test Design of a speaking test
Design of a speaking test Gerardo Zavalla
 
Testing teacher's hand testing & examiner guide 2018
Testing teacher's hand testing & examiner guide 2018Testing teacher's hand testing & examiner guide 2018
Testing teacher's hand testing & examiner guide 2018Mr Bounab Samir
 
Building a base for better teaching
Building a base for  better teachingBuilding a base for  better teaching
Building a base for better teachingYadi Purnomo
 
3-_basic_principles_of_assessment-1.ppt
3-_basic_principles_of_assessment-1.ppt3-_basic_principles_of_assessment-1.ppt
3-_basic_principles_of_assessment-1.ppt640721115015
 

Semelhante a Assesing speaking skills (20)

Fundamental concepts and principles in Language Testing
Fundamental concepts and principles in Language TestingFundamental concepts and principles in Language Testing
Fundamental concepts and principles in Language Testing
 
Assessing speaking
Assessing speakingAssessing speaking
Assessing speaking
 
Summary of all the chapters
Summary of all the chaptersSummary of all the chapters
Summary of all the chapters
 
Howtoassessgrammar
HowtoassessgrammarHowtoassessgrammar
Howtoassessgrammar
 
Howtoassessgrammar
HowtoassessgrammarHowtoassessgrammar
Howtoassessgrammar
 
Designing Calssroom Language Test and Test Methods
Designing Calssroom Language Test and Test MethodsDesigning Calssroom Language Test and Test Methods
Designing Calssroom Language Test and Test Methods
 
L2 assessment
L2 assessmentL2 assessment
L2 assessment
 
A AND E.ppt
A AND E.pptA AND E.ppt
A AND E.ppt
 
WAYS TO ASSESS PRONUNCIATION LEARNING.docx
WAYS TO ASSESS PRONUNCIATION LEARNING.docxWAYS TO ASSESS PRONUNCIATION LEARNING.docx
WAYS TO ASSESS PRONUNCIATION LEARNING.docx
 
Assessment &testing in the classroom
Assessment &testing in the classroomAssessment &testing in the classroom
Assessment &testing in the classroom
 
Assessment &testing in the classroom
Assessment &testing in the classroomAssessment &testing in the classroom
Assessment &testing in the classroom
 
Chapter 3 Constructing Tests
Chapter 3  Constructing TestsChapter 3  Constructing Tests
Chapter 3 Constructing Tests
 
UTPL-LENGUAGE TESTING-I-BIMESTRE-(OCTUBRE 2011-FEBRERO 2012)
UTPL-LENGUAGE TESTING-I-BIMESTRE-(OCTUBRE 2011-FEBRERO 2012)UTPL-LENGUAGE TESTING-I-BIMESTRE-(OCTUBRE 2011-FEBRERO 2012)
UTPL-LENGUAGE TESTING-I-BIMESTRE-(OCTUBRE 2011-FEBRERO 2012)
 
Rating scales Chapter 7 by Ahmet YUSUF
Rating scales Chapter 7  by Ahmet YUSUFRating scales Chapter 7  by Ahmet YUSUF
Rating scales Chapter 7 by Ahmet YUSUF
 
Design of a speaking test
Design of a speaking test Design of a speaking test
Design of a speaking test
 
Testing teacher's hand testing & examiner guide 2018
Testing teacher's hand testing & examiner guide 2018Testing teacher's hand testing & examiner guide 2018
Testing teacher's hand testing & examiner guide 2018
 
1 3
1 31 3
1 3
 
Building a base for better teaching
Building a base for  better teachingBuilding a base for  better teaching
Building a base for better teaching
 
50 3 10_norris
50 3 10_norris50 3 10_norris
50 3 10_norris
 
3-_basic_principles_of_assessment-1.ppt
3-_basic_principles_of_assessment-1.ppt3-_basic_principles_of_assessment-1.ppt
3-_basic_principles_of_assessment-1.ppt
 

Mais de syed ahmed

Creating a-business-letter
Creating a-business-letterCreating a-business-letter
Creating a-business-lettersyed ahmed
 
Speaking skill
Speaking skill Speaking skill
Speaking skill syed ahmed
 
Employment communication
Employment communicationEmployment communication
Employment communicationsyed ahmed
 
Effective Resume &Cover letter
Effective  Resume &Cover  letterEffective  Resume &Cover  letter
Effective Resume &Cover lettersyed ahmed
 
Abc is a key of capacity building
Abc is a key of capacity buildingAbc is a key of capacity building
Abc is a key of capacity buildingsyed ahmed
 
10 lessons-from-life-that-change-my-life (1)
10 lessons-from-life-that-change-my-life (1)10 lessons-from-life-that-change-my-life (1)
10 lessons-from-life-that-change-my-life (1)syed ahmed
 
Effective Resume & Cover letters
Effective Resume & Cover  lettersEffective Resume & Cover  letters
Effective Resume & Cover letterssyed ahmed
 
How to Read a Research Paper
How to Read  a Research Paper How to Read  a Research Paper
How to Read a Research Paper syed ahmed
 
Education past future
Education past future Education past future
Education past future syed ahmed
 
Pakistan teacher education and professional development program
Pakistan teacher education and professional development programPakistan teacher education and professional development program
Pakistan teacher education and professional development programsyed ahmed
 
Writing a problem_statement
Writing a problem_statementWriting a problem_statement
Writing a problem_statementsyed ahmed
 
239 325 ethics
239 325 ethics239 325 ethics
239 325 ethicssyed ahmed
 
Studying in netherlands a comprehensive guide
Studying in netherlands  a comprehensive guideStudying in netherlands  a comprehensive guide
Studying in netherlands a comprehensive guidesyed ahmed
 
Integrating ict as an integral teaching and learning tool into pre
Integrating ict as an integral teaching and learning tool into preIntegrating ict as an integral teaching and learning tool into pre
Integrating ict as an integral teaching and learning tool into presyed ahmed
 
Process of preparing effective business messages
Process of preparing effective business messagesProcess of preparing effective business messages
Process of preparing effective business messagessyed ahmed
 
Functions and responsibilities of
Functions and responsibilities ofFunctions and responsibilities of
Functions and responsibilities ofsyed ahmed
 
Chapter 14 affective
Chapter 14 affectiveChapter 14 affective
Chapter 14 affectivesyed ahmed
 
Emotional intelligence 3
Emotional intelligence 3Emotional intelligence 3
Emotional intelligence 3syed ahmed
 
Ten ways to enjoy blessings of life
Ten ways to enjoy blessings of lifeTen ways to enjoy blessings of life
Ten ways to enjoy blessings of lifesyed ahmed
 
Hope against hope
Hope against hopeHope against hope
Hope against hopesyed ahmed
 

Mais de syed ahmed (20)

Creating a-business-letter
Creating a-business-letterCreating a-business-letter
Creating a-business-letter
 
Speaking skill
Speaking skill Speaking skill
Speaking skill
 
Employment communication
Employment communicationEmployment communication
Employment communication
 
Effective Resume &Cover letter
Effective  Resume &Cover  letterEffective  Resume &Cover  letter
Effective Resume &Cover letter
 
Abc is a key of capacity building
Abc is a key of capacity buildingAbc is a key of capacity building
Abc is a key of capacity building
 
10 lessons-from-life-that-change-my-life (1)
10 lessons-from-life-that-change-my-life (1)10 lessons-from-life-that-change-my-life (1)
10 lessons-from-life-that-change-my-life (1)
 
Effective Resume & Cover letters
Effective Resume & Cover  lettersEffective Resume & Cover  letters
Effective Resume & Cover letters
 
How to Read a Research Paper
How to Read  a Research Paper How to Read  a Research Paper
How to Read a Research Paper
 
Education past future
Education past future Education past future
Education past future
 
Pakistan teacher education and professional development program
Pakistan teacher education and professional development programPakistan teacher education and professional development program
Pakistan teacher education and professional development program
 
Writing a problem_statement
Writing a problem_statementWriting a problem_statement
Writing a problem_statement
 
239 325 ethics
239 325 ethics239 325 ethics
239 325 ethics
 
Studying in netherlands a comprehensive guide
Studying in netherlands  a comprehensive guideStudying in netherlands  a comprehensive guide
Studying in netherlands a comprehensive guide
 
Integrating ict as an integral teaching and learning tool into pre
Integrating ict as an integral teaching and learning tool into preIntegrating ict as an integral teaching and learning tool into pre
Integrating ict as an integral teaching and learning tool into pre
 
Process of preparing effective business messages
Process of preparing effective business messagesProcess of preparing effective business messages
Process of preparing effective business messages
 
Functions and responsibilities of
Functions and responsibilities ofFunctions and responsibilities of
Functions and responsibilities of
 
Chapter 14 affective
Chapter 14 affectiveChapter 14 affective
Chapter 14 affective
 
Emotional intelligence 3
Emotional intelligence 3Emotional intelligence 3
Emotional intelligence 3
 
Ten ways to enjoy blessings of life
Ten ways to enjoy blessings of lifeTen ways to enjoy blessings of life
Ten ways to enjoy blessings of life
 
Hope against hope
Hope against hopeHope against hope
Hope against hope
 

Último

Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfAyushMahapatra5
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfciinovamais
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxheathfieldcps1
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdfQucHHunhnh
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingTeacherCyreneCayanan
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdfQucHHunhnh
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...EduSkills OECD
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3JemimahLaneBuaron
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfAdmir Softic
 
General AI for Medical Educators April 2024
General AI for Medical Educators April 2024General AI for Medical Educators April 2024
General AI for Medical Educators April 2024Janet Corral
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactdawncurless
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDThiyagu K
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Celine George
 
social pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajansocial pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajanpragatimahajan3
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfJayanti Pande
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactPECB
 
Disha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfDisha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfchloefrazer622
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxVishalSingh1417
 

Último (20)

Class 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdfClass 11th Physics NEET formula sheet pdf
Class 11th Physics NEET formula sheet pdf
 
Activity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdfActivity 01 - Artificial Culture (1).pdf
Activity 01 - Artificial Culture (1).pdf
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
1029 - Danh muc Sach Giao Khoa 10 . pdf
1029 -  Danh muc Sach Giao Khoa 10 . pdf1029 -  Danh muc Sach Giao Khoa 10 . pdf
1029 - Danh muc Sach Giao Khoa 10 . pdf
 
fourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writingfourth grading exam for kindergarten in writing
fourth grading exam for kindergarten in writing
 
1029-Danh muc Sach Giao Khoa khoi 6.pdf
1029-Danh muc Sach Giao Khoa khoi  6.pdf1029-Danh muc Sach Giao Khoa khoi  6.pdf
1029-Danh muc Sach Giao Khoa khoi 6.pdf
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3Q4-W6-Restating Informational Text Grade 3
Q4-W6-Restating Informational Text Grade 3
 
Key note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdfKey note speaker Neum_Admir Softic_ENG.pdf
Key note speaker Neum_Admir Softic_ENG.pdf
 
General AI for Medical Educators April 2024
General AI for Medical Educators April 2024General AI for Medical Educators April 2024
General AI for Medical Educators April 2024
 
Accessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impactAccessible design: Minimum effort, maximum impact
Accessible design: Minimum effort, maximum impact
 
Measures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SDMeasures of Dispersion and Variability: Range, QD, AD and SD
Measures of Dispersion and Variability: Range, QD, AD and SD
 
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptxINDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
INDIA QUIZ 2024 RLAC DELHI UNIVERSITY.pptx
 
Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17Advanced Views - Calendar View in Odoo 17
Advanced Views - Calendar View in Odoo 17
 
social pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajansocial pharmacy d-pharm 1st year by Pragati K. Mahajan
social pharmacy d-pharm 1st year by Pragati K. Mahajan
 
Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 
Web & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdfWeb & Social Media Analytics Previous Year Question Paper.pdf
Web & Social Media Analytics Previous Year Question Paper.pdf
 
Beyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global ImpactBeyond the EU: DORA and NIS 2 Directive's Global Impact
Beyond the EU: DORA and NIS 2 Directive's Global Impact
 
Disha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdfDisha NEET Physics Guide for classes 11 and 12.pdf
Disha NEET Physics Guide for classes 11 and 12.pdf
 
Unit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptxUnit-IV- Pharma. Marketing Channels.pptx
Unit-IV- Pharma. Marketing Channels.pptx
 

Assesing speaking skills

  • 1. Assessing speaking skills:. a workshop for teacher development Ben Knight 294 Speaking skills are often considered the most important part of an EFL course, and yet the difficulties in testing oral skills frequently lead teachers into using inadequate oral tests or even not testing speaking skills at all. This article describes a workshop used in teacher development programmes to help teachers with one aspect of the problem of oral testing: what should we look for when we assess a student’s ability to speak English? The workshop looks first at the range of criteria that teachers might use in such assessment. Then it examines how the selection and weighting of those criteria should depend on the circumstances in which the test takes place. The article also discusses issues raised by the workshop, and considers its applicability to people working in different circumstances. Reasons for the Assessment of speaking skills often lags far behind the importance given workshop to teaching those skills in the curriculum. We recognize the importance of relevant and reliable assessment for providing vital information to the students and teachers about the progress made and the work to be done. We also recognize the importance of backwash (the effect of the test on the teaching and learning during the course). Most teachers would accept that ‘if you want to encourage oral ability, then test oral ability’ (Hughes, 1989:44). But the problems of testing oral ability make teachers either reluctant to take it on or lacking in any confidence in the validity of their assessments. Such problems include: the practical problem of finding the time, the facilities and the personnel for testing oral ability; the problem of designing productive and relevant speaking tasks; and the problem of being consistent (on different occasions, with different testees and between different assessors). Another problem, which is the focus of the workshop framework described here, is deciding which criteria to use in making an assessment. The workshop has two principal aims: 1 to make teachers more consciously aware of the different possible criteria they could be using to assess their students’ speaking skills; 2 to make teachers more aware of the way their selection and weighting of those criteria depend on the context in which they are to be used. Achieving these aims is crucial for making valid and reliable tests. Except where tests are being marked holistically (simply in terms of degrees of communicative success), marking involves the use of assessment criteria. Even when the assessment is holistic on the surface, the assessor may be thinking in terms of criteria in judging that overall communicative success (Bachman, 1990: 329). It is doubtful whether the criteria can be ELT Journal Volume 46/3 July 1992 © Oxford University Press 1992 articles welcome
  • 2. considered justified and validated if the assessor is not even explicitly aware of them. The reliability of an assessor on different occasions with different testees can be improved by more explicit criteria, as can the reliability between assessors. The workshop The workshop takes between about 1½ and 2¼ hours and requires two or three short video clips of students talking.] Making your own video clips is preferable, as you can make the task and situation reflect the type of test which the teachers you are addressing are most likely to use. Stage 1: a. Viewing and reflection. (10 mins.) assessment Teac h criteria ers are shown a video clip of a student (or students) talking and are asked to reflect on the question ‘Which aspects of the students’ speaking would affect the grade you would give the students for their speaking skills?‘. The presenter needs to say in advance how long the clip will be, and what instructions were given to the students. b. Discussion (15 mins.) Teachers can compare their notes in pairs or small groups, and then this discussion can open up into a plenary. The objective at this stage is to get the teachers to be more conscious of what affects their own judgements, and to see how others may view it differently. The presenter’s role will include pinning people down on vague terms, such as ‘communicative’ or ‘passive’ (I have heard ‘communicative’ being used for: i) easy to understand, ii) says a lot, iii) makes up for linguistic weakness with gestures, etc., iv) interacts well with other person in role-play). Another role is to elicit or suggest concrete examples (from the clip) of features being discussed (e.g. an example of ‘inappropriate language’). There is no need at this stage to try to resolve all the differences of opinion, as often those differences stem from different assumptions about the testing context, which will be looked at later. After this discussion, it is useful to show the clip again, so that people can reconsider the points made by themselves and others. c. List of assessment criteria (10-15 mins.) The presenter then hands out copies of the list of assessment criteria (see Figure 1). This list is fairly comprehensive in its broad categories, though within those there could be many more detailed criteria (for example, in Figure 1 Assessment criteria 1 GRAMMAR a. range b. accuracy 2 VOCABULARY a. range b. accuracy 3 PRONUNCIATION a. individual sounds (esp. phonemic distinctions) b. stress and rhythm Assessing speaking skills 295 articles welcome
  • 3. c. intonation d. linking/elision/assimilation 4 FLUENCY a. speed of talking b. hestitation while speaking c. hesitation before speaking 5 CONSERVATIONAL SKILL a. topic development b. initiative (in turn taking, and topic control) c. cohesion: i) with own utterances ii) with interlocutor d. conversation maintenance (inc. clarification, repair, checking, pause fillers, etc.) 6 SOCIOLINGUISTIC SKILL a. distinguishing register and style (e.g. formal or informal, persuasive or conciliatory) b. use of cultural references 7 NON-VERBAL a. eye-contact and body posture b. gestures, facial expressions 8 CONTENT a. coherence of arguments b. relevance an investigation of ‘fluency’ alone (Lennon, 1990: 404-405), 12 different variables were looked at, ranging from ‘words per minute’ to ‘mean pause time at T-Unit boundaries’). The amount of explanation needed for the terms on the list will of course depend on the teachers. d. Viewing and comment (10-15 mins.) The teachers are then shown another clip of students talking, and are asked to think about the usefulness and relevance of the criteria on the list for assessing the students’ speaking skills, adding or deleting as they think necessary. The objective of this stage is to consider further the criteria they think are relevant for assessing speaking skills, and also, by getting them to relate their views to the terms on the list, to give them a common vocabulary to aid discussion of differences of opinions. I have found teachers tend to be less talkative here, since it is a point of mental reorganization for them as they try to relate their own feelings and experience with the list. Stage 2: a. Introduction (5 mins.) Assessment and the context By this stage, the question of context should have arisen several times (e.g. in the form of comments beginning ‘Well, it depends on why . . . /what . . . /where . . . /how . . . ‘). The presenter now recalls various examples of these and notes how they show the importance of the context in deciding the choice of assessment criteria. 296 Ben Knight articles welcome
  • 4. b. Examples (10-15 mins.) Teachers are then given the hand-out with the examples of different selections and weightings of criteria together with descriptions of the relevant contexts (see below). Note that the number under the ‘Weighting’ column does not represent maximum marks for that criterion, but its value relative to the other criteria. For example, each criterion might be given a mark out of ten, and each score would be multiplied by its weighting number before being totalled up. Teachers can then be encouraged to ask any questions about the criteria, the context and the relationship between the two. For example: ‘Why did you include listening comprehension in the placement test, but not in the end-of-term test?’ It would, of course, be wise for you as presenter to use your own examples (criteria you have used yourself) so that you are more likely to be able to answer such questions. Criteria and context Look at these two examples of differences in the selection and weighting of assessment criteria. Note how these two examples of assessment criteria sets vary according to the situation, and try to list the factors in the testing situation which can affect such selection and weighting. 1 Placement test A placement test for University students (to decide which level they go into for their General English Communication course. This is an interview -basically answering questions -with a teacher who is both interviewer and scorer at the same time. It is taken with a single written gap-fill test, which assesses knowledge of grammatical structures, vocabulary, and functionally appropriate structures. In informal terms, the qualities we felt were important for success in a class were the ability to understand the teacher, knowledge of vocabulary, structures and functions (largely tested in the written gap-fill test), confidence and a willingness to take chances and try things out, as well as the ability to distinguish (productively) basic phonemes in English. The category of ‘range of grammar and vocabulary’ aims to capture people with a wide experience of English (who we thought would progress more quickly). (Numbers on main list) Weighting 1 Range of grammar and vocabulary (la and 2a) 3 2 Accuracy of grammar and vocabulary (lband 2b) 2 3 Phonemic distinctions (3a) 2 4 Hesitation (4b, c) 4 5 Initiative, topic development, and conversational control (5a, b,d) 4 6 Listening comprehension (not listed) 5 2 End-of-term ESP test This test was given at the end of one term of a course for receptionists in international hotels, to see how much they had progressed and what they needed to work on the following term. The speaking test was a role-play with two students, and the teacher only observed and marked the score sheet. There were several other tests -gap-fill and multiple-choice tests of grammar, vocabulary, and functions, and a listening comprehension test. Assessing speaking skills 291 articles welcome
  • 5. 298 1 Grammar and vocabulary 2 Pronunciation a. individual sounds b. stress and rhythm c. intonation and linking 3 Fluency (Numbers on main list) (1 and 2) (3) Weighting 3 1 1 1 1 (4) a. hestitation before speaking b. hestitation while speaking 4 Conversational skill a. cohesion (5) b. conversation maintenance 5 Sociolinguistic skill a. distinguishing register and style (6) b. use of cultural references 6 Non-verbal (7) a) eye contact and body posture b) gestures and facial expressions 7 Content (relevance) (8) c. Context variables (15 mins.) By looking at the examples and thinking of their own experience, the teachers are asked to abstract the variables in the context which may affect the choice and weighting of the criteria. The variables could include the following: i. The purpose of the test: -achievement, proficiency, predictive or diagnostic? -and depending on that: the course objectives; the underlying theory (formal or informal) of what constitutes language proficiency; the future situations the students are being tested for; the types of feedback the students in question would understand and could benefit from. ii. The circumstances of eliciting the sample of language being assessed. -the degree of freedom or control over what the student could say and do. -the number of participants and their roles in the situation. iii. Observation restrictions -extent to which assessor participates in speaking situation (e.g. interviewer or observer). -whether recorded or not (on audio or video cassette). iv. the other tests in the battery (e.g. the selection or weighting of a criterion for grammatical accuracy may depend on how much it has been assessed in accompanying written tests). d. Using the different criteria sets (15 mins.) (Optional if short of time.) Teachers watch another video clip and assess the student’s oral skills Ben Knight articles welcome
  • 6. using first one of the example criteria sets and then the other. This is to demonstrate how different criteria sets (appropriate in different contexts) can produce different assessments of the same performance. Different criteria sets will not always produce differing results, and so care needs to be taken to use a clip which will make the point clearly. e. Task (20-30 mins.) Teachers are given details of a testing situation (preferably compatible with their own) and are asked to decide on the criteria they would use and the weighting for those criteria. There follows an example of such a task: TASK (Selecting Assessment Criteria.) You should imagine you are responsible for the oral testing of 100 students who have applied to study in U.S. colleges for one year (various subjects). You have to judge whether they have sufficient speaking skills to survive and do well there. You can conduct a lo-minute interview for each (with only one interviewer/assessor for each interview). The interviewers are all experienced and trained EFL teachers. The other tests in the battery will be multiple-choice tests of grammar and vocabulary, of academic reading and lecture listening comprehension, and essay-type questions. Decide on the criteria you would use in assessing their spoken English, and the relative weighting of each. The purpose of this task is: -for teachers to think more concretely about the points raised so far, to let them see how referring to a particular context (described in the task) can reduce the differences in opinion when talking more generally. -to provide an intermediary stage between the thinking in the earlier part and the need for application of those thoughts to their own situation after the workshop. f. Conclusion (5 mins.) The presenter can ask students for their comments on the workshop---how useful it was, how it could have been more useful, whether they think they would change the way they assess their students’ speaking skills, and so on. Discussion There is still a great deal of subjectivity in a) the selection of criteria, and 1. Objective b) the way each criterion is measured (e.g. how exactly do you decide the criteria? grammatical accuracy of a speaker’s performance?). The workshop aims only to improve the quality of those subjective decisions about selecting criteria by making it more conscious and explicit, and by giving the teachers a chance to discuss other points of view. It assumes that teachers do not have the resources to carry out their own research. A kind of collective subjectivity can be reached for how each criterion is measured by ‘training’ or ‘moderating’ sessions for assessors. But for those who have the time and resources to look closely and objectively at these questions, the following will be of interest: Hinofotis (1980), Hieke (1985), Fulcher (1987), and Lennon (1990). Assessing speaking skills 299 articles welcome
  • 7. 2. Analytic or Several tests, such as the RSA Profile Certificate or the ILR (Interagency holistic Language Roundtable) oral interview, use a different type of criterion to assessment? this workshop. The speakers are observed in different speaking tasks and they are simply judged for their degree of success in that task. This holistic approach argues that, as we cannot observe directly mental characteristics like grammatical knowledge or ability to maintain conversations, it will be inaccurate to give (inferred) scores for them. Rather we should simply assess the learner’s (observable) success in performing authentic language tasks. The approach behind this workshop, however, is one which argues that it is those mental abilities (which we must infer from the learner’s performance) that we are most interested in, for at least the following reasons. Firstly, we cannot predict, let alone test for, every function and situation which a learner might need English for. Therefore any claim about general proficiency must involve a lot of inferences from the few sample functions/situations we do test. Secondly, a lot of teachers’ own tests are partly diagnostic, and teachers need to know more about why a particular learner performed badly in some situations and better in others. This will usually come down to inferred component abilities, such as range of vocabulary or fluency. For a detailed discussion of the two approaches, see Bachman (1990: 41-42, 301-333). Hughes (1989: 110) recommends using both approaches, with one as a check on the other. 3. When to do this This workshop on its own may seem peripheral as teachers often worry workshop? more about such problems as making a practical test, setting fair tasks and getting a fair sample of the students’ language, and being consistent. However, it is probably helpful to tackle the problem of assessment criteria before these other questions, since we need to start by deciding what we want to measure, before deciding what is the most reliable and practical means to measure it. The danger, otherwise, is that we choose reliable and practical tests (e.g. multiple-choice tests) which do not give us the information we really want about our students’ oral skills, and which can have a negative effect on students’ attitudes to developing those skills during the course. 4. 300 Too complicated? Considering the context in selecting assessment criteria does make the discussion more complicated. So with teachers for whom this topic is completely new, it would probably be better to leave such considerations aside or condense them severely. I have found some untrained teachers saying that they wished they could have come away from the workshop with one fixed set of ‘best criteria’. Gebhard (1990: 158,160) reports that handed-down direction is preferred by beginning teachers (quoting research in Copeland, 1982) and by teachers in certain countries who feel that ‘if the teacher is not given direction by the supervisor, then the supervisor is not considered qualified’. However, taking the testing context into account is valuable, despite the added complexity, in dealing with two common problems with teacher- development workers. Ben Knight articles welcome
  • 8. Firstly, it makes it easier for teachers to apply what they learn in the workshop to their own situations, especially when they are working in contexts very different from that of the presenter. This is also helped by the final task (which is an exercise in applying to a particular situation principles learnt during the workshop). Secondly, it helps resolves conflicts of opinion. Many of the disagreements at the beginning of a workshop can be related to different assumptions about the testing context and its effect on the selection of criteria. Thus, it not only improves our understanding, but also improves the conduct of the workshop: it avoids the ‘anything goes’ approach which creates cynicism, and it reduces the ex cathedra judgements by the presenter which can lead to resentment or passivity. 5. Usable in other The main limitations to this workshop are: circumstances? a. There must be a video player, and some video clips of students speaking (this is best done by recording students speaking in situations most likely to be used in tests by those teachers -i.e. a video camera is helpful.). b. The teachers must have a sufficient level of English (or other target language) to assess the students and to contribute to the discussion. c. The teachers need to have some experience of assessing students’ speaking skills. Pre-service teachers would probably be overwhelmed by the workshop in its present form. The workshop, however, need not be limited to native speakers, trained teachers, or to EFL teachers, as it has proved as useful to non-native speakers, untrained (but practising) teachers, and teachers of other languages. Each of these three latter groups may have certain characteristics which should be taken into account. Some non-native speakers tend to place far greater emphasis on grammatical accuracy to begin with, though this is usually due to the way they learnt English rather than with any difficulty in perceiving discourse and sociolinguistic skills. Experienced, untrained teachers often lack the relevant vocabulary to talk clearly about learning and teaching, and sometimes appear to be more dogmatic or to lose perspective (e.g. claiming the single most important criterion in assessing speaking skills is whether the speaker keeps eye- contact with the listener). While it is obvious that the details of grammatical accuracy and pronunciation will be different for other languages, that is probably also true of the other criteria such as conversation maintenance and non-verbal behaviour. However, provided their level of English is sufficient for the tasks, the workshop’s aims (awareness of the range of different criteria possible, and awareness of how their selection depends on the testing context) can be met for teachers of different languages in the same group. It should also be noted that these three groups bring different perspectives Assessing speaking skills 301 articles welcome
  • 9. which can only enrich the discussions: for example, suggesting as a criterion for communication between non-native speakers ‘the ability to adapt your level of English to that of your interlocutor’, or (from a Japanese teacher) ‘the skill of stating an opinion in such a way that it is easy for the listener to disagree without seeming argumentative’. Conclusions The workshop works as a way of stimulating teachers to think about and discuss the way they assess their students’ speaking skills. It is rare for a participant not to be absorbed by the tasks and the exchange of ideas. A few participants have found it rather frustrating not to have a fixed set of testing criteria at the end of the workshop, but most seemed to find the process of relating criteria to context helpful in clarifying their own positions. A large survey of teachers’ testing experience found ‘there is evidence that most [teachers] prefer to use informal and flexible approaches which can be adapted to different student populations’ (Brindley, 1989: 31). This workshop suits such preferences. Received September 1991 Note 1 The video clips I have used (about 3-5 mins. long) mainly showed students speaking in pairs, usually in a simple role-play. For example: Student A: You want to go on a holiday to Hawaii. Try to persuade your partner to come with you, though he or she wants to go somewhere else. Student B: You want to go on a holiday to Europe. Try to persuade your partner to come with you, though he or she wants to go somewhere else. References Bachman, L. 1990. Fundamental Considerations in Language Testing. Oxford: Oxford University Press. Brindley, G. 1989. Assessing Achievement in the Learner-centred Curriculum. Sydney: National Centre for English Language Teaching and Research. Copeland, W. 1982. ‘Student teachers’ preference for supervisory approach’. Journal of Teacher Education. XXXIII/2: 32-36. Fulcher, G. 1987. ‘Tests of Oral Performance: the need for data-based criteria.’ ELT Journal XLI/4: 287-291. Gebhard, J. 1990. ‘Models of Supervision: choices’. in Richards J. and D. Nunan (eds.) 1990. Hieke, A. 1985. ‘A Componential Approach to Oral Fluency Evaluation’. The Modern Language Journal. LXIX/2: 135-42. Hinofotis, F. 1983. ‘The Structure of Oral Communication in an Educational Environment: a comparison of factor analytic rational procedures’. in Oller, J. (ed.) 1983. Hughes, A. 1989. Testing for Language Teachers. Cambridge: Cambridge University Press. Lennon, P. 1990. ‘Investigating Fluency in EFL: A Quantitative Approach’. Language Learning XL/3: 387/417. Oller, J. 1983. Issues in Language Testing Research. Rowley, Mass.: Newbury House. Richards, J. and D. Nunan (eds.) 1990. Second Language Teacher Education. Cambridge: Cambridge University Press. The author Ben Knight teaches EFL and linguistics at Shinshu University, Japan. He obtained an MSc in Applied Linguistics from the University of Edinburgh in 1987, and has taught EFL/ESL in Britain, Kenya, Italy, India, and Sri Lanka. His current interests include the testing of spoken English, teacher development, and language learning beyond the classroom. 302 Ben Knight articles welcome