LLTI Archives

May 1999, Week 3

LLTI@LISTSERV.DARTMOUTH.EDU

Options: Use Monospaced Font
Show Text Part by Default
Show All Mail Headers

Message: [<< First] [< Prev] [Next >] [Last >>]
Topic: [<< First] [< Prev] [Next >] [Last >>]
Author: [<< First] [< Prev] [Next >] [Last >>]

Print Reply
Subject:
From:
LLTI-Editor <[log in to unmask]>
Reply To:
Language Learning and Technology International Information Forum <[log in to unmask]>
Date:
Wed, 19 May 1999 13:20:25 EDT
Content-Type:
text/plain
Parts/Attachments:
text/plain (73 lines)
--- Forwarded Message from "David Pankratz" <[log in to unmask]> ---

>Date: Wed, 19 May 1999 11:01:48 -0500
>From: "David Pankratz" <[log in to unmask]>
>To: [log in to unmask]
>Subject: #5013 BYU exam scoring -Reply

Cindy,

We, too, have had difficulty setting scores for placement. I think
that the moral of the story is that you have to be flexible and aware
of quite a few pitfalls. For example, you mention that you might be
having trouble getting accurate data because the exams don't "count."
I think that this is a critical problem. Students who see no personal
benefit to taking an exam often score ridiculously low. (I have even
seen scores for a student go _down_ over the course of the semester
because they see no reason to do well on the exam the second time
around.)  Also, we have noticed that a student who takes the exam
twice may score differently by several points.
 I think it is important to keep in mind that the computerized exam I
assume your are using evaluates only grammar, vocabulary and reading
on a very passive level, i.e., no listening, speaking or production of
any kind is measured. So, for placement, we take into consideration
the amount of prior study, when and where that study occured, the
student's self-confidence, motivation, and objectives. Ideally, those
oral skills and production skills are measured also.

I would like to see students tracked so that we find out if placements
were indeed appropriate, but that has not yet been done in my
department in any formalized way.  Our feedback remains anecdotal for
the most part.

I would be happy to talk with you more about this if you would like to
give me a call.

David Pankratz
Loyola University Chicago



>>> LLTI-Editor <[log in to unmask]> 05/18 3:13 pm >>>
--- Forwarded Message from Cindy Evans <[log in to unmask]> ---

>Date: Mon, 17 May 1999 15:50:12 -0400
>To: [log in to unmask]
>From: Cindy Evans <[log in to unmask]>
>Subject: BYU exam scoring

We have administered the BYU exams at the beginning and end of each
semester this year to elementary and intermediate classes with the aim
of
establishing benchmark scores for placement.  Even with one year's
data, we
are having trouble setting the benchmarks, as our scores are evenly
spread
over a wide range.  I have a feeling that this may have something to
do
with the fact that the exam didn't "count" in any way.

I'd like to hear from others who are using the BYU exams.  How did you
establish your benchmark scores?  If you're willing to share the
information, I'd be interested to know what score ranges you use for
different levels.

Thanks for any advice.

_____________________________________________________________
Cindy Evans, Director
Foreign Language Resource Center        [log in to unmask]
and Lecturer in French                  phone:  518-580-5205
Skidmore College                        fax:    518-580-5230
_____________________________________________________________

ATOM RSS1 RSS2