logo ELT Concourse for academic managers
concourse

Placement

placement

In many schools, placement in class is done according to age, although there may be a system of streaming within year groups.  In language-teaching institutions, however, placing students in a class at the right level, following an appropriate syllabus and with a suitable social profile is a rather more challenging undertaking.


balance

The balancing act

Good tests manage to balance reliability, validity and practicality but, as you will know if you have followed the guide to testing on this site or have taken a good training course, the grail of making a test which maximises the three variables is probably unobtainable.
Placement testing needs to compromise to get the best results for its purpose: putting learners in groups where they are challenged, not overwhelmed and feel comfortable.

valid

Validity

  1. Content validity
    Validity refers, among other things, to whether a test actually tests what we think, or claim, it tests.
    To maximise its validity, a placement test needs to match the kinds of teaching that our institution delivers.  For example, in a school dedicated to preparing people for examinations, the placement test has to measure how well the learner is already able to achieve the kinds of tasks we are training them to do in classes.
    In a school focused on other areas, for example, preparing people for study in an English-speaking institution or for the workplace, the test will need to reflect those aims.
    Despite the claims of those who develop placement tests for commercial purposes, there is no one-size-fits-all test that will be appropriate in all situations.
    Nevertheless, there are commercially available tests which may fit well with your institution's aims and approaches and which are affordable and can be easily administered.  Some will even come with an on-line marking system to give you instant results.  If an off-the-peg test is what you need, then a little research will enable you to track down professionally designed and attractive tests.
    If that is not appropriate, you, the Academic Manager, will have to select, design or develop testing materials which are valid for the aims of your teaching programme.
  2. Face validity
    This is an important criterion to consider because learners need to trust the outcomes of the test and to do that it has to look like a fair test and be long enough and searching enough for the learner to feel that the results will be a proper measure.
    You may believe, and you may be right, that you can place a learner accurately in a class by chatting to them for five minutes or assessing their writing of a 100-word application.
    Most learners will not see that as a valid test and will not trust its outcomes.
    If, later, they feel that they have been misplaced, the feeling will be intensified by the thought that they were not properly assessed in the first place.
reliable

Reliability

Just as important is the knowledge that the test will produce comparable results whoever takes the test, whoever marks it and where it is taken.

  1. The environment
    Few schools have the luxury of setting aside a space where only placement testing occurs so the physical conditions are standard and unchanging.
    Nevertheless, you need to give some thought to whether a test completed by someone sitting in a noisy corridor with the test balanced on one knee and another person taking the test in a quiet office at a desk with a proper chair will be comparable in terms of reflecting the test-takers' ability.
    Care needs to be taken to neutralise variability in the environment.
  2. The marker
    The more objectively marked the test is, the less subject the results will be to marker variability.  Many placement tests include assessment of learners' oral ability and this is often accomplished through an interview.
    Even if the same person conducts the tests, variability can creep in if the procedure isn't carefully scripted and consistent each time it is carried out.
  3. The test takers
    Some people prefer to work slowly and methodically through a test without making guesses.  Others are more willing to take risks and have a stab at the right answer even if they don't know.
    In the first case, someone who completes, say, only half a multiple-choice test but scores 40% overall may be at a higher level than someone who has completed the whole test and gets the same score because the second person might get 25% of the answers right by guessing from the four possibilities in each question.
practical

Practicality

This is the final important area.

  1. In large organisations where many prospective new students have to be tested and placed quickly (for example, at the beginning of terms or in summer-school environments), practicality is possible the most important criterion.  There is little point in administering a test if it takes so long to mark, rank the results and interpret them that a day has slipped by before you are ready start teaching.
  2. At other times, you may have the luxury of only having one or two tests to mark and assess with the students' classes not starting imminently so time pressures are eased.
  3. However, if a test is to perform both functions, as a mass testing system and for the odd individual, it is the former function which will take precedence.  For the sake of reliability, of course, you should not have two tests serving these two functions because outcomes will very probably not be comparable.
  4. Whether a test is a computer-based or a pen-and-paper exercise is also a question of practicality.  Computer-based tests are eminently practical if you want very quick results.  Pen-and-paper tests require some manual marking, usually, and that is time consuming.  There is also the possibility of errors creeping into the marking process.
    If you need to test 50 learners at the same time, a computer-based test will require 50 terminals.  If you don't have that kind of facility, then a pen-and-paper test is your only viable option.
  5. Getting learners to complete on-line tests before they arrive at your institution's door is a popular and eminently practical option in some settings but reliability is severely compromised because you have no control at all over the conditions.
  6. A final constraint on practicality is oral testing because, by its nature, it is at best a one-to-two setting and no one person can possibly carry out a test for 100 learners in a day.  Having multiple assessors impacts negatively on reliability unless test design, training and supervision are very efficient indeed.  That is not insuperable.

test

What to test

It is noted above that what you test will reflect the aims of your teaching programme.  If you are only teaching people to take part in business meetings, then the ability to do so is what you will test.
If, however, your concern is more with General English or the core grammatical, functional and lexical aspects of the language, then a much more general test is called for.
The first trap to avoid is only testing grammar.
In many settings around the world, multiple-choice grammar tests are commonplace in classrooms and outside them.  Learners from backgrounds like these may perform well in such tests but have a poor vocabulary and little idea of communicative functions.  They may also be good at recognising grammar, but less able to use it.
Lexis and appropriacy are also easy to test via a multiple-choice task format and should be mixed in to make the test as valid as possible.  Depending on your situation, you may also want to test listening comprehension, spelling and a range of other skills.

help

Here's some help

For an example of a 100-item placement test for General English purposes, you can do two things:

  1. Click here for an on-line test (new tab so close it to come back).
  2. Click on these links for the same test in pen-and-paper format.
    Part 1
    Part 2

The answer key is at the end of the paper versions.  If you use them, please make sure that you credit this site.  If you are using the test, consider whether it might be appropriate to donate the price of a cup of coffee to the charity of your choice each tenth time you use the test.
Thank you.

The test comes in two parts and there's a reason for that.  It was noted above that test-takers vary in their approaches so the first half of the test is easier than the second part.  Someone who takes a long time to do the test but gets 80% of the first 50 questions right may well be at a higher level than someone who rushes through and completes the test by guessing at the answers.  In a multiple-choice test like this, random answering will gain 25% of the marks on offer.
For levels between A1 and B1, the first 50 questions will be a reliable guide.  Above that level, you'll need to take the score from Part 2 into account.
Here's a scoring guide based on the Common European Framework levels which start at A1 (the lowest level) and go up through A2, B1, B2, C1 and C2 (the highest level).

Score Level Score Level Score Level Score Level Score Level Score Level
0 – 15 A1 16 – 30 A2 31 – 50 B1 51 -– 70 B2 71 – 90 C1 91 + C2

interview

Testing oral skills

In addition to a pen-and-paper test of lexis, grammar and so on, you probably want to gauge learners' oral / aural skills.  A simple way to do this is by talking to them, of course, but you need to be a bit more careful than just having a chat.

The approach taken here is to have a scripted conversation which is standard for all interviews.  This way, even if you have multiple interviewers, there is some chance of being able to standardise the marking and placement procedure.
It works like this for interviews in pairs but if you are interviewing one to one, you'll need to make small adjustments.
The age of the interviewee will also need to be taken into account when selecting which questions to ask.

Interviews should not simply be question-and-answer sessions but should allow the students to demonstrate their oral communicative ability so don’t be afraid to ask more open-ended, follow-up questions, and with students of B1 / B2 ability and above, ask them to question their partner.
Remember: you are not judging formal grammatical accuracy but the ability to understand and be understood.

Level descriptors for listening and speaking ability are taken from the Common European Framework. They are:

A1: Can understand basic instructions or take part in a basic factual conversation on a predictable topic.
A2: Can express simple opinions or requirements in a familiar context.
B1: Can express opinions on abstract/cultural matters in a limited way or offer advice within a known area, and understand instructions or public announcements.
B2: Can follow or give a talk on a familiar topic or keep up a conversation on a fairly wide range of topics.
C1: Can contribute effectively to meetings and seminars within own area of work or keep up a casual conversation with a good degree of fluency, coping with abstract expressions.
C2: Can advise on or talk about complex or sensitive issues, understanding colloquial references and dealing confidently with hostile questions.

Don’t try to work through all the levels – stop the interview where it is clear that communication has broken down and enter a score for the last level at which communication was reasonably successful.

Suggested questions / elicitations

Later questions will be influenced by earlier answers so you’ll need to be flexible.
Questions – don’t ask all of them but don’t be afraid to ask follow-up questions.

A1-level students should be able to respond comprehensibly to these prompts but may not want or be able to volunteer much.
What’s your name?
Can you tell me your name?
Please tell me your name.
What’s your first language?
What’s your nationality?
How old are you?
What country do you come from?
Where do you live?
A2-level students should be able to respond comprehensibly to these prompts and may be able to volunteer a little.
How do you spell your name?
How long are you staying in England?
Have you been to England before?
When did you arrive in England?
Do you have any brothers and sisters?
How old is / are he / she / they?
Do you live in the city or in the country?
Where in ... do you live, exactly?
Where do you go to school?
What sports do you like?
B1-level students should be able to respond comprehensibly to these prompts and should also be able to volunteer information.
Tell me about / Ask your partner about:
His / her / your home town.
His / her / your family.
His / her / your best friend(s).
His / her / your house / apartment.
His / her / your last holiday.
His / her / your school and its teachers.
His / her / your country.
His / her / your class at school.
B2-level students should be able to respond comprehensibly to these prompts and easily and naturally be able to volunteer information.
What do you expect from your course with us?
What sorts of things do you want to learn?
What sort of social activities do you like doing?
Tell me about your hobbies and interests.  What do you enjoy doing in your spare time?
What are / were your best subjects at school?
What subjects at school do / did you enjoy most?
Ask your partner about …
(see the topics in this area).
C1-level students should be able to respond comprehensibly to these prompts and easily and naturally be able to volunteer considerable information.
What are your plans for the future?  Do you think you’ll need to speak English for that?  Why?
Where else have you been in the world?  Tell me about the things you liked in …
What did you on your last holiday?  What did you like about it?
What sorts of sports do you play / books do you read and why do you enjoy them?
Ask your partner about …
(see the topics in this area).
C2-level students should be able to respond comprehensibly to these prompts and easily and naturally be able to volunteer information and take turns in the conversation.
Tell me about the best / most important thing that’s ever happened to you.  What made it so good / important?
Tell me about yourself.  What sort of a person are you?  What are your strengths and weaknesses?
Tell me about your ambitions.  What qualities would you need to do that?
Ask your partner about …
(see the topics in this area).
The ability to initiate, produce and respond to follow-up questions is important at this level.

group

Making class groups

The whole purpose of placement testing is, naturally, to give you the data from which you can build groups of learners or place individuals into already formed groups.
In an ideal world, groups would have a comfortable mix of sexes, ages, first languages, interests and personalities but the world isn't ideal.
In most settings, you should expect the results of testing lots of learners to give you a bell curve of abilities with fewer learners at the bottom and top and most occupying a spot somewhere between A1 and C2 level (if, indeed, you have any students at all at those levels).
Typically, it will look something like this:
bell curve
Which means with large numbers of learners that you will be able to build classes with parallel levels in the centre of the range and then you can, and should, take the mix of other, non-language factors into account.