Showing posts with label assessment. Show all posts
Showing posts with label assessment. Show all posts

Monday, May 30, 2016

Selfie Videos as a Tool for Language Learning


photo credit: Körsbärsblommorna i Kungsträdgården 2016 via photopin (license)



Being a teacher for some time, I have seen first hand the impact the adoption of technology has had in teachers' and in students' lives. Having that in mind, one cannot deny that it is important to adopt technology for teaching. In line with this premise, I would like to share something I learned in one of the many interesting presentations at the 2016 TESOL International Convention & English Language Expo in Baltimore, USA. This practice-oriented presentation (by Loni Thorson, Kyla Masciarelli, and Christine Discoe) was entitled "Using Selfies to Promote Language Learning."

What the presenters pointed out was  that technology is what students want. Linking the drive to communicate with the technology available to us, selfies are a trend in the world today.  One point in favor of using selfies, the presenters argued, is that video chat is a growing trend. This is really true and the proof for that is that if we look around, we will see people making either video or picture selfies almost all the time. Besides that, video chat through Face Time, Skype or other channels are quite frequent among learners young and old. Educators have to admit that this is a sign that people in general are comfortable with this technology. This brings us to the first argument they presented in favor of using selfies as a means to learn a language: classroom  comfort.

Classroom comfort informs us that in order to have effective and authentic tasks, students need to be comfortable with the assignment. We observe that students are very comfortable with their cell phones. Actually they are uncomfortable if they don't have them. Social comfort is also important. Students need to be comfortable with the technology (cell phone).  Being digital natives, students are used to seeing themselves in videos. They want that image to be curated. We want students to want their image to look good., they want to sound good, their pronunciation to be good, they want their image to look good. We teachers want students to want their image to look good. So, they have a natural desire to self-correct in terms of how they sound and how they look.  This is exactly what we teachers want.Video chat is a comfortable environment for them.

When people make a selfie video, they generally explain their surroundings and they give an update on what they are doing, they also explain if they are having a problem or if they are sick. All this updating creates a one to one interaction and, as a result, it increases comfort between students and viewers. A comfortable relationship with the teacher is created through this open communication channel. Besides that, it also creates comfort between students as they see themselves and their classmates in the videos. As time goes by, students that might  not have been happy with how they looked or sounded, feel more comfortable seeing and listening to themselves. Some report never have listening to or seeing themselves before. As they report feeling more comfortable doing that.

Why are selfies important?
Some reasons that make us convinced that using selfies in the language is useful relate to comfort and attention. There are two types of attention: inward attention and outward attention. They are mutually exclusive and you cannot have the two going on at the same time. Why is it important to understand this concept when making selfies videos? While making a selfie video, students do not only direct themselves outward, but they also have to direct themselves inward to see what is happening to themselves. They correct themselves during the video and sometimes after the video. This kind of attention works as meter against which they evaluate their performance, and as a result, they record multiple times just to make sure they get it right. They are aware of their own self-presentation and they make more selfies as assessment or a class task, they get more confident of their performance and become more confident and fluent speakers.

Monday, June 01, 2015

Google Hangouts: Not Your Regular Test Validation Meeting

An important component of the assessment design cycle is validating the instruments, and to that effect we count on the group of teachers working with that particular level/course. This collective validation process used to take place in the form of a traditional meeting which took place in our school’s Main Branch, usually in a room big enough to accommodate a group of around thirty teachers (sometimes more).
I’d already been adopting some group work dynamics in order to optimize the use of time, hopefully enabling teachers to make the best of the experience of collectively analyzing the test. In a nutshell, I wanted a productive, pleasant atmosphere where not only the outspoken individuals had a go at critiquing and sharing their views. I wanted all of them to feel comfortable enough to voice their concerns and suggestions to tweak the assessment instrument at hand. Teachers worked in small groups of five to six people, appointing a spokesperson who would be in charge of communicating the group’s opinions/suggestions regarding the test.
That had been working quite well. So, it occurred to me: they worked so well within their small groups, usually sitting with fellow teachers from the same branch, who have been sharing their experiences on a regular basis. I couldn’t help but wonder if we could make the validation process even more practical. That was when I had the idea to try out Google Hangouts for Test Validation Meetings. This is how we did it.
Let’s Hangout
Teachers were asked to attend the Validation Hangout at their branches; therefore, they worked with small groups of fellow teachers with whom they connect/exchange every day. They appointed their Hangout representative/spokesperson and went about their business of analyzing the test.
Adjustments along the way
The three Hangouts we had this semester were two-hour-long events. In the first Hangout, I took the groups through the test exercise by exercise, asking them to look at one part of the test at a time. That ended up being as time consuming and noisy as a regular meeting.
After getting some feedback from them, which they gave via a Google Form Survey, we decided it would be best if I gave them about 40 minutes to work on their own first, and only then start gathering their feedback. That worked better. (That and using the mute button to lessen the noise, of course!)
However, the third tine around was the best, indeed. We decided groups should be given even more time to look over the entire test before the feedback-giving stage. I gave them an entire hour, and it really paid off. The feedback stage ran more smoothly and rather fast.
Project Success
  • Convenience: teachers were free to attend the Hangout at a branch of their convenience, which most of the times meant the branch closest to their homes;
  • Capacity for collaborative self-management: teachers had to organize the analysis process themselves, preparing to report their impressions and suggestions to the Course Supervisor (yours truly) and the other branch groups in a clear and concise manner;
  • Agency and accountability: they worked hard to convey their opinions and provide pertinent suggestions, relying on the expertise of their own groups;
  • Voice: working with smaller groups of familiar faces made the more reserved people comfortable to speak their minds, something which tended not to happen with the large face-to-face traditional (very loud and somewhat messy) meetings;
And, last but not least,
  • Modeling innovation: teachers had the chance of trying out a new tool which they might find useful for other professional development opportunities.
This is an experience I would certainly like to replicate in the future, and which I would recommend other admins try out with their teaching staff.
What’s next?
Hangouts for Professional Development and innovating the adjacent possible.




Clarissa Bezerra

Friday, May 10, 2013

IATEFL 2013 - On Listening Tasks and Tests


TESOL&IATEFL (1)

Attending and presenting at both TESOL and IATEFL conferences was a rewarding experience.  I always have two different perspectives when I attend and when I conduct a workshop. Attending a conference is a moment in which you see new trends in language teaching. We have contact with different and often revisited  viewpoints of what we sometimes  believe are unchangeable truths, and we have the priceless opportunity to meet old and new friends, professionals  who have a lot to share with you. As a presenter, I feel that a conference is a moment for networking and assessing the repercussion of the material you have been developing. Both are very motivating and make us want to share and learn even more. It is a never ending endeavor. I am sharing here an enriching presentation that I attended at IATEFL Conference in Liverpool, 2013 - Listening tests and tasks versus listening in the real world – by John Field (Oxford University Press). The talk outlined the types of mental processes involved in listening. Then it evaluated whether recorded material, formats, and items of conventional second/foreign language tests really tapped into this processes. Finally, suggestions were made for new forms of teacher-designed test and task that are more closely linked to real-world communication needs and to the listening construct.

Listening is a process taking place in the mind of the listener. The only way we can test the skill – or check understanding in the classroom – is indirectly - by asking questions. ELT teachers have to ask questions for three reasons: to test, to check understanding and to diagnose listening problems. This already distances the behavior of a learner or test candidate from that of a real-world listener. Then, what does a language test actually test?

We know that it is crucial for the learning process to consistently develop and assess the listening skill. We must, therefore, have in mind that it is impossible for a test to replicate the circumstances of real-life language use, but it is reasonable to ask to what extent a test (directly or indirectly) elicits from test takers’ mental processes like those that they would use in a real-world situation. This is a critical question in tests that claim to predict how well a candidate will perform in a real-world context, such as an academic institution, a professional position or an immigrant situation.

Cognitive validity is a well-established idea and educational researchers in the U.S. have investigated and questioned the following aspects of testing. Does a test of physics show that the learner can think like a physicist? Does a test of logical thinking test what it claims to test? Does a test in Medicine just show that learners have mastered facts – or does it show that they have the ability to diagnose? These intriguing questions lead us to reflect upon what listening consists of.

According to Mr. Fields, the model of expert listening starts with a speech signal – decoding and word search – and is followed by word parsing – separating the sentences into grammatical parts, such as subject, verb, etc. – which eventually leads to meaning construction. This model may question whether present listening tests / listening tasks materials elicit behavior from the listener that is like real-world listening processes, if they are comprehensive enough to cover most or all of the processes involved in listening, and if they are graded in a way that reflects learners’ development as listeners. He concluded that listening tests / tasks materials provide listeners with scripted (or even semi-scripted) recordings with little resemblance to natural everyday English, actors who mark commas and full stops, lack of hesitations and false starts, quite long utterances and regular rhythm, and voices that do not overlap. Aside from that, test setters sometimes put in distractors, making the recording much more informationally dense than a natural piece of speech would be.

The difficulty lies in the recording itself. Test designers and teachers tend to judge the difficulty of a piece of listening and even what points of the information to focus on by referring to a taspescript. However, these decisions also need to be made when listening to the recording. What parts of the recording (words or points of information) are prominent and easy to recognize? What characteristics of the speakers might make the recording more difficult? To choose recorded materials, teachers  have to take into consideration if it is authentic, recorded, scripted or improvised, analyze how now naturally the speakers include hesitations, for example, how fast they speak, how precisely the speakers form their words, the degree of formality, accents, if it is a dialog/conversation/interview, the frequency of the vocabulary uses, the complexity of grammar, the familiarity with the topic, the length of the recording, how dense the idea units are in the recording, how clearly structured is the overall line of argument and how concrete or abstract are the points made.

Mr. Fields concluded by affirming that conventional formats – multiple choices, gap filling, visual matching, true/false, multiple matching, identifying the speaker who said - require the listener to map from written information to spoken, eliminate negative possibilities as well as identify positive ones (multiple choices and True or False), read and write as well as listen (gap filling), and engage in complex logistical tasks which take us well beyond listening (multiple matching). He also claims that lower level learners understand far less than we assume, listen out for prominent words and try to match them to words in their vocabulary, are dependent on picking up salient words rather than chunks and whole utterances, a tendency that is increased by the use of gap filling tasks that only focus attention on word level.

He finally suggested that we provide items after a first playing of the recording and before a second. This ensures more natural listening without preconceptions or advance information other than the general context.  He insisted that we keep items short, since loading difficulty on to items just biases the test in favor of reading rather than listening. He made sure we use tasks that allow the test setter to ignore the order of the recording and to focus on global meaning rather than local detail. The information provided by Mr. Fields may not be new to many of us, but it always wonderful to listen to a specialist confirm or deny our assumptions, basing his conclusions on accurate research and studies. That is why attending a conference can make a difference in our lives.



Monday, April 08, 2013

Thinking about Assessment - Part 2


THINKING ABOUT ASSESSMENT (part 2) – A follow-up on Thinking About Assessment… Again)

Having decided that we were going to pilot the alternative assessment program, we had to inform students of our plans, and listen to what they had to say about it. We were ready to “abort the mission” in case of rejection. They accepted it with no reservations. Still, it was to my surprise that, at the end of the first lesson, one of them came to me and said (in L1, of course, as this is the beginner group) “See you next class… but you will only see me because you told us we won’t have to take that final test.” It took me a couple of seconds to grasp the meaning of what she was telling me. She went on: “I’m too old to suffer with tests. In my life, I’ve taken all the tests I needed to take… Now, I’m interested in learning!”
And that was what we needed to know that we were on the right path. The focus had naturally shifted from teaching and testing to learning. The learners had assumed their rightful place at center stage, taking control of the process. “Now, I’m interested in learning!
Lately I’ve been following Adrian Underhill and Jim Scrivener’s blog on ‘Demand-HighTeaching, and two of their questions really hit a nerve: How can we stop “covering material” and start focusing on the potential for deep learning? How can I shift my attention from “successful task “to “optimal learning”? Well, this was exactly what we wanted to explore in our “assessment quest”.
Anyway, back to my tale to tell… After the first evaluation of their oral performance, we decided to give them a weekly “assessment opportunity”. On week 4(of 10), the focus was “Listening”.
In the past we had been cautious of venturing into evaluating the listening skills with the adult groups. Adults are afraid of listening, terrified by its unexpectedness,   petrified by the possibility of failure.  Adults are interesting language learners; they bring a whole lot of baggage with them:
  • Their beliefs, more often than not tainted by their previous language learning experience – usually their formal learning of the mother tongue (which they had already acquired in their childhood), with the grammar exercises, linguistic analysis, etc.
  • Their personal history. Your student is most likely a self-respecting human being, a skillful professional, someone who undoubtedly has a lot to teach you, who can tell a number of success stories, and learning English is not one. At least not yet. 
  • Their needs and expectations:  They ‘ve come to us because they want to be part of the world who can speak English. That is the question, isn’t it? “Do you speak English?” or “Can you speak English?” 
These learners, more than any other language learner, need to be able to speak, to communicate effectively! Well, communication implies a message that is sent and, consequently, received: Listening! How can we ever assess language learning without analyzing listening?  If they don’t understand what is said to them, how can they respond?
 Anyway, assessing their Listening skills, after no more than 12 (twelve) classroom hours, for most of them twelve contact hours. How do you do it? Preferably without any extraordinary acrobatic feat, just keeping it simple and structured, with the appropriate scaffolding, and making sure that the lesson is designed focused on enabling optimal learning, while providing you – teacher – an opportunity to assess whether the goals have  been achieved, and how far they have been developed.    
Here is the step by step:
1. We had previously explored the following exponents:
  • What’s his/her name?  His/Her name is…
  • Where are you from? I’m from…
  • Where is he/she from?  He’s/She’s from…
  • Vocabulary: countries

2. On the second lesson, I showed them a PPT with international celebrities… At first I showed a photograph and asked the questions ‘What’s his name?’ and ‘Where is he from?’ (before revealing the name and the flag) Here are some samples:

3. After two or three samples, I invited the students to ask the questions: ‘X, ask Y.’
4. Then, they worked on their books, which brought an information gap activity. Both students had pictures of six people. One student had information on three of the people (names and countries of origin), while the other had to look at a different page, where they had information on the other three. The structure and vocabulary was very much the same as my PPT had prompted: What’s his/her name? Where is he/she from?
5. Next, they were asked to look at an incomplete dialogue – again from the book, and work in pairs to predict what was missing.
6. After a couple of minutes, I asked them to listen to the dialogue and check if they had made the correct choices.
 7. Just before giving them the listening task, I replayed a recording from the previous lesson, and they repeated the names of the countries.
8. Next, I gave them the worksheet with the following task:
They heard the following dialogues:


 Dialogue I
A: Hello! I’m Luis, from Mexico.
B: Hello, Luis. I’m Akemi, from Japan.
Dialogue II
C: Hello. My name’s Charles. What’s your name?
D: Hi, Charles. I’m Mike. I’m from the United States. Where are you from?
C: I’m from London, in England.
D: Oh, yeah? I’m from Chicago.
Dialogue III
E: Hi, I’m Loretta. I’m from Sydney, Australia.
F: Hi, Loretta. I’m Jason. I’m from Australia, too.
E: Oh, wow! Are you from Sydney?
F: No. I’m from Melbourne.
They were graded both on the correct country, and the correct spelling of the country’s name.
As you may have noticed, nothing fancy. The PPT could have been easily substituted with good old flashcards. I used written and audio material from the book. My main worry was to make sure they were “comfortable” when they got to the listening task. The listening element was introduced with the dialogue (steps 5/6), but they had the chance to predict what they were going to hear before they heard it. It was safer that way.
They also had plenty of meaningful and varied practice on the target piece of language. The dialogues they heard were, in a way, familiar to them.  In this lesson, before getting to step 8, they were given at least three different opportunities to produce and listen to the names of the countries, as well as the language structures surrounding them.
Now, the important thing is that this lesson was, as the first one I described here, not designed to test. It was designed to teach, it had learning at its core. The assessment opportunity was created, but it only took as long as those three short dialogues – which, by the way, they heard only once.
So, once again, I invite your input. How do you see this project? Can you help us by suggesting activities and procedures we can use with these pilot groups? We are counting on your thoughts, your suggestions, your criticism… We are waiting for you!
Lueli Ceruti

Friday, March 15, 2013

Alternative Assessment - The Prime Experience


About two weeks ago, our colleague Lueli Ceruti wrote a really interesting blog post in the CTJConnected Blog. In short, her post described our reasons for questioning the way we assess our adult students’ EFL learning and our experimenting with what we have been calling the “alternative ass essment system.”

Basically, what is being proposed is that the assessment of our adult students’ learning be carried out in a more ongoing manner. The objective here is to make it possible for us all, teachers and students, to know how well students are learning in time for us to take action, if necessary, before the last day of class. Also, with this “alternative assessment system”, our student will hopefully get less anxious with the idea of being evaluated at the end of the module.

In Lueli’s post, she described the first assessment activity she did with her Thomas Flex group. Here is the first one my Thomas Prime 1 students and I experimented with. Thomas Prime is a Casa Thomas Jefferson upper-intermediate/advanced course designed for adult students.

The Thomas Prime 1 Experiment:

In week 2 (of 10), we covered the grammar lesson “Suggest ways to enjoy life more”, and students learned about the verbs “stop”, “remember” and “forget” followed by the infinitive and the gerund.
First, we read and discussed the text “Finding Balance”, which opens the second lesson in the book Summit, published by Pearson Longman. Next, by analyzing the examples of the focus verbs in the text, we tried to come up with the different meanings each of them had when followed by infinitives and gerunds. This information was recorded on the board, and right after that, the students compared it with the chart on page 5. They then did the exercise on the same page, and we checked their answers. I assigned an extra exercise on the focus verbs for homework, with the students being responsible for checking their own answers (They had a copy of the answer key).
At the beginning of the following class, after the students had worked cooperatively to check their answers in the fill-in-the-blanks in sentences giving advice, I told them about my sister, a girl who led a very stressful life due to her inability to find balance. The students then individually wrote five suggestions on a chart I gave them, and we agreed on the five best suggestions to give to my sister.

This is what the board looked like:



Before the end of the class, I collected the charts with the students’ sentences and assessed their work at home. I used to following rubrics as a guide.
  

Each of the sentences is worth two points.

      a)    Deduct two points if the student’s sentence does not make sense.
      b)    Deduct one point if the student makes a mistake with the target structure (verbs stop, remember, forget followed by the wrong verb form)
      c)    Deduct half a point if the student makes small mistakes (prepositions, articles, spelling).


We sent these suggestions to my sister, a Prime 3 student at the Casa, and I asked her to record a video segment to respond to the students. Here is the video:






Needless to say, the students really engaged in the activity and had lots of fun watching the response. The assessment was perfectly aligned with the learning outcomes and instructional strategies. As a result, my students didn’t even notice they were actually being assessed. Their major interest was in communicating authentically with my sister.


Wednesday, March 06, 2013

Thinking about assessment... again!

If you are a teacher, like me, then you have certainly spent countless coffee breaks discussing assessment: either criticizing or praising it, maybe questioning it, or even plain “cursing” it… no matter who we are, what or where we teach: assessment is close to our hearts… in oh so many ways! 

I have personally been working with and looking into this matter for many years, be it as a course supervisor, designing, writing and revising tests, probing into the process; be it as an examiner, participating in the final assessment of someone else’s handiwork, with cold analytical eyes, scrutinizing the final product; or even as a curious mind who wonders what it is that we do: Do we test to teach? Do we teach to test? And moreover, what it is that we should be doing? 

Anyway, back in January, when Isabela Villas Boas shared a blog post by Nick Provenzano with us, I dared ask her if we could dare… Only to discover that she was the one daring us… In his post, Nick recounts how he spent a semester without his traditional testing system, and how he witnessed high levels of commitment, as well of strong evidence of his students’ skills and knowledge through the use of alternative assignments, essays, projects, and different assessment opportunities. 

The discussion was not new to us. We had already been questioning the unquestionable… the effectiveness of our traditional system with our adult students… Why were we able to find students reaching the higher levels – passing test after test, and still not able to use the language? Why were some of our adult students discouraged? How come teachers were feeling frustrated? We decided to turn these difficulties into opportunities for development. The theoretical project had been ready – on paper – for a few months, as Isabella had taken an online program on assessment with Oregon University. All we had to do was “take the leap” and bring it to life. Right now, there are two groups – one Thomas flex 1, and one Prime 1 – being piloted with an alternative assessment system. 

This series of posts is an attempt to share what we are trying to do, inviting you into this experience-experiment, summoning your thoughts and encouraging your input. 


The THOMAS FLEX 1 Experiment: 

In week 3 (of 10), having already worked with most of Unit 1, we wanted to ascertain that the students were able to interact using the following exponents:

  • What’s your name? My name is… 
  • How are you? I’m…, and you? 
  • What’s your telephone number? It’s (numbers 0-9) 
  • Nice to meet you/Nice to meet you too. 

The lesson was designed to build on student’s recently acquired abilities, consolidate them, and finally, invite linguistic output that could be assessed. The procedure was the following: 

  1. Each pair of students received a dialogue cut up into slips. They were asked to put the slips in a logical sequence to make the dialogue. 
  2. With the correct sequence, students were asked to personalize the dialogue, by substituting names and other elements with their own personal information. 
  3. Then, the slips were collected and the teacher elicited the dialogue on the board – leaving blanks in which they would complete with their own names, etc. 
  4. After that, the students were asked to stand up and cocktail, talking to at least three different classmates, using the dialogue on the board as a model. T observed and monitored the exchanges, cleaning the board when most of the students had performed the dialogue at least once. 
  5. Finally, the T called on pairs of students and asked them to perform the dialogue out loud – no model available. As they did so, the teacher filled in an assessment sheet with the following criteria:


  • Correct greeting / response to greeting - Yes (2pts.) - Partially (1pt.) - No (0 pt.)
  • Asks name correctly - Yes (2pts.) - Partially (1pt.) - No (0 pt.)
  • Responds to question about name correctly - Yes (2pts.) - Partially (1pt.) - No (0 pt.)
  • Says and responds to “nice to meet you”  - Yes (2pts.) - Partially (1pt.) - No (0 pt.)
  • Asks and/or answers about phone number - Yes (2pts.) - Partially (1pt.) - No (0 pt.)

This was the end of their first “Oral Assessment Opportunity”. The plan is to have one of these every week, focusing on different skills-areas such as speaking, writing – in which grammar would be either intrinsically embedded, or clearly stated, and also listening and reading – for we see that one of these students’ main difficulty is understanding input, so that they can formulate their own output. 

Anyway, we are using the media with one sole purpose: hearing your thoughts. Do you have any ideas we can use in this pilot project? Are there any feelings or thoughts you would like to share? Is there anything in your experience that can add to this experiment? Let’s get this chat started… The ball in on your court!



Monday, November 12, 2012

Rethinking Test Reviews - A Digital Twist


Final tests are just around the corner. It is that time of the year that teachers, before starting thinking about their well-deserved vacation, have to focus on how to better review the content for the tests. Though we always feel compelled to try something new and exciting, we are in a period of intense tiredness, so we always go for the simple and easy. And, we, the CTJ Ed Tech Team, feel it is the best approach. However, we´d like to invite you to re-frame your review classes, to think of how you can actively engage students in reinforcing what they´ve been learning, but, mainly, how you can have an exciting grand finale for your students, a memorable time together of practice and interaction. 

Our general approach to reviewing is generally asking our students to do the review handout at home and correct it in class. Or just do the written activity in class. Here´s how you could re-purpose your review class, making students active producers of their own review for the test:

- Use your students´ cellphones:

  • Take advantage of notetaking apps. Ask your students to open their notetaking apps and give them an instruction card with what they should add to their note page. Invite them to flip through the lessons and add vocabulary notes, grammar points, writing their own examples to help them remember what they´ve been studying. 
  • Ask them to take photos with their cellphones of objects and situations and write sentences to highlight vocabulary or grammar. They can use an app to add the image and the sentences (and trust us, if they have a smartphone, they know how to do it!), or they can use the photos and write their sentences in their notebooks. 
  • If you have adult students with Smartphones, ask them to download the app Evernote (http://evernote.com ) before class. With Evernote, the students can open a page, add images, sentences and voice to make their own review. Then they can share a link to their final review page with peers. 
  • Students can go through the book and create a short quiz in their cellphone for their peers to answer.
- If you have a set of iPads available:
  • You can use the same ideas above we shared for the cellphones
  • Use simple book creators apps for students to create their own reviews. After students create it, they can share their review pages with peers and teacher by sending the ebook via email, dropobox, Evenote, as a PDF file.  Here´s an example with the app Book Creator (The Ed Tech Team like it because it is super simple to use it!)

  • In apps like Notability and Penultimate, students can make personalized review pages, recording their voices, adding photos and text to their pages. 
  • Students can also open the Pages app to create a page with the main review points
  • The Keynote app lets the students produce well-designed reviews that can be shared with peers. One idea is for teachers to give different tasks for different groups of students (some groups are responsible for the vocabulary review, others for the grammar). Once their review is ready, they can plug the iPad to to the projector and present to the whole group. 
  • Students can also create a listening quiz for peers. Then, they can exchange iPads, or the teacher can plug the ipad in the classroom loudspeakers and have students answer the audio quiz. (this activity can also be adapted for smartphones) 
  • For the younger ones, they can use very simple tools, like Skitch, to write sentences or practice vocabulary. 
- If you have an iPad and a projector in your classroom:
  • ask your students to prepare a quiz on a blank sheet of paper, then take a photo of the quiz and project on the board for their classmates to answer the quiz. 
  • Take photos around the class to practice certain vocabulary items/expressions/grammar points and do a photo dictation by projecting the images on the board. 

- If you have a computer and a projector in your classroom:
  • Here is a nice way to review vocabulary with intermediate and advanced groups using the laptop and the projector in the classroom. It requires no preparation, all you have to do is open a Word document to type in the vocabulary words that need to be reviewed
>> Divide class into 2 teams. Explain that the teams are going to play against each other.One member of the team (at a time) should sit at the front of the classroom with the back facing the board. This way, that student will not see what is going to appear on the projection on the board. The teacher then should type in a vocabulary word. The only student who doesn`t see it is the one sitting at the front. The group , then, should explain the vocabulary so that the student sitting on the chair can guess it. Explain that the group has 3 chances to give an explanation (in other words, up to 3 different students in the group can raise their hands and explain the vocabulary using their own words). The group gets the point if the vocabulary word is guessed correctly.  
Tip: the students can be given the power to choose the vocabulary words used in the game if you assign each team a unit in the book. This way they can pick the words they want to test the opponent team. If you decide to play the game this way, then have them choose the words beforehand.

Remember that the most important aspect of spicing up your review class with digital tools is to make your students active participants in the review activity, in which they are producers of content. By doing that, you are helping them to personalize learning, organize their strategies for learning, and truly understand how they can become autonomous, self-directed learners. 

Remember, however, to keep track of time for students' tasks so that all the main points are reviewed. Also, the paper review is always an important focused practice. Thus,  assign it previously as homework, and be sure to check the main points with students or  let them check their answers with the answer key. Students need a tangible learning object for extra practice to feel safer and more confident when taking the test. So make sure they have either a handout or a digital page, or even better, both!

You might also want to check what teacher Dani Lyra has done with her students to review for the test:

http://tryingoutweb24ed.blogspot.com.br/2012/11/interactive-reviews-2-phrasal-verbs.html
http://tryingoutweb24ed.blogspot.com.br/2012/11/when-assessment-meets-mlearning-phrasal.html


Any other tips or ideas that you´ve tried in your English classroom?

The Ed Tech Team



Vini Lemos, Sílvia Caldas, Carla Arena and Fábio Ferreira

Monday, November 05, 2012

Aligning learning outcomes, instructional strategies and assessment – an example using mLearning and Digital Images by Vinícius Lemos

In the October 2012 special issue of the ELT Journal – The Janus Papers – Stephen Stoynoff looks back at the changes in language assessment and analyzes the transitions under way. With the emerging dominance of a sociocultural paradigm in which learning is seen as a developmental, socially-constructed, interactive, and reflective process, classroom-based assessment will (pp. 527-528):

- integrate the teacher fully into the assessment process including planning assessment, evaluating performance, and making decisions based on the results of assessment
 - be conducted by and under the direction of the learners' teacher (as opposed to an  external   assessor); 
- yield multiple samples of learner performance that are collected over time and by means of multiple assessment procedures and activities; 
- be applied and adapted to meet the teaching and learning objectives of different classes and students;
-  integrate learners into the assessment process and utilize self- and peer-assessment in addition to teacher-assessment of learning; 
- foster opportunities for learners to engage in self-initiated enquiry; 
- offer learners immediate and constructive feedback; 
- monitor, evaluate, and modify procedures to optimize teaching and learning.

Likewise, the National Capital Learning Resource Center (2004) enumerates the following distinguishing features of alternative assessment:

1) Are built around topics or issues of interest to the students;
2) replicate real-world communication contexts and situations;
3) involve multi-stage tasks and real problems that require creative use of language rather than simple repetition;
4) require learners to produce a quality product or performance;
5) include evaluation criteria and standards which are known to the student;
6) involve interaction between assessor (instructor, peers, self) and person assessed;
7) allow for self-evaluation and self-correction as they proceed.


Hence, there’s been a growing interest in integrating classroom teaching, learning, and assessment. According to the Eberly Center for Teaching Excellence, Carnegie Mellon University, assessments, learning objectives, and instructional strategies need to be aligned so that they reinforce one another, as the image below shows.



Jon Mueller has a frequently updated webiste entitled Authentic Assessment Toolbox  that not only provides solid theoretical background on authentic assessment, but also offers a variety of tools in which the assessments are perfectly aligned with the learning objectives and the instructional activities. Cecília Lemos has also written inspiring posts on alternative asssessment in her popular blog Box of Chocolates.


Burger (2008) proposes the use of Outcomes-Based Education (OBE), in which the first step in planning teaching is identifying the learning outcomes; these outcomes then determine the teaching and assessment that follow so that the learning can be easily assessed via performance. Aligning learning objectives and instructional activities is not hard at all. The difficult part of the triangle is the assessment part, especially when it comes to oral performance.

How can the teacher possibly assess every student’s performance on an oral task designed to assess the attainment of a learning outcome that was developed by way of perfectly aligned instructional activities? 

How can learners be integrated into the assessment process?



 I’m going to propose an example based on an earlier post on this blog by my colleague Vinicius Lemos – mLearning and Digital images. What he describes in his post is an instructional strategy resulting from previous strategies in which students were taught the clothing vocabulary and the present continuous to talk about what one is wearing. I will attempt here to close the triangle above by spelling out the learning objectives that are implicit in the task and suggest a way of assessing students’ resulting performance.


  Learning outcome 1: Given a specific event, students will select and photograph the appropriate pieces of clothing to wear and describe their picture to their classmates using the present continuous and the correct indefinite article before each piece of clothing.

  Learning outcome 2: Given a picture with pieces of clothing that suggest a specific event, students will be able to ask questions using “Are you going to…” and vocabulary to talk about specific events.


 I suggest having students work in pairs rather than in groups to perform the activity, according to the outcomes above: Student A shows and describes his picture using the required language; student B asks questions to guess the event. Then they switch roles.

 Students can practice this exchange with two or three different pairs, as the teacher walks around and monitors their performance. The third or fourth time around, they are asked to record their exchanges, using their smartphones or, if available, the computer lab or a set of iPads. After they finish, they listen to their performance and engage in self-assessment of their part of the recording, according to a can-do checklist that can contain items such as:

 - I can name all the pieces of clothing. 
 - I can use the correct article for pieces of clothing in the singular starting with a vowel or consonant sound and no article for plural. 
- I can describe what I’m wearing using the present continuous. 
- I can name events such as school, work, picnic, wedding, etc. 
- I can ask questions about where a person is going based on their outfit. 
 - I can produce the language described above in a natural way, without too much hesitation or many long pauses to think. 


They judge their performance and if they think it needs improvement, they can record the conversation again, making the necessary adjustments. Then they send the recording to the teacher, who will use rubrics to assess students’ attainment of the two outcomes above. The teacher’s rubrics need to be similar to the students’, but should contain at least three levels of performance with appropriate descriptions.

Suppose each unit in the language program’s assessment cycle consists of five learning outcomes. Then each outcome can be worth 20 points. If the teacher conducts these types of assessments right after the instructional strategy, in such a way that the strategy is the assessment and vice-versa, at the end the student will have a grade on a 0-100 scale for oral performance.


Who needs a midterm or end-of-term oral test after that?


 The proposed assessment system here is in keeping with Stoyoff's (2012) list of characteristics of contemporary classroom-based assessment: it integrates the teacher fully into the process; it is conducted by the teacher; it can be one of a variety of samples of learnt performance collected over time, using multiple procedures; it meets the learning objectives, it integrates learners into the assessment process; it offers immediate and constructive feedback; and it allows the teacher to monitor, evaluate, and modify procedures to optimize teaching and learning.


 References:

Burger, M. (2008). The alignment of teaching, learning and assessment in English home language grade 10 in District 9, Johannesburg (Dissertation). University of South Africa, Pretoria, South Africa.
 
 
National Capital Language Resource Center (NCLRC). (2004) Assessing learning: Alternative assessment. In The essentials of language teaching. Retrieved from http://www.nclrc.org/essentials/assessing/alternative.htm



Stoynoff, S. (2012). Looking backward and forward at classroom-based language assessment. In ELT Journal, V. 66/4 – Special Issue: The Janurs Papers, pp. 523-532.


This post is cross-posted in my blog http://isabelavillasboas.wordpress.com/
If you want to read more about assessment and other TEFL issues, pay me a visit there.