Alexander Worth, Osaka Institute of Technology, Osaka, Japan
Dani Fischer, Osaka Institute of Technology, Osaka, Japan
Erik Fritz, Kwansei Gakuin University, Osaka, Japan
Ashley R. Moore, Osaka Institute of Technology, Osaka, Japan
Worth, A., Fischer, D., Fritz, E., & Moore, A. R. (2017). CEFR-J based speaking program for a self-access learning center. Studies in Self-Access Learning Journal, 8(1), 44-59.
The following paper is a summary of a work in progress at a Self-Access Learning Center (SALC) called the Speaking Program, including the design process carried out to create it, and how the material is used and revised. Additionally, the paper provides some initial data on how the Speaking Program may have impacted the center’s overall user numbers, number of repeat users, and the number of requests for each type of service offered in the consultation service. The data collected has implications regarding how interactional spaces that provide access to learning advisors and teachers can be efficiently used and how they might be better served by using more structured activities in certain educational contexts such as a technical university.
Keywords: conversation practice, CEFR-J, materials design, SALC usage
The institution where the materials design project took place is a university specializing in technology and engineering in Japan. None of the students are English majors. The Self-Access Learning Center, known as the Language Learning Center (LLC), supports the students in a variety of ways, including preparation for overseas conferences and academic presentation training. Students also have access to teachers and learning advisors on a one-to-one basis through a reservation-based consultation service. Such services are a common feature of self-access learning centers. Students can talk with teachers for a total of 15 minutes for a variety of purposes such as receiving advice about learning, practicing presentations, or getting an academic paper proofread; however, the most common request (before the Speaking Program was introduced) was for general English conversation practice where students were encouraged to come to the consultation service having already chosen a discussion topic to use to practice English with the teacher. Popular topics included hobbies or a recent travel experience. However, during the first two years of running the consultation service, teachers began to notice a number of issues relating to the service and its ‘general conversation’ function in particular.
Historically, the consultation service has had regular users and consistent bookings during the beginning and middle of the first semester, followed by a decline in numbers towards the end of semester one, and lower overall numbers of users in semester two. For example, the percentage of the available reservations used by students during semester one 2014 was 86.9% whereas in semester two it was 66.5% (76.4% and 74.9% in 2013 and 2012 respectively). Additionally, as Lui (2013) noted of a similar student-centered service at a university in China, “only students with strong motivation are able to stay committed” (p. 32) and we indeed found that a large proportion of students used the consultation service just once. Although our students have a general awareness of the importance of English for success in their field, their motivation to continue studying wanes throughout the semester as they must prioritize more and more time for their research and laboratory work. Thus, the teachers felt that a speaking program that broke down the long-term goal of acquiring conversational fluency in English into a series of sub-stages (with rewards for completing each sub-stage) might help students to maintain their motivation for longer periods of time.
As noted above, students wishing to use the consultation service for such purposes were always encouraged by staff to decide on a specific topic for themselves at the time of making the reservation. However, over time they seemed to find it difficult to come up with concrete topics for each 15-minute session and simply started to write ‘free talk’ in the topic column of the reservation sheet. In such cases, the onus for choosing a topic and leading the conversation fell on the teacher, and students lost an opportunity to exercise their autonomy over their learning. This may also have been due to discrepancies in what the learner and teacher expected from a conversation session. As noted by Moore and Thornton (2011) during their study of a service for English conversation practice in another self-access center, there were key differences between how the teachers working in the center and the students using it viewed their roles in terms of who should decide the topic. Teachers also expressed dissatisfaction regarding the “frequency with which they were expected to talk about [commonly reoccurring generic topics]” (p. 98). To remedy this situation, the teachers thought it may be helpful to provide students with a range of topics, designed to gradually push them out of their comfort zone and into the zone of proximal development.
Lastly, as Medding and Thornbury (2009, p. 9) note, conversation is a pre-requisite for grammatical acquisition, and thus the authors were keen to provide students with opportunities to practice general conversation in English. However, teachers anecdotally noted that students engaging in repeated general conversation sessions did not always demonstrate improvements in lexical range and grammatical complexity and accuracy. This phenomenon could also be linked to the prevalence of recurrent generic topics linked above. A need for some method of scaffolding students towards greater complexity was thus identified.
To address these issues in the consultation service, we decided to create and implement the Speaking Program in the LLC. It has three principal goals; firstly, to break down the long-term process of acquiring conversational fluency into a series of more manageable sub-stages. Related to the first goal, our second goal was to use this more formalized system to help students maintain their motivation over time and thus both decrease the number of students accessing the service only once and increase reservation rates for the consultation service across both semesters. Thirdly, we wished to create a program that pushed students gradually outside of their comfort zones in terms of the topics they talked about, and their lexical and grammatical range and accuracy. This article will go on to summarize how the materials were designed, the annual review and revision process, and how the materials are currently being used in the consultation service.
Design Process: Speaking Program
A team of four whose work is divided between teaching regular classes and advising duties within the LLC collaborated to design the Speaking Program. The materials consist of a series of 2-sided speaking topic-based worksheets that are currently split into four levels of difficulty. The ‘can do’ descriptors for spoken interaction detailed in the Japanese version of the CEFR (Common European Framework), or CEFR-J, guided the formation of the program’s four levels: SA1, SA2, SB1, and SB2. Each level was assigned a color that matches one of the 4 color-coded levels LLC had already been using to label its other materials (yellow, green, blue and red, in order of increasing difficulty). The worksheets are kept in colored self-access folders that match with the level of difficulty. This system makes it easy for students to find the appropriate materials for their level.
The rationale for using CEFR-J was to maintain the Speaking Program’s validity at an institutional level and, with regard to materials design, provide a consistency to the language used in each worksheet. Using CEFR-J descriptors for spoken interaction, the team selected competencies from the different bands that contained a variety of language forms that could be developed into a discussion topic. For example, the CEFR-J A2 spoken interaction band requires that the learner be able to ‘understand sentences and frequently used expressions related to areas of immediate relevance (e.g. very basic personal and family information, shopping, local geography, employment)’. Thus, the Speaking Program features topics that include describing family members (family information) and describing one’s hometown (local geography). Initially, five worksheets were created for each level. However, it was always the team’s intention to add further worksheets to each level as it would be somewhat naïve to expect students to be able to move from one level to another having possibly only completed five 15-minute speaking tasks. The current total is eight worksheets per level with further additions expected in the future. We limited the content of each topic to fit on both sides of a single A4 sheet (Appendix 1) in order to keep within the 15-minute consultation service time.
Table 1 details the four levels and the related topics drawn from CEFR-J. Some topics, particularly in level SB2, were specifically geared toward students at our technical university.
Table 1. The Four Levels and Topics for Each of the Eight Worksheets
The program also required a level check, or diagnostic, for new students starting the Speaking Program. The level check takes the form of a controlled interview that includes language forms from each of the four levels, starting at SA1. For example, one of the first questions is ‘What kind of food do you like?’ which is related to SA1.3. Each section of the level check becomes increasingly more challenging and if the student begins to struggle at any point the level check is politely ended and the student assigned a level based on the successful exchanges that took place. Students also have the option to start at a lower level (than the one recommended by the instructor) if they would like to review any topics. Assessment tests for each level were also created, with more details described in the section below titled “How the material is used.” The next stage of the design process required that the team review all of the material and make any alterations necessary. Once completely edited in English, the instructions for each worksheet were translated into Japanese.
The Speaking Program material is reviewed every year during an annual meeting and any necessary edits are made. For example, a particular task may not be working and might need changes, or a completely different task might be suggested. Also, as was the case during the most recent review stage, when the grammar book linked to each worksheet (the Japanese bilingual edition of Basic Grammar in Use by Raymond Murphy) was replaced with a newer edition which included a restructuring of the units, the materials had to be altered to reflect the newer edition.
How the Material is Used
Each worksheet of the Speaking Program is organized into the following sections (see Appendix A for an example worksheet):
- The worksheet aims
- Short task description
- Homework component
- Preparatory section
- A task (making statements, answering and asking questions)
- Brief cultural or grammatical note
The Speaking Program level check, worksheets, and assessment tests were all designed to take 15 minutes so they could be used in the 15-minute reservation-based time slots of the consultation service. 48 time slots are usually available each week. When students begin the Speaking Program they are first required to do the level check. Once the teacher has assigned them a level, they can choose their first worksheet. Students should complete the homework and preparatory stages of the worksheet prior to their next consultation service reservation. The homework section of the worksheet consists of various activities to help a student understand the task. The activities range from simple translation exercises and gap fills, to matching exercises and role plays. On almost all of the worksheets, there is also a reference to the relevant units of a grammar book held permanently in the LLC (the Japanese bilingual edition of Basic Grammar in Use by Raymond Murphy), should the students require additional support. The homework component for each worksheet was designed to introduce the language focus to the students and provide the opportunity to practice the language in a controlled manner. At the beginning of the session the teacher checks the homework to ensure the student’s comprehension of the language form; in addition, the preparatory stage might also be checked if errors were detected by the teacher during the task stage. The worksheets also contain a ‘Did You Know?’ section that includes cultural information or further grammar support. In short, the process from the level check to the first session is as shown in Figure 1.
Figure 1. Work Flow for Speaking Program
Once a session has been completed, the teacher makes a comment on the student’s ‘progress chart’ (Appendix B), a sheet of paper on which the teacher writes notes for the student to keep a record of the sessions they have completed and their performance. This record also shows how many times a student repeated a task before being able to proceed to the next worksheet. The tasks are designed to be repeated if necessary. The teachers also record comments in a secure Google database which exists for all LLC teaching staff to check an individual student’s progress before meeting in the consultation service. The progress report and the Google database are used by student and teacher respectively to prepare for the assessment test. When a student feels ready, they can opt to take the assessment test in order to move up to the next level. The assessment test assesses students on three tasks drawn from the eight topics within a level. The teacher tests the student’s ability to produce and respond to certain language forms from the three chosen tasks. The teacher chooses the three assessment tasks based on comments made on the student’s progress charts or, more commonly, comments recorded in the Google database. A student can fail the assessment test if he or she does not perform to the required standard stated in the CEFR-J competencies and fails to provide correct utterances based on the language contained in the worksheets. The aim of the assessment test is to confirm students can understand and appropriately respond to questions in English, as well as ask appropriate questions to their interlocutor(s).
The teacher records any errors during the assessment test and then provides feedback immediately afterwards, pointing out errors and giving explicit instructions on how to improve if the assessment test must be retaken. It is not uncommon for students to fail the assessment test the first time but most pass the second time. When students pass the assessment test they are given a certificate to recognize their achievement and they move on to the next level.
Tasks in the final SB2 level are designed to be more open-ended so they can be repeated as many times as the student wants. This level gives the student more control over the content of each task as well as allowing them to practice the tasks continuously with the goal of helping them maintain their level of fluency.
Initial Impact of the Speaking Program on the Consultation Service Usage
Early data suggests that the Speaking Program has had an impact on the consultation service in the following three areas: firstly, the type of student request, secondly, the consistency of consultation service users over semesters one and two, and finally the number of students making repeat visits to the consultation service. Figure 2 shows that reservations for the Speaking Program have all but replaced ‘free talk’ whereas reservations for the other services have largely remained the same (semester one of years 2016 and 2014 were chosen for comparison as 2015 involved the gradual integration of the Speaking Program in the first semester and a full integration in the second semester).
Figure 2. A Comparison of the Services Provided by the LLC in the Consultation Service
The percentage of available reservations used by students post-Speaking Program introduction stood at 93.3% during semester one 2016 and, more significantly, 92.8% for semester two 2016, an increase from the 2014 figures of 86.9% and 66.5% for semesters one and two respectively. Although a longer study would be required to fully attribute this large increase in semester two’s reservations to the Speaking Program, the data suggests a positive shift towards achieving the goal of more consistent consultation service usage over semesters one and two. Another goal of the Speaking Program was to increase the number of students who make repeat visits to the consultation service, and by design the LLC. In the first ten weeks of semester one 2014, 49 students came to the consultation service for a single visit to practice free talk, with a large drop in repeat users beyond that initial visit. In the first ten weeks of semester one 2016 the number of students making repeat visits for the Speaking Program has increased with one time users at only 16 (Figure 3). A longer study would be required to fully evaluate whether there is any correlation between the number of times that students use the consultation service and the introduction of the Speaking Program. However, based on the time periods analyzed, the mean average for a single student and the number of times the consultation service was used stands at 2.83 visits per student for the first ten weeks of semester one 2014 and 3.96 for the first ten weeks of semester one 2016.
Figure 3. Free Talk / Speaking Program Reservations per Student
The Speaking Program has proven to be successful in terms of popularity amongst consultation service visitors and increasing the number of repeat visitors to the LLC. Such has been the demand for the program that reservations are fully booked weeks in advance and teachers have had to add extra consultation sessions in order to address the increased demand. Most importantly, the Speaking Program has provided structured activities with clear goals that form the basis of the consultation sessions. At once the Speaking Program removes both the pressure for students to consistently generate more topics as they were required to for ‘free talk’ whilst also providing more tangible means for students to check their language progress.
Special thanks to Akiko Miyama and Junko Murao of the English Department at Osaka Institute of Technology and Misato Tachibana of the Language Learning Center for their help with translating the program.
Notes on the Contributors
Alex Worth is a learning advisor and lecturer at Osaka Institute of Technology, Japan. His primary area of study is materials design, with a recent focus on the use of tablet technology.
Danielle Fischer is currently employed at Osaka Institute of Technology’s Language Learning Center. She has been working in the field of EFL for 13 years. Her main interest is English for Scientific purposes, especially in the field of Engineering. She is currently focused on designing training materials for academic presentations.
Erik Fritz has taught ESL for 12 years and currently teaches at Kwansei Gakuin University. His research interests include study abroad, nationalism, and second language writing.
Ashley R. Moore is the director of the Language Learning Center at Osaka Institute of Technology. His research and professional interests include the management of self-access learning, and the intersections between identity, second language learning and motivation.
Lui, M. (2013). English Bar as a venue to boost students’ speaking self-efficacy at the tertiary level. English Language Teaching, 6(12), 27-37. doi:10.5539/elt.v6n12p27
Meddings, L., & Thornbury, L. (2009). Teaching unplugged: Dogme in English language teaching. Surrey, UK: Delta Publishing.
Moore, A., & Thornton, K. (2011). The ELI Practice Centre: Investigating role, purpose and satisfaction in a complex interactional space. The Journal of Kanda University of International Studies, 22, 77-110. Retrieved from http://id.nii.ac.jp/1092/00000934/
Murphy, R. (2011). Māfī no kenburijji eibunpo: Shokyūhen [Murphy’s Cambridge English Grammar: Basic Level] (3rd ed.). Singapore: Cambridge University Press.