Jo Mynard, Kanda University of International Studies University, Japan
Mynard, J. (2016). Looking backwards and forwards: Evaluating a 15-year-old SALC for continued growth. Studies in Self-Access Learning Journal, 7(4), 427-436.
This reflective article gives an overview of how a self-access learning centre (SALC) in Japan approaches its ongoing evaluation. The author shares some retrospective evaluation approaches and also provides a description of a micro-evaluation as an example. The article concludes with some thoughts about two alternative approaches, one future-looking and one predictive, that might help a SAC to move into new directions.
Keywords: evaluation, strategic planning, self-access management
Evaluation is a necessary part of the overall SAC management as a way to ensure that users’ needs are being met and that both efficiency (i.e. whether resources are being used optimally) and effectiveness (i.e. whether learning is taking place) are maximised. However, evaluation is notoriously difficult due to the nature of the complex processes that we are working with (Gardner & Miller, 2015; Riley, 1996). For this reason, colleagues in the field have approached the task in a variety of ways (see Gardner & Miller, 2015 for a summary). What all the documented evaluation approaches have in common is that they appear to be mostly retrospective. In other words, they look back at what has been achieved and measure or describe it in some way in order to either simply document, take stock, or to inform a future change.
In this article, I will briefly summarise the mostly retrospective approaches we have been taking to evaluate the Self-Access Learning Centre (‘The SALC’) at Kanda University of International Studies (KUIS) in Japan. I will share an example of how we have evaluated one feature of our SALC in order to show how a micro-evaluation looks in practice, and then finally share some thoughts about two alternative approaches, one future-looking and one predictive, that might help a SAC to move into new and unchartered directions.
KUIS is a private university in Japan specializing in foreign languages and cultures. There are around 3800 students at the university who major in various European and Asian languages, and all are required to take some English classes. The SALC was established in 2001 as a place where students could continue to use and study English outside of class and get support in order to facilitate the development of learner autonomy. The SALC is a busy centre receiving between 500 and 600 visitors per day. Use is optional and students have access to a number of different services and events designed to support the development of their language skills while promoting learner autonomy. The SALC team comprises of a director, four full-time administrative staff, two full-time designers, nine full-time learning advisors and around 35 part-time student staff.
Approaches to SALC Evaluation at KUIS
We have three approaches for evaluating the SALC at KUIS, all taking a retrospective approach. The first is strategic planning which is typically used in the business world. In many ways, a SALC operates as a small company so using this approach can be very helpful for systematic planning and also for creating, implementing, and communicating a shared vision. The second is a more micro approach which is conducting ongoing cycles of research on the various services and facilities in order to constantly evaluate and improve the SALC for students. The third approach is to establish cycles and timelines for the micro-evaluation.
The overarching approach to evaluating the SALC starts with a strategic plan which we visualise as an ongoing road map. The strategic plan is established periodically and then the evaluation involves evidencing whether the SALC has achieved its plan. Although this sounds simple, strategic planning is an ongoing endeavour and requires constant attention. Depending on university directions and events, the SALC team establishes the duration of each phase of the plan typically between five and ten years.
Mission and vision
All of the full-time staff participate in updating the strategic plan starting which involves (1) discussing the mission and vision statements and making changes if necessary, and (2) discussing and establishing broad focus areas. When possible, student staff also have a chance to participate in some of the meetings. Paying attention to the mission statement ensures that the SALC directions focus on core values and services. The vision statement is helpful for imagining future developments, drawing on global trends and technological advances and creating an image of an ideal future SALC scenario. As SALC director, I make sure that we revisit the mission and vision each year in order to keep us all on track.
Specific focus areas
In a series of meetings over the course of several months, sub-teams establish more specific focus areas and then break them down into achievable and measurable goals. Around three to four times a year, we meet to review what we have achieved and to confirm the priorities for the coming semester.
The current 2016-2026 plan has five focus areas, each with several sub-goals. The broad goals are as follows (the actual plan includes sub-goals, specific details, priorities, and timeframes):
Goal 1: To provide opportunities to develop language learner autonomy
Goal 2: To Provide a suitable learning environment and resources for our students’ needs
Goal 3: To provide access to multiple learning communities to inspire and motivate learners
Goal 4: To increase language proficiency related to students’ current and future goals
Goal 5: To collaborate with others and continue to develop our professional expertise
Evaluation using a strategic plan
Establishing and monitoring a strategic plan is a useful ‘big picture’ approach to evaluating a SALC. It is rewarding to be able to ‘check off’ achievements at the end of each semester and feel a sense of progress. However, updating a strategic plan each year can feel like a never-ending ‘to do’ list unless it there is a chance to regularly revisit the vision statement. Ideally the strategic planning process benefits from including outside perspectives in order to generate alternative ideas and insights. This is something that we have not been doing at KUIS, but plan to initiate in the new academic year. In our case, this will require funding in order to invite SALC experts from other contexts to join our planning discussions.
Ongoing Research Cycles
A second approach to evaluating the SALC is to ensure that we engage in ongoing research projects as a ways to systematically investigate aspects of the SALC detailed in the strategic plan. Each service, facility or event documented in the plan is evaluated periodically on an ongoing basis. The ultimate goal is to serve our students’ needs, so the first questions related to each research project are always:
- What are the needs of our students? (these change, so this question should be revisited every few years)
- What are the best ways to support our students?
For ongoing research designed to evaluate and improve the SALC’s features or services, the overall research questions tend to be the same:
- How well is this service/facility/event serving the needs of our students?
- How could it be further improved?
The research methods tend to have been tried and tested over many years, often using the same instruments in order to see the development over time. They tend to draw upon multiple (and reasonably convenient) data sources, for example:
- A literature review
- Usage figures
- Focus group discussions
- Questionnaires gathering learner perceptions, learning advisor perceptions, teacher perceptions, etc.
Some projects draw upon more innovative and/or time-consuming research methods such as:
- Discourse analysis
- Analysis of learner diaries or reflective reports
- Analysis of learner portfolios or other documents
- Interviews with users and staff members
- Longitudinal studies over several years
To illustrate how a SALC feature or service is evaluated according to a research cycle approach, I draw upon some research currently in progress and present an example project in the next section.
An example evaluation project
Purpose of the research. To evaluate the “Effective Language Learning Course” (ELLC)
Background. The ELLC aims to develop self-directed language learning skills in order to promote language learner autonomy. The content draws on the literature in the areas of learner autonomy, self regulated learning and self-directed learning and is based on our students’ needs (see Thornton (2013) and Takahashi et al. (2013) for details). The broad learning outcome areas are as follows (see Takahashi et al. 2013 for specific details):
- Knowing about support / opportunities outside class
- Setting and reviewing goals
- Selecting, using and evaluating resources
- Identifying, using and evaluating strategies
- Making, implementing and evaluating a learning plan
- Evaluating linguistic and learning gains
- How satisfied are the learners with the ELLC?
- After completing the ELLC, are students able to meet the course learning outcomes?
- Student survey to investigate perceptions and level of satisfaction with the course, also students’ self-evaluations of learning gains as defined by the course learning outcomes.
- Analysis of learning journals, portfolios and reflective reports to investigate actual evidence of whether the learners demonstrated a working knowledge of the learning outcomes.
- Interviews with learners to reach a greater understanding of the findings.
Summary of the main findings. The questionnaire and interview data indicate a high level of student satisfaction with the course. In addition the participant responses show ways in which the course influenced how the students thought about their language learning. Students also generally felt that the course helped them to achieve all of the learning outcomes. The analysis of journals, reports, and portfolios indicated that in most cases, the majority of the learners demonstrated evidence of meeting most of the learning outcomes. The only learning outcome that was not adequately met was the students’ ability to evaluate their linguistic development.
Outcome. As a result of the research, the SALC team can be confident that the course is mostly meeting students’ needs. However, there have been discussions about how realistic it is to expect learners to be able to evaluate their linguistic development after just one semester. It is likely that the learning outcome will be adjusted.
Benefits and challenges of SALC evaluation using research cycles
Using research cycles has been a highly useful approach to engaging in continued evaluation and improvement of a service or facility. It ensures that the approach is systematic and well documented. Much of our ongoing research includes journal publications or conference presentations by team members at intervals. This creates a sense of achievement and emphasises collaboration as different team members work together at various points. Establishing research cycles is also useful for enabling new staff to join existing projects and contribute to the ongoing development of the SALC even in their first year at KUIS. There are a couple of points to be aware of however. The first is that if the research cycles are the only approach, it is important to periodically take a ‘big picture’ view in order to allow for innovation rather than simply continue to offer almost exactly the same service or resource year after year. The second potential challenge is that the research can be quite time consuming as it relies on multiple data sources. This can be managed by establishing a timeline depending on how often a service needs to be evaluated. I will discuss this point in more detail below.
Timelines and Cycles
Establishing timelines is something that has been improved upon recently having had experience of several evaluation cycles. Ideally timelines should be drafted alongside the strategic plan. Knowing how often to completely re-evaluate and how often to conduct micro-evaluations of a particular resource or facility is useful information in order to make the process efficient. For example, is it necessary to gather student feedback on courses each semester if the service remains unchanged, or is once every 3-4 years sufficient? A major re-evaluation might be needed every ten years, for example, revisiting the SALC philosophy, or evaluating its curriculum. Other micro-aspects of the SALC such as course evaluation, or evaluating the quality or usage of a service such as events, advising, orientations, or technology might typically be needed every three or four years. A practical evaluation timeline (based on a simplified version of the plans at KUIS) might look something like this (the shaded spaces indicate where a research cycle is in progress):
Table 1. Sample Evaluation Timeline
So far, I have described three retrospective approaches to SALC evaluation. As a team we have been discussing plans to test alternative future-looking and predictive approaches, but due to the lack of literature, guidance, and experience, these will be experimental. The first approach involves data mining, i.e. utilising large amounts of pre-existing data in order to learn something new about the SALC. The second turns to the business world to get insights from different industry leaders. Hayo Reinders is acknowledged for inspiring both of these ideas (personal communication, September, 2016).
Big data and learning analytics
Big data is a term to describe very large amounts of data that tend to be beyond the abilities of common statistics software (Manyika et al., 2011). Learning analytics is the actual measuring, collecting, analysing and reporting the data in order to optimize learning and learning environments (Long & Siemens, 2011). Long and Siemens (2011) describe big data and analytics as “the most dramatic factor shaping the future of higher education” (p. 31), but it has not been utilised to evaluate or predict self-access learning. Educational data mining and learning analytics would surely be a useful approach for predicting patterns of self-access use and seeing relationships between variables. According to Reinders (2016), drawing upon data we already have would allow us to conduct different kinds of analyses in order to (for example) visualise patterns, predict student performance, and identify student groups. We could use this information to plan ahead, and design courses and tailored programmes for certain users. In our case, we can only guess that patterns such as the following are true:
- Students who take our SALC courses are more likely to continue to engage in lifelong learning.
- Students who attend regular SALC workshops and events will increase their language proficiency more dramatically than non-users.
Although, we do have some knowledge of trends due to our small-scale research findings, drawing on large data sets would help us to consider many factors that affect success in learning. Examples of these factors are: proficiency when entering the university, gender, age, classes being taken, grades in high school, major, study abroad, club activities, and part time jobs. We could also consider SALC-specific data such as whether students take our courses, attend workshops, or regularly meet with learning advisors has any affect on learning.
Currently we lack expertise in this kind of data analysis and would certainly require help and training, but it seems to be a very powerful tool. This would be a completely different approach and potentially transform the current process.
The second approach which would allow a completely different kind of evaluation, potentially identifying blind spots, oversights, and inefficiencies would be to invite external evaluators to perform the evaluation according to their own criteria. The obvious place to start would be to invite an experienced director of another SALC to undertake the evaluation which is not unusual in our field. However, it might be more beneficial to invite experts from different fields to also evaluate it. For example, perhaps the SALC could be evaluated by an accountant, a librarian, a restaurant owner, a department store manager, a manager of a language school, a careers counsellor, a bookshop manager, a high school teacher, and so on. As business owners and specialists, these professionals are likely to be skilled at running efficient systems and are likely to have significant training and experience in accounting, PR, advertising, marketing and other practices unfamiliar to SALC team members. The evaluation process would force us to ask and answer questions that we may not have considered before prompting new kinds of reflection.
Writing this article has prompted me for the first time to document exactly how we evaluate our SALC and how all of the parts fit together. I recommend this process to everyone who (like me) has struggled to find the right way to show that their SALC is an affective and efficient entity. Each SAC will have its unique features, users, stakeholders, practices and priorities that will emerge and evolve over time which influence the evaluation process. The following points are a some general recommendations based on what has worked at KUIS for guiding the evaluation process:
- Have an ongoing strategic plan
- Break down the plan into manageable chunks
- Celebrate successes frequently
- Involve the entire team in the process
- Draw upon research to guide changes
- Plan research cycles in advance
- Focus on learners’ needs first
- Revisit the mission and vision statements regularly
- Include diverse, outside perspectives in discussions
- Expect the evaluation to be an ongoing process
Notes on the Contributor
Jo Mynard is associate professor and director of the Self-Access Learning Centre (SALC) at Kanda University of International Studies (KUIS) in Japan. She has co-edited several books on learner autonomy and on advising in language learning and recently co-authored a book on reflective dialogue and advising. She has been the editor of SiSAL (Studies in Self-Access Learning) Journal since 2010.
Gardner, D. & Miller, L. (2015). Managing self-access language learning. Hong Kong: City University Hong Kong Press.
Long, P., & Siemens, G. (2011). Penetrating the fog: Analytics in learning and education. Educause Review, September / October, 31-40. Retrieved from http://er.educause.edu/~/media/files/article-downloads/erm1151.pdf
Manyika, J., Chui, M., Brown, B., Bughin, J., Dobbs, R. Roxburgh, C., & Hung Byers, A. (2011, May). Big data: The next frontier for innovation, competition, and productivity. Report, McKinsey Global Institute. Retrieved from http://www.mckinsey.com/business-functions/business-technology/our-insights/big-data-the-next-frontier-for-innovation
Reinders, H. (2016). Educational data mining and learning analytics. Workshop given at Kanda University of International Studies, September, 2016.
Riley, P. (1996). The blind man and the bubble: Researching self-access. In R. Pemberton, E. S. L. Li, W. W. F. Or & H. D. Pierson. (Eds.). Taking control: Autonomy in language learning, (pp. 251-264). Hong Kong: Hong Kong University Press.
Takahashi, K., Mynard, J., Noguchi, J., Sakai, A., Thornton, K., & Yamaguchi, A. (2013). Needs analysis: Investigating students’ self-directed learning needs using multiple data sources. Studies in Self-Access Learning Journal, 4(3), 208-218. Retrieved from https://sisaljournal.org/archives/sep13/takahashi_et_al/
Thornton, K. (2013). A framework for curriculum reform: Re-designing a curriculum for self-directed language learning. Studies in Self-Access Learning Journal, 4(2), 142-153. Retrieved from https://sisaljournal.org/archives/june13/thornton/