With Dr Christopher Cheong from RMIT University in Melbourne, Australia. Chritopher’s work is concerned with how we enhance students’ experiences of learning and how they can become better engaged and motivated as they self-regulate their learning. He is based in the School of Business IT and Logistics.
These rough notes were taken during the presentation to summarise the main points of interest.
The project presented on is looking at reflection rather than prediction and learning analytics rather than the normal focus of learning analytics research on prediction. The project was to support students to reflect on their own learning processes.
A key issues with learning analytics is the volume of data that can be amassed from VLEs to produced data overload for both individual students and staff who may be unable to identify what data is relevant to learning and to teaching. The relevance of data can be more visible through effective learning design – learning design is the coordination of resources, tasks and support mechanisms. Support mechanisms assist students in their learning and so include teacher and peer feedback and facilitation mechanisms.
Learning analytics are derived from different components of the learning design. Lead or check point indicators provide metrics of student engagement such logins, downloads, etc. and so are concerned with the learning resources. Such metrics are useful but do not say much on the learning processes of students. Process analytics such as social network analysis and content analysis are derived from the task components of learning design and provide data on the learning processes or behaviours of students.
The concept of the project is to analyse small bite-size learning tasks within a short feedback mechanism of task, test monitor (TTM). Feedback is, therefore, immediate, summative and formative and is provided to students and staff individually and at aggregate levels. Short activities with detailed perquisites and next tasks as scaffolding and sequencing are provided in the TTM system. Feedback is provided on MCQs but where answer incorrect the correct answer is not given so allowing the student can retake the test multiple times. The analytics provides performance information as progress graphs and average result score at different levels of detail including between different courses on the system. Some social comparison against the rest of the class is also provided. Learning behaviours are, therefore, understood as student interactions with the TTM system. Analysis was based on test submissions and the tests were not part of the formal course assessment. TTM was used in three classes with about 50 students participating.
The data showed students tended to fall behind in submitting tests and so not necessarily working consistently on the course materials with few on-time submissions from week 5 onwards although they did use tests to aid revision for exams. In the course, the TTM system was used in the classroom based tutorials resulting in a high rate of usage with the vast majority of students finding the system somewhat or very useful for their learning. Students noted the increase in workload from other courses was a reason why their use of the tests declined over time. Students tended to be looking for very high scores on the tests and a high number took the test multiple times to ensure they got high scores. Some students used the tests on a trial and error basis to improve their scores, while others used the feedback to improve their scores. I would take from this that the system may well encourage a more surface approach to learning focused around test performance.
For the students, the benefits of the system were about identifying gaps in knowledge or validating and reinforcing knowledge. In terms of their learning process as the course progressed, students used increasingly used the tests as a diagnostic guides to what they needed to learn and as a revision tool to validate, reinforce their knowledge and understanding. It was also noted that the system helped with the students’ time management including allocating study time on the basis of the stated task duration in the task descriptors. Most students also used the analytic data and found it useful for social comparison, tracking progress in the course using a timebases task list automatically generated and to track their progress against their learning goals. There was evidence that the analytics supported students in their reflections and evaluation on their own learning behaviours.
Future work is to be undertaken on further analysis of the data on the frequency of submissions and effects of test results, improving analytics features with a focus on prescriptive learning analytics supporting students in improving their learning processes. It was noted that the mature learner able to regulate their own learning would be less likely to be helped by this system and the likely effects will be greater for mid and lower performing students.
There was also an interesting discussion on students over-estimating how much time they spend studying and returning data to students on what they actually do as an important component of self-regulating their learning.