I’m back in the Ambient Labs again, this time for a workshop on learning analytics for staff here at The Open University.
Challenges for Learning Analytics: Visualisation for Feedback
Denise Whitelock described the SaFeSEA project which is based around trying to give students meaningful feedback on their activities. SaFeSEA was a response to high student dropout rates for 33% new OU students who don’t submit their first TMA. Feedback on submitted writing prompts ‘advice for action’; a self reflective discourse with a computer. Visualizations of these interactions can open a discourse between tutor and student.
Students can worry a lot about the feedback they receive. Computers can offer a non-judgmental, objective feedback without any extra tuition costs. OpenEssayist the structure of an essay; identifies key words and phrases; and picks out key sentences (i.e. those that are most representative of the overall content of the piece). This analysis can be used to generate visual feedback, some forms of which are more easily understood than others.
Bertin (1977/81) provides a model for the visualization of data. Methods can include diagrams which show how well connected difference passages are to the whole, or to generate different patterns that highlight different types of essay. These can be integrated with social network analysis & discourse analytics.
Can students understand this kind of feedback? Might they need special training? Are these tools that could be used primarily by educators? Would they also need special training? In both case, it’s not entirely clear what kind of training this might be (information literacy?). Can one tool be used to support writing across all disciplines or should such a tool be generic?
The Wrangler’s relationship with the Science Faculty
Doug Clow then presented on ‘data wrangling’ in the science faculty at The Open University. IET collects information on student performance and presents this back to faculties in a ‘wrangler report’ able to feed back into future course delivery / learning design.
What can faculty do with these reports? Data is arguably better at highlighting problems or potential problems than it is at solving them. This process can perhaps get better at identifying key data points or performance indicators, but faculty still need to decide how to act based on this information. If we move towards the provision of more specific guidance then the role of faculty could arguably ben diminished over time.
The relation between learning analytics and learning design in IET work with the faculties
Robin Goodfellow picked up these themes from a module team perspective. Data can be understood as a way of closing the loop on learning design, creating a virtuous circle between the two. In practice, there can be significant time delays in terms of processing the data in time for it to feed in. But the information can still be useful to module teams in terms of thinking about course:
- Information Management
- Learning Experience
This can give rise to quite specific expectations about the balance of different activities and learning outcomes. Different indicators can be identified and combined to standardize metrics for student engagement, communication, etc.
In this way, a normative notion of what a module should be can be said to be emerging. (This is perhaps a good thing in terms of supporting course designers but may have worrying implications in terms of promoting homogeneity.)
Another selective element arises from the fact that it’s usually only possible to collect data from a selection of indicators: this means that we might come to place too much emphasis on data we do have instead of thinking about the significance of data that has not been collected.
The key questions:
- Can underlying learning design models be identified in data?
- If so, what do these patterns correlate with?
- How can all this be bundled up to faculty as something useful?
- Are there implications for general elements of course delivery (e.g. forums, VLE, assessment)?
- If we only permit certain kinds of data for consideration, does this lead to a kind of psychological shift where these are the only things considered to be ‘real’ or of value?
- Is there a special kind of interpretative skill that we need in able to make sense of learning analytics?
Learning Design at the OU
Annie Bryan drilled a little deeper into the integration of learning design into the picture. Learning design is now a required element of course design at The Open University. There are a number of justifications given for this:
- Quality enhancement
- Informed decision making
- Sharing good practice
- Improving cost-effectiveness
- Speeding up decision making
- Improve online pedagogy
- Explicitly represent pedagogical activity
- Effective management of student workload
A number of (beta) tools for Learning Design have been produced. These are focused on module information; learning outcomes; activity planning, and mapping modules and resources. These are intended to support constructive engagement over the life of the course. Future developments will also embrace a qualification level perspective which will map activities against qualification routes.
These tools are intended to help course teams think critically about and discuss the purpose of tolls and resources chosen in the context of the course as a whole and student learning experiences. A design perspective can also help to identify imbalances in course structure or problematic parts of a course.