data visualization

Learning from disruption #oer2015

I’m in Sausalito, California in the shadow of the Golden Gate Bridge for the Hewlett Grantees meeting this week.  Today is the start of proceedings proper, and I’m going to blog some of the presentations and seminars.  First off is Douglas Gayeton of Lexicon of Sustainability, which attempts to explain basic principles of economy and environment to a wide audience.

As a film-maker and photographer, Douglas has often reflected on the way that pictures omit as much as they include.  How can a picture capture the full story?  One way is to take lots of pictures and then turn these into a composite image.  This image can then be used to explain the overall message, perhaps with textual cues added.  We need to recast information in ways that people can understand.  (Compelling graphics can also be especially memorable.)  At the other end of the scale of transparency. he suggests, we might think about Latin versions of the Bible, which made religion obscure and opaque.

An underling assumption here is that when consumers are better informed they will make better choices.This seems like a fairly big assumption to me. (What about, for example, smokers who are fully aware of the dangers of their habit and yet continue to chuff away?  Motivation is also important.  Perhaps it is better to say that being well informed is a necessary but not sufficient condition of possibility for making good choices?  Alternatively, a paternalistic approach could make ‘good’ decisions on behalf of consumers without any need for the consumer to know what is good for them.)

Chains like Whole Foods now require clear labelling of products with GMO ingredients – Gayeton compares this to Martin Luther‘s insistence that the Bible be translated into native languages.  Consumers are also demanding transparency around the use of antibiotics to increase the weight of livestock.  Labelling around when and where fish were caught is expected to follow, as is information about grain origins.

It is argued that making improved information about food available to consumers is sustaining a national ‘locavore’ movement where localism, greater co-operation and seasonal eating replace the industrialisation of food production.  The ‘New Corner Store’ movement encourages consumers to ask for better options in their local stores.  Project Localize brings the message into schools.  Douglas take comfort from the various grass roots movements and small holdings: personal stories can be effective for communication.

I must admit that how this relates to OER wasn’t that clear to me, and there wasn’t much exploration of the disruptive elements which might be considered transferable.  I suppose that the idea is to change cultures through improved information though I suspect this is actually only half the battle.  There’s definitely a sense in which the message about OER, nuanced as it is, doesn’t always travel that far beyond the open education movement and its advocates.   But is the idea that we emphasize the hard data, analytics and metrics?  There doesn’t appear to be much of this in the materials we have been presented with today.   Should we instead focus on personal stories and narratives (which seem to be the focus here)?  Both?

Advertisements

Workshop Notes: #Ethics and #LearningAnalytics

This morning I’m attending a talk given by Sharon Slade about the ethical dimensions of learning analytics (LA), part of a larger workshop devoted to LA at The Open University’s library on the Walton Hall campus.

I was a bit late from a previous meeting but Sharon’s slides are pretty clear so I’m just going to crack on with trying to capture the essence of the talk.  Here are the guidelines currently influencing thinking in this area (with my comment in parentheses).

  1. LA as a moral practice (I guess people need to be reminded of this!)
  2. OU has a responsibility to use data for student benefit
  3. Students are not wholly defined by their data (Ergo partially defined by data?)
  4. Purpose and boundaries should be well defined and visible (transparency)
  5. Students should have the facility to update their own data
  6. Students as active agents
  7. Modelling approaches and interventions should be free from bias (Is this possible? What kind of bias should be avoided?)
  8. Adoptions of LA requires broad acceptance of the values and benefits the development of appropriate skills (Not sure I fully grasped this one)

Sharon was mainly outlining the results of some qualitative research done with OU staff and students. The most emotive discussion was around whether or not this use of student data was appropriate at all – many students expressed dismay that their data was being looked at, much less used to potentially determine their service provision and educational future (progress, funding, etc.). Many felt that LA itself is a rather intrusive approach which may not be justified by the benevolent intention to improve student support.

While there are clear policies in place around data protection (like most universities) there were concerns about the use of raw data and information derived from data patterns. There was lots of concern about the ability of the analysts to adequately understand the data they were looking at and treat it responsibly.

Students want to have a 1:1 relationship with tutors, and feel that LA can undermine this; although at the OU there are particular challenges around distance education at scale.

The most dominant issue surrounded the idea of being able to opt-out of having their data collected without this having an impact on their future studies or how they are treated by the university. The default position is one of ‘informed consent’, where students are currently expected to opt out if they wish. The policy will be explained to students at the point of registration and well as providing case studies and guidance for staff and students.

Another round of consultation is expected around the issue of whether students should have an opt-out or opt-in model.

There is an underlying paternalistic attitude here – the university believes that it knows best with regard to the interests of the students – though it seems to me that this potentially runs against the idea of a student centred approach.

Some further thoughts/comments:

  • Someone like Simon Buckingham-Shum will argue that the LA *is* the pedagogy – this is not the view being taken by the OU but we can perhaps identify a potential ‘mission creep’
  • Can we be sure that the analyses we create through LA are reliable?  How?
  • The more data we collect and the more open it is then the more effective LA can be – and the greater the ethical complexity
  • New legislation requires that everyone will have the right to opt-out but it’s not clear that this will necessarily apply to education
  • Commercialisation of data has already taken place in some initiatives

Doug Clow then took the floor and spoke about other LA initiatives.  He noted that the drivers behind interest in LA are very diverse (research, retention, support, business intelligence, etc).  Some projects of note include:

Many projects are attempting to produce the correct kind of ‘dashboard’ for LA.  Another theme is around the extent to which LA initiatives can be scaled up to form a larger infrastructure.  There is a risk that with LA we focus only on the data we have access to and everything follows from there – Doug used the metaphor of darkness/illumination/blinding light. Doug also noted that machine learning stands to benefit greatly from LA data, and LA generally should be understood within the context of trends towards informal and blended learning as well as MOOC provision.

Overall, though, it seems that evidence for the effectiveness of LA is still pretty thin with very few rigorous evaluations. This could reflect the age of the field (a lot of work has yet to be published) or alternatively the idea that LA isn’t really as effective as some hope.  For instance, it could be that any intervention is effective regardless of whether it has some foundation in data that has been collected (nb. ‘Hawthorne effect‘).

#oerrhub on the LSE Impact of Social Science blog

This is a duplicate of my article from the LSE Impact of Social Science blog which was published today.  You can find the original here.

rob farrowMuch sharing and use of open educational resources (OER) is relatively informal, difficult to observe, and part of a wider pattern of open activity. What the open education movement needs is a way to draw together disparate fragments of evidence into a coherent analytic framework. Rob Farrow provides background on a project devoted to consolidating efforts of OER practitioners by inviting the open community to contribute directly and submit impact narratives. Through the mapping of these contributions, the data can continue to grow iteratively and support the decisions made by educators, students, policymakers and advocates.

The Open Education movement is now around ten or twelve years old and has started to make a significant difference to education practices around the world. Open educational resources (OER) are resources (article, textbook, lesson plan, video, test, etc.) that might be used in teaching or learning. They are considered ‘open’ when they are openly licensed in ways that [permit] no-cost access, use, adaptation and redistribution by others with no or limited restrictions or, more simply their free use and re-purposing by others.

This distinction might seem rather subtle and legalistic at first. But the whole of the open education movement is predicated on the idea that open licensing leads to far reaching and beneficial change. By providing an alternative to traditional copyright, open licenses make it possible to share and repurpose materials at marginal cost. It is often stated, for instance, that OER have the potential to increase access to education through lowering the prohibitive cost of textbooks or journal subscriptions. Some claim that OER allows for more innovative teaching and closer bonds between students and learners as a result of a more reflexive syllabus. Others hold the view that open licensing will align existing pedagogies along more collaborative and networked lines.

Image credit: opensource.com via Flickr (CC BY-SA)

When open licensing in conjunction with digital technology can enable duplication and adaptation of materials almost anywhere in the world at next to no cost, it’s easy to see how the implications may be manifold for educational institutions. Perhaps the strongest evidence for this thus far comes from the open access movement, which continues to leverage academic publishers for better value.

Unsurprisingly, much research has gone into ascertaining the evidence that exists in support of these claims. A good portion of earlier OER research focused on establishing the relative quality of open materials and found that they are generally at least as good as equivalent commercial materials (though there are of course variations in quality). But there are reasons why establishing a clear picture of the wider impact of OER adoption is more complex.

Let’s leave aside for now issues around the much discussed and yet nebulous term “impact”. OER adoption is taking place within a world of education undergoing radical change. Where OER does change practices there are often multiple interventions taking place at the same time and so it is hard to isolate the particular influence of openness. Use contexts can vary wildly between countries and education levels, and cultural differences can come into play. Furthermore, much sharing and use of open educational materials (such as Wikipedia) is relatively informal, difficult to observe, and part of a wider pattern of activity. This is not to say that there isn’t good quality OER research out there, but the typical dependence on softer data might sometimes be thought unconvincing. Further complications can arise from inconsistencies in understanding what ‘open’ means to different groups.

Nonetheless, there remains a need for evidence that would support (or discount) from the key claims expressed in the rhetoric around OER, as well as an overall picture of global activity. What the open education movement needs is a way to draw together disparate fragments of evidence into a coherent analytic framework that can support judgments about OER impact for a range of use cases.

OER Research Hub (OERRH) is a research project in IET at The Open University which approaches these issues through an open and collaborative approach. Our project aspires to be open in both its focus and the methods we use to gather and share data. We’ve taken a mixed-methods approach to research depending on the context, and we’ve also undertaken some of the largest surveys about OER use and attitudes from a range of stakeholders. By using a survey template that is consistent across the different samples it becomes possible to see patterns across countries and sectors. Our research instruments and data are released on open licenses and we have an open access publication policy. By encouraging a culture of open sharing we have been able to consolidate the efforts of OER practitioners and help to build a shared understanding.

We work openly with a range of collaborators around the world to gather data and share practical experience and also have a fellowship scheme that helps to foster a worldwide network of experts. By focusing on collecting data around ‘impact’ in situ we are able to build up an evolving picture of changing practices.

The analytic framework for pulling together the data includes a set of research hypotheses which reflect some of the main claims that are made about OER. These help to provide focus but a further structuring is provided by the use of geospatial coordinates (which are of course universal) and map disparate data types on a map across a shared geographical base.

oer impact map1Image credit: OER Impact Map (OER Research Hub)

Mapping has become popular within the OER world, and there is a lot of interest in maps for strengthening communities and as tools for building a shared understanding of the world. Accordingly, OERRH’s OER Impact Map acts as both research tool and dissemination channel. By using a simple metadata structure for different data types it becomes possible to visualize (as well as simply ‘map’) information. For instance, real-time reporting of the evidence gathered across each hypothesis or visualising the sum of evidence gathered help us to understand the data. Soon it will be possible to browse the project survey data directly as well as interact with more detailed, structured narratives about OER impact. The map itself will continue to help us to see patterns in the data and cross-reference evidence gathered.

oer impact map2Image credit: OER Impact Map (OER Research Hub)

By no means is OER Impact Map complete; by its nature the data set continues to evolve. But openness is the key to the sustainability of a service like this: by inviting the open community to contribute directly and submit their impact narratives to OERRH the data can continue to grow iteratively and support the decisions made by educators, students, policymakers and advocates. Furthermore, open licensing of evidence records allows us to close citation loops and archive data more easily, and the relative ease with which open access research can be found helps it find it way into the evidence base.

It is worth noting that the combination of mapping and curation can be flexibly applied to other research questions in educational and social science. The code for OER Impact Map is available openly on GitHub, meaning others can use it build their own impact maps: or adapt this code to their own needs. The impact map is based on a JSON information architecture which supports multiple programming languages and flexible use of the data (like combining it with other datasets).

What our project illustrates is that the use of openness to solve challenges in the project can lead to innovation in approaches in understanding impact. The combination of mixed-methods research into hypotheses with mapping and data visualization techniques can be flexibly applied in support of traditional research activity.

OER Research Hub is funded by The William and Flora Hewlett Foundation

Rob Farrow is a philosopher and educational technologist who researches open education at The Institute of Educational Technology, The Open University (UK). He blogs at openmind.ed and tweets as @philosopher1978.

Open Research into Open Education #calrg14

Here are my slides from today’s presentation: feedback welcome as always.

The project website is http://oerresearchhub.org and the OER Impact Map is available at http://oermap.org.

Thinking Learning Analytics

I’m back in the Ambient Labs again, this time for a workshop on learning analytics for staff here at The Open University.


Challenges for Learning Analytics: Visualisation for Feedback

Denise Whitelock described the SaFeSEA project which is based around trying to give students meaningful feedback on their activities.  SaFeSEA was a response to high student dropout rates for 33% new OU students who don’t submit their first TMA.  Feedback on submitted writing prompts ‘advice for action’; a self reflective discourse with a computer.  Visualizations of these interactions can open a discourse between tutor and student.

Students can worry a lot about the feedback they receive.  Computers can offer a non-judgmental, objective feedback without any extra tuition costs.  OpenEssayist the structure of an essay; identifies key words and phrases; and picks out key sentences (i.e. those that are most representative of the overall content of the piece).  This analysis can be used to generate visual feedback, some forms of which are more easily understood than others.

Bertin (1977/81) provides a model for the visualization of data.   Methods can include diagrams which show how well connected difference passages are to the whole, or to generate different patterns that highlight different types of essay. These can be integrated with social network analysis & discourse analytics.

Can students understand this kind of feedback? Might they need special training?  Are these tools that could be used primarily by educators?  Would they also need special training?  In both case, it’s not entirely clear what kind of training this might be (information literacy?).  Can one tool be used to support writing across all disciplines or should such a tool be generic?

The Wrangler’s relationship with the Science Faculty

Doug Clow then presented on ‘data wrangling’ in the science faculty at The Open University.  IET collects information on student performance and presents this back to faculties in a ‘wrangler report’ able to feed back into future course delivery / learning design.

What can faculty do with these reports?  Data is arguably better at highlighting problems or potential problems than it is at solving them.  This process can perhaps get better at identifying key data points or performance indicators, but faculty still need to decide how to act based on this information.  If we move towards the provision of more specific guidance then the role of faculty could arguably ben diminished over time.

The relation between learning analytics and learning design in IET work with the faculties

Robin Goodfellow picked up these themes from a module team perspective.  Data can be understood as a way of closing the loop on learning design, creating a virtuous circle between the two.  In practice, there can be significant time delays in terms of processing the data in time for it to feed in.  But the information can still be useful to module teams in terms of thinking about course:

  • Communication
  • Experience
  • Assessment
  • Information Management
  • Productivity
  • Learning Experience

This can give rise to quite specific expectations about the balance of different activities and learning outcomes.  Different indicators can be identified and combined to standardize metrics for student engagement, communication, etc.

In this way, a normative notion of what a module should be can be said to be emerging.  (This is perhaps a good thing in terms of supporting course designers but may have worrying implications in terms of promoting homogeneity.)

Another selective element arises from the fact that it’s usually only possible to collect data from a selection of indicators:  this means that we might come to place too much emphasis on data we do have instead of thinking about the significance of data that has not been collected.

The key questions:

  • Can underlying learning design models be identified in data?
  • If so, what do these patterns correlate with?
  • How can all this be bundled up to faculty as something useful?
  • Are there implications for general elements of course delivery (e.g. forums, VLE, assessment)?
  • If we only permit certain kinds of data for consideration, does this lead to a kind of psychological shift where these are the only things considered to be ‘real’ or of value?
  • Is there a special kind of interpretative skill that we need in able to make sense of learning analytics?

Learning Design at the OU

Annie Bryan drilled a little deeper into the integration of learning design into the picture.   Learning design is now a required element of course design at The Open University.  There are a number of justifications given for this:

  • Quality enhancement
  • Informed decision making
  • Sharing good practice
  • Improving cost-effectiveness
  • Speeding up decision making
  • Improve online pedagogy
  • Explicitly represent pedagogical activity
  • Effective management of student workload

A number of (beta) tools for Learning Design have been produced.  These are focused on module information; learning outcomes; activity planning, and mapping modules and resources.  These are intended to support constructive engagement over the life of the course.   Future developments will also embrace a qualification level perspective which will map activities against qualification routes.

These tools are intended to help course teams think critically about and discuss the purpose of tolls and resources chosen in the context of the course as a whole and student learning experiences.  A design perspective can also help to identify imbalances in course structure or problematic parts of a course.

Data visualization as simulacra

I just saw this quote over at Radical Cartography and thought it was really interesting to think about in relation to data visualization, which is essentially also making spatial representations of information.

Information is already abstraction from experience because in regarding it as knowledge rather than immediate sensation.  So, creating representations of information is moving away from the referent and towards the ‘hyperreal’.  This is compounded when we visualize data in order to inform decision making as the ‘map that precedes the territory’.

At the same time, there is something organic and biopolitical about the growth, flourishing and decline of different representations of the world which inevitably reflect and express surrounding power structures.

If we were able to take as the finest allegory of simulation the Borges tale where the cartographers of the Empire draw up a map so detailed that it ends up exactly covering the territory (but where the decline of the Empire sees this map become frayed and finally ruined, a few shreds still discernible in the deserts — the metaphysical beauty of this ruined abstraction, bearing witness to an Imperial pride and rotting like a carcass, returning to the substance of the soil, rather as an aging double ends up being confused with the real thing) — then this fable has come full circle for us, and now has nothing but the discrete charm of second-order simulacra. Abstraction today is no longer that of the map, the double, the mirror or the concept. Simulation is no longer that of a territory, a referential being or substance. It is the generation of models of a real without origin or reality: a hyperreal. The territory no longer precedes the map, nor survives it. Henceforth, it is the map that precedes the territory — PRECESSION OF SIMULACRA — it is the map that engenders the territory and if we were to revive the fable today, it would be the territory whose shreds are slowly rotting across the map. It is the real, and not the map, whose vestiges subsist here and there, in the deserts which are no longer those of the Empire but our own: The desert of the real itself.

Jean Baudrillard (1981) “The Precession of Simulacra” in Simulacra and Simulation.

There’s some quite interesting stuff over there, in fact.