distance education

Workshop Notes: #Ethics and #LearningAnalytics

This morning I’m attending a talk given by Sharon Slade about the ethical dimensions of learning analytics (LA), part of a larger workshop devoted to LA at The Open University’s library on the Walton Hall campus.

I was a bit late from a previous meeting but Sharon’s slides are pretty clear so I’m just going to crack on with trying to capture the essence of the talk.  Here are the guidelines currently influencing thinking in this area (with my comment in parentheses).

  1. LA as a moral practice (I guess people need to be reminded of this!)
  2. OU has a responsibility to use data for student benefit
  3. Students are not wholly defined by their data (Ergo partially defined by data?)
  4. Purpose and boundaries should be well defined and visible (transparency)
  5. Students should have the facility to update their own data
  6. Students as active agents
  7. Modelling approaches and interventions should be free from bias (Is this possible? What kind of bias should be avoided?)
  8. Adoptions of LA requires broad acceptance of the values and benefits the development of appropriate skills (Not sure I fully grasped this one)

Sharon was mainly outlining the results of some qualitative research done with OU staff and students. The most emotive discussion was around whether or not this use of student data was appropriate at all – many students expressed dismay that their data was being looked at, much less used to potentially determine their service provision and educational future (progress, funding, etc.). Many felt that LA itself is a rather intrusive approach which may not be justified by the benevolent intention to improve student support.

While there are clear policies in place around data protection (like most universities) there were concerns about the use of raw data and information derived from data patterns. There was lots of concern about the ability of the analysts to adequately understand the data they were looking at and treat it responsibly.

Students want to have a 1:1 relationship with tutors, and feel that LA can undermine this; although at the OU there are particular challenges around distance education at scale.

The most dominant issue surrounded the idea of being able to opt-out of having their data collected without this having an impact on their future studies or how they are treated by the university. The default position is one of ‘informed consent’, where students are currently expected to opt out if they wish. The policy will be explained to students at the point of registration and well as providing case studies and guidance for staff and students.

Another round of consultation is expected around the issue of whether students should have an opt-out or opt-in model.

There is an underlying paternalistic attitude here – the university believes that it knows best with regard to the interests of the students – though it seems to me that this potentially runs against the idea of a student centred approach.

Some further thoughts/comments:

  • Someone like Simon Buckingham-Shum will argue that the LA *is* the pedagogy – this is not the view being taken by the OU but we can perhaps identify a potential ‘mission creep’
  • Can we be sure that the analyses we create through LA are reliable?  How?
  • The more data we collect and the more open it is then the more effective LA can be – and the greater the ethical complexity
  • New legislation requires that everyone will have the right to opt-out but it’s not clear that this will necessarily apply to education
  • Commercialisation of data has already taken place in some initiatives

Doug Clow then took the floor and spoke about other LA initiatives.  He noted that the drivers behind interest in LA are very diverse (research, retention, support, business intelligence, etc).  Some projects of note include:

Many projects are attempting to produce the correct kind of ‘dashboard’ for LA.  Another theme is around the extent to which LA initiatives can be scaled up to form a larger infrastructure.  There is a risk that with LA we focus only on the data we have access to and everything follows from there – Doug used the metaphor of darkness/illumination/blinding light. Doug also noted that machine learning stands to benefit greatly from LA data, and LA generally should be understood within the context of trends towards informal and blended learning as well as MOOC provision.

Overall, though, it seems that evidence for the effectiveness of LA is still pretty thin with very few rigorous evaluations. This could reflect the age of the field (a lot of work has yet to be published) or alternatively the idea that LA isn’t really as effective as some hope.  For instance, it could be that any intervention is effective regardless of whether it has some foundation in data that has been collected (nb. ‘Hawthorne effect‘).

wpid-20140902_133026.jpg

liveblog: Predicting Giants at #altc #altc2014

Here are my notes from this afternoon’s session at the ALT-C 2014 conference. There were three presentations in this session.


Richard Walker (University of York) – Ground swells and breaking waves: findings from the 2014 UCISA TEL survey on learning technology trends, developments and fads

This national survey started in 2001 and has since expanded out from a VLE focus to all systems which support learning and teaching. The results are typically augmented by case studies which investigate particular themes. In 2014 there were 96 responses from 158 HE institutions that were solicited (61% response). Some of the findings:

  • Top drivers for TEL are to enhance quality, meet student expectations and improve access to learning for off-campus students
  • TEL development can be encouraged by soliciting stuent feedback
  • Lack of academic staff understanding of TEL has re-emerged as a barrier to TEL development, but time is still the main factor
  • Institutions perceive a lack of specialist support staff as a leading challenge to TEL activity
  • In future, mobile technologies and BYOD will still be seen as significant challenges, but not top as in last year
  • E-assessment is also a leading concern
  • Moodle (62%)is the most used VLE, with Blackboard (49%) the leading enterprise solution
  • Very small use of other open source or commercial solutions
  • Institutions are increasingly attempting to outsource their VLE solutions
  • Plagiarism and e-assessment tools are the most commonly supported tools
  • Podcasting is down in popularity, being supplanted by streaming services and recorded lectures, etc.
  • Personal response systems / clickers are up in popularity
  • Social networking tools are the leading non-centrally supported technology used by students
  • There is more interest in mobile devices (iOS, Android) but only a handful of institutions are engaging in staff development and pedagogic activity around these
  • Increasing numbers of institutions are making mobile devices available but few support this through policies which would integrate devices into regular practice
  • The longitudinal elements of the study suggest that content is the most important driver of TEL for distance learning
  • Less than a third of institutions have evaluated pedagogical activity around TEL.

 


Simon Kear (Tavistock & Portman NHS Foundation Trust; formerly Goldsmiths College, University of London) – Grasping the nettle: promoting institution-wide take-up of online assessment at Goldsmiths College

When we talk about online assessment we need to encourage clarity around processes and expected results but learners don’t need to know much about the tools involved.  Learners tend to want to avoid hybrid systems and prefer to have alternative ways of having their work submitted and assessed.

There are many different stakeholders involved in assessment, including senior management, heads of department, administrators, and student representatives.

Implementation can be helped through regular learning and teaching committees. It’s important to work with platforms that are stable and that can provide comprehensive support and resources.

Simon concluded by advancing the claim that within 5 years electronic marking of student work will be the norm.  This should lead to accepting a wider variety of multimedia formats for student work as well as more responsive systems of feedback.


Rachel Karenza Challen (Loughborough College) – Catching the wave and taking off: Embracing FELTAG at Loughborough College – moving from recommendations to reality

This presentation focused on cultural change in FE and the results of the Feltag survey.

  • Students want VLE materials to be of high quality because it makes them feel valued
  • The report recommends that all publicly funded programmes should have a 10% component which should be available online
  • SFA and ILR funding will require colleges to declare the amount of learning available online and this will not include just any interaction which takes place online (like meetings)
  • There is a concern that increasing the amount of learning that takes place online might make it harder to assess what is working
  • Changing curricula year by year makes it harder to prepare adequate e-learning – a stable situation allows for better planning and implementation
  • Ultimately, assessment requires expert input – machine marking and peer assessment can only get you so far
  • In future they intend to release a VLE plugin that others might be able to use
  • Within 5 years the 10% component will be raised to 50% – this means that 50% of provision at college level will be without human guidance and facilitation – is this reflective of the growing influence of the big academic publishers?  Content provided by commercial providers is often not open to being embedded or customised…
  • Ministerial aspirations around online learning may ultimately be politically driven rather than evidence-based.

ITC Distance Education Survey Results #elearning2014

Here at eLearning 2014 Fred Lokken of Truckee Meadows Community College presented the results of the most recent ITC survey into distance education.  This is the 10th annual edition of the survey, which is the the primary college-focused distance education survey.  The results are sent to all college presidents as well as to key media outlets.  The survey takes place in Autumn/Fall each year, and is sent electronically.  The 2013 survey saw 140 complete responses and statistical accuracy was reported at +/-4%.

Fred claims that it was around the 7th year of the survey (2011) that distance learning began to be recognised by the government as equivalent to classroom education in terms of quality of materials and instruction.  He pointed out that online education has overcome many barriers in a short space of time, instigating a paradigm shift that has yet to be fully understood.

The majority of community colleges manage their distance learning operations through a mix of centralised and decentralised administration.

Online enrolment is up while overall enrolment is marginally down.   This is a trend seen  consistently over the life of the survey and across a range of institutions.  (40% attributed this to the downturn in the economy.)  Web-facilitated classes and blended classes are on the increase.

Here are the challenges that distance education administrators see as the most pressing:

After a period of some turbulence, most institutions have settled on a fixed LMS, with only 27% saying that they were considering switching their LMS in the next year.

OER was the main change since previous surveys, with 45% predicting significant OER impact on their campus in the next 3-5 years.  Half (50%) thought OER would have very little impact but only 3% thought there would be no impact.    Here are the challenges that were identified as barriers to institutional adoption of OER.