technology

Rethinking the Open Society #oer17

Here are my slides from yesterday’s presentation at OER17.  All feedback welcome.

Abstract

This presentation explores open education ideologies in light of educational technologies; recent political discourse; and the political philosophy of Karl Popper.  Since the latter half of the 20th century, “openness” has developed within stable frameworks of liberal/social democracy, and is now often tacitly assumed in many areas of society (such as open government, a free press, freedom of speech, etc.).  Over the last year we have witnessed considerable and sustained political upset around the globe, causing many to proclaim a crisis of liberal democracy. In the Anglo world, we observe a surge of support for ‘closed’ political positions, including ‘Brexit’ and the USA presidential election (Knapton, 2016).  There are indications that openness might form the basis of an alternative politics; the Píratar political party evolved from a single-issue focus on copyright reform to become the biggest party in Iceland, standing on a platform of civil rights and participatory democracy.  Slaughter (2016) proposes that the web is the new geopolitical theatre, and that the USA “should adopt a grand strategy of building and maintaining an open international order based on three pillars: open societies, open governments, and an open international system.”

Moe (2017) describes the difficulties inherent in developing and teaching critical thinking, especially within standardised education.  In the connected age, access to information and control over media narratives are paramount to governance.  In the age of ‘post-truth’ we need more than ever educational systems that promote information literacy and critical thinking. There is reason to think that there is a need to reconsider the ideological basis and commitments of open education and its practices, many of which remain wedded to traditional academic structures.  This may seem counterintuitive: as Weller (2014) suggests, the ‘battle for open’ is in many senses won, with a growing body of open access publication; open textbook uptake; open source tools for building learning environments; massive open online courses; and open sharing of research data. However, Rolfe (2016) has demonstrated through content analysis a fundamental shift in the discourse around open education.  Articles from the 1970s tended to understand openness in terms of widening participation, and with this came a concomitant promotion of humane values, fostering autonomy, facilitating the development of others, and a wider social mission. This approach has in turn been disrupted by the rise of flexible learning in higher education and the wide availability of educational materials.  By the time the OER movement had grown to a global force much of the debate had moved on to licensing, technical and implementation issues (Weller, 2016).

A reconsideration of the role of ideology in OER will be framed by elements of Karl Popper’s The Open Society and its Enemies (1947).  Popper’s approach was hugely influential for Western liberal democracy, and remains arguably the most sustained attempt to develop a vision of society from the idea of openness.  Popper’s critical approach to education – which emphasizes the role of learner as so-creator of knowledge– serves as a model for making explicit the connection between critical rationality and openness, and provides tools for systematically reflecting on educational practice (Chitpin, 2016).

References

Chitpin, S. (2016). Popper’s Approach to Education. London and New York: Routledge.

Knapton, S. (2016). Donald Trump is a ‘vulgar, demented, pig demon’ says Hillary Clinton’s ex adviser. The Telegraph, 30 May 2016.

Moe, R. (2017). All I Know Is What’s on the Internet. Real Life Mag.http://reallifemag.com/all-i-know-is-whats-on-the-internet/

Popper, K. (1947a). The Open Society and its Enemies. Vol. I: The Age of Plato. London: Routledge. Available from https://archive.org/stream/opensocietyandit033120mbp.

Popper, K. (1947b). The Open Society and its Enemies. Vol II: The high tide of prophecy: Hegel, Marx and the Aftermath. London: Routledge. Available from https://archive.org/details/opensocietyandit033064mbp.

Rolfe, V. (2016). Open.  But not for criticism?  Open Education 2016.http://www.slideshare.net/viv_rolfe/opened16-conference-presentation

Slaughter, A.-M. (2016). How to Succeed in the Networked World: A Grand Strategy for the Digital Age. Foreign Affairs. (Nov/Dec.) https://www.foreignaffairs.com/articles/world/2016-10-04/how-succeed-networked-world

Weller, M. (2014). The Battle for Open. Ubiquity Press.

Weller, M. (2016). Different Aspects of the Emerging OER Discipline. Revista Educacao e Cultura Contemporanea, 13(31) http://oro.open.ac.uk/4

Advertisements

#opened16 live blog: College Affordability and Social Justice

Preston Davis (aka @LazyPhilosopher) invites us to think about the early days of Western civilisation where philosophers like Plato and Aristotle formed educational institutions on the basis of their own privilege.  This kind of system persisted into Roman times, where males with the ability to pay could attend organised schools where they would learn to become educated citizens of the empire.

Education was further formalised in the Middle Ages, but mostly organised according to the strategic aims of the church.  Formalised educational systems in the USA widened curriculum and admitted women, but still remain ‘exclusive’ in many ways.

Rawlsian theories of social justice are reflective of conversations that are starting to take place in OER around stepping back from personal bias when making decisions.  If we disregard the considerations of race, gender, class and so on, we can support a more democratic and equally distributed educational system.

The remark is made that aspects of the USA educational system are exclusive rather than inclusive.  Much of the OER movement was organised around saving money on textbook costs, but this overlooks wider patterns of disenfranchisement.  The Sanders run for USA president foregrounded the idea of access to higher education as a matter of social justice.  Should education be ‘free’?

From the discussion:

  • Class divides are reinforced by higher education.  Some scholarships are set aside for students from disadvantaged backgrounds, but does this really change structural patterns of disenfranchisement?
  • If public education was made free, would this lead to a loss of resources through inefficiencies?
  • Can we really act as if we are ‘difference-blind’?
  • Is the difference between the student who goes on to higher education and the one who doesn’t a matter of money?  Disenfranchisement has other elements, e.g. confidence, role models, self-interpretation,  Much of these are the kind of ‘differences’ stripped out of the Rawlsian model.
  • How can social justice be understood from the perspective of what is essentially privilege?
  • Low cost vs. free?

The Open Research Agenda

Here are the slides I’ll be using today for my presentation at the CALRG Annual Conference.  The Open Research Agenda is an international consultation exercise focused on identifying research priorities in open education.

You can read more about the project here:

The Open Research Agenda (2)

The Open Research Agenda (1)

Ethical principles of learning analytics – mini critique

This is just a short blog post to capture some thoughts on the ethical principles of learning analytics as set out in official documentation provided by The Open University.  I have attended various briefings at the OU around this subject, mainly because there is a lot of complexity here with regard to the ethical significance of these technologies.  I was also a member of the advisory panel for the JISC Code of Practice for Learning Analytics.

Here are the ‘ethical principles’ with my own brief annotations (click to enlarge).  (This is just an internal critique of these principles as they are set out here, not of the wider project of learning analytics.)

learninganalytics

The principles have been categorised in the following way:
Screen Shot 2015-12-03 at 13.03.15You can see the original list at http://www.open.ac.uk/students/charter/sites/www.open.ac.uk.students.charter/files/files/ecms/web-content/using-information-to-support-student-learning.pdf.

In essence, these are the points I would make about these principles are as follows:

  • Point 1.  It is asserted that learning analytics is an ethical practice, but this has yet to be established.  Arguably we should state that it should be thought of an ethical practice, but this is quite different in terms of ethical principle.  ‘Ought’ statements are much harder to justify.
  • Point 2. There is a confusing mix of deontological and consequentialist-utilitarian consideration here.  Unpicking it, I interpret it to mean that the university considers itself to have a responsibility to maximise the utility of the data about students that it owns.  The important points here are that a.) stakeholders are not clearly defined and could include (for instance, privately owned data brokers; b.) there is no acknowledgment of the possible tension between different forms of self-interest; c.) no criteria are given for ‘feasibility’.
  • Point 2. It’s difficult to see how feasibility should be a criterion for whether something is ethical.  After all, ethics is something that regulates the realm of the feasible, the possible, the actual.  This would be a much stronger principle if this word was replaced with ‘ethical’, or ‘justified’.
  • Point 3 infers that students should be at least partly defined by their data and the university’s interpretation of it.  This may not be that contentious to most people, though without clear parameters for the other criteria that are considered it could be taken to mean ‘mostly’ defined by the data held by the university.  It’s not clear what this means in practice except putting in some wording to ward off concerns about treating students as nothing more than a set of data points.
  • Point 4 seems right in setting out a principle of transparency in the process, purpose and use of student data.  But it doesn’t make a commitment to full transparency for all.  Why not?
  • This is brought into sharper relief in Point 5, which sets out a commitment to full transparency for data collection. Taken in conjunction with Point 4, it seems that transparency is endorsed for collection, but not use.
  • Point 6 is on the theme of student autonomy, and co-operation in these processes.  These are good things, though claims to have given informed consent are potentially undermined by the possible lack of transparency in use in Point 4.
  • A further possible undermining of student autonomy here is the lack of clarity about whether students can entirely opt out of these processes.  If not, how can they be considered ‘active agents’?
  • I’m not an expert in big data but I know a little bit about predictive modelling.  In Point 7. the idea is that modelling ‘should be’ free from bias.  Well, all modelling should be free from bias, but these effects cannot be truly eradicated.  It would make more sense as a principle to speak of ‘minimising’ bias.
  • Point 8. endorses adoption of learning analytics into the institutional culture, and vice versa.  It asserts that there values and benefits to the approach, though these are largely hypothetical.  It basically states that the institutional culture of the university must change, and that this should be ‘broadly accepted’ (whatever that might mean).

The final point I’d make about this is that, for me, these are not generally worded as principles: rather as vision statements or something intended to guide internal decision making.  But when it comes to ethics, we really need clear principles if we are to understand whether they are being applied consistently, sensitively, and systematically.

 

JiME Reviews Dec 2015

Here is the latest list of books available for review from JiME.  If you’re interested in reviewing any of the following then get in touch with me through Twitter or via rob.farrow [at] open.ac.uk to let me know which volume you are interested in and some of your reviewer credentials.

Reviews will be due at the end of February 2016, and should be in the region of 1500-2000 words.  You can see examples of previous reviews at http://jime.open.ac.uk/.

If you’re an academic publisher and you’re reading this you my have noted we have a lot of books from Routledge in the backlog.  If you’d like to have your books considered fro review in JiME then please mail them for my attention at the address in the sidebar.


  • Curtis J. Bonk, Mimi M. Lee, Thomas C. Reeves & Thomas H. Reynolds (eds.) (2015) MOOCs and Open Education around the world. Routledge: Abingdon and New York. link
  • Charles D. Dziuban, Anthony G. Picciano, Charles R. Graham & Patsy D. Moskal (2016). Conducting Research in Online and Blended Learning Environments.  Routledge: Abingdon and New York. link
  • Susan Garvis & Narelle Lemon (eds.) (2016). Understanding Digital Technologies and Young Children: An International Perspective. Routledge: Abingdon and New York. link
  • Seth Giddings (2014). Gameworlds: Virtual Media and Children’s Everyday Play. Bloomsbury Academic. link
  • Lori Diane Hill & Felice J. Levine (eds.) (2015). World Education Research Yearbook 2015. Routledge: Abingdon. link
  • Wanda Hurren & Erika Hasebe-Ludt (eds.) (2014). Contemplating Curriculum – Genealogies, Times, Places. Routledge: London and New York.  link
  • Phyllis Jones (ed.) (2014).  Bringing Insider Perspectives into Inclusive Learner Teaching – Potentials and challenges for educational professionals. Routledge: London and New York. link
  • David Killick (2015). Developing the Global Student: Higher education in an era of globalization. Routledge: London and New York. link
  • Piet A. M. Kommers, Pedro Isaias & Tomayess Issa (2015). Perspectives on Social Media – a yearbook. Routledge: London and New York. link
  • Angela McFarlane (2015). Authentic Learning for the Digital Generation – realising the potential of technology in the classroom. Routledge: Abingdon. link
  • Jill Porter (ed.) (2015). Understanding and Responding to the Experience of Disability. Routledge: London and New York. link
  • Steven Warburton & Stylianos Hatzipanagos (eds.) (2013). Digital Identity and Social Media.  IGI Global: Hershey, PA.  link

Launching the Open Education Handbook

This morning I am taking part in a hangout to launch The Open Education Handbook, a collaboratively written document published by Open Knowledge Foundation for which I acted as the editor for the most recent edition.

Lots of people were involved in putting together the manual, both directly (in book ‘sprints’ and through a working group) and indirectly through their support of the LinkedUp project.  Originally the focus of the project was open data but this quickly expanded to areas relevant to open data (including OER, policy, education).  Open research data and open educational data have much potential for influencing education and the working group opens up a space to think and collaborate around this.

The Open Education Working Group takes an open approach to collaboration, which has also been applied to the handbook.  The LinkedUp project was required to produce a handbook as a project deliverable and it was decided that an open, collaborative and community based approach would be appropriate.  The idea of using ‘book sprints‘ was new to the team, and was slightly watered down so that instead of trying to write the whole thing in three days there would be working groups and multiple sprints which attempted to improve the existing version.  The sprints were held at conferences and workshops with a ‘question and answer’ approach used to structure the content.  Around the time of the second sprint the workspace moved from Google Docs to BookType which helped the organisation of the materials as well as version control.  The working group would regularly meet to chat about the project.

Translating the book into Portuguese allowed for further refinement of the draft as the translators queried the structure of the book as well as the possible Eurocentric quality of the earlier drafts. My own contribution was to try and pull in the shape the rather fragmentary draft and apply a consistent editorial tone across the manuscript.  This involved moving away from the ‘question & answer’ model originally used to generate content to reassemble and rephrase content more like a structured narrative but also leaving open the possibility of reading sections in no particular order.

The latest version is still available for editing and there will undoubtedly be new versions as we move forward.

  • Martin Poulter reported that the book content is being imported into WikiBooks for ongoing wiki style editing and improvement;
  • Jo Paulger spoke about the FLOSS manuals site as another possible home for the handbook (and possibly the more ‘official’ versions as they are produced.

Here are Marieke Guy‘s slides from the hangout:

Ethics, Openness and the Future of Education #opened14

By popular demand, here are my slides from today’s presentation at Open Education 2014.  All feedback welcome and if this subject is of interest to you then consider checking out the OERRH Ethics Manual and the section on ethics (week 2) of our Open Research course.

liveblog: Predicting Giants at #altc #altc2014

Here are my notes from this afternoon’s session at the ALT-C 2014 conference. There were three presentations in this session.


Richard Walker (University of York) – Ground swells and breaking waves: findings from the 2014 UCISA TEL survey on learning technology trends, developments and fads

This national survey started in 2001 and has since expanded out from a VLE focus to all systems which support learning and teaching. The results are typically augmented by case studies which investigate particular themes. In 2014 there were 96 responses from 158 HE institutions that were solicited (61% response). Some of the findings:

  • Top drivers for TEL are to enhance quality, meet student expectations and improve access to learning for off-campus students
  • TEL development can be encouraged by soliciting stuent feedback
  • Lack of academic staff understanding of TEL has re-emerged as a barrier to TEL development, but time is still the main factor
  • Institutions perceive a lack of specialist support staff as a leading challenge to TEL activity
  • In future, mobile technologies and BYOD will still be seen as significant challenges, but not top as in last year
  • E-assessment is also a leading concern
  • Moodle (62%)is the most used VLE, with Blackboard (49%) the leading enterprise solution
  • Very small use of other open source or commercial solutions
  • Institutions are increasingly attempting to outsource their VLE solutions
  • Plagiarism and e-assessment tools are the most commonly supported tools
  • Podcasting is down in popularity, being supplanted by streaming services and recorded lectures, etc.
  • Personal response systems / clickers are up in popularity
  • Social networking tools are the leading non-centrally supported technology used by students
  • There is more interest in mobile devices (iOS, Android) but only a handful of institutions are engaging in staff development and pedagogic activity around these
  • Increasing numbers of institutions are making mobile devices available but few support this through policies which would integrate devices into regular practice
  • The longitudinal elements of the study suggest that content is the most important driver of TEL for distance learning
  • Less than a third of institutions have evaluated pedagogical activity around TEL.

 


Simon Kear (Tavistock & Portman NHS Foundation Trust; formerly Goldsmiths College, University of London) – Grasping the nettle: promoting institution-wide take-up of online assessment at Goldsmiths College

When we talk about online assessment we need to encourage clarity around processes and expected results but learners don’t need to know much about the tools involved.  Learners tend to want to avoid hybrid systems and prefer to have alternative ways of having their work submitted and assessed.

There are many different stakeholders involved in assessment, including senior management, heads of department, administrators, and student representatives.

Implementation can be helped through regular learning and teaching committees. It’s important to work with platforms that are stable and that can provide comprehensive support and resources.

Simon concluded by advancing the claim that within 5 years electronic marking of student work will be the norm.  This should lead to accepting a wider variety of multimedia formats for student work as well as more responsive systems of feedback.


Rachel Karenza Challen (Loughborough College) – Catching the wave and taking off: Embracing FELTAG at Loughborough College – moving from recommendations to reality

This presentation focused on cultural change in FE and the results of the Feltag survey.

  • Students want VLE materials to be of high quality because it makes them feel valued
  • The report recommends that all publicly funded programmes should have a 10% component which should be available online
  • SFA and ILR funding will require colleges to declare the amount of learning available online and this will not include just any interaction which takes place online (like meetings)
  • There is a concern that increasing the amount of learning that takes place online might make it harder to assess what is working
  • Changing curricula year by year makes it harder to prepare adequate e-learning – a stable situation allows for better planning and implementation
  • Ultimately, assessment requires expert input – machine marking and peer assessment can only get you so far
  • In future they intend to release a VLE plugin that others might be able to use
  • Within 5 years the 10% component will be raised to 50% – this means that 50% of provision at college level will be without human guidance and facilitation – is this reflective of the growing influence of the big academic publishers?  Content provided by commercial providers is often not open to being embedded or customised…
  • Ministerial aspirations around online learning may ultimately be politically driven rather than evidence-based.

OCWC 2014 Recording Available

The video recording from my research presentation at OCWC 2014 is now available at http://videolectures.net/ocwc2014_farrow_oer_impact/.  It’s not possible to embed here but they have a nice player on their site.

This presentation gives an overview of the OER Research Hub project, some of the methodological and epistemological issues we encounter, and how we propose to ameliorate these through the technologies we use to investigate key questions facing the OER movement.


OER Impact: Collaboration, Evidence, Synthesis
Robert Farrow

Thinking Learning Analytics

I’m back in the Ambient Labs again, this time for a workshop on learning analytics for staff here at The Open University.


Challenges for Learning Analytics: Visualisation for Feedback

Denise Whitelock described the SaFeSEA project which is based around trying to give students meaningful feedback on their activities.  SaFeSEA was a response to high student dropout rates for 33% new OU students who don’t submit their first TMA.  Feedback on submitted writing prompts ‘advice for action’; a self reflective discourse with a computer.  Visualizations of these interactions can open a discourse between tutor and student.

Students can worry a lot about the feedback they receive.  Computers can offer a non-judgmental, objective feedback without any extra tuition costs.  OpenEssayist the structure of an essay; identifies key words and phrases; and picks out key sentences (i.e. those that are most representative of the overall content of the piece).  This analysis can be used to generate visual feedback, some forms of which are more easily understood than others.

Bertin (1977/81) provides a model for the visualization of data.   Methods can include diagrams which show how well connected difference passages are to the whole, or to generate different patterns that highlight different types of essay. These can be integrated with social network analysis & discourse analytics.

Can students understand this kind of feedback? Might they need special training?  Are these tools that could be used primarily by educators?  Would they also need special training?  In both case, it’s not entirely clear what kind of training this might be (information literacy?).  Can one tool be used to support writing across all disciplines or should such a tool be generic?

The Wrangler’s relationship with the Science Faculty

Doug Clow then presented on ‘data wrangling’ in the science faculty at The Open University.  IET collects information on student performance and presents this back to faculties in a ‘wrangler report’ able to feed back into future course delivery / learning design.

What can faculty do with these reports?  Data is arguably better at highlighting problems or potential problems than it is at solving them.  This process can perhaps get better at identifying key data points or performance indicators, but faculty still need to decide how to act based on this information.  If we move towards the provision of more specific guidance then the role of faculty could arguably ben diminished over time.

The relation between learning analytics and learning design in IET work with the faculties

Robin Goodfellow picked up these themes from a module team perspective.  Data can be understood as a way of closing the loop on learning design, creating a virtuous circle between the two.  In practice, there can be significant time delays in terms of processing the data in time for it to feed in.  But the information can still be useful to module teams in terms of thinking about course:

  • Communication
  • Experience
  • Assessment
  • Information Management
  • Productivity
  • Learning Experience

This can give rise to quite specific expectations about the balance of different activities and learning outcomes.  Different indicators can be identified and combined to standardize metrics for student engagement, communication, etc.

In this way, a normative notion of what a module should be can be said to be emerging.  (This is perhaps a good thing in terms of supporting course designers but may have worrying implications in terms of promoting homogeneity.)

Another selective element arises from the fact that it’s usually only possible to collect data from a selection of indicators:  this means that we might come to place too much emphasis on data we do have instead of thinking about the significance of data that has not been collected.

The key questions:

  • Can underlying learning design models be identified in data?
  • If so, what do these patterns correlate with?
  • How can all this be bundled up to faculty as something useful?
  • Are there implications for general elements of course delivery (e.g. forums, VLE, assessment)?
  • If we only permit certain kinds of data for consideration, does this lead to a kind of psychological shift where these are the only things considered to be ‘real’ or of value?
  • Is there a special kind of interpretative skill that we need in able to make sense of learning analytics?

Learning Design at the OU

Annie Bryan drilled a little deeper into the integration of learning design into the picture.   Learning design is now a required element of course design at The Open University.  There are a number of justifications given for this:

  • Quality enhancement
  • Informed decision making
  • Sharing good practice
  • Improving cost-effectiveness
  • Speeding up decision making
  • Improve online pedagogy
  • Explicitly represent pedagogical activity
  • Effective management of student workload

A number of (beta) tools for Learning Design have been produced.  These are focused on module information; learning outcomes; activity planning, and mapping modules and resources.  These are intended to support constructive engagement over the life of the course.   Future developments will also embrace a qualification level perspective which will map activities against qualification routes.

These tools are intended to help course teams think critically about and discuss the purpose of tolls and resources chosen in the context of the course as a whole and student learning experiences.  A design perspective can also help to identify imbalances in course structure or problematic parts of a course.