digital scholarship

Resisting the surveillance project: Siân Bayne keynote at #altc

This keynote presentation began by describing the ‘slow death’ of Yik Yak, an anonymous geosocial networking app launched in 2013.  The software allowed people in close proximity to send and receive short, anonymous messages and write posts (Yaks).  It was heavily marketed on university campuses.  Yik Yak was totally anonymous, and was moderated by community voting.  Edinburgh students used the app to ask questions.  The hyperlocality of the app meant that interactions were limited (e.g. discipline specific, location specific) rather than campus-wide.

Between July 2016 and May 2017, 46,637 Yaks were downloaded and analysis.  In addition, two undergraduate research assistants kept reflective diaries, and data was pulled from other studies the researchers were involved with at the time.

The research team assumed that the hyperlocality of Yik Yak was the key element, but in fact anonymity was the the most important thing for students.  The app developers made a similar misjudgement.  Their tinkering with the app eventually removed the anonymity and this was followed by outcry from the user base.  The return of anonymity was eventually restored, and the data gathered by Edinburgh showed that, for their local users, use of the app picked up again.  At a global level it never recovered, and Yik Yak went from #3 in the download charts in 2014 to #447 by 2016.  On 6 May, 2017 the app was closed down and the company sold for $1 million – having been valued at $400 million just two years before. The failure of the developers to understand the importance of anonymity in their app was symptomatic of a more general failure to understand the value of their product for their market.

There was widely reported abuse, harassment, victimisation and toxicity on Yik Yak, and this was to some extent enabled by the anonymity.  To deal with this, the developers tagged the app as for adults on the App Store, and prevented the app from working in schools.  This cut a lot of the user base.  A system of word filtering was also introduced to flag potentially offensive tweets.  The community also tended to down-vote abuse and up-vote positive messages.  Black, Mezzina & Thompson (2016) found that there was abuse on Yik Yak, but not so much as to demonise the platform as a whole.

Bachmann et al (2017) argue that anonymity can enable disinhibition and create safe spaces:  anonymity need not be associated with toxicity.  Students at Edinburgh used Yik Yak as a support network, notably for mental health issues. Bayne argues that we need to stop seeing anonymity as evidence of some kind of deviancy or an unwillingness to reveal.  As Nissenbaum (1999) argues, the value of anonymity is in acting or participating while remaining out of reach and unattainable.  (Compare this with the ubiquity of Facebook and the way that it is practically essential for students who wish to have a social life or join groups.) Lanchester (2017) provides a good overview of worries about Facebook surveillance.  Bachmann et al (2017) make the point that anonymity is a barrier to both platform capitalism and surveillance culture more generally.

Two conclusions:

  1. When designing digital learning environments, we need to allocate space for un-namability and ephemerality.  E.g. designing pop-up tools that delete themselves.
  2. The surveillance project is opposed to our instincts for an effective life.  Zuboff et al (2015) suggest this leads to a kind of psychic numbing that makes us less attentive to the operations of surveillance capitalism.  In designing teaching we should actively educate against ‘psychic numbing’.
Advertisements

Launching the Open Education Handbook

This morning I am taking part in a hangout to launch The Open Education Handbook, a collaboratively written document published by Open Knowledge Foundation for which I acted as the editor for the most recent edition.

Lots of people were involved in putting together the manual, both directly (in book ‘sprints’ and through a working group) and indirectly through their support of the LinkedUp project.  Originally the focus of the project was open data but this quickly expanded to areas relevant to open data (including OER, policy, education).  Open research data and open educational data have much potential for influencing education and the working group opens up a space to think and collaborate around this.

The Open Education Working Group takes an open approach to collaboration, which has also been applied to the handbook.  The LinkedUp project was required to produce a handbook as a project deliverable and it was decided that an open, collaborative and community based approach would be appropriate.  The idea of using ‘book sprints‘ was new to the team, and was slightly watered down so that instead of trying to write the whole thing in three days there would be working groups and multiple sprints which attempted to improve the existing version.  The sprints were held at conferences and workshops with a ‘question and answer’ approach used to structure the content.  Around the time of the second sprint the workspace moved from Google Docs to BookType which helped the organisation of the materials as well as version control.  The working group would regularly meet to chat about the project.

Translating the book into Portuguese allowed for further refinement of the draft as the translators queried the structure of the book as well as the possible Eurocentric quality of the earlier drafts. My own contribution was to try and pull in the shape the rather fragmentary draft and apply a consistent editorial tone across the manuscript.  This involved moving away from the ‘question & answer’ model originally used to generate content to reassemble and rephrase content more like a structured narrative but also leaving open the possibility of reading sections in no particular order.

The latest version is still available for editing and there will undoubtedly be new versions as we move forward.

  • Martin Poulter reported that the book content is being imported into WikiBooks for ongoing wiki style editing and improvement;
  • Jo Paulger spoke about the FLOSS manuals site as another possible home for the handbook (and possibly the more ‘official’ versions as they are produced.

Here are Marieke Guy‘s slides from the hangout:

Ethics, Openness and the Future of Education #opened14

By popular demand, here are my slides from today’s presentation at Open Education 2014.  All feedback welcome and if this subject is of interest to you then consider checking out the OERRH Ethics Manual and the section on ethics (week 2) of our Open Research course.

liveblog: Predicting Giants at #altc #altc2014

Here are my notes from this afternoon’s session at the ALT-C 2014 conference. There were three presentations in this session.


Richard Walker (University of York) – Ground swells and breaking waves: findings from the 2014 UCISA TEL survey on learning technology trends, developments and fads

This national survey started in 2001 and has since expanded out from a VLE focus to all systems which support learning and teaching. The results are typically augmented by case studies which investigate particular themes. In 2014 there were 96 responses from 158 HE institutions that were solicited (61% response). Some of the findings:

  • Top drivers for TEL are to enhance quality, meet student expectations and improve access to learning for off-campus students
  • TEL development can be encouraged by soliciting stuent feedback
  • Lack of academic staff understanding of TEL has re-emerged as a barrier to TEL development, but time is still the main factor
  • Institutions perceive a lack of specialist support staff as a leading challenge to TEL activity
  • In future, mobile technologies and BYOD will still be seen as significant challenges, but not top as in last year
  • E-assessment is also a leading concern
  • Moodle (62%)is the most used VLE, with Blackboard (49%) the leading enterprise solution
  • Very small use of other open source or commercial solutions
  • Institutions are increasingly attempting to outsource their VLE solutions
  • Plagiarism and e-assessment tools are the most commonly supported tools
  • Podcasting is down in popularity, being supplanted by streaming services and recorded lectures, etc.
  • Personal response systems / clickers are up in popularity
  • Social networking tools are the leading non-centrally supported technology used by students
  • There is more interest in mobile devices (iOS, Android) but only a handful of institutions are engaging in staff development and pedagogic activity around these
  • Increasing numbers of institutions are making mobile devices available but few support this through policies which would integrate devices into regular practice
  • The longitudinal elements of the study suggest that content is the most important driver of TEL for distance learning
  • Less than a third of institutions have evaluated pedagogical activity around TEL.

 


Simon Kear (Tavistock & Portman NHS Foundation Trust; formerly Goldsmiths College, University of London) – Grasping the nettle: promoting institution-wide take-up of online assessment at Goldsmiths College

When we talk about online assessment we need to encourage clarity around processes and expected results but learners don’t need to know much about the tools involved.  Learners tend to want to avoid hybrid systems and prefer to have alternative ways of having their work submitted and assessed.

There are many different stakeholders involved in assessment, including senior management, heads of department, administrators, and student representatives.

Implementation can be helped through regular learning and teaching committees. It’s important to work with platforms that are stable and that can provide comprehensive support and resources.

Simon concluded by advancing the claim that within 5 years electronic marking of student work will be the norm.  This should lead to accepting a wider variety of multimedia formats for student work as well as more responsive systems of feedback.


Rachel Karenza Challen (Loughborough College) – Catching the wave and taking off: Embracing FELTAG at Loughborough College – moving from recommendations to reality

This presentation focused on cultural change in FE and the results of the Feltag survey.

  • Students want VLE materials to be of high quality because it makes them feel valued
  • The report recommends that all publicly funded programmes should have a 10% component which should be available online
  • SFA and ILR funding will require colleges to declare the amount of learning available online and this will not include just any interaction which takes place online (like meetings)
  • There is a concern that increasing the amount of learning that takes place online might make it harder to assess what is working
  • Changing curricula year by year makes it harder to prepare adequate e-learning – a stable situation allows for better planning and implementation
  • Ultimately, assessment requires expert input – machine marking and peer assessment can only get you so far
  • In future they intend to release a VLE plugin that others might be able to use
  • Within 5 years the 10% component will be raised to 50% – this means that 50% of provision at college level will be without human guidance and facilitation – is this reflective of the growing influence of the big academic publishers?  Content provided by commercial providers is often not open to being embedded or customised…
  • Ministerial aspirations around online learning may ultimately be politically driven rather than evidence-based.

Liveblog – Catherine Cronin keynote at #altc #altc2014

For one day only I’m at The University of Warwick for the ALT-c conference where I’m speaking on OER Impact Map.   (You can access my slides for today here.)


Catherine Cronin (National University of Ireland, Galway) – Navigating the Marvellous: Openness in Education

Catherine began with a quote that illustrates her view of eduction:

“Education is inherently an ethical and political act.” (Michael Apple)

Catherine spoke about growing up in New York and the political milieu in the 1960s (including the assassinations of Martin Luther King and John F. Kennedy that helped her to grow to political awareness and the role of education for supporting healthy political life.  Different people have different parts to play in the political process.  Education thus conceived necessitates criticism of what exists, pointing to what has been lost, and identifying possible futures.

Openness: Catherine identifies this with sharing resources and thoughts in a freely available way.  Lots of resources that claim to be ‘open’ aren’t necessarily licensed in appropriate ways, and open practices should be understood as a more radical level built on top of this.

“Openness is an ethos, not just a license.  It’s an approach to teaching and learning that builds a community of learners” (Jim Groom)

Catherine was keen to identify openness with a kind of humility rather than the hubris of seeking greater attention for one’s work:

“I don’t think education is about centralized instruction anymore; rather, it is the process [of] establishing oneself as a node in a broad network of distributed creativity.”  (Joichi Ito)

As networked individuals, we need to overcome the distinction usually recognised between formal and informal learning.  Students come with different expectations and experiences that they bring to the spaces within which they learn.  Couros (2006) refers to the ‘networked’ teacher who makes use of a range of digital technologies.

 

http://www.scribd.com/doc/3363/Dissertation-Couros-FINAL-06-WebVersion

from Couros, A. (2006). Examining the open movement: possibilities and implications for education. (Doctoral thesis, University of Athabasca.)

Learning spaces can be physical or online, and tend to be bounded in different ways. Different spaces can facilitate community building to different degrees, but in any space there will be some voices that are privileged and some which are excluded.  When online we experience fewer markers of identity, with differing ideas about the effects of presence and telepresence on pedagogy.  Open online spaces tend to disregard institutional, national or physical barriers to entry and so facilitate greater sharing and connectivity.

The network is the organising principle of open online spaces – but how should this work in practice?  Openness here refers not to licensing but to the practice of facilitating this connectivity.

When students enter institutions, we can ask them about the tools they use and their views on transparency, privacy, and experimental pedagogies.  These discussions can be open, and help to form a shared understanding and expectation.  Open discussions can take place on social media which draw on the idea of networked learning. Students should be encouraged to connect across cohorts and levels to build community and learning skills.

We can minimise the power differential between student and teacher through open approaches, though it should be noted that some students worry  about being judged for thoughts and contributions shared in the open.  Identity is key to understanding these concerns because identities are constructed through dialogue and sharing.  Students should be supported in building and trying out different identities because so doing will help build digital skills and confidence.  Online identity doesn’t so much transform one’s own sense of self but it can help us become more aware of the contingent and contextual nature of our identities, and help us to see possibilities for being otherwise.

We can see open learning spaces as ‘third spaces’ which are neither formal nor informal but draw on both the skills of formal learning and the informal identities that have a kind of authenticity.  One risk with developing e-learning is in believing in a kind of subjectless learner who does not bring their own identity to  their learning.  We need to recognise difference: gender, race, religion, disability and other potential sources of ‘Otherness’.  Open practices are a brilliant first step towards this.

Open Research into Open Education #calrg14

Here are my slides from today’s presentation: feedback welcome as always.

The project website is http://oerresearchhub.org and the OER Impact Map is available at http://oermap.org.

Guerrilla Research #elesig

https://i1.wp.com/upload.wikimedia.org/wikipedia/commons/thumb/6/69/Afrikaner_Commandos2.JPG/459px-Afrikaner_Commandos2.JPG

We don't need no stinking permissions....

Today I’m in the research laboratories in the Jennie Lee Building at The Institute of Educational Technology (aka work) for the ELESIG Guerrilla Research Event.  Martin Weller began the session with an outline of the kind of work that goes into preparing unsuccessful research proposals.  Using figures from the UK research councils he estimates that the ESRC alone attracts bids (which it does not fund) equivalent to 65 work years every year (2000 failed bids x 12 days per bid).   This work is not made public in any way and can be considered lost.

He then went on to discuss some different digital scholarship initiatives – like a meta educational technology journal based on aggregation of open articles; MOOC research by Katy Jordan; an app built at the OU; DS106 Digital Storytelling – these have elements of what is being termed ‘guerrilla research’.  These include:

  • No permissions (open access, open licensing, open data)
  • Quick set up
  • No business case required
  • Allows for interdisciplinarity unconstrained by tradition
  • Using free tools
  • Building open scholarship identity
  • Kickstarter / enterprise funding

Such initiatives can lead to more traditional forms of funding and publication; and the two at least certainly co-exist.  But these kinds of activities are not always institutionally recognised, giving rise to a number of issues:

  • Intellectual property – will someone steal my work?
  • Can I get institutional recognition?
  • Do I need technical skills?
  • What is the right balance between traditional and digital scholarship?
  • Ethical concerns about the use of open data – can consent be assumed?  Even when dealing with personal or intimate information?

Tony Hirst then took the floor to speak about his understanding of ‘guerrilla research’.  He divided his talk into the means, opportunity and motive for this kind of work.

First he spoke about the use of the commentpress WordPress theme to disaggregate the Digital Britain report so that people could comment online.  The idea came out of a tweet but within 3 months was being funded by the Cabinet Office.

In 2009 Tony produced a map of MP expense claims which was used by The Guardian.  This was produced quickly using open technologies and led to further maps and other ways of exploring data stories.  Google Ngrams is a tool that was used to check for anachronistic use of language in Downton Abbey.

In addition to pulling together recipes using open tools and open data is to use innovative codings schemes. Mat Morrison (@mediaczar) used this to produce an accession plot graph of the London riots.  Tony has reused this approach – so another way of approaching ‘guerrilla research’ is to try to re-appropriate existing tools.

Another approach is to use data to drive a macroscopic understanding of data patterns, producing maps or other visualizations from very large data sets, helping sensemaking and interpretation.  One important consideration here is ‘glanceability‘ – whether the information has been filtered and presented so that the most important data are highlighted and the visual representation conveys meaning successfully to the view.

Data.gov.uk is a good source of data:  the UK government publishes large amounts of information on open licence.  Access to data sets like this can save a lot of research money, and combining different data sets can provide unexpected results.  Publishing data sets openly supports this method and also allows others to look for patterns that original researchers might have missed.

Google supports custom searches which can concentrate on results from a specific domain (or domains) and this can support more targeted searches for data.  Freedom of information requests can also be a good source of data; publicly funded bodies like universities, hospitals and local government all make data available in this way (though there will be exceptions). FOI requests can be made through whatdotheyknow.com.  Google spreadsheets support quick tools for exploring data such as sliding filters and graphs.

OpenRefine is another tool which Tony has found useful.  It can cluster open text responses in data sets according to algorithms and so replace manual coding of manuscripts.   The tool can also be used to compare with linked data on the web.

Tony concluded his presentation with a comparison of ‘guerrilla research’ and ‘recreational research’. Research can be more creative and playful and approaching it in this way can lead to experimental and exploratory forms of research.  However, assessing the impact of this kind of work might be problematic.  Furthermore, going through the process of trying to get funding for research like this can impede the playfulness of the endeavour.

A workflow for getting started with this kind of thing:

  • Download openly available data: use open data, hashtags, domain searches, RSS
  • DBpedia can be used to extract information from Wikipedia
  • Clean data using OpenRefine
  • Upload to Google Fusion Tables
  • From here data can be mapped, filtered and graphed
  • Use Gephi for data visualization and creating interactive widgets
  • StackOverflow can help with coding/programming

(I have a fuller list of data visualization tools on the Resources page of OER Impact Map.)

Ethical Use of New Technology in Education

Today Beck Pitt and I travelled up to Birmingham in the midlands of the UK to attend a BERA/Wiley workshop on technologies and ethics in educational research.  I’m mainly here on focus on the redraft of the Ethics Manual for OER Research Hub and to give some time over to thinking about the ethical challenges that can be raised by openness.  The first draft of the ethics manual was primarily to guide us at the start of the project but now we need to redraft it to reflect some of the issues we have encountered in practice.

Things kicked off with an outline of what BERA does and the suggestion that consciousness about new technologies in education often doesn’t filter down to practitioners.  The rationale behind the seminar seems to be to raise awareness in light of the fact that these issues are especially prevalent at the moment.

This blog post may be in direct contravention of the Chatham convention

This blog post may be in direct contravention of the Chatham convention

We were first told that these meetings would be taken under the ‘Chatham House Rule’ which suggests that participants are free to use information received but without identifying speakers or their affiliation… this seems to be straight into the meat of some of the issues provoked by openness:  I’m in the middle of life-blogging this as this suggestion is made.  (The session is being filmed but apparently they will edit out anything ‘contentious’.)

Anyway, on to the first speaker:


Jill Jameson, Prof. of Education and Co-Chair of the University of Greenwich
‘Ethical Leadership of Educational Technologies Research:  Primum non noncere’

The latin part of the title of this presentation means ‘do no harm’ and is a recognised ethical principle that goes back to antiquity.  Jameson wants to suggest that this is a sound principle for ethical leadership in educational technology.

After outlining a case from medical care Jameson identified a number of features of good practice for involving patients in their own therapy and feeding the whole process back into training and pedagogy.

  • No harm
  • Informed consent
  • Data-informed consultation on treatment
  • Anonymity, confidentiality
  • Sensitivity re: privacy
  • No coercion
  • ‘Worthwhileness’
  • Research-linked: treatment & PG teaching

This was contrasted with a problematic case from the NHS concerning the public release of patient data.  Arguably very few people have given informed consent to this procedure.  But at the same time the potential benefits of aggregating data are being impeded by concerns about sharing of identifiable information and the commercial use of such information.

In educational technology the prevalence of ‘big data’ has raised new possibilities in the field of learning analytics.  This raises the possibility of data-driven decision making and evidence-based practice.  It may also lead to more homogenous forms of data collection as we seek to aggregate data sets over time.

The global expansion of web-enabled data presents many opportunities for innovation in educational technology research.  But there are also concerns and threats:

  • Privacy vs surveillance
  • Commercialisation of research data
  • Techno-centrism
  • Limits of big data
  • Learning analytics acts as a push against anonymity in education
  • Predictive modelling could become deterministic
  • Transparency of performance replaces ‘learning
  • Audit culture
  • Learning analytics as models, not reality
  • Datasets >< information and stand in need of analysis and interpretation

Simon Buckingham-Shum has put this in terms of a utopian/dystopian vision of big data:

Leadership is thus needed in ethical research regarding the use of new technologies to develop and refine urgently needed digital research ethics principles and codes of practice.  Students entrust institutions with their data and institutions need to act as caretakers.

I made the point that the principle of ‘do no harm’ is fundamentally incompatible with any leap into the unknown as far as practices are concerned.  Any consistent application of the principle leads to a risk-averse application of the precautionary principle with respect to innovation.  How can this be made compatible with experimental work on learning analytics and sharing of personal data?  Must we reconfigure the principle of ‘do no harm’ so it it becomes ‘minimise harm’?  It seems that way from this presentation… but it is worth noting that this is significantly different to the original maxim with which we were presented… different enough to undermine the basic position?


Ralf Klamma, Technical University Aachen
‘Do Mechanical Turks Dream of Big Data?’

Klamma started in earnest by showing us some slides:  Einstein sticking his tongue out; stills from Dr. Strangelove; Alan Turing; a knowledge network (citation) visualization which could be interpreted as a ‘citation cartel’.  The Cold War image of scientists working in isolation behind geopolitical boundaries has been superseded by building of new communities.  This process can be demonstrated through data mining, networking and visualization.

Historical figures of the like of Einstein and Turing are now more like nodes on a network diagram – at least, this is an increasingly natural perspective.  The ‘iron curtain’ around research communities has dropped:

  • Research communities have long tails
  • Many research communities are under public scrutiny (e.g. climate science)
  • Funding cuts may exacerbate the problem
  • Open access threatens the integrity of the academy (?!)

Klamma argues that social network analysis and machine learning can support big data research in education.  He highlights the US Department of Homeland Security, Science and Technology, Cyber Security Division publication The Menlo Report: Ethical Principles Guiding Information and Communication Technology Research as a useful resource for the ethical debates in computer science.  In the case of learning analytics there have been many examples of data leaks:

One way to approach the issue of leaks comes from the TellNET project.  By encouraging students to learn about network data and network visualisations they can be put in better control of their own (transparent) data.  Other solutions used in this project:

  • Protection of data platform: fragmentation prevents ‘leaks’
  • Non-identification of participants at workshops
  • Only teachers had access to learning analytics tools
  • Acknowledgement that no systems are 100% secure

In conclusion we were introduced to the concept of ‘datability‘ as the ethical use of big data:

  • Clear risk assessment before data collection
  • Ethcial guidelines and sharing best pracice
  • Transparency and accountability without loss of privacy
  • Academic freedom

Fiona Murphy, Earth and Environmental Science (Wiley Publishing)
‘Getting to grips with research data: a publisher perspective’

From a publisher perspective, there is much interest in the ways that research data is shared.  They are moving towards a model with greater transparency.  There are some services under development that will use DOI to link datasets and archives to improve the findability of research data.  For instance, the Geoscience Data Journal includes bi-direction linking to original data sets.  Ethical issues from a publisher point of view include how to record citations and accreditation; manage peer review and maintenance of security protocols.

Data sharing models may be open, restricted (e.g. dependent on permissions set by data owner) or linked (where the original data is not released but access can be managed centrally).

[Discussion of open licensing was conspicuously absent from this though this is perhaps to be expected from commercial publishers.]


Luciano Floridi, Prof. of Philosophy & Ethics of Information at The University of Oxford
‘Big Data, Small Patterns, and Huge Ethical Issues’

Data can be defined by three Vs: variety, velocity, and volume. (Options for a fourth have been suggested.)  Data has seen a massive explosion since 2009 and the cost of storage is consistently falling.  The only limits to this process are thermodynamics, intelligence and memory.

This process is to some extent restricted by legal and ethical issues.

Epistemological Problems with Big Data: ‘big data’ has been with us for a while generally should be seen as a set of possibilities (prediction, simulation, decision-making, tailoring, deciding) rather than a problem per se.  The problem is rather that data sets have become so large and complex that they are difficult to process by hand or with standard software.

Ethical Problems with Big Data: the challenge is actually to understand the small patterns that exist within data sets.  This means that many data points are needed as ways into a particular data set so that meaning can become emergent.  Small patterns may be insignificant so working out which patterns have significance is half the battle.  Sometimes significance emerges through the combining of smaller patterns.

Thus small patterns may become significant when correlated.  To further complicate things:  small patterns may be significant through their absence (e.g. the curious incident of the dog in the night-time in Sherlock Holmes).

A specific ethical problem with big data: looking for these small patterns can require thorough and invasive exploration of large data sets.  These procedures may not respect the sensitivity of the subjects of that data.  The ethical problem with big data is sensitive patterns: this includes traditional data-related problems such as privacy, ownership and usability but now also includes the extraction and handling of these ‘patterns’.  The new issues that arise include:

  • Re-purposing of data and consent
  • Treating people not only as means, resources, types, targets, consumers, etc. (deontological)

It isn’t possible for a computer to calculate every variable around the education of an individual so we must use proxies:  indicators of type and frequency which render the uniqueness of the individual lost in order to make sense of the data.  However this results in the following:

  1. The profile becomes the profiled
  2. The profile becomes predictable
  3. The predictable becomes exploitable

Floridi advances the claim that the ethical value of data should not be higher than the ethical value of that entity but demand at most the same degree of respect.

Putting all this together:  how can privacy be protected while taking advantage of the potential of ‘big data’?.  This is an ethical tension between competing principles or ethical demands: the duties to be reconciled are 1) safeguarding individual rights and 2) improving human welfare.

  • This can be understood as a result of polarisation of a moral framework – we focus on the two duties to the individual and society and miss the privacy of groups in the middle
  • Ironically, it is the ‘social group’ level that is served by technology

Five related problems:

  • Can groups hold rights? (it seems so – e.g. national self-determination)
  • If yes, can groups hold a right to privacy?
  • When might a group qualify as a privacy holder? (corporate agency is often like this, isn’t it?)
  • How does group privacy relate to individual privacy?
  • Does respect for individual privacy require respect for the privacy of the group to which the individual belongs? (big data tends to address groups (‘types’) rather than individuals (‘tokens’))

The risks of releasing anonymised large data sets might need some unpacking:  the example given was that during the civil war in Cote d’Ivoire (2010-2011) Orange released a large metadata set which gave away strategic information about the position of groups involved in the conflict even though no individuals were identifiable.  There is a risk of overlooking group interests by focusing on the privacy of the individual.

There are legal or technological instruments which can be employed to mitigate the possibility of the misuse of big data, but there is no one clear solution at present.  Most of the discussion centred upon collective identity and the rights that might be afforded an individual according to groups they have autonomously chosen and those within which they have been categorised.  What happens, for example, if a group can take a legal action but one has to prove membership of that group in order to qualify?  The risk here is that we move into terra incognito when it comes to the preservation of privacy.


Summary of Discussion

Generally speaking, it’s not enough to simply get institutional ethical approval at the start of a project.  Institutional approvals typically focus on protection of individuals rather than groups and research activities can change significantly over the course of a project.

In addition to anonymising data there is a case for making it difficult to reconstruct the entire data set so as to stop others from misuse.  Increasingly we don’t even know who learners are (e.g. MOOC) so it’s hard to reasonably predict the potential outcomes of an intervention.

The BERA guidelines for ethical research are up for review by the sounds of it – and a working group is going to be formed to look at this ahead of a possible meeting at the BERA annual conference.

My ORO report

I’ve just a quick look at my author report from the ORO repository of research published by members of The Open University.  I’m quite surprised to learn that I’ve accrued almost 1,300 downloads of materials I have archived here!

An up to date account of my ORO analytics can be found at http://oro.open.ac.uk/cgi/stats/report/authors/31087069bed3e4363443db857ead0546/. I suppose a 50% strike rate for open access publication ain’t bad… but there is probably room for improvement…

JiME Reviews December 2013

We’ve got a great set of new books in for review in the Journal of Interactive Media in Education (JiME) at the moment – thanks to Routledge for the review copies.

If you’re interested in reviewing any of the following then get in touch with me through Twitter or via rob.farrow [at] open.ac.uk to let me know which volume you are interested in and some of your reviewer credentials.  First come – first served!

Sue Crowley (ed.) (2014). Challenging Professional Learning. Routledge: London and New York.  link

Andrew S. Gibbons (2014).  An Architectural Approach to Instructional Design.  Routledge: London and New York. link

Wanda Hurren & Erika Hasebe-Ludt (eds.) (2014). Contemplating Curriculum – Genealogies, Times, Places. Routledge: London and New York.  link

Phyllis Jones (ed.) (2014).  Bringing Insider Perspectives into Inclusive Learner Teaching – Potentials and challenges for educational professionals. Routledge: London and New York. link

Marilyn Leask & Norbert Pachler (eds.) (2014).  Learning to Teach Using ICT in the Secondary School – A companion to school experience.  Routledge: London and New York. link

Allison Littlejohn & Anoush Margaryan (eds.) (2014).  Technology-enhanced Professional Learning – processes, practices and tools (3rd ed.).  Routledge: London and New York.  link

Ka Ho Mok & Kar Ming Yu (eds.) (2014).  Internationalization of Higher Education in East Asia – Trends of student mobility and impact on education governance. Routledge: London and New York.  link