research

Research, Impact and the UK Parliament

This event took place in the Darwin Building at University College London on the 7th June 2017 and was organised by the Universities programme at UK Parliament Outreach and Engagement Service.  These are my personal notes which may be of interest to any researchers who wish to improve the profile of their work among policymakers.

 

An Introduction to the UK Parliament

The UK Parliament is made up of The House of Commons; The House of Lords; and The Monarch.  The Monarch’s role is mainly ceremonial and is not a focus for impact activities.  There are typically 650 MPs in The House of Commons.

The Commons is the democratically elected chamber of Parliament.  The party (or parties) who can command the confidence of the House of Commons form the UK government – typically the party with a majority of members.  If no party commands a majority then a a minority government or coalition government may be formed.

The House of Lords used to be largely filled with hereditary peers.  The 1958 and 1999 reforms did away with most of these and most current Lords are life peers.  It is possible to become a Lord through hereditary title; public nomination to the House of Lords Appointments Commission; or (most typically) Prime Ministerial prerogative.  Traditionally the 26 highest ranking bishops and archbishops of the Church of England sit in the Lords.  Life peers can choose to retire but they typically serve for life.  There are 92 places for hereditary peers.  Many peers have an allegiance to a political party, but there are also cross-benchers who retain independence. The House of Lords does not conventionally block bills that were in a government manifesto, and can only delay and request amendments of legislation, not block it.

What does Parliament do?  The main activities are:

  • Making new laws
  • Raising and debating issues
  • Scrutinizing the work of the UK government

Parliament is not the same thing as Government (the party or parties that command the confidence of the commons).  For the development of policy and legislation the focus should be on the Government.  If the focus is on applying pressure or criticising a piece of legislation then Parliament is likely to be a more appropriate place to start.

How does Parliament use academic research?

  • House of Lords/Commons Select Committees (groups of MPs/Lords involved in an inquiry into an area of government activity or spending)
  • Parliamentary Office of Science and Technology (POST)
  • House of Commons Library
  • House of Lords Library
  • Public Bill Committees

How can you contribute to legislation?

  • Respond to consultations (before it goes to parliament, e.g. green & white papers)
  • Make sure the subject specialists at the House of Commons Library knows you and your area of expertise
  • Submit evidence to pre-legislative scrutiny committees and/or Public Bill Committees
  • Brief opposition/backbench MPs and Peers to assist them in legislative debates

You can send 250 word summaries of subject expertise to papers@parliament.uk for the attention of subject specialists.  This means you will be entered onto a register of experts.

 

Academics and the UK Parliament

The Parliamentary Office of Science and Technology (POST) supports and advances the use of research evidence in Parliament.  Core activities include:

  • Producing 4 page briefings for MPs and Lords on topics deemed to be of policy relevance. The process includes literature reviews, interviews with stakeholders, etc.
  • Supporting Select Committees and Libraries (providing contacts and bespoke briefings
  • Connecting Parliament with researchers through events, fellowship schemes, etc.
  • Capacity Building through providing training to Parliamentary staff about using research and research methods

 

There are four POST areas:

  • Biological sciences and health
  • Energy and environment
  • Physical sciences and ICT
  • Social sciences

 

As part of the social science strand some work has been done on Parliamentary engagement through analysis of REF2014 case studies.  20% of impact case studies (N=1282) referred to engagement with Parliament.  88% of UK universities are engaged with Parliament in this way (from all 36 subject areas).  The universities which engaged the most were UCL, Oxford, Cambridge, KCL, Manchester, Bristol and Edinburgh.  The areas of Parliament that were engaged with most commonly were Select Committees (35%); Individual MPs or peers (28%); legislation (11%); debate (11%); APPGs (10%); libraries (3%); parliamentary questions (3%) and POST (2%).

The most typical form of engagement is through citation or mention (37%).  The next common is to provide evidence (18%).  Other examples include giving evidence, consultation, speaking or presenting, or through direct correspondence.

The POST note process provides a way to engage:

  • Written by postgraduate fellows over a period of three months
  • Topics are approved by the POST board
  • You can propose a POST note, or contribute to notes that are presently being worked on
  • First drafts of POST notes are usually written by non-specialists
  • Finished drafts are sent to all MPs and peers and made available through the Parliamentary website
  • The website shows notes that have been approved for drafting as well as work currently in progress
  • There is a mailing list and a Twitter account (@post_uk)
  • It can also be useful to follow the social media accounts of Select Committees or Library Sections, etc.
  • Fellowships are available through research councils, learned societies and charities
  • Academic fellowships are available for academics at institutions which hold an ESRC or EPSRC Impact Acceleration Account (currently being piloted, deadline 30th June 2017)
  • Fellows are increasingly offered the chance to work directly with a Select Committee or Library Section

 

Libraries are keen to work with academics but often too busy to seek them out.    The Commons Library is made up of specialists who produce briefing papers and debate packs; the Lords Library comprises generalists and is a smaller team who focus on answering enquiries.

The contact email address for Libraries is: papers@parliament.uk

 

Engaging with UK Parliamentarians

The first step is to contact your local MP, who can be found on the Parliament website (www.parliament.uk) or by calling the House of Commons Information Office on 020 7219 4272.  Many MPs have a constituency office where they can be contacted.

It may be appropriate to contact other MPs – one approach could be to ask which other MPs might share an interest.

Members of the House of Lords have no constituency, but neither do they have the staff support that MPs have.  Many Lords are busy and have jobs outside of Parliament.  It is important to identify peers who will support your campaign. The email address  contactholmember@parliament.uk can be used to contact any peer.  Don’t bulk send information – if more than six copies are received all are deleted.

Individual parliamentarians are free to evaluate your communication and act in whatever way they feel is appropriate. There is no formal quality assurance process, so finding a sympathetic ear can be useful.  Targeting communication is important; having some sense of the action that you wish Parliamentarians to take helps to structure and strategically focus the communication.

Ways to find out more about Parliamentary interests:

  • All-Party Parliamentary Groups
    • These informal groups function like clubs and have been demonstrated to be a good way for researchers to gain influence
    • APPGs typically focus on a particular issue (‘subject groups’) or country
    • They can operate in wildly different ways because they are not uniform in their organisation or structure and they are self-run
    • There is a register of APPGs on the parliament.uk website
    • It can also be useful to identify Parliamentarians who might be obstructive to your legislative agenda through searching APPGs
    • Granularity can be an issue: ‘health’ is quite a broad category but there can be groups for specific areas of medicine or even specific medical conditions. Find the right group for your particular agenda
    • Ask for a list of contact details for members of the relevant APPG
  • Hansard (records of debate)
    • These can be searched for key words
    • Useful for identifying Parliamentarians
  • Early Day Motions (suggestions for future debates)
  • Select Committees

Another inquiry point could be to identify clerks or co-ordinators and contact them directly.  They are likely to be organised and quick to respond.

If you’re going to contact an MP or peer, how should you present yourself?

  • Be polite
  • Have a clear purpose for contacting them
  • Try to stand out from the hundreds of other emails they have received that day
  • Be clear about the new knowledge that has been produced by your research
  • Communicate broad lines first and drill down into the details
  • Parliamentarians are ‘intelligent non-specialists’ who are used to taking in complex information – there is no need to dumb down research for them but it is good practice to minimise jargon and communicate the main points clearly

 

What is good Select Committee evidence?

Select Committees:

  • Are intended to hold the government (or relevant governmental department) to account
  • Are independent in terms of their focus
  • Examine expenditure, administration and policy of each Government department
  • Do not investigate individual complaints
  • Are cross-bench and reflect the makeup of the House of Commons – serving ministers are not part of Select Committees

 

How Select Committees work:

  1. Choose inquiry
  2. Announce Terms of Reference (narrowing areas of focus)
  3. Open the call for evidence (typically open about 6 weeks)
  4. Collect written evidence
  5. Commission research (this is infrequently done, but still happens)
  6. Visits (to relevant stakeholders)
  7. Take oral evidence (these sessions are open to the public and sometimes televised)
  8. Discuss conclusions & recommendations
  9. Draft and agree report
  10. Publish report
  11. Receive a response from the Government, who are obliged to respond to each of the specific recommendations

Recommendations may be accepted in full or in part, or rejected.

Why engage with Select Committees?

  • Evidence based policymaking
  • Publicise research
  • Impact

How should one engage with a Select Committee?

  • Submit written evidence
  • Oral evidence
  • Act as a specialist advisor
  • Highlight relevant research

What is good evidence?

  • Relevant to the inquiry
  • Accessible, not academic – minimise jargon
  • Provides context and assesses the significance of a piece of research
  • Gives clear recommendations to the Committee
  • Avoid political point-scoring since Select Committees are cross-bench
  • Bear in mind the original terms of reference (and possibly use this to structure the report)

One area of focus for Select Committees is to improve the diversity of those who are asked to provide evidence.

The best place to start when thinking about approaching a Select Committee is Twitter – every SC has a Twitter account where requests are made.

Things to think about:

  • How is your research relevant to public policy?
  • Which inquiries could you submit evidence to?
  • How will the REF influence your research?

On the subject of REF:  it is not entirely clear how one might use Parliamentary activity as a way of demonstrating impact.  Keeping records of engagement (e.g. a letter of thanks) is a good idea because this could potentially be used as part of a case study.

Select committees have no role in legislation, though they may be asked to provide scrutiny on bills that are in early stages.

Bill committees also take evidence on a particular subject and related legislation.   Bill committees are run by the Public Bill Office and chosen by party whips.  They are more political and controlled by political parties.

 

 

Critical issues in contemporary open education research #srhe

This presentation outlines some key considerations for researchers working in the fields of open education, OER and MOOC. Key lines of debate in the open education movement is described and critically assessed. A reflective overview of the award-winning OER Research Hub project will be used to frame several key considerations around the methodology and purpose of OER research (including ‘impact’ and ‘open practices’). These will be compared with results from a 2016 OER Hub consultation with key stakeholders in the open education movement on research priorities for the sector. The presentation concludes with thoughts on the potential for openness to act as a disruptive force in higher education.

The Open Research Agenda #opened16

Today at Open Education 2016 I presented the provisional results of a research consultation exercise we have been doing at OER Hub over the last year.  Several people asked for copies of the slides, which are available here and on the OER Hub SlideShare account.

All feedback welcome.  You can still take part in the project by completing the form at tinyurl.com/2016ORA.

The Open Research Agenda

Here are the slides I’ll be using today for my presentation at the CALRG Annual Conference.  The Open Research Agenda is an international consultation exercise focused on identifying research priorities in open education.

You can read more about the project here:

The Open Research Agenda (2)

The Open Research Agenda (1)

JiME Reviews Dec 2015

Here is the latest list of books available for review from JiME.  If you’re interested in reviewing any of the following then get in touch with me through Twitter or via rob.farrow [at] open.ac.uk to let me know which volume you are interested in and some of your reviewer credentials.

Reviews will be due at the end of February 2016, and should be in the region of 1500-2000 words.  You can see examples of previous reviews at http://jime.open.ac.uk/.

If you’re an academic publisher and you’re reading this you my have noted we have a lot of books from Routledge in the backlog.  If you’d like to have your books considered fro review in JiME then please mail them for my attention at the address in the sidebar.


  • Curtis J. Bonk, Mimi M. Lee, Thomas C. Reeves & Thomas H. Reynolds (eds.) (2015) MOOCs and Open Education around the world. Routledge: Abingdon and New York. link
  • Charles D. Dziuban, Anthony G. Picciano, Charles R. Graham & Patsy D. Moskal (2016). Conducting Research in Online and Blended Learning Environments.  Routledge: Abingdon and New York. link
  • Susan Garvis & Narelle Lemon (eds.) (2016). Understanding Digital Technologies and Young Children: An International Perspective. Routledge: Abingdon and New York. link
  • Seth Giddings (2014). Gameworlds: Virtual Media and Children’s Everyday Play. Bloomsbury Academic. link
  • Lori Diane Hill & Felice J. Levine (eds.) (2015). World Education Research Yearbook 2015. Routledge: Abingdon. link
  • Wanda Hurren & Erika Hasebe-Ludt (eds.) (2014). Contemplating Curriculum – Genealogies, Times, Places. Routledge: London and New York.  link
  • Phyllis Jones (ed.) (2014).  Bringing Insider Perspectives into Inclusive Learner Teaching – Potentials and challenges for educational professionals. Routledge: London and New York. link
  • David Killick (2015). Developing the Global Student: Higher education in an era of globalization. Routledge: London and New York. link
  • Piet A. M. Kommers, Pedro Isaias & Tomayess Issa (2015). Perspectives on Social Media – a yearbook. Routledge: London and New York. link
  • Angela McFarlane (2015). Authentic Learning for the Digital Generation – realising the potential of technology in the classroom. Routledge: Abingdon. link
  • Jill Porter (ed.) (2015). Understanding and Responding to the Experience of Disability. Routledge: London and New York. link
  • Steven Warburton & Stylianos Hatzipanagos (eds.) (2013). Digital Identity and Social Media.  IGI Global: Hershey, PA.  link

ROER4D Workshop, Banff 2015 – Day Two

For the morning of Day Two of the workshop the group split into working groups.  I floated between the groups and tried to capture a sense of what was being discussed in each.

Qualitative Data Analysis (led by Freda Wolfenden & David Porter)

Freda presented some key things to consider:

  • What does data analysis mean in the context of a research inquiry?
  • The relation of data analysis and research dissemination
  • Alternative forms of data analysis
  • Drawing conclusions from data analysis and evaluating evidence
  • Findings should be relevant and credible
  • Be aware of the relationship between the research rationale and data analysis
  • Several approaches to data analysis might be taken within one research project in order to meet different needs or ask different questions
  • Four types of analysis:  thematic, frequency, discourse, causal

image

David then spoke about qualitative data analysis and OER studies.  He said that qualitative data is really important for understanding how practitioners perceive the influence of OER on their own practice. He then connected this to the idea of communicating research findings through images and stories, using the ‘Artic Death Spiral’ an as example.

David joined BC Campus in 2003 he become involved in the Online Program Development Fund (OPDF), a programme fubnded by the Canadian government to develop online learning materials under open licence.  It became apparent that liberal arts, health and science were the subject areas with most interest in this approach.  These projects were seen as successful, but the worry of the funders was that there were pockets of activity rather than wholesale adoption,  In 2012, the Canadian government tried to further stimulate adoption by funding the production of open textbooks.

To support this work, research was done into the impact, successes and failures of the OPDF project.  They proceeded by interviewing and looking at the existing literature to structure the study, identifying gaps in knowledge and any potential methodological barriers.   Seven themes (comprising quality, instructional design, technologies, business models, cultures, and policies and localisation) were identified.  This also afforded an opportunity to reflect on the important of establishing how well OER was understood in the various institutions.

Ultimately (Third Generation) Activity Theory (Engestromm, Nardi) was selected as a model for understanding the impact of OER as a whole, and and as a framework for producing and aligning interview questions.  Interviews took about an hour and were subsequently transcribed.

Qualitative data analysis can produce richer understandings of context (Weiss, 1995).  Coding was done through Nvivo and ATLAS.ti (special software for thematic analysis).  203 codes were identified, and their frequency and proximity to each other were analysed. This was still too many for a reasonable analysis, so these were then clustered into nine overall themes (some including as many as 46 sub-themes).  (It’s worth thinking carefully about the relationship between the questions asked and the themes that emerged, since themes are bound to be in the transcript if questions are asked about them – RF.)

The outcome was that a deep understanding of OER implementation was still lacking, and new tools and practices would have to be introduced in order to drive open textbook adoption.  This influenced the design of the new framework for reviewing and distributing open resources.

Quantitative Data Analysis

When the Q&A session began I headed over to the discussion of quantitative data. I was coming into the discussion that was already getting into the nitty gritty, but here are some of the points that I took away:

  • A consistent approach to categorising resources and how they are used (driven by subject understanding rather than data-driven)
  • There are different ways of categorising OER – according to the reason they were produced; formatting; level of production (individual/institutional)
  • In producing a matrix for analysing quantitative data there should be some flexibility so as to account for regional / cultural difference, etc.; different groupings might be appropriate for different regions / countries
  • Piloting the method with 2-3 studies from within ROER4D is a good way to evaluate the approach taken
  • Stratification of the sample can be achieved by categorising the institutions according to size, level, purpose, etc.  This would allow for comparison across and within countries
  • One approach could be to use respondent codes (or some other naming convention) consistently across both the qualitative and quantitative analysis processes

After coffee the focus moved to strategies for data curation and communication.

Data Curation and Communication 

The commitment to Open Research is the foundational principle that guides ROER4D in its approach to creating and sharing research data, documents and other outputs. The five key aspects of Open Research so far identified are:

  1. Transparency in the research process
  2. Open licensing on all project outputs
  3. Maximising human readability and accessibility through multiple locations and open formats
  4. Maximising machine readability through online open formats (such as .xml)
  5. Long-term preservation curation and accessibility of outputs through a multi-platform data management plan

However, openness is a concept with which many researchers will be unfamiliar, and that an uncritical approach to openness may result in problems. Thus, the commitment to openness is qualified through the two considerations below:

  1. Open access to resources where openness adds value
  2. Protection of the dignity and privacy of individuals involved

Information needs to be organised and communicated if it is to have impact and good visibility. This is especially important for ROER4D in raising awareness of OER use in the Global South.  Once this is in place then it can be communicated to different audience in an iterative feedback loop.  For ROER4D, there are particular challenges around languages, diverse culture, and measuring the impact of the project. Effective use of metadata is crucial here – we might even call it a ‘love note to the future‘!  URI / DOI should be employed to track the use of data by others.  It’s also important to make sure that you comply with your own institutional data curation policies.

When thinking about whether to release data on a CC0 licence it’s important to realise that  this does not require attribution and there’s nothing to stop anyone working with this data and failing to give any attribution to the original researchers.  Effective registration of metadata about the project outputs on repositories will encourage better propagation of the research.

One thing we didn’t have time to discuss was how it was anticipated that people would arrive at the UCT repository in the first place.  (Maybe the idea is that the OpenUCT repository has good integration with search engines.)

Both a book summarising the ROER4D project and an interactive research report are anticipated.  The latter could include multimedia content summarising different strands of the work and link through to the more detailed reports and the open data itself.  Through a modular approach to reporting it should be possible to generate reports with different emphases or geospatial dimensions.

Dissemination

After lunch, Atieno Adala gave a neat summary of things to think about when writing research articles and gave an overview of good practice.

I then presented some work from the OER Research Hub and OER World Map projects. This was an impromptu activity, but a good opportunity to bring the map project to the attention of a network who are potentially really important for uptake among the global community. Here are the slides I used, some of which are taken from the OER 15 presentation last week.

Next Patricia Arinto gave an overview of the different dimensions of impact that the ROER4D Impact Studies will look at.  These covered a range of the spectrum of potential OER impact such as educator and student practice, institutional impact, effect on quality of resources,

From this point on the meeting broke into smaller working groups and I drifted off to the GO-GN Global Graduate Network meeting of PhD students, some of whom are likely to spend more time at The Open University (UK) which is taking over administration of the network.

Both the ROER4D project and the GO-GN network have tracks in the OER Global conference as we progress through the week in picturesque Banff.

OCWC 2014 Recording Available

The video recording from my research presentation at OCWC 2014 is now available at http://videolectures.net/ocwc2014_farrow_oer_impact/.  It’s not possible to embed here but they have a nice player on their site.

This presentation gives an overview of the OER Research Hub project, some of the methodological and epistemological issues we encounter, and how we propose to ameliorate these through the technologies we use to investigate key questions facing the OER movement.


OER Impact: Collaboration, Evidence, Synthesis
Robert Farrow

Open Research into Open Education #calrg14

Here are my slides from today’s presentation: feedback welcome as always.

The project website is http://oerresearchhub.org and the OER Impact Map is available at http://oermap.org.

Thinking Learning Analytics

I’m back in the Ambient Labs again, this time for a workshop on learning analytics for staff here at The Open University.


Challenges for Learning Analytics: Visualisation for Feedback

Denise Whitelock described the SaFeSEA project which is based around trying to give students meaningful feedback on their activities.  SaFeSEA was a response to high student dropout rates for 33% new OU students who don’t submit their first TMA.  Feedback on submitted writing prompts ‘advice for action’; a self reflective discourse with a computer.  Visualizations of these interactions can open a discourse between tutor and student.

Students can worry a lot about the feedback they receive.  Computers can offer a non-judgmental, objective feedback without any extra tuition costs.  OpenEssayist the structure of an essay; identifies key words and phrases; and picks out key sentences (i.e. those that are most representative of the overall content of the piece).  This analysis can be used to generate visual feedback, some forms of which are more easily understood than others.

Bertin (1977/81) provides a model for the visualization of data.   Methods can include diagrams which show how well connected difference passages are to the whole, or to generate different patterns that highlight different types of essay. These can be integrated with social network analysis & discourse analytics.

Can students understand this kind of feedback? Might they need special training?  Are these tools that could be used primarily by educators?  Would they also need special training?  In both case, it’s not entirely clear what kind of training this might be (information literacy?).  Can one tool be used to support writing across all disciplines or should such a tool be generic?

The Wrangler’s relationship with the Science Faculty

Doug Clow then presented on ‘data wrangling’ in the science faculty at The Open University.  IET collects information on student performance and presents this back to faculties in a ‘wrangler report’ able to feed back into future course delivery / learning design.

What can faculty do with these reports?  Data is arguably better at highlighting problems or potential problems than it is at solving them.  This process can perhaps get better at identifying key data points or performance indicators, but faculty still need to decide how to act based on this information.  If we move towards the provision of more specific guidance then the role of faculty could arguably ben diminished over time.

The relation between learning analytics and learning design in IET work with the faculties

Robin Goodfellow picked up these themes from a module team perspective.  Data can be understood as a way of closing the loop on learning design, creating a virtuous circle between the two.  In practice, there can be significant time delays in terms of processing the data in time for it to feed in.  But the information can still be useful to module teams in terms of thinking about course:

  • Communication
  • Experience
  • Assessment
  • Information Management
  • Productivity
  • Learning Experience

This can give rise to quite specific expectations about the balance of different activities and learning outcomes.  Different indicators can be identified and combined to standardize metrics for student engagement, communication, etc.

In this way, a normative notion of what a module should be can be said to be emerging.  (This is perhaps a good thing in terms of supporting course designers but may have worrying implications in terms of promoting homogeneity.)

Another selective element arises from the fact that it’s usually only possible to collect data from a selection of indicators:  this means that we might come to place too much emphasis on data we do have instead of thinking about the significance of data that has not been collected.

The key questions:

  • Can underlying learning design models be identified in data?
  • If so, what do these patterns correlate with?
  • How can all this be bundled up to faculty as something useful?
  • Are there implications for general elements of course delivery (e.g. forums, VLE, assessment)?
  • If we only permit certain kinds of data for consideration, does this lead to a kind of psychological shift where these are the only things considered to be ‘real’ or of value?
  • Is there a special kind of interpretative skill that we need in able to make sense of learning analytics?

Learning Design at the OU

Annie Bryan drilled a little deeper into the integration of learning design into the picture.   Learning design is now a required element of course design at The Open University.  There are a number of justifications given for this:

  • Quality enhancement
  • Informed decision making
  • Sharing good practice
  • Improving cost-effectiveness
  • Speeding up decision making
  • Improve online pedagogy
  • Explicitly represent pedagogical activity
  • Effective management of student workload

A number of (beta) tools for Learning Design have been produced.  These are focused on module information; learning outcomes; activity planning, and mapping modules and resources.  These are intended to support constructive engagement over the life of the course.   Future developments will also embrace a qualification level perspective which will map activities against qualification routes.

These tools are intended to help course teams think critically about and discuss the purpose of tolls and resources chosen in the context of the course as a whole and student learning experiences.  A design perspective can also help to identify imbalances in course structure or problematic parts of a course.