Perspectives on the student as e-learner


Today I attended an internal Open University (OU) seminar which I had been much looking forward to.  This blog post is my summary notes from that seminar and some personal reflections on it.  A much repeated “mantra” of mine over the last 10 years has been:

  • We don’t know enough about the experience of our students studying online

My particular interest is the experience of our approximately 20,000 disabled students.  However, in many ways, within the diversity of this cohort and that of the overall student population, their experience is not distinct in most aspects. Putting that more simply, disabled students are people first exhibiting the full range of human diversity in their dislikes and preferences.  For example among all students, and among those declaring disabilities, there are significant proportions that do not like to read long texts on screen.

So I approached this seminar as a way of gaining insights of the student experience from three colleagues who have differing roles.  My hope was that this would provide enriched context for my work.


The seminar was part of an Arts and Social Sciences e-Learning Seminar Series. The abstract published in advance is quoted here:

Three presenters will offer different perspectives on students as e-learners. Megan Doolittle, Staff Tutor and Senior Lecturer in Social Policy and Criminology, will offer a module team member’s view based on DD206, The uses of social science. Richard Greaves, Associate Lecturer in the East of England who teaches Arts modules at all three undergraduate levels, will offer the tutor’s perspective on students as e-learners. And Robin Goodfellow, Senior Lecturer in New Technology in Teaching (IET), will examine the way that the faculties’ data wranglers use and interpret information about students.

Discussion around these presentations will focus on the question ‘what does this triangulation tell us about our students?’


The presenters’ web pages are listed here:

The PowerPoint Presentations and a video recording of the session will be put on line in a day or so and I will post the links to them here then.  These may or may not be accessible to OU staff only.

A note on OU roles (OU colleagues please skip)

Here are a few notes to set the rest of this blog in context for any readers not familiar with The Open University.  They give a simplified picture. For more information see:

  • An undergraduate qualification with the OU is made up of studying a series of Modules.
  • The Open Degree is made up from 300 credit points, the equivalent of 5 years of 60 credit point modules. The BA (Honours) or BSc (Honours) adds a further 60 credit points.
  • There are requirements for the number of points that must be achieved at each of 3 Levels (1, 2 or 3), that approximately correspond the levels of study you might experience in a conventional British University in 1st, 2nd and final years of a bachelors degree.
  • A typical module is worth 60 credit points (although there are 30, 15 and 10 point modules too).
  • A 60 point module requires 16 hours of study each week over 9 months; equivalent to about three evenings a week and a full day each weekend. (Many students “laugh” at the university’s study time estimates.)
  • Modules are developed in Module Teams each of which has a Module Team Chair (the role represented by Megan Doolittle at this seminar).  The teams are multidisciplinary made up of several, sometimes many, academics from the host faculty, Curriculum Managers (who have a co-ordination and project management role) and other specialists.
  • Learning and Teaching Solutions (LTS) is the centre for the development, production and delivery of creative and cost-effective distance learning materials (both printed and electronic). Specialists in the production of the various media elements, work closely with each Module Team.  They also closely work with IET.
  • Modules are essentially delivered by people the OU calls Associate Lecturers which are tutors.  Richard Greaves represented this role in this seminar.  Students on any given module are split into tutor groups which are assigned one or two tutors.  Traditionally these tutor groups would typically have the opportunity to meet together face-to-face about once a month but the tutor group interaction has moved increasingly online in recent years.
  • The OU makes extensive use of a Virtual Learning Environment (VLE) – based on Moodle[1] and other software tools including online forums, audio conferencing and remote presentation software.  Each module has a presence here and some are presented wholly in this way. Although an approach integrating this with printed materials is more common and sometimes face-to-face activities like summer schools and field trips are included.
  • The Institute of Educational Technology (IET) (where I work) supports all the other faculties and the university’s management in making pedagogically sound use of technology in its teaching and learning.  It does this based on internationally renowned research in this field.  It has many roles but Robin Goodfellow represented in this seminar that of a learning analytics advisor to a faculty.  This is an ongoing role communicating data on student performance and attainment and assisting with its interpretation to the Module Teams and faculty management.  For decades this role has gone on in different guises.  Recently it has acquired the sexy title of “Data Wrangler” and the terminology of learner analytics has come to the fore. [Any apparent cynicism there is all mine.]

Notes taken during the seminar

Megan Doolittle – From the module teams perspective

  • Module Team Chair for  DD206 “The uses of social science”, see:
    • 2nd Level, 60 point, more online than past social-science modules
    • 1000+ students first presentation, 84% completed, 95% of those passed
    • Students liked:
      • Integrated assessment strategy
      • The relevance of the online elements
  • Students did not like:
    • Extent of online materials
    • Inadequacies in some software
  • Collaborative Groupwork
    • Students in small groups asked to  find and assessed evidence about a current debate (e.g. the planned UK High speed train line HS2)
    • Assessed (participation in groupwork always higher if assessed)
      • But assessed individually and participation accounted for 15% of grade
    • Tasks for each group (set up by module team – facilitated by tutors)
    • Forum discussions main method of collaboration
    • Issues for offender learners (prisoners) and those that can’t access forums
  • How well did it work?
    • Anecdotally – almost all students participated (at least a bit)
    • Sample group (8 students)
      • 84 posts across a number of threads (given as a hand-out)
      • A larger group of 11 students made 104 posts
    • Whole tutor group considered too big (based on knowledge of face-to-face teaching)
    • Series of lead in activities available not assessed but participation in these much lower
    • Sample exchange shows:
      • Common group dynamic patterns
      • Some posting with no comment
      • Students took task seriously
      • No complaints from tutors or students (although a few said won’t do)
      • Often one student took the lead
      • Most cases kept to focus of thread well
      • They didn’t always stick to roles initially proposed
      • Resources were shared
      • Some evaluation of resources, even if superficial (more like that which is common on social network discussions like Facebook or Twitter)
      • Key discussion points were raised  – sometimes backed up with evidence
      • It was not easy to move towards into more in-depth discussion of the issues and evidence
      • Most groups self-organising
      • Tutors tried various strategies to manage the forums – e.g. e-mail prompts
  • Technical issues
    • Forums are “clunky” at OU end to set up – the tasks had to be simpler than had been envisaged in planning stages because of technical limitations
    • Support from LTS, Module team and tutors needed in presentation
  • Feedback from students
    • (Similar to the feedback you would see in face-to-face group work)
    • Complained about lack of participation from other students
    • Difficulties of those working at different paces or different schedules
    • Not a lot of specific feedback from students just a few grumbles on forums
  • 2/3s teaching materials written for online environment
    • Limited blocks of text – lot of audio and video
    • Enabled a real blend of skills and content to be taught simultaneously
    • Took staff long time to learn to write effective resources (had to see it online to figure if it worked then iterate)
    • Some things can be best taught in book – e.g. abstract concepts
    • Tension between writing styles used online and models of good academic writing for students – referencing an important tool for this (how reference VLE materials?) – Still don’t ask students to write for online environment but rather essays and reports.
  • Student Experience
    • Students  could work though module from beginning to end – the tool has a strong linear push
    • The online annotate tool is currently very poor
    • Harder to dip in and out than print
    • Some students did not want to work on screen; others preferred it – most just got on with it.
  • Issue of the 15% participation mark and offender learners and others who could not get online?
    • Answer – It’s problematic

Richard Greaves – From the AL (tutor) perspective

  • Provision on different modules:
    • AA100 – face to face and cluster forum
    • A150 – face to face and Tutor Group forum
    • A230 Face – to face and cluster forum  and national Forum
  • Module Websites
    • All provided some information
    • Newer modules more online
    • AA100 – “Inside Arts” Works well for nearly all students who access it
    • More problematic for weaker students – need tutor support
      • Feeling about more confident students posts: “I don’t know how they got to that answer I must be thick”
  • Students inexperienced with IT find it confusing
  • Students unwilling to spend time learning to navigate the website (e.g. when encounter different layouts)
  • Important because
    • Can lead to student not accessing materials
    • Student – Tutor Interactions
  • E-contact via forums
  • Tutor “business posts”
  • If tutor gives answer on forum rather that e-mail available to all on forum
  • Tutor posts indication additional materials e.g. relevant TV programmes
  • Online Tutorials
    • N.B.  long posts on forums do not get read
    • Elicit student engagement via discussion and questions
    • Students need an individual response from tutors not summary answers
  • Types of student:
  1. Confident and Intelligent posters
  2. Weak – showing little understanding but still participating
  3. Readers – Who do not post (lurkers is the term I would use c.f. social media)
  4. Non-engagers
  5. Students who see online activity as part of their study
  6. Students see all OU activity as “work” do not see tutorials and forums as support
  7. Students who want to work on their own – they may be bright and work effectively
  8. Students who are weak and struggling but will not engage because of this – student type 1. puts this student off (may prefer to use a Facebook[2] groups – these are sometime a source of disinformation)
  • Element of chance in the mix of students types in any given tutor group
  • What does student resistance look like?
    • I haven’t got time
    • I feel intimidated prefer Facebook
    • I only want peer support Facebook is better for that
  • What does AL (tutor) want?
    • Students boosting understanding and skills
    • Student engaging with forum based tutorials
    • Students able to access peer-support
    • Don’t want – discussion posts that are literally interpretations from other sources copied from sources found by google search
  • Compulsory online activity
    • A150 Wiki assignment
    • Requires online group activity
    • Successful in itself but didn’t (in his case) carry over to later course tasks
    • Students are increasingly instrumental – only engaging when marks involved

Robin Goodfellow – From the IET “Data Wranglers perspective
or “Where is the student in all these numbers?”

I have not noted this section live.  This is partly because I ran out of steam for near verbatim note taking and because of the nature of the presentation included lots of data tables and graphs. Please refer to the slides that will be linked to under resources above as soon as they are available.

Key points:

  • What assumptions are we making about the students when looking at data from VLE?
  • Great graphic illustrating how no correlation between KPI1 (Satisfaction > 87%) and Pass rates.  SEaM[3] survey goes out after final Tutor Marked Assignment (TMA) before the end of module Exam.
  • Leaning design (OU version) structured in 4 categories:
  1. – Guidance and Support
  2. – Communication and Collaboration
  3. – Content and Experience
  4. – Reflection and Demonstration
  • VLE tracking Data:
    • % of students that access VLE in each week
    • Average time (mins) students spend online in each week
    • Forum access – % of students in each week (module teams can specify which forums tracked)
    • Resource bank accessing per week (but don’t know which students are which – so might be different students in adjacent week)
    • What are the students saying?
      • Online collaborative activates usually get low satisfaction levels.

Small group round table exercise:

A set of various data sheets and graphs were given out to each table of about 6 people.  These related to anonymised courses, 3 from Arts and 3 from Social Science, at Levels 1, 2 and 3.  Each table was asked to review the data presentations and:

  • Say what does data show?
  • Identify 2 or more modules  that appear to suggest  different learning experiences
  • Prepare a short  summary  of what the student  on each of the modules is “likely” or “unlikely” to do or experience


Time was cut short for discussion because of the inevitable over-running of the three excellent main presentations.  However here are some of the key points that were raised in the short discussion period:

  • Lots of points on the degree of confidence in the learning analytics.
  • What issues are specific to e-learning, many are the same as face-to-face, then how can this be used to improve e-learning?
  • Satisfaction and pass rates a really dangerous indicator! – Difference between learning and passing.
  • There is a danger of reducing the incentive to teach hard things.  This is because of criteria being used to evaluate how good a module is, is so reductive.
  • Does the data analytic view reflect anything? – Is it not just a process measure?

Robin – if we reject the data approaches that are beginning to pervade the university because they reductive we are absenting ourselves from the discussion and steering towards good practice in their use.

“Know thy data!”

Personal reflections

I am not going to make extensive reflections in this blog post.  I have a meeting with Robin Goodfellow booked for the 30th Jan 2014 about the overlap between our respective work, interest and reservations about learning analytics.  I hope to make a further blog post after that which this seminar will contextualize.  However here are a few thoughts:

  1. It was a shame that there was insufficient opportunity to triangulate the 3 perspectives given in the seminar.  This is vital because of the scale that the OU operates at and because unlike most courses at face-to-face universities it is not the authors of the modules that deliver them.
  • Module teams need to know the tutor experience
  • Module teams need to know the students experience as understood by the tutors and the student surveys and other learning analytics approaches
  • Those collecting and interpreting the data about the student experience need to validate their data approaches against the students, tutors and module teams “stories” (I hope to expand on this in the next blog post)
  1. Robin’s concluding comment of “know thy data” was a fitting one.  There was quite a bit of anxiety expressed, as has been widely recognised in other learning analytics work, that the learning analytics was more about management evaluation of the academic staff and their modules than enhancing student support and for continual improvement of module design.  I can understand that anxiety but I agree with Robin that the way to ensure a positive use of the data in improving teaching and learning is to understand the data, what is means and importantly what is does not mean; its limitations.
  2. Unlike many of my colleagues in IET working on learning analytics, I come from an engineering (cybernetics) background and not a social science one.  So, some of my thinking about this data comes from a measurement physics perspective that was part of my education and work as an engineer for about 10 years.  Key questions for the learning analytics approaches from this perspective include:
  • What are you actually measuring?
  • How does what you are measuring relate to what you really want to know?
  • What is your estimate of the error in your measurement?
  • What is the signal-to-noise ratio?
  • When are you approaching the fundamental limits of measurement for the approach in question?
  • How might your measurement approach and instrumentation be interfering with what you are trying to measure and to what degree?

The issue of instrumentation is important in learning analytics.  It is very easy to produce pretty graphs and dashboards that look like they are saying something important and useful, but without understanding the measurement approaches, data and metrics and the limitation of these it is too easy to convey a false confidence level in our learning analytics.

Repeating Robin’s conclusion: “know thy data!”


[2] Facebook groups are rarely set up by tutors or Module teams but are frequently set up for peer-to-peer support by students themselves

[3] The SEaM Survey is the OU’s main method for collecting feedback about the study experience and administers by IET. It is:

  • an evaluation which asks students to provide feedback on their tutor and their wider module experience in one combined survey managed jointly by the Student Statistics and Survey Team in the Institute of Educational Technology and the AL Services Team in Student Services.
  • sent to all students who successfully studied and completed modules with tutor support circulated via email with an online link to the survey two or three weeks before the end of the module.

This new approach was first used in January 2013 and the results are proving to be very beneficial.

2 thoughts on “Perspectives on the student as e-learner

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s