Notes from CALRG Conference 2014

The Computers and Learning Research Group (CALRG) at the Open University (UK) has an annual conference.  Today and tomorrow is the 35th such conference.  This post is my notes on the presentations I attend (unfortunately I can not make them all).  There is a conference Twitter feed with hashtag #calrg14.  A temporary conference website is at: http://sites.google.com/site/calrg14/ with a link to the programme.

CALRG Annual Conference Day One – June 10 2014

Discussant: Prof Rupert Wegerif, Exeter University

9:30-9:45 OPENING NOTES

Patrick McAndrew (Director of IET) – Own experience over 16 years as an induction to the university.  A catch-up point.  Today the theme is mostly on Open Education.

Session I – Chair: Doug Clow

9:45-10:15 MOOCs, Learning Analytics and Higher Education: Perspectives on a recent study leave visit to the USA
Eileen Scanlon

  • The Americans sometimes slightly hallucinate our experience of Ed Tech
  • First stop – ACM Conference on learning at scale (single track)
  •  Bestthing-keynotefromChrisDide, of Harvard. – “Newwineinnow bottles”
    • It is not about the platform but what you do on the platform
    • Use of metaphors from film
    • Going big requires thinking small
    • Micro-genetic studies of online learning
    • People had forgotten all the learning science previously done
  • DistanceLearningOERs and MOOCs (Eileen’s presentation at conference)
    • The Open Science Lab
    • Edinburgh experience – professional development of surgeons
  • Next stop Berkley (Invitational Summit of 150 people)
    • Impact on residential campus based universities
    • Relying on schools of education to measure student learning
    • ReflectiononEDX platform
      • Transforming the institution (MIT in this case)
      • Learn about learning
      • E.g. required physics course – group learning – lot of use of online assessment
      • Comparison of performance in MOOCs of those taking residential course versus those not
      • Drown in information if Google assessment of EDX
    • Simon initiative at Carnegie Mellon
      • AI and Cognitive Tutors
      • Broader than the institution
      • Global learning council
      • Spin out company called “Cognitive Tutors”
      • Individualized instruction seen as gold standard for education
  • Then visited Stanford
    • TheLytics Lab (Learning Analytics)
      • Using learning science with open educational delivery
      • Moving from fragmented approach to systematic improvement of this type of pedagogy
      • CSCL (conversation) ->MOOC space
      • Scale of work in Stanford on MOOCs is staggering
      • Still individual academic driven
  • Then various other conferences
  • Future Learn Academic Network
    • Originally 26 partners now expanding and going more global
  • ESRC proposal on future of higher education
    • Partners: OU, University of Edinburgh, Carnegie Mellon, Oxford University

10:15-10: 45 Squaring the open circle: resolving the iron triangle and the interaction equivalence theorem
Andy Lane

  • Visual Models
    • How visualization can help with understanding/sense making
    • They can equally conceal
    • The Iron Triangle – sides: Scale, Quality, Cost
      • If one dimension changed significantly it will compromise others
    • John Daniel – open distance learning could break the iron triangle
    • Interaction Equivalence Theorem (EQuiv)
    • Supply-side vs demand-side (what about the students?)
    • Adding a circle of success to the iron triangle
    • A student centred iron triangle
      • motivation, preparation, organisation
    • A student centred Interaction Engagement Equivalence Theorem

10:45-11:15 Exploring digital scholarship in the context of openness and engagement
Richard Holliman, Ann Grand, Anne Adams and Trevor Collins

See: http://open.ac.uk/blogs/per

  • Public engagement with a research mandate
  • Research councils fund catalysts
  • An “ecology” of openness
  • Action Research [Lewin 1946]
  • The Edge tool
  • How do we find ways if assessing where staff are and then support them?
  • Research Questions
    • What methods and technologies are researchers using to: make research public, make public research, enable the public to collaboratively research (citizen science)?
    • how do researchers conceptualize the role of students?
  • Scholarship reconsidered
    • discovery
    • integration
    • application
    • teaching
  • Awareness / Responsibility / Sustainability
  • Institutional strategy for open, digital and engaged scholarship
    • What should we try to change?
  • Types of researcher: the fully wired; the dabbler; the brave trier; the unimpressed
  • “The Open Scholar is someone who makes their intellectual projects digitally visible …”
  • Policies / Procedures / Practices

[The remaining  session of Day 1 I was not able to attend but the programme in included here]

Session II – Chair: Ann Jones

11:30-11:55 The OpenupEd quality label: benchmarks for MOOCs
Jon Rosewell

11:55-12:20 From theory to practice: can openness improve the quality of OER research?
Rebecca Pitt, Beatriz de-los-Arcos, Rob Farrow

12:20-12:45 Open Research into Open Education: The Role of Mapping and Curation
Rob Farrow

12:45-13:10 Strategies for Successful MOOC learning: The Voice from the World Record Breaker
Bernard Nkuyubwatsi
Session III – Chair: Rebecca Ferguson

14:00-14:25 The role of feedback in the under-attainment of ethnic minority students: Evidence from distance education
John T.E. Richardson, Bethany Alden Rivers and Denise Whitelock
14:25-14:50 Evaluating serious experiences in games
Jo (Ioanna) Iacovides
14:50-15:15 Social media for informal minority language learning: exploring Welsh learners’ practices
Ann Jones
15:15-15:30 TEA/COFFEE
Session IV – Chair: Inge de Waard
15:30-15:55 What students want: designing learning to optimise engagement in digital literacy skills development
Ingrid Nix and Marion Hall
15:55-16:20 Recording online synchronous tutorials to support learning
Pauline Bloss, Elisabeth Clifford, Chris Niblett and Elke St.John
16:20-16:45 Open Education needs Education for Openness: a dialogic theory of education for the Internet Age
Rupert Wegerif
16:45-17:00 Discussant – Rupert Wegerif
and CLOSE

CALRG Annual Conference – Day 2 – June 11 2014

Session V – Chair: Mark Gaved

9:40-10:05 ‘nQuire-it’: The design and evaluation of a mission-based web platform for citizen inquiry science learning
Christothea Herodotou, Eloy Villasclaras- Fernández , Mike Sharples

Notes from this presentation lost in the ether 😦

10:05-10:30 3D Virtual Geology Field Trips: Opportunities and Limitations
Shailey Minocha, Sarah-Jane Davies, Brian Richardson and Tom Argles

  • Can do things unable to d in a real field trips – e.g. drape maps over mountains, see geological cross sections
  • Us Unity 3D Game Engine to build a 10km x 10km area mapping and imaging the real world (around Skiddaw, England)
  • Can pick up rocks and examine under microscope
  • Includes a chat facility for tutor group communication
  • Leave these tools out of the application so as not to compromise the immersion
  • Addresses accessibility with transcripts and full keyboard only access
  • Able to “fly” and “teleport” (on a real field trip a lot of time wasted travelling between sites)
  • Avatar based environment
  • Students use a paper based notebook as they would in the field
  • Integrate the virtual microscope (existing facility) but now contextualized learning
  • Cloud server can handle up to 500 students at one time

10:30-10:55 Juxtalearn: From Practice into Practice
Anne Adams and Gill Clough

  • Large EU project
  • Driver – not enough taking science and technology at school – employment implications
  • Science and Technology engagement through “creative performance” and reflective learning”
  • Threshold concept (TC)
    • Where students find challenges
    • When they get it it is transformative
    • Irreversible – not readily forgotten
    • Integrative – brings concepts together
  • Learning Pathways and Threshold Concepts (different ways from introduction of concept to internalisation of it)
  • Develop understanding through creative video making
  • Tricky Topic Tool
    • Teachers identify tricky topic
    • Teachers create an example
    • Teachers write down student problems
    • Teachers fill in Taxonomy (linked to student problems)
      • e.g terminology, intuitive beliefs, incomplete pre-knowledge, …
  • Taxonomy scaffolds quiz creation
    • Tool to facilitate this
    • Integrates detailed feedback to the student
  • Demo

Session VI – Chair: Anne Adams

11:15-11:40 Citizen Inquiry: From rocks to clouds
Maria Aristeidou, Eileen Scanlon, Mike Sharples

  • Citizen Science + Inquiry based Learning -> Citizen Inquiry
  • Inquiring – Rock Hunters (Initial Study)
    • 24 participants
    • 12 rock investigations
    • discussion and feedback on chat and forums
    • Data collection – questionnaires, System Usability Scale [John Brooke, 1986], …
  • [Note taking interrupted]

11:40-12:05 Imagining TM351 – Virtual Machines and Interactive Notebooks
Tony Hirst

  • TM351 – New Level 3 30 point module on data
  • Two new things:

1. Virtual machines (to overcome the diversity of machines being used by students)

    • Interfaces increasingly browser based
    • Virtual box installed on student machine and browser used as interface
    • Virtual machine can be on cloud server – then can use on a tablet

2. Notebook Computing

  • Literate programming / reproduce-able code or research
  • Code should be able to be read as an essay (self documenting) – read well to human and executable by the machine
  •  Can’t reproduce data analysis from traditional academic papers – reproduceable research includes the tools to enable this
  • Using IPython
  • Corollary to spreadsheets
  • Task orientated productivity tools
  • Cells
    • write text
    • uses “mark-down” simple text based mark-up
    • other cells contain python code
    • e.g. the software creates the table – avoids errors in production and editing
    • similarly with maps and paths
  • IPython server in VM – interface in browser
  • Exploring using OpenDesignStudio so students can share and critique each others code in executable form (see: http://design.open.ac.uk/atelier-d/cdi1.htm)
  • Example shown

 

12:05-12:30 MASELTOV – mobile incidental learning services to support language learning and the social inclusion of recent immigrants
Agnes Kukulska-Hulme, Eileen Scanlon, Ann Jones, Mark Gaved

  •  Using smart phones to support language learning
  • Addressing those with low educational level and from different culture
  • Incidental learning approach
  • MApp: a range of services
    • Field local mapping
    • Social network
    • Information resources
    • Translation
    • Navigation guide
    • Language learning
    • Serious game
  • These are separate apps but integrated in the platform
  • High penetration of smart phone among target audience
  • Technology uncertainty period
    • Many purchase phone ahead of travel
    • Android phones most popular
    • May have multiple phones
    • Seek out free WiFi
    • Word of mouth expertise highly valued
  • Howdowe enable transition from problem solving to reflective learning?
    • relating immediate situation to broader context
    • Feedback and progress indicators
      • Study planning and goal setting
      • Indicating completion
      • Supporting sense of community
      • Building confidence
      • Gamified approach
      • Quizzes
  • What evidence that this approach to language learning is effective?
  • Are there clusters of tools use?
  • Demo

12:30-12:55 Knowledge Transfer Partnership: Booktrust and the Open University
Natalia Kucirkova, Karen Littleton, Teresa Cremin and Laura Venning

  • Ongoing project started this year
  • KTP-objectives:
    • Extending book trust work on promoting reading for pleasure
    • Contribute to digital literacy
    • New knowledge and understanding of digital technologies and the opportunities they provide
  • Synergy of two organisations
  • Looking at books created on iPads (created by children or parents using words and images)
  • The ability to search for meaning is enhanced by creating stories
  • Book Trust:
    • Charity founded in 1920s
    • Encouraging reading for pleasure among children and families
    • Run book gifting programmes
    • Book-start – packs delivered by health visitors and via libraries
    • Reception year programme
    • Now seeking to develop the digital side of their work
    • Undertake research on reading habits and how reading contributes to peoples’ lives
    • Reading Habits survey 2013-14

Session VII – Chair: ?

14:00-14:25 Flipped teachers’ views of the impact of open practices on students
Beatriz de los Arcos

  • Flipped teacher – move the instruction online more discussion and analysis in class
  • Help with “homework” given by experts
  • Survey of OER use by teachers and how impact on students
  • “I do not treat this curriculum as mine – it belongs to the class and the world”
  • http://sites.google.com/a/byron.k12.mn.us/stats4g
  • Example of a learning activity on Chaucer’s Canterbury Tales – kids turned in 82% of homework on time
  • OER enables new ways of teaching and learning
  • How do we measure the success of the flipped model?
    • A lot of teachers respond to do with student motivation and engagement
  • Most teachers informally adopt OER practice (e.g. uploading to You Tube) but don’t know about CC licenses etc.
  • Does flipping with OER give a better “flip” than working with closed resources?

 

14:25-14:50 The pedagogical design, user profile and evaluation of a Mobile app to teach beginners’ Chinese characters
Fernando Rosell-Aguilar and Kan Qian

  • Examples of tones in Chinese where same syllable means different things – but context means in practice mistakes not significant
  • About 10,000 characters in common use with typically 12 strokes
  • No space between characters to denote separation of words
  •  Stroke order is important – but this also aids memory of characters – in Chinese primary schools they would chant this
  • Pinyin (Roman letters) is used to teach pronunciation because no correspondence between character and pronunciation
  • Grammar very simple (no past or future tense) – verbs stay the same – no plural singular
  • Rationale
    • To provide an aid to learning
    • To raise profile of the introduction to Chinese course
    •  To fulfill KMi objective to produce revision aids
  • Pedagogical design
    • Bite-sized learning
    • Progressive Learning 20 lessons must be taken in order
    • Integrating writing, listening, reading and vocabulary
    • Gaming feature
    • Personalised learning
  • 4 Sections
  • Challenges of working with App Developers
    • What can be done with what desired
    • Timing issues
    • Technical affordences vs pedagogy
  • User profile and evaluation
    • More males than females (unlike other modern languages more males than females study Chinese)
    • Median Age 30-39
    • 91.9% describe themselves as beginners
    • 75% learning Chinese informally
    • Why learn Chinese:
      • Personal interest
      • Family ties
      • Non-Chinese living in China
      • Business use
    • False expectation of ability to learn fluent Chinese from app
    • App rated positively 86% very good or good
    • Good ratings for learning to write but better for learning to recognise characters
    • 82% app as additional to other learning but 18% using it as their main resource
  • Conclusion
    • Met objectives towards a large degree but no evidence of people using the app then signing up for the course
    • Varied mix of users (gender, age, etc.)
    • Android version limited character set iOS more comprehensive
  • App Chinese Characters First Steps – http://itunes.apple.com/gb/app/chinese-characters-first-steps/id441549197?mt=8#

14:50-15:15 Models of Disability, Models of Learning, Accessibility and Learning Technologies
Martyn Cooper

My presentation so not noted but slides are available on SlideShare at: http://www.slideshare.net/martyncooper/models-of-disability-models-of-learning-accessibility-calrg2014

 

Session VIII – Chair: Canan Blake

15:30-15:55 Computer-marked assessment as learning analytics
Sally Jordan

  • Using in iCMAs in teaching since 2000
  • Ellis (2013) assessment often excluded from learning analytics but this is “stupid”
  • Assessment give deep information about learner engagement
  • Analysis at the cohort level
    • Look at questions that student struggle with (from hard data not student opinion)
  • Example of graphic illustrating number of tries students take to get correct question answer in a maths assessment
  • Look are reasons for repeated wrong answers
  • Measuring student engagement – “750 students used my iCMA”
  • iCMAs in formative use exhibit those that just click on it but don’t engage (about 10%)
  • WhendostudentsuseiCMAs?
    • Strong bias towards cut-off dates
  • Length of response to short answer questions – if say a word limit students tend to write near to that limit (see it as a hint)
  • Student engagement with feedback – comparisons between students and comparison between modules
  • Generally students do what they believe their teachers want
  • Engagement with computer marked assessment can be used as a proxy for deeper behaviour
  • Transcend the testing paradigm and see assessment for learning not assessment of learning

15:55-16:20 Open Essayist: Opening automatic support for students drafting summative essays
Denise Whitelock, John Richardson, Debora Field, Stephen Pulman

  • The SAfeSEA Project, see: http://www.open.ac.uk/researchprojects/safesea/
  • Present summaries of students essays back to students to facilitate their reflection
  • Not tell students what to write (or what is right)
  • Identifies Intro, Main Section, Conclusions, Keywords
  • Generates different visual representations of the essay – one research question is what representations the students find most helpful
  • Nodel graphs represent repeated notions
  • Marked contrast between highly marked and low marked essays
  • Nodes closer together in the better essays – vector length represents the connectivity between sentences
  • In 2014 made available to students on MAODE, University of Herts and British University in Dubai
  • Non-native speakers expressed found it very helpful
  • A lot of students do not see how a computer system could help them with their essays

16:20-16:45 Findings from a survey of undergraduate use of mobile devices for OU study
Authors: Simon Cross, Graham Healing, Mike Sharples

  • ePedagogies of handheld devices
  • Document and analyse the patterns of use of OU students
  • Align with other surveys – e.g. OU Student Survey
  • Becoming a longitudinal study
  • Modules like a Lego set – what students do with it may be different than intended and may be influenced by the technologies they use
  • 82% students mobile phones, 50% tablets, 37% e-readers 8% none of these
  • 30% bought tablet for OU study
  • 16% bought e-reader for OU study
  • Evolving data set – resource for future research
  • Insights for module development
  • Evolving survey instrument
  • Evolving analytical framework
  • Technology barriers -> learning barriers

16:45-17:00 CLOSE

No discussant today – shame because I like this feature of CALRG Conferences.

JISC Digital Festival 2014 – Notes Day 1 (Cont…)

The other presentation I attended on day 1 of the festival was that given by Prof. David de Roure of the  University of Oxford.  He spoke on “Big Data for the Social Sciences“, which I hoped would be relevant to my own work on Learning Analytics.  This blog post is my notes from his talk.

How does technology get used in research?

-> What is this new “big data” and what does it tell us?

  • Big data does not respect disciplinary boundaries
  • Data has been around a long time
  • There is a lot of “hype” around big data that has led to inflated expectations of it
  • Can consider 2013 as the year we sort to define big data and 2014 the year we begun to use it effectively
  • It is big data because of both the velocity and volume of the data being generated
  • “Data deluge” is now a phenomenon across the disciplines
  • In the past analysis moved from the universities to business, now it is from the business world to the universities.
  • There is  huge unsatisfied demand for “data scientists”
  • Mores Law vs The Big Social

Moore's Law vs Big Social diagram

  • We use digital tools because it is the ecosystem – Research 2.0
  • What is the relevance of Social Science to Big Data?
    • We need to think through the implications
  • RCUK’s definition of “big data” is: big enough that we can’t deal with it as we did before
  • Why do we want it?
    • To do things in new ways
    • To do new things
    • e.g. Twitter data – we can look at the evolution of social processes in real-time
  • We need the expertise of those from classical Social Science
    • e.g. food vs consumption
    • can obtain new data from new sources (e.g. supermarket loyalty cards)
  • We can use different data sets to correlate
  • Real-time uses of big data, e.g. Twitter
    • spread of infectious diseases
    • riots
  • Visualisation can lead to better analysis
  • Underpinned by available infrastructure
  • Wikipedia is an example of  a social medium
    • behaviorally it is socially constructed
    • different in different countries/languages

— end —

Random Quotes from JISC Digital Festival 2014

Here are a few random quotes I noted down while at the JISC Digital Festival 2014 in Birmingham this week. Apologies for when I didn’t note who said them.

Academics need to stay on top of the analytics movement and not get pushed around by it!
[Anon]

A related to the above:

How does technology get used in research -> What is this new “big data” and what can (can’t) it tell us?
[Prof. David Rowe, Oxford University]

From a different perspective:

Research and Teaching have now diverged at the Universities
[On Twitter]

From the presentation by the originator of the “Hole in the Wall Experiment!:

Children will learn to do what they want to learn to do!
[Sugata Mitra, Prof. Of Educational Technology at Newcastle University]

I will add to these as I review the archived talks that I did not attend which you can do by going to: http://www.jisc.ac.uk/events/jisc-digital-festival-2014-11-mar-2014/expert-speakers

Notes from JISC Digital Festival (Day 1)

I attended the JISC Digital Festival in Birmingham, UK.  This blog post is a set of notes from Day 1 (11-March-2014).

Keynote – Diana Oblinger (CEO Educause)

“Why are we still talking about this digital stuff?”

It is about 25 years since we moved from the analogue world to the world of society, education and work being based on digital computing technologies. But we still use the term “digital” because either we see it as something special or we get concerned about “man vs machine”.  It is not just about digital – it is about demographics. In US only 17% of learners are traditional college students. Many are now studying as adults who may not have previously had a positive educational experience.  This changes what we need from education and “how” we deliver that education.

Engagement

When you are engaged you learn better – leading to the hypothesis that face-to-face is always the best solution. But face-to-face often just presents text on a screen (a board or projector screen). However, digital technologies allow greater interaction between students and between them and data about what they are  investigating. We want to promote deep learning and develop skills through practice.  In these online practice environments not only is there student activity but there is data coming back from the students’ interaction. When have massive amounts of data can begin to realise the long-held goal of personalised learning.  This can lead to adaptive learning systems.

Student Empowerment

There are different types of students.  Two example profiles:

  1. ROI Skeptics – not sure education will be worth the effort; external barriers; lack of vision; juggling work and family etc.
  2. Highly motivated students who always expected a college (university) education

Students need help with their complicated lives to be able them to give education the right priority. Case management – dealing with the student holistically; early alert programmes (for e.g.) have significant and lasting impact. Some students are blissfully unaware that they might be at risk of failing the course. Here lies a potential for learning analytics. However, beyond that teachers need support on how to deliver those messages to the student, e.g. mobile phone text message? (BTW 43 words seems to be about the right length of message.)

Too much choice can be the enemy of student success if they choose courses they are not prepared for or at too high a level for where they are in their studies.  (C.F. consumer choice problems).  Software tools are emerging to address.  Example shown of a tool for students and their advisers as they seek to support them and their choices.

Alternative Models

IT in education can spawn over complexity and disorientation.  Interconnected elements:

  • Mission
  • Market
  • Margin

Part of this is coming from outside our institution – e..g. MOOCs; large-scale commercial educational providers.  Many customers (students) feel they are over-served by the traditional university system. Now a big puss on competency based education – prove you have particular skills irrespective of where you acquired them.  However, current IT systems just focus internally to the educational establishment.

Time is an opportunity cost.  Example “Direct to Degree” from University of Kentucky, enabling students to rapidly acquire a degree and reach their employment goals. If students never go to campus where do you provide their support? Another example from College for America that provides the support in the workplace. Can be low-cost models – e.g. 1 student completed degree in 3 month at cost of $1,350.

There is lots yet to do in the digital environment.  Need to design from the digital and with man & machine not man vs machine in mind.  A great frontier for all of us.

Three questions:

  1. What does it take to exceed expectations in this digital world?
  2. What capabilities (personnel, budget, skill) are required to deliver the value from IT?
  3. How do we optimise for a digital future??? – Answer yet to be told!
Steps and clap board Image credit: http://www.livebinders.com/play/play/759398

Response and Reflections on keynote

Prof. Paul Curren VC at City University

Was in a situation of ailing IT and with many mismatch between demand and supply issues.  (Here the focus is on education not research or admin. but had to address all three). How to develop a personalised experience for the students and enable academic staff to give the support they want to provide?  Further, how do we enhance the educational experience? How can we do this in the context engendering a community experience. City has multiple campuses now.

Trying to achieve a clear vision (2016) of where wanted the university to be.   Investing in academic staff, IT and estate.  Focused on having sector leading IT areas in education.  In 2010/11 large IT service, 142 IT staff, £ 14.6 million budget.  But, very devolved, often software developed in-house and not fully documented and big problems occurred when staff move on.

Formulated a strategic plan with projects each under a PVC.

  • Engaging IT services around the student (e.g. brought in Moodle, Office 365) and organised around the concept of “The One City”.
  • Sourced commodity from external suppliers
  • Now spending less on IT, less staff but more junior staff at the coal-face and less IT managers
  • Standard high quality equipment in the student spaces
  • Listened to what students, staff and professional services members wanted
  • Moved core services to a central base in London

Where are we now?:

  • The student registration system now stabilised
  • Monitor student access to Moodle to check on student engagement (reduced drop out rates)
  • Early initiation to ensure things were scalable and multi-platform (including mobile to enable work while travel)
  • Provided easy access to student records
  • Using the big providers (e.g. SAP) linked to increasing in-house skills
  • IT staff now viewed differently – previously seen around their skill base (e.g. UNIX) – now seen around their relationship management and knowledge of integrating systems

Challenges for staff:

  • Academic staff need to move seamlessly between a “digital” and “real” world – provided a lot of training
  • IT staff – understanding their new role around the user/systems integration – again investment in training
  • Outsourcing and agreeing the boundaries

Conclusions

  • On a journey
  • Downtime reduced
  • Student Satisfaction Scores with IT increased

Martyn’s reflections

The key points that stood out for me from these two linked presentations were:

  1. We need to accept digital as being here and now and move to a “Man & Machine” mindset and not a “Man vs Machine” one
  2. Systems integration is key and outsourced systems are often the cost-effective way to go
  3. IT support staff need to see their role as focussed on the users (students, academics, admin staff, management) not on a particular area of technology that forms the core of their skill set
  4. It’s a ‘win-win’ situation of better services at lower cost that is achievable this way

A personal reflection from me:

The IT systems should be the servants of the educational, support and management processes not the other way round!

horzizonal line indicating end of blog post

 

Ethics, Learning Analytics and Disability

Today I have been writing a contribution for a paper requested by the Open University’s Ethics Committee about ethics in Learning Analytics.  This blog post is adapted from that.

There are two broad use case scenarios where learning analytics approaches may benefit disabled students:

  1. Targeting support to disabled students or their tutors (Support)
  2. Identifying online activities that seem to be problematic for some disabled students (Accessibility)

As far as we are aware these approaches are yet to be deployed anywhere world-wide but we are actively researching them here at the Open University where we have approximately 20,000 disabled students.  We envisage that if the early promise of this research holds up, deployment on about a 3 year horizon.  These approaches, especially the accessibility one, are reported in more detail in Section 5. of Cooper et. al. 2012.

Firstly, a few definitions:

IMS Global Learning Consortium offered education-specific definitions of both disability and accessibility when introducing its work on the development of technical standards for accessibility in e-learning:

[…] the term disability has been re-defined as a mismatch between the needs of the learner and the education offered. It is therefore not a personal trait but an artifact of the relationship between the learner and the learning environment or education delivery. Accessibility, given this re-definition, is the ability of the learning environment to adjust to the needs of all learners. Accessibility is determined by the flexibility of the education environment (with respect to presentation, control methods, access modality, and learner supports) and the availability of adequate alternative-but-equivalent content and activities. The needs and preferences of a user may arise from the context or environment the user is in, the tools available (e.g., mobile devices, assistive technologies such as Braille devices, voice recognition systems, or alternative keyboards, etc.), their background, or a disability in the traditional sense. Accessible systems adjust the user interface of the learning environment, locate needed resources and adjust the properties of the resources to match the needs and preferences of the user. (IMS Global 2004)

Thus disability is not an attribute of a person, but an attribute of the relationship between that person and the tools they are using to meet their goals; in this case online learning.  And, accessibility is a property of the learning resources that makes is usable by all, including those traditionally labelled as disabled.

The principle ethical dilemma when approaching learning analytics and learners who might experience a disability in the context of online learning is:

  • For what purpose has the individual students declared their disability to the university or other educational establishment, and is this consistent with how that information is to be used in the learning analytics approaches?

No other literature has been found explicitly addressing this issue.  So this blog post might represent the first public statement of the problem.

At the Open University students who declare a disability so that they can be provided with support in their studies.  This is consistent with the first use case scenario (Support).  It is a moot point if it is consistent with the second use case scenario (Accessibility).  More critically at this stage of development of these approaches it is not obvious that it is consistent with research into these approaches.  Is it ethical to use historic or current data relating to students with disabilities to undertake research into future approaches of applying learning analytics?

References

Cooper, M,Sloan, D., Kelly, B.,  and Laithwaite, S. (2012) A Challenge to Web Accessibility Metrics and Guidelines: Putting People and Processes First, Proc. W4A2012, April 16-17, 2012, Lyon, France. Co-Located with the 21st International World Wide Web Conference.

IMS Global Learning Consortium (2004), IMS AccessForAll Meta-data Overview. Available online at: http://www.imsglobal.org/accessibility/accmdv1p0/imsaccmd_oviewv1p0.html (accessed 17/02/14)

Perspectives on the student as e-learner

Introduction

Today I attended an internal Open University (OU) seminar which I had been much looking forward to.  This blog post is my summary notes from that seminar and some personal reflections on it.  A much repeated “mantra” of mine over the last 10 years has been:

  • We don’t know enough about the experience of our students studying online

My particular interest is the experience of our approximately 20,000 disabled students.  However, in many ways, within the diversity of this cohort and that of the overall student population, their experience is not distinct in most aspects. Putting that more simply, disabled students are people first exhibiting the full range of human diversity in their dislikes and preferences.  For example among all students, and among those declaring disabilities, there are significant proportions that do not like to read long texts on screen.

So I approached this seminar as a way of gaining insights of the student experience from three colleagues who have differing roles.  My hope was that this would provide enriched context for my work.

Abstract

The seminar was part of an Arts and Social Sciences e-Learning Seminar Series. The abstract published in advance is quoted here:

Three presenters will offer different perspectives on students as e-learners. Megan Doolittle, Staff Tutor and Senior Lecturer in Social Policy and Criminology, will offer a module team member’s view based on DD206, The uses of social science. Richard Greaves, Associate Lecturer in the East of England who teaches Arts modules at all three undergraduate levels, will offer the tutor’s perspective on students as e-learners. And Robin Goodfellow, Senior Lecturer in New Technology in Teaching (IET), will examine the way that the faculties’ data wranglers use and interpret information about students.

Discussion around these presentations will focus on the question ‘what does this triangulation tell us about our students?’

Resources

The presenters’ web pages are listed here:

The PowerPoint Presentations and a video recording of the session will be put on line in a day or so and I will post the links to them here then.  These may or may not be accessible to OU staff only.

A note on OU roles (OU colleagues please skip)

Here are a few notes to set the rest of this blog in context for any readers not familiar with The Open University.  They give a simplified picture. For more information see: http://www.open.ac.uk/about/main/

  • An undergraduate qualification with the OU is made up of studying a series of Modules.
  • The Open Degree is made up from 300 credit points, the equivalent of 5 years of 60 credit point modules. The BA (Honours) or BSc (Honours) adds a further 60 credit points.
  • There are requirements for the number of points that must be achieved at each of 3 Levels (1, 2 or 3), that approximately correspond the levels of study you might experience in a conventional British University in 1st, 2nd and final years of a bachelors degree.
  • A typical module is worth 60 credit points (although there are 30, 15 and 10 point modules too).
  • A 60 point module requires 16 hours of study each week over 9 months; equivalent to about three evenings a week and a full day each weekend. (Many students “laugh” at the university’s study time estimates.)
  • Modules are developed in Module Teams each of which has a Module Team Chair (the role represented by Megan Doolittle at this seminar).  The teams are multidisciplinary made up of several, sometimes many, academics from the host faculty, Curriculum Managers (who have a co-ordination and project management role) and other specialists.
  • Learning and Teaching Solutions (LTS) is the centre for the development, production and delivery of creative and cost-effective distance learning materials (both printed and electronic). Specialists in the production of the various media elements, work closely with each Module Team.  They also closely work with IET.
  • Modules are essentially delivered by people the OU calls Associate Lecturers which are tutors.  Richard Greaves represented this role in this seminar.  Students on any given module are split into tutor groups which are assigned one or two tutors.  Traditionally these tutor groups would typically have the opportunity to meet together face-to-face about once a month but the tutor group interaction has moved increasingly online in recent years.
  • The OU makes extensive use of a Virtual Learning Environment (VLE) – based on Moodle[1] and other software tools including online forums, audio conferencing and remote presentation software.  Each module has a presence here and some are presented wholly in this way. Although an approach integrating this with printed materials is more common and sometimes face-to-face activities like summer schools and field trips are included.
  • The Institute of Educational Technology (IET) (where I work) supports all the other faculties and the university’s management in making pedagogically sound use of technology in its teaching and learning.  It does this based on internationally renowned research in this field.  It has many roles but Robin Goodfellow represented in this seminar that of a learning analytics advisor to a faculty.  This is an ongoing role communicating data on student performance and attainment and assisting with its interpretation to the Module Teams and faculty management.  For decades this role has gone on in different guises.  Recently it has acquired the sexy title of “Data Wrangler” and the terminology of learner analytics has come to the fore. [Any apparent cynicism there is all mine.]

Notes taken during the seminar

Megan Doolittle – From the module teams perspective

  • Module Team Chair for  DD206 “The uses of social science”, see: http://www3.open.ac.uk/study/undergraduate/course/dd206.htm
    • 2nd Level, 60 point, more online than past social-science modules
    • 1000+ students first presentation, 84% completed, 95% of those passed
    • Students liked:
      • Integrated assessment strategy
      • The relevance of the online elements
  • Students did not like:
    • Extent of online materials
    • Inadequacies in some software
  • Collaborative Groupwork
    • Students in small groups asked to  find and assessed evidence about a current debate (e.g. the planned UK High speed train line HS2)
    • Assessed (participation in groupwork always higher if assessed)
      • But assessed individually and participation accounted for 15% of grade
    • Tasks for each group (set up by module team – facilitated by tutors)
    • Forum discussions main method of collaboration
    • Issues for offender learners (prisoners) and those that can’t access forums
  • How well did it work?
    • Anecdotally – almost all students participated (at least a bit)
    • Sample group (8 students)
      • 84 posts across a number of threads (given as a hand-out)
      • A larger group of 11 students made 104 posts
    • Whole tutor group considered too big (based on knowledge of face-to-face teaching)
    • Series of lead in activities available not assessed but participation in these much lower
    • Sample exchange shows:
      • Common group dynamic patterns
      • Some posting with no comment
      • Students took task seriously
      • No complaints from tutors or students (although a few said won’t do)
      • Often one student took the lead
      • Most cases kept to focus of thread well
      • They didn’t always stick to roles initially proposed
      • Resources were shared
      • Some evaluation of resources, even if superficial (more like that which is common on social network discussions like Facebook or Twitter)
      • Key discussion points were raised  – sometimes backed up with evidence
      • It was not easy to move towards into more in-depth discussion of the issues and evidence
      • Most groups self-organising
      • Tutors tried various strategies to manage the forums – e.g. e-mail prompts
  • Technical issues
    • Forums are “clunky” at OU end to set up – the tasks had to be simpler than had been envisaged in planning stages because of technical limitations
    • Support from LTS, Module team and tutors needed in presentation
  • Feedback from students
    • (Similar to the feedback you would see in face-to-face group work)
    • Complained about lack of participation from other students
    • Difficulties of those working at different paces or different schedules
    • Not a lot of specific feedback from students just a few grumbles on forums
  • 2/3s teaching materials written for online environment
    • Limited blocks of text – lot of audio and video
    • Enabled a real blend of skills and content to be taught simultaneously
    • Took staff long time to learn to write effective resources (had to see it online to figure if it worked then iterate)
    • Some things can be best taught in book – e.g. abstract concepts
    • Tension between writing styles used online and models of good academic writing for students – referencing an important tool for this (how reference VLE materials?) – Still don’t ask students to write for online environment but rather essays and reports.
  • Student Experience
    • Students  could work though module from beginning to end – the tool has a strong linear push
    • The online annotate tool is currently very poor
    • Harder to dip in and out than print
    • Some students did not want to work on screen; others preferred it – most just got on with it.
Question
  • Issue of the 15% participation mark and offender learners and others who could not get online?
    • Answer – It’s problematic

Richard Greaves – From the AL (tutor) perspective

  • Provision on different modules:
    • AA100 – face to face and cluster forum
    • A150 – face to face and Tutor Group forum
    • A230 Face – to face and cluster forum  and national Forum
  • Module Websites
    • All provided some information
    • Newer modules more online
    • AA100 – “Inside Arts” Works well for nearly all students who access it
    • More problematic for weaker students – need tutor support
      • Feeling about more confident students posts: “I don’t know how they got to that answer I must be thick”
  • Students inexperienced with IT find it confusing
  • Students unwilling to spend time learning to navigate the website (e.g. when encounter different layouts)
  • Important because
    • Can lead to student not accessing materials
    • Student – Tutor Interactions
  • E-contact via forums
  • Tutor “business posts”
  • If tutor gives answer on forum rather that e-mail available to all on forum
  • Tutor posts indication additional materials e.g. relevant TV programmes
  • Online Tutorials
    • N.B.  long posts on forums do not get read
    • Elicit student engagement via discussion and questions
    • Students need an individual response from tutors not summary answers
  • Types of student:
  1. Confident and Intelligent posters
  2. Weak – showing little understanding but still participating
  3. Readers – Who do not post (lurkers is the term I would use c.f. social media)
  4. Non-engagers
  5. Students who see online activity as part of their study
  6. Students see all OU activity as “work” do not see tutorials and forums as support
  7. Students who want to work on their own – they may be bright and work effectively
  8. Students who are weak and struggling but will not engage because of this – student type 1. puts this student off (may prefer to use a Facebook[2] groups – these are sometime a source of disinformation)
  • Element of chance in the mix of students types in any given tutor group
  • What does student resistance look like?
    • I haven’t got time
    • I feel intimidated prefer Facebook
    • I only want peer support Facebook is better for that
  • What does AL (tutor) want?
    • Students boosting understanding and skills
    • Student engaging with forum based tutorials
    • Students able to access peer-support
    • Don’t want – discussion posts that are literally interpretations from other sources copied from sources found by google search
  • Compulsory online activity
    • A150 Wiki assignment
    • Requires online group activity
    • Successful in itself but didn’t (in his case) carry over to later course tasks
    • Students are increasingly instrumental – only engaging when marks involved

Robin Goodfellow – From the IET “Data Wranglers perspective
or “Where is the student in all these numbers?”

I have not noted this section live.  This is partly because I ran out of steam for near verbatim note taking and because of the nature of the presentation included lots of data tables and graphs. Please refer to the slides that will be linked to under resources above as soon as they are available.

Key points:

  • What assumptions are we making about the students when looking at data from VLE?
  • Great graphic illustrating how no correlation between KPI1 (Satisfaction > 87%) and Pass rates.  SEaM[3] survey goes out after final Tutor Marked Assignment (TMA) before the end of module Exam.
  • Leaning design (OU version) structured in 4 categories:
  1. – Guidance and Support
  2. – Communication and Collaboration
  3. – Content and Experience
  4. – Reflection and Demonstration
  • VLE tracking Data:
    • % of students that access VLE in each week
    • Average time (mins) students spend online in each week
    • Forum access – % of students in each week (module teams can specify which forums tracked)
    • Resource bank accessing per week (but don’t know which students are which – so might be different students in adjacent week)
    • What are the students saying?
      • Online collaborative activates usually get low satisfaction levels.

Small group round table exercise:

A set of various data sheets and graphs were given out to each table of about 6 people.  These related to anonymised courses, 3 from Arts and 3 from Social Science, at Levels 1, 2 and 3.  Each table was asked to review the data presentations and:

  • Say what does data show?
  • Identify 2 or more modules  that appear to suggest  different learning experiences
  • Prepare a short  summary  of what the student  on each of the modules is “likely” or “unlikely” to do or experience

Discussion

Time was cut short for discussion because of the inevitable over-running of the three excellent main presentations.  However here are some of the key points that were raised in the short discussion period:

  • Lots of points on the degree of confidence in the learning analytics.
  • What issues are specific to e-learning, many are the same as face-to-face, then how can this be used to improve e-learning?
  • Satisfaction and pass rates a really dangerous indicator! – Difference between learning and passing.
  • There is a danger of reducing the incentive to teach hard things.  This is because of criteria being used to evaluate how good a module is, is so reductive.
  • Does the data analytic view reflect anything? – Is it not just a process measure?

Robin – if we reject the data approaches that are beginning to pervade the university because they reductive we are absenting ourselves from the discussion and steering towards good practice in their use.

“Know thy data!”

Personal reflections

I am not going to make extensive reflections in this blog post.  I have a meeting with Robin Goodfellow booked for the 30th Jan 2014 about the overlap between our respective work, interest and reservations about learning analytics.  I hope to make a further blog post after that which this seminar will contextualize.  However here are a few thoughts:

  1. It was a shame that there was insufficient opportunity to triangulate the 3 perspectives given in the seminar.  This is vital because of the scale that the OU operates at and because unlike most courses at face-to-face universities it is not the authors of the modules that deliver them.
  • Module teams need to know the tutor experience
  • Module teams need to know the students experience as understood by the tutors and the student surveys and other learning analytics approaches
  • Those collecting and interpreting the data about the student experience need to validate their data approaches against the students, tutors and module teams “stories” (I hope to expand on this in the next blog post)
  1. Robin’s concluding comment of “know thy data” was a fitting one.  There was quite a bit of anxiety expressed, as has been widely recognised in other learning analytics work, that the learning analytics was more about management evaluation of the academic staff and their modules than enhancing student support and for continual improvement of module design.  I can understand that anxiety but I agree with Robin that the way to ensure a positive use of the data in improving teaching and learning is to understand the data, what is means and importantly what is does not mean; its limitations.
  2. Unlike many of my colleagues in IET working on learning analytics, I come from an engineering (cybernetics) background and not a social science one.  So, some of my thinking about this data comes from a measurement physics perspective that was part of my education and work as an engineer for about 10 years.  Key questions for the learning analytics approaches from this perspective include:
  • What are you actually measuring?
  • How does what you are measuring relate to what you really want to know?
  • What is your estimate of the error in your measurement?
  • What is the signal-to-noise ratio?
  • When are you approaching the fundamental limits of measurement for the approach in question?
  • How might your measurement approach and instrumentation be interfering with what you are trying to measure and to what degree?

The issue of instrumentation is important in learning analytics.  It is very easy to produce pretty graphs and dashboards that look like they are saying something important and useful, but without understanding the measurement approaches, data and metrics and the limitation of these it is too easy to convey a false confidence level in our learning analytics.

Repeating Robin’s conclusion: “know thy data!”

Footnotes:


[2] Facebook groups are rarely set up by tutors or Module teams but are frequently set up for peer-to-peer support by students themselves

[3] The SEaM Survey is the OU’s main method for collecting feedback about the study experience and administers by IET. It is:

  • an evaluation which asks students to provide feedback on their tutor and their wider module experience in one combined survey managed jointly by the Student Statistics and Survey Team in the Institute of Educational Technology and the AL Services Team in Student Services.
  • sent to all students who successfully studied and completed modules with tutor support circulated via email with an online link to the survey two or three weeks before the end of the module.

This new approach was first used in January 2013 and the results are proving to be very beneficial.