Internal project proposal: Learning Analytics for Disabled Students in STEM subjects

I am currently working on an internal project proposal: Learning Analytics for Disabled Students in STEM subjects (LA4DS-STEM). Hopefully it will run from April – December 2014.

The LA4DS-STEM project will review the potential of Learning Analytics in higher education, specifically in STEM, and with an emphasis on supporting disabled students and facilitating accessibility enhancements.

Learning Analytics is defined as the measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs.  The LA4DS-STEM project will specifically explore the following STEM application areas for Learning Analytics. A key output of the project will be an external funding bid for a larger-scale collaborative project. The work of LA4DS-STEM will inform pilots in this project. Provide envisaged benefits are confirmed, this should lead to enterprise level implementation within the OU and across HE.

The findings of the LA4DS-STEM project will be disseminated, firstly throughout the Science and MCT faculties, then to the wider university. External dissemination will highlight the OU’s lead in this field.

Learning Analytics for STEM – disabled student support/accessibility LA4STEM (#la4stem)

Today I submitted and internal Open University project bid to a programme called eSTEeM.

I post here the project description.  N.B. at this stage this is just a proposal. However we should hear by 31 October 2012 if this has been supported as an eSTEeM project and funded. If so I might be blogging much more about this work and its findings.

May I remind readers I set up a LinkedIn Group to try and tease out if there was anyone worldwide doing anything in the area of Learning Analytics and Accessibility. There has been some interest (the group currently has 75 members) but no one has yet shared that they are doing substantive work.  So you never know LA4STEM but in the future be seen as seminal. 😉

If you are interested in this field may I commend to you SoLAR – The Society of Learning Analytics Research: http://www.solaresearch.org/

I will be giving a 30 min presentation about this work at this event  – it’s a long way to travel for me 😉 – it’s OU main campus where I work :

SoLAR Flare UK (19 Nov 2012) #flareUK

Mon 19 Nov 2012, The Open University
Jennie Lee Building, Walton Hall, Milton Keynes, MK7 6AA [
map]

http://www.solaresearch.org/flare/solar-flare-uk/

Feel free to post comments or questions!

LA4STEM Project Description

The LA4STEM project will review the potential of Learning Analytics in higher education, specifically in STEM, and with an emphasis on supporting disabled students and facilitating accessibility enhancements.

Learning Analytics is defined as the measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs. Learning analytics is a “hot topic” in eLearning and was the second headline topic in the 2-3 year time to adoption section in the 2012 NMC Horizon Report on Higher Education[1]:

“The larger promise of learning analytics, however, is that when correctly applied and interpreted, it will enable faculty to more precisely understand students’ learning needs and to tailor instruction appropriately far more accurately and far sooner than is possible today.”

The LA4STEM project will specifically explore the following STEM application areas for Learning Analytics:

  • Student support (with an emphasis on support for disabled students)
  • Tutor support (facilitating their support of disabled learners)
  • Module review (identifying accessibility enhancements)
  • Retention and attainment (focussing on where disabled students appear disadvantaged)
  • Learning analytics in remote labs (because of their potential for enhancing access to STEM)
  • Recommender systems (the timely direction of disabled students to support and study skills aids; including scaffolding of STEM specific learning activities)

A key output of the project will be an external funding bid for a larger-scale collaborative project.  The work of LA4ALL will inform pilots in this project. Provide envisaged benefits are confirmed, this should lead to enterprise level implementation within the OU and across HE.

The findings of the LA4STEM project will be disseminated, firstly throughout the Science and MCT faculties, then to the wider university. External dissemination will highlight the OU’s lead in this field.


[1] Johnson, L., Adams, S. and Cummins, M. (2012) The NMC Horizon Report: 2012 Higher Education Edition. The New Media Consortium, Austin, Texas: http://www.nmc.org/publications/horizon-report-2012-higher-ed-edition

A role for Learner Analytics in identifying intervention points for accessibility improvement

With 3 colleagues from other UK universities I have just had the following paper accepted for W4A2012:

A Challenge to Web Accessibility Metrics and Guidelines: Putting People and Processes First

[By Martyn Cooper, David Sloan, Brian Kelly, Sarah Lewthwaite]

In this paper we argue that web accessibility guidelines such as WCAG 2.0 are insufficient in ensuring accessibility is achieved in any web-based resource or service. A key deficiency is in an appropriate level of understanding of the users, their needs and behaviors. In a higher education context one approach to address this that I am currently exploring is based on Learner Analytics. This blog posts expands on the ideas floated in the above paper and invites comment on them. I am just beginning to draft a project proposal to fund a pilot project exploring these ideas with real data and real students in their learning contexts. If this project might be of interest then please e-mail me: m.cooper@open.ac.uk.

The 1st International Conference on Learning Analytics & Knowledge defined Learner Analytics as:

… the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs.

The Open University is currently hosting various projects exploring how such approaches may be adopted to enhance its educational offering and support an ethos of continued improvement. A survey report on this area, Ferguson, R. (2012). “The State Of Learning Analytics in 2012: A Review and Future Challenges.” Technical Report KMI-12-01, Knowledge Media Institute, The Open University, UK, is freely available at:
http://kmi.open.ac.uk/publications/techreport/kmi-12-01.

But how can Learner Analytics be used to enhance accessibility?

A possible starting point was outlined in the above cited W4A2012 paper and is quoted here:

One area of interest is withdrawals – e.g. when students stop study before completion of a module towards a degree. Such “drop-outs” are a high-stakes issue for universities, because they form part of the assessment of the quality of their teaching, which in turn impacts on levels of funding from government and from student-paid fees.

It is envisaged that, with a learner analytics approach, it will be possible to map for the whole student body what points on paths of study withdrawals occur. It will be further possible to analyse this data comparing withdrawals by disabled students in particular with the student body in general. It is worth noting that the Open University is a large institution with more than 200,000 students, more than 12,000 of whom declare a disability. Hence there is a reasonable chance of data about the point of withdrawal across the educational context revealing that there is something of significance relating to that context over and above the more random distribution of withdrawals for non-study related issues such as health problems and family circumstances.

An analysis has been made of completion and pass rates on all undergraduate modules presented in 2010-11, with a minimum of 10 students who had declared at least one disability (164 modules). The differences in module completion and pass rates between disabled students and non-disabled students falls in the range [-33% to +29%]. On the majority of modules (67%) disabled students fail to complete or do not pass the course in a greater proportion than non-disabled students. If this data was routinely reviewed in a learner analytics approach investigations could be triggered of what might be the factors in module design that might be leading to poor completion and pass rates for disabled students. Further, what are factors in the modules where disabled students are doing as well or better than the non-disabled members of their cohort? The reasons for these could be diverse, but might include issues of accessibility at the teaching and learning level or at the technical level of how the teaching and learning is mediated; which increasingly is web-based. The learner analytics here only indicate where there might be a problem, not what it is.

Now the above approach is only possible in a university context because data is held as to which students identify themselves as having a disability. This would not normally be the case with most public websites for example. However knowing which students have a disability says little about what their needs and preferences might be in interacting with eLearning resources for example. The university does collect slightly refined data but this is to meet the requirements of national statistics agencies. This is coarse grained and based on medical model classifications of disability and not functional requirements with respect to interaction with computer environments. To illustrate the most recent data for all OU disabled students is shown here:

Category Feb 2012
0 1
1 Sight 1376
2 Hearing 978
3 Mobility 3938
4 Manual Skills 2522
5 Speech 441
6 Dyslexia 3530
7 Mental Health 4755
8 Personal Care 862
9 Fatigue/Pain 5486
10 Other 2041
11 Unseen Disability 2177
12 Autistic Spectrum 325
Total 28432

[Source: OU internal data. N.B. some students declare more than one disability; the actual total of disabled students currently registered with the OU is 13,884.]

Now we can infer that students in the Sight, Hearing, Manual Skills, and Dyslexia categories are likely to have web access needs, those in the other categories may too, but we do not know anything about the detail of their needs. Further the access needs will be diverse within any given category. Probably the most effective way of addressing this problem is asking the students to create for themselves on initial registration a profile of the detailed access needs and approaches. On candidate standard to base such a set of profiles on is AccessForAll 3.0 which is currently near finalisation within the IMS Accessibility Working Group. Note I will blog about the AccessForAll 3.0 specification when it goes to public draft which I am informed is imminent. Suffice it to say for this discussion, a learner’s needs and preferences with respect to how the learner can best interact with digital resources is represented using the IMS GLC Access For All Personal Needs and Preferences (PNP) v3.0 specification. This specification includes descriptors for all envisaged access approaches that can be encoded in a variety of ways; probably most likely in the application considered here as user profiles made up of sets of RDF Triples as defined by the vocabulary of the specification. We can set aside the “under the bonnet” discussion for now. All we need to know is that from the student perspective they can complete (once and for all but amend if necessary) a web form detailing their access needs. Then the university has this information, mapped to the PI (Personal Identification) number for each student that does so. Thus for any Learner Analytics approach we now know not just which students have a disability but specifically the nature of their access needs and preferences. These profiles could also be used for other purposes such as personalisation; managing alternative formats and quality assurance of services to disabled students but I will not discuss those here.

Another type of information that is being collected for Learner Analytics purposes is “Click Rate”. This is generated from the automatic monitoring of the frequency of clicks of individual students on all learning resources on the VLE. This gives a reasonable measure of what resources each student accessed, for what period and how actively they interacted with them. This information is stored against PI number for each student.

Now in the 3 sets of data described above we have some powerful tools to assess what is the actual performance and attainment of disabled students compared with their non-disabled peers. Where there appears to be a disparity here we can analyse as to whether web accessibility is likely to key factor. If so targeted remedial action can be instigated to improve accessibility. Further this accessibility improvement is strategically focused where it will have greatest impact on student learning and attainment. This makes best use of the limited resources and staff expertise to address accessibility issues.

In summary I remind you what the 3 sets of data here are:

  1. Information about the students
    Disability flag, disability type, and access needs and preferences profiles completed by the students
  2. Progression and attainment information
    Student module pass rates, grades, and withdrawal data
  3. Information about activity in the VLE
    Information about individual student interaction levels with all specific learning resources

Given ready and simultaneous access to this data it will be possible to construct a wide range of specific accessibility investigations that will identify issues which when addressed will have real impact of the learning of disabled students. What is more, these will be based on actual student interactions with the resources and not just measures of accessibility focussed on the properties of the resources. This approach directly takes into account user experience and context both of which are excluded in approaches based just on evaluation against WCAG2.0.

Comments, questions, discussion and suggestions of collaboration are all welcome.