A role for Learner Analytics in identifying intervention points for accessibility improvement

With 3 colleagues from other UK universities I have just had the following paper accepted for W4A2012:

A Challenge to Web Accessibility Metrics and Guidelines: Putting People and Processes First

[By Martyn Cooper, David Sloan, Brian Kelly, Sarah Lewthwaite]

In this paper we argue that web accessibility guidelines such as WCAG 2.0 are insufficient in ensuring accessibility is achieved in any web-based resource or service. A key deficiency is in an appropriate level of understanding of the users, their needs and behaviors. In a higher education context one approach to address this that I am currently exploring is based on Learner Analytics. This blog posts expands on the ideas floated in the above paper and invites comment on them. I am just beginning to draft a project proposal to fund a pilot project exploring these ideas with real data and real students in their learning contexts. If this project might be of interest then please e-mail me: m.cooper@open.ac.uk.

The 1st International Conference on Learning Analytics & Knowledge defined Learner Analytics as:

… the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs.

The Open University is currently hosting various projects exploring how such approaches may be adopted to enhance its educational offering and support an ethos of continued improvement. A survey report on this area, Ferguson, R. (2012). “The State Of Learning Analytics in 2012: A Review and Future Challenges.” Technical Report KMI-12-01, Knowledge Media Institute, The Open University, UK, is freely available at:
http://kmi.open.ac.uk/publications/techreport/kmi-12-01.

But how can Learner Analytics be used to enhance accessibility?

A possible starting point was outlined in the above cited W4A2012 paper and is quoted here:

One area of interest is withdrawals – e.g. when students stop study before completion of a module towards a degree. Such “drop-outs” are a high-stakes issue for universities, because they form part of the assessment of the quality of their teaching, which in turn impacts on levels of funding from government and from student-paid fees.

It is envisaged that, with a learner analytics approach, it will be possible to map for the whole student body what points on paths of study withdrawals occur. It will be further possible to analyse this data comparing withdrawals by disabled students in particular with the student body in general. It is worth noting that the Open University is a large institution with more than 200,000 students, more than 12,000 of whom declare a disability. Hence there is a reasonable chance of data about the point of withdrawal across the educational context revealing that there is something of significance relating to that context over and above the more random distribution of withdrawals for non-study related issues such as health problems and family circumstances.

An analysis has been made of completion and pass rates on all undergraduate modules presented in 2010-11, with a minimum of 10 students who had declared at least one disability (164 modules). The differences in module completion and pass rates between disabled students and non-disabled students falls in the range [-33% to +29%]. On the majority of modules (67%) disabled students fail to complete or do not pass the course in a greater proportion than non-disabled students. If this data was routinely reviewed in a learner analytics approach investigations could be triggered of what might be the factors in module design that might be leading to poor completion and pass rates for disabled students. Further, what are factors in the modules where disabled students are doing as well or better than the non-disabled members of their cohort? The reasons for these could be diverse, but might include issues of accessibility at the teaching and learning level or at the technical level of how the teaching and learning is mediated; which increasingly is web-based. The learner analytics here only indicate where there might be a problem, not what it is.

Now the above approach is only possible in a university context because data is held as to which students identify themselves as having a disability. This would not normally be the case with most public websites for example. However knowing which students have a disability says little about what their needs and preferences might be in interacting with eLearning resources for example. The university does collect slightly refined data but this is to meet the requirements of national statistics agencies. This is coarse grained and based on medical model classifications of disability and not functional requirements with respect to interaction with computer environments. To illustrate the most recent data for all OU disabled students is shown here:

Category Feb 2012
0 1
1 Sight 1376
2 Hearing 978
3 Mobility 3938
4 Manual Skills 2522
5 Speech 441
6 Dyslexia 3530
7 Mental Health 4755
8 Personal Care 862
9 Fatigue/Pain 5486
10 Other 2041
11 Unseen Disability 2177
12 Autistic Spectrum 325
Total 28432

[Source: OU internal data. N.B. some students declare more than one disability; the actual total of disabled students currently registered with the OU is 13,884.]

Now we can infer that students in the Sight, Hearing, Manual Skills, and Dyslexia categories are likely to have web access needs, those in the other categories may too, but we do not know anything about the detail of their needs. Further the access needs will be diverse within any given category. Probably the most effective way of addressing this problem is asking the students to create for themselves on initial registration a profile of the detailed access needs and approaches. On candidate standard to base such a set of profiles on is AccessForAll 3.0 which is currently near finalisation within the IMS Accessibility Working Group. Note I will blog about the AccessForAll 3.0 specification when it goes to public draft which I am informed is imminent. Suffice it to say for this discussion, a learner’s needs and preferences with respect to how the learner can best interact with digital resources is represented using the IMS GLC Access For All Personal Needs and Preferences (PNP) v3.0 specification. This specification includes descriptors for all envisaged access approaches that can be encoded in a variety of ways; probably most likely in the application considered here as user profiles made up of sets of RDF Triples as defined by the vocabulary of the specification. We can set aside the “under the bonnet” discussion for now. All we need to know is that from the student perspective they can complete (once and for all but amend if necessary) a web form detailing their access needs. Then the university has this information, mapped to the PI (Personal Identification) number for each student that does so. Thus for any Learner Analytics approach we now know not just which students have a disability but specifically the nature of their access needs and preferences. These profiles could also be used for other purposes such as personalisation; managing alternative formats and quality assurance of services to disabled students but I will not discuss those here.

Another type of information that is being collected for Learner Analytics purposes is “Click Rate”. This is generated from the automatic monitoring of the frequency of clicks of individual students on all learning resources on the VLE. This gives a reasonable measure of what resources each student accessed, for what period and how actively they interacted with them. This information is stored against PI number for each student.

Now in the 3 sets of data described above we have some powerful tools to assess what is the actual performance and attainment of disabled students compared with their non-disabled peers. Where there appears to be a disparity here we can analyse as to whether web accessibility is likely to key factor. If so targeted remedial action can be instigated to improve accessibility. Further this accessibility improvement is strategically focused where it will have greatest impact on student learning and attainment. This makes best use of the limited resources and staff expertise to address accessibility issues.

In summary I remind you what the 3 sets of data here are:

  1. Information about the students
    Disability flag, disability type, and access needs and preferences profiles completed by the students
  2. Progression and attainment information
    Student module pass rates, grades, and withdrawal data
  3. Information about activity in the VLE
    Information about individual student interaction levels with all specific learning resources

Given ready and simultaneous access to this data it will be possible to construct a wide range of specific accessibility investigations that will identify issues which when addressed will have real impact of the learning of disabled students. What is more, these will be based on actual student interactions with the resources and not just measures of accessibility focussed on the properties of the resources. This approach directly takes into account user experience and context both of which are excluded in approaches based just on evaluation against WCAG2.0.

Comments, questions, discussion and suggestions of collaboration are all welcome.

5 thoughts on “A role for Learner Analytics in identifying intervention points for accessibility improvement

  1. Hi Martyn – a few comments on your most interesting blog post on Learning Analytics:

    1. Re. reasons given by students and particularly disabled students for withdrawal – I recall some broad discussion – possibly with Mary – about ‘presenting problems/reasons’ given by disabled students for withdrawing eg. ‘health problems and family circumstances’, possibly masking more deeper reasons for withdrawal eg. disability-related ‘confidence’ issues. We need to have more personal/qualitative insights

    2. Web form detailing access needs – my experience as an Assessor is that some students register for modules without any input from Learner Support – leading to:
    a. Registering on wrong (for student) modules in terms of Level and/or interest
    b. Registering for too many modules ‘by mistake’
    In the 2 cases I’m referring to, I believe without my intervention and suggested contact with Learner Support, both students would have withdrawn very early.
    I’ve also had cases where student insight into their condition and associated needs has not been sufficient to cover their actual needs eg. student with epilepsy who had serious manual dexterity issues (due to injuries that had occurred over the years due to his seizures), but until I noted how uncomfortably he was sitting and moving and that his study environment exacerbated this and possibly contributed to stress which could then be a factor in provoking a seizure, he had no idea that this i) was a problem ii) could/should be addressed.

    3. There needs to be some information that joins up what happens after the Assessment. Beyond apparently knowing that disabled students who have a DSA do better than disabled students who don’t and also apparently knowing that disabled students who take up training do better than disabled students who don’t, we have no real insight into the effect/value of the whole DSA process beyond the Access Centre ‘customer care’ sheets about the DSA proces in terms of did the Assessor do a ‘good job’ (arrange appointment on time, deliver report on time, when equipment/training was delivered etc.). We know nothing about, whether the kit is used and how – not only in the year of the DSA, but in subsequent years, the quality of the training, whether/how NMH was used, whether Region/AL support was sufficient etc.

    All these issues will impact on the disabled student learning experience and go beyond what any analytics will reveal and I’d suggest without these insights resource may be wasted making improvements that may either be insufficient and/or ill-directed.

    Or to put it another way, I’d be most interested to work on any of this and particularly looking at the impact (if any) of my intervention as an Assessor.

  2. Martyn,

    As an author of IMS Access For All 3.0 I need to point out that there are inaccuracuries in your report of it. For example

    “This specification includes descriptors for all envisaged access approaches that can be encoded in a variety of ways; probably most likely in the application considered here as user profiles made up of sets of RDF Triples as defined by the vocabulary of the specification.”

    Is wrong in respect of both RDF and also in its claim that the specification “includes descriptors for all envisaged access approaches” – it does not.

    There may be other inaccuracies but this is enough to make the point. I trust that inaccuracies such as this did not make it into the paper because such high profile exposure may lead people to wrong conclusions about Access For All 3.0 if they did.

    I acknowledge there were no bad intentions here in this blog but as accuracy is important you might like to consider revising the content of this post. I would be very happy to work with you on this with appropriate source attribution.

    Best wishes

    andy
    Andy Heath axelafa.com

    1. Andy,

      Thanks – very helpful – I did not intend to mislead. I did check with Madeline Rothberg (Co-chair IMS Accessibility Working Group) the current status of AfA3.0. She suggested that you were probably 2 months away from public draft. So I decided not to re-visit the detail of the specification until it was published.

      The only point I was trying to make – which I admit I did badly – was that the IMS A4A3.0 PNP specification has an information model that enables the way a disabled person interacts with their computer, and the content it mediates, to be comprehensively described. Further that this model can be represented in RDF which I am suggesting might be the best technical approach in this use case.

      Feel free to provide more detailed information about the IMS A4A3.0 PNP specification here if you wish or we can all wait until the pubic draft is available.

      Cheers,

      Martyn

Leave a comment