Archive for the 'Learner Analytics' Category

Open University’s CALRG Conference 2015 – Notes Day 3Findings: #calrg2015

CALRG Annual Conference   17 June 2013

Day 3: Paper Presentations

Jennie Lee Building, Ambient Lab

9:30-9:45 Opening remarks – Canan Blake
  Session V– Chair: Ann Jones
9:45-10:15 Helen Farley

Making the Connection: eLearning and mobile learning for prisoners

  • Project working on for the last 18 months
  • Prisons are overcrowded
  • Education reduced recidivism
  • When  back in the work released prisoners will have to deal with the digital world
  • Australia all time prison population high 34,000
  • Increasing number from non-English speaking backgrounds
  • Asylum seeker
  • Aboriginal prisoners 4% population but about 27% prison populations
  • So project puts emphasis on Aboriginals
  • Prisoners have no access to the internet and limited access to computers
  • Have been offering education into prisons for about 25 years
  • Use a Moodle based Learning Management System and eBook readers
  • Restricted devices because of security concerns – leads to time consuming systems support etc.
  • The prisoners like the dictionaries on the LMS for their Scrabble contests
  • Can now send 2-3 DVDs to the prison educational centre to update server software
  • EEE – eLearning, Empowerment and ???
  • Reverted to hard-book copy because not allowed to use eBook readers but then prison authorities suggested using tablets
  • In 2012 44% tertiary students could not access the internet
  • Project Scope
    • – Also now in Victoria and Western Australia (with interest from other states)
    • – Funded by Australian Government
    • – Aboriginals half as like to finish year 12 school education
    • – Now can been invited into a women’s prison
  • Technology designed to be robust and easily maintainable
  • Important after the project that everything does not just fall over
  • Self marking quizzes, games, etc. but not blogs
  • Now narrowed spec down to one laptop, one tablet and 1 notebook
  • Moodle does not run well on the notebooks to introduced a HTML layer
  • It’s not easy – system and processes are complex to ensure life after the project
  • Now have an off-line authoring environment
  • Course development environment deposits content into a repository – developed a compiler for this so isolation from internet maintained
  • Educational officer can download courses for their institution
  • It’s not just about the technology
  • Looking at English for academic purposes courses – just a 10 week course so ok for prisioners with short sentences
  •  Diplomas:
    • – Arts
    • – Business
    • – Science
  • Looking at incorporating OpenLearn courses
  • Problems:
    • – copyright
    • – prejudice
  • High level endorsement from the University and the Government Authorities
  • Issues of getting funding for prisoner related work – so style it as not internet users on funding bids

Brief question time 

10:15-10:40 Anne Pike

What makes the difference? Understanding the interactions and experiences of ‘at risk’ learner

[This presentation not blogged] 

10:40-11:05 Annie Bryan and Lisette Toetenel

Designing for inclusion: Supporting disabled students at the OU

  • Why design for inclusion?
    • – OU’s mission
    • – Equality Act (2010)
    • Increasing number and proportion of disabled students (> 18,000, 14% in 2014)
  • Legislation is a limited driver
  • Beginning of a journey
  • Increasing online curriculum
  • Online deliver may or may not be more accessible – e.g. OU Live is problematic for many disabled students
  • Satisfaction and pass rates are lower for disabled students
  • Learning Design – should have a big impact if takes into account accessibility (provides visualisations that help analysis for access issues)
  • SeGA (Securing Greater Accessibility)  – a cross university programme to embed good practice for inclusion in business as usual processes
  • Method:
    • – looked at all modules (200+) will accessibility information
    • Analysed SeAM (end of module survey) data
  • Findings:
    • – modules in development might not yet have considered the accessibility issues
    • 49 modules had Accessibility Guides
      • – only 29 Accessibility Guides had specific information for the module concerned
    • SeAM analysis
      • – students value several key things:
        • a supportive tutor
        • special exam arrangements
        • comb bound books
    • Module materials need to be provided in a wide range of formats
    • Learner needs must be anticipated in the learning design
      • – This aligns to the (mostly American) term of Universal Design
    • It is difficult to measure how increasing number of disabled students is impacting on practice [Seale 1996]

Brief question time 

11:05-11:30    TEA/COFFEE
   Session VI – Chair: Doug Clow
11:30-11:55 Martyn Cooper

Learning Analytics and Accessibility – what can be done and pragmatic considerations

[I can’t live blog my own presentation but here is a link to the slides instead]

Learning Analytics and Accessibility: http://www.slideshare.net/martyncooper/learning-analytics-and-accessibility-calrg-2015-mc

11:55-12:20 Jenna Mittelmeier, Bart Rienties, Denise Whitelock

The Role of Culture in Student Contributions to an Online Group Learning Activity

[This presentation was not blogged as I had to be elsewhere on campus]

12:20-12:45 Ann Jones

Informal language learning with mobile technologies: reflections on three recent studies

[This presentation was not blogged as I had to be elsewhere on campus]

12:45-13:10 Ann Grand, Richard Holliman, Helen Donelan, Peter Devine

Linking research and practice: the evolution of “the snakes and ladders of social media”

  • Defining the problem – what do we mean by engaged research?
  • What does “public” mean?
  • A definition that applies across the institution.
  • Used the EDGE tool to identify the issues (a categorization of levels of engagement)
  • Idea of an Open Research University (Profs. Eileen Scanlon and Martin Weller in IET)
  • People start with a blank piece of paper
  • Open/Digital/Engaged – Venn Diagram
  • Are social media work for researchers?
  • Should they work for researchers?
    • The answers to these differ across the university – some parts have no digital presence at all!!!
  • There are so many social media tools
  • Describe an activity a public engagement with research
    • – Very few units identified social-media as part of this
  • Ideal types:
    • – The fully wired
    • – The unconvinced
    • – The experimenter
  • Throw away remark – “Let’s do a board game” -> The snakes and ladders of social-media
    • You sit around, facing each other and talk and play
  • Did a blog post about the game and then other’s got interested in it
  • Used in PhD student workshops; public engagement ambassadors workshop; research Councils
  • The game is a conversation tool
  • Also work across a team to produce online resources – e.g. what is a a digital identity?
  • Research and Scholarship Unit were not on the Web!

Brief question time 

13:10-14:00    LUNCH
   Session VII – Chair: Simon Cross
14:00-14:25 Katy Jordan

Characterising the structure of academics’ personal networks on academic social networking sites and Twitter

[This presentation was not blogged because it is not possible to convey the social-network diagrams in text.]

14:25-14:50 Anne Adams and Gill Clough

The E-assessment burger:  Supporting the before and after in e-assessment systems

[I was not present for this presentation.]

14:50-15:15 Tim Coughlan

Creating Structures for Engagement with Open Knowledge: Interpreting the links between art and location in the ArtMaps project

[I was not present for this presentation.]

15:15-15:40 Lucia Rapanotti, Canan Blake​, Jon Hall

Enriched student context in online professional learning

  • Not a research project but a perspective on evaluating a pedagogical approach
  • Context:
    • – Postgraduate professional computing courses
    • – Students already in a rich context – let’s make use of it
    • – Part of the evaluation of the learning occurs within their professional context
    • – Module on Software Development M813 (there are two other modules as well)
    • – Wide range of learning theories, principles, techniques for software development
    • – Learning organised around fundamental software lifecycle
  • [Diagrammatic overview of the course]
  • M813 Evaluation Questions:
    •  – To what extent does the rich context matter?
  • Use specialised forums
  • Data from formal assessment
  • Tracking the leading-edge
  • Organisation and Scope of the Forum:
    • – Students discus privately with their tutor the sort of context they are in
    • – Determine if their proposed project is suitable
    • – Narrative ideas
  • Analysis of the narratives:
    • – We now know the sort of enterprises the students come from (not previously collected this data)
    • – Many diverse sectors covered
    • – 90% use real organisations with 80% being private
    • – Usually students choose their own projects
    • – 62% pass but completion rate high – pass rate not as high as wished for but consitant with similar modules
    • Qualitative data: “Loved the project”; “Helped me to identify sloppy practices”; …
    • One negative one: “Useless for me”
    • “This is good in theory but it does not work in practice” – useful feedback to revise the course
  • Does the new assessment model work?
    • How is online learning working in this context?
    • Consider the collaborative learning aspects of the course
    • 100s of messages in dozens of threads
    • Looking for evidence that they are making use of their professional learning in their group forum interactions
    • Labour intensive research but no better (automated) way
    • [Examples from the forums given]
    • “Are real programmers a dying breed?”
    • Initial categories of interactions:
      • Greetings
      • Like or don’t like
      • Links
      • Professional experience or work context
      • Discussion of justified opinions
    • Going to follow Kerchner’s framework to look at affordances and other aspects in future work
    • Big data but qualitative
    • Much more to do! – But need to provide evidence that the modules are effective.
    • Appeal for help from the “experts” in the audience
    • Discussion –
      • Could you use NLP methods to code the forums
      • Coding (of forum posts) is a negotiated discourse
      • A programme can go wrong very quickly
      • Forum threads evolve
      • Even in human coding need to work with more than one person doing the coding to get accurate/consistent coding
      • Identify critical incidents (automatically?) then analyse in depth by human assessment

[End of conference presentations]

15:40-16:00 TEA/COFFEE and Close of the Conference

Notes from CALRG Conference 2014

The Computers and Learning Research Group (CALRG) at the Open University (UK) has an annual conference.  Today and tomorrow is the 35th such conference.  This post is my notes on the presentations I attend (unfortunately I can not make them all).  There is a conference Twitter feed with hashtag #calrg14.  A temporary conference website is at: http://sites.google.com/site/calrg14/ with a link to the programme.

CALRG Annual Conference Day One – June 10 2014

Discussant: Prof Rupert Wegerif, Exeter University

9:30-9:45 OPENING NOTES

Patrick McAndrew (Director of IET) – Own experience over 16 years as an induction to the university.  A catch-up point.  Today the theme is mostly on Open Education.

Session I – Chair: Doug Clow

9:45-10:15 MOOCs, Learning Analytics and Higher Education: Perspectives on a recent study leave visit to the USA
Eileen Scanlon

  • The Americans sometimes slightly hallucinate our experience of Ed Tech
  • First stop – ACM Conference on learning at scale (single track)
  •  Bestthing-keynotefromChrisDide, of Harvard. – “Newwineinnow bottles”
    • It is not about the platform but what you do on the platform
    • Use of metaphors from film
    • Going big requires thinking small
    • Micro-genetic studies of online learning
    • People had forgotten all the learning science previously done
  • DistanceLearningOERs and MOOCs (Eileen’s presentation at conference)
    • The Open Science Lab
    • Edinburgh experience – professional development of surgeons
  • Next stop Berkley (Invitational Summit of 150 people)
    • Impact on residential campus based universities
    • Relying on schools of education to measure student learning
    • ReflectiononEDX platform
      • Transforming the institution (MIT in this case)
      • Learn about learning
      • E.g. required physics course – group learning – lot of use of online assessment
      • Comparison of performance in MOOCs of those taking residential course versus those not
      • Drown in information if Google assessment of EDX
    • Simon initiative at Carnegie Mellon
      • AI and Cognitive Tutors
      • Broader than the institution
      • Global learning council
      • Spin out company called “Cognitive Tutors”
      • Individualized instruction seen as gold standard for education
  • Then visited Stanford
    • TheLytics Lab (Learning Analytics)
      • Using learning science with open educational delivery
      • Moving from fragmented approach to systematic improvement of this type of pedagogy
      • CSCL (conversation) ->MOOC space
      • Scale of work in Stanford on MOOCs is staggering
      • Still individual academic driven
  • Then various other conferences
  • Future Learn Academic Network
    • Originally 26 partners now expanding and going more global
  • ESRC proposal on future of higher education
    • Partners: OU, University of Edinburgh, Carnegie Mellon, Oxford University

10:15-10: 45 Squaring the open circle: resolving the iron triangle and the interaction equivalence theorem
Andy Lane

  • Visual Models
    • How visualization can help with understanding/sense making
    • They can equally conceal
    • The Iron Triangle – sides: Scale, Quality, Cost
      • If one dimension changed significantly it will compromise others
    • John Daniel – open distance learning could break the iron triangle
    • Interaction Equivalence Theorem (EQuiv)
    • Supply-side vs demand-side (what about the students?)
    • Adding a circle of success to the iron triangle
    • A student centred iron triangle
      • motivation, preparation, organisation
    • A student centred Interaction Engagement Equivalence Theorem

10:45-11:15 Exploring digital scholarship in the context of openness and engagement
Richard Holliman, Ann Grand, Anne Adams and Trevor Collins

See: http://open.ac.uk/blogs/per

  • Public engagement with a research mandate
  • Research councils fund catalysts
  • An “ecology” of openness
  • Action Research [Lewin 1946]
  • The Edge tool
  • How do we find ways if assessing where staff are and then support them?
  • Research Questions
    • What methods and technologies are researchers using to: make research public, make public research, enable the public to collaboratively research (citizen science)?
    • how do researchers conceptualize the role of students?
  • Scholarship reconsidered
    • discovery
    • integration
    • application
    • teaching
  • Awareness / Responsibility / Sustainability
  • Institutional strategy for open, digital and engaged scholarship
    • What should we try to change?
  • Types of researcher: the fully wired; the dabbler; the brave trier; the unimpressed
  • “The Open Scholar is someone who makes their intellectual projects digitally visible …”
  • Policies / Procedures / Practices

[The remaining  session of Day 1 I was not able to attend but the programme in included here]

Session II – Chair: Ann Jones

11:30-11:55 The OpenupEd quality label: benchmarks for MOOCs
Jon Rosewell

11:55-12:20 From theory to practice: can openness improve the quality of OER research?
Rebecca Pitt, Beatriz de-los-Arcos, Rob Farrow

12:20-12:45 Open Research into Open Education: The Role of Mapping and Curation
Rob Farrow

12:45-13:10 Strategies for Successful MOOC learning: The Voice from the World Record Breaker
Bernard Nkuyubwatsi
Session III – Chair: Rebecca Ferguson

14:00-14:25 The role of feedback in the under-attainment of ethnic minority students: Evidence from distance education
John T.E. Richardson, Bethany Alden Rivers and Denise Whitelock
14:25-14:50 Evaluating serious experiences in games
Jo (Ioanna) Iacovides
14:50-15:15 Social media for informal minority language learning: exploring Welsh learners’ practices
Ann Jones
15:15-15:30 TEA/COFFEE
Session IV – Chair: Inge de Waard
15:30-15:55 What students want: designing learning to optimise engagement in digital literacy skills development
Ingrid Nix and Marion Hall
15:55-16:20 Recording online synchronous tutorials to support learning
Pauline Bloss, Elisabeth Clifford, Chris Niblett and Elke St.John
16:20-16:45 Open Education needs Education for Openness: a dialogic theory of education for the Internet Age
Rupert Wegerif
16:45-17:00 Discussant – Rupert Wegerif
and CLOSE

CALRG Annual Conference – Day 2 – June 11 2014

Session V – Chair: Mark Gaved

9:40-10:05 ‘nQuire-it’: The design and evaluation of a mission-based web platform for citizen inquiry science learning
Christothea Herodotou, Eloy Villasclaras- Fernández , Mike Sharples

Notes from this presentation lost in the ether 😦

10:05-10:30 3D Virtual Geology Field Trips: Opportunities and Limitations
Shailey Minocha, Sarah-Jane Davies, Brian Richardson and Tom Argles

  • Can do things unable to d in a real field trips – e.g. drape maps over mountains, see geological cross sections
  • Us Unity 3D Game Engine to build a 10km x 10km area mapping and imaging the real world (around Skiddaw, England)
  • Can pick up rocks and examine under microscope
  • Includes a chat facility for tutor group communication
  • Leave these tools out of the application so as not to compromise the immersion
  • Addresses accessibility with transcripts and full keyboard only access
  • Able to “fly” and “teleport” (on a real field trip a lot of time wasted travelling between sites)
  • Avatar based environment
  • Students use a paper based notebook as they would in the field
  • Integrate the virtual microscope (existing facility) but now contextualized learning
  • Cloud server can handle up to 500 students at one time

10:30-10:55 Juxtalearn: From Practice into Practice
Anne Adams and Gill Clough

  • Large EU project
  • Driver – not enough taking science and technology at school – employment implications
  • Science and Technology engagement through “creative performance” and reflective learning”
  • Threshold concept (TC)
    • Where students find challenges
    • When they get it it is transformative
    • Irreversible – not readily forgotten
    • Integrative – brings concepts together
  • Learning Pathways and Threshold Concepts (different ways from introduction of concept to internalisation of it)
  • Develop understanding through creative video making
  • Tricky Topic Tool
    • Teachers identify tricky topic
    • Teachers create an example
    • Teachers write down student problems
    • Teachers fill in Taxonomy (linked to student problems)
      • e.g terminology, intuitive beliefs, incomplete pre-knowledge, …
  • Taxonomy scaffolds quiz creation
    • Tool to facilitate this
    • Integrates detailed feedback to the student
  • Demo

Session VI – Chair: Anne Adams

11:15-11:40 Citizen Inquiry: From rocks to clouds
Maria Aristeidou, Eileen Scanlon, Mike Sharples

  • Citizen Science + Inquiry based Learning -> Citizen Inquiry
  • Inquiring – Rock Hunters (Initial Study)
    • 24 participants
    • 12 rock investigations
    • discussion and feedback on chat and forums
    • Data collection – questionnaires, System Usability Scale [John Brooke, 1986], …
  • [Note taking interrupted]

11:40-12:05 Imagining TM351 – Virtual Machines and Interactive Notebooks
Tony Hirst

  • TM351 – New Level 3 30 point module on data
  • Two new things:

1. Virtual machines (to overcome the diversity of machines being used by students)

    • Interfaces increasingly browser based
    • Virtual box installed on student machine and browser used as interface
    • Virtual machine can be on cloud server – then can use on a tablet

2. Notebook Computing

  • Literate programming / reproduce-able code or research
  • Code should be able to be read as an essay (self documenting) – read well to human and executable by the machine
  •  Can’t reproduce data analysis from traditional academic papers – reproduceable research includes the tools to enable this
  • Using IPython
  • Corollary to spreadsheets
  • Task orientated productivity tools
  • Cells
    • write text
    • uses “mark-down” simple text based mark-up
    • other cells contain python code
    • e.g. the software creates the table – avoids errors in production and editing
    • similarly with maps and paths
  • IPython server in VM – interface in browser
  • Exploring using OpenDesignStudio so students can share and critique each others code in executable form (see: http://design.open.ac.uk/atelier-d/cdi1.htm)
  • Example shown

 

12:05-12:30 MASELTOV – mobile incidental learning services to support language learning and the social inclusion of recent immigrants
Agnes Kukulska-Hulme, Eileen Scanlon, Ann Jones, Mark Gaved

  •  Using smart phones to support language learning
  • Addressing those with low educational level and from different culture
  • Incidental learning approach
  • MApp: a range of services
    • Field local mapping
    • Social network
    • Information resources
    • Translation
    • Navigation guide
    • Language learning
    • Serious game
  • These are separate apps but integrated in the platform
  • High penetration of smart phone among target audience
  • Technology uncertainty period
    • Many purchase phone ahead of travel
    • Android phones most popular
    • May have multiple phones
    • Seek out free WiFi
    • Word of mouth expertise highly valued
  • Howdowe enable transition from problem solving to reflective learning?
    • relating immediate situation to broader context
    • Feedback and progress indicators
      • Study planning and goal setting
      • Indicating completion
      • Supporting sense of community
      • Building confidence
      • Gamified approach
      • Quizzes
  • What evidence that this approach to language learning is effective?
  • Are there clusters of tools use?
  • Demo

12:30-12:55 Knowledge Transfer Partnership: Booktrust and the Open University
Natalia Kucirkova, Karen Littleton, Teresa Cremin and Laura Venning

  • Ongoing project started this year
  • KTP-objectives:
    • Extending book trust work on promoting reading for pleasure
    • Contribute to digital literacy
    • New knowledge and understanding of digital technologies and the opportunities they provide
  • Synergy of two organisations
  • Looking at books created on iPads (created by children or parents using words and images)
  • The ability to search for meaning is enhanced by creating stories
  • Book Trust:
    • Charity founded in 1920s
    • Encouraging reading for pleasure among children and families
    • Run book gifting programmes
    • Book-start – packs delivered by health visitors and via libraries
    • Reception year programme
    • Now seeking to develop the digital side of their work
    • Undertake research on reading habits and how reading contributes to peoples’ lives
    • Reading Habits survey 2013-14

Session VII – Chair: ?

14:00-14:25 Flipped teachers’ views of the impact of open practices on students
Beatriz de los Arcos

  • Flipped teacher – move the instruction online more discussion and analysis in class
  • Help with “homework” given by experts
  • Survey of OER use by teachers and how impact on students
  • “I do not treat this curriculum as mine – it belongs to the class and the world”
  • http://sites.google.com/a/byron.k12.mn.us/stats4g
  • Example of a learning activity on Chaucer’s Canterbury Tales – kids turned in 82% of homework on time
  • OER enables new ways of teaching and learning
  • How do we measure the success of the flipped model?
    • A lot of teachers respond to do with student motivation and engagement
  • Most teachers informally adopt OER practice (e.g. uploading to You Tube) but don’t know about CC licenses etc.
  • Does flipping with OER give a better “flip” than working with closed resources?

 

14:25-14:50 The pedagogical design, user profile and evaluation of a Mobile app to teach beginners’ Chinese characters
Fernando Rosell-Aguilar and Kan Qian

  • Examples of tones in Chinese where same syllable means different things – but context means in practice mistakes not significant
  • About 10,000 characters in common use with typically 12 strokes
  • No space between characters to denote separation of words
  •  Stroke order is important – but this also aids memory of characters – in Chinese primary schools they would chant this
  • Pinyin (Roman letters) is used to teach pronunciation because no correspondence between character and pronunciation
  • Grammar very simple (no past or future tense) – verbs stay the same – no plural singular
  • Rationale
    • To provide an aid to learning
    • To raise profile of the introduction to Chinese course
    •  To fulfill KMi objective to produce revision aids
  • Pedagogical design
    • Bite-sized learning
    • Progressive Learning 20 lessons must be taken in order
    • Integrating writing, listening, reading and vocabulary
    • Gaming feature
    • Personalised learning
  • 4 Sections
  • Challenges of working with App Developers
    • What can be done with what desired
    • Timing issues
    • Technical affordences vs pedagogy
  • User profile and evaluation
    • More males than females (unlike other modern languages more males than females study Chinese)
    • Median Age 30-39
    • 91.9% describe themselves as beginners
    • 75% learning Chinese informally
    • Why learn Chinese:
      • Personal interest
      • Family ties
      • Non-Chinese living in China
      • Business use
    • False expectation of ability to learn fluent Chinese from app
    • App rated positively 86% very good or good
    • Good ratings for learning to write but better for learning to recognise characters
    • 82% app as additional to other learning but 18% using it as their main resource
  • Conclusion
    • Met objectives towards a large degree but no evidence of people using the app then signing up for the course
    • Varied mix of users (gender, age, etc.)
    • Android version limited character set iOS more comprehensive
  • App Chinese Characters First Steps – http://itunes.apple.com/gb/app/chinese-characters-first-steps/id441549197?mt=8#

14:50-15:15 Models of Disability, Models of Learning, Accessibility and Learning Technologies
Martyn Cooper

My presentation so not noted but slides are available on SlideShare at: http://www.slideshare.net/martyncooper/models-of-disability-models-of-learning-accessibility-calrg2014

 

Session VIII – Chair: Canan Blake

15:30-15:55 Computer-marked assessment as learning analytics
Sally Jordan

  • Using in iCMAs in teaching since 2000
  • Ellis (2013) assessment often excluded from learning analytics but this is “stupid”
  • Assessment give deep information about learner engagement
  • Analysis at the cohort level
    • Look at questions that student struggle with (from hard data not student opinion)
  • Example of graphic illustrating number of tries students take to get correct question answer in a maths assessment
  • Look are reasons for repeated wrong answers
  • Measuring student engagement – “750 students used my iCMA”
  • iCMAs in formative use exhibit those that just click on it but don’t engage (about 10%)
  • WhendostudentsuseiCMAs?
    • Strong bias towards cut-off dates
  • Length of response to short answer questions – if say a word limit students tend to write near to that limit (see it as a hint)
  • Student engagement with feedback – comparisons between students and comparison between modules
  • Generally students do what they believe their teachers want
  • Engagement with computer marked assessment can be used as a proxy for deeper behaviour
  • Transcend the testing paradigm and see assessment for learning not assessment of learning

15:55-16:20 Open Essayist: Opening automatic support for students drafting summative essays
Denise Whitelock, John Richardson, Debora Field, Stephen Pulman

  • The SAfeSEA Project, see: http://www.open.ac.uk/researchprojects/safesea/
  • Present summaries of students essays back to students to facilitate their reflection
  • Not tell students what to write (or what is right)
  • Identifies Intro, Main Section, Conclusions, Keywords
  • Generates different visual representations of the essay – one research question is what representations the students find most helpful
  • Nodel graphs represent repeated notions
  • Marked contrast between highly marked and low marked essays
  • Nodes closer together in the better essays – vector length represents the connectivity between sentences
  • In 2014 made available to students on MAODE, University of Herts and British University in Dubai
  • Non-native speakers expressed found it very helpful
  • A lot of students do not see how a computer system could help them with their essays

16:20-16:45 Findings from a survey of undergraduate use of mobile devices for OU study
Authors: Simon Cross, Graham Healing, Mike Sharples

  • ePedagogies of handheld devices
  • Document and analyse the patterns of use of OU students
  • Align with other surveys – e.g. OU Student Survey
  • Becoming a longitudinal study
  • Modules like a Lego set – what students do with it may be different than intended and may be influenced by the technologies they use
  • 82% students mobile phones, 50% tablets, 37% e-readers 8% none of these
  • 30% bought tablet for OU study
  • 16% bought e-reader for OU study
  • Evolving data set – resource for future research
  • Insights for module development
  • Evolving survey instrument
  • Evolving analytical framework
  • Technology barriers -> learning barriers

16:45-17:00 CLOSE

No discussant today – shame because I like this feature of CALRG Conferences.

IET Learning Analytics Workshop (15-05-14)

Today I attended and presented at an OU internal Learning Analytics workshop organised by my institute – the Institute of Educational Technology, at the Open University, UK.  This blog post is my notes from that event.

Introduction
Eileen Scanlon

3rd in this workshop series – IET researchers joined by those from KMi and visitors from University of Amsterdam

Starfish – Networked Knowledge Sharing
Sander Latour and Natasa Brouwer (University of Amsterdam)

  • Starfish – finding a better way to disseminate best practice
  • (Video) http://www.youtube.com/watch?v=H6YrdOppk8
  • Community driven network
  • TPACK (Model of teaching good practice and technology) based labels.  See: http://www.tpack.org/
  • Entities linked into a network – aid to exploration
  • c.f. Google+ Communities
  • First working system in place – building network with other faculties/institutions
  • Potential for EU project
  • Research topics –
    • Dealing with difference in vocabulary
    • Effective search and exploration
    • Evaluating quality of information
    • SNA / Expert finding
    • Effect on Teachers of TPACK beliefs (Mishra and Koehler 2006) – e.g. technology used effects how we teach

Analytics insights into Epistemic Commitments in collaborative information seeking
Simon Knight, KMi

  • Link to epistemic games group in Maddison
  • Evolution in forms of assessment – moving away from pure summative towards performance assessment
  • Epistemic beliefs – a lens on learners views (Broome, 2009)
  • Removing the thought element – not decontextualised beliefs but situated and contextualised
  • Moving away from psychometrics
  • In “Search”
    • selecting sources
    • collating information
    • etc.
  • Epistemic commitments
    • The connections people make
    • Certainty characterised as …
  • Surface answers or deeper reasoning – use of search results
    • E.g. question on Marie Currie
  • Epistemic Frames for Epistemic Commitments
    • Views on how we see the world
    • described as skills, knowledge, values, identities, and epistemological rules
    • Discourse orientated
    • E.g. “we should try looking on Wikipedia”
      • select token, make connections
    • Epistemic Frames allows classification of activities in this process
  • Epistemic Network Analysis
    • Edge – indicates communication between nodes – edge gets thicker in proportion to level of discourse between the nodes
    • Move from log data – to keywords and concepts
    • Some maths happens – stanzas – don’t worry about how many times word occurs but how sourced
    • Principle Component Analysis (PCA) across stanzas
    • As analysis builds up some nodes become more central
    • Used a pair and two trios of 11-year-olds
    • Hypothesis – collaboration might be linked to number of sources researched
    • Some questions in exercise open – some closed – students asked to justify their answers and cite sources
    • Differences in groups
      • G1 – Successful “it has got all the important information” – i.e. less sense making more whether source had answers to questions
      • G2 – Also stressful but used different strategies and discourse was thus distinct from G1 (talked a lot about authority)
      • G3 – Poor results so discourse related to this
  • Claim that ENA offers a representational tools and can be used for hypothesis testing

Papers: http://oro.open.ac.uk/39254  http://oro.open.ac.uk/39181

Learning Analytics approaches to target support for disabled students in particular and to identify accessibility deficits in teaching and learning presented on the VLE
Martyn Cooper, IET

My presentation so not noted but slides on SlideShare at: http://www.slideshare.net/martyncooper/learning-analytics-and-disabled-students-iet-la-workshop-may14a

Learning Analytics for Academic Writing
Duygu Simsek, KMi

  • Machine code to identify good attributes in academic writing
  • How use this to support students and academics
  • Academic Writing –
    • Critical for students
    • Need to communicate validity of claims of automatic system
    • Meta-discourse analysis
    • Students find it challenging to learn academic writing but also find it difficult to understand meta-discourse cues
  • XIP – Xerox Incremental Parser uses NLP
    • Pulls out key features in academic writing
    • XML format output but not suitable for the learners
  • What are key features in student writing:
    • relevance
    • demonstration of knowledge
    • linguistic quality
    • argumentation
    • etc.
  • Argumentation:
    • mapping between good academic writing and XIP rhetorical functions
  • XIP needs a learning analytics approach to be useful in this context
  • Research Questions:
    • To what degree can XIP be used to identify good academic writing practice?
    • In what way shouldXIP output be communicated to students and educators?
      • Used a dashboard in a pilot study
    • To what extent to students value this approach?

Six different learning analytics metrics, but which one(s) best predict performance
Bart Rienties, IET

  •  Simon Buckingham-Shum: “We should move towards depositional learning analytics”
  • Learner Data vs Learning Data
  • E.g. from footballer tracker data
  • Study at Maastricht of students on compulsory maths/stats course presented on Blackboard
    • deep learner vs surface learner
    • motivation
    • diagnostic pre-test
    • demographic data
    • Blackboard data
    • Results on quizzes
  • Research Question 1 – to what extent predict performance?
    • Level of clicking in VLE poor predictor
    • More sophisticated tools shown to be better predictors
    • Combined metrics better predictors
    • Best predictions of assessment is assessment itself – so predictions get better after initial assessment on course
  • Research Question 2 – When should “coaches” intervene?
    • After first test good prediction but too late to intervene?
  • Research Question 3 – Dispositions or Learning Analytics?
    • Dispositions combined with early assessment provides good early warnings
    • Dispositions can be changed – feedback

 

— Close, drinks (Dutch treat) and onward discussion —

 

 

JISC Digital Festival 2014 – Notes Day 1 (Cont…)

The other presentation I attended on day 1 of the festival was that given by Prof. David de Roure of the  University of Oxford.  He spoke on “Big Data for the Social Sciences“, which I hoped would be relevant to my own work on Learning Analytics.  This blog post is my notes from his talk.

How does technology get used in research?

-> What is this new “big data” and what does it tell us?

  • Big data does not respect disciplinary boundaries
  • Data has been around a long time
  • There is a lot of “hype” around big data that has led to inflated expectations of it
  • Can consider 2013 as the year we sort to define big data and 2014 the year we begun to use it effectively
  • It is big data because of both the velocity and volume of the data being generated
  • “Data deluge” is now a phenomenon across the disciplines
  • In the past analysis moved from the universities to business, now it is from the business world to the universities.
  • There is  huge unsatisfied demand for “data scientists”
  • Mores Law vs The Big Social

Moore's Law vs Big Social diagram

  • We use digital tools because it is the ecosystem – Research 2.0
  • What is the relevance of Social Science to Big Data?
    • We need to think through the implications
  • RCUK’s definition of “big data” is: big enough that we can’t deal with it as we did before
  • Why do we want it?
    • To do things in new ways
    • To do new things
    • e.g. Twitter data – we can look at the evolution of social processes in real-time
  • We need the expertise of those from classical Social Science
    • e.g. food vs consumption
    • can obtain new data from new sources (e.g. supermarket loyalty cards)
  • We can use different data sets to correlate
  • Real-time uses of big data, e.g. Twitter
    • spread of infectious diseases
    • riots
  • Visualisation can lead to better analysis
  • Underpinned by available infrastructure
  • Wikipedia is an example of  a social medium
    • behaviorally it is socially constructed
    • different in different countries/languages

— end —

Internal project proposal: Learning Analytics for Disabled Students in STEM subjects

I am currently working on an internal project proposal: Learning Analytics for Disabled Students in STEM subjects (LA4DS-STEM). Hopefully it will run from April – December 2014.

The LA4DS-STEM project will review the potential of Learning Analytics in higher education, specifically in STEM, and with an emphasis on supporting disabled students and facilitating accessibility enhancements.

Learning Analytics is defined as the measurement, collection, analysis, and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs.  The LA4DS-STEM project will specifically explore the following STEM application areas for Learning Analytics. A key output of the project will be an external funding bid for a larger-scale collaborative project. The work of LA4DS-STEM will inform pilots in this project. Provide envisaged benefits are confirmed, this should lead to enterprise level implementation within the OU and across HE.

The findings of the LA4DS-STEM project will be disseminated, firstly throughout the Science and MCT faculties, then to the wider university. External dissemination will highlight the OU’s lead in this field.

Random Quotes from JISC Digital Festival 2014

Here are a few random quotes I noted down while at the JISC Digital Festival 2014 in Birmingham this week. Apologies for when I didn’t note who said them.

Academics need to stay on top of the analytics movement and not get pushed around by it!
[Anon]

A related to the above:

How does technology get used in research -> What is this new “big data” and what can (can’t) it tell us?
[Prof. David Rowe, Oxford University]

From a different perspective:

Research and Teaching have now diverged at the Universities
[On Twitter]

From the presentation by the originator of the “Hole in the Wall Experiment!:

Children will learn to do what they want to learn to do!
[Sugata Mitra, Prof. Of Educational Technology at Newcastle University]

I will add to these as I review the archived talks that I did not attend which you can do by going to: http://www.jisc.ac.uk/events/jisc-digital-festival-2014-11-mar-2014/expert-speakers


Martyn Cooper

Blog Stats

  • 25,086 hits

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 1,222 other followers

Categories