Notes from CALRG Conference 2014

The Computers and Learning Research Group (CALRG) at the Open University (UK) has an annual conference.  Today and tomorrow is the 35th such conference.  This post is my notes on the presentations I attend (unfortunately I can not make them all).  There is a conference Twitter feed with hashtag #calrg14.  A temporary conference website is at: http://sites.google.com/site/calrg14/ with a link to the programme.

CALRG Annual Conference Day One – June 10 2014

Discussant: Prof Rupert Wegerif, Exeter University

9:30-9:45 OPENING NOTES

Patrick McAndrew (Director of IET) – Own experience over 16 years as an induction to the university.  A catch-up point.  Today the theme is mostly on Open Education.

Session I – Chair: Doug Clow

9:45-10:15 MOOCs, Learning Analytics and Higher Education: Perspectives on a recent study leave visit to the USA
Eileen Scanlon

  • The Americans sometimes slightly hallucinate our experience of Ed Tech
  • First stop – ACM Conference on learning at scale (single track)
  •  Bestthing-keynotefromChrisDide, of Harvard. – “Newwineinnow bottles”
    • It is not about the platform but what you do on the platform
    • Use of metaphors from film
    • Going big requires thinking small
    • Micro-genetic studies of online learning
    • People had forgotten all the learning science previously done
  • DistanceLearningOERs and MOOCs (Eileen’s presentation at conference)
    • The Open Science Lab
    • Edinburgh experience – professional development of surgeons
  • Next stop Berkley (Invitational Summit of 150 people)
    • Impact on residential campus based universities
    • Relying on schools of education to measure student learning
    • ReflectiononEDX platform
      • Transforming the institution (MIT in this case)
      • Learn about learning
      • E.g. required physics course – group learning – lot of use of online assessment
      • Comparison of performance in MOOCs of those taking residential course versus those not
      • Drown in information if Google assessment of EDX
    • Simon initiative at Carnegie Mellon
      • AI and Cognitive Tutors
      • Broader than the institution
      • Global learning council
      • Spin out company called “Cognitive Tutors”
      • Individualized instruction seen as gold standard for education
  • Then visited Stanford
    • TheLytics Lab (Learning Analytics)
      • Using learning science with open educational delivery
      • Moving from fragmented approach to systematic improvement of this type of pedagogy
      • CSCL (conversation) ->MOOC space
      • Scale of work in Stanford on MOOCs is staggering
      • Still individual academic driven
  • Then various other conferences
  • Future Learn Academic Network
    • Originally 26 partners now expanding and going more global
  • ESRC proposal on future of higher education
    • Partners: OU, University of Edinburgh, Carnegie Mellon, Oxford University

10:15-10: 45 Squaring the open circle: resolving the iron triangle and the interaction equivalence theorem
Andy Lane

  • Visual Models
    • How visualization can help with understanding/sense making
    • They can equally conceal
    • The Iron Triangle – sides: Scale, Quality, Cost
      • If one dimension changed significantly it will compromise others
    • John Daniel – open distance learning could break the iron triangle
    • Interaction Equivalence Theorem (EQuiv)
    • Supply-side vs demand-side (what about the students?)
    • Adding a circle of success to the iron triangle
    • A student centred iron triangle
      • motivation, preparation, organisation
    • A student centred Interaction Engagement Equivalence Theorem

10:45-11:15 Exploring digital scholarship in the context of openness and engagement
Richard Holliman, Ann Grand, Anne Adams and Trevor Collins

See: http://open.ac.uk/blogs/per

  • Public engagement with a research mandate
  • Research councils fund catalysts
  • An “ecology” of openness
  • Action Research [Lewin 1946]
  • The Edge tool
  • How do we find ways if assessing where staff are and then support them?
  • Research Questions
    • What methods and technologies are researchers using to: make research public, make public research, enable the public to collaboratively research (citizen science)?
    • how do researchers conceptualize the role of students?
  • Scholarship reconsidered
    • discovery
    • integration
    • application
    • teaching
  • Awareness / Responsibility / Sustainability
  • Institutional strategy for open, digital and engaged scholarship
    • What should we try to change?
  • Types of researcher: the fully wired; the dabbler; the brave trier; the unimpressed
  • “The Open Scholar is someone who makes their intellectual projects digitally visible …”
  • Policies / Procedures / Practices

[The remaining  session of Day 1 I was not able to attend but the programme in included here]

Session II – Chair: Ann Jones

11:30-11:55 The OpenupEd quality label: benchmarks for MOOCs
Jon Rosewell

11:55-12:20 From theory to practice: can openness improve the quality of OER research?
Rebecca Pitt, Beatriz de-los-Arcos, Rob Farrow

12:20-12:45 Open Research into Open Education: The Role of Mapping and Curation
Rob Farrow

12:45-13:10 Strategies for Successful MOOC learning: The Voice from the World Record Breaker
Bernard Nkuyubwatsi
Session III – Chair: Rebecca Ferguson

14:00-14:25 The role of feedback in the under-attainment of ethnic minority students: Evidence from distance education
John T.E. Richardson, Bethany Alden Rivers and Denise Whitelock
14:25-14:50 Evaluating serious experiences in games
Jo (Ioanna) Iacovides
14:50-15:15 Social media for informal minority language learning: exploring Welsh learners’ practices
Ann Jones
15:15-15:30 TEA/COFFEE
Session IV – Chair: Inge de Waard
15:30-15:55 What students want: designing learning to optimise engagement in digital literacy skills development
Ingrid Nix and Marion Hall
15:55-16:20 Recording online synchronous tutorials to support learning
Pauline Bloss, Elisabeth Clifford, Chris Niblett and Elke St.John
16:20-16:45 Open Education needs Education for Openness: a dialogic theory of education for the Internet Age
Rupert Wegerif
16:45-17:00 Discussant – Rupert Wegerif
and CLOSE

CALRG Annual Conference – Day 2 – June 11 2014

Session V – Chair: Mark Gaved

9:40-10:05 ‘nQuire-it’: The design and evaluation of a mission-based web platform for citizen inquiry science learning
Christothea Herodotou, Eloy Villasclaras- Fernández , Mike Sharples

Notes from this presentation lost in the ether :(

10:05-10:30 3D Virtual Geology Field Trips: Opportunities and Limitations
Shailey Minocha, Sarah-Jane Davies, Brian Richardson and Tom Argles

  • Can do things unable to d in a real field trips – e.g. drape maps over mountains, see geological cross sections
  • Us Unity 3D Game Engine to build a 10km x 10km area mapping and imaging the real world (around Skiddaw, England)
  • Can pick up rocks and examine under microscope
  • Includes a chat facility for tutor group communication
  • Leave these tools out of the application so as not to compromise the immersion
  • Addresses accessibility with transcripts and full keyboard only access
  • Able to “fly” and “teleport” (on a real field trip a lot of time wasted travelling between sites)
  • Avatar based environment
  • Students use a paper based notebook as they would in the field
  • Integrate the virtual microscope (existing facility) but now contextualized learning
  • Cloud server can handle up to 500 students at one time

10:30-10:55 Juxtalearn: From Practice into Practice
Anne Adams and Gill Clough

  • Large EU project
  • Driver – not enough taking science and technology at school – employment implications
  • Science and Technology engagement through “creative performance” and reflective learning”
  • Threshold concept (TC)
    • Where students find challenges
    • When they get it it is transformative
    • Irreversible – not readily forgotten
    • Integrative – brings concepts together
  • Learning Pathways and Threshold Concepts (different ways from introduction of concept to internalisation of it)
  • Develop understanding through creative video making
  • Tricky Topic Tool
    • Teachers identify tricky topic
    • Teachers create an example
    • Teachers write down student problems
    • Teachers fill in Taxonomy (linked to student problems)
      • e.g terminology, intuitive beliefs, incomplete pre-knowledge, …
  • Taxonomy scaffolds quiz creation
    • Tool to facilitate this
    • Integrates detailed feedback to the student
  • Demo

Session VI – Chair: Anne Adams

11:15-11:40 Citizen Inquiry: From rocks to clouds
Maria Aristeidou, Eileen Scanlon, Mike Sharples

  • Citizen Science + Inquiry based Learning -> Citizen Inquiry
  • Inquiring – Rock Hunters (Initial Study)
    • 24 participants
    • 12 rock investigations
    • discussion and feedback on chat and forums
    • Data collection – questionnaires, System Usability Scale [John Brooke, 1986], …
  • [Note taking interrupted]

11:40-12:05 Imagining TM351 – Virtual Machines and Interactive Notebooks
Tony Hirst

  • TM351 – New Level 3 30 point module on data
  • Two new things:

1. Virtual machines (to overcome the diversity of machines being used by students)

    • Interfaces increasingly browser based
    • Virtual box installed on student machine and browser used as interface
    • Virtual machine can be on cloud server – then can use on a tablet

2. Notebook Computing

  • Literate programming / reproduce-able code or research
  • Code should be able to be read as an essay (self documenting) – read well to human and executable by the machine
  •  Can’t reproduce data analysis from traditional academic papers – reproduceable research includes the tools to enable this
  • Using IPython
  • Corollary to spreadsheets
  • Task orientated productivity tools
  • Cells
    • write text
    • uses “mark-down” simple text based mark-up
    • other cells contain python code
    • e.g. the software creates the table – avoids errors in production and editing
    • similarly with maps and paths
  • IPython server in VM – interface in browser
  • Exploring using OpenDesignStudio so students can share and critique each others code in executable form (see: http://design.open.ac.uk/atelier-d/cdi1.htm)
  • Example shown

 

12:05-12:30 MASELTOV – mobile incidental learning services to support language learning and the social inclusion of recent immigrants
Agnes Kukulska-Hulme, Eileen Scanlon, Ann Jones, Mark Gaved

  •  Using smart phones to support language learning
  • Addressing those with low educational level and from different culture
  • Incidental learning approach
  • MApp: a range of services
    • Field local mapping
    • Social network
    • Information resources
    • Translation
    • Navigation guide
    • Language learning
    • Serious game
  • These are separate apps but integrated in the platform
  • High penetration of smart phone among target audience
  • Technology uncertainty period
    • Many purchase phone ahead of travel
    • Android phones most popular
    • May have multiple phones
    • Seek out free WiFi
    • Word of mouth expertise highly valued
  • Howdowe enable transition from problem solving to reflective learning?
    • relating immediate situation to broader context
    • Feedback and progress indicators
      • Study planning and goal setting
      • Indicating completion
      • Supporting sense of community
      • Building confidence
      • Gamified approach
      • Quizzes
  • What evidence that this approach to language learning is effective?
  • Are there clusters of tools use?
  • Demo

12:30-12:55 Knowledge Transfer Partnership: Booktrust and the Open University
Natalia Kucirkova, Karen Littleton, Teresa Cremin and Laura Venning

  • Ongoing project started this year
  • KTP-objectives:
    • Extending book trust work on promoting reading for pleasure
    • Contribute to digital literacy
    • New knowledge and understanding of digital technologies and the opportunities they provide
  • Synergy of two organisations
  • Looking at books created on iPads (created by children or parents using words and images)
  • The ability to search for meaning is enhanced by creating stories
  • Book Trust:
    • Charity founded in 1920s
    • Encouraging reading for pleasure among children and families
    • Run book gifting programmes
    • Book-start – packs delivered by health visitors and via libraries
    • Reception year programme
    • Now seeking to develop the digital side of their work
    • Undertake research on reading habits and how reading contributes to peoples’ lives
    • Reading Habits survey 2013-14

Session VII – Chair: ?

14:00-14:25 Flipped teachers’ views of the impact of open practices on students
Beatriz de los Arcos

  • Flipped teacher – move the instruction online more discussion and analysis in class
  • Help with “homework” given by experts
  • Survey of OER use by teachers and how impact on students
  • “I do not treat this curriculum as mine – it belongs to the class and the world”
  • http://sites.google.com/a/byron.k12.mn.us/stats4g
  • Example of a learning activity on Chaucer’s Canterbury Tales – kids turned in 82% of homework on time
  • OER enables new ways of teaching and learning
  • How do we measure the success of the flipped model?
    • A lot of teachers respond to do with student motivation and engagement
  • Most teachers informally adopt OER practice (e.g. uploading to You Tube) but don’t know about CC licenses etc.
  • Does flipping with OER give a better “flip” than working with closed resources?

 

14:25-14:50 The pedagogical design, user profile and evaluation of a Mobile app to teach beginners’ Chinese characters
Fernando Rosell-Aguilar and Kan Qian

  • Examples of tones in Chinese where same syllable means different things – but context means in practice mistakes not significant
  • About 10,000 characters in common use with typically 12 strokes
  • No space between characters to denote separation of words
  •  Stroke order is important – but this also aids memory of characters – in Chinese primary schools they would chant this
  • Pinyin (Roman letters) is used to teach pronunciation because no correspondence between character and pronunciation
  • Grammar very simple (no past or future tense) – verbs stay the same – no plural singular
  • Rationale
    • To provide an aid to learning
    • To raise profile of the introduction to Chinese course
    •  To fulfill KMi objective to produce revision aids
  • Pedagogical design
    • Bite-sized learning
    • Progressive Learning 20 lessons must be taken in order
    • Integrating writing, listening, reading and vocabulary
    • Gaming feature
    • Personalised learning
  • 4 Sections
  • Challenges of working with App Developers
    • What can be done with what desired
    • Timing issues
    • Technical affordences vs pedagogy
  • User profile and evaluation
    • More males than females (unlike other modern languages more males than females study Chinese)
    • Median Age 30-39
    • 91.9% describe themselves as beginners
    • 75% learning Chinese informally
    • Why learn Chinese:
      • Personal interest
      • Family ties
      • Non-Chinese living in China
      • Business use
    • False expectation of ability to learn fluent Chinese from app
    • App rated positively 86% very good or good
    • Good ratings for learning to write but better for learning to recognise characters
    • 82% app as additional to other learning but 18% using it as their main resource
  • Conclusion
    • Met objectives towards a large degree but no evidence of people using the app then signing up for the course
    • Varied mix of users (gender, age, etc.)
    • Android version limited character set iOS more comprehensive
  • App Chinese Characters First Steps – http://itunes.apple.com/gb/app/chinese-characters-first-steps/id441549197?mt=8#

14:50-15:15 Models of Disability, Models of Learning, Accessibility and Learning Technologies
Martyn Cooper

My presentation so not noted but slides are available on SlideShare at: http://www.slideshare.net/martyncooper/models-of-disability-models-of-learning-accessibility-calrg2014

 

Session VIII – Chair: Canan Blake

15:30-15:55 Computer-marked assessment as learning analytics
Sally Jordan

  • Using in iCMAs in teaching since 2000
  • Ellis (2013) assessment often excluded from learning analytics but this is “stupid”
  • Assessment give deep information about learner engagement
  • Analysis at the cohort level
    • Look at questions that student struggle with (from hard data not student opinion)
  • Example of graphic illustrating number of tries students take to get correct question answer in a maths assessment
  • Look are reasons for repeated wrong answers
  • Measuring student engagement – “750 students used my iCMA”
  • iCMAs in formative use exhibit those that just click on it but don’t engage (about 10%)
  • WhendostudentsuseiCMAs?
    • Strong bias towards cut-off dates
  • Length of response to short answer questions – if say a word limit students tend to write near to that limit (see it as a hint)
  • Student engagement with feedback – comparisons between students and comparison between modules
  • Generally students do what they believe their teachers want
  • Engagement with computer marked assessment can be used as a proxy for deeper behaviour
  • Transcend the testing paradigm and see assessment for learning not assessment of learning

15:55-16:20 Open Essayist: Opening automatic support for students drafting summative essays
Denise Whitelock, John Richardson, Debora Field, Stephen Pulman

  • The SAfeSEA Project, see: http://www.open.ac.uk/researchprojects/safesea/
  • Present summaries of students essays back to students to facilitate their reflection
  • Not tell students what to write (or what is right)
  • Identifies Intro, Main Section, Conclusions, Keywords
  • Generates different visual representations of the essay – one research question is what representations the students find most helpful
  • Nodel graphs represent repeated notions
  • Marked contrast between highly marked and low marked essays
  • Nodes closer together in the better essays – vector length represents the connectivity between sentences
  • In 2014 made available to students on MAODE, University of Herts and British University in Dubai
  • Non-native speakers expressed found it very helpful
  • A lot of students do not see how a computer system could help them with their essays

16:20-16:45 Findings from a survey of undergraduate use of mobile devices for OU study
Authors: Simon Cross, Graham Healing, Mike Sharples

  • ePedagogies of handheld devices
  • Document and analyse the patterns of use of OU students
  • Align with other surveys – e.g. OU Student Survey
  • Becoming a longitudinal study
  • Modules like a Lego set – what students do with it may be different than intended and may be influenced by the technologies they use
  • 82% students mobile phones, 50% tablets, 37% e-readers 8% none of these
  • 30% bought tablet for OU study
  • 16% bought e-reader for OU study
  • Evolving data set – resource for future research
  • Insights for module development
  • Evolving survey instrument
  • Evolving analytical framework
  • Technology barriers -> learning barriers

16:45-17:00 CLOSE

No discussant today – shame because I like this feature of CALRG Conferences.

IET Learning Analytics Workshop (15-05-14)

Today I attended and presented at an OU internal Learning Analytics workshop organised by my institute – the Institute of Educational Technology, at the Open University, UK.  This blog post is my notes from that event.

Introduction
Eileen Scanlon

3rd in this workshop series – IET researchers joined by those from KMi and visitors from University of Amsterdam

Starfish – Networked Knowledge Sharing
Sander Latour and Natasa Brouwer (University of Amsterdam)

  • Starfish – finding a better way to disseminate best practice
  • (Video) http://www.youtube.com/watch?v=H6YrdOppk8
  • Community driven network
  • TPACK (Model of teaching good practice and technology) based labels.  See: http://www.tpack.org/
  • Entities linked into a network – aid to exploration
  • c.f. Google+ Communities
  • First working system in place – building network with other faculties/institutions
  • Potential for EU project
  • Research topics –
    • Dealing with difference in vocabulary
    • Effective search and exploration
    • Evaluating quality of information
    • SNA / Expert finding
    • Effect on Teachers of TPACK beliefs (Mishra and Koehler 2006) – e.g. technology used effects how we teach

Analytics insights into Epistemic Commitments in collaborative information seeking
Simon Knight, KMi

  • Link to epistemic games group in Maddison
  • Evolution in forms of assessment – moving away from pure summative towards performance assessment
  • Epistemic beliefs – a lens on learners views (Broome, 2009)
  • Removing the thought element – not decontextualised beliefs but situated and contextualised
  • Moving away from psychometrics
  • In “Search”
    • selecting sources
    • collating information
    • etc.
  • Epistemic commitments
    • The connections people make
    • Certainty characterised as …
  • Surface answers or deeper reasoning – use of search results
    • E.g. question on Marie Currie
  • Epistemic Frames for Epistemic Commitments
    • Views on how we see the world
    • described as skills, knowledge, values, identities, and epistemological rules
    • Discourse orientated
    • E.g. “we should try looking on Wikipedia”
      • select token, make connections
    • Epistemic Frames allows classification of activities in this process
  • Epistemic Network Analysis
    • Edge – indicates communication between nodes – edge gets thicker in proportion to level of discourse between the nodes
    • Move from log data – to keywords and concepts
    • Some maths happens – stanzas – don’t worry about how many times word occurs but how sourced
    • Principle Component Analysis (PCA) across stanzas
    • As analysis builds up some nodes become more central
    • Used a pair and two trios of 11-year-olds
    • Hypothesis – collaboration might be linked to number of sources researched
    • Some questions in exercise open – some closed – students asked to justify their answers and cite sources
    • Differences in groups
      • G1 – Successful “it has got all the important information” – i.e. less sense making more whether source had answers to questions
      • G2 – Also stressful but used different strategies and discourse was thus distinct from G1 (talked a lot about authority)
      • G3 – Poor results so discourse related to this
  • Claim that ENA offers a representational tools and can be used for hypothesis testing

Papers: http://oro.open.ac.uk/39254  http://oro.open.ac.uk/39181

Learning Analytics approaches to target support for disabled students in particular and to identify accessibility deficits in teaching and learning presented on the VLE
Martyn Cooper, IET

My presentation so not noted but slides on SlideShare at: http://www.slideshare.net/martyncooper/learning-analytics-and-disabled-students-iet-la-workshop-may14a

Learning Analytics for Academic Writing
Duygu Simsek, KMi

  • Machine code to identify good attributes in academic writing
  • How use this to support students and academics
  • Academic Writing –
    • Critical for students
    • Need to communicate validity of claims of automatic system
    • Meta-discourse analysis
    • Students find it challenging to learn academic writing but also find it difficult to understand meta-discourse cues
  • XIP – Xerox Incremental Parser uses NLP
    • Pulls out key features in academic writing
    • XML format output but not suitable for the learners
  • What are key features in student writing:
    • relevance
    • demonstration of knowledge
    • linguistic quality
    • argumentation
    • etc.
  • Argumentation:
    • mapping between good academic writing and XIP rhetorical functions
  • XIP needs a learning analytics approach to be useful in this context
  • Research Questions:
    • To what degree can XIP be used to identify good academic writing practice?
    • In what way shouldXIP output be communicated to students and educators?
      • Used a dashboard in a pilot study
    • To what extent to students value this approach?

Six different learning analytics metrics, but which one(s) best predict performance
Bart Rienties, IET

  •  Simon Buckingham-Shum: “We should move towards depositional learning analytics”
  • Learner Data vs Learning Data
  • E.g. from footballer tracker data
  • Study at Maastricht of students on compulsory maths/stats course presented on Blackboard
    • deep learner vs surface learner
    • motivation
    • diagnostic pre-test
    • demographic data
    • Blackboard data
    • Results on quizzes
  • Research Question 1 – to what extent predict performance?
    • Level of clicking in VLE poor predictor
    • More sophisticated tools shown to be better predictors
    • Combined metrics better predictors
    • Best predictions of assessment is assessment itself – so predictions get better after initial assessment on course
  • Research Question 2 – When should “coaches” intervene?
    • After first test good prediction but too late to intervene?
  • Research Question 3 – Dispositions or Learning Analytics?
    • Dispositions combined with early assessment provides good early warnings
    • Dispositions can be changed – feedback

 

— Close, drinks (Dutch treat) and onward discussion —

 

 

Large Scale Collaborative Funding Bids

Today I attended a training workshop on large-scale funding for projects.  I have built my research career on such projects, but it is never too late to learn and gain from others’ experiences.

Anne Adams (IET):

Tensions – Turning Tension

  • Understand the drivers of the partners
  • Manage expectations
  • Managing cultural differences
    • differences can be enablers
  • Working at the OU is like working at a mini-EU (committee structures, scale, layers, etc.)
  • Learn by mistakes
  • Use processes and support : Trouble shooting
  • Funders can be a useful connection point – use them
  • Check understanding
  • Impact and dissemination from early on – impact and engagement
  • Balancing different objectives – your partners are doing the same

Small Projects

  • More focused objectives
  • Smaller budgets
  • Tighter timeframes
  • Researchers often have to do project management as well

Large Projects

  • Inverse of above but effectively made up of a series of small projects

Managing Time

  • Clear-out bid writing time
  • Co-ordinate multiple objectives
  • Larger projects may have tighter financial restrictions

Managing People

Stakeholders

  • Project partners
  • Outside the project who are they?
  • Involve in writing the bid
  • Involve users
  • Expectations
  • Highlight how stakeholders have been involved in bid writing
  • Shifting timescales
  • OU Catalyst Project – Research School
  • Breakdown what will be available when

Roles

  • Administrators
  • Project manager
  • Researchers
  • Academics as researchers / teachers / communicators

Lines of management

  • Issue of part-time staff
  • Commenting on objectives
  • People are overworked
  • Use e-mail sparingly

KPIs / Partners

  • Co PIs / Partners
    • Institutional differences
    • Cultural differences
    • Tensions from misunderstandings

Manage expectations

  • Competing expectations from a project
    • From partners
    • From funders
    • Your own objectives

Be Brave

  • Proactive in getting key players in the bid who may have a previous funding record with the funders
  • Summary of objectives and ideas early on
  • Leverage OU
    • Scale
    • Contacts
    • Systems and procedures e.g. ethics
    • Management systems
  • Don’t be afraid
    • Change adapt ideas, partners even near deadline
    • Bring in additional partners – mid-project if necessary
  • If lose trust in a partner – deal with it – don’t want a disaster during the project

Turning Tensions Around

  • transferable skills from teaching at the OU
  • Learn to take risks
    • Allow exploration
    • Keep it focus
  • Allow partners to shine

Using people available

  • Colleagues
  • Research school
  • Successful bids

Using Processes

  • Re-use processes
  • Other external projects
  • Get feedback from potential reviewers
  • Funders have resources to support

Share your ideas

  • Share ideas early on – get the project name out there!
    • Leaving it to the end means missing opportunities
  • Share your ideas with your partners
  • Different ways of sharing:
    • Posters
    • Speed dating events
    • Plan to create a video early on
    • Create eBooks
    • Websites / data sharing (in project and public)

John Domingue (KMi):

Why:

  • Funding for staff and the latest equipment and travel
  • Good for CV – funding necessary for promotion to Chair
  • Networking
  • It gives you autonomy

How:

  • Need a great idea for a project – the elevator test – can you sell it in 2 mins
  • The larger the funding the more political
  • In Europe saying the US has it is a seller
  • E.g. “turning the web from one of data into one of services”

Reviews

  • Writing for the reviews – able to give up a week of their time so tend to be good researchers but not the best.  They have to read a lot of bids – make yours stand out
  • Also have to write for the EC Section who select the funded bids
  • Make it something beyond the state-of-the-art
  • Clear, pertinent, transparent

Official Criteria

  • Excellence (threshold 3/5)
  • Impact (threshold 3/5)
  • Quality and Efficiency
    • Plan has to match the idea … if you are going to change the world in X do  the resources to do it?
  • The Consortium – probably counts 50%
    • Do you have the big players?  If not why not?
  • Use the relevant industrial associations if appropriate
  • EU projects is a game – play by the rules
  • Make sure the objective aims of the proposals are aligned with the big partners

Consortium

  • Make sure every partner is playing a specific role
  • Exploitation partners hardest to find – but most important
  • Academics will always come aboard
  • SMEs / Industrial players
  • Leading Research Institutions
  • Balance by Country, region, and type

Process

  • Year to 9 months ahead of deadline meeting of core partners – set forward the core idea
  • Pre-consortium beauty contest
    • Needs to be handled careful
  • E-mails, Skype, etc. to develop the bid
  • Set a small team of people who will write the bid
    • The consortium may well change during the bid writing
  • Talk to the funder

Commission Dialogue

  • Go to Brussels for a day, e.g. an info day
  • Get feedback from the Commission after the call is out
  • Be prepared to radically change the bid in response to feedback
  • Study the Workplan early – before the call is published

Proposal Document

  • Template
  • Stick within page limit
  • Coherent
    • Text
    • Workplan (spreadsheets/ Gantt charts etc.)
    • Risk Management
  • Note – unit heads will have their own goals – how does your project fit?
  • Take into account previous EU projects in State-of-the-Art
  • Use strategic reports from the Commission and others to give background information
  • Typically WPs: Management; 3 Technical WPs, Dissemination and Exploitation
  • Get balance of roles between the partners across the WPs right – balance the effort to match the objectives
  • Impact – who are the authoritative sources?  Quote from key reports e.g. Gartner

Writing the Proposal

  • Small team of good writers (native English speakers), separated away from other work, usually in a shared office – use study leave

Submission

  • The deadline is final!

Networking

  • The difference between a good academic and a good academic with project money is networking
  • Info days
    • Often include networking session to find partners/projects
    • ICT Events (different in different fields)
    • Keynotes invite Commission representative to conferences you organise

Easy way to start

  • Become a project or proposal reviewer
  • The OU is a world leader in Pedagogy – lead training workpackages

Sarah Gray (Research Office)

Research Support

E-mail: Research-Grants-Contracts@open.ac.uk

  • Work closely with faculty administrators
  • Review and approve all external bids (to Leverhulme and UKRCs)
  • Sarah is EU co-ordinator
  • Finding funding opportunities
    • Research.professional.com
    • Current oppertunies page
    • Visit UKRO and register e-mail address
    • National Contact Points in UK.Gov
  • Open calls on WWW page URL: search EU Horizon 2020 Participant Portal

 

— end —

Doug’s blogs from LAK14

I have not made it to LAK14 but am following the conference through Doug Clow’s blog posts.  I commend them to you if you are interested in Learning Analytics.

See: http://dougclow.org/2014/03/25/lak14-tuesday-am-3/ 

 

 

Why Educators Need to Know Learning Theory

I highly recommend the following blog post on Learning Theory: Why Educators Need to Know Learning Theory.

Personalisation for Accessibility (EU4ALL)

An animation illustrating the principles of personalisation for accessibility. Produced by the EU4ALL project in 2011.


Martyn Cooper

A head and shoulders photo of Martyn Cooper

Blog Stats

  • 18,023 hits

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 1,074 other followers

Categories


Follow

Get every new post delivered to your Inbox.

Join 1,074 other followers