It has been a more hectic couple of weeks in some ways with
more exam boards as its that time of year
continuing planning course staffing for next year so my head was buried in spread sheets for a while
researching literature on communities on Twitter and how might the affective aspects of communities distinguish them from networks
meetings, lots or meetings …
assessing applications to be part of an exciting new initiative to launch in the New Year
reading up on Open Badges for possible inclusion in a new course launching in January 2016
attending the ReCon conference on open data and open publishing at Edinburgh University Business School . My notes on the conference can be found here
supervising a number of super dissertations which is great!
So here we are at ReCon, Research in the 21st Century: Data, Analytics and Impact at the University of Edinburgh’s Business School. I’ll be taking notes here throughout the day but these will be partial and picking up main points of interest to me.
The conference is opening with Jo Young from the Scientific Editing Co giving the welcome and introduction to the event.
The first session is from Scott Edmunds from GigaScience on “Beyond Paper”. Has the 350 year old practices of academic publishing had its day and is the advertising of scholarship & formulated around academic clickbait. Taken to extremes, we can see the use of bribery around impact factors, writing papers to order, guaranteed publications etc. This has led to an increase in retractions (x15 in the last decade) so that by 2045 as many paper will be retracted as published and then we’re into negative publishing.
We need to think of new systems of incentives and we now have the infrastructure to do this especially data publishing such as Giga Science provide.
Giga Science has own data publishing repository as well as an open access journal with open and transparent review process. Open data and data publishing is not new and was how Darwin worked through depositing collections in museums and publishing descriptions of finds before the analysis that led to Origin of the Species.
Open data has a moral imperative regarding data on natural disasters, disease outbreaks and so forth. Releasing data leads to sharing of data and analysis of that data for examples on Ecoli Genome analysis. Traditional academic outputs were created but it is also used as an example of the impact of open data. See the Royal Society report here. The crowd sourced approach to genome sequencing is being used in, eg, Ebola, in rice genomes addressing the global food crisis. But publishing of analysis remains slow and needs to be closer to realtime publishing.
So we’re now interesting in executable data looking at the research cycle of interacting data and analysis leading to publications at micro and nano publications that retain DOIs. Alot of this is collected on GitHub.
Also looking at the sharing of workflows using the Galaxy system and again, giving DOIs to particular workflows (see GigaGalaxy), sharing virtual machines (via Amazon).
Through analysis of published papers found how rates of errors but also that replication was very costly.
So the call is “death to the publication, long live the research object” to rewards replication rather than scholarly advertising.
Question: how is the quality of the data assured?
Journal publications are peered reviewed and do checks using own data scientists. While open data is not checked. Tools are available and being developed that will help improve this.
Now on to Arfon Smith from GitHub on Predicting the future of Publishing. Looking at open source software communities for ideas that could inform academic publishing. GitHub is a solution to the issues of version control for collaboration using Git technology. People use GitHub for different things: from single files, through to massive software projects involving 7m + lines of codes. There are about 24m projects on GitHub and is often used by academics.
Will be talking about the publication of software and data rather than papers. Assumptions for the talk are: 1. open is the new normal; 2. the PDF is increasingly unsatisfactory way of sharing research; and 3. we are unprepared to share data and software in useful ways.
GitHub especially being used in data intensive sciences. There is the argument that we are moving in to a new paradigm of sciences beyond computational data in to data intensive sciences (data abundance) & Big Science.
Big Science requires new tools, ways of working and ways of publishing research. But as we become more data intensive, reproducibility declines under traditional publishing. In the biosciences, many methods are black boxed and so it is difficult to really understand the findings – which is not good!
To help, GitHub have a guide on how to cite code by giving a GitHub repository a DOI (via Zenodo) for academics.
From open source practices that are most applicable are:
1. rapid verification, eg, through verification of pull-requests where the community and 3rd party providers undertaking testing or using metrics that check the quality of the code, eg, Code Climate. So verification can and should be automated and open source is “reproducible by necessity”. So in academia we can see the rise of benchmarking services – see for example, Recast or benchmarking algorithm performance.
2. innovation in where there are data challenges by drawing on a culture of reuse around data products to filter out noise in research to enable focus on the specific phenomena of interest (by elimination by data from other analysis)
3. Normal citations are not sufficient for software. Academic environments do not reward tool builders. So there is an idea of distributing credit to authors, tools, data and previous papers. So makes the credit tree transparent and comprehensive.
These innovations depend on the forming of communities around challenges and/ or where open data is available.
The open software community have developed a number of solutions for the challenges faced in academic publishing.
Now we’ve moved on to Stephanie Dawson, CEO, ScienceOpen on “The Big Picture: Open Access content aggregators as drivers of impact” – which is framed in terms of information overload which is a growth trend that is not going to go away. The is reinforced by an economic advantage open access of publishing more along with increased interest in open data, micro-publications etc At the same time, the science information market is extending to new countries such as India, Brazil & China.
Discovery is largely through search engines, indexing services (Scopus, Web of Science), personal and online networking (conferences, mendeley) and so one. But these do not rank knowledge providing reputation, orientation, context, inspiration.
Current tools: journal impact factor but this is a blunt tool that doesn’t work at the individual paper level but is still perceived as important for academics – and for publishers as pricing correlates to impact factor. Article based tools such as usage and dissemination metrics are common.
There is an opportunity for open access to make access to published papers easier that may undermine publishing paywalls and encourage academics to look to open access channels. But open access publications are about 10% of total and on a lower growth trajectory. So there needs further incentives for academics to support open access publications.
Open Science is an open access communication platform with 1.5m open access articles, social networking and collaboration tools. The platform allows commenting, dissemination, reviewing or ‘liking’ an article. Will develop an approach to enable the ranking of individual articles that can be bundled with others, eg, by platform users, or by publishers [so there is a shift towards alternative and personalised forms of article aggregation that can be shared as collections?].
Question: impact factors can be gamed as can alternative metrics. What is key is the quality of the data used and analysis – metrics for how believable articles are?
We’re looking at how to note reproducibility of article findings but these aren’t always possible so edited collections based are a way forward.
Q: this issue of trust is not about people but should be about the data and analysis and the transparency of these – how the data came about?
So there is a need to rethink how methods sections are written. We’re also enhancing the transparency of the review process.
The final session on this section is Peter Burnhill, Director, EDINA on “Where data and journal content collide: what does it mean to ‘publish your data’?”. Looking at two case studies:
1. project on reference rot (link rot+content drift) to develop ways of archiving the web and capturing how sites/ urls have changed over time. Tracked the growth in web citations in academic articles and found 20%+ of urls are ‘rotten’ and original pages cited have disappeared including from open archives. A remedy is to use reference management software to snapshot and archive web pages at time of capture. The project has developed a Zotero plug-in to do this (see video here).
2. an ongoing project on url preservation by publishers. There are many smaller publishers that are ‘at risk’ of being lost. Considers data as working capital (that can be private as work-in-progress) or as something to be shared.
The idea of open data is not new to science and can be seen in comments on science from the 19th Century.
The web and archiving problematises the issues of fixity and malleability of data.
We’re back following a brief coffee break.
Next up is Steve Wheeler on “The Future is Open: Education in the digital age”. Will be talking about ‘openness’ and what we do with the content and knowledge that we produce and have available. Publishing is about educating our community and so should be as open as possible and for freely accessible to better educate that community.
Pedagogy comes first and technology are the tools: we don’t want technological determinism. You have to have a purpose in subscribing to a tool – technology is not a silver bullet.
“Meet Student 2.0″: has been using digital tools at six months old onwards. Most of our students are younger than Google! and are immersed in the digital. But I don’t follow the digital natives idea but do see merit in the digital residents and visitor concept from White and Le Cornu.
Teachers fearing technology: 1 how to make it work; 2 how to avoid looking like an idiot; 3. they ‘ll know more then me. For learners the concerns are about access to WiFi and power. Uses the example of the floppy disk recognised as the save icon but not as a storage device.
Students in lectures with laptops as ‘windows on the world’ to check on and expand on what is being presented too them. But what do these windows do: find information, engage in conversations. Another example is asking about a text on Twitter leads to a response directly from the author of that text. UNESCO talks about communities of users (2002).
Openness is based on the premise of sharing and becomes more prominent as technology makes sharing possible at scale. mentions Martin Weller’s Battle for Open and how openness as an idea has ‘won’ but implementation still has a lot stil to do.
Community is key based on common interest rather than proximity – as communities of practice and of interest. Online, en masse reduces the scope for anonymity and drives towards open scholarship where the academic opens themselves up for constructive criticism. Everything can be collaborative if we want it to be.
Celebration, connection, collaboration and communication all goes into User Generated Content (UGC). Defines UGC as having *not* been through peer review but there is peer review through blog comments, Wikipedia, Twitter conversations. Notes Wikipedia as the largest human Rhizomatic structure in the world.
Moving on to CopyLeft and the Creative Commons. Rheingold on networking as a key literacy of the 21st Century in terms of amplifying your content and knowledge.
Communities of Learning and professional learning networks – with a nod to six degrees of separation but thinks it is down to two to three degrees as we can network to people much easier. Collaborative Open Networks where information is counted as knowledge if it is useful to the community. David Cormier (2007) on Rhizomatic knowledge that has no core or centre and the connections become more important than the knowledge. Knowledge comes out of the processes of working together. This can be contrasted with the closed nature of the LMS/ VLE and students will shift as much as possible to their personal learning environments.
Have to mention MOOCS ad the original cMOOCs were very much about opening content on a massive scale and led by students. The xMOOC has closed and boxed the concept and generating accusations of a shallow learning experience.
Open access publishing. Gives the example of two papers of his, one was in an open access journal that underwent open peer review. The original paper, the reviewer comments, the response and the final paper were published – open publishing at its best!But the other paper was to a closed journal and took three years to publish – the open journal took five months. The closed journal paper has 27 citations against 1023 for the open journal.
Open publishing amplifies your content, eg, the interactions generated through sharing content on SlideShare. His blog has about 100k readers a month and is another form of publication and all available under Creative Commons.
This is about adaptation to make our research and knowledge more available and more impactful.
Question: how are universities responding to openness.
It depends on the universities’ business model – cites the freemium model with a basic ‘product’ being available for free. In the example of FutureLearn is giving away partner content for free with either paid for certification or as a way of enhancing recruitment to mainstream courses.
Now time for lunch
Now back and looking at measuring impact with Euan Adie from altmetric
Using the idea of impact of research is about making a difference. Impact include quality: rigour, significance, original, replicable
attention: the right people see it
impact: makes a difference in terms of social, economic, cultural benefits.
REF impact is assessed on quality and impact. A ‘high impact journal’ assumes the journal is of quality and the right people see it (attention).
Impact is increasingly important in research funding across the world. And it is important to look at impact.
Traditional citations counts measure attention – scholars reading scholarship.
Altmetrics manifesto – acknowledge that research is available and used online then we can capture some measures of attention and impact (not quality). This tends to look at non-academic attention through blog posts and comments, Tweets, newspapers; and impact on policy-makers. But what this gives is data but a human has to interpret it and put it in to context via narrative.
Anna Clements on the university library at St Andrews University. What are the policy drivers for the focus on data: research assessments, open access requirements (HEFCE, RCUK) and research data management policies (EPSRC, 2015). Which required HE to focus on the quality of research data with a view to REF2020, asset exploitation, promotion and reputation and managing research income – as well as student demand/ expectations especially following the increase in fees. So libraries are taking lead in institutional data science within the context of financial constraints and ROI and working with academics.
Developing metrics jointly with other HEIs as snowball metrics involving UK, US and ANZ as well as publishers and the metrics are open and free to use.
Kaveh Bazargan from River Valley Technologies on “Letting go of 350 years’ legacy – painful but necessary”. The company specialises in typesetting heavy maths texts. But has more recently developed publishing platforms.
It has been, in many ways, a fairly quiet week as I was working on:
exam boards as its that time of year
planning course staffing for next year so my head was buried in spread sheets for a while
researching literature on communities on Twitter and considering the role of hashtags and trending topics in generating a sense of being part of an imagined (virtual) community
in virtual meetings with various students and with Yulia Sidorova to discuss researching social media.
But I’ve mainly been feeling tired and a bit wiped out so could probably do with a break ….luckily, its the weekend!
- attending meetings and listing tasks as I have a new academic management role.
- lots of marking and moderating marks and exam boards.
- meetings with students on Skype and Spreed.
- reading and writing on the concept of communities on Twitter.
- presented at the #mscde on the supervision process that students can look forward to as part of the programme dissertation festival in Second Life. A video of my presentation is available here and others will become available at the MSc in Digital Education programme YouTube Channel soon.
- attended an excellent presentation by Dragan Gasevic on learning analytics and the importance of context in making sense of such analytics. The presentation emphasised the importance of data literacy among students, teaching staff and institutional leadership *if* learning analytics are to make an effective contribution to improving education.
I’m attending the eLearning@Ed 2015 conference and will be attempting to live blog throughout the day.
Melissa Highton, Director of Learning Teaching and Web Services here at Edinburgh is starting the conference and the theme of Designing for 21st Century Learning. Wanted to ask what 21st century learning might be and how it might be different from 20th century. Many aspects of learning and education have stayed the same, but differences around scale, technology, teachers and teaching and, in particular, “its not ok to not understand the internet anymore”.
Highlighting some trends in the sector from the New Media Consortium with trends around maker spaces, changes spaces for learning, BYOD, personalised learning and the wicked problems of recognition and reward for teaching.
Now moving on to a panel of Chairs in Digital Education on views of 21st century learning.
First up is Judy Hardy, School of Physics and Astronomy with personal view and concerns. Looking to the student experience in 2020 and what will it be like. IN many ways, it will be very similar to now: lectures, workshops and tutorials and self-study. But there will be much more extensive use of digital technologies. Uses an anecdote on research methods for honours students that includes a self-reflective assignment and many used cloud based tools and Facebook groups and these sorts of tools and working methods will be mobilised. Also cited research on active engagement in classroom teaching against more traditional (didactic) learning design that shows active engagement has massive benefits to learning achievement.
But why is there lecturer resistance. Cited a survey showing lecturers want to teach and take pride in their teaching competences. So what are the challenges: time – which is a proxy for many things; and pedagogical context, where innovations abandoned early or perceive too much choices. So there are challenges of awareness; ‘how-to’ knowledge and why innovations in learning are important – ‘principles’ knowledge – and understanding these three forms of knowledge are crucial to implementing improving teaching.
Next is Sian Bayne based in the School of Education and Prof of Digital Education. Sian’s talking about Dave Cournier’s Rhizo Mooc, that included Tweets on one of Sian’s papers that was a set reading. The paper was about striated and smooth space in online learning: striated spaces is formal, goal-orientated and ordered while smooth space is nomadic, open and wandering-orientated and these two metaphorical spaces do merge and their boundaries blur. We can map learning spaces on to striated and smooth spaces: striated spaces as VLEs/ LMS and smooth spaces as hypertext, linkages, multimodal assessments, wikis and blogs.How do these metaphors work in 2015 and we continue to have striated spaces in VLEs, progression, e-portfolios, personalisation, adaptive learning, learning analytics, gamification. But also increased smooth(er) spaces such as Twitter, YikYak, augmented realities, flipped classrooms, maker spaces, crowd-based learning. The bigger point is that this field is predominately future orientated with lots of trends forecasts which generates a change acceleration to adapt practices to the ‘next big thing’. But trends are contingent on the situated context (the University of Edinburgh) leading to questions of what sort of institute we want to be and what is the purpose of higher education.
Judy Robertson, Chair in Digital Learning talking about current work and using technology to support learner goal setting. A lot of her work involves user centred design for mainly school pupils related to behavioural change in education and in public health. Typically games set goals for users but the interest here is user goal setting and setting appropriate goals. Currently developing a game to encourage behavioural change to increase activity levels. Can also be extended to realistic goal setting for students in their study skills. So the question is on designing technology to be helpful but not intrusive.
Critter Jam (FitQuest) is an exercise game for a mobile phone to encourage children to run around. The game includes being chased by a virtual wolf, or to pick up virtual coins. Children can select different goals such as topping the leader board, beating your PB, setting points targets (but how to select an appropriate points goal?). Her research is on self-efficacy and in patterns of goal setting related to increased performance. Also links to resilience in context of goal failure and adjusting goals accordingly – and this could be adapted to, for example, undergraduates.
David Reay from Geosciences and talking on distance education and the development of the MSc in Carbon Management involving the Schools of Business, GeoSciences and Economics. There was a clear demand from students for applied experience and so developed online learning as a response. Initially, developed a role play simulation with face-to-face learning and developed this for online learning that was delivered as part of the MSc in Global Challenges. So now there is a full online MSc in Carbon Management launching in September. He is also developing an online course in sustainability for campus based students linked to graduate attributes around understanding sustainability. Each student will look at sustainability in their subject area to understand what sustainability means and have an excellent online learning experience. His research is on climate change including online teaching and conferencing in terms of its environmental impacts including measuring the total carbon emissions for the online programmes. The intention is to off-set carbon emissions generated by the programme – to be the greenest masters ever!
Dragan Gasevic, Professor of Learning Analytics at the Schools of Education and of Informatics. Why learning analytics is important: especially in provision of personalised feedback loops for students that acknowledges their diverse needs. We use VLEs/ LMS but also rely on many other digital technologies for learning including on the web, using social learning, reflective learning through annotation technologies and blogs. In using digital technologies we are leaving a digital footprint. We have been collecting some of this data since the start of universities. We want to leverage this data to assist teaching, learning, policy-making etc. and this is the point of learning analytics. Learning analytics is about learning and this must not be forgotten – not just data crunching for its own sake but is purposive. Learners are not black boxes but are individuals with many different and not permanent traits, knowledge and understanding, The black box needs to be opened up to deliver the benefits of learning analytics. Looks to CLAS – collaborative lecture annotation system – but the key is to encourage learners to use beneficial technologies. So we have a duty to inform students on the benefits of a technology and to scaffold support for the students to use that technology. Found that students were more engaged with technologies in graded courses and came to internalise the use of the tool in either graded or ungraded courses. So if we teach our student to use a tool they will continue to use that tool even if that use is not required. Learning analytics support and validate pedagogy.
“Counts don’t count much is decontextualised”! We need to account for pedagogical context in learning analytics. Also, visualisations can be harmful especially in showing visualisations to learners/ students so we need to develop analytics literacy for students. We also need to scale up qualitative analysis to improve understanding of learners and to develop institutional policies to support the use of analytics. But the use of learning analytics is contingent for each institutional context – one size does not fit all!
Jonathan Silvertown, Biological Sciences, is talking about the project ‘virtual edinburgh’. The project will turn the city in to a pervasive learning environment for formal and informal education. The future is already here – such as WiFi on buses but also apps such as Walking through Time, LitLong (Palimset), Mesh, iSpot etc.. but virtual edinburgh will also allow interaction between users. Also look to the ‘nearby’ function on Wikipedia. These apps and functions will be linked together through virtual Edinburgh and draws on the teaching and learning strategy priorities on giving learners agency and providing technology to do that. Modes of interaction will involve existing and new apps, peer interaction, game play, new data layers, mashups etc. that can be used in courses or as part of self-directed (informal) learning. The ultimate objective is to create Edinburgh as the City of Learning.
Question: One of the themes is on student digital literacy and what baseline of literacy should we expect students and staff to have?
Judy R: That’s a really interesting question as we cannot assume that students will know how to use it for learning.
Judy Harding: we need to think about how institutional and personal technologies are used with, perhaps students preferencing their personal technologies.
Dragan: the focus should be on study and learning skills and these will not change but that abilities may decline in these due to the affordances of new technologies.
Dave Reay: confession on start of online course assumed students would know about and be able to use particular technologies. Preparation with students is key.
Sian: the research busting the idea of the digital native. The evidence is that what students come to the university with is less important than what we expect them to do. As many of the talks have suggested, the context is key.
Question: on engaged learning[??]
Judy H: the flipped classroom is important in using the technology to engage with larger cohorts of student as the large lecturer will not disappear.
Question: teach honours and postgraduate students and trying to get students to use newer technologies and if not introduced to these technologies earlier, then it may be too late in learning to use these technologies for learning.
Judy H: do we need to be more explicit in encouraging students to develop relevant technology skills in students.
Dave Reay: this will improve in patches and should be a question for programme convenors to develop online learning experiences in degree programmes.
Dragan: we have academic autonomy and so top-down solutions will not work. We need to consider what technologies academics are aware of and can use and so what incentives are provided to encourage the use of technologies. Suggests greater emphasis and recognition of teaching.
Question: what learning technologies are we developing taking account accessibility and the ethical responsibilities of the university.
Dave Reay: the technologies and online courses increase the accessibility to the programmes to new and different students. Avoids some of the challenges of cost, visas, personal circumstances.
Sian: need to differentiate between learning and education – wanting to learn is different from seeking qualifications via formal education.
Dragan: accreditation is an important factor. Also students don’t just come to edinburgh for the content but also for the experience and networks. Online learning also needs higher development abilities at self-regulated learning. We also tend to think in terms of credit hour rather than outcomes and this can be seen in shifts towards competence based education including graduate attributes.
Question: what practical measures could be taken to keep academic staff up to date with what is happening with learning technologies at schools level
Judy R: CoE does include technology in primary such as using Microsoft office but also extreme paranoia about anything social online and allowing pupils outside the walled garden of eg, GLOW
Judy H: not all out students come through the Scottish education system and we need to encourage self-regulated learning for students coming from a vast range of education systems.
Jonathon S: that would be a goo topic for the conference next year.
We’re back from a break with Dash Sekhar, VPAA and Tanya Lubicz-Nawrock from Edinburgh University Students Association on “Co-Creation: Student Ownership of Curriculum”. Starts with the many forms of student engagement such as Kuh’s focus on time and effort aligned to institutional desired outcomes and Bovill emphasises respect, reciprocity and shared responsibility between students and academics.
Co-creation operates on a continuum from student feedback/ evaluation to students as experts of their own learning experiences expressed through student representations to Co-Creation of the Curriculum. So Co-Creation is a mutuality between students and academics and so shifts power relations between staff and student.
Putting the ideas of co-creation in to action through student-led content where students create their own projects to meet learning outcomes and assessment criteria. Technology allows for more flexible and remote learning.
Student partnerships in assessment: where students select and negotiate the assessment components and weighting to create sense of joint ownership of the assessments. Involved a democratic process for selecting the final assessment process.
Social bookmarking: in a statistics course where as a part of the course, the students had to tag sites and posts related to the course content. These posts were used in a running ‘live feed’. While fairly surface, this involved a shift in how students relate to course content.
We’re now moving to small group discussion so I’ll stop here and be back later.
Group work over and we’re on to Prof. Ian Pirie, Asst Principal Learning Developments on the use of portfolios and e-portfolios in art & design. Simon Riley (CMVM) will talk about portfolios in medicine. Portfolios are used to demonstrate research, process, methods, outcomes etc. and curate a portfolio for submission for assessment. Portfolios a central to the method of art & design education in the context of sustained practice including art, design, architecture, medicine, engineering, healthcare etc. linked to demonstration of competence.
In the case of art, design & architecture, the portfolio is used from recruitment to almost all assessments. Portfolios include all forms of media and is crucial in entry to the next stages of education and in professional careers.
Simon Riley, on portfolios in medicine. Medical education governed by the GMC as a competency-based curriculum with an interest in allowing student choice. To enable the student choice element of the curriculum, portfolios were adopted since 1990s.
The university curriculum is closely mapped to the GMC requirements. The different themes of the curriculum is pulled together through the portfolio. Portfolios include case reports, essays, project reports, reflective analysis of professional skills, reflective analysis of experiences, assessment (by viva) and project organisation. The reflective analysis components continue to have room for further development.
There is also a professional development portfolio including capturing the graduate attributes using Pebble+ in parallel to the programme portfolios.
Gives the example of a Group Project that uses an open WordPress site. This involves the collection and synthesis of information and knowledge.
The portfolios are being used for the demonstration of competence and reflection. Portfolios also train students for progression to postgraduate study and professional development. There is a huge amount of commonality between how medicine and art & design use portfolios.
Back to Prof. Ian Pirie on the share pedagogy based on Kolb’s model of experiential learning. In the remaining time, the range of eportfolios being used at Edinburgh are shown. A key issue is transferring the ePortfolio so students can use them outside and after their University forum.
Melissa Highton is in the last slot before lunch to talk about Open Educational Resources: new media for learning, and recent developments on OER at Edinburgh.
Openess is seen as a bold and positive move for the University. Initially, the University set up a task group on the development of an OER strategy. OER underpins a lot of the themes of this conference. The task group involved a range of academic and support services stakeholders. Cites the Capetown declaration of 2007 as a fit with stated intentions around sharing and developing knowledge. This sharing of knowledge and learning resources is enabled by technology. But resources need amending to the local context and we’re not sure if this is possible/ legal. There are also strong opinions that publicly funded resources should be open.
A problem with the word ‘open’ is that it means different things: available, available online, accessible. There is a definition of open: “open data and content can be freely used, modified and shared by anyone for any purpose”. There is a need for rigour in the definition in apart to manage the reputational risks of stating that the university is using open resources and that staff understand licensing and sharing and publishing of material. Licensing tends to be on Creative Commons licenses which fits nicely with the notion of teaching as a creative act – and this is a growing phenomena with 882million items on CC license in 2014 from 50m in 2006.
Fourteen countries have made a national commitments to open education including Scotland. CC licensed material is available from all over the world – which would help in internationalising and diversifying the curriculum.
Edinburgh has launched open.ed as open content resources. Also CC licenses allow us to renew and amend any resources so as technologies change, resources can be updated and so are sustainable.
…. and now its time for lunch….and I’ll have to finish here as I’ve run out of power and that plug points don’t work…
I am currently enjoying my first extended experience of teaching in (and on) Second Life. Here are a few images of the initial orientation sessions and later tutorials in action at two of our teaching spaces:
[Update: the paper was published in January and can be found here] In the New Year, a short paper by me is to be included in a special edition of TechTrends to be published in the New Year. The abstract is:
This article explores professional learning through online discussion events as sites of communities of learning. The rise of distributed work places and networked labour coincides with a privileging of individualised professional learning. Alongside this focus on the individual has been a growth in informal online learning communities and networks for professional learning and professional identity development. An example of these learning communities can be seen in the synchronous discussion events held on Twitter. This article examines a sample of these events where the interplay of personal learning and the collaborative components of professional learning and practice are seen, and discusses how facilitation is performed through a distributed assemblage of technologies and the collective of event participants. These Twitter-based events demonstrate competing forces of newer technologies and related practices of social and collaborative learning against a rhetoric of learner autonomy and control found in the advocacy of the personalisation of learning.
I’m looking forward to it coming out – along with other excellent papers from colleagues here.
These are my notes taken during the presentation and then tidied up a bit in terms of spelling and sense – so they may well be limit, partial and mistaken!
Welcome from Sian Bayne with the drama of the day “fire! Toilets!” and confirmed that the event is being livestreamed and the video is available here.
Lesley Gourlay as chair for the day also welcomed participants from across the UK and Copenhagen. Seeking to provide a forum for a more theorised and critical perspective technology in higher education in the SRHE (Society for Research in Higher Education). Prof Richard Edwards at the School of Education gained funding for international speakers for today’s events. Unfortunately Richard is ill and can’t be here.
The theme of the event is developing the theoretical, ethical, political and social analysis of digital technologies and shift away from an instrumentalist perspective. The event Twitter hashtag is #shre
The first presentation is by Alex Juhasz on distributed online FemTechNet. FemTechNet as a network does not often speak to be field of education so this is a welcome opportunity (she has also blogged on the event here)
FemTechNet is an active network of scholars, technologist and artists interested in technology and feminism. The network is focused on both the history and future of women in technology sectors and practices. FemTechNet is structured through committees and has a deep process-focused approach to its work that is in important in terms of feminist practices. Projects involve the production of a white paper, teaching and teaching practices, workshops, open office hours, co-teaching, etc. models the interaction of theory and practice. But it has been difficult to engage students in collaborative projects while staff/ professors are much more engaged. Town halls are events for collaborative discussion events with an upcoming event on Gamergate to include a teach-in. FemTechNet have also produced a ‘rocking’ manifesto as “feminist academic hacktivism” and “cyberfeminist praxis”.
FemTechNet values are made manifest in Distributed Open Collaborative Courses (DOCCs) themes on Dialogues on Feminism and Technology (2013) and Collaborations in Feminism and Technology (2014). DOCCs against the xMOOC model to promote a participative approach to course design and distributed approaches to collaboration. DOCC was labelled as the Feminist anti-MOOC based on deep feminist principles including wikistorming, and has much press and other interest, some positive and some ‘silly’ (Fox News). FemTechNet has lots of notes on using tools and teaching approaches that can be used across lots of different critical topics beyond feminism alone.
DOCCs are designed to be distributed and with a flatter hierarchy with less of a focus on massiveness. Using technology in an open way to co-create knowledge beyond transmission. More details on the DOCC as a learning commons vs course can be found here.
The FemTechNet commons is now housed and redesigned at the University of Michigan although this may be a way of universities avoiding Title 9 violations. But as a result, the newer commons has become less open and collaborative as an online space.
Much of FemTechNet work involved overcoming technological hurdles and was based on the unpaid work of members. FemTechNet engage with critique of lobour practices and contexts in higher education.
The DOCC networks involve a wide scope of different types of universities from Ivey League and community colleges and community organisations collaborately working.
Student numbers are fairly small with approx 200 students but very high completion rates and very positive feedback and evaluations. Between 2013-4 there was not really growth of scale partly due to limitations of infrastructure. Now with the support of University of Michigan, there is an increased aspiration to develop international collaborative work.
DOCCs involve networking courses from many different fields of study involving both on-campus to fully online courses. Basic components of courses are keynote dialogue videos, smaller keywords dialogues and five shared learning activities. See also the situated knowledge map [link]. There is a big emphasis on share resources, cross-displinarity and inter-institutional working and learning.
So while DOCCs emerged from a feminist network, the tools, models and approaches can be used in many subject areas.
Ben Williamson is prsenting on Calculating Academics: theorising the algorithmic organisationan of the digital university. The open slide isof a conceptualisation of a digital university university that can react to data and information that it receives. Ben will be prsenting on a shift t understanding of the university as mediated by the digital and focus on the role of algorithms.
One of the major terms being used is in terms of the smart university based on big data to enhance teaching, engagement, research, enterprise to optimise and utilise the data universities generate. This turn is situation in the wider concept of ‘smart cities’.
Smart cities are ‘fabricated spaces’ that are imaginary and unrealised and perhaps unrealisable. Fabricated spaces serve as models to aspire to realise.
Smart universities are fabricated through
technical devicecs, softre, code,
social actors including software producers, government and
discourses of text and materials.
Algorithm is seen in compsci as a set of processes to produce a desired output. But algorithms are black boxed hidden in IP and impenetrable code. It is also hidden in wider heterogeneous systems involving languages, regulation and law, standards etc.
Also algorithms emerge and change overtime and are, to an extent, out of comtrol, and are complex and emergent.
Socio-algorithmic relationality as algorithms co-constitute social practice (Bucher 2012); generate patterns, order and coordination (mackenzie 2006) and are social products of specific political, social and cultureal contexts and go on to produce by temselves.
Involve translation of human action through mathematical logics (Neyland 2014). Gillespie (2014) argues for a sociological analysis of algorithms as social, poitical as well as technical accomplishments.
Algorithms offer (Gillespie 2014): technical solutions; as synedoche – an abbreviation for a much wider socio-technical system; as stand-in for something else around corporate ownership for example; commitment to procedure as they privilige qualitification and proceduralisation.
Big data and the smart university is a problem area in this idea of the smart university. Is there a different epistemology for big data. Big data cannot exist without algorithms and has generated a number of discourses. Wired mag has suggested that big data is leading to the end of theory as there is no need to create a hypothesis as big data will locate patterns and results and this is a challenge to traditional academic practice. Also there is the rise of commercial social science such as the Facebook social science team often linked to nudging behaviours and “engineering the public” (Tufecki 2014). This is replicated in policy development such as the centre for analysis of social media at Demos using new big data sets. We’re also seeing new academic initiatives such as social physics at MIT and building a predictive model of human behaviour. Also see MIT laboratory for social machines in partnership with Twitter.
This raises the question of what expertise is being harnessed for smarter universities. Points ot the rise of alternative centres of expertise that can conduct big data analysis that are labelled as algorithmist Mayer0Schonberger and Cukier. Such skills and interdisciplinarity does not fit well in university. Sees the rise of non-sociologist sociologists doing better social research?
Mayer0Schonberger and Cukier Learning with Big data – predictive learning analytics, new learning platforms, et.\c. that is reflected in the discourses on the smarter university. Bid data generates the university in immediate and real time- doesn’t have to wait for assessment returns. See for example, IBM education for a smarter planet focused on smarter and prescriptive analytics based on big data.
Knewton talks of inferred student data that suggests the algorithm is objective and consistent. But as Seaver (2014) points out, these algorithms are created and changed through ‘human hands’.
So we’re seeing a big data epistemology that uses statistics that explain and predict human behaviour (Kitchin 2014): algorithms can find patterns where science cannot that you don’t need subject knowledge to understand the data. But he goes on that this is based on fallacies of big data- big data is partial, based on samples, what analysis is selected, what data is or can be captured. Jurgenson (2014) also argues for the understanding of the wider socio-economic networks that create the algorithms – the capture of data points is governed by political choices.
How assumptions of bid=g data are influenceing academic research practices. Increasingly algor entwinned in knowledge production when working with data – sch as Nvivo, SPSS, google scholar – Beer 2012 – algorthimic creation of social knowledge.Also seeing the emergence of digital social research around big data and social media. eg social software studies initiative – soc sci increasingy dep on digital infrrastructure not of our making.
Noortje Marres rethink social research as distributed and share accomplishment involving human and non-human.
In turn influences academic self-assessment and identity through snowball metrics on citation scores, researchfish etc. translating academic work in to metrics. See Eysenback (2011) study linking Tweets and rates of citation. So academics are subject to increasing quantified control mediated through software and algorithms. Seeing the emergence of the quantified academi self. Yet academics are socialised and by these social media networks that exacerbtes this e-surviellance (Lupton 2014). While share research develops its own lib=vely social life outside of the originator’s control.
Hall (2013) points to new epistemic environment that academics are being more social (media) entrepreneurial. Lyotard (1979) points to the importance and constraints of computerisation of research.
Finish with Q
– how do cog based classrooms learn?
– what data is collected to teach?
– should academics learn to code?
A lot of discssion on the last question. It was also pointed out that its not asked should coders learn to be sociologists?
Also pointed out that people demonstrate the importanc of embodoed experiences through protests, demonstrations, that reflects what is loss in the turn to data.
After a short break, we now have Norm Friesen on Education Technology or Education as always-already Technological”. Talking about educational technology as not new but as going through a series of entwinements over time. Norm will look at older technologies of the text book and the lecture as looking back at older recognisable forms.
Looking back we can argue that educational technologies now are not presenting particularly novel problems for higher education. Rather higher education has always been constituative with educational practices then we can see how practices can adapt to newer technologies now.
Tech in education have always been about inscription, symbols as well as performance. If we understand the university as a discourse networks – see Kipler’s discourse network in analysis of publishing in 19 Century. Institutions like universities are closely linked to technology in storing and using technologies and modifying technologies for their practices.
In the example of tablets going back to ancient times or the horn book or other forms that is tigtly couple with institutions of learning and education. Such as clay tablets dating back to 2500 – 2000 BCE that show student work and teacher corrects as symbolic inscriptions of teaching and learning practices. And such tablets work at the scale of individual student work or as larger epic literatures. Can see a continued institution symbolic practices through to the iPad. Here technologies may include epistemic technologies such as knowledge of multiplication tables, procedures of a lecture – technologies as a means ot an end – so technologies are ‘cultural techniques’.
For the rest of the presentation will focus on the textbook and lecture as technologies that are particularly under attack in the revisioning of the university. Ideas of the fipped classroom still priviliges the lecture through video capture. Similarly the text book has yet to be overtaken by the e-textbook. Both provide continuities fromover 800 years of practice and performance.
The lecture goes back to the earliest university as originally to recite a text, so for transmission rather than generation of knowledge with a focus on the retention of knowledge. Developing ones own ideas in a lecture was unknown and student work involved extensive note taking from oral teaching (see Blair 2008). The lecture is about textual reproduction. Even following the printing press, this lecture practice continued although slowly, the lecturers own commentary on the text was introduced manifested as interlines between lines written from the dictated text. Educational practice tended to not change as rapidly as the technologies of printing such that education was about 100 years behind.
But in 1800 saw the first lectures only from the lecturers own notes. so the lecture was recast around the individual as the creator of knowledge. So the individual lecturer and student not the official text became the authoritative sources of knowledge. Also the notion of the performance becomes increasingly important in the procedure of the lecture.
In textbooks we see pedagogical practice embedded in the text as end of chapter questions for the student to reflect and respond to (the Pestalozzian method, 1863). This approach can be seen in Vygotsky, Mead and self-regulated learning.
Specific technological configurations supported the increased emphasis on performance such as podcasting, powerPoint, projectors, etc. (see TED talks).
In the text book, similar innovations are happening in term sof layout, multimedia, personalised questioning (using algorithms). The text book becomes an interactional experience but continue from much older forms of the textbook. What is central is what are the familiar forms – that underlying structures have persisted.
But it is also the case that lectures nolonger espouse their own theories, they do not create new knowedge in the lecture.
These are my notes taken during the presentation and then tidied up a bit in terms of spelling and sense – so they may well be limit, partial and mistaken!
The welcome came from Robin Williams noting that Kalle has a wide range of appointments and research interests and often acts as abridge builder across different subject disciplines and between American and European research communities. Kalle has been particularly supportive around research in IT infrastructures and in supporting the development of research communities on IT infrastructure.
Kalle starts the presentation with a discussion of the background of this paper that has been developing over the last five years. His research is positioned within science and technology studies (STS) but with a more behaviourist focus. This paper investigates issues of regulation which is fundamental to social interactions through establishing what is and is not acceptable behaviour within a specific context.
The example of the Securite Generale fraud by Jerome Kerviel who fooled the control systems to undertake fraudulent trading resulting in losses for the bank of approximately €5bn. This fraud was contrasted the old fashioned approaches to bank robbery and the regulatory regimes aimed at preventing such robberies to highlight that digital banking require new and different regulatory regimes.
IT systems embed rules that have regulatory functions on access to and the use of resources. Yet a key concern remains with how social actors comply with and work around these rules. So this research is concerned with how IT can be seen as materially based organisational regulation in interaction with the social.
What is a rule? Rules tend to be defined as a purely social statement on the expectations on behaviours by participants in a system and it is assumed that such rules are generally reciprocal. The expectations should create stabilities of behaviour yet are not mechanistic and so variances occur through misunderstanding, reinterpretation and resistance. For organisations, what is key is the materiality of rules through systems, processes, expressions in space design and so forth, that also generate stability over space and time. Regulation combines social and material components intertwined in a practice that decrease variance in behaviours and also facilitate the coordination of collective action.
Regulation is a meeting point of tensions between structure and agency raising questions on, for example, centralisation vs decentralisation of decision-making.
An IT system is a dynamic and expansive resource through which regulatory power is exercised by materialisation of rules. Rules are stored, diffused, enforced through IT. Rules encode and embed rules (Latour 1996, 2005) while rules become more complex through IT systems that allow complex combinations of rules. IT can track, record and identify events on a large scale and high speed and low cost – which is where big data can help identify and enforce new rules. Through IT, regulation becomes less visible as it is embedded in, for example, user-interfaces.
The example of high frequency trading and how IT rules are established that limit what types of trades can be operationalised – see Lewis’ Flashboys book.
Regulation has three dimensions: 1. the Rules that are materialised as a 2. IT artefact that is interdependent on 3. practices. Rules are coupled overtime with practices (such that the rule may be forgotten as it is embedded in the IT artefact.
IT regulation research in 1970s to 90s viewed regulation as oppressive and deterministic and in 1990s+ research was more concerned with deviation in practice. Alot of research in regulation positioned IT as a contextual variable while a much smaller number looked specifically at the IT in terms of materialisation, enactment of rules in practices and in the temporal aspects (Leonardi 2011). So research on IT and Regulation is limited.
Research to focus on sources of co-existence of multple IT based regulations generating heterogeneous and conflicting regulations so has multiple consequences.
Our focus is on practices of maintaining and transforming rules that mediate collective activity. Regulations are based on three types of origins: (i) autonomy where people agree on behaviours; (ii) control-orientated, explicit rules and laws based; or (iii) joint. The research is interested in practices in IT rich environments as rules become more invisible as they are ‘inscripted’ in to technology and/ or material. The same rule can be embedded in different ways, eg, speeding rules embedded in speed bumps and/ or in vocal warning from speedometer.
The study was a 7 year longitudinal study of regulatory episodes in a virtual learning environments. How teaching and learning behaviours are regulated through the VLE. Data was gathered from email logs, interviews and document analysis. The analysis focused on critical incidents, simple statistics and lexical analysis of emails.
The research questions were: 1. what is the focus of the regulatory episodes and 2. what was the temporal coupling between regulation and behaviour. The VLE provides a rich environment with alternative forms of regulation, dynamic in terms of wider changes in higher education, rules embedded in the application and how it is used.
Five types of regulatory episodes, all of which changed over time:
1. functional – restrictions on how users use the VLE based on the functionality of the VL
2. Tool orientated – specific tools are imposed for specific activities
3. Role orientated – which roles can use which aspects of the VLE
4. Procedure orientated – where learning processes such as course activities are practiced in new ways
5. Opportunity orientated.
Material regulation is dominant in functional and tool orientated rules while the social was dominant in role and procedure orientated rules.
The complexity of the multiplicity of rules and sources of rules led to confusion and difficulties in enforcing rules but, with low levels of constraint, were also sources of innovation in practices. Also, increasing the formal limits of the IT systems generated conflict over the rules.
As the operationalisation of the VLE continued over time so the complexity and volume of rules increased.
Over time the central administration of the university asserted increased control over the VLE for purposes of efficiency and uniformity of provision but also to legitimise its existence. But this increased control also removed a lot of local innovations. The materialisation of the rules in the VLE enabled greater centralised control. But also that IT choices then limits what future flexibility may be possible.