I’m listening now to Tara McPherson on humanities research in a networked world as the opening session of the Digital Scholarship day of ideas. (I’ve started late due to a change in the start-time).
Discussing how large data sets can be presented in a variety of interfaces: for schools; researchers; publishers and only now beginning to realise the variety of modes of presenting information across all discipline areas. But humanities scholars are not trained in tool building but should engage in that tool building drawing on their historic work on text, embodiment etc. and points to working with artistis on such interpretive tool building – see Mukurtu an archive platform design by an anthropogist based on work with indigenous people in Australia. Tools allow indigenous people to control access to knowledge according to their knowledge exchange protocols.
Open ended group create immersive 3D spaces but is not designed to be realistic but engaging. More usually found in an experimental art gallery. Also showing an example of a project of audio recordings of interviews with drug users at a needle exchanges.
Vectors is a journal examining these sorts of interactive and immersive experiences and research. Involves ‘papers’ that interact, mutate and change which challenges the notion of scholarship as stable. Interactive experiences are developed in collaboration with scholars in a long iterative process that is not particularly scaleable.
The develop of a tool-building process was a reaction on problematising interaction with data-sets. Example of HyperCities extending google maps across space and time.
The Alliance for networking Visual Culture including universities and publishers working together, reconsider scales of scholarship and using material from visual archives. Process starts with the development of prototypes. Scalar emerged from Vectors work as a publishing platform for scholars using visual materials. Allows scholars to explore multiple views of visual materials linked to archives and associated online materials linked to critical commons (under US ‘fair use’ allowing legal use of commercial material). Scalar allows a high level of interactivity with the material of (virtual) books and learning materials.
Aim to expand proces of scholarly production and to rethink education. For example, USC has a new PhD programme in media studies in which PhD students make (rather than write) a dissertation- see Take Action Games as an example.
Thinking about scholarly practice in an era of big data and archives: valuing openness; thinking of users as co-creators; assume multiple front-ends/ interfaces; scales scholarship from micro to macro; learning from experiment and artistic practices; engaging designers and information architects; value and reward collaboration acros skills sets.
Scalar treats all items in a data-set as at the same ‘level’ so affording alternative and different ways of examining and interacting with the data.
USC School of Cinematic Arts has a long history of the use of multi-media in assessment practices and the development of criteria. Have also developed guidance on the evaluation of digital scholarship for appointment and tenure. The key issue here has been in dealing with issues of attribution in collaborative production.
Now moved on to the next sessions of the day with Jeremy Knox who is research open education and questioning the current calls for restructuring higher education about autonomous learning and developing a critique of the open education movement. He is discussing data collection on MOOCS in terms of
- Objectives of education
- Bodies and how the human body might be involved in online education
Starts with discussing what a MOOC is as free; delivered online and massive. Delivered via universities on platforms provided through main players such as Udacity, Coursera and edX.
Most MOOCs involved video lectures and quizes supported by discussion forum and assessed through an automatic process (often multi-choice quizes) due to the number of students.
Data collection in MOOCs as example of big data in education allowing learning analytics to optimise the educational experience including through personalisation of the educational experience.
Data collected specifically from the MOOC platforms. edX claiming to use data to inform both their MOOC delivery but also to inform development of the campus based progress at MIT
Space – where is the MOOC? edX website includes images of campus students congregating around the bricks and mortar of the university. Coursera makes use of many images of physical campus buildings. Also many images of where students are from through images of the globe – see here
Metaphor of the space of the MOOc is both local and global.
Taught on one of the six MOOCs delivered by University of Edinburgh. Students often used visual metaphors of space in their experience fo the MOOC – network spaces, flows and spaces of confusion. Also the space metaphor used by instructors in delivering MOOCs such as in video tours of spaces. The instructors seeking to project the campus building as the ‘space of the MOOC’ and this impacts on the student experience of the MOOC. The buildings may have agency
What else might have agency in the experience of education? For example, book as a key ‘tool’ of education. Developed a RFID system so that tagged books send a Tweet with a random sentence from the book when placed on a book-stand/ sensor as a playful way of collecting data. So twitter streams include tweets from students/ people and books.
Another example is of YouTube recommended videos recontextualises video with other videos as a mesh of videos and algorithms.
The body in the MOOCs? Is taken in to account through Signature Track that uses the body to tract the individual student. Now showing a Kinect sensor to analyse how body position changes interaction with a MOOC course which allows the body to intervene and impact on the course space.
How does the body of the teacher be other than the body of external gaze?
Now moving to a Skyped session with Sophia Lycouris Reader in Digital Choreography at Edinburgh College of Art and is working on research in using haptic technologies to enable people with impaired sight to experience live dance performance – see here. A prototype has been developed to allow users to experience some movements of the dance through vibrations. Again, uses a Kinect.
The project explores the relationship between arts and humanities and innovations in digital technology as trans-disciplinary alongside accessing and experiencing forms of performing arts. In particular, interested in how technologies changes the practice itself and how arts practice can drive technological change (not just respond to it).
The Kinect senses movement which is transformed in to vibrations in a pad held by the participant.
Discussing some problems as Microsoft now limiting code changes needed for the project.
The device does not translate dance but does provide an alternative experience equivalent to seeing the dance. The haptic device becomes a performance space in its own right that is not necessarily similar to a visual experience. So the visual landscape of a performance becomes a haptic landscape to be explored by the wandering fingers of blind users.
The project is part of a number of projects around the world looking at kinesthetic empathy.
Question on what models are being used to investigate the intersection of the human and the digital? Sophia focuses on using the technology as a choreographic medium and away from the dancing body. Jeremy’s research underpinned by theories of post-humanism that decentres the human: socio-materialism; Actor Network Theory and spacial theory.
Now on to Mariza Dima on design-led knowledge exchange with creative industries in the Moving Targets project. Focusing today on the methodological approach to knowledge exchange.
Moving targets is a three year project funded by SFc for creative industries in Scotland including sector intermediaries and universities to involve audiences in collaboration and co-design. INterdisciplinary research team including design, games, management. The project targets SMEs as well as working with BBC Scotland.
Knowledge exchange as alternative to transfer model. Exchange model emphasises interaction between all participants to develop new knowledge and experiences. Used design as a methodological approach in the co-design of problem identification and problem-solving.
Used experiential design which is design as experience – the designer is not an expert but supports collaboration; transdisciplinary; experience and knowledge is closely related and interactional working in context of complexity.
Process stages of research; design and innovation. Innovation tending to incremental improvement that returns to research. Knowledge is developed as a concept through research and as an experience through design and innovation. Phases:
Research involves secondments in to companies as immersion researching areas for improvement, gain and share knowledge and undertaking tasks/ activities. Example of working with CulturalSparks on community consultation related to cultural programme of Commonwealth Games 2014. Research workshops were also held on a quarterly basis.
Design of interventions with companies and audiences using e business voucher scheme. Ran a number of proto-typing projects including looking at pre-consumption theatre audience engagement.
Innovation based on two streams: (a) application of knowledge within the company and (b) identifying transferable knowledge. Have developed new processes, digital tools and products with an aim of creating longer-term impact of process improvements and tacit understandings by both the companies and by the universities/ intermediaries.
Experience of the clients very variable. Agencies much more receptive to working with higher education while micro-enterprises were more cautious as have limited resources. So with company, took a more business-like approach focused on outcomes and have gained positive impact.
The focus project is on supporting creative industries companies to engage with rapid changes in audiences driven by technological changes.
Now onto looking at invisible work in software development; data curatorship and invisible data consumption in industry, government and research. Research framework is base don the social shaping of technology; infrastructure studies and the sociology of business knowledge.
Focused on climate science due the importance of the interface between data and modelling projections through software; also in modelling data in manufacturing. In manufacturing is a question of generic software vs localisation via specific vagueness where metadata is under-emphasised and developed. While sharing data in government involved a more specific focus on curation of data and sharing data without affecting data ownership. Discourse on disintermediation tends to downplay costs of co-ordination particularly in respect of trust relations.
Data consumption linked to issues in data visualisation that aggregates and simplifies data presentation with careless consumption of data. Consumers have preference for simplified visualisations such as the two-by-two matrix to aid prioritisation. Such matrices become the shared language for users and the market or are amended as different simplified visualisation such as waves or landscapes.
The specific vagueness of the software ontologies makes comparability across platforms and contexts of the data becomes impossible.
Study on ERP involved videoed observation; situational analysis used in study on government softwares to generate grounded data analysis and study on data visualisation involved direct interviews of providers and users of data.
Ontologies discovered as useless – a life changing discovery!