Tag Archives: practice

Theorising Technology in Digital Education

These are my notes taken during the presentation and then tidied up a bit in terms of spelling and sense – so they may well be limit, partial and mistaken!

Welcome from Sian Bayne with the drama of the day “fire! Toilets!” and confirmed that the event is being livestreamed and the video is available here.
Lesley Gourlay as chair for the day also welcomed participants from across the UK and  Copenhagen. Seeking to provide a forum for a more theorised and critical perspective technology in higher education in the SRHE (Society for Research in Higher Education). Prof Richard Edwards at the School of Education gained funding for international speakers for today’s events. Unfortunately Richard is ill and can’t be here.

The theme of the event is developing the theoretical, ethical, political and social analysis of digital technologies and shift away from an instrumentalist perspective. The event Twitter hashtag is #shre

The first presentation is by Alex Juhasz on distributed online FemTechNet. FemTechNet as a network does not often speak to be field of education so this is a welcome opportunity (she has also blogged on the event here)

FemTechNet is an active network of scholars, technologist and artists interested in technology and feminism. The network is focused on both the history and future of women in technology sectors and practices. FemTechNet is structured through committees and has a deep process-focused approach to its work that is in important in terms of feminist practices. Projects involve the production of a white paper, teaching and teaching practices, workshops, open office hours, co-teaching, etc. models the interaction of theory and practice. But it has been difficult to engage students in collaborative projects while staff/ professors are much more engaged. Town halls are events for collaborative discussion events with an upcoming event on Gamergate to include a teach-in. FemTechNet have also produced a ‘rocking’ manifesto as “feminist academic hacktivism” and “cyberfeminist praxis”.
FemTechNet values are made manifest in Distributed Open Collaborative Courses (DOCCs) themes on Dialogues on Feminism and Technology (2013) and Collaborations in Feminism and Technology (2014). DOCCs against the xMOOC model to promote a participative approach to course design and distributed approaches to collaboration. DOCC was labelled as the Feminist anti-MOOC based on deep feminist principles including wikistorming, and has much press and other interest, some positive and some ‘silly’ (Fox News). FemTechNet has lots of notes on using tools and teaching approaches that can be used across lots of different critical topics beyond feminism alone.
DOCCs are designed to be distributed and with a flatter hierarchy with less of a focus on massiveness. Using technology in an open way to co-create knowledge beyond transmission. More details on the DOCC as a learning commons vs course can be found here.
The FemTechNet commons is now housed and redesigned at the University of Michigan although this may be a way of universities avoiding Title 9 violations. But as a result, the newer commons has become less open and collaborative as an online space.
Much of FemTechNet work involved overcoming technological hurdles and was based on the unpaid work of members. FemTechNet engage with critique of lobour practices and contexts in higher education.
The DOCC networks involve a wide scope of different types of universities from Ivey League and community colleges and community organisations collaborately working.
Student numbers are fairly small with approx 200 students but very high completion rates and very positive feedback and evaluations. Between 2013-4 there was not really growth of scale partly due to limitations of infrastructure. Now with the support of University of Michigan, there is an increased aspiration to develop international collaborative work.
DOCCs involve networking courses from many different fields of study involving both on-campus to fully online courses. Basic components of courses are keynote dialogue videos, smaller keywords dialogues and five shared learning activities. See also the situated knowledge map [link]. There is a big emphasis on share resources, cross-displinarity and inter-institutional working and learning.
So while DOCCs emerged from a feminist network, the tools, models and approaches can be used in many subject areas.

After lunch

Ben Williamson is prsenting on Calculating Academics: theorising the algorithmic organisationan of the digital university. The open slide isof a conceptualisation of a digital university university that can react to data and information that it receives. Ben will be prsenting on a shift t understanding of the university as mediated by the digital and focus on the role of algorithms.
One of the major terms being used is in terms of the smart university based on big data to enhance teaching, engagement, research, enterprise to optimise and utilise the data universities generate. This turn is situation in the wider concept of ‘smart cities’.
Smart cities are ‘fabricated spaces’ that are imaginary and unrealised and perhaps unrealisable. Fabricated spaces serve as models to aspire to realise.
Smart universities are fabricated through
technical devicecs, softre, code,
social actors including software producers, government and
discourses of text and materials.
Algorithm is seen in compsci as a set of processes to produce a desired output. But algorithms are black boxed hidden in IP and impenetrable code. It is also hidden in wider heterogeneous systems involving languages, regulation and law, standards etc.
Also algorithms emerge and change overtime and are, to an extent, out of comtrol, and are complex and emergent.
Socio-algorithmic relationality as algorithms co-constitute social practice (Bucher 2012); generate patterns, order and coordination (mackenzie 2006) and are social products of specific political, social and cultureal contexts and go on to produce by temselves.
Involve translation of human action through mathematical logics (Neyland 2014). Gillespie (2014) argues for a sociological analysis of algorithms as social, poitical as well as technical accomplishments.
Algorithms offer (Gillespie 2014): technical solutions; as synedoche – an abbreviation for a much wider socio-technical system; as stand-in for something else around corporate ownership for example; commitment to procedure as they privilige qualitification and proceduralisation.
Big data and the smart university is a problem area in this idea of the smart university. Is there a different epistemology for big data. Big data cannot exist without algorithms and has generated a number of discourses. Wired mag has suggested that big data is leading to the end of theory as there is no need to create a hypothesis as big data will locate patterns and results and this is a challenge to traditional academic practice. Also there is the rise of commercial social science such as the Facebook social science team often linked to nudging behaviours and “engineering the public” (Tufecki 2014). This is replicated in policy development such as the centre for analysis of social media at Demos using new big data sets. We’re also seeing new academic initiatives such as social physics at MIT and building a predictive model of human behaviour. Also see MIT laboratory for social machines in partnership with Twitter.
This raises the question of what expertise is being harnessed for smarter universities. Points ot the rise of alternative centres of expertise that can conduct big data analysis that are labelled as algorithmist Mayer0Schonberger and Cukier. Such skills and interdisciplinarity does not fit well in university. Sees the rise of non-sociologist sociologists doing better social research?
Mayer0Schonberger and Cukier Learning with Big data – predictive learning analytics, new learning platforms, et.\c. that is reflected in the discourses on the smarter university. Bid data generates the university in immediate and real time- doesn’t have to wait for assessment returns. See for example, IBM education for a smarter planet focused on smarter and prescriptive analytics based on big data.
Knewton talks of inferred student data that suggests the algorithm is objective and consistent. But as Seaver (2014) points out, these algorithms are created and changed through ‘human hands’.
So we’re seeing a big data epistemology that uses statistics that explain and predict human behaviour (Kitchin 2014): algorithms can find patterns where science cannot that you don’t need subject knowledge to understand the data. But he goes on that this is based on fallacies of big data- big data is partial, based on samples, what analysis is selected, what data is or can be captured. Jurgenson (2014) also argues for the understanding of the wider socio-economic networks that create the algorithms – the capture of data points is governed by political choices.
How assumptions of bid=g data are influenceing academic research practices. Increasingly algor entwinned in knowledge production when working with data – sch as Nvivo, SPSS, google scholar – Beer 2012 – algorthimic creation of social knowledge.Also seeing the emergence of digital social research around big data and social media. eg social software studies initiative – soc sci increasingy dep on digital infrrastructure not of our making.
Noortje Marres rethink social research as distributed and share accomplishment involving human and non-human.
In turn influences academic self-assessment and identity through snowball metrics on citation scores, researchfish etc. translating academic work in to metrics. See Eysenback (2011) study linking Tweets and rates of citation. So academics are subject to increasing quantified control mediated through software and algorithms. Seeing the emergence of the quantified academi self. Yet academics are socialised and by these social media networks that exacerbtes this e-surviellance (Lupton 2014). While share research develops its own lib=vely social life outside of the originator’s control.
Hall (2013) points to new epistemic environment that academics are being more social (media) entrepreneurial. Lyotard (1979) points to the importance and constraints of computerisation of research.
Finish with Q
– how do cog based classrooms learn?
–  what data is collected to teach?
– should academics learn to code?

A lot of discssion on the last question. It was also pointed out that its not asked should coders learn to be sociologists?
Also pointed out that people demonstrate the importanc of embodoed experiences through protests, demonstrations, that reflects what is loss in the turn to data.

After a short break, we now have Norm Friesen on Education Technology or Education as always-already Technological”. Talking about educational technology as not new but as going through a series of entwinements over time. Norm will look at older technologies of the text book and the lecture as looking back at older recognisable forms.
Looking back we can argue that educational technologies now are not presenting particularly novel problems for higher education. Rather higher education has always been constituative with educational practices then we can see how practices can adapt to newer technologies now.
Tech in education have always been about inscription, symbols as well as performance. If we understand the university as a discourse networks – see Kipler’s discourse network in analysis of publishing in 19 Century. Institutions like universities are closely linked to technology in storing and using technologies and modifying technologies for their practices.
In the example of tablets going back to ancient times or the horn book or other forms that is tigtly couple with institutions of learning and education. Such as clay tablets dating back to 2500 – 2000 BCE that show student work and teacher corrects as symbolic inscriptions of teaching and learning practices. And such tablets work at the scale of individual student work or as larger epic literatures. Can see a continued institution symbolic practices through to the iPad. Here technologies may include epistemic technologies such as knowledge of multiplication tables, procedures of a lecture – technologies as a means ot an end – so technologies are ‘cultural techniques’.

For the rest of the presentation will focus on the textbook and lecture as technologies that are particularly under attack in the revisioning of the university. Ideas of the fipped classroom still priviliges the lecture through video capture. Similarly the text book has yet to be overtaken by the e-textbook. Both provide continuities fromover 800 years of practice and performance.
The lecture goes back to the earliest university as originally to recite a text, so for transmission rather than generation of knowledge with a focus on the retention of knowledge. Developing ones own ideas in a lecture was unknown and student work involved extensive note taking from oral teaching (see Blair 2008). The lecture is about textual reproduction. Even following the printing press, this lecture practice continued although slowly, the lecturers own commentary on the text was introduced manifested as interlines between lines written from the dictated text. Educational practice tended to  not change as rapidly as the technologies of printing such that education was about 100 years behind.
But in 1800  saw the first lectures only from the lecturers own notes. so the lecture was recast around the individual as the creator of knowledge. So the individual lecturer and student not the official text became the authoritative sources of knowledge. Also the notion of the performance becomes increasingly important in the procedure of the lecture.
In textbooks we see pedagogical practice embedded in the text as end of chapter questions for the student to reflect and respond to (the Pestalozzian method, 1863). This approach can be seen in Vygotsky, Mead and self-regulated learning.
Specific technological configurations supported the increased emphasis on performance such as podcasting, powerPoint, projectors, etc. (see TED talks).
In the text book, similar innovations are happening in term sof layout, multimedia, personalised questioning (using algorithms). The text book becomes an interactional experience but continue from much older forms of the textbook. What is central is what are the familiar forms – that underlying structures have persisted.

But it is also the case that lectures nolonger espouse their own theories, they do not create new knowedge in the lecture.

Making & Breaking Rules in IT Rich Environments

These are my notes taken during the presentation and then tidied up a bit in terms of spelling and sense – so they may well be limit, partial and mistaken!

Prof Kalle Lyytinen, Case Western Reserve University.

The welcome came from Robin Williams noting that Kalle has a wide range of  appointments and  research interests and often acts as abridge builder across different subject disciplines and between American and European research communities. Kalle has been particularly supportive around research in IT infrastructures and in supporting the development of research communities on IT infrastructure.

Kalle starts the presentation with a discussion of the background of this paper that has been developing over the last five years. His research is positioned within science and technology studies (STS) but with a more behaviourist focus. This paper investigates issues of regulation which is fundamental to social interactions through establishing what is and is not acceptable behaviour within a specific context.

The example of the Securite Generale fraud by Jerome Kerviel who fooled the control systems to undertake fraudulent trading resulting in losses for the bank of approximately €5bn. This fraud was contrasted the old fashioned approaches to bank robbery and the regulatory regimes aimed at preventing such robberies to highlight that digital banking require new and different regulatory regimes.

IT systems embed rules that have regulatory functions on access to and the use of resources. Yet a key concern remains with how social actors comply with and work around these rules. So this research is concerned with how IT can be seen as materially based organisational regulation in interaction with the social.

What is a rule? Rules tend to be defined as a purely social statement on the expectations on behaviours by participants in a system and it is assumed that such rules are generally reciprocal. The expectations should create stabilities of behaviour yet are not mechanistic and so variances occur through misunderstanding, reinterpretation and resistance. For organisations, what is key is the materiality of rules through systems, processes, expressions in space design and so forth, that also generate stability over space and time. Regulation combines social and material components intertwined in a practice that decrease variance in behaviours and also facilitate the coordination of collective action.

Regulation is a meeting point of tensions between structure and agency raising questions on, for example, centralisation vs decentralisation of decision-making.

An IT system is a dynamic and expansive resource through which regulatory power is exercised by materialisation of rules. Rules are stored, diffused, enforced through IT. Rules encode and embed rules (Latour 1996, 2005) while rules become more complex through IT systems that allow complex combinations of rules. IT can track, record and identify events on a large scale and high speed and low cost – which is where big data can help identify and enforce new rules. Through IT, regulation becomes less visible as it is embedded in, for example, user-interfaces.

The example of high frequency trading and how IT rules are established that limit what types of trades can be operationalised – see Lewis’ Flashboys book.

Regulation has three dimensions: 1. the Rules that are materialised as a 2. IT artefact that is interdependent on 3. practices. Rules are coupled overtime with practices (such that the rule may be forgotten as it is embedded in the IT artefact.

IT regulation research in 1970s to 90s viewed regulation as oppressive and deterministic and in 1990s+ research was more concerned with deviation in practice. Alot of research in regulation positioned IT as a contextual variable while a much smaller number looked specifically at the IT in terms of materialisation, enactment of rules in practices and in the temporal aspects (Leonardi 2011). So research on IT and Regulation is limited.

Research to focus on sources of co-existence of multple IT based regulations generating heterogeneous and conflicting regulations so has multiple consequences.

Our focus is on practices of maintaining and transforming rules that mediate collective activity. Regulations are based on three types of origins: (i) autonomy where people agree on behaviours; (ii) control-orientated, explicit rules and laws based; or (iii) joint. The research is interested in practices in IT rich environments as rules become more invisible as they are ‘inscripted’ in to technology and/ or material. The same rule can be embedded in different ways, eg, speeding rules embedded in speed bumps and/ or in vocal warning from speedometer.

The study was a 7 year longitudinal study of regulatory episodes in a virtual learning environments. How teaching and learning behaviours are regulated through the VLE. Data was gathered from email logs, interviews and document analysis. The analysis focused on critical incidents, simple statistics and lexical analysis of emails.

The research questions were: 1. what is the focus of the regulatory episodes and 2. what was the temporal coupling between regulation and behaviour. The VLE provides a rich environment with alternative forms of regulation, dynamic in terms of wider changes in higher education, rules embedded in the application and how it is used.

Five types of regulatory episodes, all of which changed over time:

1. functional – restrictions on how users use the VLE based on the functionality of the VL

2. Tool orientated – specific tools are imposed for specific activities

3. Role orientated – which roles can use which aspects of the VLE

4. Procedure orientated – where learning processes such as course activities are practiced in new ways

5. Opportunity orientated.

Material regulation is dominant in functional and tool orientated rules while the social was dominant in role and procedure orientated rules.

The complexity of the multiplicity of rules and sources of rules led to confusion and difficulties in enforcing rules but, with low levels of constraint, were also sources of innovation in practices. Also, increasing the formal limits of the IT systems generated conflict over the rules.

As the operationalisation of the VLE continued over time so the complexity and volume of rules increased.

Over time the central administration of the university asserted increased control over the VLE for purposes of efficiency and uniformity of provision but also to legitimise its existence. But this increased control also removed a lot of local innovations. The materialisation of the rules in the VLE enabled greater centralised control. But also that IT choices then limits what future flexibility may be possible.