PJ Evans

Working & teaching in Second Life

Posted on | March 8, 2015 | No Comments

Second Life tutorialI am currently enjoying my first extended experience of teaching in (and on) Second Life. Here are a few images of the initial orientation sessions and later tutorials in action at two of our teaching spaces:

Orientation at the beach_001

Context, personalisation and facilitation – new paper to be published

Posted on | November 27, 2014 | No Comments

[Update: the paper was published in January and can be found here] In the New Year, a short paper by me is to be included in a special edition of TechTrends to be published in the New Year. The abstract is:

This article explores professional learning through online discussion events as sites of communities of learning. The rise of distributed work places and networked labour coincides with a privileging of individualised professional learning. Alongside this focus on the individual has been a growth in informal online learning communities and networks for professional learning and professional identity development. An example of these learning communities can be seen in the synchronous discussion events held on Twitter. This article examines a sample of these events where the interplay of personal learning and the collaborative components of professional learning and practice are seen, and discusses how facilitation is performed through a distributed assemblage of technologies and the collective of event participants. These Twitter-based events demonstrate competing forces of newer technologies and related practices of social and collaborative learning against a rhetoric of learner autonomy and control found in the advocacy of the personalisation of learning.

I’m looking forward to it coming out – along with other excellent papers from colleagues here.

Theorising Technology in Digital Education

Posted on | November 7, 2014 | No Comments

These are my notes taken during the presentation and then tidied up a bit in terms of spelling and sense – so they may well be limit, partial and mistaken!

Welcome from Sian Bayne with the drama of the day “fire! Toilets!” and confirmed that the event is being livestreamed and the video is available here.
Lesley Gourlay as chair for the day also welcomed participants from across the UK and  Copenhagen. Seeking to provide a forum for a more theorised and critical perspective technology in higher education in the SRHE (Society for Research in Higher Education). Prof Richard Edwards at the School of Education gained funding for international speakers for today’s events. Unfortunately Richard is ill and can’t be here.

The theme of the event is developing the theoretical, ethical, political and social analysis of digital technologies and shift away from an instrumentalist perspective. The event Twitter hashtag is #shre

The first presentation is by Alex Juhasz on distributed online FemTechNet. FemTechNet as a network does not often speak to be field of education so this is a welcome opportunity (she has also blogged on the event here)

FemTechNet is an active network of scholars, technologist and artists interested in technology and feminism. The network is focused on both the history and future of women in technology sectors and practices. FemTechNet is structured through committees and has a deep process-focused approach to its work that is in important in terms of feminist practices. Projects involve the production of a white paper, teaching and teaching practices, workshops, open office hours, co-teaching, etc. models the interaction of theory and practice. But it has been difficult to engage students in collaborative projects while staff/ professors are much more engaged. Town halls are events for collaborative discussion events with an upcoming event on Gamergate to include a teach-in. FemTechNet have also produced a ‘rocking’ manifesto as “feminist academic hacktivism” and “cyberfeminist praxis”.
FemTechNet values are made manifest in Distributed Open Collaborative Courses (DOCCs) themes on Dialogues on Feminism and Technology (2013) and Collaborations in Feminism and Technology (2014). DOCCs against the xMOOC model to promote a participative approach to course design and distributed approaches to collaboration. DOCC was labelled as the Feminist anti-MOOC based on deep feminist principles including wikistorming, and has much press and other interest, some positive and some ‘silly’ (Fox News). FemTechNet has lots of notes on using tools and teaching approaches that can be used across lots of different critical topics beyond feminism alone.
DOCCs are designed to be distributed and with a flatter hierarchy with less of a focus on massiveness. Using technology in an open way to co-create knowledge beyond transmission. More details on the DOCC as a learning commons vs course can be found here.
The FemTechNet commons is now housed and redesigned at the University of Michigan although this may be a way of universities avoiding Title 9 violations. But as a result, the newer commons has become less open and collaborative as an online space.
Much of FemTechNet work involved overcoming technological hurdles and was based on the unpaid work of members. FemTechNet engage with critique of lobour practices and contexts in higher education.
The DOCC networks involve a wide scope of different types of universities from Ivey League and community colleges and community organisations collaborately working.
Student numbers are fairly small with approx 200 students but very high completion rates and very positive feedback and evaluations. Between 2013-4 there was not really growth of scale partly due to limitations of infrastructure. Now with the support of University of Michigan, there is an increased aspiration to develop international collaborative work.
DOCCs involve networking courses from many different fields of study involving both on-campus to fully online courses. Basic components of courses are keynote dialogue videos, smaller keywords dialogues and five shared learning activities. See also the situated knowledge map [link]. There is a big emphasis on share resources, cross-displinarity and inter-institutional working and learning.
So while DOCCs emerged from a feminist network, the tools, models and approaches can be used in many subject areas.

After lunch

Ben Williamson is prsenting on Calculating Academics: theorising the algorithmic organisationan of the digital university. The open slide isof a conceptualisation of a digital university university that can react to data and information that it receives. Ben will be prsenting on a shift t understanding of the university as mediated by the digital and focus on the role of algorithms.
One of the major terms being used is in terms of the smart university based on big data to enhance teaching, engagement, research, enterprise to optimise and utilise the data universities generate. This turn is situation in the wider concept of ‘smart cities’.
Smart cities are ‘fabricated spaces’ that are imaginary and unrealised and perhaps unrealisable. Fabricated spaces serve as models to aspire to realise.
Smart universities are fabricated through
technical devicecs, softre, code,
social actors including software producers, government and
discourses of text and materials.
Algorithm is seen in compsci as a set of processes to produce a desired output. But algorithms are black boxed hidden in IP and impenetrable code. It is also hidden in wider heterogeneous systems involving languages, regulation and law, standards etc.
Also algorithms emerge and change overtime and are, to an extent, out of comtrol, and are complex and emergent.
Socio-algorithmic relationality as algorithms co-constitute social practice (Bucher 2012); generate patterns, order and coordination (mackenzie 2006) and are social products of specific political, social and cultureal contexts and go on to produce by temselves.
Involve translation of human action through mathematical logics (Neyland 2014). Gillespie (2014) argues for a sociological analysis of algorithms as social, poitical as well as technical accomplishments.
Algorithms offer (Gillespie 2014): technical solutions; as synedoche – an abbreviation for a much wider socio-technical system; as stand-in for something else around corporate ownership for example; commitment to procedure as they privilige qualitification and proceduralisation.
Big data and the smart university is a problem area in this idea of the smart university. Is there a different epistemology for big data. Big data cannot exist without algorithms and has generated a number of discourses. Wired mag has suggested that big data is leading to the end of theory as there is no need to create a hypothesis as big data will locate patterns and results and this is a challenge to traditional academic practice. Also there is the rise of commercial social science such as the Facebook social science team often linked to nudging behaviours and “engineering the public” (Tufecki 2014). This is replicated in policy development such as the centre for analysis of social media at Demos using new big data sets. We’re also seeing new academic initiatives such as social physics at MIT and building a predictive model of human behaviour. Also see MIT laboratory for social machines in partnership with Twitter.
This raises the question of what expertise is being harnessed for smarter universities. Points ot the rise of alternative centres of expertise that can conduct big data analysis that are labelled as algorithmist Mayer0Schonberger and Cukier. Such skills and interdisciplinarity does not fit well in university. Sees the rise of non-sociologist sociologists doing better social research?
Mayer0Schonberger and Cukier Learning with Big data – predictive learning analytics, new learning platforms, et.\c. that is reflected in the discourses on the smarter university. Bid data generates the university in immediate and real time- doesn’t have to wait for assessment returns. See for example, IBM education for a smarter planet focused on smarter and prescriptive analytics based on big data.
Knewton talks of inferred student data that suggests the algorithm is objective and consistent. But as Seaver (2014) points out, these algorithms are created and changed through ‘human hands’.
So we’re seeing a big data epistemology that uses statistics that explain and predict human behaviour (Kitchin 2014): algorithms can find patterns where science cannot that you don’t need subject knowledge to understand the data. But he goes on that this is based on fallacies of big data- big data is partial, based on samples, what analysis is selected, what data is or can be captured. Jurgenson (2014) also argues for the understanding of the wider socio-economic networks that create the algorithms – the capture of data points is governed by political choices.
How assumptions of bid=g data are influenceing academic research practices. Increasingly algor entwinned in knowledge production when working with data – sch as Nvivo, SPSS, google scholar – Beer 2012 – algorthimic creation of social knowledge.Also seeing the emergence of digital social research around big data and social media. eg social software studies initiative – soc sci increasingy dep on digital infrrastructure not of our making.
Noortje Marres rethink social research as distributed and share accomplishment involving human and non-human.
In turn influences academic self-assessment and identity through snowball metrics on citation scores, researchfish etc. translating academic work in to metrics. See Eysenback (2011) study linking Tweets and rates of citation. So academics are subject to increasing quantified control mediated through software and algorithms. Seeing the emergence of the quantified academi self. Yet academics are socialised and by these social media networks that exacerbtes this e-surviellance (Lupton 2014). While share research develops its own lib=vely social life outside of the originator’s control.
Hall (2013) points to new epistemic environment that academics are being more social (media) entrepreneurial. Lyotard (1979) points to the importance and constraints of computerisation of research.
Finish with Q
– how do cog based classrooms learn?
–  what data is collected to teach?
– should academics learn to code?

A lot of discssion on the last question. It was also pointed out that its not asked should coders learn to be sociologists?
Also pointed out that people demonstrate the importanc of embodoed experiences through protests, demonstrations, that reflects what is loss in the turn to data.

After a short break, we now have Norm Friesen on Education Technology or Education as always-already Technological”. Talking about educational technology as not new but as going through a series of entwinements over time. Norm will look at older technologies of the text book and the lecture as looking back at older recognisable forms.
Looking back we can argue that educational technologies now are not presenting particularly novel problems for higher education. Rather higher education has always been constituative with educational practices then we can see how practices can adapt to newer technologies now.
Tech in education have always been about inscription, symbols as well as performance. If we understand the university as a discourse networks – see Kipler’s discourse network in analysis of publishing in 19 Century. Institutions like universities are closely linked to technology in storing and using technologies and modifying technologies for their practices.
In the example of tablets going back to ancient times or the horn book or other forms that is tigtly couple with institutions of learning and education. Such as clay tablets dating back to 2500 – 2000 BCE that show student work and teacher corrects as symbolic inscriptions of teaching and learning practices. And such tablets work at the scale of individual student work or as larger epic literatures. Can see a continued institution symbolic practices through to the iPad. Here technologies may include epistemic technologies such as knowledge of multiplication tables, procedures of a lecture – technologies as a means ot an end – so technologies are ‘cultural techniques’.

For the rest of the presentation will focus on the textbook and lecture as technologies that are particularly under attack in the revisioning of the university. Ideas of the fipped classroom still priviliges the lecture through video capture. Similarly the text book has yet to be overtaken by the e-textbook. Both provide continuities fromover 800 years of practice and performance.
The lecture goes back to the earliest university as originally to recite a text, so for transmission rather than generation of knowledge with a focus on the retention of knowledge. Developing ones own ideas in a lecture was unknown and student work involved extensive note taking from oral teaching (see Blair 2008). The lecture is about textual reproduction. Even following the printing press, this lecture practice continued although slowly, the lecturers own commentary on the text was introduced manifested as interlines between lines written from the dictated text. Educational practice tended to  not change as rapidly as the technologies of printing such that education was about 100 years behind.
But in 1800  saw the first lectures only from the lecturers own notes. so the lecture was recast around the individual as the creator of knowledge. So the individual lecturer and student not the official text became the authoritative sources of knowledge. Also the notion of the performance becomes increasingly important in the procedure of the lecture.
In textbooks we see pedagogical practice embedded in the text as end of chapter questions for the student to reflect and respond to (the Pestalozzian method, 1863). This approach can be seen in Vygotsky, Mead and self-regulated learning.
Specific technological configurations supported the increased emphasis on performance such as podcasting, powerPoint, projectors, etc. (see TED talks).
In the text book, similar innovations are happening in term sof layout, multimedia, personalised questioning (using algorithms). The text book becomes an interactional experience but continue from much older forms of the textbook. What is central is what are the familiar forms – that underlying structures have persisted.

But it is also the case that lectures nolonger espouse their own theories, they do not create new knowedge in the lecture.

Making & Breaking Rules in IT Rich Environments

Posted on | November 5, 2014 | No Comments

These are my notes taken during the presentation and then tidied up a bit in terms of spelling and sense – so they may well be limit, partial and mistaken!

Prof Kalle Lyytinen, Case Western Reserve University.

The welcome came from Robin Williams noting that Kalle has a wide range of  appointments and  research interests and often acts as abridge builder across different subject disciplines and between American and European research communities. Kalle has been particularly supportive around research in IT infrastructures and in supporting the development of research communities on IT infrastructure.

Kalle starts the presentation with a discussion of the background of this paper that has been developing over the last five years. His research is positioned within science and technology studies (STS) but with a more behaviourist focus. This paper investigates issues of regulation which is fundamental to social interactions through establishing what is and is not acceptable behaviour within a specific context.

The example of the Securite Generale fraud by Jerome Kerviel who fooled the control systems to undertake fraudulent trading resulting in losses for the bank of approximately €5bn. This fraud was contrasted the old fashioned approaches to bank robbery and the regulatory regimes aimed at preventing such robberies to highlight that digital banking require new and different regulatory regimes.

IT systems embed rules that have regulatory functions on access to and the use of resources. Yet a key concern remains with how social actors comply with and work around these rules. So this research is concerned with how IT can be seen as materially based organisational regulation in interaction with the social.

What is a rule? Rules tend to be defined as a purely social statement on the expectations on behaviours by participants in a system and it is assumed that such rules are generally reciprocal. The expectations should create stabilities of behaviour yet are not mechanistic and so variances occur through misunderstanding, reinterpretation and resistance. For organisations, what is key is the materiality of rules through systems, processes, expressions in space design and so forth, that also generate stability over space and time. Regulation combines social and material components intertwined in a practice that decrease variance in behaviours and also facilitate the coordination of collective action.

Regulation is a meeting point of tensions between structure and agency raising questions on, for example, centralisation vs decentralisation of decision-making.

An IT system is a dynamic and expansive resource through which regulatory power is exercised by materialisation of rules. Rules are stored, diffused, enforced through IT. Rules encode and embed rules (Latour 1996, 2005) while rules become more complex through IT systems that allow complex combinations of rules. IT can track, record and identify events on a large scale and high speed and low cost – which is where big data can help identify and enforce new rules. Through IT, regulation becomes less visible as it is embedded in, for example, user-interfaces.

The example of high frequency trading and how IT rules are established that limit what types of trades can be operationalised – see Lewis’ Flashboys book.

Regulation has three dimensions: 1. the Rules that are materialised as a 2. IT artefact that is interdependent on 3. practices. Rules are coupled overtime with practices (such that the rule may be forgotten as it is embedded in the IT artefact.

IT regulation research in 1970s to 90s viewed regulation as oppressive and deterministic and in 1990s+ research was more concerned with deviation in practice. Alot of research in regulation positioned IT as a contextual variable while a much smaller number looked specifically at the IT in terms of materialisation, enactment of rules in practices and in the temporal aspects (Leonardi 2011). So research on IT and Regulation is limited.

Research to focus on sources of co-existence of multple IT based regulations generating heterogeneous and conflicting regulations so has multiple consequences.

Our focus is on practices of maintaining and transforming rules that mediate collective activity. Regulations are based on three types of origins: (i) autonomy where people agree on behaviours; (ii) control-orientated, explicit rules and laws based; or (iii) joint. The research is interested in practices in IT rich environments as rules become more invisible as they are ‘inscripted’ in to technology and/ or material. The same rule can be embedded in different ways, eg, speeding rules embedded in speed bumps and/ or in vocal warning from speedometer.

The study was a 7 year longitudinal study of regulatory episodes in a virtual learning environments. How teaching and learning behaviours are regulated through the VLE. Data was gathered from email logs, interviews and document analysis. The analysis focused on critical incidents, simple statistics and lexical analysis of emails.

The research questions were: 1. what is the focus of the regulatory episodes and 2. what was the temporal coupling between regulation and behaviour. The VLE provides a rich environment with alternative forms of regulation, dynamic in terms of wider changes in higher education, rules embedded in the application and how it is used.

Five types of regulatory episodes, all of which changed over time:

1. functional – restrictions on how users use the VLE based on the functionality of the VL

2. Tool orientated – specific tools are imposed for specific activities

3. Role orientated – which roles can use which aspects of the VLE

4. Procedure orientated – where learning processes such as course activities are practiced in new ways

5. Opportunity orientated.

Material regulation is dominant in functional and tool orientated rules while the social was dominant in role and procedure orientated rules.

The complexity of the multiplicity of rules and sources of rules led to confusion and difficulties in enforcing rules but, with low levels of constraint, were also sources of innovation in practices. Also, increasing the formal limits of the IT systems generated conflict over the rules.

As the operationalisation of the VLE continued over time so the complexity and volume of rules increased.

Over time the central administration of the university asserted increased control over the VLE for purposes of efficiency and uniformity of provision but also to legitimise its existence. But this increased control also removed a lot of local innovations. The materialisation of the rules in the VLE enabled greater centralised control. But also that IT choices then limits what future flexibility may be possible.



Facebook network

Posted on | October 23, 2014 | No Comments

A sociogram of my Facebook networkI am currently trying to catch up on the Coursera MOOC on social network analysis . My main aim in taking the course is to force myself to learn about using Gephi for network analysis. The course so far has been clear and well presented but its early stages. Also, using Gephi on the Mavericks version of OSX has been a pain largely due to Java as Gephi won’t run on the default install of Java. The solution can be found on the Gelphi forums here although I’m still having some problems with Java.

I don’t use Facebook much and was a bit surprised at the density of the network as a whole but having that number of sub-clusters was less surprising considering the stop-start nature of how the network developed. I’ll have to find out who the single unconnected nodes are once the Java issues have been resolved.

What is wrong with ‘Technology Enhanced Learning’

Posted on | October 4, 2014 | No Comments

Last Friday I attended a Digital Cultures & Education research group presentation by Sian Bayne on her recent article What’s the matter with ‘Technology Enhanced Learning’?

These are my notes taken during the presentation and then tidied up later – so they may well be limit, partial and mistaken!


16th century French cypher machine in the shape of a book with arms of Henri II. Image from Uploadalt

While Technology Enhanced Learning (TEL) is a widely used term in the UK and Europe, the presentation positions TEL as an essentially conservative term that discursively limits what we do as researchers and researchers in the field of digital education and learning. Sian’s critique draws on three theoretical perspectives:

* Science & Technology Studies (STS) for a critique of ‘Technology
* Critical posthumanism for a critique of ‘Enhancement
* Gert Biesta’s language of learning for ‘Learning

For Technology, we dont tend to define it but rather black box it as unproblematically in service to teaching practices. This black-boxing of technology as supporting learning and teaching creates a barrier between the technology and the social practices of teaching. As Hamilton & Friesen discuss, two main perspectives on technology as either as an essentialist perspective of unalienable qualities of the technologies or we treat it instrumentally as a neutral set of tools. I both cases technology is understood as being independent of the social context in which it is used. Hamilton & Friesen argue we need to take a more critical stance especially in terms of technology as the operationalisation of values and to engage in larger issues such as social justice, the speed of change and globalisation, the nature of learning or what it is to be human.

By using the term, Enhanced, TEL adopts a conservative discourse as it assumes there is no need to radically rethink teaching & learning practices but just a need to enhance of tinker with existing practice. So enhancement aligns with Transhumanism – a humanist philosophy of rationality and human perfectibility where technological advances remove the limitations of being human (Bostrom 2005)
Critical post-humanism (Simon 2003) is a philosophical critique of the humanism of the Enlightenment and its assumptions on human nature and the emphasis on human rationality. arguing that these assumptions are complicit in dominatory practices of opporession and control. The human being is just one component in complex ecology of practice that also includes machines, non-human components in symmetry. So post-humanism is more about humility and appreciation of that our involvement as humans in our context is complex and inter-related and interactional. Yet TEL buys into a dominant Transhumanism emphasising the cognitive enhancement of the mind and so could include the use of drugs as a ’technology’ to enhance learning. The Technology Enhanced Learning System Upgrade report.
Transhumanism positions technology as an object acted on by human subject so ignoring how humans are shaped by and shape technology and does not ask Is ‘enhancement’ good, who benefits from enhancement and is enhancing is context specific? It is argued that TEL could learn from the post humanist critique of Transhumanism

The ‘problem’ of Learning draws on Gert Biesta’s writing on the new language of learning and more specifically, the ‘learnification’ of discourses of education. This involves talking about “learning” rather than “teaching”, or “education”. Learning as a terms is used as a proxy for education that takes discussions away from considerations of structures of power in education itself. So learnification discursively instrumentalises education – education is provided/ delivered to learners based on predefined needs rather than needs emerging and evolving over time. So learners are positioned as customers or clients of education ‘providers’ and TEL gets bound up with this neo-liberal discourse/ perspective

So the label of TEL tacitly subordinates social practice to technology while also ontologically separating the human from the non-human. The TEL discourse is aligned with broader enhancement discourse that enrols transhumanism and instrumentalisation so entrenching a particular view of the relationships between education, learning and technology.

Rather, education technologies involve complex assemblages of human and non-human components and as practitioners and researcher, we need to embrace that complexity. Posthumanism as a stance, is a way of doing this and understanding learning as an emergent property of complex and fluid networks of human and non-human elements coming together. In posthumanism, the human is not an essence but rather a moment.

Distributed governance of technological innovation through the case of WiBro in S. Korea.

Posted on | September 29, 2014 | No Comments

I attended the Social Informatics Cluster meting to hear Jee Hyun Suh present on: Co evolution of an emerging mobile technology and mobile services: distributed governance of technological innovation through the case of WiBro in S. Korea. These are rough notes taken during the presentation.

She presented the story of WiBro and the implications for the governance of large scale technological innovations for technology companies and government. WiBro was initiated from 2001 as a national R&D programme for high speed portable internet, it was harmonised with national and international standards (WiMax) and went to a commercial launch in 2006. It is widely seen as a case of market failure despite a successful technological innovation.

The research objectives were initially to examine the socio-technical factors in the development of the technology and the gap between the visions and outcomes of the technology commercialisation and explore the governance of large scale and complex innovations. The technology’s development was interpreted through social learning processes with a particular focus on building alignments between the technology, service evolution, standardisation and social learning within a wider development arena of R&D.

Over the course of the research period, 2001 to date, the focus of interest shifts from design & development of the technologies to a commercial focus on then on to a focus of the service evolution. The WiBro development was linked to broader policy imperatives of positioning S.Korea as innovation leader.

The technology itself was predicated on a problematisation of the inefficient use of 2.3 GHz and then enrolment of stakeholders to co-shape a generic vision of the using bandwidth portable internet service. This became co-evolved with drive towards a High performance portable internet and processes of standardisation.Standard setting closely linked to bandwidth/ spectrum allocation. Became conceived as a seamlessly interlinked innovation process. but different interests and objectives across stakeholders remained unresolved especially between focus on tech dev vs commercial exploitation through existing technologies. Also shifting alignments around adoption of differing international standards. The technology had been successfully developed and as pre-commercial produce was show cased at APEC 2005.
Commercialisation occured around processes of spectrum licensing. Again, different visions for WiBro, eg as an extension of fixed line services, as a differentiated service and as a complementary service to existing mobile networks. These different visions were rolled into different commercial aims eg, early market advantage vs emphasis on interoperability, adoption or blocking of VoIP as well as the emergence of 3G services. The later development of 4G mobile resulted in shifts to the vision of WiBro and how it should evolve.
Also, the commercial focus bifurcated on domestic versus a global market focus. In the domestic market, there could be seen the dynamics of trail and error on finding niche markets for WiBro, eg, mobile routers, digital shipyards, WiBro-Taxi. This market learning processes occurred despite tensions between players and their visions for the service.
The argument presented was that the ‘problem’ of WiBro should be framed in terms of uncertainties in innovation processes rather than in terms of a failure in diffusion/ commercialisation. So the coordination challenges and dispersed arenas of innovation enabled key players to interact in the social shaping of this particular technology highlighting the importance of stakeholder reflexivity and flexibility in large-scale technological innovations.
It was also noted during the Q&A that WiBro coincided with the testing and general failure of attempts at developing national technology champions that could then be exported in to global markets.

For more on social learning processes in innovation diffusion, see:

Unbundling higher education

Posted on | September 10, 2014 | No Comments

These are my notes from a seminar by Amy Collier, Stanford University  titled The Good, the Bad and the Unbundled on 27 August 2014. These notes were taken live and then cleaned up a bit, links added etc. but they remain a bit partial and sketchy in places.  For a more thoughtful and reflective take on the presentation, see Hazel Christie’s post here. Amy’s own post on her visit  can be found here.

The good, the bad, and the unbundled from Amy Collier

The presentation is looking at this emerging phenomenon in the US Higher Education sector and the possible lessons for UK Higher Education.

Amy has been at Stanford for two half years working on MOOCs and on supporting the increasing interest in online learning at Stanford from a position of a weak tradition of online learning. Her role initially focused on the operational aspects of course design. She now has developed a more strategic role asking what they’re doing, who is being targeted and why adopting online learning.

Unbundling is an increasingly prominent topic in US  higher education. It should also be noted that unbundling has a long presence in UK HE in particular through the Open University.

The Unbundling idea has taken hold in the US as part of a wider discourse of ‘disruption’. The US has a weird love affair with the term ‘disruption’. This love affair is based on a ‘dis-ease’ with how things are currently done. Higher education is ‘broken’ and should be disrupted and that disruption is often undertaken through unbundling. Yet, that discourse of  dis-ease with a broken education system is often promoted by others as means to sell ‘solutions’.

Unbundling is the separation of ownership of infrastructure and processes of service provision to gain efficiencies and savings. So unbundling involves the compartmentalism of components of HE that are then outsourced to other providers rather than the traditional model of being provided by a single institution.

As an example, the music industry as traditionally produce a bundled product such as the album, but then iTunes disrupted this product by allowing the purchasing of single songs, users creating their own playlists, etc… Apple and iTunes allowed us as customer to do things with the purchased products independently of music businesses. This development lead on to Pandora and Spotify and took place within a discourse of ‘freedom’ and ‘access to artists’ and hence as the democratisation of the music industry. Similarly, we’e seeing an emerging discourse on the democratisation of higher education in US.

So what is the problem? What is lost when we unbundle? In the case of the music industry, we can see a counter-trend with the return of the cassette as a ‘product’ as a piece of art that cannot be unbundled (popular in Portland – who knew?), it is a single, indivisable and cohesive piece of art. Similar examples of rebundling can be seen in the examples of free music when you buy phone X or in playlists created by Pandora. So unbundling and then rebundling leads to a loss of control and more importantly, a loss of a sense of the whole – replaced by another interpretation of that whole – the art of the album. Also, while obscure artists can be found online they don’t have the sales volumes to make money through these unbundled services.

How does this apply to HE? Returning to the notion of HE as broken is “disaster porn” such as the  IPPR report, An Avalanche is Coming. The IPPR report cites the diversity of pressures on HE in terms of purpose, funding, public policy in the context of a globalised economy where HE is no longer fit for purpose. HE should, therefore, look to technological solutions and these are to be found in the private sector.

A particular recent emphasis is on questioning the value of university, is it worth going? The degree is dead, reimagining higher education. Jose Ferriera (at Knewton) claims bundling works to trick people in to believing a service is worth more than it is and hiding the real cost-benefit.

Unbundling in HE may involve splitting: content; social networks; accreditation; delivery; testing; and research (see Henry Brady, UC Berkeley). But what are the tensions then between economic efficiencies and the holistic integrity of education?

MOOCs have inflated this discussion of disruption and unbundling. Clay Shirky argues that HE is being, and should be disrupted and, returning to the music industry analogy, the “MP3 is our MOOC“.

And we can see examples of MOOCs unbundle accreditation from HE now. The American Council of Education is offering credientialisation of MOOCs through member HEIs so separating/ unbundling the delivery and accreditation of courses. Antioch College told its own students that they could receive credit for MOOCs thereby unbundled content, credit and, in this case, the tutoring and support of learning.

But the concept of unbundling has been going on in HE at least since the 19th Century, for example, in unbundling academics from the pastoral roles.

The problems of unbundling:

While a lot of the authors of the disruption discourse make this comparison to the music industry, as George Siemens states, education is a social and cultural as well as content ‘industry’. In taking that perspective, a number of problems with, or questions on, unbundling can be identified:

1. Who, how and what of rebundling? Who does the rebundling and what power are they taking through rebundling? Things that get unbundled tend to be rebundled with a change of ownership and control and what does this mean, for example, on the student experience?

The Minerva project provides access to higher education at reduced cost by focusing on (transferable) skills rather than content/ domain knowledge. They rely on MOOCs for domain knowledge for introductory courses. So Minerva are rebundling MOOCs provided by others while focusing on project-based and experiential learning..

A dark-side of this is that there will still be very bundled education institutions and there is a danger that these highly bundled experiences become the expensive premium service for an elite minority. So the unbundling and rebundling ‘disruptions’ will increase the divisions on access to high quality education.

So, while it remains the case that for some students the unbundled experience may be what they want and need, a key question remains that if unbundling is about raising access to HE then who for and to what form of HE?

Also, bundled and unbundled experiences collected data. HEIs are generally trusted to handle data with care and respect but what happens when services are unbundled and rebundled with the concomitant opportunities for the commercial exploitation of student data. For example, the backlash on the recent Facebook experiment was not just against Facebook but also Cornell University for their role in analysing the data.

2. Impacts on teaching and other staff.
We can see the unbundling of the academics’ role eg, in support development of student social networks, advising, admissions, instruction design,teaching, research etc . especially to para-academics, but this is problematic if you view the academics’ role as holistic.
In the case of MOOCs, courses are being designed by people who may not deliver/ teach on them. But this approach can also be seen in the development of Online & Distance Learning (ODL)  programmes from the 1990s as they considered how learning technologists interacted with academic staff. Different models of ODL can be seen:

(i) craft model where faculty did it all; (ii) collegial model where academics helped each other; and (iii) where a virtual assembly line was created the produced a course for  academics to deliver. The craft model is where academics identified themselves as autonomous experts whereas this identity was lost in the assembly line model. So unbundling also affects academic self-identification.

But why is an integrated faculty role of value? Because it engages academics in their work and highlights the integrative role of research and teaching. On the other hand, unbundling does allow faculty to focus on individual areas of strength – why force a shy researcher in to teaching?

There are other models such as Patricia Ianuzzi’s (University of Nevada) team-based model involving the co-production between academics and para-academics of student experiences.

3. the lost art of the University: what happens when unbundling leads to loss of serendipity and synergies of the bundled student experience?

On a positive note, unbundling may provide opportunities for the redesign of HE and to challenge assumptions of the institutions.

Examples of redesigning rather than unbundling has changed HE
1. domain of one’s own at the University Mary Washington as a push-back against VLEs and MLEs. Each student was provided with a domain for students to use any tools they wanted and use for their learning. This initiative allows students to experiment with online learning both personally and in groups. Another initiative is Thought Vectors at Virginia Commonwealth University enabling student learning on open websites.

2. the Stanford 2025 project involved both students and staff to consider the redesign of Stanford for 2025. For example, redesigned away from semester and academic years to a much more flexible programme structure built around micro-learning opportunities as Paced Education. In effect this is unbundling the curriculum and is being implemented through The Impact Lab. This social innovation is focused on the food system and involves students researching (immersion), prototyping and piloting implementations of interventions in the food system.

The key point of this talk is to examine the issues and opportunities in the unbundling of higher education.

Q: Can you separate the neo-liberal drivers of the rise of idea of unbundling and the more positive opportunities of redesign? How suspicious should we be of unbundling in HE?
A: I’m very suspicious mainly because I work in Silicon Valley and see unbundling projected as way for start-ups to access investment and government  to ‘solve’ higher education through the private sector.

Q: Can you comment on the adjunct faculty in the US as it appears to be linked?
A: Unbundling the faculty role leads to the deskilling of the faculty so seeing rise of adjunct faculty as having very specialist skills along with precarious employment positions. See the alt ac movement in US (alternative academic).

Q: Comments Music Industry to suggest that senior managers saw that the internet would change their business but didn’t know how to change. Also, the UK has the experience of the OU for the team development of courses. Finally, HEI is very diverse but that is hidden to many of us. Some HEIs rebundle through eg, accreditation of prior learning (cites military in US as example of this)
A: RPEL is really important. A key danger of unbundling is that it imposes a monolithic view of HE and that a sense diversity is lost.

Q: Interested in your views of a model from Cornell University of a faculty housing model of free housing if you live with the students as a rebundling of student services?
A: Stanford has strong ethos of living on campus and the creation of a learning community.

Q: Who is the customer and what is the product? Are students viewed as a product and society the customer?
A: The student as customer is a strong aspect of the unbundling discourse. People have changed their ideas of education as a public good and the promotion of citizenship – now less of a priority given the end of the Cold War.

Q: Worried that there may be an oversimplification of a good or bad unbundling and whether there is a need for a bigger discussion on what the university is for?
A: I’m not opposed to unbundling per se but more discussion is needed beyond the binary of good and bad but that allows the challenge of the assumptions of educational institutions

Learning techniques – for education and life

Posted on | August 15, 2014 | No Comments

An interesting and useful read from Harold Jarche on learning techniques framed in terms of PKM and sense-making. As with many areas of knowledge and learning, the post (and the research article cited – and summarised here) highlight the tendency towards shallow learning techniques and the avoidance of the more valuable, but harder, techniques of sense-making and critical thinking. The two key techniques here of elaborative interrogation and self-explanation seem to me to be two crucial steps in situated knowing and being able to think through the nitty-gritty pragmatic aspects of applying knowledge/ information in actual problem-solving situations. It is these approaches that should provide the situational links between education and professional practices.

Personal learning environments

Posted on | August 8, 2014 | No Comments

Network ALL2_BC
I’m currently writing up some ideas on open online professional learning that includes considering  personal learning networks. I came across this interesting post from Martin Weller on the apparent decline in interest or discussion of personal learning networks. The reasons suggested include the mainstreaming of the practices associated with PLEs, a consolidation of the tools used in to a fairly generic set of software used but also that the (research) agenda has shifted from personal learning to institutionally provided personalised learning partly driven by learning analytics.


« go backkeep looking »


This is an area on your website where you can add text. This will serve as an informative location on your website, where you can talk about your site.

Subscribe to our feed