Report from the Jisc/CNI workshop

Transforming opportunities in scholarly discourse

Workshop report: JISC/CNI, Birmingham, UK – 5/6 July 2012

Feature Image

Contents

Foreword

Traditional methods of scholarly communication are changing – and they are changing swiftly and dramatically. Traditional measures of impact are no longer capturing the whole picture. Innovative researchers using new forms of scholarly communications risk not receiving recognition and reward for their work. The way we analyse scholarly activity and research is inevitably going to have to change to respond to the technological and cultural revolution under way. Institutions that fail now to respond and adapt to the transformations taking place in scholarly communications risk their research reputation in the future.

Developments in each of these individual areas have been intensively studied and analysed through conferences, workshops and reports. But the reality is that all of these changes are interconnected. Particularly for high-level policy-makers and institutional leaders, what is most needed, and most lacking, is a synthesis that can account for these interconnections and offer overall guidance.

Further, these are international – indeed, global – developments; the various aspects of the move to openness, for example, are playing out in numerous ways across the world. But institutional impacts and responses are also profoundly shaped by national policy and funding structures.

Earlier this year, Jisc and the Coalition for Networked Information (CNI) gathered thought leaders from the United States and Europe at a horizon-scanning event to explore how institutions need to respond to the changes in scholarly communications, and to discuss how institutions can build reputation and recognise research quality in this new environment.

Over an intense two days, experts from both sides of the Atlantic debated and discussed the evolution of scholarly research, behaviour and outputs. They envisioned the future – one in which scholarly discourse recognises the opportunities that digital technologies can bring, in which research outputs evolve into complex but seamless sources of methods, data and narrative, and in which the importance for innovation comes to the fore through barrier-free access to research for small and medium-sized companies.

This report clearly distils these key issues from the diverse discussions at the workshop and summarises them in a way that reflects the workshop’s tenor and title – transforming opportunities.

Jisc and CNI intend to build on this work and to continue to provide analysis and guidance to institutions facing the challenges and opportunities of the changing environment of scholarly communication.

This report concludes with a list of the most significant and promising options for next steps, from policy development on open access to encouragement and reward for new forms of participation and collaboration.

As we recognised going into the symposium, we are dealing with global developments that play out in unique ways within different national settings. Jisc, with its enormously deep knowledge of the UK higher education world, can help institutions stay ahead of these changes to both protect and boost their international research reputations. They have a wide range of tools and advice to help UK universities identify opportunities to gain future advantage – through research data management, open access, open data and other strategies. CNI is primarily focused on the very different landscape of higher education within the United States and Canada, and has a long history of engagement with many of the key issues discussed at the workshop such as data-intensive scholarship and its implications for both institutions and scholarly communication. Jisc and CNI, working together though our commitments to international collaboration will continue to help our members with the insights that can be obtained through the kind of multi-national perspectives on gobal developments that characterised this symposium.

Clifford Lynch, Executive Director, CNI

back to contents

Introduction and context

Scholarly outputs should be discoverable, accessible, understandable, translatable and applicable. Digital technologies and the web are both enabling and transforming factors in the way scholarly research is conceived, carried out and recorded.

We are seeing great changes in these activities. The natural sciences are seeing the evolution of collaborative, open, participatory behaviours where sharing can become the norm; the humanities have a growing wealth of new technological approaches to exploit the digital research environment. Teaching and learning, too, are undergoing a transformation through digital technologies as the ability to deliver instruction and instructional materials in new ways, and remotely, changes the way teachers work and students learn. Along with those changes come different expectations on the part of learners.

New research practices have brought new forms of output – digital, varied and often complex. There are disciplinary differences which can be marked in terms of what is produced from research activities and how these outputs can be used, how – and if – research is applied, the shape that communication activities take, acknowledgment, reward and impact.

Yet communication about these developments is lagging behind in technological and behavioural terms. The traditional system of scholarly communication is deeply embedded. Legacies from the print-on-paper age of scholarly communication still dictate practices, attitudes and values within higher education that persist despite the web offering countless opportunities for new approaches to scholarly discourse, ones that can take full advantage of digital technologies and align with new ways of doing research.

Systems for assessment, reward and recognition have not changed in substance over the last 15 years of ubiquitous web use. Institutional and funder expectations are rooted in the established system and are acting as constraints on change. Scholars themselves are equally constraining: researchers who in their daily lives take full advantage of web commerce, new selling models and community systems of assessing and rating products and services are proving remarkably resistant to developing and adopting these things on a wide scale in scholarly life.

One central, but not solely responsible, factor is that the whole issue of scholarly discourse receives cursory attention in general from institutional management. Given the strategic centrality of the issue, this is a disappointing oversight. There is, however, some excuse. The research life cycle remains centred around publications in the traditional sense. There is pressure for increasing openness but there is a perceived tension between this and the issue of maximising impact, both commonly appearing on funders’ agendas. How to give away intellectual property (IP) and at the same time exploit it? Tensions exist within higher education that can paralyse progress; those who could force change have personally benefited from existing ways of doing things and are likely to see little reason for change; the vision for a better future is not in the main well articulated and benefits are not clearly evidenced.

The workshop discussed developments in these contexts. The session topics were varied and diverse, yet on many occasions the same core issues surfaced within different topics. This report does not record the workshop chronologically but instead attempts to distil these key issues from the whole and to report them in a way that reflects the workshop’s discussions.

back to contents

The workshop discussions

collaborate

The vision

Whatever the nature of system(s) of scholarly discourse that will replace the traditional one, change and continuing evolution are likely to become the norms rather than one stable system being replaced by another. Attitudes will need to change within the researcher community itself and with resistance to change proving so entrenched, the timescale for significant shifts to occur will be into the medium term rather than in the short term.

The vision for future scholarly discourse recognises the opportunities that digital technologies can bring. It also anticipates that the ‘scientific conversation’ that characterised the early days of scientific endeavour is the ultimate goal– even though in those times the number of individuals involved was relatively tiny, the nature of investigation and experimentation relatively basic, and the information to be communicated relatively restricted and simple. Digital technologies can enable scientific ‘conversations’ to happen in real time or whatever time-based variant is most convenient for the participants, whether in speech or by the exchange of words or data through one-to-one, one-to-many or many-to-many channels.

It is understood that the nature of the research outputs that form the basis of this conversation will continue to evolve. It already embraces not only the traditional narrative text in the form of journal articles but also software, video or audio recordings of music, performances or happenings in the natural world, websites and so forth. Numerical or textual datasets, in both the sciences and humanities, are generated using machines that create digital files, sometimes vast. Increasingly frequently, these are combined into complex objects. Such outputs are becoming commonplace.

Often these files benefit from annotation and manipulation, activities that are carried out by the creator and, commonly, by other researchers in the field in a community effort to produce an increasingly meaningful dataset. Computer tools designed to locate facts and relationships in disparate sources can create new knowledge from them. Some of this is in its infancy – text-mining is a developing area with great potential for a range of fields and disciplines.

Further into the future, workshop participants envisage a scenario that includes research outputs as a seamless source of methods, data and narrative, fully enabling reproduction of an experiment or investigation (so-called liquid publication). Alongside this, personal mobile applications will enable researchers to discover and locate content of interest, institutional repositories could be used for interactive discussion and social media feeds, and computation will be done on the move using cloud facilities. There is considerable doubt as to whether the academic journal, in its usual form, will be part of the scenario in the longer term. The package of information that a journal represents is one of the legacies of the print-on-paper age, necessary to fulfil the demands of cost-effectiveness in physically distributing research results to customers across the world. Journals still play an important role at the moment, primarily as guides to research quality, but the importance of this may be diminishing as methods to assess quality at article or author level are developed.

Finally, included in the vision is the innovation imperative. This is influencing developments quite strongly, especially in the UK where the Department for Business, Innovation and Skills (whose areas of responsibility encompass higher education) moved on the open access agenda markedly during 2012. An open system will provide barrier-free access for small and medium-sized companies (SMEs in the UK, SMBs in the US) for innovation using basic research results. With this, it is argued, will come wealth creation and jobs. Certainly evidence shows that currently the SME/SMB sector is hampered by lack of access, or access at too high a price, to research findings and innovation activity suffers as a result, costing businesses and economies considerable sums.

Openness

At an institutional level openness is a true cross-cutting agenda, applicable over much of what a higher education institution (HEI) does. It is a philosophy that needs to be adopted from top to bottom in higher education to really effect wholesale change.

All these kinds of activity are predicated upon an open research outputs corpus, a corpus that includes both narrative and data outputs along with the tools with which to interrogate and manipulate these things. In other words, a research results base that allows barrier-free access and use for humans and machines. Of course, there will be special cases and exceptions, especially for data where confidentiality, privacy or security are involved, but these will be a minority of instances where special arrangements can be put in place to protect vested interests.

The current state of affairs is that around 20% of global research literature is openly available, though very little of that is computable, as open access papers exist mainly in PDF form. Research data is shared openly in specific cases (notably certain areas of biomedicine, astronomy, and a few other fields) but not routinely in most disciplines.

The growing number of mandatory policies on accessibility of both publications and research data from funders and institutions is helping to raise levels of open content, but only very slowly. This is partly because the number of policies is growing only slowly, and mostly because policy-makers are not monitoring and policing compliance. In many cases, policies are not implemented well, so researchers targeted by them remain uninformed about the reasons for the policy and the benefits that can accrue. Coupled with an entrenched reluctance to share (entrenched because the traditional system has worked to encourage the opposite of sharing), especially with respect to data, non supported policies are little more effective than voluntary ones.

There are signs of an increasing recognition amongst policy-makers that compliance is a problematic issue and that they need to put in place systems to police this, probably alongside sanctions for noncompliance.

Nonetheless, there are some promising advances in the policy arena. On open access to publications, it was noted that we are now seeing mandates arising from faculty members themselves rather than being introduced by management, Harvard and MIT being the most conspicuous examples of this type of policy-making. On open data, the Royal Society’s study on science as an open enterprise is the most recent of several reports in this area, indicating the growing interest in the issues at public policy level. One of the areas of action highlighted in that report was also raised as an issue during the workshop reported here – that common standards must be developed for sharing information in order to make it more widely usable.

back to contents

Current practice and emerging trends

data

There is a large gap between the vision articulated at the workshop and current practice. That said, there are encouraging signs that the benefits of change are increasingly being recognised by key players and some progress is being seen on many fronts. A number of topics were raised for particular discussion.

Research data

Recent interest in open data from research funders and the growing numbers of open access mandates, increasingly emanating from faculties themselves, testify to policy-level interest. There are hiccups, most commonly due to arguments along the lines of, ‘if we open this data, it might help people in other countries/institutions’. Where these have been overcome, funders are now commonly requiring data management plans as part of research grant proposals. In some fields, at least, research teams include a new position of data manager or data scientist to develop such plans, manage all data related issues during the research process, advise on data handling and processing, advise on data protection and confidentiality issues, and on IP issues more generally, write software for data manipulation and ensure the proper storage and curation of data.

It is hoped that a clearer policy on open data will help to reduce the marked reluctance to share on the part of probably the majority of researchers. The main reasons for this reluctance are the entrenched habit of only revealing what is necessary to achieve a publication, fears that data may be ‘stolen’ by others or exploited by others more successfully than the creator themselves, wanting to keep data closed until there is no chance of the creator wishing to use or mine it again and, in the case of young researchers in the humanities, there is the fear that they will not get their first book published if they make their doctoral thesis open.

One concern was the current attitude, in the natural sciences mainly, towards the issue of secondary data. Only the creation of primary data is viewed as a legitimate activity and rewarded, while analysis of secondary (other people’s) data is generally looked down upon as inferior science, despite the many benefits.

Data storage and curation are topics that raise concerns. Should all research data be stored? One school of thought agrees that it should, contending that storage is cheap and getting cheaper so it is an investment well made. The other view is that it is not feasible to consider storing all data and that some forms of selection process must be undertaken. Whichever of these prevails, the cost and logistics of data storage will be major issues to be addressed in the coming years.

The assessment of datasets as a form of research output was also discussed. There is currently little peer review undertaken on datasets, primarily because this is usually not possible for the reviewer because of the size or complexity of the data, or because specialised tools are needed to access and analyse them. It is recognised, however, that if researchers are to produce and share data as part of their outputs, and are to be required to do this, then some incentive needs to be provided. Publication of journal articles is the main academic currency at present and the traditional two-pronged system of peer review along with the journal hierarchy that has emerged over decades is the means for assessing quality at the moment, at least until better metrics are embedded in the system. The challenge of assessing the quality and value of datasets, and rewarding their creators appropriately, stalks the future.

Collaboration and competition

Despite the current system being characterised more by competition than collaboration, the emergence of participatory/collaborative approaches to research, approaches that involve new players (volunteer based computation, citizen science, digital humanities) and the development of new tools in research take us part way to a new world of scholarly discourse at the level of practice.

Collaborative research is becoming much more the norm than hitherto. Research teams can be composed of members in more than one (sometimes many) institution, motivated by the fact that data-driven scholarship (commonly called ‘e-science’ in the natural sciences and ‘digital scholarship’ in the humanities) can only be carried out on a collaborative basis due to the need for many skills – and often much equipment.

There remain tensions between collaboration and competition that can have pernicious effects on the progress of openness as well as on research progress more generally. There is often a misalignment between local and national incentives – for example, for managing research data. Competition between institutions continues to be a strong influencer of institutional behaviour and values, exacerbated by the growing number of university rankings (each measuring something different to others) and exercises to grade research and teaching performance. With respect to the latter, it was mentioned that we may look forward to battles between institutions over educators as well as over researchers. Reconciling the imperatives for collaboration with this ongoing competitive culture is very difficult.

Participation

Away from the institutional level, however, collaboration is a growing norm, and new players are being involved, including governments, private industry/commerce and, increasingly, citizens.

Citizens are engaging with research as both creators of research information and as users of information that is becoming available to them through greater accessibility. Funders and governments are keen to see better connection between the academic research community and the public in general, and new programmes to secure citizen engagement are being promoted. And partnerships or less formal interactions are improving the connection between HEIs and the library and museum sectors.

Two major issues emerged in the discussion. First, for those people who are seeking information for their own use, there is the double-sided problem of discovery and data presentation. For research articles discovery is not always easy, despite Google’s abilities to locate repository content. Part of the issue here is the art of searching itself, since lay people usually do not use the kinds of search terms that retrieve articles relevant to them: SME personnel, for example, tend to use more common parlance for topics of interest rather than the scientific terms that would result in a successful web search. For research data the issue is even more complex. For example, the volume of genomics data in the public domain doubles every five months: anyone without the specialised tools developed and owned by researchers will find such a corpus inaccessible, even though it is freely available. Related to this is the issue of the translation of research data into a form that can be used by non-specialists.

Second, there is the issue of recognising and rewarding the contribution of citizen scientists (that term is used here to embrace all people outside academic institutions that participate in research activities): the contribution can be extensive and even critical but other than naming such individuals as authors on an academic article, there is no recognition/reward structure for this type of participation.

Technology and the cyberinfrastructure

The term cyberinfrastructure can be used to cover a range of services from the provision for e-science on the largest scale through local networks to institutional services. Funding can be at institutional, regional or national level, and occasionally international (e.g. European Union research infrastructures).

The discussion around this focused mainly on the funding issue, notably who should fund what, whether the investment is sensibly made and how future investments should be planned and shared. One major consideration is the infrastructure for research data sharing which, it was suggested, should optimally be established and shared on a global scale. There are some successful examples of international cooperation of this kind, the European ELIXIR infrastructure for biological data being one.

At a quite different level, institutional infrastructure services were discussed in the contexts of the need to improve interoperability between institutions and the optimal approach to dealing with IT developments, which can happen at a different timescale to institutional decision-making cycles. Some detailed issues were discussed, such as whether there is a need for an agreed metadata set for institutional repositories or, conversely, whether natural language processing can do the discovery job effectively. Institutional versus individual provision of hardware and software tools is another strategic issue with which to grapple. Should institutions provide the information space and the tools to interact with it, a one-size-fits-all approach, or should ‘bring your own device’ solutions be encouraged? What effect will the increasing use of the cloud have on these things? Is there a danger of an institution getting too firm a grip on what it thinks is happening, only to find that its researchers are innovating their own solutions where they prefer to do this? And can institutions usefully distinguish between commodity services common across many institutions and those differentiated institutional offerings serving individual institutions at the strategic level?

Peer review

The subject of peer review – or, rather, the system by which it happens – was raised a number of times during the course of the workshop. There was a considerable level of agreement that it has to change, if only to streamline the dissemination process, but the question of how remains firmly unanswered. It was noted that peer review is, and will persist in being, the gold standard, with the exceptions of economics where working papers are the main publication currency and high-energy physics where preprints serve the same role.

Some form of peer review will always be needed as a sifting mechanism to save researchers time in dealing with information overload, but the current system is under strain and appears unsustainable in its present form. Open peer review is not proving very popular with researchers in general, though in a few fields it has gained some traction.

In a more open world there is a clear role here for learned societies, though current thinking on their part appears to be on the whole conservative with a general preference for the status quo. It will take considerable time and effort to change the ingrained views about their role in the research dissemination system.

Impact, recognition, reputation

Research assessment dominates research activities and programmes in the UK. In the US there is no similar national research assessment exercise as such, though assessment systems are still in place. In the UK, the Research Excellence Framework (REF), as the national assessment exercise is called, dictates funding, helps perpetuate the publish-or-perish imperative and reinforces the quest to publish in ‘high impact’ journals. In some ways, then, it has pernicious effects on the progress of the open agenda.

League tables and ranking systems are very influential, too, at institutional level and in this case this applies across the world. Global rankings of universities for the student experience, for research prowess, for web presence and so on are considered important and universities do perform to the test in many cases, especially in respect of the more prestigious rankings.

Impact metrics

The measurement of impact is changing. Traditionally, only academic impact has been seriously taken into account but there is much emphasis now on impact in other ways. These include impact for innovation, on social developments and as a contribution to the knowledge society more generally in terms of better-informed citizens pursuing a closer interest in research endeavours. The relationship between academia and the wider society is changing and should become even closer, a view being pushed now by funders.

The problem is how to assess impact of these kinds and reward the information creators. Academic impact can still be measured by citations and, as they develop, by new metrics of usage and influence. There will be much more work in this area to develop metrics that measure more immediate factors than citations, or which provide a richer picture of citation profiles. The development of these ‘alt-metrics’ is work-in-progress. This issue of alternative, new ways of measuring impact is important because impact assessment is currently acting as a brake on development in scholarly discourse: reliance on the Journal Impact Factor as the main metric hampers the acceptance of new open access journals or other dissemination channels. The other kinds of impact are much harder to assess and this is exacerbated by the fact that their manifestation can be in the long term.

Notable here is the influence of funders. We are increasingly seeing funders acknowledge that they wish scholarly research to support the innovation, education and society benefit agendas. Some require an explicit statement in grant proposals of the intended or likely wider impact of the research proposed. This perspective is likely to continue.

Roles and rewards

New perspectives on the issue of rewarding research performance were also discussed. There are unresolved problems in how best to reward desirable research practices.

First, how can the individuals in large (or even huge) research teams receive recognition of their part in the research effort? Teams may also form and de-form, or coalesce. Sharing out recognition in an equitable way will not be easy.

Second, what can we do to reward researchers for creating new types of research output – things that are valid research products, like videos or software, or numerical datasets? These things are not cited in the traditional way journal articles and books are cited, yet they are valid contributions in the digital research age.

Third, how can we reward researchers for successfully engaging with the wider society and what can be measured? This dilemma highlights the difference between research outputs that can have an immediate effect and those which take longer (years or decades in same cases) before their true impact can be assessed.  

Fourth, how can we reward sharing? Researchers who currently share data say that they do so largely ‘to be seen as a good guy in the community’. In many research communities this does not matter. We need more explicit rewards for sharing, whether articles or data.

And fifth, there are new roles evolving in the research community that are also outside the traditional citation-based recognition system and more often than not outside the traditional academic reward system (tenure, career progression, reputational capital). These are new roles such as data scientist, specialised technician, data analyst, informatician, librarian and so forth. They have appeared as the need for people with the skills to cope with e-research has grown, but to the greatest extent remain marginalised from the normal reward system of academia. Sometimes the names of technicians or data scientists find their way onto a paper as one of the team of authors, but this is an academic reward that may very well not have any effect on that individual’s career prospects.

The same basic problem applies to other roles within institutions, notably in IT where a new role, that of strategic developer, was discussed. This shares with those other roles above the lack of a career path and the danger that the skills developed in that role, skills that transfer readily to the commercial sector for better reward, are invested in by the institution and then lost. New thinking is needed on how to better integrate all such roles into the system – for they are critical to modern research.

Finally, there was some discussion of the accountability that goes alongside recognition of these new or evolving roles. What level of responsibility should these roles have?

Government expectations

The influence of research assessment exercises has already been mentioned. Governments are keen to encourage research excellence, perhaps at the expense of the ‘utility’ research that forms the bulk of global research effort.

The question, ‘Should we always expect research activity to keep increasing?’ was voiced. Demographics do not necessarily signal this and budgetary constraints in these economic times may change the pattern of decades of ever-increasing research budgets.

Government expectations with respect to exploitation of IP by universities clash with new expectations of sharing. There are tensions here that will be difficult to moderate, though perhaps reality will intervene – only 3% of patents ever generate any income and the proportion of income for universities from IP is at best 4% of the total. Exploitation of IP by academic institutions is not going to solve the problem of tight research budgets.

Other government-favoured targets may be easier to hit: spin-off companies or the nurturing of a halo of innovative businesses; students who achieve appropriate employment within a specific period of time after graduating; significant and economically beneficial influence on the local community; all these are types of outcome that governments increasingly indicate they wish universities to achieve.

Students and learning

Alongside research there is the teaching and learning agenda. The roles of students may be changing in a number of ways, and where research is concerned it was acknowledged that there must be an effort to find ways to better involve them in genuine research and give them a better understanding of the process. One suggestion was to use the cloud to put virtual research environments (VREs) in front of students to enable them to participate more fully.

Motivation to get such things underway is crucial and to achieve this the same sort of requirements as for data or impact may be useful – that is, for teachers to be required to demonstrate outreach or educational outcomes from their projects.

Moreover, student outcomes, too, need to be measured in better ways. It was noted that the next generation can change things for better forever if taught in the right ways and the open agenda should be fundamental to this. Unfortunately, teaching is still seen as second to research and many people in HEIs are not involved in it at all but are employed on research-only contracts. The view has always been that teaching is done most effectively in institutions where research is carried out at an advanced level. Now this is being challenged, especially where undergraduate teaching is concerned. The ability now to acquire and deliver teaching remotely is changing the whole set of expectations about this part of  programmes within HE institutions.

Public expectations

As well as government influence, there are the expectations the general public has of higher education. Perhaps the primary one is that higher education will fit society’s young people to obtain a job, and fit people with the skills to create jobs. There are expectations around this issue of the kind of jobs and what overall contribution these will make to society: in other words, there is a payoff calculation being made, albeit a loose one. The public expects to be able to assess, somehow, the worth of an institution and its programmes to that public.

Institutional responsibilities

The biggest problem identified during the workshop was the lack of institutional engagement with the issues around scholarly discourse. Management needs to be made more aware of the strategic opportunities offered by the evolution of scholarly communication. Some of them fit fairly and squarely into the current strategic agenda, such as the opportunity to measure impact in new ways and to provide the tools for management to take a more critical look at how this has been done in the past. Institutions need to improve the way they collect data about themselves and the techniques they use to compare themselves to others. Others are new ways of thinking strategically about opportunities. The concept of competitive sharing, where competitive advantage can be derived from openness, has still to be properly considered and adopted by institutional managers yet has the potential to provide huge benefits to institutions.

There is a danger of the individual’s agenda being at conflict with that of the institution. It was ever thus, of course, but now that individual researchers or their teams can take significant and very effective steps to promote themselves, this may be at the expense of the institution, with potential reputational capital not being gathered by the institution.

This is actually a bigger issue to do with institutional strategy. Scholarly communication issues need to be properly coupled with the research strategies of the institution: if not there is a risk of ‘something else’ – a system that by-lines institutions as the individual or team becomes the unit of reputation. A converse side to this also arose in the discussion around open data and that was that the risk of making mistakes in opening up data – unintentionally contravening privacy or confidentiality rules – where the institution could lose considerable reputational capital, its greatest asset. Where kudos can accrue, it will go to the researcher but where condemnation is due the institution, which has provided the enabling system for sharing, is the loser. This whole issue of reputational capital and where it will reside in future is of great strategic importance.

How money is spent on scholarly communication and its related costs needs to be better looked at in the context of the institution’s overall strategy and in the spirit of helping to reshape the future. There is concern that management can often take only a superficial interest in this, allowing parties keen to preserve the status quo to impose their rules without questioning contracts or associated agreements. There is also fear that this reluctance of managers to really engage with the issues could mean that any new system may be implemented that is little better than the current one. Specifically, there was discussion over the issue of paying for ‘Gold’ open access and the fairness of a scenario that includes the creation of a hierarchy of dissemination outlets where only the wealthiest can afford to pay to publish in the ‘cream’ layer of journals.

What is needed in each institution is a ‘champion with a wallet’ who can steer progress along a desirable path. There is rarely a well informed champion, however, and the topics of scholarly discourse and the strategic use of IT appear only infrequently on the agenda at strategic level. As they are not considered a strategic opportunity, policies do not emerge. And without policies, actions are forced to be bottom-up and suffer due to lack of institutional support.

There is another influence on institutional policy, however, and this is extramural. Research funders can dictate policy to institutions and can very strongly influence effective implementation. We are starting to see this happen now on both sides of the Atlantic and elsewhere in the world, with funders requiring institutions to partner with them in progressing the open agenda.

Barriers and opportunities

The main barriers and opportunities identified during the workshop are summarised below.

Table 1: barriers and opportunities in scholarly discourse

Barriers Opportunities
Policy
based
  • Lack of engagement at institutional management level
  • Slow adoption of policies on openness, especially by institutions
  • Bad policy wording or implementation
  • Poor compliance-monitoring and policing
  • Government and funder expectations can conflict with the open agenda and constrain its adoption
  • Government and funder expectations around exploitation
  • Government and funder expectations of sharing can drive openness
  • Governments’ innovation agenda is gaining strength and can drive openness
  • Governments and funders are recognising the importance of data as recognisable, rewardable, reusable outputs
Technological
  • Interoperability issues not fully resolved
  • Discovery mechanisms do not suit needs of nonacademic users
  • Investments in cyberinfrastructure are too often piecemeal and locally made rather than on a collaborative wide-scale basis
  • Data management, curation and storage issues are not yet adequately addressed in terms of finding workable, collaborative ways forward

 

  • An open research corpus will allow computation
  • Liquid publications will enrich scholarly ‘literature’
  • The cyberinfrastructure is ripe for clever, collaborative development that will bring huge benefits and payoffs
  • New metrics will allow better assessment of scholarly endeavours
  • Technological solutions can be found to join up the assessment of individuals, teams, departments and institutions

 

Cultural and
practical
  • Competition culture currently trumps sharing culture
  • A reward system for sharing is not established, either for articles or research data
  • Reward systems for new roles and for nonacademic participants are not established
  • The traditional measure of impact (Journal Impact Factor) is entrenched
  • New metrics for impact of various types are not widely used
  • There is a lack of metrics for measuring non-academic impact properly
  • Research assessment exercises exert a constraint on adoption of new impact metrics
  • League tables and rankings can help entrench the competition

 

  • Policy-makers are starting to listen to the arguments for openness
  • Faculty are starting to listen to the arguments for openness
  • Students represent fertile ground for the arguments for openness
  • There is a growing collaborative approach within the sector
  • Increasing participation from outside academic life
  • Re-engagement with the public will allow better demonstration of the value for society created within HEIs
  • Evolution of peer review will streamline the process and minimise problems
  • An opportunity exists for scholarly societies to take responsibility for the development of better peer review processes in their fields
  • A return to the ‘scientific conversation’

What can be done?

So, what can be done to exploit opportunities and overcome barriers? The list below is of points made during the workshop or derived from notes and outputs from that workshop. It is not exhaustive, but it serves to summarise the most significant, promising or do-able options. As such, it can form a basis for further discussion and debate.

  • Policy development on open access and open data
  • Monitoring and enforcing of policies
  • Better embedding of scholarly communication in institutional strategies
  • Identification and empowerment of institutional champions of the open agenda
  • Better monitoring and recording of research outcomes – at the level of institutions, teams and individuals
  • Maintenance of the focus on affordability for the scholarly communication system
  • Encouragement of plurality of outputs, eligible for reward
  • Development of workable attribution mechanisms for non-text
  • Research and development on unresolved interoperability issues
  • Recognition and reward sharing
  • Encouragement and reward for innovation in scholarly communications
  • Encouragement and rewarding of new forms of participation
  • Encouragement and rewarding of new forms of collaboration

back to contents

Workshop attendees

Delegates

  • Kevin Ashley, Digital Curation Centre
  • Niamh Brennan, Trinity College Dublin
  • Laura Brown, Ithaka
  • Peter Burnhill, EDINA
  • Catriona Cannon, Bodleian Libraries, University of Oxford
  • Keith Cole, Mimas, University of Manchester
  • Louisa Dale, Jisc
  • David De Roure, Oxford e-Research Centre
  • Mathew Dovey, Jisc
  • Marc Dupuis, SURF
  • Matthew Grist, Demos
  • Clem Guthro, Colby College
  • Robert Haymon-Collins, Jisc
  • Charles Lowry, Association of Research Libraries
  • Richard Otlet, Jisc
  • Leo Plugge, Stichting SURF
  • David Prosser, Research Libraries UK
  • Cecilia Preston, CNI
  • Seb Schmoller, Association for Learning Technology
  • Alma Swan, Key Perspectives Ltd
  • Alex Wade, Microsoft Research
  • Norman Wiseman, Jisc
  • Nicola Yeeles, Jisc

Speakers/Moderators

  • Paul Ayris, UCL
  • Philip Bourne, University of California San Diego
  • Josh Brown, Jisc
  • Rachel Bruce, Jisc
  • Ian Carter, University of Sussex
  • Paul N. Courant, University of Michigan Library
  • Jeremy Frey, University of Southampton
  • Diane Harley, Centre for Studies in Higher Education, UC Berkeley
  • Jeff Haywood, University of Edinburgh
  • Neil Jacobs, Jisc
  • Joan Lippincott, CNI
  • Clifford Lynch, CNI
  • Elizabeth Lyon, UKOLN, University of Bath
  • Tara McPherson, Vectors
  • Mark Patterson, eLife
  • Stephen Pinfield, University of Nottingham
  • Kevin Schurer, University of Leicester
  • Mike Taylor, Elsevier Labs
  • Amber Thomas, Jisc
  • Sarah Thomas, Bodleian Libraries, Oxford
  • Paul Walk, Innovation Support Centre, UKOLN, University of Bath
  • Donald Waters, The Andrew W. Mellon Foundation

back to contents

 
Summary
Author
Alma Swan
Publication Date
19 November 2012
Publication Type