Call for contributions: Failures and fallacies of innovative research methods
Background and rationale
The current climate of higher education across the globe is characterised by research excellence discourses, increased rivalry for ever-shrinking funding and grant pots, tougher competition in relation to publishing within scientific communities. As a consequence, pressures to be original and to innovate are heightened. Indeed, innovation and originality are set criteria for doctorates, funding applications, fellowships and teaching positions. It is no wonder then that in most recent years we have seen a veritable ballooning of innovations in research methods aiming at making research less hierarchical, more participatory, more accessible, more modern and in line with the developments of our social and cultural worlds. There are no longer clear boundaries between qualitative and quantitative methods, or indeed within these. Where ethnography was once a specific approach to carrying out research requiring weeks and indeed months spent in the field to be studied, harnessing of social media data for example allows for the relatively quick collection of months’ worth of information in a much shorter period of time.
Innovative research methods
In the spirit of research excellence, innovation and originality, these new approaches are regularly reported on and published. Such publications tend to report on the success stories. However, there are limitations to what can be innovated and how, and there are unforeseen and/or negative consequences to innovative research methods. Every researcher knows, that research is messy, chaotic, untidy, disorderly. And yet, research reports do not account for this nature of research. For this special issue I propose to publish articles that focus on the difficulties, the failures and fallacies in relation to attempted innovations highlighting when things go wrong and how and what can be done about that. For example, data collection methods may not be as easily transferable from one research context to another as some researchers may think, especially where different countries and cultures are concerned. As a consequence, the innovative data collection method may have resulted in data quite different from the one expected, or data sets may be incomplete. The special issue would then specifically focus on how researchers salvaged these situations.
Article details
The articles will specifically address questions of “what went wrong” and how researchers dealt with their failures; which fallacies researchers fell prey to originally; how we can know the difference between bad research or too much innovation gone wrong; whether there is an argument for continuing to push for innovation as a driving force or if errors and faults should be accepted as such and put aside.
Articles will be between 6000 and 8000 words in length (incl. bibliography, abstract, footnotes and endnotes)
Submit abstracts of 300 words via the following contact form.
Leave a message: