Using IT effectively: a guide to technology in the social sciences
Using IT effectively: a guide to technology in the social sciences Edited by
Millsom Henry University of Stirling
© Millsom Henry 1998 the collection and introductory material © The contributors for individual chapters 1998 This book is copyright under the Berne Convention. No reproduction without permission. All rights reserved. First published in 1998 by UCL Press UCL Press Limited 1 Gunpowder Square London EC4A 3DE UK and This edition published in the Taylor & Francis e-Library, 2005. “To purchase your own copy of this or any of Taylor & Francis or Routledge’s collection of thousands of eBooks please go to www.eBookstore.tandf.co.uk.” 1900 Frost Road, Suite 101 Bristol Pennsylvania 19007–1598 USA The name of University College London (UCL) is a registered trade mark used by UCL Press with the consent of the owner. British Library Cataloguing-in-Publication Data A catalogue record for this book is available from the British Library. Library of Congress Cataloging-in-Publication Data are available ISBN 0-203-98452-8 Master e-book ISBN
ISBNs: 1-85728-794-0 (Print Edition) HB 1-85728-795-9 (Print Edition) PB Cover design by Amanda Barragry
CONTENTS
SECTION ONE:
FOREWORD Professor Howard Newby
vi
LIST OF FIGURES AND TABLES
vii
NOTES ON CONTRIBUTORS
viii
EDITOR’S INTRODUCTION Millsom Henry
xiii
NEW CHALLENGES FOR TEACHING AND LEARNING 1
EXPONENTIAL EDUCATION Peter Cochrane
2
PEDAGOGY, PROGRESS, POLITICS AND POWER IN THE INFORMATION AGE Stephen Heppell
12
3
TECHNOLOGY AND SOCIETY: AN MP’S VIEW Anne Campbell
15
4
INFORMATION TECHNOLOGY: A CASE FOR SOCIAL SCIENTIFIC ENQUIRY Adrian Kirkwood
19
SECTION TWO:
2
DEVELOPING COURSEWARE FOR THE SOCIAL SCIENCES 5
EXPECTATIONS AND REALITIES IN DEVELOPING COMPUTERASSISTED LEARNING: THE EXAMPLE OF GraphIT! Ruth Madigan Sue Tickner Margaret Milner
29
6
THE DATA GAME: LEARNING STATISTICS Stephen Morris Jill Szuscikiewicz
36
7
CONVERSION OF THE IDEOLOGIES OF WELFARE TO A MULTIMEDIA TEACHING AND LEARNING FORMAT David Gerrett
47
8
DESIGNNET: TRANSNATIONAL DESIGN PROJECT WORK AT A DISTANCE Stephen Scrivener Susan Vernon
54
v
SECTION THREE: IMPLEMENTING COMPUTER-ASSISTED LEARNING IN THE SOCIAL SCIENCES 9
COMPUTER-AIDED LEARNING AS A LEARNING TOOL: LESSONS FROM EDUCATIONAL THEORY Graham R.Gibbs David Robinson
62
10
ANORAKS AND TECHIES: A CALL FOR THE INCORPORATION OF NON-TECHNICAL KNOWLEDGE IN TECHNOLOGICAL DEVELOPMENTS Vernon Gayle
73
11
EVANGELISM AND AGNOSTICISM IN THE TAKE-UP OF INFORMATION TECHNOLOGY Danny Lawrence Ken Levine Nick Manning
80
12
STANDARDS FOR THE NON-STANDARD: THE IMPACT OF NEW TECHNOLOGY ON THE NON-STANDARD STUDENT Ann Wilkinson
91
SECTION FOUR: THE EFFECTIVENESS OF THE NEW TECHNOLOGIES IN TEACHING AND LEARNING ENVIRONMENTS 13
INFORMATION TECHNOLOGY AND TEACHING QUALITY ASSESSMENT: REFLECTIONS OF A SOCIOLOGIST Chris Turner
14
WHY COSTS ARE IMPORTANT IN THE ADOPTION AND ASSESSMENT OF NEW EDUCATIONAL TECHNOLOGIES David Newlands Alasdair McLean Fraser Lovie
106
15
USING MULTIMEDIA TECHNOLOGY FOR TEACHING: A CASE STUDY APPROACH David CrowtherNeil BarnettMatt Davies
115
16
INFORMATION TECHNOLOGY AND TEACHING THE SOCIAL SCIENCES: OBSTACLES AND OPPORTUNITIES Duncan Timms
125
GLOSSSARY
135
INDEX
137
98
FOREWORD
Professor Howard Newby Vice-Chancellor, University of Southampton
This book could not be more timely. The shape, structure and content of higher education in the UK is once more under intense public scrutiny and the role of information technology (IT) in teaching and learning remains a key issue. “Using IT effectively” is central to this. There are those with influence over policy in Brussels, Whitehall and Westminster who still regard IT as some sort of panacea: put students in front of VDU screens, the argument runs, and mass higher education can be provided on the cheap. Such “technological fix” arguments are simply false. But so is their contrary, that new technology has no relevance to the future provision of higher education in this country. Clearly the nature of teaching and learning will be affected—sometimes in quite profound ways—by new developments in IT. The key words in the title of this book are not so much “IT” (most developments over the next decade are known or knowable), but “using” and “effectively”. The technology will stand or fall by its use. Education is—and will, for the foreseeable future, remain—at heart a social process. New technology can assist in raising the quality of this process, but it must go with the grain of conventional pedagogy. Without this the sheer quantity of information available via modern information technology will disable, rather than enable, participation in a genuinely educational process. In the social sciences these issues are particularly pertinent. The social sciences thrive on debate. While the acquisition of empirical data is an important component of the understanding of society, the facts are never self-evident: they require interpretation. IT is, therefore, an essential tool in the social sciences and brings on to the desktop capacity to assimilate and analyze information at a speed and a cost undreamt of less than a generation ago. But it is not a substitute for the informed understanding that also comes from debate and discussion. Social science is, in many respects, the epitome of education as a social process. The contributions to the collection are not, therefore, narrowly concerned with the technology per se, startling though the advances in this continue to be. Most of the contributors are concerned with the organizational, social and pedagogical use of the new teaching and learning technologies. Relating these technologies to those contexts is crucial if the promise evinced by the new technologies is to be fulfilled. As we look forward to developing a higher education system attuned to the needs of a new century, these chapters remind us of the great care with which new technology must be handled if, indeed, it is to be used effectively.
LIST OF FIGURES AND TABLES
Figures Figure 1.1 Children using computers: an on-line school session. Figure 1.2 The surrogate head surgeon. Figure 1.3 The virtual university. Figure 4.1 Households in the UK with video recorder, home computer and audio CD player, 1985–94. Figure 4.2 Proportion of UK households with selected media technologies, 1994. Figure 4.3 Access to information and communications technologies in UK households, 1994. Figure 6.1 A painless way to absorb basic information. Figure 6.2 Repeated, varied and purposeful experimentation. Figure 6.3 Explaining the pattern recognition principle of choosing a test. Figure 6.4 Interactivity allows the student to follow their own line of interest. Figure 6.5 Knowing which tests are appropriate. Figure 6.6 More information on each test is available. Figure 6.7 Self-testing enables students to monitor their learning in an exploratory environment. Figure 7.1 Steps in the authoring process (A–M). Figure 7.2 Example of a lesson in Ideologies of Welfare. Figure 9.1 Correlation Explorer. Figure 9.2 A screen from Inspiration. Figure 9.3 The Polygraph from MacLaboratory.
5 6 7 21 22 23 40 40 41 41 42 42 42 49 50 69 70 70
Tables Table 6.1 Table 6.2 Table 7.1
Areas covered by the software. Confidence and understanding before and after using the software. Steps in the authoring process (A–M).
37 42 48
NOTES ON CONTRIBUTORS
Neil Barnett is Lecturer in Public Sector Management at Leeds Metropolitan University. Research interests include local government structure and decentralization. He is interested in developing multimedia teaching/learning material for public sector managers and social science students in an interdisciplinary environment. Anne Campbell is the first woman MP for Cambridge and the city’s third Labour MP. Since her election in April 1992, she has taken a special interest in science and technology, education and economic affairs. She was a member of the House of Commons Select Committee on Science and Technology from 1992 to 1997 and was Vice-Chair of the Parliamentary Office of Science and Technology. Anne Campbell worked with David Blunkett, Secretary of State for Education, on the future of information technology (IT) in education and research from 1995 to 1997. She is currently Parliamentary private Secretary to John Battle, Minister for Science, Energy and Industry. She has also chaired a sub-group of Labour’s Policy Commission on the Information Superhighway. Educated at Newnham College, Cambridge, she taught mathematics at Cambridge secondary schools before becoming a Senior Lecturer in Statistics at Cambridgeshire College of Arts and Technology (now Anglia Polytechnic University). She was Head of the Statistics and Data Processing Department at the National Institute of Agricultural Botany from 1983 to 1992. She is a Fellow of the Institute of Statisticians, the Royal Statistical Society and the Royal Society of Arts. Peter Cochrane is Head of Research at BT Laboratories. A graduate of Trent Polytechnic and Essex University, he is also a visiting professor to UCL, Essex, and Kent universities. Peter Cochrane has published and lectured widely on technology and the implications for society. He received the IEE Electronics Division Premium in 1986; the Queen’s Award in 1990; and the Martlesham Medal for contributions to fibre optic technology, the Computing and Control Premium, and the IERE Benefactors Prize in 1994. David Crowther is Lecturer in Management Accounting at Aston Business School, Aston University. His main area of research is corporate performance measurement and behavioural accounting. He is interested in teaching the use of technology and in particular multimedia as a teaching/learning tool. Matt Davies, a qualified chartered accountant, is Lecturer in Financial and Management Accounting at Aston University. His main research interest is in the use of shareholder value performance measures and the use of technology in the teaching process. Vernon Gayle is Lecturer in Sociology at the University of Stirling and responsible for teaching research methods and data analysis to both undergraduates and postgraduates. His own research is mainly concerned with analyzing social and economic data using generalized linear models. He is a committed
ix
GLIM4 user and has published a paper that reflects his interests in ordered categorical data analysis. He has also conducted research into a complementary treatment regimen for cancer patients. David Gerrett received his Bachelor of Pharmacy degree from Queensland University in 1977. After professional registration and working as a community and hospital pharmacist, he returned eight years later to full-time academia and read for a Masters in Hospital Pharmacy. Continuing part-time, he received his doctorate in 1995. This considered the role of community pharmacists as advisers on prescribed medication. In researching a role he was led towards a greater understanding of the literature on social policy and professionalism. An outlet for this understanding was provided by his role as course leader and primary author for the Postgraduate Programme in Social and Administrative Pharmacy which uses Multimedia as its sole teaching and learning method. The conversion of the Ideologies of Welfare was found to have generic appeal and is currently in use on three further postgraduate programmes. Graham R.Gibbs is Principal Lecturer in Sociology at the University of Huddersfield. He has a wide experience of teaching research methods using IT at undergraduate and postgraduate levels. His most recent research has focused on the use of CAL in teaching the social sciences and especially the use of computer aided co-operative learning in teaching theoretical subjects. Millsom Henry is the Deputy Director of SocInfo (see Glossary). She graduated from the universities of Durham and Stirling as a social scientist with specific teaching and research interests in the sociology of ethnicity and gender, especially in relation to culture/media and the social implications of the new technologies. She has presented numerous papers at international conferences, published over 12 articles and is editor of three regular publications. Millsom Henry was commissioned to edit three books and to write two chapters for publications due out in 1997. Finally, but not least, she somehow finds time to complete a part-time PhD on the identities of Caribbean women in Britain at the University of Stirling. Stephen Heppell is Professor of Information Technology in the Learning Environment at Anglia Polytechnic University and Director of ULTRALAB. Stephen Heppell is a member of a number of public committees and acts as a consultant in both the public and private sectors; he also has a long list of television appearances and writes regularly for the popular press. He is on the editorial board of the Journal of Information Technology for Teacher Education and the Journal of Multimedia and has contributed many chapters in books and journals; a full list can be viewed from: http:// www.ultralab.anglia.ac.uk/pages/ultralab/team/stephen/contents.html Adrian Kirkwood is Head of the Programme on Learner Use of Media within the Open University’s Institute of Educational Technology. He undertakes research and evaluation studies related to access and applications of media and information technologies, both within and outside the Open University. He has been a consultant to organizations including British Telecom and the National Council for Educational Technology and has made invited contributions to international conferences. He co-authored Personal computers for distance education (Paul Chapman 1992) and has published widely on the subject of using media in education and training. Danny Lawrence is Senior Lecturer in Sociology in the School of Social Studies at the University of Nottingham. His early research in “race” and ethnic group relations led to Black migrants: white natives (Cambridge University Press, 1974, reprinted and reissued by Gregg, 1992) and he has subsequently published many articles in this field. He has since conducted research and published on transmitted deprivation; youth unemployment; the professional aspiration and changing circumstances of the occupational groups responsible for the delivery of careers guidance; the disestablishment of teacher careers and, most recently, higher education and the labour market.
x
Ken Levine lectures at the School of Social Studies at Nottingham University. His main sociological research interests are adult literacy and architects as a professional group. He has gained considerable experience using computers with both undergraduate and postgraduate students in a variety of contexts, including courses on statistics and survey design and analysis, as well as introductions to networks, word processing, email and databases. He has collaborated with colleagues (including Danny Lawrence) on the production of CAL courseware (using Authorware Professional) designed for a module on “official” statistics. A long spell as the departmental Computing Officer attempting to meet the needs of staff and student users has taught him that despite massive advances in technology, the gap between expectations and reality in educational IT is more or less unchanging. Fraser Lovie is Research Assistant in the Department of Politics and International Relations at the University of Aberdeen. He is also working on the research project on the Internet delivery of International Relations courses with Alasdair McLean. Ruth Madigan is a Senior Lecturer in Sociology at the University of Glasgow. She has taught urban sociology and methods of social research, in particular data analysis using SPSS, for many years. Her article “Gender issues: teaching with computers in sociology” (SocInfo Journal I, 1995) arose directly out of this experience. The TLTP-TILT project (University of Glasgow) offered an opportunity to explore new approaches to computer-aided learning in the area of basic research statistics. Alasdair McLean is Lecturer in the Department of Politics and International Relations at the University of Aberdeen and is also the Convenor of the Faculty IT User Group. He has considerable experience of distance education through audio-conferencing and leads a research project on the Internet delivery of international relations courses. Nick Manning is Professor of Social Policy and Sociology, University of Nottingham. He has been interested in introducing IT into the social science curriculum since the late 1980s, both as a general environment for student learning and for specific courses. This has included both data analysis and the construction of lectures in hypertext. He has also developed postgraduate degrees combining social policy and IT, with European Union (EU) funding. His research work is mainly on eastern Europe. This started in the 1980s on social policy, changed to environmental and housing movements in the early 1990s, and is now on (un)employment policy and household experience in Russia. His other areas of work have included medical sociology and health policy, comparative social policy among various OECD countries, and general theories of social problems and social policy. Margaret Milner is a Lecturer in Quantitative Methods in the Department of Accounting and Finance at the University of Glasgow. She has had a keen interest in developing computer applications for the teaching of statistics and quantitative methods to accountancy students and also teaches MBA students and students interested in IT. Her research topics include investigating the distributional properties of accounting ratios and decision-making and report format choices. As a member of the team developing GraphIT!, a TLTP-TILT project, the strategic use of graphs and graphical analysis is also an important teaching and research interest. Stephen Morris has been producing CAL for many years; and was an original ITTI project holder for the production of CAL in medical educa tion for which the software Statistics for the Terrified was produced. He is one of the prime movers of the MIDRIB project, which will bring together a comprehensive database of peer-reviewed medical images to UK higher education over the Internet. He has also been the head of successful higher education computer units at St Bartholomew’s Hospital (1984–94), and (currently) St George’s Hospital Medical School.
xi
Professor Howard Newby took office as Vice-Chancellor of the University of Southampton on 1 September 1994, moving from the Economic and Social Research Council, where he was first Chairman (1988–94), and then Chief Executive (1994). Professor Newby has a background of research and writing in rural affairs, including many books and articles on social change in rural England, and is a Rural Development Commissioner. He is also a member of a number of government bodies concerned with the funding of research in the UK, including the Dearing Committee Research Working Group; Chairman of the Centre for the Exploitation of Science and Technology; and a member of the executive council of the European Science Foundation. He is Vice-Chair of the Committee of Vice-Chancellors and Principals and serves on a number of its steering and sector groups. David Newlands is Senior Lecturer in Economics at the University of Aberdeen. His principal research interests in economics include regional economics and the economics of the welfare state. He has also conducted research on new educational technologies and is directing a major project to examine the impact of such technologies on educational provision in Scotland. David Robinson is Senior Lecturer in Psychology at the University of Huddersfield. His work has included involvement in the development and evaluation of a software system to support general practitioners. His current research interest is the relationship between autobiographical memory and fantasy. Stephen Scrivener is Professor and Director of the Design Research Centre at the University of Derby. He has published four books and over 50 papers in learned journals, and has refereed conferences and made numerous presentations. Jill Szuscikiewicz has worked on CAL with Stephen Morris since the original ITTI grant was obtained, and more recently at St George’s Hospital Medical School has worked on projects with a more medical base such as Immunology Explained and Heartbeat. She is now project manager of MIDRIB, funded under the eLib programme as a joint project with Bristol University. Sue Tickner is a Teaching and Learning Support Consultant at the University of Glasgow. Originally an English graduate, she worked as a teacher in Britain and Spain before returning to the UK for an MSC in IT and learning (by distance learning). After working as an independent developer/trainer and as an Open University tutor Sue Tickner joined the University of Glasgow’s TLTP project, as designer/co-ordinator for the Numerical Data Group. She has recently been conducting research into distance education and remote learning technologies. Duncan Timms is Director of Soclnfo (see Glossary) and Professor of Sociology in the Department of Applied Social Science in the University of Stirling. He is also Director of Project VARESTILE (the Value-Added Reuse at Stirling of Existing Technology in the Learning Experience), an institutional project funded under the Teaching and Learning Technology Programme. His teaching and research interests encompass two main areas: the social correlates of health and illness, and the social implications of information and communications technologies. He is also the Director of a series of Scottish-Nordic Winter Schools on Comparative Social Research funded by the EU under the Training and Mobility of Researchers Programme. Chris Turner is Professor of Sociology, University of Stirling. His teaching and research interests include the social constructions of childhood; processes of transition from childhood to adulthood; the impact of state intervention in children’s lives; and state policies on children since 1945. Susan Vernon is Director of Applied Arts at the University of Derby. Having studied at Gray’s School of Art in Aberdeen (jewellery and printmaking) and at the University of Central England, Birmingham (MA in Industrial Design), Susan regularly exhibits her work at an international level.
xii
Ann Wilkinson is a Senior Research Fellow working as the coordinator of the CTI Centre for Human Services in the Department of Social Work Studies, University of Southampton. The Centre is also funded to provide information and advice to CTI Centres on antidiscriminatory practice. Her previous research includes a European-funded project to provide information on access to higher education for disabled students. This was published as an information system and forms part of her MPhil thesis.
EDITOR’S INTRODUCTION Millsom Henry
This book represents a selection of edited papers, most of which were presented at the first Soclnfo (see Glossary) International Conference on Technology and Education in the Social Sciences (TESS) in September 1995 at the University of Stirling. This event provided a unique opportunity for social scientists to share ideas and experiences around the issues of innovation in teaching and learning. The proliferation of technology initiatives, the growth of more generic computer-assisted courseware for higher education, the rapid expansion of higher education, the emergence of the Internet and the Teaching Quality Assessment (TQA) exercises in the UK all provided the backdrop for many of the issues raised at the conference. As a result, the papers in this collection represent an attempt to document as well as to stimulate some critical debate on the impact of these technologies for teaching and learning. Although studies are emerging in the social sciences on the varying forms of technology and its related sub-cultures, more work is needed on how technology affects everyday life. The social, political and economic implications of the new technologies should be a central concern for the social sciences. Interestingly, this was recognized by the Economic and Social Research Council (ESRC) with the launch in spring 1997 of a new research programme on the role of virtual technologies in society.1 This programme will pay attention to the wide-ranging implications of new technology which to date has been missing in social scientific research. This must be good news. It is heartening to note that the themes surrounding the effective use of technology in education were also highlighted as key areas in the new ESRC programme. Using IT effectively: a guide to technology in the social sciences examines in detail some of the major issues associated with the development, impact, implementation and assessment of technology particularly within the social sciences. In UK higher educational institutions, the teaching and learning process is currently undergoing a major revolution and a more sustained examination of the development, implementation, assessment and impact of technology within the teaching and learning process is required. This book should be read as a basis for further investigations into both the positive and negative consequences of technology. Section One contains four key contributions from business, politics, academia and the technological field on the changing effect of technology on the teaching and learning process. In Chapter 1 Peter Cochrane points out the significance in expanding the traditional way of looking at teaching and learning. As the Head of BT Research Laboratories, Cochrane contends that while the younger generation “are embracing IT and rapidly gaining skills, the teaching profession remains dominated by a population resisting, or unable to see the need for, change”. This has led to a gap between the students and staff and is, according to Cochrane, a
1. ESRC Virtual Societies Programme directed by Professor Steve Woolgar at Brunel University. For more details refer to: http://www.esrc.ac.uk/programmes
xiv
cause for concern. In addition, there is a failure to appreciate the significance of IT as an effective tool for teaching and learning. Cochrane insists that “IT is not an ‘instead of’ but an ‘as well as’ technology. It is unlikely to replace the teacher or the institution but it will change their nature. In future, education will have to be more available, just-in-time, and on-line as it becomes a continuous life-long process.” The key point of this chapter is that there is a dramatic shift in the way formal education is viewed which has implications for staff, students and the nature of higher education disciplines. Social scientists should be interested in these shifts not only as an area for external enquiry, but also to investigate ways in which IT can be used more effectively as a tool within its own discipline areas. In Chapter 2 Stephen Heppell, the Director of ULTRALAB, points out how developments in educational technology contrast sharply “between rapidly expanding advancing technological potential and slower pedagogical, social and political development”. This has led to a number of tensions “between the emergent capabilities both of the technology itself and of the ‘children of the information age’; the challenge that these capabilities pose for existing models of education and assessment; the challenges posed for public policy and the social implications of technology for work, gender, family and education”. These tensions are of central concern to the social science disciplines, yet to date we have failed to critically engage in the debate. Chapter 3 focuses on the need to use the technologies effectively to encourage innovative research and to assist in the development of good teaching and learning skills. Anne Campbell, the Labour MP for Cambridge, demonstrates how technology can be used to improve access to the political as well as the learning process by describing the design and management of her online political surgery, one of the first in the UK. Campbell highlights the issues of access and empowerment which are central to the debate about the implications of the new technologies and insists that the information revolution should “not worsen the divisions in society, but is used to enable opportunity, equality and democracy”. In the final chapter of this section, Adrian Kirkwood also picks up on some of the negative consequences of technological development and considers whether it will excerabate social differences. The evidence to date suggests that whether in the home or at work, technology has tended to reinforce existing social differences in relation to gender and class. In this regard, Kirkwood argues that these concerns represent a valid subject for social scientific enquiry. In Section Two, a few examples of the development of courseware in the social sciences are highlighted. Ruth Madigan, Sue Tickner and Margaret Milner describe their work in Chapter 5, as part of an interdisciplinary team to produce a basic statistics CAL program. As selfconfessed “relative novices in courseware development” the authors here describe how the design and implementation of their program forced a fundamental re-evaluation of their own teaching methods. Their case study provides some salutary lessons about collaboration in courseware development and raises issues about the effective use of technology for teaching and learning. In Chapter 6, Stephen Morris and Jill Szuscikiewicz also outline their attempts to develop, assess and implement a statistical program for students. The chapter demonstrates how to exploit the technology and introduce graphics and simulations to encourage practical experimentation. By employing these methods, Morris and Szuscikiewicz assert, students will have a deeper appreciation of statistics. In Chapter 7, David Gerrett describes how he developed a programme based on an existing core course. Gerrett utilized the literature about the ideologies of welfare in social and public policy to create what he called “one-to-one, non-judgemental tuition”. Chapter 8 documents Stephen Scrivener’s and Susan Vernon’s vision of the future where collaborative group work by designers as part of international teams supported by computer and electronically mediated communication and cscw tools will predominate. This
xv
form of learning has implications for all discipline areas and may be particularly of use to the social science community who should be able to fully exploit the capability of computer-mediated communications. The contributions in Section Three focus more directly on the implementation of CAL programs in the social sciences. In Chapter 9, Graham R.Gibbs and David Robinson argue that attempts to replace the teacher with technology are unhelpful and relate to a much wider societal process of deskilling. Developments in CAL, they argue, should be used to enhance teaching skills by providing flexible learning tools rather than seeking to replace or deskill teachers. In Chapter 10, Vernon Gayle points out how the influx of IT in higher education has been poorly conceived and ineffectively implemented. Gayle argues that the inclusion of a sociological account of the teaching and learning environment serves to provide “an empirical account of the sociality of the teaching and learning environment and incorporates the knowledgeability held within the non-technical perspectives”. Nick Manning, Danny Lawrence and Ken Levine examine some of the reasons why academics have been reluctant to embrace technology in their teaching and research areas in Chapter 11. Paying particular attention to the attitudes, organizational ethos and context of academia, the authors maintain that a number of factors have hindered the successful development and use of IT. In response to Brackenbury’s & Hague’s (1995) article, Manning et al. argue that, rather than being dismissed as irrational, the actions of academics who do not embrace technology may actually be based on calculation. Ann Wilkinson’s contribution in Chapter 12 addresses how the implementation of technology affects groups defined as “non-standard” students. Wilkinson points out that educational technologies should be providing the opportunity to look at different approaches to teaching and learning that benefit all. In Section Four, Chris Turner reflects on his recent experience of TQA in Scotland by exploring both the current and potential uses of IT in the learning and teaching of sociology in higher education. Focusing in more detail on cost, David Newlands, Alasdair McLean and Fraser Lovie in Chapter 14 stress the importance of comparing the costs of different technologies as well as looking at the evidence for the learning achievements and experiences of students. David Crowther, Neil Barnett and Matt Davies assess how the introduction of computer-based-learning programs have been driven by the desire to achieve efficiency savings. However, as the authors point out, “efficient teaching may not represent efficient learning”. Crowther, Barnett and Davies proceed to outline the general failure to exploit multimedia technology, particularly in the social sciences, before going on to describe how to maximize both effective learning and efficient teaching. Finally, in Chapter 16, Duncan Timms examines the obstacles and the opportunities of IT teaching in the social sciences. According to Timms, the development of computer-based learning in the social sciences has been slow, with the exception of its use in data collection and analysis. Consequently, teaching IT in the social sciences has remained largely unchanged. The reasons for this are both general and specific and as a result the pres sures are also both positive and negative. The chapter ends with a brief look ahead to the direction that the social science community should take in relation to the pervading role of technology. The issues surrounding developments in technology are undoubtedly wide-ranging, as these chapters have shown. There are issues which still need to be identified as well as resolved. This book should be seen as an attempt to provide a working document to some of the issues raised. It is clear that as a research area, the social, political and economic implications of technology will expand over the next few years; and in the area of education it is evident that with the continuing expansion of higher education, issues of access, quality, effectiveness and choice will be central. Consequently, the pedagogic nature of disciplines, the structure of universities, the teaching and learning styles of both staff and students and any partnership with industry, commerce and public policy must be strategically reviewed. This will undoubtedly involve a huge investment in time as well as financial and other resources. Nevertheless, for the first time, the social
xvi
science community are well-placed to take the lead and to shape policy in this area. The role of technology, then, must be an issue that is placed high on the social science agenda.
Section One NEW CHALLENGES FOR TEACHING AND LEARNING
Chapter 1 EXPONENTIAL EDUCATION Peter Cochrane
We are living at a time of unprecedented change, with technology advancing faster, and producing more new opportunities, than ever before (Lilley 1995). IT has created not only the mechanisms to do more with less, but also the means of storing, accessing and transporting information on a scale inconceivable just ten years ago (Emmot 1995). Technology feeding technology, with machines used to design better machines, is the evolutionary process responsible for the exponential capability growth now driving society. In contrast, our wetware (the brain between our ears) has seen no significant change during the past 150,000 years, and in evolutionary terms mankind is in stasis (Calvin 1991). So if we are to survive in a technologically driven world that is changing faster than we can biologically accommodate, we have to use the very technology that engendered our predicament to help us cope; it is our only course of action. Going back to earlier, and in many respects simpler, times is not an option—no matter how distortedly attractive it may appear (Bronowski 1973). The progress of our species has always been, and remains, irrevocably linked to innovation and technology—and it is one way only! We just could not support the world’s population of over 5 billion without the technology we have come to take for granted (Toffler 1971). Human-technology perspective Only 2,000 years ago most of humankind lived in tribal communities of just a few hundred individuals meeting and knowing fewer than 1,000 people in a lifetime. For this life, we were well equipped, with all of the tribe’s knowledge contained in the human brain, and passed on from father to son, mother to daughter. For most, all the information they ever required was within the tribe. Civilization, cities and trade changed all this, and in a period of less than 200 years the transition from the farming and rural existence to the Industrial Age was completed (Bronowski 1973). During this transition, the ability to transport large quantities of goods and people across the planet emerged, creating a demand for good telecommunications. It is interesting to reflect that colonization and supremacy in war were the primary motives for the development of much of our industry, and have led directly to today’s revolution in IT. More impressively, we have created a new era in much less than 100 years. When De Forest invented the thermionic valve in 1915 he could never have guessed the revolution that he was starting. The next major step was the invention of the transistor in 1946 by Shockley, Bardeen and Brattain, to be followed by the integrated circuit in 1958, the laser in 1960, and optical fibre in 1966. In the last 50 years we have seen the world become dominated by electronics (chips) and optical fibre. As a result, computers and communication are now ubiquitous, and we have created more information, achieved and understood more than all of the past generations since we first discovered fire. This pace of change will not only continue, but accelerate: and the trajectory is now clear—it is exponential. Every year
EDUCATION
3
(or thereabouts) sees optical fibre transporting twice as much traffic (Cochrane & Heatley 1995), memory chips storing twice as much data, and computers twice as fast (Emmot 1995). Many people consider English to be the planet’s primary language, and speech to be the most sophisticated and dominant form of communication. Well, they are wrong. The dominant form is now binary, and it is between machines having more conversations per day than mankind has had in its entire existence (Drexler 1990). We can now wear more computing power in a wristwatch than was provided by a commercial computer the size of a domestic washing machine 30 years ago. In 10 years the PC will be around 1,000 times more powerful than today, and in 20 years near 1,000,000 times more. By about 2015 super computers will have reached human equivalence in terms of information storage and processing, and by 2025 that power will be available on our desks. About five years later (Calvin 1991, Regis 1991) computers will be wearing us! If we are to maintain a primary role on this planet, we must understand technology and use it to advance our own limited brain capacity. It is not possible to ignore these changes for they are inexorable, and will promote even more change (Cochrane 1995b, Kennedy 1993). In short, you can opt out, but you cannot escape. Antagonistic technology There is absolutely no doubt that most IT interfaces seem to have been designed by people who feel we should all be computer scientists (Norman 1988). This is definitely the wrong approach. Most people have great difficulty driving a VHS video recorder, let alone a PC. Unless we humanize machines (make devices extremely user friendly and easy to use), a society divided by its abilities with machines will be born. This would be a disastrous society of IT “have and have nots”, full of tension, and sub-optimal for our own productivity, progress and survival. It is vital, therefore, that technology is bent into people and people are not bent further into technology (Emmot 1995). Today the primary interface is the button, switch, knob, mouse, keyboard and screen. This can only be viewed as archaic, and something that should not survive. Fortunately, technology is now reaching a point where voice control and command, and even limited conversations between people and machines, are possible (Cochrane & Westall 1995). This Star Trek vision is the first step in the journey to a symbiotic relationship between carbon (us) and silicon (chips) life forms. It is also the first example of directly linking the nervous systems of two different entities. The next extension will be our sense of touch, as fingertips and other sensory areas are coupled directly into machines (Drexler 1990). In the meantime we have to make do with sight and sound, head-mounted screens, cameras, microphones and earphones. But even with this limited technology we can achieve a tremendous expansion and change in our abilities and society as we increase the access to, and throughput of, information and experience (Earnshaw & Vince 1995). Education In the slow-moving world of the ancients, who wrote and drew in the sand, on clay and parchment, education followed the master-disciple model, whereby only a select few were chosen to be educated by a very few teachers. The world was a slow-moving place where innovation and technology were alternately promoted and constrained by war and religion (Bronowski 1973). With the invention of the printing press major changes evolved: this was a new means of propagating the written word and, more importantly, ideas. Mass education started to take off; formal systems, teachers and classes grew in size and number throughout the developing world. Up to, and throughout, the industrial revolution this “Sage-on-the-Stage” system of imparting knowledge was very effective. Regimented classes of 30–50 children, drilled by a single teacher, proved an efficient and essential means of educating the armies of people required to fuel
4
EXPONENTIAL EDUCATION
the transition of society from agriculture and cottage industry to mass production and industrialization. Up to the end of this era, change was still relatively modest and within the grasp of the individual, and so was education. However, at the dawn of the information age, the system and individuals were beginning to creak under the pace of change and demand for more diversity (Toffler 1971). Long-held wisdoms of science, technology, economics and commerce started to shift or became increasingly challenged. In contrast, other topics such as mathematics, history, sociology and law remained relatively stable for a further 30 years. Today, nothing is stable, nothing goes unchallenged, and certainly our accepted modes of education and training are under threat as the world accelerates into the information age (Emmot 1995). Let us examine this change in more detail. Thirty years ago the vast majority of children came from homes with few books and went to school for education. Today, unfortunately, the reverse is often true. For many children with top end computers, access to CDS and networks at home, they see school as having little to offer. Interestingly, in numerous programmes with children, it has become abundantly clear that the primary impediment to progress is not the young people. It is the older generation who are trying to impart their experience and knowledge who present the key limitation (Cochrane 1995a). For the most part our society appears divided at about the age of 29, with those older computer illiterate, and those younger fully able. So it is not unusual to find a class dominated by a teacher who is IT illiterate, and feels threatened by a class full of capability. This problem is compounded by the lifestyle of children which is now partly governed by the games environment (Martyn, Vickers, Feeney 1990) of intuitive learning, and a “crash and burn” culture. They feel no inhibition in discovering by doing, and coming to grief in full public gaze, while the cultural background of their elders is the converse. It is perhaps not surprising to find that many youngsters view university, college and school as boring, where the teaching methods have not changed in aeons. These young people happen to come from a world of instant gratification, of IT, and rapid access and experience, of new and dynamic skills learnt in new ways (Cochrane (ed.) 1994a). Examples of new ways Just two decades ago a young child would have learnt to tell the time on an analogue clock-face, and the digital form would have been unusual. If they later developed an interest in science, engineering or flying, they would come to grips with the vernier scale and altimeter by a single step analogy with telling the time. Today, the converse is often the case. They learn about flying very early, and you cannot fly an F16 simulator if you do not learn about cockpit instrumentation. So telling the time on an analogue display involves analogical reasoning in the reverse direction of 20 years ago. Finding information has always been a social activity. The Dickensian library offered a degree of order and mapping that allowed a fair degree of success by the individual. However, much of the information retrieval process of this old world involved finding knowledgeable people; teachers, friends and colleagues could usually help steer us in the right direction. In the IT world we now have search engines and Gofers or Agents that serve the same purpose (Milne & Montgomery 1994). We can also communicate electronically with vastly more people to gain their assistance and steer. With such devices, students can search, find, sort and assemble information hundreds of times faster than previous generations. Curiously, older people, especially teachers, often consider this as cheating. There are now over 24,000 CD titles available for use with PCS containing everything from classic books and whole-body interactive encyclopaedias to scientific experiments and university degree courses. The te aching of some difficult topics in science, statistics and engineering can now be enhanced significantly
EDUCATION
5
Figure 1.1 Children using computers: an on-line school session.
through computer animation and visual representation. Instead of static words and two-dimensional pictures on paper, students can interact with three-dimensional entities on the screen to experience cause and effect first hand. There is a growing library of standard experiments and situations available, along with medical operations, Shakespearean plays and legal cases. In this regard, interactive multimedia is providing an often superior alternative (Martyn, Vickers, Feeney 1990) to individual teachers and books for large tranches of education. It is now possible to illustrate and explain immensely complex systems and situations with the technology of visualization and virtual reality (MacDonald and Vince 1994). Unlike Crick & Watson, students should not have to construct a model of DNA using cardboard and coathangers (Crick 1994). Access to mathematical representations of a visual form that is both exciting, stimulating and edifying is now a given in modern industry. Leading manufacturers no longer construct prototypes, but the real thing in virtual space, and then go straight to the production line with the finished product (Earnshaw & Vince 1995; Yates 1992). Education needs this technology too. Shared experiences With telepresence technology it is now possible for a one-to-many or one-to-one experience to be realized efficiently on a massive scale. The surrogate head is just one development where miniature television cameras above the eyes, and microphones above the ears, collect information in real space and time. This can then be transmitted and displayed on screen, or a VR headset, to one or more people in any location on the planet. So a surgeon can perform an operation with a thousand students standing inside his or her head looking out. Conversely, when a protégé performs the same operation, the surgeon can stand inside and advise in the closest possible sense (Cochrane 1994).
6
EXPONENTIAL EDUCATION
Figure 1.2 The surrogate head surgeon.
Within the next 15 years the addition of touch to such systems will make this human experience almost complete. This might sound farfetched, but it exists in the laboratory today, and has been used for real operations on humans over standard dial-up ISDN circuits (Cochrane, Heatley, Pearson 1995). This technology is applicable to a wide range of disciplines, and has the potential to completely change the education and training paradigm to just-in-time. Half-life education In fast-moving areas of technology many degrees now have a half-life of less than five years. Moreover, the time when a single discipline degree was sufficient for a lifetime of work has long gone (Gell & Cochrane 1995). For example, it is not unusual to find electrical engineers now concerned with biology, sociology and genetics. So it seems time to create a new form of degree that is much lower, broader, more generic, and able to equip people for a world that will change rapidly over a working lifetime. In addition, a series of higher degrees are necessary that can be rapidly acquired as technology and work practices change. However, as business life and industry also accelerate and demand increases, then so does the pressure to hang on to the scarce well-trained resource that is key to the success of the very enterprise itself (Hague 1991). Virtual university It is partly in response to the above paradox that five years ago BT created a series of internal degree courses. Their organization and running were under the auspices of several conventional universities banded together to create the desired profile and course content (Cochrane 1995b). Interestingly, this content is increasingly dynamic as each year sees the course material change to meet the needs of a fast-moving business. Everyone wins: the students who become empowered and capable, the company that has the workforce it requires, and the universities that gain access to key people and activities in industry.
EDUCATION
7
Figure 1.3 The virtual university.
At first the courses were conventional, with students and teacher gathered in a lecture theatre for a few hours a week, followed by tutorials and assignments. More recently a new format began to unfurl, whereby lecturers from North America and other regions were teleported into the lecture theatre by suitably mounted cameras and ISDN dial-up lines. They appear on a three-metre-square back-projected screen to give their lectures eye-to-eye. Only two years ago such a lecture was costing £60 for the communication connection, and today it is only £40, much less than the hotel charges for a real lecturer in a real hotel. There are those who would argue that this is not a real experience, and it is not as good as the real thing. While this may be true, the choice is rather more stark: either you have the electronic experience, or none at all! On that basis, the students would sooner have world experts in front of them electronically rather than never getting their presence. More recently the next step has been taken: teleporting the event to the desks of individual students so that they no longer have to break away from work, and they do not have to crowd into a lecture theatre. They can now attend courses or tutorials, and interact with each other, directly on the screen. Within BT this experiment has now been ratified as the primary model for future company education and training. The key discovery has been that the downside of apparent isolation at the desk can be overcome by a series of short communal periods where everyone on the course meets and works together. To date technology presents a poor meeting environment for people. The images are small and distorted, often with sound and vision slightly disconnected. After a first real face-to-face meeting, however, these deficiencies tend to be overlooked and the participants just get on with working together. In the not-too-distant future new display and audio technology will provide life-size images of near-zero distortion and daylight brightness. This is expected to extend this education and training regime significantly, and may totally remove the current need for real interaction. Experiments on breaking down
8
EXPONENTIAL EDUCATION
the social barriers (Cooper 1994) and establishing trust and relationships will thus form a primary target in the next phase of development. The critics There are very few of us who look forward to, or enjoy, change on a large scale (Toffler 1971). This is certainly true of the education establishment and many who are involved indirectly. The primary direction of criticism always seems to be, “That’s not the way they did it in my day!” I suppose if we went back to the time of Archimedes and Aristotle, people were saying much the same thing about their methods of teaching. The reality is that just 50 years ago in British universities the lecturing and teaching practice was totally different. Today you can still see the benches at the front of lecture theatres where experiments on a grand scale would be conducted in front of an enthralled class. This was real experience, and teaching in a manner that is now long lost. Why? Because education has been squeezed and changed continually. This has resulted in small universities with very small departments trying to do far too much in too short a time. Students are being asked to take in more and more information and experience in less time while staff-student contact time continues to decline (Gell & Cochrane 1994). Ultimately education is becoming impossible relative to the rate and breadth of change in a world of technologically driven progress (Ravitch 1995). Most active university staff have far too many research students, and far too many classes to teach. To exacerbate the problem, most university departments seriously lack the necessary number of people with the right abilities. No doubt, all of the abilities required to create a suitably well qualified, skilled and able department are available in the country. However, they are seldom, if ever, available in one location—a university (Hague 1991). The virtual university, an ethereal space in the information world, overcomes this problem, and allows groups of people with the right interests and skills to come together to work and be proactive. The problem is that it does mean a different mind set, and a different way of doing things (Lyons & Gell 1994). Unfortunately for the traditionalists there is no other solution that will allow us to meet the challenge of technologically driven change in our society. It is, therefore, imperative that we embrace the technology, and experiment to find out what works and what doesn’t (Handy 1990). The virtual world today On a Saturday morning I can struggle into Ipswich, park my car, walk across town, buy some software at a high price and pay VAT. Alternatively I can go onto the Net, access the software directly from the supplier in the USA, pull it down the Network, and pay for it without even leaving my machine. The advantage is not only in the time and inconvenience saved but in the lower cost of a product that no longer requires a wholesaler, distributor, retail outlet or VAT. The same is true for the library, the bookstall and potentially for all forms of “soft products”. Such thoughts alarm many people when they ought to make them feel relaxed. For this virtual world is not an instead of, but an aswell-as, technology. It opens opportunities for new ways of doing things, new forms of trading and enterprise, and new dynamic markets. Shopping, entertainment, education and training, from your desktop at home, at work, or wherever you happen to be, are now very real options (Heldman 1988).
EDUCATION
9
Society and change While technology changes our world irrevocably, there are some features of it that will remain for many decades to come—but not many. For example, consider such stable institutions as government, banking and the City of London. We currently have a governing mechanism that involves people sitting two swordlengths apart acting like demented schoolchildren in lengthy debates eyeball to eyeball. The decisionmaking processes of this system is orders of magnitude slower than counterparts in the virtual (electronic) world. Similarly, financial institutions are being touched by technology in ways that are changing them, and impacting on our society. The vast majority of bank branches are no longer required. It is possible to run the entire operation from one location or even no location at all. The same is true of the City as it now deals primarily with information rather than money, for gold has become an abstract concept, as are the pound and other currencies. If such solid institutions are being challenged (Drucker 1993) by technologically driven change, then the role of an education system is to prepare the population for the new world that will result. It is vital that the education and training sector produces the right people with the right skills. This will not happen by following the market; education (Gell & Cochrane 1994) has to get ahead. The difference between the old world and the new is exemplified by the typing pool. Only a decade ago most large organizations had such a resource staffed by young women, their sole purpose to take handwritten or spoken text and transcribe it onto the typed page. The process could take several days depending on the queue length. The very thought is inconceivable today; who would operate in such a way? Things are now turned around in a matter of minutes, not days. Modern companies operate with telephone, fax, e-mail, video conferencing; they have very low flat structures (Lyons & Gell 1994) with people empowered to make local decisions and get on with the job fast. Any form of delay is just inviting the competition to take away your business and markets. The same is increasingly true in education and training —any school, college, university or training establishment that sits back and continues to exclusively use the old chalk-and-talk methods is destined for extinction (Gell & Cochrane 1995; Ravitch 1995). We have to move forward with the technology if we are going to keep up with a world that is changing ever faster. The future My father had a working life of 100,000 hours; I can now do his work in 10,000 hours; my son will be able to do it in 1,000 hours, and so on (Handy 1990). The work that took me a whole morning as a young engineer is now completed in less than 15 seconds by the power of computer-based automation. This level of progress is assured for at least another decade as we can see all of the techniques, and all of the technologies on the laboratory bench today. It is likely that this progress will continue for at least another two decades and probably three, but after that we reach the ultimate limit of using sub-atomic particles as components (Drexler 1990). There is little doubt, as history shows, that our innate curiosity, creativity and inventiveness will generate even more technology (Bronowski 1973) and cause more change beyond silicon and silica. However, we are at a unique epoch, and there is a new proviso; for the first time in our entire history, we have to keep up with the technology. We have to stay ahead, stay educated and trained, and somehow understand things that currently defy our limited wetware. Tapping the exponential power of the technology itself (Pagels 1988) appears the only option if we are to live and prosper as individuals and a society (Cochrane (ed.) 1994b).
10
EXPONENTIAL EDUCATION
References J.Bronowski, “The Ascent of Man” (television series) (London: BBC, 1973). W.H.Calvin, The ascent of mind (New York: Bantam, 1991). P.Cochrane, “Communications, care and cure”, Telemed ‘94 conference, Hammersmith Hospital, London, 1–4 September 1994. P.Cochrane (ed.), “The potential for multimedia, information technology and public policy”, The Journal of the Parliamentary Information Technology Committee 13, 3, Summer 1994a. P.Cochrane (ed.), (37) Special Series on “The 21st Century”, British Telecommunications Engineering, 13, Pt 1, April 1994, continuing into 1996. Features a wide range of articles on technologies and applications concerned with education and training. P.Cochrane, “Desperate race to keep up with children”, The Times Educational Supplement, 23 June 1995a, p. 25. P.Cochrane, “The virtual university”, Business of Education 5, March 1995b. P.Cochrane, & D.J.T.Heatley, “Aspects of optical transparency”, British Telecom Engineering 14, 1, April 1995, pp. 33–7. P.Cochrane, D.J.T.Heatley, I.D.Pearson, “Who cares?”, British Telecom Engineering, 14, 3, October 1995, pp. 225–32. P.Cochrane, & F.Westall, “It would be good to talk!”, paper presented at Second Language Engineering Convention, London, October 1995. M.Cooper,“Human Factors in Telecommunications Engineering”, special issue of British Telecom Engineering Journal 13, 1994. F.Crick, The astonishing hypothesis: the scientific search for the soul (London: Simon & Schuster, 1994). K.E.Drexler, Engines of creation: the coming era of nanotechnology (Oxford: Oxford University Press, 1990). P.Drucker, Post-capitalist society (Oxford: Butterworth-Heinemann, 1993). R.A.Earnshaw, & J.A.Vince, Computer graphics: developments in virtual environments (London: Academic Press, 1995). S.Emmot, Information superhighways: multimedia users and futures (London: Academic Press, 1995). M.Gell, & P.Cochrane, “Education and the birth of the experience industry”, paper presented at European Technology in Learning Conference, Birmingham, 16–18 November 1994. M.Gell, & P.Cochrane, “Turbulence signals a lucrative experience”, The Times Higher Education Supplement, 10 March 1995, p. 11. D.Hague, “Beyond universities: a new republic of the intellect”, Hobart Paper, Institute of Public Affairs, London, 1991. C.Handy, The age of unreason (London: Arrow, 1990). R.K.Heldman, ISDN in the information marketplace (Blue Ridge Summit, PA: TAB Books, 1988). P.Kennedy, Preparing for the 21st century (London: HarperCollins, 1993). R.Lilley, Future proofing (London: Radcliffe Press, 1995). M.Lyons, & M.Gell, “Companies and communications in the next century”, British Telecommunications Engineering Journal 13, 2, 1994, p. 112. L.MacDonald, & J.Vince, Interacting with virtual environments (Chichester: Wiley, 1994). J.Martyn, P.Vickers, M.Feeney, “Information UK 2000”, British Library Research (London: Bowker-Saurn, 1990). R.Milne, & A.Montgomery, “Proceedings of expert systems 94”, British Computer Society (Oxford: Information Press, 1994). D.A.Norman, The psychology of everyday things (New York: HarperCollins, 1988). H.R.Pagels, The dreams of reason: the computer and the rise of the sciences of complexity (London: Bantam New Age Books, 1988). D.Ravitch, “When school comes to you: The coming transformation of education and its underside”, The Economist, 11 September 1995, pp. 53–5. E.Regis, Great mambo chicken and the transhuman condition (London: Penguin Books, 1991). A.Toffler, Future shock (London: Pan Books, 1971). I.Yates, Innovation investment and survival (London: The Royal Academy of Engineering, 1992).
EDUCATION
11
Chapter 2 PEDAGOGY, PROGRESS, POLITICS AND POWER IN THE INFORMATION AGE Stephen Heppell
The development of educational technology from Skinnerian teaching machines onwards has offered a contrast between rapidly advancing technological potential and slower pedagogical, social and political development. Initially, this had not necessarily been disadvantageous; in many cases technology failed to deliver on its potential, often embodying models of learning that owed more to convenience than cognition. Technology’s contribution was often then touted with a triumph of hype over hope and institutional learning’s conservatism provided a pragmatic, and welcome, buffer against the tide of misplaced optimism. However, it is wrong to assume blandly that this will continue to describe the state of affairs in learning technology. Technology continues to advance its potential exponentially and, inexorably, we reach a point where rhetoric is eclipsed by reality and learning technology holds out the hope of simply better learning, whatever we mean by that. Unfortunately it is not as simple as “sitting and waiting” for progress to take root. Many systemic barriers to progress must be surmounted and many confusions result from the mismatch of technological and pedagogical progress. Far from a model of irredeemable technological determinism, the choices thrown up by these confusions are the stuff of political and social debate with real choices to be made: for example, it could be suggested that increasingly affordable micro-technology liberates individuals from old forms of capital. In publishing and communications, for example, we have seen the economies-of-scale barriers to entry of new competition fall away rapidly as everyone with a desktop micro and a laser printer becomes a publishing house, and the Internet has offered access to vast audiences for minimal capital outlays. But equally we could argue that, although the barriers are lower, lack of access to new communication technology has created a further disenfranchized techno-poor minority. Similar political and social debate should surround concepts of what public service looks like in cyberspace, whether information is a new factor of production or a new form of capital, whether teleworking liberates or imprisons…and so on. Technological progress in this way is posing some fundamental questions for nations which are a long way from the simple grasping of “the white heat of technology”. Indeed, as telecommunications reduce our reliance on geographical proximity the concept of the nation state itself becomes challenged; will I vote and pay taxes with my geographical neighbour, or with the electronic community that I work, shop and socialize with? However, these are broad and general issues for future debate. This paper will reflect on more pressing concerns. Firstly, the emergent capabilities both of technology itself and of the “children of the information age”; reflecting on the challenge that these capabilities pose for existing models of education and assessment. As I demonstrated in my conference plenary address1, technology allows us to offer powerful support for small cultures whether they are linguistically determined (for example Catalan) or (like Deaf culture) based on some other common circumstances. We can dictate to our computers, they can reward our engagement with multiple media types—speech, text, graphics, aural ambience, video—a tapestry of cues, clues and primary information. This should not mean that we require our school students to be media eclectic, strong in every
PEDAGOGY, PROGRESS, POLITICS AND POWER IN THE INFORMATION AGE
13
media type that technology can support; already we disadvantage those not fluent in textual notation (for example dyslexics) by an insistence that we filter much of our children’s learning through the ability to represent it textually. Requiring them to be strong in all other media too would be to further narrow the corridor of success. What we are seeking is media redundancy where children can derive and represent meaning from a menu of media type/s and this of course has profound implications for the assessment and examination system. Children too pose challenges to that assessment system. It is clear from research at ULTRALAB that children are adept performers with (and through) technology. Faced with new tasks and problems they adopt strategies (for example Observe, Question, Hypothesize, Test and Reflect), they represent Process to each other (“look at how I did this” rather than “look at what I did”), they work collaboratively and they multi-task. When asked for example to watch multiple television programmes simultaneously they adopted a strategy which reflected their own understanding of media (for example they used their knowledge of genre and of the role of aural information) which allowed them to answer detailed questions afterwards about minutiae (“what colour was the…”) and also successfully to tackle meta-level questions about character development and production decisions. They showed themselves to be highly media literate and yet much of our pedagogy and assessment fails to allow them to reflect this capability. Worse still, as we abandon (for good reasons) our reliance on normative-referenced testing in favour of criteria referencing we find that technology moves the criteria faster than we can pin them down, with the result that either the assessment model becomes an unacceptable drag on progress or we are uncertain about the quality of our assessment procedures. Ten years ago I could have gained a recognized qualification by typing at n words per minute on a manual Remington typewriter. Now “speech to text” technology lets me dictate to a portable computer faster and with fewer errors; do I still qualify for the certificate? Our constant problem with technology has been to look at its impact on an existing model of behaviour. Too often we make judgements from a deficiency model of both people and technology (“they can’t use it and it doesn’t work”). In 1939 the New York Times commented that “The problem with television is that people must sit and keep their eyes glued to the screen. The average American family doesn’t have the time for it”, which undervalued the ability of technology and of individuals to modify behaviours. In 1967 Chu & Schram looked back on half a decade of research into the impact of colour television on learning. They concluded from the research data that “Where learning is concerned colour television has no distinct advantage over monochrome”, which was true in retrospect because at that point television companies had failed to grasp the new ways that colour might allow them to represent knowledge and entertainment. One of the first TV entertainment programmes to be converted into colour was The Black and White Minstrel Show, which shows how easy it is to miss both social and technological change. Similarly today much of the output of publishers on CD ROM is simply in the form of electronic books and the resultant multimediocre both misrepresents the potential of technology and the capability and new literacy of technology. A second important area for current debate centres around the shape of our media services and the institutional or public policy that attempts to keep pace with them. As computers put communication tools into more and more hands the national debate about preserving standards and about what is appropriate or inappropriate is reminiscent of the church’s rearguard action to preserve literacy for itself as printing began to impact on our social lives. From pirate radio onwards the democratization of communications has been characterized by stout defences of position by existing institutions. In the context of learning, schools and universities as 1. At the First International SocInfo Conference: (TESS) Technology and Education in the Social Sciences, 5–7 September 1995. For details refer to: http://www.stir.ac.uk/socinfo
14
PEDAGOGY, PROGRESS, POLITICS AND POWER IN THE INFORMATION AGE
institutions too have worked to preserve and strengthen their role in formal learning. Schools even encourage parents to create little institutional microcosms in the home by sending students back with homework, while parents respond in some cases by setting up little school desks and trying to recreate the classroom in the bedroom (“you wouldn’t have the radio on in the classroom would you?”). Suddenly, however, the learning industry looks a lot bigger than schools and universities and high quality learning will be occurring through other channels like the digital annotative side channels offering parallel commentary to TV programmes. A huge challenge for educational institutions will be the way in which they respond to these new learning environments. There are already popular “project collaboration and exchange” areas available on the Internet and these can either be seen as an appropriate and imaginative use of technology or as cheating. Education’s response (and the way it addresses the issues of social equity raised) will determine its future significance in the learning industry, just as the church’s response to mass literacy was crucial in shaping its own destiny. For politicians looking to build policy in the information age it should be clear that alternative scenarios can and will result and the extrapolation of policies to build those scenarios will become a key differentiator of political perspectives. Not so long ago in us politics it was a universal tenet that “motherhood and apple pie” would always be a Good Thing but now our understanding of the changing dynamics of the family, and of nutrition and diet, leaves many credible shades of opinion about just how “good”. Similarly our view of technology as a Good Thing needs to evolve levels of sophistication; the critical awareness that social sciences bring will be crucial to this process and education needs to be at the heart of the debate if it is not to be excluded. References G.C.Chu & W.Schramm, Learning from television: what the research says (Stanford, California: Institute for Communication Research, 1967). A.M.Gillaume & G.L.Rudney, “Student teachers’ growth towards independence: an analysis of their changing concerns”, Teaching and Teacher Education 9, 1993, pp. 65–80. M.C.Heck, “The ideological dimension of media messages”, in Culture, Media Language S.Hall et al. (eds) (London: Hutchinson, 1980). S.Papert, “Literacy and letteracy in the media ages”, Wired 1(2), May/June 1993, pp. 50–52.
Chapter 3 TECHNOLOGY AND SOCIETY: AN MP’S VIEW Anne Campbell
We are living through a period of intense technological change. The number of people employed in manufacturing industry is down from around 7 million in 1979 to 4 million today. These changes have left many scars and caused insecurity and unease among those who are still employed as well as those who have given up hope. Much of the population has been left with a feeling of deep suspicion. In many quarters an “anti-science” culture is developing, particularly among the young. The free market approach to technological development has left many people without access to the new technologies and consequently the gap between the “haves” and the “have-nots” has grown. What is required is more vision. Technology could be employed to open up the opportunities for education and research for millions of people. The government’s role must be to ensure that such access is available to everyone and that the information revolution is used to encourage opportunity, equality and democracy. Social scientists need to engage in this process and find ways to understand the implications of these technological advances. Using the example of my own constituency in Cambridge, this paper illustrates how, through free public access points across the city, citizens can access socially useful information about the city, council services and information from government agencies. Science, technology and government policy In May 1993, the Conservative Government produced a White Paper called “Realizing our potential: a strategy for science, engineering and technology”. Their strategy was to improve the nation’s competitiveness and quality of life by maintaining the excellence of science, engi neering and technology. However, this was accompanied by a sharp reduction in the funds available for science and technology across government departments. The Government also expressed its concern that public money spent on science and technology might not always be directed in a way that best satisfied industrial needs. A Technology Foresight exercise was established in order to predict the future needs of society. Few would argue that this exercise has no merit. It is helpful for the Government, academics and the business community to sit down together and discuss issues of common interest. Nevertheless there are many concerns that focus on the way the results of this exercise might be used. Some parts of the White Paper certainly sent a chill through the scientific community. Many scientists believed that the Government would try to restrict its spending on science and technology to those areas most useful for industrial application. In fact, it has been proved on numerous occasions that attempts to pick industrial winners often meet with failure. The pressure group Save British Science sent out a Christmas card to MPS in 1993 which cited examples of research projects that had been turned down for funding in the past, because they did not appear to have any industrial application. Liquid crystal display technology, which was invented in the UK, was developed abroad because nobody in the grant-awarding
16
TECHNOLOGY AND SOCIETY: AN MP’S VIEW
bodies believed that it was an idea which had any commercial future. Many scientists have argued that many of the best commercial ideas come from “blue skies research”. We risk missing the more innovative and exciting developments if the direction of our scientific effort is determined by society’s existing needs. It is important that scientists are given the freedom to continue with blue skies research to ensure that we do not extinguish future opportunities to improve our lifestyles. It is not too difficult to predict some of the areas in which scientists and engineers will develop the technology in the five or ten years which lie ahead, but it is much harder to imagine how people will adapt to the changes which it can bring. The role that social scientists can play here is therefore quite important. We need to understand the new social processes that are emerging with the developments in technology. Since technology can create new needs, we should not limit ourselves to merely doing more efficiently or more cheaply those things which we can do already. Word processors are not simply about typing letters more quickly; they allow people to think in a different and more flexible way. Mobile telephones are not just different ways of using the telephone; they enable people to be in touch anywhere at any time. The Internet is not only a means of downloading information; it encourages communication and interaction across national boundaries as well. All these changes have been supported and welcomed by the people who could afford to pay for them. Successful technological development is often a leap of faith. It depends on being able to predict the new needs which are generated by the scientific progress which has been made. Thirty years ago it was not anticipated that computers would be used to do anything other than high-speed mathematical calculations. Now we see them being used to organize information in a way that has revolutionized our lives. The ways in which the national communication networks are used in future will depend very much on human ingenuity and imagination. The information gap We must ask whether these advantages will benefit everyone or will leave us with an underclass of information poor. Will they increase employment so that everyone can afford to have shopping, entertainment, education, business, and so on, all available from home? From present trends that seems doubtful. Without intervention, the free market will force open the divisions in society even wider than they are at present. People who are employed will acquire the experience and up-to-date technological skills to flourish in the new age. The “haves” will rapidly accumulate more and the “have-nots” will have no relevant skills to pull themselves out of the poverty trap. When 80 per cent of the population are using electronic mail, what happens to those dependent on the daily mail deliveries when the postman disappears? Does electronic surveillance drive the homeless even further from the centres of civilization, to dark hidden corners where they become invisible? Will the corner shop disappear completely with the advent of teleshopping? How will the “have-nots” manage in those circumstances? There are some serious and difficult issues to do with access and equality. It may be easy if you have the necessary computer and modem and can afford to pay the subscription to an Internet provider, as well as the expensive phone bills which arrive when you have been surfing and forgotten the time. If you have never been able to afford a telephone, which is the situation for up to 75 per cent of households on some housing estates in Britain, then life is bound to be much more difficult. These issues are the cornerstone of social scientific research and more work should be done in this area.
THE INFORMATION GAP
17
The Cambridge experience About a year ago, I decided to make use of the new technologies by giving my constituents the facility to contact me by e-mail. It is probably a more viable prospect in Cambridge than in most other constituencies, since about 30,000 of my 70,000 constituents already have access to e-mail. About 25 per cent of my constituency mail arrives this way. I also give my constituents the chance to contact me at an e-mail advice surgery. This specifies an exact time when I shall be sitting at a terminal ready to receive messages and I try to respond to them immediately. But for many, and for 40,000 of my 70,000 constituents, there is no access and probably little inclination. What is the point of connecting when you have never used a computer anyway and you just do not believe that there is anything on the Internet which would be of any conceivable use to you? In Cambridge we have attempted to tackle these problems by launching the Cambridge On-line City Project, with the aim of providing socially useful information and free network access for people to whom it would not normally be available. In its first phase, six public access points have been provided in public buildings such as libraries, community centres and council offices around the city. This has now increased to 17. The information is provided via a Website.1 This contains an A–Z of council services, an index of voluntary groups, advertisements of council leisure facilities, information on where to get benefits advice, and some links to job vacancy databases. We hope to add doctors’ lists, NHS dentists, council house exchange lists, chemists which are open late and many others. Many public service organizations have been consulted, and are enthusiastic about having their information provided over the Web. In these early stages, the project has relied on the generosity and goodwill of local companies and local councils. Cambridgeshire County Council and Cambridge City Council have contributed officer time, Cambridge Cable have provided telephone lines, and Uunet have given server space and technical support, with further support from CMS and Software AG. In order to progress, the project will need funds to employ a manager and to expand the system. The success of the recent lottery bid will ensure that this happens. It is also hoped to have an IT learning centre so that people can pick up the skills required in order to be comfortable with the technology. In the future, this facility could also be used to provide a feedback mechanism so that users can comment on council services and on other public services as well. In a properly developed framework, it could greatly increase the accountability of councillors, MPS and other elected public representatives, particularly if the comments were accessible in an open and public way. Another project that will link-in to the Cambridge On-line City is the Cambridge Childcare Information Project, now called Opportunity Links. The purpose of this venture is to help parents get back to work by providing most of the information that they need in one location. There has been a very generous response from the organizations and firms which we approached about sponsoring this project. Initially, £10,000 was raised from commercial and public sources which has enabled us to employ a part-time project worker to collect the information. A Website was launched in 1996 to give parents some basic advice on childcare: the different kinds available, their relative costs, local nurseries and playgroups. Other information about ways in which to look for a job, how to find appropriate training, and benefits advice together with “better-off” calculations are also supplied. The continuation of this project will rely on local government and businesses donating sufficient funds to continue to employ project workers. The need for such a service is clearly there. In the future, these information services will revolutionize libraries and welfare information provision. It will be cost-effective to spend some public money but it is difficult to envisage government, local or central, being able to afford the expenditure to assure completely open access. There are commercial 1. http://www.worldserver.pipex.com/cambridge/
18
TECHNOLOGY AND SOCIETY: AN MP’S VIEW
advantages for the private sector wishing to provide additional entertainment and leisure facilities. This could stimulate the development of public-private partnerships that will provide the systems and access facilities, giving benefit to both community and business. Access and empowerment: some lessons There are many issues that I have had to consider in my own personal use of the Net. A fundamental belief is that participation in the information society should be available to all, and not just the privileged few.2 There must be equality of access and we must seek to empower citizens both as participants and consumers, as well as providing equal access for the providers of services. The new networks must help to increase citizen participation in decision-making and contribute to the development of a more open society. At the same time, legislation should be framed which enables privacy to be respected and legitimate rights to the ownership of information to be acknowledged. Government itself can become more open and accessible through this process. The implications for education and research are central to issues of access.3 The opportunities to learn will undergo the same massive expansion as occurred when the first public libraries opened. But this is a new kind of active learning, since it will involve interaction with individuals and not just passive absorption of information. How much more then will learners need guidance through the maze of information, learning packages, electronic courses and offers of tuition. The role of teachers and lecturers will change for the better: there will be more individual direction and guidance, less bureaucratic record-keeping and factgiving. Programmes can be tailored to the needs of individuals, but that individual will still want human contact and human input to learn in the most effective way. The national communication networks have the potential to open up learning channels for very many more people than those who benefit from further and higher education at present. It is through this new technology that we see “The Learning Society” within our grasp. It is vitally important that we take hold of these chances and use them to improve the quality of life for all our people. But that takes more than pure commercial development. This is not an area that we can safely leave to the scientists and the business community. It is one that requires government intervention and the social scientific community to ensure that the benefits are available to everyone. Let us make sure that the information revolution does not worsen the divisions in society, but is used to enable opportunity, equality and democracy. References Office of Science and Technology, Realizing our potential: a strategy for science, engineering and technology (London: HMSO, 1993). Office of Science and Technology, Progress through partnership, 1–15 (London: HMSO, 1995).
2. The Labour Party held a Superhighway Policy Forum in 1995, a wide-ranging investigation into the effects of the new networks and how government can manage them to benefit the many, not just the few. Its findings (Labour Party (1995)) were adopted by the Party conference in October 1995. Labour Party, Communicating Britain’s future (London: The Labour Party, 1995). 3. For papers and information on the development and effects of IT in education see ULTRALAB’S Website at: http:// www.ultralab.anglia.ac.uk/pages/ultralab
Chapter 4 INFORMATION TECHNOLOGY: A CASE FOR SOCIAL SCIENTIFIC ENQUIRY Adrian Kirkwood
Although there has recently been a significant growth in the use of information technologies in the workplace, in education and in the home, there is little evidence to support the technologically deterministic predictions about IT becoming ubiquitous throughout society and about radical social changes that would follow. In Western countries, the impact of IT upon different groups in society has been varied, tending to reinforce rather than ameliorate existing inequalities. This chapter will examine some of the differences that exist between groups (primarily within the UK) in the extent of access to and use of IT in the home and in education. In particular, it will consider variations that exist in terms of gender, age and socio-economic group. As well as presenting the quantitative evidence for the existence of these differences, the chapter will consider whether IT is likely to exacerbate social differences. Introduction For at least two decades predictions have been made about the imminent ubiquity of IT throughout society and the radical social changes that would follow. Alvin Toffler’s view of a future society (1980) had at its centre the home; an “electronic cottage” in which not only paid work, but also leisure and service consumption, would be mediated through information and communication technologies. Although there has been a significant growth in the use of information technologies in the workplace, in education and even in the home, the technologically deterministic prediction of IT bringing about fundamental social change has failed to materialize. In Western countries, the impact of IT upon different groups in society has been varied, tending to reinforce existing inequalities (e.g. Forester 1988; Miles 1988). Those people with good access to IT and familiarity with its use often assume that their situation is typical. For example, Eliot Soloway (a professor at the University of Michigan, USA) introduced his keynote speech at an international conference in 1994 with these words: There is no longer a problem of access to computers.” Perhaps access to IT is not a problem if, like Soloway, you are a male, white American in a middle-class professional occupation—if not, the situation might be viewed differently. In fact, the pattern of ownership and use of IT varies considerably between social groups. This chapter will examine some of the differences that exist between groups (primarily within the UK) in the extent of access to and use of IT in the home and in education. As well as presenting the quantitative evidence for the existence of these differences, the chapter will consider whether IT is likely to ameliorate or exacerbate social differences. It will also examine some of the social factors that tend to have been overlooked (or dismissed) by those who expound technological determinist predictions.
20
INFORMATION TECHNOLOGY: A CASE FOR SOCIAL SCIENTIFIC ENQUIRY
Access to information technology in the home Computers are not accessible to all, even in the richer industrialized Western countries. Many of the claims made by computer manufacturing and marketing companies about the numbers of machines available in particular countries are based upon their measures of output, usually “deliveries to the trade”—i.e. machines shipped out from their own factories, assembly plants or warehouses to retailers or other distributors. Even the consumer sales figures of computer retailers or other distributors provide little or no indication of who the purchasers are and whether the machines are being sold into existing markets (i.e. additional or replacement equipment) or penetrating new markets (i.e. first-time buyers). National social surveys can provide independent information about the extent to which households have computers and other media technologies. The us Bureau of the Census (1993) reported that there was a computer available in about 45 per cent of households in the USA, but only about 10 per cent of the homes of blacks or Hispanics contain a computer. In the UK, data from the General Household Survey for 1994 (OPCS 1996) indicates that less than a quarter of households (24 per cent) contained a computer, compared with 77 per cent having a video recorder and 47 per cent an audio CD player. (This figure for computer access is in line with data from commercial market research.) Even more revealing is the extent to which access in the UK has changed over the last decade. Figure 4.1, below, uses data from successive General Household Surveys from 1985 to 1994 to reveal trends in access to computers and media technologies. The growth in access to these three domestic technologies exhibits strikingly different patterns over this period. Home access to a video recorder steadily rose to over three-quarters of households (increasing by almost two and a half times, from 31 per cent to 77 per cent). There was a similar (but slightly more rapid) rate of growth in access to an audio CD player over a shorter period; more than tripling, from 15 per cent in 1989 to 47 per cent in 1994. Over the whole period, access to a home computer increased, but only very gradually (from 13 per cent to 24 per cent, often increasing by only 1 per cent per year). But why has the computer failed to penetrate more than three-quarters of UK households despite the highprofile marketing campaigns of the 1980s and 1990s? One of the reasons why computers have not become as ubiquitous as video recorders in the home (a forecast that was commonly made throughout the 1980s) is that many people are uncertain about what the multi-function computer could usefully do for them in the domestic setting. A video recorder and an audio CD player have clearly defined and easily understood functions within a household. Both offer increased convenience to users (extending control over when and what TV programmes and films can be watched, or the quality of music reproduction) and also a degree of continuity—unlike home computers, they have not been subject to a rapid succession of changes that give rise to problems of technical incompatibility and obsolescence. Another factor must surely be the marketing and pricing policies of the hardware manufacturers, who prefer to increase the technical specification of computers on a regular basis rather than reduce the base price. Increasingly powerful machines are being marketed as “entry level” computers, but these still require a large financial outlay for many people. Software developers reinforce this strategy by frequently producing enhanced programmes that require ever more computer memory to operate. Information technology at home: differences between social groups In the UK, domestic access to computing equipment is clearly not universal and the penetration of new households has been very slow. So who does have computing equipment at home? For more than a decade, much of the computer companies’ marketing effort has targeted families with children of school age: a computer at home was desirable, if not essential, as it would extend the educational opportunities for
ACCESS TO INFORMATION TECHNOLOGY IN THE HOME
21
Figure 4.1 Households in the UK with video recorder, home computer and audio CD player, 1985–94. Sources: OPCS, 1989 and 1996.
children and allow them to practise and consolidate the skills they learn in the classroom. Has this strategy had any effect? Families with children of school age There is evidence from the annual survey undertaken by the Independent Television Commission (ITC) that homes with children are more likely than others to contain domestic media technologies, including a computer (e.g. ITC 1995). Figure 4.2 shows differential rates of access to a range of technologies. Data from the General Household Survey 1994 (Central Statistical Office 1996) provides confirmation that households containing dependent children are more likely to possess a video recorder, audio CD player and home computer. So, households that include children are more likely than others to contain domestic media technologies. What other differences can be identified between groups in society? Gender differences Survey data on home access to technologies often fails to identify which members of the household make (or control) use of particular items of equipment. Even if there is a computer at home, it may not be equally accessible to all members of a household. Or, looked at another way, not all household members might choose to make use of a computer for leisure or entertainment, for educational purposes or for other domestic or business purposes. Market research surveys and studies of adult students, teenagers and children have consistently found that males are more likely than females to have access to a computer and to spend more time using a computer at home. For example, a study of 12-year-old children in England (Kirkman 1993) revealed that
22
INFORMATION TECHNOLOGY: A CASE FOR SOCIAL SCIENTIFIC ENQUIRY
Figure 4.2 Proportion of UK households with selected media technologies, 1994. Source: ITC, 1995.
55 per cent of the students in the sample used a computer at home—70 per cent of the boys compared with 38 per cent of the girls. The amount of time spent using a computer at home averaged 7.1 hours per week for the boys, but only 4.2 hours per week for girls. Another study of English secondary school students (Robertson et al. 1995) found that 47 per cent of students had access to a computer at home, but that ten times as many boys as girls had sole access to a home computer and that there was a significant difference in the extent to which they made use of them. In the USA, a study of high school students (Shashaani 1994) found 68 per cent of boys and 56 per cent of girls reporting the presence of a computer at home. When asked who used the home computer, two-thirds of the primary users identified were male. In a study of Norwegian undergraduate students, Busch (1995) found that more male students than female students had had a home computer before entering higher education (41 per cent compared with 24 per cent), and as college students the difference persisted, although not to the same extent. In many of these studies it has been found that the extent of use correlates with attitudes and perceptions about the potential value of computer-related activities and with performance on learning tasks involving the use of a computer. Home computers are more likely to be bought for the use of men and boys, and even when a machine is acquired as a family resource, the main users are very infrequently reported to be female. This might reflect the fact that computers were initially marketed for the male leisure industry (Haddon 1988). It is also associated with the greater control that tends to be exerted over domestic finances by men. Research undertaken with large numbers of adults studying with the Open University has consistently indicated that men are more likely than women to have access to a computer, either at home or at their place of work (Kirkwood & Kirkup 1991; Kirkwood et al. 1994; Taylor & Jelfs 1995; etc.). Furthermore, men were much more likely than women to have made the decision to acquire or upgrade home computing equipment and to make use of such equipment in the home. For example, in a large-scale survey conducted in 1995 (Taylor & Jelfs 1995), over 40 per cent of female students had no access to a computer (either at home or at work) compared with only 25 per cent of male students. When students with a computer at home
ACCESS TO INFORMATION TECHNOLOGY IN THE HOME
23
Figure 4.3 Access to information and communication technologies in UK households, 1994 (by economic status of head of household). Source: OPCS, 1996.
were asked who in the household provided the main impetus to acquire computing equipment, 77 per cent of men—but only 41 per cent of women—answered “self”; 26 per cent of women indicated that their spouse or partner had been the main decision-maker, compared with only 4 per cent of men who answered that way. Patterns of use also favoured men. Half of the females with access to a computer at home reported that their spouse or partner made frequent use of the equipment, compared with only 26 per cent of male students. Occupation and social class differences The basic data on domestic access to media technologies also conceals social differences. Where the head of a household has a high-status occupation (i.e. classified as being in the categories “Professional” or “Employers and managers”) there is greater likelihood that the home will contain a telephone, video recorder, audio CD player and home computer than if the occupation is classified as “Semi-skilled manual”, “Unskilled manual” or “Economically inactive” (OPCS 1996). Data from the 1994 General Household Survey is presented in Figure 4.3. It is not just a matter of those using stand-alone computers: those in the higher socio-economic groups are more likely than others to participate in computer-mediated communication and have access to networks. A survey conducted by Continental Research in September 1995 (quoted in ITC 1996) found that less than 7 per cent of the UK population had access to the Internet, which was mainly available at the workplace. The user profile is biased towards younger men in the higher occupational categories. However, an earlier survey by the same organization (quoted in Screen Digest) indicated that 23 per cent of UK company executives had access (either at home or at work) in June 1995.
24
INFORMATION TECHNOLOGY: A CASE FOR SOCIAL SCIENTIFIC ENQUIRY
Is information technology bringing about fundamental social change? So the evidence does not support the predictions of IT becoming ubiquitous in Western countries in the near future, particularly in terms of home access. But what of the fundamental social changes that were expected to be brought about through the widespread use of IT? Has IT made any contribution to changes in society and, if so, have these tended to ameliorate or exacerbate social differences? A number of aspects will be considered, paying particular attention to home-based activities. Changing employment patterns The overall pattern of employment in the UK has changed in recent years. Since the mid-1980s part-time working has become more common for both men and women, which has led to a rise in the number of women in paid employment (Central Statistical Office 1996). However, while the number of women in fulltime work has also increased, fulltime employment for men has declined. Information and communication technologies have had both positive and negative effects upon the level of employment in the UK, as they have in most other developed countries. The impact of new technologies can be seen in the creation of new employment opportunities as well as the destruction of jobs in certain industries and services (Freeman 1995). Many of the new jobs made possible by greater use of information and communication technologies have involved changes in the geographical location of companies, particularly in the service sector. For example, the organization of banking, insurance and other financial services has been transformed in recent years with a much greater emphasis on access to “remote” rather than “High Street” provision. But while the use of IT in the workplace has permeated a large proportion of companies and organizations, there is little evidence of significant changes in the practice of homeworking—an essential element of the predictions for a new electronic society. Homeworking encompasses many categories of activity, including farmers, selfemployed building and maintenance workers, those in creative fields (writers, designers, artists, etc.) as well as people undertaking unskilled or semi-skilled assembly jobs or other forms of piece-work. Few of these activities lend themselves to being IT-based. Although some people are engaged in “teleworking” (i.e. working from home with information and communication technologies rather than travelling to a place of work located elsewhere), much of this appears to be done as only part of the normal work pattern or by people engaged in professional and creative occupations. Many companies would be reluctant to encourage or facilitate homeworking because it would necessitate a loss of control over employees’ time and the tasks they undertake. Furthermore, many homes are not suitable for teleworking. Using IT for homeworking requires not only appropriate facilities, but also space and arrangements that allow work to proceed without too much disruption being caused (both to the homeworker and to other members of the household). Much of the growth in professional homeworking arises not so much from developments in IT as from economic changes that have brought about an increase in selfemployment and home-based consultancy work. In 1995, more than three-quarters of all UK home-workers owned their own business or worked on their own account (Central Statistical Office 1996). Leisure and service consumption at home using information and communication technologies It was predicted that information and communication technologies would bring about significant changes in the patterns of leisure and service consumption. IT would make it unnecessary for people to leave their homes for many forms of entertainment or to undertake activities such as shopping, banking or gaining access to information and advice on a wide range of topics.
ACCESS TO INFORMATION TECHNOLOGY IN THE HOME
25
The convergence of computing and digitized telecommunications services has made possible the development of an infrastructure that is often referred to as an Information Highway (or even Superhighway). This would comprise linked networks of high-capacity fibre optic (broadband) cables capable of conveying at high speed very large volumes of data (audio, text, video, etc.) to and from a very high proportion of business and domestic properties and institutions such as schools, libraries, hospitals, etc. A high level of investment has already been made in installing the necessary infrastructure and this will continue for at least the next decade. The principal actors involved are the telecommunications providers (BT, Mercury, etc.) and the cable television companies. In terms of the domestic market, cable television has not achieved a high degree of penetration of UK homes since it was established in 1984. Figure 4.2, above, showed that in 1994 only 7 per cent of UK households were connected to cable TV services (ITC 1995). These companies are seeking to achieve a target of 75 per cent of households being capable of being connected by the year 2000. It is frequently claimed that there is an enormous demand for consumer services using information and communication technologies, but to what extent are the services currently provided being used? To date, the limited number of services that have been offered have achieved only a modest amount of success. In recent decades, leisure and recreation time has increasingly been spent in the home rather than in the public sphere. In Western societies attendance at public performances (e.g. cinema, concerts, theatre and attending sports events as spectators) has declined in favour of home consumption using audio-visual means (television, video, etc.). However, there are other activities which involve people going outside the home, for example to restaurants, shopping expeditions, day-trips, etc. Increased leisure services using information and communication technologies are unlikely to replace the “outside the home” activities to any great extent—social contacts are important and people do not want to remain at home unless they have no alternative; the new services are more likely to be in competition with other home-based leisure activities. “A new supply of information, communication and entertainment services is more likely to result in increased competition to win round the consumer. The idea that as a result of new services, new markets will open up is a distortion of the facts” (Punie 1995, p. 33). Economic capacity as a limiting factor Those who have predicted the ubiquity of home computers have tended to adopt a diffusion model, seeking to explain patterns of adoption and use by relating the characteristics of computers and the needs and attitudes of potential users. There has been a tendency to assume that people would perceive the benefits to be gained from the use of IT applications and acquire equipment for home use. If existing uses and applications were unable to convince the reluctant to become involved, then efforts were needed to develop a “killer application”, i.e. a service or use for IT that met so many needs for so many people that it was impossible to resist. The economic capacity of a household was largely overlooked, because such model: took it for granted that everybody was a potential computer owner and that the diffusion curve would follow other major innovations in domestic electronics, such as the television set, with adoption trickling steadily down the income scale. (Murdock et al. 1994, p. 271) Despite enormous promotional activities, home computer ownership remains concentrated within the professional and managerial groups, often increasing opportunities for those who already have them.
26
INFORMATION TECHNOLOGY: A CASE FOR SOCIAL SCIENTIFIC ENQUIRY
Who’s using the Net? The Internet enables communication between computers to be established for the purpose of data transfer, email, access to remote databases and information sources, etc. In a newspaper article Bowen (1996) sought to draw the attention of the business community to the commercial possibilities offered by the Internet: “It is not fanciful to compare the potential of the Internet with that of the motor car.” He used a “comparative history” of the motor car and the Internet to draw an analogy with early scepticism about the potential of the car. However, the historical “facts” about the motor car are very selective, with no mention whatsoever of the negative and detrimental effects brought about by the dominance of the motor trade in Western countries. For example, although one-third (32 per cent) of UK households were without a car in 1994 (OPCS 1996), transport policies make travel in rural and remote areas very difficult, while retail and leisure activities in very many town centres have declined as a result of the growth in out-oftown shopping and entertainment developments. It also ignores the economic and environmental effects of traffic congestion that renders many journeys very time-consuming. Conclusions A number of sources have been used to provide evidence that significant differences exist between social groups in terms of access to and use of IT, particularly in the home. Some of those differences have been examined with a view to assessing the likely impact of IT. Little evidence has been found to support the idea that IT is bringing about fundamental changes to the existing social structures. References D.Bowen, “Is anybody out there?”, Independent on Sunday, 10 March 1996. T.Busch, “Gender differences in self-efficacy and attitudes towards computers”, Journal of Educational Computing Research 12, 1995, pp. 147–58. Central Statistical Office, Social Trends 26 (HMSO, London, 1996). T.Forester, “The myth of the electronic cottage”, Futures, June 1988, pp. 227– 40. C.Freeman, “Unemployment and the diffusion of information technologies: the two-edged nature of technical change,” PICT Policy Research Paper no. 32, Programme on Information and Communication Technologies, Economic and Social Research Council, 1995. L.Haddon, “The home computer: the making of a consumer electronic”, Science as Culture, no. 2, 1988, pp. 7–51. Independent Television Commission, Television: the public’s view 1994 (London: Independent Television Commission, 1995). Independent Television Commission, “Surfin’ UK”, Spectrum, Issue 19, 1996. C.Kirkman, “Computer experience and attitudes of 12-year old students: implications for the UK national curriculum”, Journal of Computer Assisted Learning 9, 1993, pp. 51–62. A.Kirkwood, & G.Kirkup, “Access to computing for home-based students”, Studies in Higher Education16, no. 2, 1991, pp. 199–208. A.Kirkwood, A.Jelfs, A.Jones, “Computing access survey 1994: foundation level students”, Paper no. 51, Programme on Learner Use of Media, Institute of Educational Technology, The Open University, 1994. I.Miles, “The electronic cottage: myth or near-myth?”, Futures, August 1988, pp. 355–66. G.Murdock, P.Hartmann, P.Gray, “Contextualizing home computing: resources and practices”, in Information technology and society, N.Heap et al. (eds) (London: Sage, 1994). Office of Population Censuses and Surveys, General Household Survey 1987 (HMSO, London, 1989).
ACCESS TO INFORMATION TECHNOLOGY IN THE HOME
27
Office of Population Censuses and Surveys, Living in Britain: results from the 1994 General Household Survey (London: HMSO, 1996). Y.Punie, “Media use on the information highway: towards a new consumer market or towards increased competition to win round the consumer?”, paper presented at PICT International Conference on the Social and Economic Implications of Information and Communication Technologies, London, 10–12 May 1995. S.I.Robertson, J.Calder, P.Fung, A.Jones, T.O’Shea, “Attitudes to computers in an English secondary school”, Computers and Education 24, 1995, pp. 73– 81. L.Shashaani, “Gender-differences in computer experience and its influence on computer attitudes”, Journal of Educational Computing Research 11, 1994, pp. 347–67. E.Soloway, “Reading and Writing in the 21st Century”, keynote address to EDMEDIA 94, World Conference on Educational Multimedia and Hypermedia, Vancouver, Canada, 1994. J.Taylor, & A.Jelfs, “Access to new technologies survey (ANTS) 1995”, Report no. 62, Programme on Learner Use of Media, Institute of Educational Technology, The Open University, 1995. A.Toffler, The Third Wave (London: Pan, 1980). US Bureau of the Census, “Current Population Reports: Computer use in the United States, 1993”, Washington. The data is available at the following URL: http://www.census.gov/population/www/socdemo/computer.html
Section Two DEVELOPING COURSEWARE FOR THE SOCIAL SCIENCES
Chapter 5 EXPECTATIONS AND REALITIES IN DEVELOPING COMPUTER-ASSISTED LEARNING: THE EXAMPLE OF GraphIT! Ruth Madigan, Sue Tickner and Margaret Milner
Working with an interdisciplinary team to produce CAL courseware (a tutorial package introducing basic statistics) proved more difficult than anticipated. More training, organization and sustained teamwork were needed to establish a common language and mode of operation. The design and the use of CAL forced a fundamental re-evaluation of teaching methods. Defining how students learn may be as important as defining what they learn. This is necessary in order that academics (and other teachers) can come to terms with a new medium and its integration into the curriculum. The aim of this paper is to pass on a few honest reflections on the problems encountered in developing a piece of interdisciplinary courseware under the umbrella of TLTP. We are taking a risk here, since we are focusing on our mistakes rather than our successes, but we are doing this in order to clarify our own thoughts and in the hope that others can learn from us. Despite our mistakes, we do believe we have produced a tutorial program which others will find useful.1 Our particular objective was to create an independent learning package (GraphIT!) which could serve as an introduction to basic statistics across a number of university departments; accounting and finance, sociology and statistics were represented on the development team2. Introductory statistics appeared to be an appropriate area in which to make use of such a program. Many of the social sciences are essentially discursive and evaluative subjects, which do not generally lend themselves to a simple rehearsal of factual knowledge or established routines. Basic statistics on the other hand is an area which requires a certain amount of repetitive exercises to grasp its application and does produce some right and wrong answers. Moreover it is an area of study which many students (and staff!) find difficult, so any additional aid to learning would be welcomed. The computer has the obvious advantage of an interactive dimension and the capacity for rapid calculation, so the drudgery is removed and the student can concentrate on the application. Moreover, the interactive, dynamic aspect of CAL can introduce an element of fun or play which is often welcome in a subject which many experience as rather dry; a means to an end perhaps rather than interesting in its own right (apologies to all those statisticians who evidently love their subject, but many teachers will recognize the problem). Introductory statistics therefore seemed an appropriate area in which to develop a CAL package: • At this introductory level at least, it rests on a well-defined paradigm. • It can be presented as exercises which are susceptible to right and wrong answers. • It is an area in which students are likely to find repetitive practice helpful. 1. For those who are interested, a copy can be found at: http://www.elec.gla.ac.uk/TILT/cat-of-software/ downloadGraphIT.html
30
EXPECTATIONS AND REALITIES
• The fun/play element of CAL helps in an area of learning which many regard as necessary rather than popular. Tutorial programs Only some of our team had any experience of authoring systems and, as a consequence perhaps, some of us at the start had very little understanding or “feel” for what could be achieved with CAL. It was only later when we discussed and read some of the literature about the role of IT in education that we came to understand that the “drill-and-practice” tutorial has attracted a lot of criticism from CAL professionals, because it appears to rely on a rather old-fashioned approach to learning with built-in assumptions about a fixed body of knowledge and narrowly prescribed learning objectives. This is seen, understandably, as a rigid and non-exploratory approach to learning. “It is judged to offer poor approximations to what is itself a rather poor model of the teaching process in the first place (didactic encounters guided by the IRE3 pattern of dialogue)” (Crook 1994, p. 13). In planning our own tutorial package we were happy to include some elements of “drill-and-practice” routines. None the less, our original conception aimed to be rather more discursive and adaptive (Laurillard 1993, pp. 94–5) than we finally achieved. We had hoped that GraphIT! would provide in effect a front end for Minitab (a commercially available statistical package) so that the student could access and analyze the data sets in a rather more creative, flexible way than has actually proved possible. A series of technical problems and the consequent pressure of time meant we had to abandon the direct use of Minitab and as a consequence we lost the more exploratory dimension. We were also keen to retain a more interpretational dimension, to encourage students to realize, for example, that there is not always agreement about the best way of presenting data or indeed interpreting data. The problem here is not just technical, but may also reflect the inexperience of the academics as scriptwriters, who found it difficult to think themselves into a new medium (see below). What we have produced, then, is a series of modules arranged in a hierarchy of learning (from the simple to the more complex). It is possible for the student to go back and forth at will, but it is essentially a linear tutorial program following a fairly traditional, didactic model of learning. This is less than we had originally envisaged, but still has a useful role in many courses. As Crook (1994) points out, this sort of tutorial is popular with teachers because it is easy to assimilate into prevailing patterns of teaching practice, and because the drill-and-practice approach is appropriate for some types of material and some forms of learning. We can recognize the value of this type of teaching technique in certain situations: “it need not presume a wholesale reduction of educational activity to the rehearsal of discrete subskills…. It needs to be made sense of rather than automatically disparaged” (Crook 1994, p. 14). Its value must depend to a great extent on how successfully the tutorial is integrated into the rest of the course and other complementary teaching methods.
2. The courseware development was carried out within the context of a wider initiative within the University of Glasgow: Teaching with Independent Learning Technologies (TILT) . 3. I-R-E “verbal exchanges taking the form of a (teacher) Initiation, a (pupil) Response and a (teacher) Evaluation” (Crook 1994, p. 11).
TUTORIAL PROGRAMS
31
Interdisciplinary work At a purely practical level, we found an interdisciplinary project more difficult than anticipated. The academics in our group were originally located in four departments spread across the campus between five and fifteen minutes’ walk from each other. All the academics involved had heavy teaching timetables and other commitments, so even when the number of participating departments was reduced to three (because one member moved department) it remained very difficult to get the whole group together on a regular basis. The RAS (research assistants) were located at some distance from the Chair of the group who was responsible for administration and liaison with the centre (i.e. the TILT Steering Group overseeing all the subgroups). At the beginning of the project not everyone had access to e-mail. It is easy to say we should have given more thought to these practical issues, but they arose as a result of the resources available, the fact that the central project administration was not yet established and the interdisciplinary nature of the group. They were, however, very serious for the operation of the group. The fact that we found it so difficult to meet regularly as a group meant that we also had difficulty in developing a common language and were slow to pick up divergent views. Part of our problem was also an intellectual one. We used the same words, the same statistical terms for instance, but we did not necessarily speak the same language; the relative importance of categorical versus interval data for different disciplines for example, or what constitutes an attractive data set. At one level we knew these differences existed before we began, which is why indeed we have included interchangeable data sets so that to some extent the package can be customized to suit each subject area, but one can know these things without realizing their full implications. With hindsight we should have spent more time at the planning stage (though in our defence, we worked through all the recommended stages of defining objectives, distinguishing our project from comparable software, creating a common framework, agreeing on key design features, etc.). The pressure of time, the fact that the RAS had already been recruited and the difficulty of getting together as a group encouraged us to subdivide the task of scriptwriting as soon as we had an agreed framework. This seemed a practical way to progress in the circumstances, but had the unintended effect of reinforcing a disciplinary divide, the statisticians on the one hand and the social sciences/accounting on the other, and allowed two sets of interests to develop in isolation. This slowed down the development process considerably as material then had to be rewritten and reshaped at a later stage. It left the RAS in a difficult situation trying to reconcile the two groups. Authoring and a new medium for academics As well as interdisciplinary problems there were also problems of communication between those with experience of authoring software (in particular, though not exclusively, the RAS) and those without. Again this was a gap which tended to be reinforced, rather than reduced, by organizational arrangements. The RAS, across all the subgroups, not just ours, were appointed at an early stage in the project before the central organization had really been established. This had advantages and disadvantages: they were in at the beginning and consequently able to make their own contribution to developments, but at the same time they were newcomers to an organization which had not yet established its own lines of communication and administration. Thrown on their own resources, the RAS developed a camaraderie and a lively network of working relationships right across the university. This has been enormously beneficial in that RAS have been able to swap technical knowledge, offer each other support and spontaneously advance one of the aims of such a project, that is to evaluate the role of IT across a diversity of disciplines and teaching situations. The academic teaching staff were often marginal to this process and slow to benefit. Unlike the RAS they were not newcomers: they already had an established niche in the university and they worked to a different
32
EXPECTATIONS AND REALITIES
timetable and a different set of imperatives. Many of the academics were ignorant about authoring software, had never attended a CAL conference or read any of the debates about teaching with IT. They were experienced in teaching in a verbal medium (written and spoken) but had difficulty in envisaging the possibilities of the new medium of the computer. As Bunderson et al. suggest: instruction has been trapped in a “lexical loop” perpetuated by print based media and methodology… the skill/knowledge of the expert [is translated] into a list of verbal abstractions descriptive of the critical tasks [and] given to students. The student is expected to translate the verbal abstraction back into the skills/ knowledge of the expert. They are expected to create a model of the performance of the expert from the verbal abstraction. This then is the lexical loop (1981, p. 206). The alternative approach suggested by Bunderson et al. is to provide working models in which the learner can perform. Computers are valuable because they can provide elements of simulation, but it requires imagination and experience to be able to take advantage of these possibilities. Yet at the outset of the project it was the RAS, not the academics, who were offered training (on the grounds presumably that they were the people entering a new situation). It seems obvious now that it was the academics who required the training and who needed the introduction to CAL philosophy and educational debates. It was they who were having to shift to a new medium of presentation and a new pedagogy. We learnt the hard way, through our own mistakes, and at the end that may be the only way to learn, but a bit of basic training would have speeded up the process and made for easier communication between the academics with mostly teaching experience and the RAS with mostly development experience. Editorial function Again with hindsight, we should have been much more specific about defining the working relationships within our subgroup. We had a good range of skills for a courseware development team (as defined by Laurillard 1993, p. 237) either within the subgroup or the wider TILT project. After one or two false starts, we established satisfactory procedures for dealing with accounts and routine administration. What we failed to do was to establish an appropriate editorial structure. Academics are used to working within a broadly collegiate environment where at least in theory everyone contributes as equals. As any social scientist will recognize, this is a rather naïve view of academic life, but it allowed us to believe that working relationships would develop organically. We had assumed that we could work with a division of labour (referred to above) in which different individuals and combinations of individuals went off and were responsible for writing different parts of the initial script, and then came together on a series of “Design Days” for collective approval, editorial decision and so on. This system failed, or at least worked only intermittently, for a variety of reasons already alluded to: the group found it difficult to come together on a regular basis and the distance between the disciplines was greater than we had anticipated. As a consequence we were left with a very weak editorial decision-making structure. The RAS were particularly affected by this. They would produce alternative designs and receive a range of comments and preferences, when what they needed was a decision. The irony is of course that the RAS, who had most direct experience of the working practices involved in a project of this sort, were the least able to dictate or change the group structure; they were the newcomers to the institution, part-time, less well paid, and the academics were the original instigators of the project. The organic collegiate model tends to ignore these differences and pretend they do not exist. This can have its plus side if it embraces a genuine respect for
TUTORIAL PROGRAMS
33
people’s expertise, but a more formal structure of decision-making is needed and can also, we think, be empowering. In our group there was sufficient goodwill that we did evolve ways of working together, but we would have done so more efficiently and with less frustration had we recognized at an earlier stage that our existing model of editorial control was not working as intended and needed to be replaced. Evaluation The university-wide TILT project was initially set up by inviting groups of academics throughout the university to submit ideas for projects in their area of work. These proposals were then combined into cognate areas (the subgroups) which in turn were combined into the single TILT programme. At the outset the academics, in our group at least, tended to be focused on their own project and cognate areas rather than the TILT project as a whole, and tended to resent the demands made by the centre for information and participation in activities which appeared to have more to do with the central project than their own subgroup. In particular the role of evaluation, which was crucial to the overall project, caused a great deal of initial misunderstanding at the subgroup level. In fact the RAS, who were better integrated as a group across the university and closer to the centre than the academics, had a better understanding of the role of evaluation in the project as a whole. This changed as the project progressed and it became clear that the subgroup with special responsibility for evaluation could offer something of value to the other subgroups, rather than seeming to intrude and demand more paperwork, more reports and so on. In the end we all came to appreciate the value of having a group of independent evaluators who had the time and expertise to design instruments (before and after questionnaires, observation schedules, video recordings) with which to evaluate the effectiveness of the software we were producing. They carried out their evaluations in laboratory conditions with selected groups and in genuine classroom situations. Quite apart from anything else, positive feedback from such thorough external evaluation has done a great deal to restore confidence in moments of self doubt. It is important, though, to recognize that our own subgroup also carried out evaluation exercises which were crucial at a formative stage. These tended to be smaller in scale and more informal, but allowed us to try out an unfinished piece of software which was still rough around the edges and could not therefore be used in a fully fledged teaching situation. The classroom evaluations were extremely valuable in focusing our attention on the importance of locating such a package within the course structure (Laurillard 1993, p. 213). Different teachers will want to use the package in different ways, but it is extremely important that it is properly introduced to students and it is clear what they are expected to do with it. There is a temptation for teachers everywhere when given an independent learning package (equally video or film) to treat it as a “child-minder”, something just to occupy a class room hour or so. Courseware is only of interest if it promotes learning. However, to the extent that it does, it only does so in conjunction with the wider teaching context in which it is used: how it is supported by handouts, books, compulsory assessment, whether the teacher seems enthusiastic about it, support among learners as peer group, and many other factors (Draper 1995). Both formal and informal evaluation were found to be important as part of the formative development process and as part of the transition to the classroom. Summative evaluation is more difficult to accomplish. Our students for example, were questioned and “tested” before and after classroom sessions using GraphIT!. For the most part they reported that they had enjoyed using the tutorial package, they believed it to be useful, and the “tests” showed that they had acquired new knowledge or information (Henderson et al.
34
EXPECTATIONS AND REALITIES
1995). At one level then, this example of CAL courseware appears to be effective, but we cannot say whether it is more effective than other methods because we did not compare our CAL tutorial with alternative, more conventional teaching and learning methods. We had always intended that the use of this courseware should be integrated with other coursework to supplement or reinforce, not to replace, conventional teaching, though it might obviously reduce the time spent on certain topics. In these circumstances it is very difficult to “pinpoint the precise variables that determine the superiority of a particular approach” (Booth et al. 1993, p. 83). We believe that CAL is attractive because it adds to the diversity of teaching methods available and it offers the student an additional source of independent learning. Whether it is cost-effective can be substantiated only in the longer term. GraphIT! has been very expensive to produce (two part-time staff working for three years) and, if we ignore the research and learning experience involved, could be justified only if it is widely adopted. So far we have received many expressions of interest, but only time will tell if it is widely used in practice. Generally CAL has not been taken up as enthusiastically as its developers would like (Booth et al. 1993, p. 83). One of the problems with the evaluation of CAL is that it is often done by CAL professionals and enthusiasts, who are already committed to developing and expanding its use. They are faced with the issue of overcoming the conservatism of course teachers and ensuring that genuine opportunities for the constructive use of CAL are created. But the real evaluation must in the end come from the long-term patterns of usage which emerge, and we must allow for the possibility that in many areas of education these evaluations may be negative and the use of CAL will be rejected. This is not an easy finding for a project like TLTP, which is committed to expanding the use of CAL, to contemplate objectively. Conclusion We have produced what we believe to be an attractive tutorial introduction to basic statistics and graphical presentation. The evaluation, from within our own institution where it has been piloted in classroom situations and from other institutions, where we have received teacher evaluation, has been most encouraging. We hope the final product, which has a teacher’s editing facility so that data sets which have particular relevance to individual courses can be include as part of the exercise set, will also be well received. We have not achieved everything we set out to achieve: we were over-ambitious given our resources. The whole courseware development took much longer than we had anticipated; as a consequence the tutorial package is shorter than intended. We had planned a number of additional modules which would have taken the student on to a slightly more advanced level. What we have learned: • Keep talking to each other! It is not enough to identify learning objectives at the outset: the same words may mean different things to different people. This is particularly true where people are coming from different intellectual backgrounds. • Defining how you want students to learn may be more important than exactly what you want them to learn. • Although there is bound to be a division of labour and of expertise within the group, it helps to identify a minimum training scheme and/or literature review with which you expect everyone to be familiar. • Do not fall into the trap of thinking that the practical day-to-day arrangements and working relationships will take care of themselves: they need regular review.
TUTORIAL PROGRAMS
35
• No package is a “stand-alone”, even if it is designed for independent learning. It has to be integrated into the rest of the course and its success or failure will depend in part on how it is used. References J.Booth, J.Foster, D.Wilkie, K.Silber, “Evaluating CAL”, Psychology Teaching Review 2, 1993, 2. C.V.Bunderson, A.S.Gibbons, J.B.Olsen, G.P.Kearsley, “Work models: beyond instructional objectives”, Instructional Science 10, 1981, pp. 205–15. C.Crook, Computers and the collaborative experience of learning (London: Routledge, 1994). S.W.Draper, “Two notes on evaluating CAL in HE”, University of Glasgow, 1995. www URL: http://psy.gla.ac.uk/ steve F.P.Henderson, C.Duffy, L.Creanor, S.Tickner, “Teaching with Independent Learning Technology Project: University of Glasgow” paper presented at CAL Conference, Cambridge, 10–13 April 1995. D.Laurillard, Rethinking university teaching: a framework for the effective use of educational technology (London: Routledge 1993).
Chapter 6 THE DATA GAME: LEARNING STATISTICS Stephen Morris and Jill Szuscikiewicz
Learning statistics is a perennial problem for students and research workers from non-mathematical backgrounds. The social sciences and medicine in particular rely on high quality analysis and interpretation of data. However, the teaching of statistics throughout higher education assumes a high degree of mathematical competence even when the students are from non-mathematical disciplines. This is without doubt a major reason for students’ perceived lack of statistical judgement (Jamart 1992). Clearly a different approach is called for. With imagination it is possible to convert difficult statistical problems into simpler problems of pattern recognition. Furthermore, with computerization the extra ingredient of interactivity can be added enabling the student to manipulate the raw data while observing the changing patterns and thereby build up an intuitive understanding of how statistics works. In effect statistics is turned into a game, the data game. In this paper we describe this approach to the teaching of statistics. The problem Imagine learning to play chess, with a set of principles and examples in a book but without the board and pieces. Or learning music composition without sound. Statistics learnt solely from the pages of a book (or lectures) suffers the same problem, and unfortunately most students and researchers are expected to gain a practical grasp of the subject in exactly this way. A further problem is that much statistics teaching is based on mathematics, which is beyond the easy reach of most students. This makes it more like learning chess without a board in a foreign language. Finally, most statistics textbooks and courses take a few sets of data and work through them; which sounds acceptable, but does not actually teach statistics. Students learn a handful of analyses instead; examples of statistics rather than statistics itself. Statistics is particularly in need of a new approach. Its purpose is to present a large and probably complex body of raw data in a meaningful, summarized form. Everyone understands what raw data is, because they collect it and it is largely self-explanatory. Students accept the concept that you come to a conclusion, and that that is the end of the process—however, they do not understand what goes on in between. Statisticians work with a series of steps culminating in the table of test statistics, each step condensing the data and making it more manageable. This forms a kind of “Information Funnel”, large and raw at one end, and clear and informative at the other. Statistical analysis programs used by those in education and research every day emphasize the two ends of the funnel, and hide the intermediate steps. How is a student to understand what is happening when they enter a vast array of data and a moment later are presented with a few probabilities? Although pride of place appears to go to the raw data and the final statistics, educationally the in-between steps are of primary importance. Most of what we thoroughly know has been learnt by observation, trial and error. Statistics cannot be taught this way within the current mathematical framework using a selection of data sets, since the time
OUR SOLUTION
37
required (and the number of prepared analyses) would be far too great. However, most students of statistics are not interested in it as an academic discipline, and by approaching the subject instead as a tool to be used, a higher degree of teaching flexibility can be achieved. Mathematical proofs become irrelevant; when learning to ride a bicycle, a child needs no knowledge of angular velocity, frictional forces or gravitational pull; an intuitive understanding of all of them will be impressed on him/her more or less painfully. While deep mathematics can prove or disprove assertions, proof does not necessarily lead to enlightenment (Jamart 1992), and much of what we truly understand requires no proof at all but repeated, varied and directed observation. A deep appreciation of almost any subject comes after practical experimentation. This can be achieved in statistics teaching, by making it into a game, after which it can be at least as interesting as chess, and certainly a lot more useful. Now that the IT revolution has made faster, more powerful PCS available to university education at reasonable cost, their advantages to both teachers and students are widely recognized. When used with imagination, the increased interactivity of PC software is a powerful ally in the move away from didactic teaching; and the potential availability of networked software 24 hours a day enables students to work with complex concepts, at their own pace, whenever it suits them (Simpson 1995). Our solution With this in mind we have approached the problem of teaching statistics by creating and computerizing a series of challenges and games which the user plays by changing the data. This is a radical departure from traditional statistics teaching, where correct statistical practice is mirrored unnecessarily closely in making the data sacred and immutable. It is still possible to impress on students that the real life data cannot be changed; and by giving them the opportunity in the classroom to experiment in a way impossible in real life, they become experienced in recognizing patterns and exploring strategies without danger. They are exposed to a wide variety of situations which might take a decade or more to accumulate through real research. Although traditional teaching styles may be able to show some variety to students, the interactivity of our approach involves them directly, making it more successful than a strictly didactic method. However, if the results of the interactivity remain complicated, the students will simply have a deeper understanding of their confusion. By transposing the data into a simple graphical representation, whether a fitted line or a set of normal plots, the results of the interactivity become clear and the student gains a genuine understanding. This gives an entirely natural representation of the middle stage of the Informational Funnel, connecting the original data in a clear way to the otherwise slightly mysterious test results and conclusions. The data points are the game pieces which may be moved in any direction. The student can be guided through a number of scenarios within which they are encouraged to experiment and observe changes in the resulting test statistics, finally being challenged to generate particular outcomes. Through these exercises, the students recognize that the processes directly connect the data to the results, and that they can be understood. Although they may not understand the processes at first, by the end a clear intuitive understanding will be established. We tested this teaching approach by producing a suite of PC-based gameplaying scenarios designed to demystify a wide range of statistical concepts. These were incorporated into a comprehensive teaching package called Statistics for the Terrified. Although the software does cover some quite advanced topics, it approaches everything in a basic, commonsense way. The areas covered by the software include: Table 6.1 Areas covered by the software. How to choose a test
The importance of Groups in statistics
38
THE DATA GAME: LEARNING STATISTICS
Identifying appropriate tests by data layout When to use: Chi Square, Kruskall-Wallis, Mann-Whitney, Oneway Analysis of Variance, Paired and Two-sample t-tests, Wilcoxon Basic data Description Descriptive statistics Advantages and disadvantages of median and mean, range and variance, and coefficient of variation The importance of the normal curve and how the mean and s.d. affect its shape and position Standard error and confidence intervals Testing for differences between groups The differences and similarities between two-sample t-test and oneway analysis of variance Role of the normal distribution The differences and similarities between the Mann-Whitney Test and the Kruskall-Wallis test Role of box and whisker plots When and how to use oneway and twoway analysis of variance Develop the ability to visualize data in graph form Uncovering hidden influences Uncovering influences! Reducing bias and variance Analysis of covariance (when the influence is a measurement) Twoway analysis of variance (when the influence is a category, such as gender) Matching groups to prevent bias Reduced variance enhances the likelihood of a significant result Fitting lines to data Regression Judging the value of a linefit Using the fitted line Describing the line Analyzing repeated measurements Before and after studies About the normal curve The paired t-test Why area under a curve? The differences between areas Different repeated measurement shapes Analyzing 2 × 2 classification tables What are classification tables? Interpreting proportions Risk difference Relative risk and relating two proportions Constructing a hypothesis of no difference between the groups Issues surrounding the Chi Square Test (Fisher’s Exact Test) What does p <0.05 actually mean? What is going on when a hypothesis is tested? Type 1 error, Type 2 error, and Power The effect of sample size in the accuracy of a statistical trial Use of blocking in experiments
The Computer Unit ran a series of regular statistics courses open to research staff and students who had previously had university-level statistics teaching. Their level of knowledge and confidence was assessed before and after CAL-based teaching sessions via a questionnaire, on such areas as Correlation, Outliers, ttest, etc. Responses were made by marking a 0–10 scale. Information on attitudes to using computers for
OUR SOLUTION
39
learning, IT skill levels, etc. was also gathered to ensure that participants were all of a similar skill level and attitude. These questionnaires were received from a total of 51 students. The data was analyzed using a paired t-test after applying the Wilks-Shapiro test to confirm normality. Examples of the data game approach Interactive graphs The module on linefitting (regression) is a good example of our approach. It can be viewed as three sections: • Introduction and overview • General exploration • Challenges. In the first part, general concepts are introduced, with animated illus-trations. This covers what linefitting is, what it is for (prediction and influence), how to judge the usefulness of a linefit (correlation coefficient), and how to describe the line (gradient and intercept). At this stage we don’t include too much detail, just a commonsense definition with a clear demonstration. The next section provides users with a graph, ten movable points, and a line which is automatically recalculated (see Figure 6.1). Next to the graph are the essential parameters: Correlation, Gradient and Intercept. More information on the meaning of these is available to students who require it—however, as it is not important at this stage, it is available only as hypertext. For a period of two minutes, students are invited to move the points and watch how the line changes, and also to observe the effect on the parameters. This section performs a dual function. Primarily, the student begins to gain an intuitive feel for the way a line reacts to data (we have found they pick up particularly on outliers and on data with poor correlation) and begins the process of learning to “eyeball” scatters of points. Since they will not always have a plotted graph to work from in real life, the data values are also given (colour-coded to the points) next to the graph. However, it also painlessly teaches students the minimal software skills which they will need in order to work through the challenges in the next section: dragging the points, and spotting where on the screen the relevant changes occur. In the final section, the students work through a series of four challenges, in which they have to progressively change the data to produce a Correlation, Intercept and Gradient value by manipulating the points, and finally a really difficult challenge where a particular Correlation and a Gradient must be obtained together (see Figure 6.2). The values for these exercises are generated randomly; this means that they can be repeated as often as desired, without any actual repetition. Although they have been exposed to absolutely no mathematics in this module, all students successfully complete these exercises, and enjoy the learning process. Depending on the complexity of the random challenges and the student’s initial knowledge, this may take between 30 minutes and an hour. At the end they have a good grasp of what linefitting is for, what correlation, gradient and intercept are, and what they say about the data. They are also able to make an educated guesstimate of the correlation of a given set of data. In other words, they are aware of all stages of the Information Funnel, and connect them together naturally and easily.
40
THE DATA GAME: LEARNING STATISTICS
Figure 6.1 A painless way to absorb basic information.
Pattern recognition A slightly different approach was used in the module “How to choose a test” Many of those using statistics as a tool (both students and researchers) find the choice of a test to use in a real-life situation a baffling and fundamental problem. This is because statistics textbooks present different sets of data and mathematically generate appropriate tests from first principles. The mysterious use of mathematics in this context merely distracts the student away from the data. However, it is possible to concentrate on the data and still choose an appropriate test. By presenting it in terms of pattern recognition (a skill available to all of us) rather than complex maths, we were able to make some basic concepts clear without any attendant bewilderment. As with the line-fitting module, it can be viewed in three sections: • Introduction and overview • General exploration • Quiz. The introduction is a little more detailed than with linefitting, since it often has to overcome a lack of confidence resulting from previous bad experiences and confusion. The three most common data layouts are shown on the screen and explained (see Figure 6.3). Throughout the whole module it is demonstrated that 90 per cent of data obtained in research is based around one of these clearly different layouts. Deciding on a test is simply a matter of matching patterns. The General exploration and Quiz sections are designed to be worked through a number of times. We recommend that students return to the exploration at least twice, as there is a huge volume of information stored within it. The basic screen display remains the same as for the introduction. However, everything on the screen is “hot”, and by simply clicking anywhere the student can gain further information. For example, by clicking on a name in the Repeat Measurements panel, the student can obtain information on the concept of tracking
OUR SOLUTION
41
Figure 6.2 Repeated, varied and purposeful experimentation.
one person at repeated intervals, and how that differs from Group Comparison (see Figure 6.4). This section is totally interactive. The user clicks on items of interest and receives information accordingly. In the Help window they are introduced to further ideas to follow up, such as parametric/non-parametric testing. Perhaps the most important facet in this section is the opportunity to explore the Permissible Tests for each layout. Once this mode has been entered, the mouse click brings up a summary of the appropriate tests, and again clicking on a test calls up detailed information on the test itself (Figures 6.5 and 6.6) After spending some time working through this section, the student acquires a body of information concerning the major statistical tests and the way they work, and the confidence that they can apply them appropriately. Finally, there is a Test Quiz, in which six questions are selected at random from a question bank, which tests them on their acquired level of knowledge in choosing a test (see Figure 6.7). Feedback is provided to the student at the end, so that they can return to the previous section and explore a little further. Conclusion We have been using the software now for some time in the teaching of statistics, and those who have used it report that they feel much more confident of their ability to cope with the subject. In addition to classroom sessions, it is available on the St George’s Medical School Network, and in this way it has reached a large number of researchers. These users are particularly pleased with its practical emphasis, and find that the method of using the PC on their desk for a quick subject refresher is very convenient, and does not involve the loss of face from asking for advice. Alarmingly, a number of experienced researchers have confessed to not having understood some very basic concepts before working through this package under their own steam.
42
THE DATA GAME: LEARNING STATISTICS
Figure 6.3 Explaining the pattern recognition principle of choosing a test.
These results were extremely encouraging, not least because all users enjoyed the experience of learning statistics for the first time. The most effective learning took place when the data game was also accompanied by graphical representations, such as the fitted line in the regression module or multiple normal curves (or Box Plots) in the Analysis of Variance module, although quizzes were also popular. The overall effect of the software was to make the Information Funnel clearly Table 6.2 Confidence and understanding before and after using the software. Question
Average confidence/ understanding before (%)
Average confidence/ understanding after (%)
Paired t-test Significance level
How confidently can you explain what the Intercept is? How confidently can you explain what Gradient is? How confidently can you explain what Correlation is? How confident are you about the way outliers affect a linefit? Could you explain how the standard deviation affects the normal curve?
20.7
84.2
0.0000006
29.2
77.8
0.000004
64.2
85
0.0056
27.8
78.5
0.00001
52.8
82.8
0.0006
OUR SOLUTION
43
Figure 6.4 Interactivity allows the student to follow their own line of interest. Question
Average confidence/ understanding before (%)
Average confidence/ understanding after (%)
Paired t-test Significance level
Could you explain what the two sample t-test is used to test for? Could you explain the circumstances in which the two sample t-test gives a significant difference?
48.5
74.2
0.0007
44.2
67.8
0.0002
(0% implies zero confidence/understanding; 100% implies total confidence/ understanding)
visible, with large amounts of raw data at one end, a small number of interactive graphical representations and numeric statistics in the middle, leading to the final simple interpretation. This reflects one function of statistics itself as an informative summary tool. The software has also generated an enormous volume of comments from users and from teachers in other institutions. The feedback was obtained from verbal comments, questionnaire and also a large number through our Talkback feature, which allows the user to enter comments at any time from within the software. The Talkback feature was designed by us originally as a way of obtaining student feedback on areas of difficulty and misunderstanding, its in-built anonymity encouraging users to be as frank as humanly possible. We used this frankness to perform an evaluation of the software, asking them to enter comments on the usefulness (or otherwise) of learning statistics by the data-game approach. The most common reaction from researchers was that they could not believe that statistics has been so simple all along: “If
44
THE DATA GAME: LEARNING STATISTICS
Figure 6.5 Knowing which tests are appropriate.
choosing a test can be made this simple, why hasn’t anyone told me before?” Students enjoy it, classes consistently overrun because students do not want to leave—unusually for statistics. Teams of two to a PC working in the classroom on interactive exercises have been observed competing in a race against time to complete a set of challenges—and even to go back for a “best of three”! The accompanying graphical reinforcement of the data was commented on frequently, as being especially helpful to the intuitive understanding of the way a test works. Also the simple description of apparently complex items such as an analysis of variance table were well received. Most students found the package approachable, unlike the typical statistics textbook; however, many said that the experience of using the data-game approach would allow them to look at traditional statistical textbooks again in a more informed light. An unsuspected benefit was highlighted by a number of students who felt that their confidence in, and grasp of, basic numeracy had improved. There is some concern on the lack of basic numeracy skills in a number of disciplines, and in areas such as nursing (Jacobsen et al. 1991) there is growing evidence that this is worsening with increased use of calculators and computers. Having gained an understanding at this level, the user is able to approach lectures and textbooks in a more informed manner. We do not feel that our approach replaces the existing teaching of statistics, but it is a valuable precursor and accompaniment to it. References M.Cartwright, “Numeracy needs of the beginning Registered Nurse”, Nurse Education Today (School of Nursing and Health Administration, Charles Stuart University, Bathurst, NSW), 16, 1996, pp. 137–43.
OUR SOLUTION
45
Figure 6.6 More information on each test is available. B.S.Jacobsen, R.S.Jacobsen, L.Tulman, “The computer as a classroom participant in teaching statistics”, Computers in Nursing (School of Nursing, University of Pennsylvania, PA), May–June 1991. J.Jamart, “Statistical tests in medical research”, Acta Oncologica (Cliniques Universitaires de Mont-Godinne, Université Catholique de Louvain, Yvoir, Belgium), 1992. J.M.Simpson, “Teaching statistics to non-specialists”, Statistics in Medicine (Department of Public Health, University of Sydney, NSW), January 1995.
46
THE DATA GAME: LEARNING STATISTICS
Figure 6.7 Self-testing enables students to monitor their learning in an exploratory environment.
Chapter 7 CONVERSION OF THE IDEOLOGIES OF WELFARE TO A MULTIMEDIA TEACHING AND LEARNING FORMAT David Gerrett
This paper describes the process developed for, and lessons learned from, the conversion of the Ideologies of Welfare into a Multimedia teaching and learning format. Yardsticks for the time and effort required to convert intellectual material are provided. In the case of the Ideologies of Welfare lesson, to produce a second generation package required 260 hours of staff time at a cost of approximately £4,000. A brief description of the lesson and specific student feedback on the use of the package are included. Background The Ideologies of Welfare (IofW) are groupings of often opposing social constructs which provide a rationale for differentiating policy decision-making concerning public welfare. A knowledge of the ideology which currently underpins the direction of the British health service is of particular interest to health care professionals such as pharmacists. They are legally responsible for monitoring the process whereby therapeutic medicines are made available to the public. As drug costs rise they are relied upon to assist in rationalization of services and are themselves becoming increasingly the focus of policy-making. As such, their ability to make decisions perceived to be rational by those in power is dependent on their knowledge of the “in vogue” ideology. There is currently no formal undergraduate instruction on the IofW for pharmacists. The only postgraduate instruction occurs on the Post-graduate Programme in Social and Administrative Pharmacy (The Course) run by the University of Derby. However, the theory and practical application of the IofW may become more important in pharmacy education following an independent and particularly influential assessment of the occupation in 1986 which recognized the general educational deficit in the Social Sciences (The Nuffield Foundation, 1986). In response, the official body responsible for monitoring curriculum content and ultimately registration of pharmacists, the Royal Pharmaceutical Society of Great Britain, recommended that “teaching of social sciences should be an element of all years of the undergraduate course” (Royal Pharmaceutical Society 1989). Furthermore, postgraduate pharmacy teaching has begun focusing on practice in its broadest sense which necessitates an understanding of human action, a social science domain. The change in emphasis is in keeping with Pharmacy Administration courses in America (Teachers of Pharmacy Administration of the American Association of Colleges of Pharmacy 1985; American Association of Colleges of Pharmacy 1992).
48
CONVERSION OF THE IDEOLOGIES OF WELFARE
The Course and Multimedia teaching and learning The University of Derby validated The Course in September 1993. The Course Planning Team (CPT) comprises staff of the School of Health and Community Studies, pharmacists from the School’s Academic Pharmacy Practice Unit plus senior hospital pharmacists at the Derbyshire Royal Infirmary. The first module of four making up the Certificate stage of the award is Pharmacy and Health Policy. This core module was validated at 100 hours of student effort of which 30 hours was allocated for instruction on the IofW. A critical feature of The Course is its sole use of Multimedia. This term is universally understood to represent the integration of several media such as text, graphics, video and sound into a single computer application. At the time of validation, only two other courses in Britain were known to be so essentially depended on technology. Multimedia teaching and learning (MTL) refers to the use of computers and programmes to present educational information to students in an interactive manner. It is a form of resourcebased learning commonly described as self-directed, independent and individual in nature. Through its use the educational needs of a significant population, including those unable or unwilling to attend face-toface courses, can be satisfied (Gilroy 1992; McDonough, Strivens, Rada 1994). Multime dia teaching and learning has been shown to be a viable alternative to lectures generally (Clem et al. 1992) and specifically for pharmacy as part of undergraduate (Stevens & Sewell 1993) and postgraduate (Pugh et al. 1993) provision. Pharmacists are an ideal audience for Multimedia. Many are unable to attend a university as they are legally committed to be available for discussion with patients concerning medication yet have access to and experience with computers. The conversion of intellectual material to lessons in a Multimedia format is termed “authoring”. In order to ensure that the aims and outcomes of lessons are reflected in student understanding and action, further that the lessons of a module form a cohesive learning experience, a series of stages in the process of authoring were identified for validation of The Course. The flow diagram in Figure 7.1 and the corresponding descriptions in Table 7.1 define the stages involved. These chart the relationship in the production of lessons of the CPT to Module Teams, Authors, the Multimedia Teams, the MTL Unit and most importantly students. Careful note should be made of the implicit responsibilities of those groups involved in authoring and the quality assurance implicit in feedback mechanisms. Also, that for the full period when a lesson is made available on The Course, evidence of the effectiveness of the student learning experience is required to be compiled and passed between involved groups. Evidence may take the form of, for example, student comment, assessment or specific research conducted for the purpose of eliciting the student response. In ensuring quality, reports are required to address the central question as to whether lessons achieve student outcomes as specified in the module content. Table 7.1 Steps in the authoring process (A–M) (A)
The CPT notify Module leaders (E) six months prior to delivery of lessons. Module leaders convene meetings of the module team and produce a first report detailing module structure and the aims and outcomes for each lesson. The CPT receive the first and may provide suggestions and request conditional changes.
The Multimedia team communicate with the MTL Unit for advice on the latest HCI strategies and provide lessons for student evaluation.
(J)
Having considered recommendations and met all conditions Module teams supervise student access to authored lessons.
A MULTIMEDIA IDEOLOGIES OF WELFARE LESSON
Figure 7.1 Steps in the authoring process (A–M). (F)
The MRL Unit conduct research to verify the student outcome from interaction with the lesson. (K)
Student feedback is monitored by the Module teams.
(L)
Module teams compile a yearly report for the cpt including student responses to lessons.
(M)
The CPT respond to the yearly repot and may initiate changes
(G) The Multimedia team communicate results of student evaluation to the Module team. (B)
(C)
(D)
Within criteria agreed, Module Leaders commission production of authored lessons.
Lessons are authored. Communication between Authors and Module Teams concern how authored lessons meet aims and objectives specified.
Lessons are authored. Communication between Authors and the Multimedia
(H) The Module team produce a second report to the CPT, including evidence of the student experience, for approval to run the authored lessons (I) The CPT comment on the second report and notify their recommendations to the Course
49
50
CONVERSION OF THE IDEOLOGIES OF WELFARE
Figure 7.2 Example of a lesson in Ideologies of Welfare. team concern application of current knowledge of the HCI and available technology to optimizing the student learning experience.
Committee. Within policy laid down by the Course committee, directionis given to the Module team.
or production of new lessons following the steps B to L.
A Multimedia Ideologies of Welfare lesson Following the process outlined previously (Figure 7.1; Table 7.1), on 24 June 1993 the first report (Table 1 (A)) on the structure and content of the Pharmacy and Health Policy Module was sent to the CPT. The description for lesson 4 of the module stated: “Lesson four ‘Ideologies of welfare’ takes the negotiated policies in lesson two and considers the underpinning political ideologies which have shaped the Welfare State. This lesson considers developments from the political perspective.” The aim of the lesson was: “To direct the student to an understanding of the belief systems which underpin ideological stances.” The stated outcome was: “The student will demonstrate an ability to identify and critique the fundamental ideological stances which underpin health policies.” Details of the proposed intellectual author were submitted. It was noted in the submission that the topic had been taught for two years at postgraduate level to health care workers attending the Postgraduate Programme in Research Methods. A subsequent meeting of the CPT agreed to progress the lesson and funds for conversion to Multimedia were released.
A MULTIMEDIA IDEOLOGIES OF WELFARE LESSON
51
After an estimated 20 hours’ focused activity, a lesson covering the IofW for conversion to a Multimedia format was committed to paper by the intellectual author. No further reading was necessary, lecture notes had been prepared and a clear idea of the structure of the Multimedia lesson was already known. The Multimedia lesson was designed to provide four hours of student interaction and directed approximately 26 hours of reading. Reading time was based on one hour of student effort for 10 pages of text. Students based in Britain were required to obtain three classic references (Lee & Raban 1983; George & Wilding 1985; Clarke, Cochrane, Smart 1987) and those outside were required to access one further text (Cochrane & Clarke 1993). Reasons for “outsourcing” much of the didactic material were as follows: 1. 2. 3. 4.
The amount of technical authoring required was kept to a minimum. Students were saved from either printing off, or reading from the computer, reams of text. Student time in front of the computer was more interactive. Copyright issues for the use of intellectual material in a Multimedia format were, and still are, vague.
The lesson centred around four questions on each of 20 themes, which were piloted and shown to differentiate anti-collectivist, reluctant collectivist, collectivist and revolutionary collectivist ideologies. Students were first asked to respond to the 80 questions and view a graphical representation of their ideology. They were then directed to specific reading and asked to respond a second time to the same questions in a different order. Feedback to students was based on an examination of how their ideology may have changed through the reading as evidenced by a single graphical representation of their pre- and postreading ideologies. The graphical representation of one student’s response is provided in Figure 7.2. Students were able to record their views and responses to specific posed situations by typing in text entry boxes. The first version of the Multimedia lesson was designed to run in the Windows™ operating environment. This defined the recommended minimum hardware specification as a 486 processor running at 33 Mhtz speed with 4 Mbytes of random access memory (RAM). No other specific software was required. Using Authorware Pro™ as the authoring software tool, an estimated 120 hours of software development was required to produce the first version. In the design of Multimedia lessons the literature (Preece et al. 1994; Christie & Gardiner 1990) was used to provide some guidance for authoring. Text and graphics were the principal media used. Subsequent software testing and quality assur ance required a further 40 hours. Four students provided initial feedback (Table . In the case of lesson 4 additional student testing was conducted. Twenty students from the 1994 cohort of the Postgraduate Programme in Research Methods studying the core module entitled Health policy, read the set texts and interacted with the first version of the IofW lesson. Verbal student feedback on the experience was favourable. Indeed, one student indicated that the lesson “allowed me to be ignorant and did not show me up to the rest”. It was not felt necessary to go over the material in formal, traditional lecture-plus-tutorial format. Six hours would normally have been allocated to such activity. Subsequent students in this programme have all been provided with the Multimedia lesson and told to “take the lesson” at their own pace, but to have completed their interaction by a set date, thereby releasing valuable staff time. In April 1994, 12 students on The Course were provided with Module 1 including the Multimedia IofW (Table 7.1(J)). Feedback was automated by the Multimedia programme and saved to a floppy disk. Assessment was dependent on receipt of the feedback disk. Evaluation of the success of the lesson was based on beforeand-after reading responses to the 80 questions and on the transcripts of students’ rationalization of their ideologies. Further evidence of the student learning experience was provided by students’ text entries of their understanding of how over time developments in the NHS had paralleled changes in ideologies.
52
CONVERSION OF THE IDEOLOGIES OF WELFARE
By April 1995 all student marks had been externally moderated and all students had passed. Thirty per cent of the coursework mark was allocated to demonstration of an understanding of the lofW. From the 12 students a total of 105, A4 pages of feedback were received. A typical student’s feedback contained 3,000 words of text. In addition the exact times when the lesson was accessed and the specific path chosen through the lesson were all recorded. Students were encouraged to make comments as to their experience of the lesson. These tended to reflect on technical aspects, as the following—verbatim example demonstrates (cited with permission of Student K.Rosenbloom): When going through the program it would help to know if you have done this before… I hope that I have done this already…and that it is stored I have changed my views now that I have done this reading what has happened The program would be better if it knew that you have done this section the bookmark is not working this is different to last time, I was a reluctant collectivist I have done this but the information was lost, I will do this again… where have my results gone again? I can’t go back and I forgot to press F 12 how can you say that 1980 is not anti collectivist? I give up I tried this one first From such feedback it was possible to determine the following critical features which required changing in the revised lesson, the ability for the student to: • • • • •
move backwards and forwards in all situations have permanent records kept of their interaction start again if required exit from and return to all parts of the lesson print all text inserted.
In order to make the changes to the lesson a further 80 hours of authoring time was required. Up to the time of demonstration the lofW lesson had required an estimated 260 hours of academic and authoring effort. Conservative estimates put the staff costs at £4,000. External assessment of students’ coursework has proved the lesson is capable of leading students through an academic learning experience. Exactly how this experience compares to that of “chalk’ n’ talk” plus tutorials is yet to be determined. It is the author’s suspicion that the two experiences are simply different, each with strengths and weaknesses but, demonstrably, the same outcome. Conversion of the IofW is an important test for Multimedia. In the author’s previous experience, this topic is most effectively taught with a minimum of formal lecture input, copious handouts and proportionally greater time allocated to small group tutorials discussing questions across a wide variety of politically, socially and emotionally charged topics. The test is: how can Multimedia emulate such interaction and lead to similar student learning outcomes? Where emulation is concerned it clearly fails the real-time interaction test; however, students’ questions can be anticipated and the Multimedia software programmed to react on cue. In terms of outcome, our limited experience is that it passes. Exactly how and why have not been elucidated, but the proof is there. Our experience suggests that the limitations of
A MULTIMEDIA IDEOLOGIES OF WELFARE LESSON
53
Multimedia in terms of real-time interaction are not terminal and that other discursive topics may also be amenable to such delivery. The implication for the social sciences is considerable in that many of its topics are taught in this manner. In summary, this paper describes the changing orientation of pharmacy undergraduate and postgraduate education in Britain. Many practising pharmacists have not had the benefit of a broader education. Pharmacists typically work in community, hospital and industrial settings. Many are unable to access campus-based courses and require a distance-learning delivery format. The Postgraduate Programme in Social and Administrative Pharmacy has been developed to meet their needs. The challenge has been to convert material which has been successfully taught by lecture and tutorials to the one-to-one, interactive Multimedia format. At validation the process of conversion was laid down. An example of one lesson, the IofW, is described. The incorporation of full navigation facilities is a fundamental principle in authoring. Student feedback confirms the desirability of such facilities. Finally, our answer to the group learning experience has been to incorporate student feedback into subsequent versions of the lesson. That Multimedia cannot truly emulate the discursive real-time tutorial interaction does not appear to terminally affect student learning outcomes. The implication is that other topics taught in a similar manner may be amenable for conversion to MTL. This is a long-term and expensive commitment to the student learning experience. It remains for future researchers to determine the effect of this approach. References American Association of Colleges of Pharmacy, “Mastering change”. 93rd annual meeting (Washington DC: American Association of Colleges of Pharmacy, 12–15 July 1992). B.Christie, & M.M.Gardiner, “Evaluation of the human-computer interface”, in Evaluation of human work: A practical ergonomics methodology, J.R. Wilson & E.N.Corelett (eds) (London: Taylor & Francis Ltd, 1990). J.Clarke, A.Cochrane, C.Smart, Ideologies of welfare: from dreams to disillusion (London: Hutchinson, 1987). J.R.Clem, D.J.Murray, P.J.Perry, B.Alexander, “Performance in a clinical pharmacy clerkship: computer-aided instruction versus traditional lectures”, American Journal Pharmaceutical Education 56, 1992, pp. 259–63. A.Cochrane, & J.Clarke, Comparing welfare states: Britain in international context (London: Sage Publications, 1993). V.George, & P.Wilding, Ideology and social welfare (London: Routledge & Kegan Paul, 1985) pp. 19–119. L.Gilroy, “Problem-based distance learning, the role of the pharmacist. Part 1”, Hospital Pharmacy Practice 2(12), 1992, pp. 743–4, p. 753. P.Lee, & C.Raban, “Welfare and ideology”, in Socialpolicy and social welfare, M.Loney, D.Boswell, & J.Clarke (eds) (Milton Keynes: Open University Press, 1983), pp. 18–32. D.McDonough, J.Strivens, R.Rada, “University courseware development: comparative views of computer-based teaching by users and non-users”, Computers Education 23, 1994, pp. 211–20. J.Preece, Y.Rogers, H.Sharp, D.Benyon, S.Holland, T.Carey, Human-computer interaction (Wokingham: AddisonWesley, 1994). J.Pugh, C.Moss-Barclay, R.Sharratt, N.Boreham, “The effectiveness of an interactive computerised education program”, Pharmaceutical Journal 251, 1993, pp. E1–3. Royal Pharmaceutical Society, “Working party recommends social sciences teaching to undergraduates”, Pharmaceutical Journal 243, 1989, p. 228. R.Stevens, & R.Sewell, “The replacement of pharmacology practicals by multimedia computer technology”, Pharmaceutical Journal 251, 1993, pp. E11–5. Teachers of Pharmacy Administration of the American Association of Colleges of Pharmacy, Commissioned Report: A history of the discipline of pharmacy administration (Washington: American Association of Colleges of Pharmacy, 1985). The Nuffield Foundation, The Report of a Committee of Inquiry. Pharmacy (London: The Nuffield Foundation, 1986).
Chapter 8 DESIGNNET: TRANSNATIONAL DESIGN PROJECT WORK AT A DISTANCE Stephen Scrivener and Susan Vernon
This paper does not deal specifically with the use of computing in social sciences—rather the subject considered is design. We believe, however, that the way of working described has general application in the social sciences. In design, computers are often seen as offering new forms of media, image-making, and information resource, for example virtual reality, three-dimensional modelling, painting systems and databases. Working with computer-based media is different to working with pen and paper, paint, models or the like, and design practice is bound to change as practitioners learn to deal with both its limitations and possibilities. This is understood to the extent that most design courses now include modules dealing with IT, computer-aided design, computer-based image-making and design databases. Important as these uses of the computer are, other equally important applications of computer-based technology should be considered by both designers and educators. For example, computers can provide an infrastructure for mediating collaborative design. When computers are used in this way the final artefacts, even their visualization and representation during the design process, may be largely non-digital and produced using conventional media and tools. Computer systems that support team communication and collaboration are usually called ComputerSupported Co-operative Work (cscw), or Groupware systems (see Scrivener & Clark 1994a for a review of cscw systems). This application of computer-based technology is likely to have as great an impact on design practice as digital media, modelling, and database tools, and yet at present there are few instances where this technology is used in practice or in the curriculum. However, a future can be envisaged in which designers work as part of international teams supported by computer- and electronically mediated communication and cscw tools. It will be important to prepare designers and students to work in this way. Indeed, we hope to demonstrate how this technology is not only something that students should understand and know how to use, but it is also actually a way of making it possible for students to work together as part of multi-national and multidisciplinary teams; educators can use the technology to bring such students teams together. Very importantly, the students do not have to be brought together in a given country—it is the technology that brings them together. This chapter describes the DesignNet project which aimed to explore further the possibilities of CSCW usage, of which earlier projects had led us to postulate the following: 1. Users will be committed and motivated to complete a task if it is perceived to be purposeful and valuable. 2. Users’ motivation to complete a task will drive them to exploit the resources at their disposal, even if this involves radical changes in work and communication methods. 3. Users will choose what they perceive to be the most efficient and effective means at their disposal in order to complete a task.
THE PARTNERS AND DISCIPLINES
55
4. Users will choose from the resources at their disposal those they perceive to be necessary and sufficient for the task in hand. (For a fuller account of the collected knowledge from an earlier project (ROCOCO) of design at a distance, refer to Scrivener & Clark 1994b.) Computer and electronically mediated communication was used in the DesignNet project to enable multidisciplinary, transnational students groups separated by distance to work together on a shared design project in order to produce an agreed outcome. DesignNet The DesignNet project was partly funded by the Commission of the European Community’s Task Force for Human Resources, Education, Training and Youth, as part of the preparatory phase of the Arts Education and Training Initiative. This initiative will seek To enhance transnational co-operation between education and training establishments in the European Union Member States in the field of the Arts, to increase mobility of students and teaching staff in this field, to promote the use of innovative techniques through measures to enhance dissemination of information and good practice, to encourage international masterclasses and the production of special modules and courses which add a European dimension to education and training in the arts, and in general to support activities which, through the medium of European co-operation, enhance the quality of education and training in the arts throughout the European union. (Arts Education and Training Initiative 1994) The project The project brief was the result of a collective decision between staff at an initial planning workshop. It was designed to focus on life-style, cultural issues and the interaction of design influences and objects to encourage an exchange of ideas from different countries. Key words were used in the brief to help overcome language difficulties and misinterpretation. The project provided a unique opportunity to compare different modes of group working and uses of various media technologies; it also highlighted the importance of cultural factors and the positive interaction created between peoples of different backgrounds. The project culminated in an exhibition, presentation and feedback workshop attended by all staff and students. It provided an opportunity to see, document and evaluate all of the work, containing elements that reflected both the combination of different skills and different cultural viewpoints. Primary objectives The aims of the project, as set out in our proposal, were: 1. To share the experience of and evaluate the earlier ROCOCO electronic link project. 2. To explore the interaction and transition between two-dimensional representation and three-dimensional construction. 3. To explore computer- and electronically mediated distance communication and how design is communicated across language barriers through visual discourse.
56
DESIGNNET: TRANSNATIONAL DESIGN PROJECT WORK AT A DISTANCE
4. To demonstrate how electronic links can support communications between collaborating transnational universities and enhance the student learning experience. The partners and disciplines DesignNet involved four institutions: Applied Arts, School of Art and Design, University of Derby, Derby, UK; the Ceramics Institute, Bergen College of Art, Bergen, Norway; Faculty of Industrial Design, TU Delft, Delft, The Netherlands; and The Centre of Art and Design, Escola Massana, Barcelona, Spain. Ceramics, industrial design, jewellery and graphic design staff and students took part in the project. Six teams were formed linking Derby with Derby, Delft, Barcelona and Bergen, and Bergen with Barcelona. In all but one case, Derby-Derby, the links were transnational. In this latter case, the students were located in different sites in the City and were unknown to each other. The media and technology Some teams had more resources to choose from than others. Fax and telephone were the primary communication resource in four of the six links. The other two links were computer-supported—DerbyDerby and Derby-Delft—and included Talk, a computer application that allows users present at the same time at each end of a link to “talk” by means of typed messages, and Aspects, which allowed users at both ends of a link to simultaneously write and draw on a common worksurface. The video connection was achieved using CUSeeMe which allows multiple users to share the video. Eudora was available for electronic mail and file transfer; and additionally files could be transferred by FTP. The conclusions presented in this chapter are based on observations during the project, and student and staff feedback at the end of the project. Electronically and computer-mediated communication Art and Design is generally taught as an individualistic activity and group work is not the normal mode of working, neither is distance communication. Students and staff are used to working in a face-to-face situation for individual tutorials and seminars; communicating ideas visually and verbally is an inevitable part of the designing process. DesignNet posed quite different working methodologies for staff and students as face-to-face visual and verbal discussions in the real sense were not possible and alternative technological methods were adapted to enable design communication to take place. Misunderstandings did occur, particularly in the groups where only fax was used and students did not share a common language. Students were required to be more specific about conveying their thoughts and designs than would be necessary in a face-to-face situation as information could not be gathered from the subtleties of body language and expression. All teams were satisfied with the media available to them. This evidence supports Postulate 2 above which predicts that users will adapt their behaviour to accommodate for technological impoverishment in order to complete the task in hand. For example, the students who had only fax and telephone did not express more or less dissatisfaction with the technology than those who also had video and e-mail. Some students commented that while installing the work for the final workshop they came to realize that some misunderstandings had occurred as a result of the restricted forms of communication, but they also agreed that they did not, at the time, associate these problems with the communication media. In other words, in all cases, the students were able to exploit the media in ways they perceived satisfactory. Put another way,
THE PARTNERS AND DISCIPLINES
57
users will accept restricted means of communication if they can find ways of completing a task to their satisfaction. What is remarkable is users’ robustness; they seem to be able to accommodate very severe restrictions. Adaptation of working method was clear: the teams developed protocols and practices geared to their conditions of working. For example, one group described how they prepared for a synchronous meeting (i.e. of fax exchanges), how they made a telephone connection to agree the agenda for the meeting, and how at the end of the meeting they would summarize the agreements reached during the meeting. It appears that each group devised a different strategy for co-ordinating their activities. The more successful groups maintained regular contact at fixed times, weekly meetings whether by telephone or cscw were planned in advance, and correspondence ensued asynchronously in between real time contact. In all groups students initially needed to “visualize” their partners (except in the Derby-Derby link) by faxing images of themselves, or in the Derby-Delft link by initially spending more time scrutinizing the video link through CUSeeMe. Initial fax messages exchanged were self-conscious regarding the quality of drawn and written information being transmitted (e.g. spelling, legibility, finished drawings); this soon loosened up as more spontaneous communication took place, especially when time was an issue. Nearly all groups said later that they would have liked more time, but in many ways the time constraint kept the momentum of the project going. It is difficult to state categorically that the addition of computer-mediated communication resources led to qualitative enhancements in either process or outcome as compared to those obtained using only electronically mediated communication. However, the Derby-Delft students displayed a strong sense of team commitment and identity. Of all the projects, this was probably the most integrated and unified. Had one not known otherwise one would have thought that the work was produced by a single hand. On the evidence of the exhibition and presentations, its reasonable to hypothesize that computer-mediated communication of the kind used in the DesignNet project offers positive advantages over electronically mediated communication, such as fax and phone. Observation would suggest that the Derby-Delft group communicated more than the other groups. For them, video was very important as, although of very low bandwidth and hence poor visual quality, they were able to gauge gesture and facial expression. Interestingly, they did not use the telephone very often, although one was available. First indications are that this reflected a strategy rather than a preference: it seems that they decided to do as much as they could using the computer. Neither did they use Aspects, but this is hardly surprising as the network latency of Internet makes drawing difficult. To summarize so far: design at a distance mediated by electronic and computer technology is perfectly feasible. Students are able to adapt both communication and working strategies to accommodate technological impoverishments. Initial evaluation would suggest that computer-mediated resources, such as video, synchronous text exchange, e-mail, file transfer, and shared worksurfaces offer positive benefits in comparison to fax and telephone. The method of working The brief permitted two ways of working. The first required a team to work together to develop what was essentially an agreed installation; the second required the designers to agree on an object to be produced by each individual. Three teams chose to work in one way and three in the other. The different methods of working were clearly represented in the outcomes: the agreed-installation method leading to single, unified pieces; the agreed-object approach to mini-exhibitions of individual pieces, the obvious connection being a common starting point. Perhaps not surprisingly, the teams adopting
58
DESIGNNET: TRANSNATIONAL DESIGN PROJECT WORK AT A DISTANCE
the agreed-installation method (apart from one group which we will come to later) appeared to have a stronger sense of team identity than those choosing the latter. When they presented their work they did so as a team, each member speaking for the team and seeking agreement from other team members about statements made on their behalf. Teams adopting the agreed-object method tended to report as two national teams. Interestingly, the agreed-installation teams appeared to be more positive about the whole experience overall. Task meaningfulness Perhaps the reason for these apparent differences in the value attached to experience by different groups is related to the purposefulness of the task. If you are being asked to work as part of a team perhaps the production of an agreed installation seems more purposeful and more valuable than being asked to work on an agreed object. Perceived purposefulness and value (Postulate 1 above, “Users will be committed and motivated to complete a task if it is perceived to be purposeful and valuable”) perhaps explains why the Derby-Derby link, an agreed-installation project, worked less successfully from the teamworking point of view than the other two agreed-installation projects. Obviously, things could have been arranged such that the students in Derby worked together, for all or part of the project. Both options were essentially prohibited as students were asked not to work face-to-face. Postulate 2 predicts that users will be less tolerant of communication impoverishments when they lack motivation; Postulate 1 suggesting that the perceived purposefulness and value of the task is a strong motivational factor. Multi-disciplinary and transnational team working Generally, students found the experience of working in multidisciplinary and transnational teams rewarding. Where problems arose between teams this seemed to have little to do with the communication media or language. Indeed, students seemed highly tolerant of problems of this kind. Personality seemed to be an important factor in determining successful group dynamics. One student found it difficult to work with other people, and this caused problems for the group as a whole. Students who made valuable contributions to the group were well motivated and well organized, had a flexible attitude and an ability to value others’ opinions. An enthusiastic approach was also important as well as an enjoyment of working with others. Decision-making was a shared activity with all group members participating. In one instance there was almost telepathic communication: as one design was faxed an almost identical one was received. Personality was directly related to learning activity, as those students who enjoyed working in groups and were more enthusiastic about the process gained more from the experience and felt a greater sense of satisfaction with what was achieved. This also directly related to the quality of the exhibited work: those groups who gained more from the whole process of communication also produced the most innovative and well co-ordinated work. Generally, students were able to give and take, even when they wanted to retain a strong individualistic element in the work. Interestingly, when questioned about the quality of the work produced by groups with good team dynamics, the highly independent student admitted to being very impressed. One feature that did emerge in support of multi-disciplinary teams was that team members found it easy to accommodate another team’s contributions when they concerned experience and skills not possessed by the others. Finally,
THE PARTNERS AND DISCIPLINES
59
although individuals agreed that compromises had to be made they regarded this as generally enhancing rather than diminishing the outcome. The work It is difficult to say whether transnational multi-disciplinary teamwork leads to better outcomes. What we can say with reasonable confidence is that the work produced for DesignNet contained elements that reflected both the combination of different skills and different cultural viewpoints. It is tempting to believe that these conjunctions and unifications enhanced the quality of the work; they certainly produced interesting and novel results. Conclusions and future work We have suggested that design at a distance involving multidisciplinary, transnational teams is likely to be an increasingly common feature of design practice. It may also offer some salutary lessons to other disciplines such as the social sciences. Electronically and computer-mediated communication and collaborative work technology may enhance design practice; this remains to be determined. However, it cannot be disputed that this technology enables design at a distance to be achieved. It has to be recognized that collaboration using this technology is impoverished in terms of media, communication, and work pattern possibilities as compared to working in the same place at the same time. On the other hand, design at a distance offers potential benefits that may counterbalance or override these impoverishments. The primary aim of our previous studies of design at a distance and the DesignNet project was to investigate the problems and potential of this way of working, with a view to establishing ways of minimizing the problems and maximizing potential. The DesignNet project extended our earlier work in a number of ways, the most important being the differences in culture, language, and discipline of the collaborating individuals. Consequently, the overall aims of the DesignNet project were successfully realized. Both staff and students found it a purposeful, meaningful, valuable, interesting and enjoyable work and social experience. The students gained experience of unfamiliar technology, communication and work methods; and multidisciplinary, transnational group-working. Feedback indicates that students were able to overcome the problems of technology, language, culture, discipline and group-working, and to draw out positive lessons and insights from the experience. In the first place, student resistance to new technology may be reduced since a design-at-a-distance project cannot easily be completed without it. Thus students can gain experience of new technology in a non-threatening and meaningful context. Having used new technology in this context they may be motivated to explore its potential in their day-to-day activities. Secondly, computer-supported communication and work may provide a means of maintaining regular interactions with linked institutions, especially if technology is available to support routine project work forming part of the curriculum. Staff also gained insights into the benefits and limitations of this kind of working. Routine design-at-adistance projects, of the DesignNet kind, would enable an ongoing and constant level of interaction to persist as the foundation of other perhaps more changeable forms of interaction. Furthermore, such projects may provide a sensible precursor to staff and student exchange, as they allow staff and students in each collaborating institution to gain some prior knowledge and practical experience of the people, place, values and working and learning methods of their partners. We might expect that this would assist the assimilation of exchange staff and students into the host organization.
60
DESIGNNET: TRANSNATIONAL DESIGN PROJECT WORK AT A DISTANCE
Since completing this project an Art and Design communications module using Internet connections world-wide has been validated; this will help consolidate institutional links integrated in the curriculum. We have also recently completed DesignNet 2, a pilot scheme, with student teams linking institutions in Finland, Sweden, USA, Canada and Columbia which used only the Internet. Acknowledgements Thanks go to all those who contributed to the success of the DesignNet project: Raghu Kolli, Richard Launder, Joan Sunyol, Lindon Ball, Nigel Billson, Paula Bourges, Sean Clark, Gail Ferriman, Tim Willey, Quinten Drakes, Joan Ainley, Gemma Carcaterra, Joost de Keijzer, Marjolein Rains, Irene Osborne, Amanda Simes, Elisabeth Fornas Dos-Santos, Cora Egger, Paul Rodriquez, Sarah Matthews, Nicola Williams, Anneli Belsvik, Heidi Bourgan, Carol Cooling, Roger Davies, Nacho Garcia Del Rio, Franscisco Juan Tent Petrus, Robin Reeves, Lynn Butler, Raakesh Nath, Howard Dean, Elin Andreasson, Anna Maria Jacobsdottir, Lluis Serra, Anna Aibar, Wendy Proctor; to all those who contributed behind the scenes; and to the Arts Education and Training Initiative that partly funded the project. References Arts Education and Training Initiative, Preparation Phase: Support for Demonstration Project, Commission of the European Communities, Task Force Human Resources, Education and Training, 22 January 1994, p. 1. S.A.R.Scrivener & S.M.Clark, “Introducing computer-supported co-operative work”, in Computer-supported cooperative work, S.A.R.Scrivener (ed.) (Alperton: Ashgate Publishing, 1994a), pp. 51–66. S.A.R.Scrivener, & S.M.Clark, “Experiences in computer-mediated communication”, Information Systems Architecture and Technology Workshop, Szklarska Poreba, Poland, September 1994b.
Section Three IMPLEMENTING COMPUTER-ASSISTED LEARNING IN THE SOCIAL SCIENCES
Chapter 9 COMPUTER-AIDED LEARNING AS A TOOL: LESSONS FROM EDUCATIONAL THEORY Graham R.Gibbs and David Robinson
This paper argues against a dominating design philosophy of CAL: the attempt to replace the teacher with technology. This is an example of the wider societal process of deskilling and recent research in CAL and allied areas suggests that it may be counterproductive if improved teaching and learning is the goal. In developing CAL packages for use in higher education teaching we need to identify what teachers are good at: giving lectures, explaining complex ideas, communicating with large numbers of people, having in-depth knowledge of specific areas. CAL should be used to enhance these skills by providing flexible learning tools rather than seeking to replace or deskill them. Tools which enhance the pedagogic skills of teachers include software gadgets, simulations, databases/ hypertext systems, knowledge tools and communications systems. Advantages of the tools approach include recognizing teachers as experts, minimizing the effect of the “not invented here” syndrome, allowing for the different learning styles of students and in many cases encouraging learning by exploration and by doing. Examples of learning tools are examined, and the paper concludes with the suggestion that the development and use of learning tools will avoid the pitfalls associated with replacing the expertise of teachers. Introduction In recent years there have been several initiatives in the UK such as the Computers in Teaching Initiative Centres, Phase 1 and 2 (CTI 1996), the TLTP 1 and 2 (TLTP 1996), and the ITTI (ITTI 1996), each aimed at promoting the use of CAL packages for use in academic institutions. This paper looks at issues associated with these developments and in particular addresses the key question: “what kind of CAL software is best supportive of teaching in higher education?” There are many factors which determine whether CAL software will be widely used by lecturers in higher education. Software is more likely to be used where institutions give support to lecturers to develop and modify software, there are sufficient computers and students have adequate access. Such factors, however, are generally beyond the control of GAL software developers. On the other hand there are others which they can control. These include making software relevant to teaching needs and making it easy to integrate into teaching programmes. A key aspect of this is an understanding of how students learn. Indeed, in recent years there has been much investigation of how to produce CAL that is suitable for and supportive of the different ways students learn, for example, Kwok & Jones (1995), Patterson & Rosbottom (1995) and Groat & Musson (1995). However, a focus on learning is only half the equation. It is important for the uptake of CAL not to forget the role of the teacher. We need to develop an understanding about what good teaching practice is and how CAL software can best support it. Sadly, much recent CAL software seems intent on replacing the teacher rather than asking
RECENT RESEARCH ON TEACHING
63
how the teacher can be supported. Even Groat & Musson, who do at least consider the relationship between teaching approaches and learning styles, are still mainly concerned with how CAL can be “tailored” to provide the kind of teaching which matches students’ learning styles. Their model is still fundamentally one of teacher substitution. The developers of CAL, along with many other software developers, have commonly claimed that a major advantage of the software was efficiency gains. In fact, in some cases, an important criterion for funding of CAL development was the identification of improvements in teaching efficiency. This was true of the TLTP programme, where potential projects had to identify the number of tutor hours their proposals saved. One unfortunate consequence of this focus has been that much CAL has been designed in terms of replacing lectures, tutorials and seminars rather than supporting or enhancing teachers’ existing skills.1 Not only does this mean that some of the most satisfactory and often the most efficient parts of teaching are replaced, but it also suggests that CAL designers are using a very limited idea of what teaching expertise is. In the latter case it is repeating the salutary experience of software developers in other fields such as the use of computer-based decision support in the medical domain (Young, Chapman, Poile 1990). To date there has been little success in producing systems in medicine which are widely used in routine clinical practice. In part this is because the systems on offer are not sufficiently useful to justify the effort required to use them (Rector et al. 1992) but, as Heathfield & Wyatt (1993) argue, it is also because they are generally replacement systems. In discussing the development of knowledge-based systems, Rector (1989) makes a broad distinction between those systems which aim to augment skilled performance and those which aim to substitute for some skill or knowledge. Substitution systems assume that their users are ignorant, or at most novices, in the field. Augmentation systems assume that their users are “broad experts” who are skilled in the field and exercise ultimate judgement, although they may make slips or lack particular items of knowledge. Augmentation systems are used, whereas substitution systems are consulted (Rector 1989). He argues that the goal of augmentation systems is to “become part of the regular tools of the trade”. In contrast, substitution systems tend to deskill the experts they seek to replace and for Rector this is a mistaken strategy. Not only is it probably an impossible goal to replace the expertise of professionals in this way, but it is based on a false view of the nature of professional expertise, one in which knowledge-based systems are simply a question of capturing and reproducing a large body of internalized knowledge. Recent research on teaching The current state of knowledge about the nature of teaching expertise indicates is that it is not a simple matter of defining a set of knowledge to be acquired and then producing optimal conditions for the transfer of that knowledge to the student. The work of authors like Marton & Säljö (1984), Ramsden (1992), Entwistle (1987) and Laurillard (1993) suggests a general consensus that good teaching practice brings together two things. First is an understanding on the part of the teacher of the scientific and intellectual 1. Of course this does not necessarily follow from the need to show efficiency gains. Gains could also arise from the reduced need for remedial work, lower failure rates, faster learning and so on. But it is perhaps indicative of the poverty of much understanding of the teaching process that efficiency is nearly always seen in terms of reducing student contact and/or increasing staff—student ratios.
64
COMPUTER-AIDED LEARNING AS A LEARNING TOOL
concepts of the subject matter—what is to be learned. Second is an awareness of the learner’s conception of that same subject matter, i.e. what their understanding of the “thing” is and the nature of their mistakes. In this essentially phenomenographic or constructivist approach, the skill of teaching in higher education consists of progressively modifying the learner’s conception to bring it closer to the teacher’s while dealing with a large number of students. Laurillard suggests there are four prescriptive implications of this approach: • there must be a continuing dialogue between teacher and student [not a monologue] • the dialogue must reveal both participants’ conceptions • the teacher must analyse the relationship between the student’s and the target conception to determine the focus for the continuation of the dialogue • the dialogue must be conducted so that it addresses all aspects of the learning process (Laurillard 1993, p. 85). There are several consequences of these with relevance to CAL development. First, replacing the teacher by software is likely to be detrimental to the dialogue between teacher and learner—it may even eliminate it. Nevertheless, attempts have been made to produce software that substitutes the teacher’s diagnostic role. In particular some developers have produced Intelligent Tutoring Systems (Anderson 1994; Anderson, Boyle, Yost 1986). These contain within them a cognitive model of the learner and give feedback and modify the learner’s path though the material depending on the kind of misconceptions suggested by a comparison between the learner’s answers and the model. However, such systems seem limited at the moment to knowledge domains which are logically well defined and uncontested. Examples include learning computer programming languages and geometry. Unfortunately, in most areas of knowledge in higher education, and in the social sciences in particular, concepts and ideas are commonly challenged and debated. There is as yet no CAL system which can diagnose students’ misconceptions in such “ill-structured knowledge domains” (Spiro et al. 1991). Indeed, Perry suggests that recognizing that a discipline consists not of simple, agreed facts but of contested ideas about which one must take a stand is indicative of advanced learning (Perry 1970). Moreover, recent research with the Geometry Tutor (Schofield et al. 1994) suggests that, in fact, rather than replacing the teachers the program became an additional, enriching resource. Second, there have to be occasions on which the student’s (mis)conceptions are made apparent so they are amenable to correction by the teacher. This is not just a matter of detecting wrong answers, something which intelligent tutoring systems can do. Learners may be doing the right things but for the wrong reasons. For example, we found, while examining students’ use of CAL without intervention from the teacher, that some students while apparently doing sensible things with the program actually held quite mistaken views about the underlying logic of the subject matter (Gibbs & Robinson, 1995). Third, a key problem is getting the student to modify their conceptions (or develop a coherent one). One difficulty here is that pointed to by Laurillard (1993), namely the distinction between percept and precept. Knowledge of dogs, for example, is a percept which comes about through ordinary, direct and common experience. In that sense it is easy to acquire. On the other hand, and more common in higher education, is knowledge of a more abstract nature such as knowledge of molecules. Laurillard calls this knowledge of precepts. These are learned only indirectly through analogy or metaphor since we have no direct contact with things such as molecules. Understanding here is thus more difficult to acquire; indeed it is fraught with difficulties arising from both the lack of concrete metaphors and the need to keep analogy within bounds. There is clearly a role here for CAL in providing fertile models and analogies that can make precepts more concrete. The advantages of computers—they can do calculations quickly, they can change with time and they can interact with the learner—suggest a rich seam of applications. At the same time, these are only
RECENT RESEARCH ON TEACHING
65
models or analogies. There remains a role for the teacher in ensuring that the student does not generalize erroneously beyond the limits of the metaphor. Entwistle & Entwistle (1991), following the principles of cognitive psychology (e.g. Anderson 1990; McKeachie et al. 1990), suggest four functions which teaching has to meet. These put Laurillard’s prescriptions into a more concrete form and elaborate the relationship between the use of CAL and teaching skills. The four functions are presentation, remediation, consolidation and elaboration. Each is associated with particular teaching methods and each suggests different roles for CAL. Presentation This is most often done via lectures. Lecturers present new knowledge which should be related to learners’ prior understanding and knowledge. It should have a clear, logical structure to help the students establish a personal organizing framework. Entwistle (1992) suggests that good lecturing has a deliberate focus and promotes self-confidence in learners and knowledge acquisition by them. It should also strike a balance between serialist and holist thinking. This distinction is made by Pask (1988), who suggests that some students prefer a holist approach in their learning—taking a broad overview, using good illustrations and examples and analogies but consequently giving insufficient attention to detail. They gain more from holist approaches in lectures. Others tend to be serialists. They prefer a narrow focus, looking at detail and logical connections in building up to an understanding, but they tend to miss important analogies and connections between ideas. Pask presented evidence which suggests that a mismatch between students’ preferred approach and that of lecturers has a detrimental effect on learning. However, the best learning occurred when lecturers used a versatile approach with holist and serialist elements, encouraging both holist and serialist thinking in their students. On another, but related, dimension, Ramsden (1992) suggests it is important to ensure that students undertake deep learning as well as surface learning. In a parallel to versatile approaches to lecturing, he suggests that students need to be encouraged to adopt flexible approaches to deep and surface learning. The main danger here is that the form in which knowledge is presented and especially how it is assessed may encourage learners to adopt mainly surface strategies. This enables them to perform well in the immediate tests, but hinders the development of a deeper and more conceptual understanding. There is clearly a role in presentation for CAL, especially of the hypertext or hypermedia kind, along with lectures and videos. But the dangers here are significant. Unlike a lecture or a video, there is much less control over how the student uses, or navigates around, a hypermedia system. That, indeed, is often thought a major advantage of such systems. However, the dangers are those identified above. Students tend to use the system in a way that simply reflects their prior preferences, holist or serialist, and may fail to develop a versatile approach that produces the best learning. They tend to skim the surface of the content of the hypermedia system and gain little depth understanding. Their use of the system may lack focus and, in the absence of guidance, they may just “wander around”. Browsing is not learning. Moreover, without some teaching guidance the information and concepts the system presents them with may be at the wrong level for their stage of understanding—too hard or too simple. None of this rules out the use of CAL in presentation, but it does mean its use is likely to need the continuing intervention of the teaching expert, to give guidance and modify the student’s explorations. Admittedly, some of these considerations could be built into the software. For example, the package could ensure that the level of material presented is appropriate for the student by using a built-in pre-test. However, we should be careful of trying too hard to match the content and approach of CAL systems with students’ preferred learning styles. There have been
66
COMPUTER-AIDED LEARNING AS A LEARNING TOOL
several attempts at this (Clarke 1993; Groat & Musson 1995). Done automatically, there is a danger that it will lead merely to the reinforcement of existing habits and the underdevelopment of versatile approaches. Remediation Remediation is where students “fill in the gaps” and try to catch up with the concepts and knowledge presented in lectures. Entwistle (1992) suggests this is most often done by students themselves through reading textbooks. Of course this could now be achieved by their use of CAL systems. Tutorials can also play a role in helping learners to assimilate new concepts, where their background understanding is insufficient or they are having difficulties grasping new ideas. However, both educational research and the common experience of lecturers suggest that tutorials are notoriously varied in their adequacy. Attendance is often poor and, as a result of student reluctance to engage or their insufficient preparation, tutorials are often far too didactic. Research suggests tutorials work best if their focus is on meta-cognitive strategies. For instance, Baumgart (1976) suggests the best role for teachers in seminars or tutorials is “the reflexive judge” or “probing”. A device that is being increasingly used in small group discussions is work based on case studies and simulations. This is a good way of promoting problem-solving approaches. But, as Entwistle (1992) cautions, good and careful debriefing after the case study is needed so that students can see the general relevance of the material they have examined. Thus both as a replacement for textbooks and as supportive material for case studies there is a role for CAL. Indeed, there is a current TLTP project in Politics which is aimed at developing computer-supported case study materials. But again, as in the case of presentation, without the intervention of the skilled teacher students may gain only limited benefit from the materials. There is a need for careful debriefing, to identify and correct learners’ misconceptions and again, unless teachers counteract it, learners may just skim the surface, not properly getting to grips with underlying concepts. In the case of both the presentation and remediation uses of CAL there is a further limiting factor. Teachers and learners want to pick and choose the materials they use. This reflects varying styles and approaches but is also because teachers need to ensure that materials appropriate to the students’ pre-existing conceptual sophistication are used. Printed materials are varied and widely available and fit this need well. Parallel materials in CAL packages, and in hypermedia systems particularly, are much less available, sometimes expensive and much less varied. Also, and this is crucial, whereas it easy for both lecturers and learners to gauge the level and content of printed materials (readers have already acquired the skills for doing this and the content is relatively open to access), this is much more difficult with hypermedia systems. The openstructuredness of hypermedia, often seen as a major advantage, may in fact militate against their use. It is difficult to find out quickly whether such programs have contents at the required conceptual level, or, even if they do, there is still a need for the skilled teacher to modify their presentation to meet pedagogic needs. Consolidation Consolidation is the use of the newly acquired knowledge, concepts and skills in wider contexts, Students’ tentative and initially fragile understanding is reinforced in a variety of settings and ways such as in laboratory work, fieldwork, essays, projects, examinations and other assignments. Again, case studies and simulations are especially good for this. But, as Entwistle (1992) points out, both project work and case studies need good quality teaching resources and learners need to adapt to a different self-discipline. Learners may be left more to their own devices than they are used to. They need to develop new coping
RECENT RESEARCH ON TEACHING
67
strategies in order that they do not procrastinate over getting work done. Entwistle suggests a mix of approaches is best, and above all, good debriefing is needed. There are several ways in which CAL can help here. As mentioned above, case studies can be presented through the use of computers. In particular, case study and simulation may be brought together in microworlds (Hartog 1989; Isaacs 1990). Second, the process of thinking about what has been learned, of processing and structuring knowledge can be supported by other kinds of program, collectively known as mindtools. Software such as Inspiration and Skate (discussed below) can support concept mapping, which encourages the student to investigate the logical structuring of what they know and to investigate both the links between what they know and the gaps in their understanding. Third, assessment can be computerassisted. While at the moment this is limited to highly structured and deterministic assessments, the approach does have the great advantages that students get almost instantaneous feedback and most of the assessment requires no teacher involvement. Again, there are dangers. In addition to the need for debriefing mentioned above, in many cases what is presented in software could just as easily, and probably more cheaply, be presented in printed or paper form. Indeed the use of mindtools as a form of consolidation often requires generous resources of printed materials which students read in order to build their concept maps. In the case of computer-assisted assessment, the biggest danger is the tendency to focus development efforts on what is assessable in that way rather than on what it is important to assess. Like the drunken man at night looking for his keys under the street light because, although he didn’t drop them there, that’s the only place he might see them, the temptation is to assess what is easily assessed even if it is educationally trivial. In particular, depth learning, which is educationally most significant, may never be assessed this way because of the technical problems of writing the software. Even in those domains where assessment can be done successfully by computer, there remains an important role for the skilled teacher in maintaining a balanced diet of assessment and feedback. Elaboration The final stage of learning identified by Entwistle (1992) is elaboration. This refers to the learning where students acquire additional meanings, examples of evidence and a deeper and broader understanding. It is widely accepted within cognitive psychology that greater elaborative processing of information results in better understanding and recall of the material. Whether the elaborations are generated internally by individuals or are generated externally, for example by CAL software or a lecturer, is not necessarily important. It is the precision with which the elaboration relates to the concepts and material to be learned and understood that is the issue. If Anderson (1990) is correct in his assertion that question-making contributes most to elaborative processing then as teachers we should aim to construct situations that facilitate this process. Traditionally this has been done through the use of lectures, tutorials, textbooks, laboratory work and fieldwork, but clearly there is a role for CAL here in examples such as mindtools, microworlds and other case studies. However, the potential dangers and pitfalls faced by CAL software outlined in the discussion of consolidation also apply here. In particular there is a need for proper debriefing so that students recognize what they have learned and for appropriate monitoring of their elaborations to ensure that misconceptions are identified and corrected.
68
COMPUTER-AIDED LEARNING AS A LEARNING TOOL
Software tools for learning What kind of picture of appropriate CAL software emerges from the preceding discussion? We suggest four attributes: CAL software should complement the expertise of teachers, it should be flexible and adaptable, it should promote deeper learning, and it should be usable only when integrated into a wider teaching context (i.e. should not stand alone). Such software has been termed software tools or learning aids (Chute 1995). In not substituting for the knowledge of the teacher such software may have little apparent informational content. The function of the tool is not to get the student to learn by acquiring surface knowledge from manifest information in the software, but to get the student to master new ideas by manipulating the program in ways made possible by the latent concepts on which its design is based. A parallel can be drawn here with what is needed in order to repair a car. A manual and some tools are needed. The manual alone is useless—as are the tools without some knowledge about how the car works. But the tools can be used on a variety of cars, in a variety of ways, and they can be used once the mechanic is experienced enough not to need the manuals. We believe CAL software should more often provide the tools for learning rather than the manual. The provision of factual knowledge is done very well at the moment by books, videos and to some extent by CD-ROM. However, as we have argued above, presentation of knowledge is merely one stage in learning. Students need remediation and they need to consolidate and elaborate. For this they need learning tools as well as the knowledge base. Software tools for learning are relatively free of content. They rely on its provision in other ways. But they support the learner in focusing on remediation, consolidation and elaboration, and they do so in a way that does not eliminate the lecturer, but in fact requires the intervention of the skilled teacher to manage the student’s learning. Software tools for learning, therefore, do not substitute for teaching skills, rather they support and augment teachers’ essential skill: helping learners to manage their development. Moreover, they do so in a way that is flexible and adaptable. Because software tools have little in the way of content they are easier to adapt to specific teaching needs and for that reason are likely to be usable in a wide variety of teaching contexts by a large number of different teachers. Examples of CAL software tools include software gadgets (like experiment generators), simulations (such as models of political systems, or ecological systems), databases/hypertext systems and knowledge tools (all of which require the learner to structure and process the knowledge they are gaining) and communications systems (such as computer-assisted co-operative learning and e-mail which promote collaboration, peer tutoring and asynchronous co-operation). These general types can be illustrated by three examples. Correlation explorer This is a simple model of the statistical concept of correlation (see Figure 9.1). It is based on the scattergram. Students can modify the data set by changing the number of points and by moving them around the scattergram. As they do so they can see the impact on various statistics and on the regression line. There is almost no text in the program. Instruction about correlation takes place outside the program. Its role is to provide a relatively concrete metaphor for correlation so that in manipulating the scattergram students can consolidate and elaborate their understanding. Inspiration This is a general concept mapping and chart-making program (see Figure 9.2). Its use as a teaching tool would be either as an analysis tool, for instance as a means of analyzing qualitative data (Weitzman &
RECENT RESEARCH ON TEACHING
69
Figure 9.1 Correlation Explorer.
Miles 1995), or as a concept mapping tool. In the latter case the student would use the program as a means of structuring and representing their understanding of the nature and relationship of ideas, theories, hypotheses, evidence, conjectures and so on from the area they are studying. Inspiration is fairly open about how this could be done. In this sense it is close to a word processor (in fact like many word processors Inspiration contains an outline facility). Other similar programs such as Skate (Reader & Hammond 1994) are stricter about how they can be used. For example, Skate allows only certain kinds of connections between kinds of concept. Thus, for example, an hypothesis cannot support a theory. MacLaboratory This consists of a set of modules or CAL tools along with a large amount of hypermedia material (see Figure 9.3). It is a commercial package in use in many psychology departments around the world. Individual parts of the package such as the Skinner box or the Polygraph tool can be used separately as part of standard laboratory exercises or even as research tools. The hypermedia materials are configurable and the package comes with tools to facilitate this. Teachers can thus produce their own hypermedia documents to support the use of the tools in a learning context they have selected. All these examples of CAL tools are relatively content free and standalone programs and are thus adaptable and flexible. But above all they are not designed for use without the intervention of a teacher. The teacher must explain their use and prepare students with appropriate basic understanding before the tool is used and, of course, must provide essential diagnostic assistance while the tool is being used.
70
COMPUTER-AIDED LEARNING AS A LEARNING TOOL
Figure 9.2 A screen from Inspiration.
Concluding remarks This paper has looked at two related issues: first, the philosophy behind the development of CAL and, secondly, the nature of teaching expertise. We have identified two radically different approaches to the development of CAL: CAL as replacement and CAL as tool. It is argued that the different philosophies behind the development of CAL have implications for the nature of CAL software development. We are proposing that the approach to CAL as tool is consistent with recent constructivist approaches to understanding the nature of teaching expertise. Teaching is seen as a facilitative process, where the student is an active, rather than passive, participant in the learning environment and the expertise and skills of the teacher are recognized. CAL aimed at replacing teachers is based on an erroneous view of the nature of teaching. Evidence from other domains suggests that it is unlikely that it will achieve widespread acceptance and use. In addition it is also likely that the replacement approach will have a detrimental effect on student experience. We are arguing that the most appropriate form of CAL is learning tools that support and augment the skills of the teacher and engage the learner in an active process. We are thus in agreement with Chute (1995) who claims that: The most important and the most successful of academic software products are those applications that function as tools or productivity aids. They empower students to do what would otherwise be difficult. They are superior in student evaluations and in content mastery (Chute 1995, p.6). Currently, one of the major driving forces behind the funding and development of CAL is “efficiency”. Unfortunately it seems to be the case that the policy-makers and funding bodies’ definition of efficiency focuses rather narrowly on measures of student contact and staff-student ratios. Unless funding bodies are
RECENT RESEARCH ON TEACHING
71
Figure 9.3 The Polygraph from MacLaboratory.
encouraged to take a wider and more enlightened view of efficiency to include measures such as lower failure rates, student satisfaction, faster learning and better understanding of complex concepts then it is unlikely that the potential of CAL in higher education will ever be fully realized. References J.R.Anderson, Cognitive psychology and its implications (New York: Freeman, 1990). J.R.Anderson, “Cognitive tutors: lessons learned”, presentation at Computers in Psychology ‘94 Conference, University of York, September 1994. See also: http://www.york.ac.uk/inst/ctipsych/web/CTI/WebCiP/Anderson.html J.R.Anderson, C.F.Boyle, G.Yost, “The geometry tutor”, Journal of Mathematical Behaviour 5, 1, 1986, pp. 5–19. N.L.Baumgart, “Verbal interaction in university tutorials”, Higher Education 5, 1976, pp. 301–17. D.L.Chute, “Things I wish they had told me: developing and using technologies for Psychology”, Psychology Software News 6, 1, 1995, pp. 4–9. J.Clarke, “Cognitive style and Computer-Assisted Learning: problems and a possible solution”, ALT-J 1, 1, 1993, pp. 47–59. CTI. http://info.ox.ac.uk/cti/, June 1996. N.J.Entwistle, “A model of the teaching-learning process”, in Student learning: research in education and cognitive psychology, J.T.E.Richardson,M. W.Eysenck, D.Warren (eds) (London: SRHE/Open University Press, 1987), pp. 13–28. N.J.Entwistle, The impact of teaching on learning outcomes in higher education: a literature review (Sheffield: Committee of Vice-Chancellors and Principals of the Universities of the United Kingdom, 1992). N.J.Entwistle, & A.C.Entwistle, Developing, revising and examining conceptual understanding: the student experience and its implications (Edinburgh: Centre for Research on Learning and Instruction, University of Edinburgh, 1991). G.R.Gibbs, & D.Robinson, “Correlation Explorer”, Psychology Teaching Review 4, 2, pp. 110–20. See also http:// www.york.ac.uk/inst/ctipsych/web/CTI/ WebCiP/Gibbs.html
72
COMPUTER-AIDED LEARNING AS A LEARNING TOOL
A.Groat, & T.Musson, “Learning styles: individualising computer-based learning environments”, ALT-J 3, 2, 1995, pp. 53–62. R.J.H.Hartog, “Computer-assisted learning: from process control paradigm to information resource paradigm”, Journal of Microcomputer Applications 12, 1989, pp. 15–31. H.A.Heathfield, & J.Wyatt, “Philosophies for the design and development of clinical decision support systems”, Methods of Information in Medicine 32, 1993, pp. 1–8. ITTI. http://www.icbl.hw.ac.uk/itti/, June 1996. G.Isaacs, “Course and tutorial lesson design: helping students take control of their learning”, Education and Training Technology International 27, 1990, pp. 85–91. K.Kwok, & C.Jones, “Catering for different learning styles”, ALT-J, 3, 1, 1995, pp. 5–11. D.Laurillard, Rethinking university teaching: a framework for the effective use of educational technology (London: Routledge, 1993). F.Marton, & R.Säljö, “Approaches to learning” in The experience of learning, F.Marton, D.J.Hounsell, N.J.Entwistle (eds) (Edinburgh: Scottish Academic Press, 1984), pp. 36–55. W.J.McKeachie, P.R.Pintrich, L.Yi-Guang, D.A.F.Smith, R.Sharma, Teaching and learning in the college classroom: a review of the research literature (University of Michigan: National Center for Research to Improve Postsecondary Teaching and Learning, 1990). G.Pask, “Learning strategies, teaching strategies and conceptual or learning style”, in Learning styles and strategies, R.R.Schmeck (ed.) (New York: Plenum Press, 1988), pp. 83–100. P.Patterson, & J.Rosbottom, J. “Learning style and learning strategies”, ALT-J 3 , 1, 1995, pp. 11–21. W.G.Perry, Jr, Forms of intellectual and ethical development in the college years: a scheme (New York: Holt, Rinehart & Winston, 1970). P.Ramsden, Learning to teach in higher education (London: Routledge, 1992). W.Reader, & N.Hammond, “Computer-based tools to support learning from hypertext: concept mapping tools and beyond”, Computers and Education 22, 1/2, 1994, pp. 99–106. A.L.R.Rector, “Helping with the humanly impossible task: integrating knowledge based systems into clinical care”, inProceedings of the Second Scandinavian Conference on Artificial Intelligence, SCAI ‘89, Tampere, Finland, J.Hanmu, & L.Seppo (eds), 1989, pp. 560–72. A.L.R.Rector, M.Fitter, B.Horan, S.Kay, P.Newton, A.Nowlan, A.Wilson, D.N.Robinson, “User centred design and development of a general practice medical workstation: the PEN & PAD experience”, in Proceedings of Computer Human Interaction CHI ‘92, P.Bauersfield, J.Bennett, G.Lynch (eds), Association for Computing Machinery (Monterey: Addison Wesley, 1992), pp. 1420–26. J.W.Schofield, R.Eurich-Fulcer, C.L.Britt, “Teachers, computor tutors and teaching: the artificially intelligent tutor as an agent for classroom change”, American Educational Research Journal 31, 3, 1994, pp. 579– 607. R.J.Spiro, P.J.Feltovich, M.J.Jacobson, R.L.Coulson, “Cognitive flexibility, constructivism and hypertext: random access instruction for advanced knowledge acquisition in ill-structured domains”, Educational Technology 31, 5, 1991, pp. 24–33. TLTP. http://www.icbl.hw.ac.uk/tltp/, June 1996. E.A.Weitzman, & M.B.Miles, Computer programs for qualitative data analysis (London: Sage, 1995). D.Young, T.Chapman, C.Poile, “Physician reveal thyself”, British Journal of Healthcare Computing 7, 9, 1990, pp. 16–21.
Chapter 10 ANORAKS AND TECHIES: A CALL FOR THE INCORPORATION OF NON-TECHNICAL KNOWLEDGE IN TECHNOLOGICAL DEVELOPMENTS Vernon Gayle
British higher education has undergone a quiet revolution (Daniel 1993). The revolution has extended to teaching and learning which have recently, and to some extent necessarily, been impacted by new and often innovative information technologies with awesome potentiality. The desire to incorporate technology into higher education teaching and learning has been placed on the higher education agenda with a number of technology-related programmes such as CTI and TLTP and the recent audit exercise in TQA. The incorporation of IT seems to proceed from a “technology is obviously a good thing” approach, which in some quarters is considered as axiomatic. While I am sympathetic to the idea of incorporating technology into higher education teaching and learning it is the abandoning of this axiom that is essential if IT is to be successful. The driving forces behind IT in the university setting have been enthusiasts on the one hand, which I have playfully termed as “anoraks”, and technical specialists on the other, whom I have labelled as “techies”. I will refer to this as the “technologist perspective”. My argument is that the “technology is obviously a good thing” approach is a flawed departure point. Due to the hegemonic domination of technical expertise held within the technologist perspective the design and implementation of new information technologies in teaching and learning environments will have very limited success. I will be advancing an argument against the “technologist”. approach that calls for the incorporation of non-technical knowledge in technological developments in university teaching and learning. This will draw upon some of the advances that have come out of cscw approaches. I will also argue that a particular theoretical conception of “work” and of the role of social science in technological design and implementation is appropriate. Technology and teaching social science Part of my disquiet with the technologist perspective’s approach to the design and implementation of new IT in teaching and learning environments is the generality of the argument that technology is necessarily a good thing. The lack of specificity in this approach engenders a poor understanding of the particularities of university teaching and learning. In much the same way it would be easy to talk quite generally about technology and university teaching and learning, but to avoid this pitfall I will confine this discussion to the social sciences in particular. In the social sciences, students’ experiences of IT will mostly be in the form of desktop computing on either the PC or the Macintosh. Their introduction to computing will often form part of research methods or study skills. The role of the computer in this instance is not to deliver computer-based or assisted learning, rather the personal computer is employed as a tool to undertake a particular task. In the research methods context it is likely that students will be engaged in using personal computers to run specific packages which deal with tasks like the computation of statistics with established software packages such as SPSS or Minitab. In the study skills setting, the personal computer is likely to be used as a tool for tasks such as
74
ANORAKS AND TECHIES
library searches or word processing. In both the research methods and the study skills settings, software and hardware are used as tools rather than as learning technologies. Although funding has been directed towards computer-based or assisted learning, in the social sciences there has been an absence of completed and usable bespoke software packages (see Timms 1992). The failure of these endeavours is evident insofar as there are few examples that are routinely incorporated into social science curricula. The alternative to bespoke software is the re-use of existing technology. In these endeavours software, and to a much lesser extent hardware, are directed toward a teaching and learning requirement in an attempt to add value to a specific learning experience. From my own experience and that of colleagues, these attempts to re-use existing technology are at best problematic. The re-used existing technology is generally attempted on an ad hoc basis and its development is time-consuming and labourintensive. In the majority of cases the development has not proceeded from a clear pedagogical requirement and the end-products lack the sophistication required to deliver the high quality educational experience that is the hallmark of universities. The “value added” nature of these endeavours is not necessarily traceable, especially when cost is entered into the equation. Personal computing for social science teaching and learning has awesome potentiality. This issue is fully investigated in Henry (1997, 1998, forthcoming). Despite this potentiality, attempts to introduce bespoke software and to re-use existing technology are ill conceived. This is due to the assumptions about the actual nature of the teaching and learning environment and the lack of a comprehensive empirical understanding of the processes that are in motion. This in turn leads to a shallow understanding of the impact of new technologies in social science teaching. In the world of commerce and industry many technical endeavours which are based around personal computing have also failed to furnish adequate results. This has been due to technology failing to pay sensitive account to what I shall term as the “innate sociality” of the environments into which they are being introduced. This parallels the situation in social science teaching in universities. One solution to the problem of new technology and the workplace has been the development of cscw as a design paradigm. I do not wish to argue that the work environment in the world of commerce and industry is the same as it is in higher education, although there are some obvious similarities at a generic level. I maintain that a particular theoretical sociological conception of “work” can inform cscw designs and this is appropriate to the design and implementation of computer-based learning technologies in higher education. The problem of Human Computer Interaction Fundamental to an understanding of the propriety of CSCW is an appreciation of the problems of a Human Computer Interaction (HCI) approach.1 HCI was a new and radical approach to systems design that achieved prominence in the 1980s and sought to provide a better cognitive coupling between human users and computers (Bannon 1989). Cscw can reasonably be considered as a response to the failings of HCI approaches. HCI is a general framework for innovation aimed at developing interaction techniques, analysis methods, software and computer systems within a controlled context in order to create enhanced products. HCI endeavoured to go beyond simply providing improved surface characteristics, and hoped to address wider 1. It is important to note that there is no clear or coherent answer to the question “what is, or was, the goal of HCI?” One of the more traditional answers to this question is that HCI intends to provide methods and matrices for evaluating the usability of computer systems. This stems from what can be loosely termed the “human factors” approach. This is in contrast to cognitive scientists who argue that HCI is a workbench for the application of cognitive psychology to a real
THE PROBLEM OF HUMAN COMPUTER INTERACTION
75
issues surrounding human interaction with computers. In this sense HCI is a design and engineering science as it aims to produce artefacts of hardware and software within satisfactory frameworks of compromise that take functionality, performance and cost into account (Brooks 1990). The general HCI perspective recognized that there were human consequences to the introduction of new technology, and that the way technology was developed and applied could profoundly affect work. Questions of usability, applicability and acceptability were issues viewed as being of legitimate concern to HCI practitioners. An applied psychological dimension located within a problem-centred approach, it was argued, would enable HCI practitioners to undertake research that would inform future designs2 (Blackler & Osborne 1987). However, the organization of work is in fact endlessly richer and more complex than the majority of formal psychological models could convey. Due to the rigid frameworks that such systems imposed, human actors were not furnished with sufficient flexibility to make the system function (Bannon 1989). Another drawback, according to Brooks (1990), stemmed from the psychological foundation is that HCI fashioned itself as a general paradigm for innovation and design, in limited and controlled environments. Much of the early HCI work was confined to rather small-scale controlled experiments with the presumption that the findings could be generalized to other settings (Barnard & Gurdin 1988). For Carroll (1987), the hoped-for contribution of HCI to the design of computer systems and novel interfaces did not materialize in the 1980s. Gray & Atwood (1988) note that the lack of examples of developed HCI systems for which Luff and Heath (1990) argue is largely due to the inherent deficiencies in HCI approaches. An alternative theoretical and methodological orientation is required. Computer Supported Co-operative Work The expression CSCW, a comparatively new one in the IT vocabulary, was first coined in the mid-1980s by researchers in the USA. The term, most notably used by Grief (1988), has been applied as an umbrella term which takes in anything to do with computer support for activities involving more than one person. An alternative terminology to cscw includes the expressions “groupware” and “workgroup computing” (Clark & O’Donnell 1991). The role of an effective cscw system, as its name suggests, is to support the cooperative nature of work. Hirscheim & Klein (1989) assert that the good system must not be designed in what they term as the “usual sense”, but has to be designed and developed within the framework of the social interactions that are embedded in the environment in which the technology is to be incorporated. The caveat that must be issued here is that in no sense is there an objective set of criteria that form a typology for an effective system. The system must be developed within what they term the “user’s perspective”. A cscw system is not, however, simply an electronic cloning or duplication of a working environment. In contrast, it is a pragmatic attempt to support the co-operative tasks of work in context within its natural social and physical environment. In terms of sociological inquiry, ethnography is the tool of sociological research most applicable in cscw endeavours. As with all research methods ethnography has advantages and disadvantages, but the potency,
problem domain. Computer scientists assert that HCI must help to guide the definition, invention and introduction of new computing tools and environments. This argument is to some extent a product of the exigencies of the computer industry. The point here is to illustrate that HCI is a diverse discipline with fragmented foci and interests, a feature not often drawn out in HCI literature. 2. For example, Beynon (1984) examined working at Ford, Hobbs (1987) studied detective work in East London and Pollert (1981) treated the working lives of women in factory setting.
76
ANORAKS AND TECHIES
in the cscw context, is that it depicts the activities of social actors from their own perspective. This challenges the preconceptions that alternative social science approaches often bring to phenomena (Hammersley & Atkinson 1983). Ethnography is not simply description: rather, it is detailed explication. It is about capturing the real movement of experience in the concrete world. Ethnography achieves something which theory and commentary in the majority of cases cannot—namely it presents human experience without minimizing it and without making it a passive reflex of structures, organizations and social conditions. “Human productions are all of a piece, indivisible and always summed. The metal cannot be simply smelted out from the ore of experience in human affairs” (Willis 1978, p. 180). Sociologists have long used ethnographic techniques to study work in general.3 What is required in cscw is an ethnographic analysis of settings that are due to be “technologized”, by which I mean where new forms of IT are to be implemented. In this regard, the settings are where social science teaching and learning take place. Straightforward ethnographies, such as those developed in the sociology of work, are not sufficient, however. Instead, this calls for what Button & Dourish (1996) term a “technomethodologically” informed approach. This I believe will lead to the successful design and implementation of new information technologies in social science teaching settings. The technomethodological approach departs from the desire to make conspicuous what actors are doing when they organize the activities that they do in particular settings. It draws heavily upon ethnomethodology which turns away from the structures and theorizing of traditional sociology, and concentrates instead upon the details of the practices through which action and interaction are accomplished (Button & Dourish 1996). This approach is underwritten by the work of Garfinkel,4 who is concerned with that most pervasive sociological question: “how is it that actions recur and reproduce themselves?” He insists that this orderliness be viewed as arising from within activities themselves and the work being done by the parties to that activity. Garfinkel eschews the traditional sociological strategy of seeking to explain this orderliness and the organization of social activities by attempting to identify causes and conditions beyond the activities themselves (Benson & Hughes 1983). The concept of the “egological organization” as advanced by Anderson, Hughes and Sharrock (1989) is an ethnomethodologically informed view of organizations that begins with a bottom-up understanding of them.5 The conception of the egological organization departs from an enquiry into the daily or routinized experiences of individuals. The value of this approach is that it places the actor’s point of view at the centre of the analysis. By employing the concept of the egological organization, they develop the idea of the “working division of labour”. In Working for profit they argue that it ought to come as no surprise that
3. For example Beynon (1984) examined working at Ford, Hobbes (1988) studied detective work in East London, and Pollert (1981) treated the working lives of women in a factory setting. 4. Garfinkel states that “the policy is recommended that any social setting be viewed as selforganising with respect to the intelligible character of its own appearances as either representations of or as evidences-of-a-social-order. Any setting organises its activities to make its properties as an organised environment of practical activities detectable, countable, recordable, reportable, tell-a-story-aboutable, analysable—in short, accountable” (Garfinkel 1967, p. 33). See especially Garfinkel (1986 & 1991). 5. The conception of the egological organization is motivated by the desire to provide descriptions and analysis, but raises a deep methodological question. Sociological descriptions, like other theoretical accounts, are thematically constructed. The methodological question at issue in this instance is that employing this approach is an attempt to provide a third person account of first person experience. This does not mean incorporating first person accounts into sociological depictions, rather a sociological reconstitution of that experience is required. The concern is not with particular people’s experience, but with the organization of experience, as it is encountered in social life, as a readily accountable, known and shared scheme of interpretation (Anderson, Hughes, Sharrock 1989).
THE PROBLEM OF HUMAN COMPUTER INTERACTION
77
actors in work settings see themselves as part of an elaborate working division of labour. From the way that they talk about their work, both to each other and to outsiders, it is clear that the notion of a working division of labour is one which they use to interrelate and explicate the things that they see going on about them, on a daily basis and on ordinary occasions. These accounts depict a body of activities marshalled by “a working division of labour”. Technomethodology, then, requires a sociological analysis of the organization of social action and interaction and the organization of work and work settings. The fullest possible description that captures the essence of the “working division of labour” must be furnished. The thrust of technomethodology is the conception that sociological descriptions of the ways in which people routinely organize their actions and interactions can be furnished and compared to what is or is not possible using technology. In this sense the term “technomethodology” is an identification of the need for the incorporation of ethnomethodologically informed accounts of the working division of labour that places the actor’s perspective at the centre. The object of the ethnographic exercise is therefore to provide what Geertz (1975) terms as “thick descriptions”, which will inform the design and the implementation of the new information technologies.6 Cscw can inform the development and implementation of new information technologies that are directed towards teaching and learning environments in the social sciences. I do not wish to argue that these environments are the same as the commercial and industrial settings which so far have been the foci of cscw. At a generic level university teaching and learning settings are similar insofar as they also require cooperation, co-ordination and collaboration to accomplish work tasks. And while the work carried out in university teaching and learning settings is, arguably, often of a more individual nature, CSCW has attended to the issue of individualistic work.7 If we treat the concept of “doing work” as the active process of “sensemaking” that individuals undertake in settings, then the same kinds of issue that impinge upon actors in commercial work settings are also present in university teaching and learning settings. Conclusions The desire to incorporate technology into higher education teaching and learning which has been placed on the higher education agenda is both desirable and essential to the future development of British higher education. Technologies will most probably continue to be innovative and technical advances will increase their potentiality. If, in the social sciences, we wish to move from using computers as tools to a scenario where we develop computer-based learning technologies, I believe that it is fundamental to incorporate nontechnical knowledge and a more circumspect and strategic development of IT than could be achieved by enthusiasts (“anoraks”) and technical specialists (“techies”). The development of computer-based learning technologies for the social sciences must proceed from clear sets of pedagogical requirements. This is not to argue that in any sense objective sets of criteria that form typologies for effective systems exist. The system requirements in each setting will be context specific, and the role of a CSCW strategy that is technomethodologically informed is an attempt to uncover the pedagogical requirements of the computer-based learning technology under development. This is the level of sophistication that is required to develop high quality computer-based learning tools that will be useful and used in social science departments.
6. An example of such an attempt is Harper et al. (forthcoming) which is an account of technology and air traffic control as part of an interdisciplinary attempt to design and implement a technological system. 7. My own work on London taxi drivers (Gayle 1991) and the work of Thimbleby et al. (1990) are two examples.
78
ANORAKS AND TECHIES
A CSCW approach to the design and implementation of new computerbased technologies in social science teaching and learning will be liberating. The need for a clear understanding of teaching and learning environments is as critical for bespoke software development as it is for the re-use of existing technology as similar problems face both approaches. Technomethodologically informed CSCW, when brought to bear upon the design and implementation of new computer-based information technologies in social science teaching settings, is an attempt to improve what Gurdin (1988) and Bannon & Schmidt (1991) term the “distinctly random success” of new information technologies. References R.Anderson, J.A.Hughes, W.W.Sharrock, Working for profit: the social organisation of calculation in an entrepreneurial firm (Aldershot: Gower, 1989). L.Bannon, From human factors to human actors: the role of psychology and HCI studies in systems design (Dublin: Computing Services, University College, 1989). L.Bannon, & K.Schmidt, “CSCW: four characters in search of a context”, in Studies in CSCW: theory, practice and design, J.Bowers & S.Benford (eds) (Amsterdam: Elsevier, 1991). P.Barnard, & J.Gurdin, “Command names”, in Handbook of human computer interaction, M.Helander (ed.) (Amsterdam: North Holland Press, 1988). D.Benson, & J.A.Hughes, The perspective of ethnomethodology (London: Longman, 1983). H.Beynon, Working for Ford (Harmondsworth: Penguin, 1984). F.Blackler, & D.Osborne, “Designs for the future: IT and people”, paper presented to British Psychology Society Conference, Leicester, 1987. R.Brooks, “The contribution of practitioner case studies to human-computer interaction science”, Interaction with Computers II, 1, April 1990. G.Button, & P.Dourish, “Technomethodology: paradoxes and possibilities”, Technical Report EPC-1996–101 (Cambridge: Rank Xerox, 1996). J.Carroll, Interfacing thought: cognitive aspects of human computer interaction (Cambridge: MIT Press, 1987). B.M.Clark, & S.O’Donnell, “Computer supported co-operative work”, British Telecom Technical Journal 9, 1, January 1991, pp. 47–56. J.Daniel, “The challenge of mass higher education”, Studies in Higher Education 18(2), 1993, pp. 197–203. H.Garfinkel, Studies in ethnomethodology (Prentice Hall: NJ, 1967). H.Garfinkel, Ethnomethodological studies of work (London: Routledge & Kegan Paul, 1986). H.Garfinkel, “Respecification”, in Ethnomethodology and the human sciences, G.Button (ed.) (Cambridge: Cambridge University Press, 1991). V.J.Gayle, “Cscw and individual work: an investigation of work distribution in a London taxi firm”. (Masters thesis, Department of Sociology, University of Lancaster, 1991). C.Geertz, “Thick description”, in The interpretation of culture (London: Hutchinson, 1975). W.Gray, & M.Atwood, “Review of Interfacing Thought”, in Interfacing thought: cognitive aspects of human computer interaction, J.Carroll (ed.) (Cambridge: MIT Press, 1988). I.Grief, “Remarks in panel discussion on ‘CSCW: What does it mean?’”, Proceedings of conference CSCW ’88, Portland, Oregon, 1988. J.Gurdin, “Why CSCW applications fail: problems in the design and evaluation of organisational interfaces”, Proceedings of conference CSCW ’88, Portland, Oregon, 1988. M.Hammersley, & P.Atkinson, Ethnography: principles in practice (London: Tavistock, 1983). R.Harper, & D.Randall, “Rogues in the air: an ethnomethodology of ‘conflict’ in socially organised airspace”, Technical Report EPC-1992–109, (Cambridge: Rank Xerox, 1992). R.Harper, J.Hughes, D.Randall, D.Shapiro, W.Sharrock, Ordering the skies (London: Routledge, (forthcoming)).
THE PROBLEM OF HUMAN COMPUTER INTERACTION
79
M.S.Henry (ed.), Studying and using technology: a guide for social and political science students (Oxford: Blackwell, 1997). M.S.Henry, “The role of computers in social policy”, in The SPA student’s companion to social policy (Oxford: Blackwell, 1997). R.Hirscheim, & K.Klein, “Four paradigms of information systems development”, Social Aspects of Computing 32, 10, October, 1989, pp. 1199–216. D.Hobbes, Doing the business: entrepreneurship the working class and detectives in the East End of London (Oxford: Clarendon, 1988). P.Luff, & C.Heath, “Collaboration and control: the introduction of multimedia technology on London Underground” (Cambridge: Rank Xerox, 1990). A.Pollert, Girls, wives, factory lives (London: Macmillan, 1981). H.Thimbleby, S.Anderson, I.Whitten, “Reflexive CSCW: supporting long-term personal work”, Interacting with Computers 2, 3, 1990, pp. 330–36. D.W.G.Timms, “Computers and the teaching of sociology and the policy sciences”, Computers and Education Journal 19, 1/2, 1992, pp. 97–104. P.Willis, Profane culture (London: Routledge, 1978).
Chapter 11 EVANGELISM AND AGNOSTICISM IN THE TAKE-UP OF INFORMATION TECHNOLOGY Danny Lawrence, Ken Levine and Nick Manning
Academics vary greatly in the extent to which they have embraced IT in their teaching and research. Many continue to work, as they have always done, without any recourse to the computers, peripherals and other electronic props to which others are now devoted. Such indifference is often deplored by those who have embraced IT with an almost evangelical zeal. The charge is not only that their more traditional colleagues are denying themselves the benefits of IT; it is that, through both their outmoded personal examples and failure to incorporate IT into their teaching, they are letting down their students. As teachers, they have a responsibility to introduce them to all the ways in which disciplinary knowledge can be acquired and analyzed. Their failure to incorporate IT into that scenario is, therefore, tantamount to professional negligence. Discussions on why their colleagues fail to embrace IT and how best to remedy this situation are usually based on implicit but recognizable models of human behaviour. One such model starts from the premise that our behaviour is sometimes irrational, i.e. it is not based on logical considerations. Frequently, this model also incorporates a moral dimension. Behaviour which is irrational is unreasoning and therefore unreasonable. Such models of behaviour based on assumptions about human irrationality have been employed to try to explain a wide range of behaviour patterns, e.g. racial prejudice and discrimination, collective violence, the belief in magic, and resistance to change within organizations. In these instances, members of ethnic minorities, law abiding citizens, gullible souls and shareholders are represented as the victims of irrational behaviour. In all the examples cited above, the attempts to explain behaviour in terms of a putative irrationality have been found wanting (Lawrence 1974, Wellman 1977; Feagin & Hahn 1973, Tilly 1969; Peel 1969; Rose 1978). We would maintain that the same holds for attempts to explain resistance to IT in such terms. In the specific case of IT, it has been argued that the (allegedly) clear advantages to be derived from adopting innovations in IT have often not been realized because of irrational attitudes formed before today’s userfriendly IT became available, and/or by conservative workplace cultures which have generated an ethos of inertia. In the case of students, the full benefits of IT have not been made available to them because of the behaviour of their teachers. Students are the victims of the unreasoning and unreasonable behaviour of staff who are failing in their obligations towards them. While there may be useful insights in these approaches, we would argue that a more promising approach to understanding the differential take-up of IT is to assume that such behaviour is calculating and purposive. Whether or not behaviour is in accord with a Weberian notion of formal rationality is immaterial: what matters in understanding individual and group responses is how the situation appears from the standpoint of individual actors. In essence, our argument is that, instead of simply assuming that right and rationality are on our side, those of us who are enthusiastic converts to IT must acknowledge both that there may be
EVANGELISM AND INFORMATION TECHNOLOGY
81
irrational zealots among our number and that when some of our colleagues and students prove reluctant to adopt developments in IT their actions may be based on calculation rather than irrationality or inertia. Evangelism and information technology The most striking recent example of what might be called IT evangelism is found in Brackenbury’s & Hague’s (1995) article “Converting the sinners”. Though their tongues may have been firmly in their cheeks when they devised their title, their article makes it clear that they do consider that there is a genuine ethical and professional dimension to their argument. They approach the problem of the take-up of IT like committed evangelical Christians. They, unlike their agnostic colleagues, enjoy a blessed assurance. The truth about IT and its take up has been revealed to them and their mission is to proclaim it. Ethos is the single most important factor determining the penetration of new technologies into the teaching curriculum…. (Brackenbury & Hague 1995, p. 3) Attitudes are the weak link: high scepticism, low-awareness and not-my-job syndrome inhibit the more widespread use of new technologies in the teaching curriculum…. These attitudes are sinful and should not be tolerated…seamless teaching techniques are the key to winning the battle for hearts and minds (Brackenbury & Hague 1995, p. 10). Academics who fail to embrace the new technologies are setting a bad example to their students and are in need of redemption. In any form of teaching, the goal staff set for their students is for them to produce original wellresearched, well-presented work. Computers offer powerful yet easy-to-use means with which to achieve this goal. Failure by staff to show their students how to use these tools for whatever reason… creates a serious paradox: staff are setting goals for their students without providing them with the means to achieve them. This raises the whole issue of the relationship between staff and students’ attitudes towards the use of computers in a subject discipline (Brackenbury & Hague 1995, p. 4). Brackenbury & Hague seem reluctant to accept that their reluctant colleagues may be able to offer a reasoned defence of their agnosticism towards IT. They consider unfavourable attitudes towards it are the irrational products of prejudice and ignorance. Alleviate that ignorance by revealing the truth of the wonders of modern software and hardware to the disbelievers and the scales will be lifted from the eyes of unbelievers. So strong is their faith in IT and in their method of proselytizing that they offer us their “action plan of how your department can go about creating the ethos of low-pain, high-gain IT awareness”: a plan which, amongst other things, includes the use of the academic equivalent of the wayside pulpit and the distribution of religious tracts Moreover, like the Salvation Army, they refuse to let the devil have all the best tunes (Brackenbury & Hague 1995, pp. 5 and 10). There can be no substitute for making lessons fun as a means of winning hearts and minds. However, the main element in the plan is that of personal witness to one’s IT faith: sinners can best be made to see the light by the personal example of the already converted i.e. by what Brackenbury & Hague call “show and help” teaching.1
82
EVANGELISM AND AGNOSTICISM
In defence of agnosticism Unlike Brackenbury & Hague, we are not convinced that it is irrational attitudes that are inhibiting the spread of IT or that ethos is the single most important factor determining its penetration in academic departments. Nor is it necessarily the case that “the attitudes of staff towards computers in their discipline will be reflected by their students” (Brackenbury & Hague 1995, p. 5). While we would not wish to deny that attitudinal and cultural factors may be related to patterns of IT use (e.g. in the rivalry which still exists between the users of different platforms), we would maintain that, in relation to the take-up of IT, this is a limited and limiting view of the problem. Staff and students do not necessarily have a general disposition towards the new technologies which determines whether or not they will be in favour of IT. While we probably all have colleagues who either ignore IT altogether or appear to have an emotional craving for every new gadget that comes along, academics and students much more commonly adopt some elements in IT but not others, and they vary in the pace with which they adopt any particular element of IT. Nor we are persuaded that most academic departments have a sufficiently strong prevailing cultural ethos for it to generate a sense of identity in staff and students which either inhibits or promotes the take-up of IT. Again, this approach fails to explain the marked variation in the take-up of IT among those working in the same local occupational culture. Instead of assuming that academic staff who seem to be resistant to IT are driven by irrational attitudes, or trapped within conservative occupational cultures, we would suggest that it is more profitable to explore the possibility that variations in the pattern of IT take-up are the result of considered choices made by intelligent and discerning individuals. The problem of deploying the concepts of attitudes and/or departmental cultural ethos to explain the take-up of IT is that these approaches do not help us to explain why the same individuals are happy to embrace some parts of the new technologies but not others. We will not understand the variable take-up of IT, by adopting the equivalent of the evangelical form: “if you are not for me [IT] you are against me [IT]”. We must adopt an approach that enables us to explain the particular pattern in the take-up of IT in any given individual or institution. That involves an explicit acknowledgement that rationality is individually and situationally specific. What is a rational IT action for person A may not be a rational IT action for person B. Moreover, for any individual, a rational action may involve the use of IT in situation X but not in situation Y. It is not uncommon to encounter students who are reluctant to word-process their essays but have embraced e-mail with enthusiasm. Such students do not appear to have an attitude problem towards computers nor do they seem to be the victims of out-of-touch teachers. They are more likely to have calculated that in their particular situation the costs of the word-processing systems available to them outweigh the benefits, but that e-mail represents an easy, quick and free way of keeping in touch with friends living at a distance. While from the standpoint of a third party, and especially an IT enthusiast, it may appear obvious that word:processing is inherently good for you, seen from the standpoint of some students mastering word-processing may appear like a needless, time-consuming chore when they have already succeeded in passing their exams and getting to university using the well-tried alternative of joinedup writing.2 Calculations of self-interest in relation to IT can take more complex and even devious forms. An academic friend of one of the authors, when at work, actively perpetrates an image of himself as a
1. Curiously, given that the main thrust of their argument is about the need to change hearts and minds, their action plan is actually competency-based, just like the much-criticized approach to learning embodied in NVQS. For Brackenbury & Hague, “computer training is a practical skill: like trying to tie your shoelaces…students without shoelace-tyingskills will trip up and hurt themselves”.
EVANGELISM AND INFORMATION TECHNOLOGY
83
hopeless incompetent at IT who does not own his own computer. He does so in order to help protect his existing share in a secretary. In the privacy of his own home, away from the prying eyes of those with the authority to allocate scarce secretarial resources, he uses his wife’s computer in a moderately competent way. Attitudes Attitudes are not as important in determining how we behave as Brackenbury & Hague appear to assume. Indeed we have known since LaPiere’s classic study, conducted over 60 years ago, that in some situations there can be virtually no correspondence between attitudes and behaviour (LaPiere 1934). LaPiere’s respondents may have held irrational attitudes, but they behaved rationally. Their irrational attitudes (which led them to state that they would not cater for Chinese customers) did not prevent them from behaving rationally (i.e. taking money from Chinese customers). LaPiere’s findings were no fluke. Over 30 years later, Wicker undertook a thorough review of the research literature and concluded that there was “little evidence to support the postulated existence of stable underlying attitudes within the individual which influence both his verbal expressions and actions” (Wicker 1969, p. 75). The impact of the review, according to Sabini, was that “gloom prevailed among attitude researchers: indeed there followed a period of about five years during which social psychologists all but abandoned research on attitudes” (Sabini 1992, p. 675). The key point about Fishbein’s & Ajzen’s subsequent attempt to rescue the study of attitudes from such a devastating critique is that it involved the introduction of a model of reasoned (not unreasoning) action. Fishbein & Ajzen were obliged to concede that general attitudes do not determine specific behaviour. Of equal significance is the fact that in their new model they demoted attitudes to just one of a variety of determinants of human action—acknowledging, in the process, the situationally specific nature of human behaviour (Fishbein & Ajzen 1974). So, while we do not want to deny that individuals may have attitudes, nor that these attitudes may in some way be implicated in the determination of how individuals relate to IT, we do want to suggest that trying to change attitudes is unlikely to be an effective way of increasing the scale and pace of the take-up of IT. Organizational ethos While Brackenbury & Hague claim that “attitudes are the weak link” in the take-up of IT, they insist that “ethos is the single most important factor determining the penetration of new technologies into the teaching curriculum”. Again, the suggestion is that non-rational factors are at work. If, however, we examine this contention in relation to the literature on organizational resistance to change, we are presented with a second example of the dangers of denying rationality to actors, with lessons similar to those identified in the section above.3 The origins of the organizational resistance to change discourse lie in Elton Mayo’s writings on human relations at work (Mayo 1949) together with the group dynamics of Kurt Lewin (1951) and several of the theoreticians associated with the Tavistock Institute of Human Relations (e.g. Bion 1961; Rice 1963; Trist et al. 1963). The bulk of this literature revolves around the fate of attempts of change agents to introduce 2. It is ironic, given the praise Brackenbury & Hague accord the “increasingly refined word processors (with tools to improve document structure, style and grammar)”, that there are several points in their joint paper which are written in the first person singular.
84
EVANGELISM AND AGNOSTICISM
new technologies into the work place including, in some instances, computer technologies (e.g. Mumford 1964, 1965). This resistance to change thinking started from a presumption that it was the pre-existing normative commitments of individuals and workgroups that were the main obstacles to change (Watson 1970). The role of the organizational change agent was perceived as the design and execution of “influence” programmes capable of altering these norms. The function of research activity was to identify and understand the nature of “resisters”, that is, the minority of deviant individuals or groups who rejected new arrangements or who failed fully to adapt to the new regime. Normative commitments arising from asymmetries of power and the “vested interests” present in the social or work environment were acknowledged only weakly (e.g. Watson 1970, p. 495), the implication being that these too could be overcome by participative influence programmes. In fact, the industrial situations selected for analysis in this fashion were rarely if ever characterized by active confrontation between organized groups, permitting the issue of whose interests the proposed changes advanced (and therefore of the rationality, or otherwise, of resistance to them) to be left in the background. An illustration of these assumptions in operation can be seen in the classic and extremely influential experimental study conducted by Coch & French (1948) of garment workers who had been responding negatively to the continuous introduction of new work methods in the context of a piecework system. Coch’s & French’s experiments set out to demonstrate that groups allowed full participation in the decisionmaking over new methods and piece-rates were more likely to achieve production targets and less likely to restrict production or show other symptoms of frustration such as absenteeism and labour turnover than groups allowed less or no participation. The unequivocal conclusion to their study is that: It is possible for management to modify greatly or to remove completely group resistance to changes in methods of work and the ensuing piece-rates. This change can be accomplished by the use of group meetings in which management effectively communicates the need for change and stimulates group participation in planning the changes (Coch & French 1948, p. 531). The stance adopted by Coch & French (and their successors) is that resistance to change is essentially a nonrational or irrational phenomenon that involves a complex combination of psychological components, such as tendencies to see the future as hopeless, and social components reflecting group memberships and status considerations. Because much of this research was directly sponsored by management and uncritically endorsed managerial concerns, it was, perhaps, inevitable that it would fail to conceive of resistance to change as an instru-mental attempt by employees to defend legitimate interests. Yet, in the discussion of the poor post-change performance of the control group in their experiment, Coch & French did concede that: The operator transferred [to new tasks] by the usual factory procedure (including the control group) has in fact a realistic view of continued failure because, as we have already noted, 62% of transfers do in fact fail to recover to standard production (Coch & French 1948, p. 531).
3. A large social science and historical literature exists which is devoted to understanding attempts by individuals, groups and communities to resist economic, political or cultural changes which the participants perceive to be external in origin and which are judged to threaten cherished institutions and values. “Resistance” can range in form from the ad hoc efforts of a small clique of individuals to counter the authority structure of a school (as in Willis 1977) to episodes of collective violence which, directly or indirectly, challenge controlling political systems (e.g. Hobsbawn 1959; Tilly 1969; Quinault & Stevenson 1974; Stevenson 1979).
EVANGELISM AND INFORMATION TECHNOLOGY
85
This is a most important point. In the plant under study, the workers assigned to tasks requiring new methods of work were given a temporary “transfer bonus” while they learned the new techniques. Although the duration of the temporary transfer bonus was supposed to be derived from the average length of time it took workers to reach the standard production levels defined by time-and-motion engineers, 62 per cent actually failed to do so in the time allotted. This had the consequence of losing the latter workers their transfer bonus, returning them to minimum wage levels which could be 25 per cent or more below their pretransfer earnings. In the light of such punitive financial arrangements, resistance to the new work methods does not seem to be either irrational or non-rational and hardly requires sophisticated social psychological explanation. IT evangelism and resistance to change thinking share certain key presumptions. Perhaps the most obvious is that new methods and technologies are represented as irresistible and overwhelming external forces of change that bear down on reactive communities, organizations and individuals. An adjective that commonly accompanies “change” in Coch’s & French’s article is “necessary”. There is little recognition that the take-up of new technology is always the outcome of choices made by particular interest groups. Such decisions may draw on a more or less sophisticated cost—benefit analysis of likely efficiency gains and resource implications, but they will also reflect how congruent the adoption of the new technologies will be with the existing political and economic interests of the parties affected. In this way, the decision to introduce change and the decisions not to accept change or to resist it may have more in common, and be more symmetrical in terms of their rationality, than some accounts indicate. A second, and perhaps less obvious, shared presumption concerns the time horizons within which change is evaluated. The organizational sponsors of IT and industrial managements tend to adopt a relatively long-term perspective from which to evaluate the costs and benefits of change. In most cases, this will be a longer period than the one used by employees and students. While the former groups may regard the transition costs of new technologies as entirely acceptable over the longer run, the latter groups may be prepared to contemplate only the short term in their perceptions e.g. by focusing on the temporary loss of earnings or study time while new skills are acquired, the disruption of existing services during equipment installation and the teething problems of new systems and service provision. Political context and the calculation of self interest Discussions about the take-up of IT in higher education have not taken place in a political vacuum. While IT may have a momentum of its own, the very fact that it needs to be resourced from public funds makes it inevitable that it will be affected by the widespread political changes which have affected the public sector in general and higher education in particular since the early 1980s. Mrs (now Baroness) Thatcher came to power convinced that the policies of her predecessors, both Conservative and Labour, had helped bring about Britain’s economic decline. She maintained that those who worked in the public sector had been cosseted at the expense of those who worked in the private sector, and embarked on a major programme of reforms, based on the introduction of quasi-market mechanisms designed, ultimately, to cut costs and obtain greater value for money for the taxpayer. The impact of the reforms on staff and students is difficult to exaggerate. For staff, the massive expansion in student numbers has been coupled with a fall in staffing levels and, therefore, a marked increase in teaching loads—but, because of public sector pay policies, no corresponding productivity-based salary increases. In real terms, academic salaries have remained almost static since 1979. During the same period, academic staff have been compelled, via the market mechanisms of the Higher Education Funding Council for England’s Research Assessment Exercise, to become even more productive
86
EVANGELISM AND AGNOSTICISM
in research terms and, via TQA exercises, institute complex, time-consuming mechanisms to be able to publicly demonstrate that they are taking proper care of their students despite reduced staff contact. For today’s students, higher education is characterized not only by reduced staff contact, but also diminishing grants and an increasing reliance on student loans and bank overdrafts. Many now have to work during term (or semester) to survive. Against such a background, it would be remarkable if staff and students were not calculating in the way in which they reacted to the changes imposed or urged on them by others. This, inevitably, includes their reactions to IT. As Davies & Crowther remind us, much of the investment in IT in higher education has been financed in the hope that, especially in the long run, it will cut institutional costs as well as enhance productivity (1995, p. 3). The allocation of personal workstations to members of staff is an obvious example of this dual managerial objective. They have been provided not only to allow academic staff to take advantage of the new technologies but also to reduce their dependency on secretarial services. So, whatever the long-run benefits to their institutions, in the short run, especially for staff without keyboard skills, the introduction of workstations represents yet another demand on the already diminishing time available for teaching and research. Routine tasks previously undertaken by a secretary, such as the typing of reading lists, have now been shunted onto academics. This is just one of a great many rational reasons why some academic staff may not welcome advances in technology, despite what may seem to IT enthusiasts to be its obvious benefits. Problems may sometimes be imagined or exaggerated but they are very often real and, in the concluding section of this chapter, we draw attention to some of the more obvious of them as they relate to both staff and students. Incentives and disincentives In an increasingly market-orientated system of higher education, in which we are all encouraged to compete by pursuing our own self-interest, there need to be genuine incentives if staff and students are to embrace IT and incorporate it into their teaching and learning to the extent that enthusiasts and university managements would like. The most obvious potential attraction of IT is that it can take some of the routine drudgery out of our work and make us more productive with reduced effort. However, as we have already noted, the allocation of workstations has obliged many academics to undertake what they regard as routine and even demeaning tasks that would, before the invention of the word-processor, have been undertaken by others. For those reluctant users who, in order to perform such tasks, have laboured hard to become proficient in the use of an operating system such as DOS, and a word processor they were told was an “industry standard”, such as WordPerfect 5.1, it can then be particularly galling to be told that they must now move on to Windows 3.1 and WordPerfect 6.2—and, shortly afterwards, perhaps, to Windows95 and Microsoft Word for Windows. But what of the more confident and adventurous academics who we might expect to be developing CAL packages to enhance their teaching? What do they have to gain? The answer is, often, very little. IT enthusiasts are often coy when it comes to acknowledging how much work has gone into developing reasonable quality courseware. As Reid notes, we tend not to conduct full cost-effectiveness studies of most of the courseware we are currently producing because, if we did, “our funding agencies would be scared off” (Reid 1994, p. 7). Two related problems are that because of the huge variation in the way in which higher education courses are constructed, there is often no market for courseware outside the institution in which it is developed and that, even when the content of courseware does not require major revision, rapid advances in computer hardware can still render its presentation effectively obsolete within a short time of it being completed. As Reid notes, courseware which was initially full of “technodazzle” is soon made to look
EVANGELISM AND INFORMATION TECHNOLOGY
87
pedestrian by advances in technology. Given the time and energy needed to develop and continually upgrade courseware, academics must have clear incentives if they are going to choose to invest their limited time in its development. Yet the reality is that those involved are unlikely to get anything like the kind of credit or return that would follow from a comparable effort being invested in a programme of research or publication. What matters most for individuals in the promotion stakes are conventional forms of research output and grant income—and what matters most for institutions is the additional income that can be generated via good departmental performances in the Research Assessment Exercise. Access If we expect students to make effective use of developments in IT, whether by using standard applications, locally developed courseware, or Janet to access sources of information not available locally, they must have ready access to suitable equipment. For most of them, this means centrally provided computer suites. Yet the provision of such general access workstations in higher education institutions has never approached the levels recommended by the Nelson (1983) and McDonough (1991) working parties, i.e. 5 students per workstation by 1990 and 4 students per workstation by 1996. Even the average of 17 students per workstation suggested by Rosner’s 1994 survey (Reid 1994, p. 6) is optimistic because, at any one time, a proportion of machines will be out of action, and the statistics probably include older computers unable to run a full range of current software. Perhaps more significantly, it is only a sub-set of these workstations (of an unknown size) that are freely available in the evenings and weekends when students will be most likely to have extended periods free from classes to devote to coursework preparation. This is an important limitation, particularly in urban universities where students may live far from central facilities, but, even in networked campus institutions, out-of-hours IT use may be difficult or inconvenient because the capital costs and high overheads have greatly restricted the levels of IT provision in halls of residence and selfcatering accommodation. Such factors often increase the in-hours competition among students for the computers that are available to discouraging levels and introduce a general uncer tainty about whether machines will be available at critical times (e.g. before submission deadlines). Thus many students regard the use of IT sceptically not because of any attitudinal or cultural resistance to it but because IT-mediated coursework raises for them substantial additional planning and scheduling problems over and above those posed by the non-IT equivalents with which they are already familiar. Compatibitity There is an old jest to the effect that standards are such a good thing that the IT industry has adopted as many of them as it possibly can. The situation with respect to hardware and software compatibility in higher education is, at first sight, no worse than in other sectors. Under central encouragement, higher education has historically tended to favour non-proprietary, “open”, standards and inter-operability (or, at the minimum, connectivity) between platforms, while most institutions have pursued policies which actively encourage standardization on a limited number of hardware platforms and major software applications, sometimes through purchase restrictions but, more often, by limiting central support to these designated types and titles. However, despite this strategy to contain potential compatibility problems, rapid innovations in personal computer hardware and associated developments in operating systems and software still create significant problems for both academic staff and students. Recent advances have led to the emergence of what is effectively a broad dual standard: third-generation workstations that support WIMP GUIS in colour and are multimedia capable, coexisting with second-
88
EVANGELISM AND AGNOSTICISM
generation machines lacking some or all of these capacities. Currently, institutions are gradually replacing second-generation machines with the third generation but, as a consequence, new compatibility problems are being generated. The problem of differential access to facilities between campus and off-campus users already referred to above is now being exacerbated by a “specifications gap”. Few students, given their already difficult financial predicaments, are likely to be able to afford to purchase third-generation equipment for their personal use, despite falling hardware costs. Even academic staff may not feel able to justify further computer purchases given how quickly it appears to become obsolete. The key point is not that home-based equipment will necessarily be less up-to-date than that provided within higher education but that the leapfrogging in standards will continually re-present the problems of incompatibility in new guises. Similar problems exist in relation to software. Universities have benefited greatly from the centrally negotiated CHEST deals which have made a wide range of advanced commercial and educational software available at preferential prices through computer centres. Usually, it is this discounted software which staff and students are encouraged by their institutions to use and for which training and support are most widely available. However, the standard CHEST licence terms specifically restrict the use of software to the site for which it was purchased and this increases the likelihood that staff will use different software on their home machines from that which they have access to on campus, giving rise in the process to a variety of potential compatibility and conversion problems. The traditional, informal solution to this incompatibility problem, that of software piracy, will become less easy to exploit as more sophisticated software protection is employed by publishers and stricter policing of software licences begins to bite. For both staff and student, the issue of compatibility is not just a matter of the degree of convergence of platforms and operating systems that has occurred since the first generation of personal computers became widely available in Britain at the beginning of the 1980s. The fact is that the systems provided by educational institutions now invariably involve networks and other functionalities and software applications that cannot necessarily be acquired economically for use on personal equipment, often frustrating attempts, even by those able and willing to spend money, to acquire a hardware/software combination that effectively complements those available at their university. Support Problems of compatibility are related to the more general problem of support for users of IT. There was, of course, early recognition by both national agencies including the Computer Board and its successors, and individual institutions, that the provision of effective support and advice infrastructures was a key element in encouraging the take-up and effective use of IT by staff and students in British higher education. As a consequence, a wide variety of top-down and bottom-up support mechanisms have developed in parallel. The strategically orientated focus of the discipline-based CTI Centres has tended in a fairly rough-and-ready way to complement the course-specific roles of departmental computer staff and faculty advisers, with university-wide CAL groups, computer learning units and computing centre support staff operating in the (often unmapped) territory between them. While a more or less continuous pressure on staffing levels has meant that many individuals and groups in advisory functions find it extremely difficult to provide adequate support for their user communities, an additional problem for users is finding their way around the often complex sets of support mechanisms which are provided. When operational problems arise, both staff and students can be faced with what can appear to be a bewildering set of alternative sources of help.
EVANGELISM AND INFORMATION TECHNOLOGY
89
With the increasing use of complex software in networked environments, it is frequently not clear to the user where the real origin of their problem lies. Nor will it necessarily be instantly obvious to those support staff nearest to hand. Even when relatively clear official divisions of labour and responsibility exist, many staff and students will still have to contend with what for them seem to be interminable delays and experience the frustration of being “bounced around” from one source of support to another while the different facets of their problem begin to unravel. Such memories are not erased when the fault is satisfactorily resolved and it can lead to confidence in both the technology and their use of it being undermined. An enthusiast’s response to this kind of situation might be to say that comparable difficulties can arise in staff or student use of other services, for example libraries, where support services are also increasingly specialized and differentiated. However, the problems that arise in the case of IT are much more likely to be beyond the comprehension of the user and more disconcerting as a result. It is surely not irrational to fear that an increasing dependency on technologies which one neither understands nor can control when things go wrong may be unnecessarily risky when still adequate, conventional alternatives exist. Concluding remarks This chapter has been written by IT enthusiasts—not by Luddites or people raised in a pre-IT culture with an attitude problem. We know that IT is good for us. What we felt we could not leave unanswered was the charge that academics who are unenthusiastic or sceptical about IT should be regarded as irrational and that, by failing to set their students a good example, are coming close to being guilty of professional negligence. We also reject the notion that students should be seen as little more than the hapless victims of their teachers in relation to IT. What we have sought to argue is that there are a good many sound reasons why staff and students may be wary about the claims that some of us make for IT. We wish to emphasize in conclusion that there are general problems relating to the new technologies and their deployment which, though they will manifest themselves in different guises, are unlikely ever to go away. The kinds of problem we have tried to identify above do not permit technological solutions. On the contrary, although each major advance in hardware and software may provide us with the means of becoming more productive, it will also create new versions of old problems for us to resolve. Though it is entirely appropriate for us to celebrate the remarkable developments made in IT in recent decades, we should not assume that there is a golden age just around the corner that we can enter if only we can change the attitudes and ethos of our more recalcitrant colleagues. We also want to emphasize that, while there is already a wide recognition of the nature of many of these problems, there is less general awareness of the degree to which IT developments are embedded in the wider social and political context within which higher education is now operating. The nature of this setting needs to be clearly understood by everybody involved in IT—not least because it is changing at a rate which is just as unprecedented as that which characterizes the world of computer technologies. References W.G.Bennis, K.D.Benne, R.Chin (eds), The planning of change, 2nd edn (New York: Holt Rinehart & Winston, 1970). W.R.Bion, Experiences in groups (London: Tavistock Publications, 1961). S.Brackenbury & R.Hague, “Converting the sinners: seamless, cost-effective use of the new technologies in teaching”, SocInfo journal, 1, December 1995, pp. 3–11. L.Coch & J.R.P.French, “Overcoming resistance to change”, Human Relations 1, 1, 1948, pp. 512–32.
90
EVANGELISM AND AGNOSTICISM
M.L.Davies & D.E.A.Crowther, “The benefits of using multimedia in higher education: myths and realities”, Active Learning, 3, 1995, pp. 3–5. J.R.Feagin & H.Hahn, Ghetto revolts (New York: Macmillan, 1973). M.Fishbein & I.Ajzen, “Attitudes towards objects as predictors of single and multiple behavioural criteria”, Psychological Review 81, 1974, pp. 59–74. E.J.Hobsbawn, Primitive rebels: studies in archaic forms of social movement in the 19th and 20th centuries (Manchester: Manchester University Press, 1959). R.T.Lapiere, “Attitudes v actions”, Social Forces 13, 1934, pp. 230–37. D.Lawrence, Black migrants, white natives (Cambridge: Cambridge University Press, 1974; reprinted: Gregg, 1992). K.Lewin, Field theory in social science (New York: Harper, 1951). W.R.McDonough, Report of the working party on the provision of computing facilities for teaching (Inter-University Committee on Computing, 1991). E.Mayo, The social problems of an industrial civilisation (London: Routledge, 1949). E.Mumford, Living with a computer (London: Institute of Personnel Management, 1964). E.Mumford, “Clerks and computers”, Journal of Management Studies 2, 1965, pp. 138–52. D.Nelson, Report of a working party on computer facilities for teaching in universities (Computer Board for Universities and Research Councils, 1983). J.D.Y.Peel, “Understanding alien belief systems”, British Journal of Sociology xx 1, 1969, pp. 69–84. R.Quinault & J.Stevenson, Popular protest and public order: six studies in British history 1790–1920 (London: Allen & Unwin, 1974). T.A.Reid, “Perspectives on computers in education: the promise, the pain, the prospect”, Active Learning, 1, December 1994, pp. 4–10. A.K.Rice, The enterprise and its environment: a system theory of management organisation (London: Tavistock Publications, 1963). M.Rose, Industrial behaviour: theoretical developments since Taylor (Harmondsworth: Penguin Books, 1978). J.Sabini, Social psychology (London: Norton, 1992). J.Stevenson, Popular disturbance in England 1700–1870 (London: Longman, 1979). C.Tilly, “Collective violence in European perspective” in Violence in America: historical and comparative perspectives, H.D.Graham & T.R.Gurr (eds) (New York: Bantam, 1969). E.L.Trist, G.W.Higgin, H.Murray, A.B.Pollack, Organisational choice (London: Tavistock Publications, 1963). G.Watson, “Resistance to change” in The planning of change, W.G.Bennis,K. D.Benne, R.Chin (eds) (New York: Holt Rinehart & Winston, 1970). D.T.Wellman, Portraits of white racism (Cambridge: Cambridge University Press, 1977). A.W.Wicker, “Attitudes v actions: the relationship of verbal and overt behavioural responses to attitude objects”, Journal of Social Issues 2, 1969, pp. 541–78. P.Willis, Learning to labour (Farnborough: Saxon House, 1977).
Chapter 12 STANDARDS FOR THE NON-STANDARD: THE IMPACT OF NEW TECHNOLOGY ON THE NON-STANDARD STUDENT Ann Wilkinson
Traditional methods of teaching are currently being revised to take into account more innovative methods of teaching and learning. This provides a useful opportunity for educators to look at the different teaching and learning approaches and consider their effectiveness. As a co-ordinator of the CTI Human Services (one of the 24 national centres, which has a mission to maintain and enhance the quality of learning and to increase the effectiveness of teaching through the application of appropriate learning technologies), the author of this paper addresses how technology can be used to support teaching and learning by the “non-standard” student. Introduction Now that technology is developing more rapidly than ever before and is being incorporated into all spheres of teaching this is a good time to pause and ask if it is good innovation for everyone. Traditional methods of teaching, often referred to as “chalk and talk”, are known to disadvantage a significant number of students and teachers, for example those who are deaf, have specific learning difficulties such as dyslexia, or are not native English speakers. These methods also as- sume that the teacher imparts information to the receptive student with little need for active dialogue. Both software developers, managers and educators need to consider whether they are creating a learning environment that is suitable to an increasingly diverse student population. A few definitions For the purposes of this paper “educational technology” is used in its widest sense and refers to the use of computers as a tool for producing assignments (e.g. word processing, spreadsheets and databases); the use of a variety of presentation media, assessment tools and CAL. The term “non-standard” student is proposed to include students who have not recently experienced traditional mainstream education in the UK and, as a result, tend either to be educated in a country outside the UK, not native English speakers, mature, disabled and/or working or studying part-time. In addition, this diverse group tend to be either technophobic and to have had little access to technology in education. These students, however, exemplify the diversity which is now being encouraged in higher education and are what Ehrmann (1993) categorizes as the “new majority”. Teaching staff are equally diverse and the majority lack the time, opportunity or experience to experiment with technology in general terms, but specifically, in relation to their teaching. The evidence of CTI workshops supports this situation. Even teachers who can manage their own PCS and review materials are sometimes reluctant to experiment with teaching using technology.
92
STANDARDS FOR THE NON-STANDARD
Teaching and learning environments Many teachers are beginning to use educational technology to augment some part of their teaching and/or assessment procedures. In addition, teachers are beginning to expect students to have or to gain competence in basic IT skills, for example by being able to submit word-processed work assignments. In some departments, CAL is well integrated into the curriculum, whereas in others it hovers on the edges as staff move tentatively to introduce change. The very nature of the change, which requires staff to become involved in using educational technology, demands a revolution in teaching methods which places pressure on individual members of staff. This may well be more acute in the social sciences and humanities areas where the use of technology has not traditionally been prominent. Consequently, although lecturers may have developed skills in handling statistical packages and using word processing programs and spreadsheets, many have not have changed their mode of delivering information to students. The reasons for this are complex and ultimately require a fundamental review of entire courses as well as the mode, the means of delivery and assessment. Using bespoke software also requires more work. For example, lecturers evaluating the Interpersonal Skills in Social Work for ProCare1 described writing supplementary questions for the program in order to customize it for their own needs. In addition, lecturers felt that they need to be more than “one step ahead” of students and therefore needed to understand completely both how the program is structured and how the student should approach the required tasks. In more sophisticated packages, lecturers are able to add their own material or make decisions about which paths the student must take in order to complete the program effectively. However, this involves a lot of work and lecturers still may experience difficulty in helping the students to move through more complex packages. Consequently, lecturers have to develop more guidance and support systems. Some lecturers committed to using educational technology feel that programs rarely contain suitable or relevant material and therefore attempt to develop their own. Unfortunately, this is very time-and resource-consuming and requires the acquisition of skills which are probably unrealistic for most lecfurers. Even the task of converting lecture notes into presentation material involves a degree of re-skilling. The role of educational technology Currently there is little conclusive research available to innovators about how students use educational technology. There is a growing body of research on generic studies in educational technology as well as on different cognitive teaching and learning styles which are well summarized in Laurillard (1993). In addition, a number of articles recently have begun to consider how institutions can make their courses accessible to a diverse body of students (Ehrmann 1993, Henry & Rafferty 1995). Some of the studies in educational technology have focused on the fact that students (and teachers) vary from those who require very structured material to those who prefer to have little structure and develop their own method of using resources to obtain information. (Clarke 1993; Kwok & Jones 1995). This suggests that resources can be used in a variety of ways and lecturers should embrace the variety of methods available. In particular, the use of technology should be used as an effective tool to make possible deep and rich teaching and learning environments. Lecturers also need to consider their own style of communicating when considering with which media to use (Armstrong 1991).
1. ProCare Courseware has been developed by a partnership of Bournemouth and Southampton Universities with the involvement of the Open Learning Foundation.
THE ROLE OF EDUCATIONAL TECHNOLOGY
93
Teaching with technology The basic IT tools such as word processors, spreadsheets, databases and statistical packages that are now so widely available in all discipline areas of higher education require both lecturers and students to develop a set of transferable skills. However, there must be some recognition that these skills are not always acquired and this has implications for both lecturers and students. For example, if students are being asked to present word-processed material, will they be compared favourably against students who have submitted handwritten material even though the content of the latter may be better? Presentation materials Both teachers and students are beginning to experiment with producing lectures or seminar material on presentation programs such as PowerPoint. These may be used to produce professional-looking acetates for projection via an overhead projector. Reports from delegates at CTI workshops recount incidents of how students and teachers who are presenting seminars using hand-drawn acetates feel deskilled when they follow a technically able person who has made a more glossy presentation using a computer. The use of technology may lead to a perception that printed acetates may be more authoritative and therefore more valid. Assessment One of the fastest developments of computer use has been in using software to assist in the assessment process. It is important to know what is being tested; short-term memory, understanding of the subject, ability to use the technology or some other unforeseen variable. Some of the advantages of these programs are that they create databanks of questions which can either be randomly assigned or selected to cover a particular piece of learning. Modelling and simulation In some disciplines computers are used to model situations from database material, for example, GIS and population projections. There are suggestions from teachers that some of the more sophisticated programs create problems for “non-standard” students who have not been taught some of the basic concepts in previous education i.e. understanding laws of physics, basic algebra or rules of grammar. The development of simulation programs are usually graphical and are controlled by a GUI. However, users who are blind, partially sighted or colour blind may not find these programs easy to use. In the biological and chemical sciences there has been great increase in software for use in dissection practicals and demonstration of crystal structures but this has led to problems for some staff and students. For example, graphics need to have a brief description in text to enable blind students to have a basic understanding of the illustration (NCET 1993), and users who are colour blind should be able to change the colour palette on programs. In addition, the use of speech synthesizer programs which work well for text are wholly inadequate for graphics and this needs to be taken into consideration for certain users. Computer-based learning programs Many institutions are incorporating programs which teach skills which can be learnt only by repetitive work or drill and practice. These are often frustrating to teach but can be handled well by programs which
94
STANDARDS FOR THE NON-STANDARD
introduce material in small batches with continuous assessments. In the teaching of aspects of sociology and politics, for example, a number of basic concepts can be learned before the student can do more advanced work. Students often require time and a number of repetitions to learn this material. However, a few pedagogical and practical issues arise. Does the software allow for different learning styles? If courses are taught using computers, are there enough computers available in workstation areas with open access? Are the programs usable by students who do not have good vision? Some of the most innovatory teaching material is that which involves the use of courseware. In the social work field this would include such programs as Social Work and Information Technology and TLTP materials in the ProCare Consortium which have been developed at the Centre for Human Service Technology.2 These programs have a degree of interactivity and require students to handle GUI interfaces confidently. Such programs include graphics and in some cases, video. Students can access notebooks, annotate material and from some programs call other software for statistical or database use. The best way of delivering these programs is via CD-ROM as this format can handle large quantities of data, graphics and video. On many campuses CD-ROMS are available through the library network. Teaching with the Internet A growing number of courses are being taught via the Internet. This takes different formats, including general distance learning modules, computer-mediated conferencing, “chat line” tutorials, searches of remote databases, downloading and transferring files, e-mail and fax and a number of other computermediated forms of communication. These developments offer advantages to the “non-standard” students. The problems that exist in this type of delivery are access, infrastructure and interface design. A case that I can recall highlights this point. At one university, a mature student who was both deaf and using a wheelchair was handed a disk at the start of his course with the outlines and reading list from the Internet. However, he had no computer skills and no computer. He tried to obtain assistance from computing services but the building had very poor physical access and he had to wait to be admitted. The public workstation areas were arranged in such a way as to prevent a wheelchair user reaching any but the end machines in a row. If these were occupied he had to queue until they were vacated. Some months later through a charitable trust he obtained a computer which was placed in his hall of residence. The computer arrived without any software and he did not have the resources to purchase mainstream programs or get training in their use. Eventually, he did get help from his computing services adviser but he was disadvantaged in the early stages of his course by the department’s failure to ensure that all students had access to the resources to use the materials provided. As Yeomans (1995) points out, the promotion of learning on-line raises many issues about the delivery of learning and the role of the university and its relationship with students. Among his concerns were the gender bias found on electronic campuses and the drive towards increased competitiveness between institutions rather than co-operation. Access It is difficult to visualize a whole cohort of students purchasing multimedia machines to use at home3 with their grants or loans even if the software licence allowed them to have copies, and yet at a recent CTI workshop teachers were reporting that a small but growing number of students were obtaining loans to buy 2. Social Work and IT is a module developed by the Centre for Human Service Technology and the University of Wales Cardiff with support from CCETSW.
THE ROLE OF EDUCATIONAL TECHNOLOGY
95
computers. Those who are unlikely to have resources to provide personal machines would be the “nonstandard” students for whom money and access are major issues. Although in some universities resource centres are open 24 hours a day, this is not yet universal. Students who are studying and working part-time and those who live off campus have restricted access. For overseas students, there are additional concerns. Some of these students have limited budgets and only access to basic levels of technology in their home country. I can recall on one postgraduate course that a student collected data in this country and used SPSS for Windows to analyze the statistics. The student returned to Africa to write up the thesis and experienced difficulty in manipulating the data as the only platform available to her was a PC AT. An additional journey to this country had to be made to resolve the problem by converting the data to a lower-level format. Students with medical conditions which are exacerbated by the over-use of computers also need to be considered. The CTI Centre at Southampton recently received an enquiry about a student who has tenosynovitus (strain of the tendons of the arm probably caused by repetitive actions such as typing). He needs to use a computer in the course of a maths degree to do calculations. He is about to return to study after time out but is still trying to find a way of inputting to the computer and managing daily living which does not exacerbate his condition. Other conditions which may result in students presenting with problems are epilepsy, back problems and ME (myaloencephalitus). All of these may vary in their severity and impact and may recur intermittently. The need for good preliminary skills training to prepare all students to use technology has become evident but there is a continuing argument about who is responsible: the lecturer, the student or the support services. Students who enter institutions needing help with study skills and support with access to learning materials, whether because of disability or different educational background, should have access to support services. Unfortunately, these are not adequate in many universities. Conclusion Universities are being motivated to change by many factors among which are budgetary changes, competition, increasing numbers of students and increased demands on staff. Teaching and learning has to encompass a number of different approaches and the aim of CTI is to help teaching staff to introduce technology into their teaching in an appropriate manner. More systematic research is required to find out why some students and teachers take enthusiastically to information and education technology yet others are deterred. This paper referred to a number of examples collected in a brief period from teachers and professionals in a number of settings. Material was not difficult to find and this, together with the issues outlined above, suggests that a properly conducted piece of research would be of value to developers, teachers and students. References H.Armstrong, Developing training skills (London: Longman, 1991). J.A.Clarke, “Cognitive style and computer-assisted learning: problems and a possible solution”, ALT-J 1, 1, 1993, pp. 47–55.
3. The recent Dearing report recommends that students should have access to good portable multimedia notebooks. Recommendation 13.43–13.50 in the “Report of the National Committee of Inquiry into Higher Education (The Dearing Committee)”, 1997. For details refer to: http://www.leeds.ac.uk/educol/ncihe/
96
STANDARDS FOR THE NON-STANDARD
S.Ehrmann, “US higher education in 1998: how it might use technologies for undergraduate education”, The CTISS File 16, December 1993, pp. 8–17. S.Ehrmann, “The bad option and the good option”, Active Learning 2, July 1995. M.Henry, & J.Rafferty, “Equality and CAL in Higher Education”, Journal of Computer Assisted Learning 11, 1995, pp. 72–8. M.Kwok, & C.Jones, “Catering for different learning styles”, ALT-J 3, 1, 1995, pp. 5–11. D.Laurillard, Rethinking university teaching (Routledge: London, 1993). National Council for Educational Technology. CD-ROM: a matter of access (NCET: 1993). K.Yeomans, “Higher education on the information superhighway: negotiating on a ramp”, Active Learning, 2, July 1995, pp. 19–21.
Section Four THE EFFECTIVENESS OF THE NEW TECHNOLOGIES IN TEACHING AND LEARNING ENVIRONMENTS
Chapter 13 INFORMATION TECHNOLOGY AND TEACHING QUALITY ASSESSMENT: REFLECTIONS OF A SOCIOLOGIST Chris Turner
This chapter is an exploration of both current and potential uses of IT in the learning and teaching of sociology in higher education in Britain. The ideas and arguments presented have been refined as a result of the author’s recent experience of TQA in Scotland. The arguments and evidence are restricted to undergraduate learning. The primary questions underpinning the analysis are: 1. To what extent do universities have strategies for ensuring that students have opportunities, encouragement and support to use everyday IT skills as an integral part of their own learning process? To what extent are existing strategies contributing to the development of effective IT skills among sociology students? 3. What are the general policy implications? Background The huge potential of IT for transforming higher education is widely acknowledged and proclaimed, but there are extremely divergent ideas about how the “great IT transformation” can be achieved. A number of well-recognized factors inhibits long-term strategic planning and decision-making. The extremely rapid pace of technological development and associated reductions in IT unit costs make prediction and plan- ning an extremely hazardous task for relatively small-scale investors such as British universities. The well-publicized problems of the London Stock Exchange and the NHS in achieving satisfactory outcomes for large-scale management applications, together with British universities’ similar but small-scale experience of the gap between promise and delivery on management and administrative computing initiatives, highlight the potential risk associated with ambitious innovative systems development projects. The net effect of rapid technological development and chronic under-funding of the British university sector has been to produce a piecemeal, incrementalist approach to IT development in universities. There have been both major and obvious benefits, and instances of waste and ineffective management. A great deal has been learned about what not to do. One of the key arguments of this chapter is that, by the mid-1990s, the development and costs of IT hardware have reached a stage at which the investment risks to universities of committing themselves to forceful IT strategies are strongly outweighed by the potential advantages, especially to students, of well-planned developments. A second set of IT issues in British universities arises from the structure and organization of university management. Traditional distinctions are drawn between central university management and administrative functions, and academic applications of IT. Within the academic sphere, a further significant distinction is made between research and teaching applications. Most universities have either deliberately chosen, or
IT AND THE LEARNING STRATEGIES OF SOCIOLOGY STUDENTS
99
accepted by lack of decision, distinctions between different types of IT use as a basis for both capital and recurrent funding. The argument advanced in this chapter is that IT uses in research and certain aspects of management and administrative computing are crucial for the development of the most effective student learning strategies. Teaching Quality Assessment TQA in Scotland is broadly based. The major foci for all disciplinary evaluations ensure that attention is given to: • each university’s mission statement, the broad range of aims and objectives associated with it, and patterns of quality control • the key resources upon which learning and teaching depend • the design, organization and modification of curricula • learning and teaching practices, forms of assessment and monitoring of student learning, and the quality of learning as indicated by the work produced • patterns of academic and non-academic support for students • current learning outcomes and their implications for the future lives and careers of students. An important aspect of TQA is the relative emphasis placed on learning as opposed to teaching. This varies considerably by discipline. For sociology, the dominant emphasis is on learning processes and outcomes. This is not to deny the importance of explicit curriculum design, of a variety of formal methods of providing teaching inputs, and academic guidance or support from staff when students encounter difficulties in their learning. Both personal experience of diverse aspects of social life and theoretical understanding of how what currently passes for knowledge is constructed and used are particularly relevant for students of sociology, compared with many other disciplines. This has significant implications for the role of IT in the education of sociologists. For sociology students, practical development of IT skills, the usevalue of IT in sociological analysis, and a theoretical understanding of how IT has developed and of prospects for future development and their impact on global society are relevant. The incorporation of information technology into the learning strategies of sociology students It is useful, analytically, to distinguish three broad types of strategy for IT use by sociology students: casual incorporation of IT; IT as an aspect of disciplinary teaching and IT as an integral element of effective learning. “Casual incorporation” denotes situations in which use of IT by students in their study of sociology is a by-product. For example, it may be a spillover effect from a specific student’s prior experience and development of IT skills in the home, school, college and/or workplace. Alternatively the spillover may be from staff research—a lecturer may encourage a selected student or student group to use resources which he/ she has developed in his/her research for a project or dissertation. Alternatively a university may seek to stimulate use of IT by organizing specialist voluntary workshops or courses to introduce students to generic IT skills (word-processing/use of spreadsheets and databases/ production of tables and charts in association with text/development of computer-based presentations for tutorials, seminars, workshops, job interviews). The hallmark of “casual incorporation” is that IT use by sociology students is voluntary. This type of
100
INFORMATION TECHNOLOGY AND TEACHING QUALITY ASSESSMENT
strategy allows some students to avoid IT use altogether while also enabling others to develop varying levels of knowledge and skills, including extremely sophisticated use in sociology. Formal disciplinary teaching of IT in sociology, when not incorporated into a broader strategy for use of IT as an element of effective learning, has a strong tendency to focus on the application of specific IT skills. The link between IT skills and research methods is emphasized because of the obvious benefits of harnessing the precision and processing power of desktop computing to the organization and analysis of sociological data. The most common elements are: 1. The introduction of word processing accompanied by a requirement that the skills are demonstrated in assessed coursework, and 2. Teaching batches of students to use statistical packages, especially SPSS. This is often extended into project work requiring secondary analysis of data sets from staff research, the Economic and Social Research Council data archive, and/or the Office of Population Census and Surveys. Less common at undergraduate level are: 1. The introduction of students to the use of software packages for the analysis of qualitative data and 2. The use and/or development of simulation models to analyze the effects of selected sociological ideas such as the basic principles of population growth, or of decision-making in organizations. This type of strategy ensures that students (usually honours students) are required to develop limited IT skills and an awareness of certain specialist applications of IT for the production of data sets and their analysis. It serves as an invitation to students to take responsibility for the use of IT in their own work, and to make informed choices about the use of IT in their own subsequent sociological education. IT as an integral element of effective learning for students in higher education involves the drawing together of many different strands of learning. This type of strategy must cater both for students without any previous IT background or experience, and for students building on prior but diverse knowledge, skills and experience of IT. The total experience of IT use, whether for administrative purposes, accessing library and information service resources, or the production of academic work, is offered in such a way as to be mutually reinforcing. The essence of this type of approach is the integration of learning across the different strands from a student perspective, in contrast to parallel and piecemeal learning across the various strands. The range of potential uses covers a complex variety of elements which can be approached in a parallel or integrated manner, or some combination of the two. In practical terms, the following list indicates potentially significant purposes of student IT use, but is by no means exhaustive: • access to university degree regulations, programme requirements, and/or course unit information • access to student bulletins, handbooks and information on both academic and non-academic student support services • access to university calendars and timetables • direct university registration • access to library catalogues • access to information from publishers • access to abstracts of articles, monographs and other relevant publications • access to electronic publications (including editions specifically reproduced in electronic form) • interactive use for communications with staff and other students
IT AND THE LEARNING STRATEGIES OF SOCIOLOGY STUDENTS
101
• • • • • •
interactive use for ongoing academic projects production of individual presentations for tutorials, seminars and workshops direct submission and logging of coursework receiving individual comments on coursework submitted accessing transcripts of grades achieved and progress reports preparation and updating of a student’s curriculum vitae and portfolio of evidence of academic achievements and transferable skills • accessing the full range of Internet facilities, and contributing to Internet information and debates. Assessing current strategies and patterns of provision in Scotland Some higher education institutions in Scotland offering sociology have not proceeded beyond the stage of casual incorporation. None of these institutions made sociology submissions for TQA. The seven institutions which submitted self-assessments for the sociology TQA ranged from one which was in process of shifting from a casual incorporation to a formal disciplinary teaching strategy to one which was developing towards an integrated learning strategy. In all seven institutions there was evidence of significant developments from session 1994–5 to session 1995–6. In each institution there were experts among the social science staff with both clear visions about the potential of IT in the lives of sociology students, and the requisite technical expertise to initiate and implement significant developments from the existing base. In all seven institutions there were major resource allocation dilemmas at institutional and social science/ sociology levels. The effects of a piecemeal, predominantly finance-led, approach to IT strategies and the associated pursuit of parallel developments in different university sections have militated against an integrated learning approach. A number of practical illustrations can be given. There were two or more cases of each of the following: • dedicated sociology/social science IT facilities without direct links to library/information services for sociology undergraduates to use • e-mail facilities not available as a matter of course to sociology undergraduates • over-concentration on the use of dedicated sociology/social science IT facilities for quantitative analysis • Internet facilities not available as a matter of course to sociology undergraduates • sociology/social science IT facilities not introduced into the formal teaching programme before third year honours • independent subject choice of hardware and associated software by different subjects in a university had produced a situation in which students could be required to use two or three different IT systems for their academic studies on one undergraduate programme. These examples indicate the lack of a thoroughgoing student-centred approach to the planning and provision of IT facilities geared to student learning. The argument advanced here is that the combination of financeled and staff-centred IT development in universities, coupled with the extremely rapid rate of technological development, has militated against the adoption of a thoroughgoing student-centred IT strategy. It is possible to draw together and summarize the conditions which sociology students with very varied experiences of IT in higher education believe would encourage effective IT use. The major elements are:
102
INFORMATION TECHNOLOGY AND TEACHING QUALITY ASSESSMENT
• short, clear, jargon-free summaries for prospective students about university IT facilities and expectations about their academic use • written advice to prospective students to help them prepare effectively for use of university IT facilities • flexible modular IT workshops, available to all in the first term/ semester, tackling introductions to basic information and skills, and incorporating an appropriate range of subject specific “exercises” • a graduated introduction to basic skills and academic applications, with opportunities for students to control the pace of learning and to gain a positive sense of IT benefits for their own studies • user-friendly formal support systems both for the development and updating of generic skills, and for appropriate assistance with subject specific applications • continuity and mutual reinforcement of IT uses in different locations by a degree of standardization of supported equipment and software • integration of library/information services access and subject specialist applications • sustained patterns of IT use throughout the undergraduate programme for a wide variety of academic purposes • open access to IT facilities at times and in locations to suit diverse personal scheduling needs of students • the availability of excellent detailed examples of effective IT use in the discipline/s being studied, in which the various steps of the research process are highlighted • positive encouragement for IT use from all tutors in a subject, and willingness of hard-pressed tutors to communicate via IT networks • strong networks of student interaction and mutual support in the use of IT. These conditions emphasise (i) sensitive, thorough and supportive introductions to IT use in higher education; (ii) the development of IT skills so that IT use becomes a taken-for-granted aspect of study; (iii) the significance of strong staff encouragement and support, and (iv) the importance of students achieving a sense of both short-term and longer-term benefits of IT use. The highest quality of student IT use was associated with final year dissertation/project supervision and small group honours-level learning and teaching. A few students used Internet sources—ranging from the latest government reports and statistics, information from interest and pressure groups including qualitative experiences of their participants, through to topical interactive debates on current policy issues—extremely well. In a few instances, students engaged on group projects used IT facilities efficiently both to sustain group dialogue and debate, and to deliver their final project report. Some individual dissertations/projects demonstrated excellent use of a range of IT skills—the effective integration of text, tables, charts, diagrams and other visual materials into overall presentations. The indications were that nearly all the students achieving this level of success had taken firm responsibility for developing IT use as an integral part of their own learning, usually encouraged by staff with specific research interests in the substantive topics under investigation. Another area in which extremely good exemplars of effective IT use were found was that of learning support. A number of institutions were in process of making available computer-based systems dealing with study and transferable skills—i.e. for student self-assessment, diagnosis and exploration of problem areas, and advice on constructive steps towards improvement of oral presentations and written submissions. In addition, recent Scottish Higher Education Funding Council Disability Initiatives had stimulated institutional provision of specialist IT facilities for students with special academic needs. Developments such as use of personal profiling or voice synthesizer software for students with visual impairments, dyslexia and associated conditions were particularly impressive.
IT AND THE LEARNING STRATEGIES OF SOCIOLOGY STUDENTS
103
Paradoxically, for mass teaching and learning in sociology (using as a crude indicator courses designed to achieve economies of scale in teaching inputs for classes with rolls of more than 100 students), where substantial benefits might have been expected, IT use was much less effectively developed. There appeared to be two main reasons for this. The first was lack of investment (personal and institutional) in equipment on a sufficient scale to permit requirements for sustained IT use. The second was the relatively high human resource costs. Staff emphasized the extremely heavy investment in time and energy necessary to develop and pilot stimulating and effective subject-based applications for large classes. Students reported that their greatest need for sustained support was in the early stages of learning, when they faced the twin challenges of rapid IT skills development and introduction to subject-based applications. Staff encouragement and support were seen as particularly important in “getting started” and “gaining confidence in application of IT skills”. Thereafter, as students achieved a basic competence in IT applications and were able to take greater responsibility for IT use in their own learning, mutual support and collaborative working were generally recognized as having a crucial part to play, given that staff were available to offer guidance and support when technical or academic problems were encountered. An associated weakness became apparent from discussions with students. Little guidance was offered on IT use for taking and organization of academic notes. Even in institutions where there was easy access to IT facilities, students developed personal styles of note-taking through often lengthy trial-and-error processes. Summary and conclusions IT use in sociology has both advantages and disadvantages. A major argument advanced for harnessing IT to learning and teaching in sociology and other disciplines is expressed in terms of key transferable skills. The essence of this argument is that, given the rate of growth of IT applications in modern society, current graduates will need IT knowledge and skills both for career advancement and for meaningful participation in everyday life. The general proposition causes little dissent, but there are considerable differences about where, when and how generic skills and competences as opposed to specialized subject IT applications should be incorporated into the curriculum. The development of general library, information scanning and information retrieval skills is seen as of general relevance for academic subjects. It is widely agreed that the “information explosion” accompanying the development of electronic databases and publications poses a definite dilemma for sociologists, and requires increasing sophistication in knowledge and use of search techniques. The advantages of IT uses in this field are generally acknowledged to outweigh any disadvantages, whether in sociology or other disciplines. Certain specialist applications in sociology are identified by sociology staff as of great importance. Two areas on which there was strong agreement are the construction of large data sets, and the use of software applications for quantitative and qualitative analyses. The advantages for high volume data handling and for speed and power of analysis are widely acknowledged. Major advantages accrue when the same IT facilities, wherever they are located, can be used for accessing library services, communicating with staff (especially in large classes), note-taking and the production of scholarly work. At present, sociology students in several universities either experience problems of regular access to equipment and appropriate levels of support, or have access to equipment in different locations for different purposes. The scope for significant improvements is clear. Some staff have strong reservations about encouraging unlimited IT use by students because of the issue of plagiarism. There is little doubt that the scope for various forms of plagiarism has been seriously exacerbated by the combined effects of the decreasing examination component in formal academic
104
INFORMATION TECHNOLOGY AND TEACHING QUALITY ASSESSMENT
assessment, the advent of mass teaching and learning in introductory sociology in higher education, and the increasing ease of electronic reproduction and communication. The evidence, however, on the extent of plagiarism is mainly anecdotal. It should also be noted that there is considerable scope for developing effective electronic means for detecting plagiarism, should it emerge as a major problem. Staff expressed serious concern about the declining unit of resource per student in higher education generally. It is by no means accepted that additional IT investment is in the highest priority category. Indeed, the argument that the effective incorporation of IT use into student learning strategies requires additional recurrent resources needs to be fully evaluated. No doubt there are major benefits to be gained by introducing students to specialized sociological applications of IT, but it is relatively inefficient to do this by means of stand-alone provision. The advantages gained by sociology honours students who are fully conversant with, and regular users of, IT facilities for multiple purposes are impressive, but the relative advantage to them would be lessened by universal incorporation of IT use as part of an integrated learning strategy in higher education. The development of IT use in sociology in Scottish higher education is currently at a critical stage. There are substantial but diverse achievements, some extremely promising pilot/demonstration projects in process, and scope for important new developments in learning, teaching and efficient communication. However, if universities are to commit themselves fully to an integrated IT learning strategy, there are a number of important steps to be taken. Medium-term university-wide planning, resource allocation and associated patterns of staff development are required along the following lines: • Each institution would need to determine the most appropriate division of labour between the patterns of quality control. • Staff development and training programmes would need to be geared to enabling staff to work cooperatively, given the selected pattern of division of labour, to ensure that students receive the necessary training, encouragement and support for regular and productive IT use at all levels. • Each university would need to assess what essential additional human and/or material resources are required and make the necessary investments. A systematic, well-planned approach along these lines would provide the basis on which an integrated learning strategy could be designed and implemented. There are important implications for staff development. Perhaps the greatest difficulty is likely to be encountered in developing a team approach in first year undergraduate programmes. The very flexibility of choice which permits students in most Scottish universities to select a mix of different disciplines, often across faculty boundaries, means that collaboration across different disciplines is likely to be complicated. Ideally the introduction for students to effective integrated IT use in higher education would be based upon a sophisticated team approach involving computing services staff, library staff and academic staff/subject specific IT staff in relevant disciplines. Whatever the potential difficulties and problems, it is important to acknowledge that rapid and effective developments in varied types of IT use by sociology students have taken place in Scottish universities during the present decade. It would appear that the current impetus can be sustained, but not without with additional dedicated funding. IT developments in teaching and learning tend to make heavy demands on staff time, and the major benefits from piloting and introducing new systems of working often take years rather than months to reach a significant payoff. There needs to be some explicit recognition in terms of
IT AND THE LEARNING STRATEGIES OF SOCIOLOGY STUDENTS
105
advancement and promotion for staff who successfully and innovatively contribute to major advances in the use of IT in student learning. The major policy arguments which merit consideration are: • The most effective IT strategy is likely to be one which develops IT as an integral element of effective learning. • The implementation of this type of strategy requires a common framework of institutional level planning and management, and, at the same time, adequate scope for subject level specialist applications. • For the foreseeable future, an integrated approach to the development of IT knowledge, skills and applications in sociology has far more to offer in terms of enhancement of the quality of student learning than in terms of economies of scale for learning and teaching in sociology. • The most challenging area for development is in the field of mass learning and teaching. References Scottish Higher Education Funding Council, Quality assessors’ handbook, 1995–6 edn, (Edinburgh: SHEFC, 1995). Scottish Higher Education Funding Council Reports of a quality assessment in sociology (Edinburgh: SHEFC, 1996).
Chapter 14 WHY COSTS ARE IMPORTANT IN THE ADOPTION AND ASSESSMENT OF NEW EDUCATIONAL TECHNOLOGIES David Newlands, Alasdair McLean and Fraser Lovie
Social scientists in higher education are users of new technologies but, in that respect, they are no different from other academics. Where they can make a distinctive contribution is in drawing upon their expertise as social scientists to advise on which technologies should be adopted and how precisely they should be used. This chapter seeks to demonstrate that an economic approach can help in making such decisions. Introduction Bates (1982) has identified a number of factors which are relevant to the selection of an educational technology or system: the accessibility of the medium, whether it is widely available or can at least be provided cheaply; convenience and ease of use by students, without undue additional training; academic control over the design and preparation of materials, again without undue additional training; the “human touch”, making possible relatively natural communication between learners and teachers; cost; and what is available. The last factor—what is available—is frequently a dominant consideration, suggesting that much educational innovation is technology driven. There are legitimate arguments as to why the choice of educational media should be influenced by the pattern of technological development (Pisapia 1994). For example, given that computers are now pervasive throughout society and that people with good computer skills have better employment prospects, many universities view it as important that their students develop confidence and competence in the use of computers. Nevertheless, it is argued here that new educational technologies should not be introduced without firm evidence of their academic and economic effectiveness. An economic approach need not mean a parsimonious concentration on costs. Although it will be very surprising to some, an economic approach can also encourage much more attention to be paid to the educational benefits of new media, including the learning achievements and experiences of students. The chapter is in three main sections. The next section introduces a framework for identifying and analyzing costs. It is then shown how this framework can be employed to inform decision-making about the use of new technologies. The final main section reviews some of the existing studies of the costs of different educational media. A framework for identifying and analyzing costs Cost-benefit analysis and cost-effectiveness analysis In theory, the benefits of an educational system, in terms of learning experiences and achievements, and its costs can be brought together into a single comparison. There is a well-developed technique, cost-benefit analysis, for evaluating alternative projects (Pearce 1983). Cost-benefit analysis seeks to convert all the
A FRAMEWORK FOR IDENTIFYING AND ANALYZING COSTS
107
costs and benefits of the proposals under consideration into money terms. Investment in a new educational technology should take place only if the money value of benefits exceeds the money value of costs. In practice, very little of this is possible. In particular, many of the benefits of higher education are difficult to specify at all, let alone put in money terms. The benefits of higher education extend well beyond students’ achievement of course or degree passes. They include access to higher education, the wider intellectual and social skills which students acquire, and the role of universities in the propagation of cultural values. This may prevent the carrying out of a comprehensive cost-benefit analysis but it does not pose an insuperable obstacle to the systematic comparison of different educational systems. At least two other possible procedures could be adopted. One way of proceeding is to assume that the benefits of the different educational media under consideration are broadly comparable. This does not mean that each of the various individual benefits are assumed to be comparable. What it does mean is that while one educational system is in some sense “better” than another when judgement is made about a particular type of benefit, it is “worse” when judged against another type of benefit. The assumption therefore is that the overall benefits of different educational systems are comparable. For example, the learning outcomes of students in a traditional face-to-face system may be considered to be better than those of students studying in a distance learning system but the latter has the advantage of permitting greater access to higher education. The mix of benefits in the two cases is quite different but it may be that overall the benefits are thought to be broadly similar in extent. If we are unable to quantify the benefits of different educational systems—and thus cannot carry out a full cost-benefit analysis—but are reasonably happy to assume that the benefits of the alternatives under consideration are broadly comparable, we can proceed by conducting what economists term a costeffectiveness analysis (Pearce 1983). We remove benefits from the picture, assuming they are the same for all the alternatives, and try to decide which of the options can be pursued most cost-effectively. A second way of proceeding is a modified cost-benefit analysis which recognizes the inability to accurately measure specific benefits but maintains that the benefits of alternative educational media should continue to be borne in mind, particularly if the assumption of roughly equal benefits is not acceptable. For example, distance learning may be thought of as less desirable than face-to-face learning, even when the benefits of greater access to higher education are taken into account. This is not to say that the achievements and experiences of distance learning students are thought to fall below some minimum acceptable standard. We assume that all the systems under review meet such a minimum standard although the quality of learning experience associated with different educational media will always be a relevant concern. Thus, a cost-effectiveness analysis in which there is reasonable confidence in the robustness of the cost information used might yield the result that a distance learning system has average costs which are 95 per cent those of a face-to-face system. This would suggest that the distance learning system should be pursued. However, if we are not happy that a distance learning system is as good as a face-to-face one, we are unlikely to find the 5 per cent cost advantage of distance learning sufficient. We would probably arrive at the opposite conclusion to that of the cost-effectiveness analysis, namely that the face-to-face system is to be preferred. If, however, a distance learning system has average costs which are 85 per cent or 75 per cent those of a face-to-face system, either of these cost differentials may be thought sufficiently large for us to prefer the former although it may not be as educationally desirable as the latter. As this illustration of what might be involved in practice shows, the final decision can be made by “feel”. Indeed, we would argue that even accurate information on costs should not be allowed to displace other criteria in the final judgement about the adoption or use of educational technologies. This type of modified or simplified cost-benefit analysis is preferable to a cost-effectiveness analysis. By prompting us to think about what the benefits of different educational systems are, it ensures that as much
108
WHY COSTS ARE IMPORTANT IN NEW EDUCATIONAL TECHNOLOGIES
relevant information as possible is brought to bear in making decisions about how higher education resources are used, even if it is not possible to precisely quantify that information. Some simple economic concepts There is no single measure of what an educational technology or system costs. Instead, there are various related measures which are more or less relevant depending on the decision or judgement which has to be made. An understanding of some simple economic concepts can help improve the effectiveness with which educational technology is used. Total costs, as the expression suggests, are the sum of all costs involved in putting on and running a degree or individual course. It may nevertheless be difficult to calculate. For example, resources may be used for a variety of purposes of which a particular educational programme is only one. The problem is how to allocate costs between the different uses to which a person’s time, or a room, or an audio conferencing link, is put. Total costs can be divided into fixed and variable costs. At any point in time, a substantial proportion of costs may be fixed. They will be incurred regardless of whether use is made of the resource or not. Thus, if a university has bought audio conferencing equipment, the purchase cost of this equipment is fixed. It has to be paid, whether the equipment is being used or not. Variable costs are ones which depend upon the use made of the resource. The costs of the telephone links and tutor’s time incurred when a tutorial is conducted via audio conferencing are variable costs. The variable costs (and total costs) of audio conferencing will increase if more tutorials take place and decrease if fewer take place. Average costs are total costs divided by the volume of “production”. There is no single interpretation of what is meant by production. In particular, this may refer to the number of courses or to the number of students. The two will almost always give different answers. For example, average costs per student for 100 students on a single course will differ from average costs per student for 100 students equally split between two courses. In many situations, it will be sensible to compute average costs both per course and per student. Marginal costs are those incurred when producing one more unit, whatever the unit is. Again, marginal costs could be couched in terms of courses or students. The marginal cost of an extra course is the addition to total costs from including one more course in an existing programme of courses. The marginal cost of an extra student is the addition to total costs from one more student joining a class. Marginal costs may often be very low. For example, if an additional student joins an existing class, it may be that the only extra work involved is a trivial amount of administration plus the extra effort in assessment. The implication of low marginal costs is that educational provision may be characterized by significant economies of scale, where average costs fall as the scale of activity increases. If one more course is added to an existing programme of courses, the average cost of each of those courses may fall, perhaps significantly. If one more student is added to an existing course, the average cost of teaching each student may fall. What is happening is that the fixed costs of the educational system are being spread over a larger and larger number of courses and students. If audio conferencing technology costing £10,000 is purchased but used to teach only one student, the average cost per student is £10,000 (ignoring all other costs). However, if 100 students a year are taught over a ten-year lifetime of the technology, the average cost per student is £10. It is important to use consistent and accurate measures of costs. Often, the treatment of costs is both inconsistent and wildly inaccurate. For example, if a programme is simply an extension of an existing course and is taught by established, permanent staff, institutions might treat their time as of zero cost since they are salaried and the university is not involved in any additional payment. If, however, a programme is
A FRAMEWORK FOR IDENTIFYING AND ANALYZING COSTS
109
completely new and additional, there may be an argument that the lecturers and tutors involved, even if permanent staff, should receive an additional remuneration, at a set hourly rate. The costs of staff time in the latter case are treated very differently from in the former case. This difference may be very relevant to the comparison of traditional, face-to-face university education and distance learning, where the former case corresponds to traditional courses and the latter case to distance learning courses. Neither of the examples just quoted may be at all accurate measures of true costs. True costs are ones which reflect the opportunity cost of staff time. The opportunity cost of an action is the value of the alternative action which is forgone. Opportunity cost can be completely different from money cost. In the first example above, staff are salaried and the use of their time involves the institution concerned in no additional money outlay but this does not mean that the staff time is costless. An hour or week spent on developing or teaching a new course is an hour or week which is not spent on another course or on research. It may be very difficult to conceptualize, let alone measure, the appropriate opportunity cost in different situations, but at least this concept draws attention to the fact that staff time always has some cost. Even if the use of someone’s time does not incur any financial cost, we should take account of it. One reasonable working procedure is to proxy the value of people’s time by their salary. It may even be acceptable to reduce the number of rates to just a handful, such as a tutor’s rate, lecturer’s rate and professor’s rate. The important thing for a comparison of different systems is consistency. Staff time should be valued, and valued in a consistent fashion. A further problem of costing educational systems or media concerns the treatment of investment in educational technology. Some educational systems may involve very considerable initial outlays on new technology, including such things as purpose-built accommodation. It is not appropriate to write off these outlays. The appropriate procedure is to spread the costs of the investment over the period of its life and attribute these costs to the programme concerned. Among the difficulties involved is attributing the costs of a technology which is an input into many different programmes. A further problem is deciding what the life of a technology is. It may well be that a technology can operate well in a technical sense for many years or decades but is obsolete in terms of accepted standards of practice after two or three years. The latter is the more appropriate definition but there is of course no precise interpretation of “accepted standards”. Finally, we should note that, in addition to all these categories of costs incurred, there is a further category of costs saved or avoided. This is particularly true of distance learning, which, by giving students the freedom to learn at home or at work, can avoid substantial costs, of student accommodation, subsistence, travel and specialized academic buildings. While they impinge on some of the choices universities have to make about the nature of their educational provision, these cost avoidance savings are more relevant to broader decisions about the shape of higher education which are properly made by government. The use of costs to inform decision-making Choosing the appropriate concept of costs The different measures of costs may be relevant in one situation or another depending on the type of decision which has to be made. Total costs are of interest to those deciding whether or not to establish a technology, or to those working within a fixed overall budget. Thus, while the adoption of a particular technology may reduce the average cost per student, or allow for a rapid expansion in student numbers, it could require an increase in total expenditure beyond the assigned budget. This assumes that the institution or department or unit has to bear all the costs of the investment and cannot obtain a contribution from
110
WHY COSTS ARE IMPORTANT IN NEW EDUCATIONAL TECHNOLOGIES
others. If other contributions are available, another cost variable may be more relevant in making a decision whether or not to pursue the project. Average costs are particularly relevant if students are to be expected to contribute to the cost of an educational programme (whether or not students actually pay the fees from their own pocket or have them paid). If a course is to be self-financing, the fee should be set equal to the average cost per student of the course. Of course, this will usually only be known with any certainty after the event. If fees are set at a certain level in advance, the number of students prepared to pay those fees will have to be estimated. The problem is that, if fees are set at the level required to cover costs, students may be deterred from taking a particular programme. Nevertheless, progress will have been made. Decision-makers can decide to discontinue the programme, explore the possibility of employing a different technology, set a different level of fees in the future, or choose to continue with the existing level of fees and bear the losses involved. It may be that courses are not expected to be self-financing. One alternative would be that students are expected to pay only a proportion, whether 20 per cent, 50 per cent, 80 per cent or whatever, of the total. However, the average cost per student is still the relevant cost variable to guide the decision as to the level of fees to be set. Another possibility, which is in a sense less arbitrary, is to say that students should be asked to pay for the variable costs of an educational programme while the institution meets the fixed costs. Students as a body would be paying for the element of total costs which is attributable to a course being put on while the institution would be paying for the costs—of technology, buildings and permanent staff— which would have been incurred anyway. Low marginal costs and falling average costs are both indicators of economies of scale. If there are significant economies of scale, then it could be argued that it is important to attract large numbers so as to realize the reduction in average costs. Economies of scale will be more important the greater the proportion of total costs accounted for by fixed costs. This is likely to be true the more technologically advanced the educational medium under consideration because the necessary purchases of equipment, specialized premises and skilled staff will be greater. This conclusion is subject to two qualifications. First, the exploitation of continuous economies of scale will be possible only while there is spare capacity. If a new machine has to be purchased or a new team of tutors or technicians hired, there will be a jump in fixed, and average, costs. Secondly, there may be “diseconomies” of scale. The most likely situation in which average costs will begin to rise as the scale of activity continues to expand is if the task of administration and co-ordination becomes so considerable that the costs of managing the programme escalate. There is a final important point which decision-makers must bear in mind if the fixed costs of a particular technology are large. In such a situation, average costs are very sensitive to the number of students. Decision-makers should be cautious about investment in a technologically intensive educational medium if student numbers are uncertain. Establishing the costs of an educational technology The introduction of a more explicit economic approach to decisions about the adoption or use of educational systems is easier than it might seem. The first step consists of a description of the alternatives under consideration. It is probably the most important stage of the process. The very act of description allows many of the costs involved in an educational innovation to be readily identified. It also prompts decision-makers to think about learning outcomes and experiences and the other perceived benefits of a particular technology as well as the costs involved. The type of simplified cost-benefit analysis advocated in the previous section can then be attempted.
A FRAMEWORK FOR IDENTIFYING AND ANALYZING COSTS
111
There are a number of cost factors associated with all educational media. These include hardware, software, facilities, course development, training and staff. The latter category should be taken to include not only teaching staff but also those involved in developing software, or in the provision of training, or in the management of educational programmes. Attaching figures to the relevant cost categories will seldom be easy but it need not be a horrendous process either, or involve abdicating educational decisions to university accountants. Obviously, the more reliable the information available the better. However, even in the absence of comprehensive and accurate data, progress can still be made by employing rules-of-thumb, best estimates and plausible ranges of values of the key cost categories. The use of sensitivity analysis will also lead to improved information about costs (and benefits). Sensitivity analysis involves changing key estimates and studying how this affects the overall calculations. It thus permits the identification of the factors to which the overall calcula tions are most sensitive and may also be used to help decision-makers determine where the effort of seeking out better quality data would be justified. In many situations, the most important application of sensitivity analysis will be to see how costs vary with student numbers. In this whole process of identifying and estimating costs, the golden rule is to be systematic and consistent. In making comparisons between educational programmes or technologies, the same categories and methods of calculations should be used in each case. It is essential to ensure that comparisons are fair, of like with like. As explained earlier, no single measure of costs is the most applicable in all situations. Decision-makers may have multiple goals and there may be multiple decision-makers. Nevertheless, the procedure outlined here should allow estimates to be derived for all the relevant measures. Finally, it is worth reiterating the point made in the previous section that, even if the information to hand is reliable, it would be unwise to make any important decisions on the basis of small differences in costs. All the economic data should simply be treated as one, albeit important, input into a wider decision-making process. The costs of different educational technologies Comparing distance teaching universities and traditional universities There have been various attempts to compare the costs of distance teaching universities and conventional universities in the same country. In the UK, this has tended to involve comparisons of the Open University (OU) with conventional universities. In an important early study, Wagner (1972) concluded that the average cost per ou student was only some 25 per cent of that per conventional university student. Subsequent contributions cast some doubt on this result (Carter 1973; Laidlaw & Layard 1973). It was argued that Wagner had not made proper allowance for research activity and the different educational experiences of OU students by comparison with traditional university students. Moreover, the cost advantage of the OU and indeed other distance universities appears to be greater expressed in terms of cost per student than in terms of cost per graduate. The implication is that, for various reasons, drop-out and repetition rates are higher at distance universities. Despite these disagreements and the fact that the precise figures are now so dated as to be largely irrelevant, some of the general conclusions which emerged from this early debate remain relevant. The distinction between fixed costs, which are typically higher for distance universities, and variable costs, which are typically lower, emerges again as very important. The major variable cost at all educational institutions “is that concerned with the provision of personal tuition” (Wagner 1972, p. 167). Thus, the great advantage of distance universities is “the potential economies of scale which can be reaped by substituting
112
WHY COSTS ARE IMPORTANT IN NEW EDUCATIONAL TECHNOLOGIES
capital for labour” (Laidlaw & Layard 1973, p. 452). A significant part of total costs thereby becomes fixed and invariable with respect to student numbers. The important conclusion is that, in general, “conventional teaching systems are cheaper for low numbers of students, while distance teaching systems are cheaper for high numbers of students” (Rumble & Keegan 1982, p. 220). While there is evidence that distance universities can be costeffective compared to conventional institutions, their cost advantage can be undermined in certain circumstances. This might happen if investment in distance media is excessive relative to the number of students in the system or if distance universities cannot attract sufficient students to warrant investment in the development of materials. If the variable cost per student at distance universities is only marginally below that at conventional universities, the number of distance students required to sufficiently exploit economies of scale may be impossibly high (Keegan 1990). The cost structures of distance universities have taken on an increased importance because more and more conventional universities are adopting technologies previously associated with distance learning. Larger student numbers have permitted the greater exploitation of the economies of scale offered by new technologies. The balance of advantage has shifted from labour-intensive media such as traditional lecturing and tutoring to more capital-intensive media such as computer-based learning. In a parallel development, the increase in the number of multi-campus universities and networks of universities and associated colleges has stimulated the use of such media as audio and video conferencing to teach students. Comparing the costs of different educational technologies There have been various attempts to compare the costs of different educational technologies over the last 20 years (UNESCO 1977; Jamison, Klees, Wells 1978; Wagner 1982; Levin, Glass, Meister 1987; Rumble 1989; Fletcher, Hawley, Piele 1990; Henry 1991 and 1994). Care is needed in making comparisons. Some studies are based on hypothetical cases. Even when data has been derived from actual cases, it is necessary to make certain assumptions and changes in those assump tions can lead to significant changes in the resultant cost figures. If the data is not robust, only substantial differences in costs can be considered significant. Precise comparisons are also difficult because “costs vary markedly according to the quality of the product, the mix of media used and the particular form of development, production and presentation method employed” (Henry 1994, p. 12). Nevertheless, a few general conclusions can be drawn. First of all, it is dangerous to look at a particular technology in isolation. For instance, while print still often appears to be the cheapest medium for courses with large numbers of students (several hundred or more per annum), the figures generally exclude the costs of tutoring, student assessment and student motivation. Among the new technologies, computer conferencing is a low-cost medium for small numbers. It starts to become more expensive when numbers increase. Recorded instructional television is an expensive medium until student numbers reach several hundred per course, then it increasingly becomes a low-cost medium, especially if a steady volume of production can be maintained. Live interactive broadcasts and video conferencing are very expensive technologies, at all levels of output. In choosing between different educational technologies, a major advantage of computer conferencing is that it can be asynchronous: it does not have to take place in real time. Computer conferencing maximizes the possibilities of open learning and is also relatively cheap, although it is obviously more expensive in terms of staff time to maintain a dialogue with a given number of students individually rather than in a group. However, where computer conferencing is most inadequate is in terms of the limited interactive nature of the medium.
A FRAMEWORK FOR IDENTIFYING AND ANALYZING COSTS
113
Audio and video conferencing have to be synchronous. This, together with the cost of the technology, means that classes have to be a certain minimum size to be viable. The virtue of these two means of communication is that, by comparison with computer-based learning, they make possible a more natural form of human interaction. The choice between audio and video conferencing is largely one of cost versus the standard of learning support, with audio conferencing the cheaper but lower quality medium. This last example illustrates the more general point that costs have to be balanced against quality and the degree of interaction. Apart from computer conferencing, none of the low-cost technologies allow for personal interaction. Indeed, it may be more cost-effective to combine two low-cost technologies, such as print and computer conferencing, than to invest in an expensive interactive technology such as video conferencing. Conclusions The assembly and analysis of information on the costs of new educational media is less daunting than it may at first appear and such efforts as are involved are rewarding since they make possible better-informed decision-making. Different measures of costs are relevant in one situation or another depending on the type of decision which has to be made. Total costs are of interest to those deciding whether or not to establish a technology, or to those working within a fixed overall budget. If students are to be expected to contribute to the cost of an educational programme, average costs are particularly relevant or, alternatively, students may be asked to pay for the variable costs of an educational programme while the institution meets the fixed costs. If there are significant economies of scale, then it becomes vitally important to attract large numbers so as to realize a reduction in average costs. Economies of scale will be more important the greater is the proportion of total costs accounted for by fixed costs. This is likely to be true the more technologically advanced the educational medium under consideration. The value of a broad economic approach is that, by prompting us to think about what the benefits of different educational systems are, it ensures that as much relevant information as possible is brought to bear in making decisions about how higher education resources are used, even if it is not possible to precisely quantify that information. References A.Bates, “Trends in audio-visual media”, in Learning at a distance: a world perspective, J.Daniel, M.Stroud, J.Thompson (eds) (Edmonton: International Council for Correspondence Education, 1982). C.Carter, “The economics of the OU: a comment”, Higher Education 2,1973, pp. 285–8. J.Fletcher, D.Hawley, P.Piele, “Costs, effects and utility of microcomputer assisted instruction in the classroom”, American Educational Research Journal 27, 4, 1990, pp. 783–806. J.Henry, Cost factors affecting the future of IT based education, Report No. 53, Institute of Educational Technology (Milton Keynes: The Open University, 1991). J.Henry, “Resources and constraints in open and distance learning”, in Materials production in open and distance learning, F.Lockwood (ed.) (London: Paul Chapman, 1994). D.Jamison, S.Klees, S.Wells, The costs of educational media (Beverley Hills: Sage, 1978). D.Keegan, Foundations of distance education (London: Routledge, 1990). B.Laidlaw, & R.Layard, “Traditional versus ou teaching methods: a cost comparison”, Higher Education 3, 1973, pp. 439–68.
114
WHY COSTS ARE IMPORTANT IN NEW EDUCATIONAL TECHNOLOGIES
H.Levin, G.Glass, G.Meister, “Cost effectiveness of computer assisted instruction”, Evaluation Review 11, 1987, pp. 50–71. D.Pearce, Cost-benefit analysis (London: Macmillan, 1983). J.Pisapia, “Cost analysis and learning technologies”, presentation at FloridaEducational Technology Conference, Tampa, Florida, 1994. G.Rumble, “On line costs: interactivity at a price”, in Mindweave: communication, computers and distance education, R.Mason & A.Kaye (eds) (Oxford: Pergamon, 1989). G.Rumble, & D.Keegan, “General characteristics of the distance teaching universities”, in The distance teaching universities, G.Rumble & K.Harry (eds) (London: Croom Helm, 1982). UNESCO, The economics of new educational media (Paris: UNESCO, 1977). L.Wagner, “The economics of the Open University”, Higher Education 2, 1972, pp. 159–83. L.Wagner, The economics of educational media (London: Macmillan, 1982).
Chapter 15 USING MULTIMEDIA TECHNOLOGY FOR TEACHING: A CASE STUDY APPROACH David Crowther, Neil Barnett and Matt Davies
The introduction of computer-based learning courseware (CBL) into the education process in higher education has arguably been driven by a desire to reduce teacher-student contact time in an attempt to achieve significant efficiency savings (Davies & Crowther 1995). The use of such courseware, however, is not necessarily a cheap option, for a number of reasons. First, a considerable investment in computer hardware, rooms to house the computers and technical support to maintain the facilities is required. Secondly, although the subsequent use of CBL courseware may be at minimal marginal cost, initial development costs can be considerable. Finally, efficiency defined solely from the educator’s perspective fails to recognize students’ involvement in the learning process. Efficient teaching may not represent efficient learning. In any case, even if the use of CBL courseware could lead to genuine efficiency savings, what is the impact upon the effectiveness of learning? Often, it would seem that designers of such courseware have been mesmerized by the novelty gimmicks and features inherent in the technology, and there is a general assumption that effective learning will inevitably occur provided students are equally mesmerized. More recently higher education has experienced the emergence of multimedia technology, with developments apparently being equally driven largely by the desire to increase teaching efficiency. In general, courseware developed using this technology has been based on the traditional model of learning which is concerned with the interaction between teacher and learner, and in which the teaching strategy is dependent upon students’ needs and preferred learning styles (Crowther & Davies 1996). Technology has generally been regarded as having a subordinate role in this process. As a result, ways have been found to transplant existing teaching methods and materials into the multimedia environment with little attention paid to the specific capabilities and limitations associated with these technologies. Consequently, there has been a general failure to exploit the full potential of multimedia technology. This has been particularly the case in the more discursive/qualitative subjects within social science teaching. In this paper, the authors consider the use of multimedia technology as a teaching vehicle. The paper consists of three main sections. First, the use of multimedia technology in a higher education context is evaluated. Secondly, the strengths and limitations of using case studies for teaching in a higher education context are reviewed, and this evaluation is then used as the basis for considering the compatibility of the case study method and multimedia technology. The benefits of the method are believed to be applicable to a range of subjects in the social sciences, although their development so far has been focused on business and accounting applications. Finally, the authors describe their experience of developing multimedia courseware and argue that the differential advantages of using the technology in the way described maximize both effective learning and efficient teaching, thereby meeting both the pedagogic and curricular needs currently predominant within the discourse of teaching in higher education.
116
USING MULTIMEDIA TECHNOLOGY FOR TEACHING
Multimedia courseware in higher education The introduction of multimedia technology into the education process in higher education not only provides an opportunity to reconsider teaching strategies to be adopted but also requires such a reconsideration. Multimedia has been viewed as a way to increase teaching efficiency through the substitution of selfteaching via this technology, and courseware designed to run on this platform, for more traditional lecturerled teaching. Implicit in this view of multimedia is that the technology is directly substitutable for other forms of teaching and that students find this substitution acceptable, or even preferable. There is an acceptance within the discourse that multimedia is an effective teaching vehicle for all forms of teaching. In an educational context, multimedia technology has several important capabilities. Firstly, material can be presented in an interesting, stimulating and easily accessible manner, utilizing sound, video, graphics and animation. There is, however, a temptation to overplay the motivational impact of multimedia. Novelty features alone are unlikely to provide more than a short-term incentive to engage in the technology, and this by itself will not ensure that they engage in the subject matter and, therefore, enter the learning process. Ultimately, though stimulating presentation of the material can positively affect students’ motivation to learn, it is the intellectual stimulation provided by the material itself which is the most important design consideration. Students will not learn from a package which is of dubious educational content simply because it looks and sounds impressive. The second advantage offered by multimedia technology is that it facilitates active learning. Active learning embraces the view that effective learning occurs when students actively engage in the subject matter concerned (Thorpe 1992). CAL and multimedia courseware can be designed to include built-in questions, problems, tasks and tests, all requiring some form of thought and action on the part of the student. The interactive and non-linear capabilities of the technology provide two further advantages in this respect, since the courseware can be designed to react differently depending on the student input. These two aspects are discussed further in the context of experiential learning. Experiential learning is related to active learning and basically refers to the belief that students learn well by doing. The key capability of multimedia technology in this respect is that it offers the opportunity to simulate reality. Through video and sound, a real-life scenario can be portrayed and then the interactive and non-linear access capabilities enable the student to explore the situation as if in real life. There are limitations concerning the extent to which the complexity of a real-life situation can be replicated, but nevertheless the technology’s ability to contextualize knowledge and to offer non-linear access means that it does have the potential to enhance learning for any subject with a practical dimension. Additionally, the use of multimedia technology is potentially consistent with the student-centred approach to learning which has become the dominant paradigm. Student-centred learning recognizes that students learn well when they take responsibility for their own learning, and also that different students have different learning styles and different learning needs. Multimedia courseware can offer flexibility in terms of what is studied, when it is studied, the pace at which it is studied, the order in which it is studied and so on. Importantly also, the interactive capability means the courseware can adapt to the specific abilities of the student. The pedagogic advantages of case studies There has been increasing use of case studies in education over recent years, to the point where case studies have become an established teaching method in most if not all management and social science disciplines. This trend has been experienced within higher education at both undergraduate and postgraduate levels and, indeed, many new textbooks contain case studies. There are at least two significant reasons for this trend. First, it is widely acknowledged that case studies can promote a deeper understanding, and secondly, in
THE PEDAGOGIC ADVANTAGES OF CASE STUDIES
117
recent years there has been growing emphasis on teaching transferable skills, the development of which arguably may be aided through the use of the case study approach. It is argued by the authors that the appropriate use of well-written case studies can promote the achievement of the above objectives. Indeed, for some of these objectives, case studies can be an extremely effective method in a way in which other teaching methods cannot. In any consideration of the effectiveness of case studies in the higher education environment it is first necessary to define the objectives which a higher education course seeks to serve. These educational objectives can be classified into two categories: subject-specific objectives and more general transferable skills. Subject-specific learning objectives In terms of subject-specific learning objectives, Bloom’s (1956) hierarchical taxonomy provides a useful frame of reference. Bloom argues that a course should, in ascending order, equip a student to: • • • • • •
recall and recognize specified information comprehend and digest the information apply what they have learned analyze the subject, with an understanding of the component parts and their interrelationships synthesize the subject taking an overview evaluate their knowledge, understanding and competence critically. Developing transferable skills
Transferable or generic skills are not specific to a particular subject, course or job. They include communication, analytical, interpersonal, quantitative, problem-solving, decision-making and learning skills, as well as computer literacy. Increasingly in the current environment importance is becoming attached to the teaching of transferable skills. This is evidenced by the Enterprise in Higher Education Initiative of the then Department of Employment, which commenced in 1987 with the aim of helping higher education institutions to equip students for the “world of work”, through the development of transferable skills which will be required in their future working lives. Further evidence of the importance of transferable skills is provided by the Accounting Education Change Commission’s (AECC) Position Statement Number 1 (1990), which outlined the committees’ views on the objectives of education for accountants. The AECC developed a taxonomy of objectives consisting of three components: skills, knowledge and professional orientation. Greater emphasis on communication, intellectual and interpersonal skills was advocated, along with an appreciation of ethical issues. Indeed, with regard to the knowledge requirement, subject-specific knowledge was seen as only one important aspect along with general knowledge, and organizational and business knowledge. Additionally, it was considered that the focus of teaching should be on developing analytical and conceptual thinking, rather than on knowledge transfer. Such an approach can be likened to focusing on the higher objectives in Bloom’s taxonomy. It is clear, therefore, that there is a growing recognition of the importance of what students can do, as well as what they know. This has been accompanied by a shift away from the traditional teaching culture of higher education to that of a learning culture, in which students are encouraged to take responsibility for their own learning. It is argued that the use of case studies has a significant role to play in promoting the development of transferable skills within this more student-centred learning environment.
118
USING MULTIMEDIA TECHNOLOGY FOR TEACHING
Many authors have advocated the use of case studies in education. For example, Argyris (1980) describes the following benefits: • • • • •
hearing others’ views confronting differences making decisions becoming aware of the complexity of reality realizing that there are rarely right or wrong answers since cases are incomplete, as are real-life situations.
It can be seen that Argyris’ list of benefits refers mainly to transferable skills as opposed to subject-specific objectives. A note of caution is required, however. For the full potential of the case method to be realized, a number of factors require consideration. First, to be most effective, the case must be well-written, case details should be clear and unambiguous, and above all the case should be interesting. Students tend to have a very low tolerance for complex, unstructured information, ambiguity and excessive data in cases, although, in fact, it could be argued that these features enhance the realism of a case study, and promote the development of problem-formulation and problem-solving skills. Students, therefore, generally require some instruction on how to approach case study analysis, and the rationale for using case studies in the educational process should be made clear. Another related problem arises from the amount of time students are required to spend reading and assimilating case details before they are in a position to undertake the more important analysis of the issues involved. Lecturers must ensure that cases are properly integrated into a particular module, so that sufficient time is available for students to explore a case study, and also so that sufficient time is available for students to discuss and reflect upon their views and findings. Lastly, cases may quickly become out of date, although this depends to a great extent on the subject and topic concerned. This limitation is more difficult to overcome, though the increasing popularity of the case approach should ensure that a sufficiently large data bank of relevant cases exists. Multimedia courseware and case studies The capabilities of multimedia technology have important implications when the use of the case study method is considered. The technology’s ability to present material in a stimulating way offers a very significant advantage. One of the potential limitations of the case approach referred to earlier was that students are often required to spend a relatively large amount of time reading and assimilating case details before they are in a position to tackle the important issues involved. Using multimedia technology, the amount of time required to comprehend case material may be reduced. Also, when compared with a traditional paper-based version of the same case, the student is more likely to be motivated to engage in the multimedia equivalent, with its more dynamic and interesting approach. It is important to emphasize, however, that a poorly written case of little educational value cannot be transformed into an effective learning vehicle merely by transforming it into a multimedia case. It is with respect to the experiential learning capability that the most significant synergies lie. Case studies are essentially real-life scenarios through which various issues, concepts and techniques can be explored. Multimedia technology, through its facility to incorporate sound, graphics and video, enables such a real-life scenario to be conveyed in an interesting and dynamic manner. More important than this aspect, however, the interactive and non-linear capabilities enable reality to be simulated. In other words, rather
THE PEDAGOGIC ADVANTAGES OF CASE STUDIES
119
than just representing a novel and stimulating way in which case material can be presented to the student, multimedia provides the opportunity for the student to actively engage in the case situation. For example, a multimedia case could involve the student taking the role of one of the key players in a case and then working through the information and issues in the case in an interactive manner. To incorporate the active learning benefits referred to above, various tasks and activities relating to the role need to be integrated into the courseware. These could include interviews, quantitative analyses, memo- and report-writing exercises and so on. In so doing, this approach clearly also helps to develop students’ transferable skills. Additionally, there is the potential for students to encounter and address conceptual, political and ethical problems, and to identify situations which have reference to a variety of theoretical and conceptual positions. Multimedia case studies should not require the considerable developmental investment which has been required to produce many of the current CBL and multimedia courseware products, which are essentially knowledge transfer mechanisms. Converting a conventional textbook into a computerized textbook with flashing lights and impressive graphics is time-consuming and expensive, and any improvements in the effectiveness of student learning are likely to be relatively minor. Combining the case study approach with multimedia technology, on the other hand, offers a significant opportunity to achieve efficient and effective learning: the learning process is transformed from an essentially passive to an essentially active one, and the achievement of higher learning objectives and the development of transferable skills is facilitated. It is argued by the authors that, while the use of multimedia courseware provides a valuable teaching resource in the continuing drive towards efficiency in teaching, it is only by recognizing the differential advantage which the use of multimedia gives to the teaching process that the best use of the medium can be made, in order to maximize both efficiency in the teaching process and effectiveness in the student learning process. It is further argued that, through the use of case studies delivered via a multimedia platform, maximum benefit can be derived from the capabilities of the medium and that this delivery mechanism both facilitates active learning and experiential learning as well as being fully consistent with the student-centred approach favoured by current teaching paradigms. Interdisciplinary knowledge integration is one significant differential advantage from the use of multimedia courseware. Developing interdisciplinarity Drawing upon the above evidence, which is generally consistent with their own experiences of using case studies, the authors believe that case studies are an extremely effective teaching tool, for facilitating both the achievement of subject-specific learning outcomes and the development of transferable skills and qualities. Perhaps most importantly, students seem to enjoy the case study approach, and respond positively to the challenges posed. One of the features of the current higher education environment is the polarization of teaching and research into increasingly discrete disciplines and sub-disciplines while at the same time extending the boundary of metadisciplines to increased knowledge domains in the pursuit of relevance and legitimation. For individual academics this polarization is reflected in the increasingly local focus of their teaching and research as their individual subject areas narrow down and increasingly seek legitimation from within their own knowledge domain. Thus increased localization is the direction of polarization for individual academics whereas overall the polarity is towards increased globalization as the schools compete with each other for resources, customers and recognition through the vaunting of their universal specialisms and relevance. The localization of focus for academics as far as teaching is concerned can be seen to be manifest in the increasing number of disciplines studied, researched and taught, together with the increasing separation of
120
USING MULTIMEDIA TECHNOLOGY FOR TEACHING
these disciplines from each other. Increasingly these disciplines are taught as discrete subjects, with little or no overlap between them, and with little perceived relationship and relevance of one to another. At the same time, however, the courses taught are constructed from components made up of segments of these discrete disciplines, and the discourse, adopting the modernist metanarrative of the whole uniting the parts (Lyotard 1984), assumes that this form of aggregation provides a learning vehicle for students which is implicitly useful and relevant to these students and the organizations who will eventually employ the successful outputs (i.e. graduates) of higher education institutions. There is little questioning of this relevance within the discourse even though an examination of the marketplace demonstrates that this is not the case. This specialism is perhaps best viewed as a reflection of the higher education agenda which places research output, implying increasing specialism, as superordinate to teaching students. While this academic discourse continues to develop through an increasing array of specialisms the world continues to change, and change in a different manner to this increasing specialization. For organizational management, for example, the current era is manifest not so much in the local/global polarization of business schools but rather in the space-time compression suggested by Harvey (1990). This is particularly manifest in organizations through the increased technologies and information architectures which have become prevalent in the recent past. These technologies have had the effect of facilitating the major changes within organizations which have taken place under the label of downsizing and which have removed vast numbers of middle management and subject specialists. Thus while academe, and the academics within, continue to seek legitimation from within their own community, through increasingly self-referential discourse and specialization, the world of their customers is increasingly requiring generalization rather than specialization. This therefore suggests the need to teach students in an interdisciplinary manner rather than assuming that a set of discrete subjects will be assimilated by them into a complete knowledge domain. It is in this arena that the use of multimedia has one of its most significant differential advantages. The differential advantages of multimedia courseware It has been stated that current teaching paradigms acknowledge that the use of case studies in teaching provides a means of integration of knowledge acquired by students, through the various teaching mechanisms employed, and of addressing the higher skills identified in Bloom’s taxonomy of learning. Equally it has been stated that such knowledge integration has, however, tended to be considered from the viewpoint of single-subject specialisms, whereas in the education environment cross-disciplinary knowledge integration is of particular importance. Before arguing that multimedia courseware produces a significant differential advantage in overcoming this specialism-centred view of teaching it is first necessary to review some of the generic advantages of multimedia. The subject areas which are most appropriate for the use of multimedia technology require careful consideration, as does the type of student for whom multimedia as a means of learning is most appropriate. Beishuizen (1992) has suggested that a complex knowledge domain is best engaged with not by a linear exploration of that domain but by navigating around the domain using different routes. This type of knowledge domain therefore seems more appropriate for multimedia technology than does a simpler or more linear knowledge domain. It is argued, therefore, that more complex subjects are the ones which can best be taught more effectively using multimedia as the teaching vehicle and it is further argued that most social science subjects fall into this category. This is particularly true when it is considered that in most cases the solutions to problems are not arrived at linearly but through an interaction between various factors, and the necessity of undertaking often complex analysis in order to arrive at possible solutions to a problem. This suggests an interdisciplinary approach.
THE PEDAGOGIC ADVANTAGES OF CASE STUDIES
121
Some areas of social science teaching are appropriate for the use of such technology, particularly in cases where students can benefit from being placed in a decision-making environment which involves deliberation on theoretical and conceptual problems. For example, a case could put the student in a political environment, facing a variety of competing interests, values and cultures. The student could be confronted with situations which require reflection on such concepts as pluralism, elitism and corporatism, but also more immediate party political and short-term considerations. In addition, the student could be required to draw upon knowledge gained from, for example, organization theory, accounting and human resource management. To be effective, however, the design and content of the courseware needs careful consideration. This is particularly true when the simulation of “real world” situations is attempted through the use of multimedia, and the social implications of attempting this, given that it cannot be fully successful, need to be considered. The different needs of different groups of students are recognized by Grupe & Connolly (1995) who argue for the different needs of adult learners, and by Filipczak (1995) who argues, within an organizational learning context, that training must match the preferred learning style of the people involved in that learning process in order for that training to be effective. All modern learning theory, and the teaching strategies developed from that theory, is founded in the student-centred learning paradigm initiated by Rogers (1951, p. 389) who states: “We cannot teach another person directly: we can only facilitate his [sic] learning.” Rogers (1961, p. 268) also considers that learning takes place best in the context of a relevant situation and occurs most readily within the context of problem solving. He states: “Significant learning occurs more readily in relation to situations perceived as problems.” Thus appropriate courseware is needed to enhance learning but it is important to recognize the transaction cost-minimizing imperative of the current environment. There is therefore a need for low-cost, high-quality courseware to be produced and this has been recognized in the CTISS File editorial (1992, p. 2) which states that: “The key to reducing costs is flexible high quality courseware capable of substituting for substantial amounts of lecturing time.” This then is the agenda of the discourse for higher education—how it can reduce teaching costs while maintaining quality. It is argued in this paper that the approach taken by the authors achieves this through the interdisciplinary approach taken to the development of multimedia case-study-based courseware. Dixon (1992) recognizes this need for cost reduction in claiming that the main driver for the uptake of IT in higher education is cost reduction, but he also recognizes drivers in the form of improving the image of the institution concerned and in creating competitive advantage for that institution. He implies, however, that merely using more IT than other institutions will provide these benefits, rather than a consideration of how that IT is used within the institution and within its teaching—quantity rather than quality being the key to success! Patterson (1994), however, criticizes the way technology has been used in teaching and compares it to the Bermuda Triangle in that resources have been poured in only to disappear without trace. It has been argued that people learn in different ways and at different paces, and one of the claimed advantages of using computer-based courseware is that it enables learning to take place under the control of the student and at a pace that accommodates each individual’s learning style and speed (Mullins & Mullins 1994). The flexibility of multimedia in terms of its non-linearity of access to material is particularly suited to meeting these differences in learning need. This of course requires considerable investment in the development of courseware to enable the course material to be designed to capitalize on this non-linear capability. This potentially conflicts with the efficiency driver which demands rapid development. It also means that the transposition of conventional textbooks or lectures on to multimedia, enhanced with video, sound and graphics, does not really maximize the potential capability of multimedia. The development of appropriate courseware requires a complete redesign of course material and the order in which it is to be delivered.
122
USING MULTIMEDIA TECHNOLOGY FOR TEACHING
The interdisciplinary approach to multimedia development For cost-effectiveness it is essential that multimedia courseware should not require the considerable developmental investment which has been required to produce many of the current CBL products, which are essentially knowledge transfer mechanisms. Combining the case study approach with multimedia technology offers a significant opportunity to achieve efficient and effective learning: the learning process is transformed from an essentially passive to an essentially active one, and the achievement of higher learning objectives and the development of transferable skills is facilitated. It is for this reason that a case study approach has been adopted by the authors for the development of multimedia courseware. In this paper, it has been argued that the case study approach can be applicable within social science disciplines, where case studies are already used extensively. In the final part of this paper, the authors describe their experience of developing multimedia case studies and their programme for extending this development. The primary focus of the development of multimedia courseware by the authors has been upon facilitating effective learning, rather than upon cost reduction through reduced student contact time. However, it can be seen that an effective learning vehicle also produces reduced costs in the delivery of material through enabling students to learn by themselves in their own time. Thus cost reduction is achieved as a by-product of effective learning rather than seeking to achieve this as an end in itself. Achieving this cost reduction through reduced student contact time, however, requires an initial investment for the development of material to meet this need. The development of multimedia courseware has generally been constrained by the popular conception of extremely high development costs. Such costs, however, have tended to be evaluated in the context of current material, which has primarily been of the knowledge transfer type. By concentrating upon the higher objectives of Bloom’s taxonomy through the use of case studies the authors have been able to significantly reduce the costs of courseware development and also the timescales for development of material. This has been achieved within the context of the development of generic shell programming, which enables subsequent products to be rapidly developed from the shell initially developed for the first product. Although development costs have been reduced through this approach they do not, of course, become insignificant. However, they can be reduced further for the institution concerned by the recognition of the commercial possibilities of such products. Thus the products developed by the authors are aimed not just at the higher education market but also at the commercial training market, as a means of recouping some of the development costs. To this end commercial partnerships have been developed to market the products and at the same time meet some of the development costs. An interdisciplinary approach to courseware development implies designing material that is suitable for use on a variety of subjectspecific courses as well as for integrating courses. It also implies knowledge of different disciplines which necessitates the involvement of specialists from these disciplines. Accordingly the authors (who teach within the accounting and public sector management areas themselves) are working in partnership in the development of courseware in a multimedia format. Development of the first product was commenced in December 1995 and this was launched commercially in July 1996. Development on two further products commenced in May 1996 and they were launched in October 1996, thereby demonstrating the reducing timescale of this generic shell approach. Further products are under development as part of a continuing programme. The first product is in the general area of strategic management and the main objective of the case is to provide a business setting which allows students to apply their knowledge of appropriate concepts and techniques, enabling them to gain a deeper understanding of the subject areas, while at the same time developing some key transferable skills. There is no single right answer to the case. Instead, the key requirement is for the student to build a set of recommendations. This in turn requires the ability to deal
THE PEDAGOGIC ADVANTAGES OF CASE STUDIES
123
with an ambiguously defined practical problem through careful analysis of the case details, the presentation of a set of clear and logical arguments, supported with appropriate evidence and analysis and the recognition of the importance of assumptions, and any extra information required. The case, therefore, contains incomplete and partially conflicting information, some facts and many opinions. Of the later cases, one is concerned with setting up a business and the other with human resource management from the viewpoint of a business manager with business problems. There is potential for the development of further cases within more discursive subjects; as previously stated, the potential is perhaps greatest where a student can be placed in a decision-making or legislative environment which requires the prior knowledge, understanding and application of several competing theoretical concepts and value systems. Cases currently under development are concerned with the political environment of a local authority and with cost accounting. Further cases planned include those dealing with the management of change, with business ethics, and with operational planning. Conclusions For business subjects, the case study method is widely accepted as an effective teaching tool and advocates have argued that the case method promotes a deeper understanding of the subject, promoting the achievement of Bloom’s higher learning objectives such as application, analysis and synthesis as well as facilitating the development of transferable skills. The use of multimedia technology is well suited to the case approach, since it not only builds upon the differential advantages of this teaching method but also helps to overcome some of its limitations. Through the use of such technology, case material can be made available to the student in a stimulating, realistic and interactive manner, providing scope for active learning, experiential learning and also student-centred learning. Multimedia technology can also help to reduce the amount of time the student requires to absorb and understand case information, due to the video, sound and graphics features. At the same time, the case study approach is well suited to the multimedia environment since this use of the technology should not require the considerable development costs which tend to be generally associated with such courseware. References Accounting Education Change Commission, Objectives of Education for Accountants (Position Statement Number 1), AECC (1990). C.Argyris, “Some limitations of the case method: experiences in a management development programme”, Academy of Management Review 5, 2, 1980. J.J.Beishuizen, “Studying a complex knowledge domain by exploration or explanation”; Journal of Computer Assisted Learning 8, 2, 1992, pp. 104–17. B.S.Bloom (ed.), Taxonomy of educational objectives: cognitive domain (London: Longmans, 1956). D.Crowther, & M.Davies, “Multimedia, mesmerism and teaching strategy”, CALECO 95 Conference Proceedings, 1996, pp. 11–17. The CTISS File (editorial) “Teaching and learning technology programme”, 13, 1992, p. 2. M.Davies, & D.Crowther, “The benefits of using multimedia in higher education: myths and realities”, Active Learning, 3, 1995, pp. 3–6. M.Dixon, “The uptake of IT as a teaching aid in higher education: a social science perspective”, The CTISS File, 14, 1992, pp. 57–8. B.Filipczak, “Different strokes: learning styles in the classroom”, Training, 32, 3, 1995, pp. 43–8.
124
USING MULTIMEDIA TECHNOLOGY FOR TEACHING
F.H.Grupe, & F.W.Connolly “Grownups are different: computer training for adult learners”, Journal of Systems Management 46, 1, 1995, pp. 58–64. D.Harvey, The condition of postmodernity (Oxford: Blackwell, 1990). J.F.Lyotard, The postmodern condition,G.Bennington & B.Massumi (transl.) (Minnesota: University of Minneapolis Press, 1984). B.Mullins & B.Mullins, “The human side of technology”, Canadian Insurance 99, 10, 1994, p. 22. J.Patterson “Teaching, learning and computers: a Bermuda Triangle?” The CTISS File 17, 1994, pp. 58–9. C.R.Rogers, Client centred therapy (London: Constable, 1951). C.R.Rogers, On becoming a person (London: Constable, 1961). R.Thorpe, “Alternative theory of management education”, Journal of European Industrial Training, 14, 2, 1992, pp. 2–15.
Chapter 16 INFORMATION TECHNOLOGY AND TEACHING THE SOCIAL SCIENCES: OBSTACLES AND OPPORTUNITIES Duncan Timms
Early developments in digital computing were much influenced by the demands of social scientists for means to monitor and analyze demographic and economic features, such as those revealed in censuses. The use of computers to analyze the results of surveys is a taken-for-granted characteristic of modern social science and all except a few die-hard qualitative analysts are likely to have at least a nodding acquaintance with one of the major statistical packages, such as SPSS, SAS, Minitab or BMDP (and, even among qualitative researchers, there are many who routinely make use of such packages as NUDIST or the Ethnograph). Graduates from social science departments are employed in large numbers in occupations in which the collection, manipulation, analysis and presentation of information is the prime activity and their computer literacy may be taken for granted by employers. Yet the take-up of computer-based technology for teaching and learning in British social science, especially sociology, social policy and political studies, remains generally low. There are exceptions but, on the whole, even in those institutions which teach courses on such topics as the social implications of new developments in information and communications technology or the role of social factors in the development of new technologies, much social science is still taught in the same way as it was a generation ago. Reasons for this paradox involve both factors which are general to higher education and others which are specific to the social sciences. The pressures are both positive and negative. External encouragement: computer literacy, efficiency and effectiveness A succession of reports by employers and the funding councils have emphasized the need to ensure that graduates should be computer literate and have suggested that the development of modern computing and information technologies provide the means to make possible the expansion of higher education in an era of tight public spending limits. Official encouragement to expand the use of ICT in teaching and learning has been propelled by at least three motives which flow from combinations of these judgements: an ambition to ensure that graduates are computer literate, a hope that the use of ICT will save money and a desire to use it to improve the effectiveness of the learning process. The three motives have rarely been spelled out, cannot be assumed to be necessarily congruent and may have different resonances in different academic disciplines. A number of UK initiatives—the CTI, which has gone through a number of phases, starting in 1986, ITTI (1991–4), TLTP (1992–6), and the New Technology Initiative (1994–)—have been financed by the higher education funding councils designed to encourage the uptake of new technology in university teaching. Action at the UK level has been paralleled by similar developments in Northern Ireland and Scotland (e.g. the Learning Technology Dissemination Initiative (1993–7) and the Use of the Metropolitan Area Networks Initiative (1995–)). The total funding which has been devoted to the encouragement of new technology in university teaching in Britain must now run into hundreds of millions of pounds. At the exit point it is clear
126
INFORMATION TECHNOLOGY AND TEACHING IN THE SOCIAL SCIENCES
that the demonstration of IT skills is a normal requirement of many employers; at the entry point, the efforts put in by the Department for Education and Employment with respect to the introduction of computers and of computerassisted learning in schools have led to a growing sophistication among new students. Despite this apparently propitious background, the uptake of IT for teaching and learning within universities remains patchy, especially in the social sciences. The development of digital information and communications technologies has coincided with a period of unparalleled change in the nature of higher education in virtually all Western societies. In Britain, the number of students enrolling in universities has expanded dramatically, reflecting a change from an elite to a mass system of higher education. An increasing percentage of the students are returning to education after a gap of several years and many have non-traditional entry qualifications. A high proportion of the new students are attracted to social science disciplines; many, generally the majority, are women. There has also been a striking change in the way in which higher education is regarded by other sectors of the economy: the importance given to the knowledge industries in the growth of the economy is reflected in the assertion that university education must become more relevant to the needs of the economy and that it must be open to the needs of life-long learning. The calls for greater relevance and for a greatly increased throughput of students have occurred just as wider developments in society and the economy have demanded that tough constraints be placed on public expenditure. Universities find themselves in the position of having to teach new skills to greatly increased numbers of students with declining per capita resources. It is in this context that the apparent promise held out by some proponents of ICT that its use may reduce the need for the input of skilled and expensive academic labour may lead to its adoption becoming an alluring prospect to university administrators and funders (Dixon 1992). To increase the attractions further, the use of CAL can be presented as a way of providing students with sets of transferable computing skills which are said to be a necessary requirement of employability. Universities which can demonstrate that their graduates have a wide familiarity with a range of computing skills can use this as a bonus point in an increasingly fierce competition for recruits (Gardner 1990). The claims that the use of ICT can improve the efficiency of teaching and impart computer literacy are essentially arguments about the ability of new technology to augment the delivery of current forms of education. At a more fundamental level, the use of IT may be presented as a way of increasing the flexibility and effectiveness of higher education, enabling people to study at places and times of their own choosing and of leading to new, deeper forms of learning. The notion of life-long learning, largely based on part-time study at or from home, and involving students in active co-operative learning, has the potential of heralding a radical restructuring of the higher education system. In this view, not only will the number and type of student change, but so also will the nature of the educational process itself. For long confined to a number of visionaries working in the field of adult education, the concept of active life-long learning has recently entered the mainstream in policy documents emanating from the European Commission (e.g. the Bangeman report1) and the Joint Information Systems Committee of the UK higher education funding councils (The Implications of Information Technology for Higher Education 1995)). Institutions such as the ou in the UK have shown that the model of active life-long learning can be supported using relatively crude communications and information technologies, but the expansion of the model demands the use of wideband computer networks, such as the metropolitan area networks being laid down in Scotland, and the development of new forms of computer-mediated interaction. The evolution of technology and of
1. http://www.ispo.cec.be/infosoc/backg/bangeman.html
EXTERNAL ENCOURAGEMENT: COMPUTER LITERACY
127
approaches to higher education is occurring together (Laurillard 1993). The fact that much of the investment in the infrastructure of communications is being supported by national funding initiatives, ahead of local demand, increases the pressure for universities to come up with effective means of exploiting the opportunities available. Internal influences on the use of information and communications technologies in teaching and learning in universities External pressures for universities to make more use of information and communications technologies have been matched by internal influences. Within universities, pressures for the development of CAL and other computer-based applications have come mainly from two sources: computer enthusiasts and those concerned with forward planning, who see investment in ICT as a way of attracting more or better-qualified students and/or of making more efficient use of existing staffing, accommodation and other resources. The role of computer enthusiasts, often hobbyists, has had both positive and negative consequences. On the positive side, enthusiasts have put in far more hours of development work than could have been expected; on the negative side, their efforts have sometimes been idiosyncratic and their concentration on computer applications has sometimes led to both them and their products being regarded as unduly eccentric. In order to influence university policy, the enthusiasts have either to occupy high positions within the university hierarchy or to persuade those who do that their enthusiasm is worth supporting. The enthusiast may, of course, be a useful asset in the external relations of the university, presented to visiting delegations as instance of the university’s commitment to being at the forefront of technology. To the extent that universities are under external pressure to demonstrate that they have an institutional IT strategy for teaching and learning the role of the enthusiast may be crucial. Changes in the funding and accountability of universities have been reflected in changes in the nature of university planning. One feature of the changes has been an increased role for those senior members of the university charged with planning and resource allocation. In order to support their activities considerable effort has been devoted to the development and/or importation of management information systems, designed to enable such activities as financial planning, the keeping of student records and timetabling to be automated. Progress in this area has been somewhat slower than was originally hoped or planned, but use of the technology is likely to rub off into academic areas. To the extent that those concerned are convinced that the use of ICT will help in the pursuit of the university’s mission, will give it a competitive edge in the struggle to attract good students and profitable research grants, or will help it to score points in quality assessment exercises, they are likely to allocate an increasing proportion of university resources to the area, often using “top-sliced” funds for the purpose. The development of institutional information strategies, becoming a prerequisite for funding council support, brings consideration of the use of information and communications technologies for teaching, research and administration to a more central position in the planning exercise. Inhibiting factors on the uptake of ICT for teaching and learning In their ardour, computer enthusiasts have sometimes overlooked the range of other factors which need to be taken into account in the successful use of new technology in teaching and learning. In a survey of some of the experiments in the development of electronic campuses in US universities in the mid-1980s, Gardner (1992, p. 5) points out that the uptake of the new technologies is heavily influenced by social factors:
128
INFORMATION TECHNOLOGY AND TEACHING IN THE SOCIAL SCIENCES
Some of the more publicised US experiments in campus computing, particularly those initiated in the period 1982 to 1986, were directed by individuals who were apparently mesmerised by a vision of a silicon future, strangely oblivious to the fact that the real hurdles facing those seeking to exploit information and communications technologies in universities, as elsewhere, are ultimately not technical; the challenges in bringing the electronic campus to reality are economic, political, social and educational in character. Inhibiting factors which have reduced the speed with which ICT has been introduced into university teaching include a variety of technical, cultural, pedagogic and structural elements. Despite the efforts in the UK of the Computing Board/Information Systems Committee, standards for the delivery of computer software remain diverse and the speed of development has meant that any standards which achieve acceptance are soon perceived to be obsolete. The attempt by computer directors to regulate matters, a relatively simple thing when the box on the desk or in the lab was simply a dumb terminal attached to a central server, has become a Canute-like activity in the face of the flowing tide of development of the personal computer and of the Network. At the same time, computer interfaces remain far from transparent to the naïve user and the fact that each teaching package or laboratory may follow slightly different conventions can give rise to considerable confusion. The waves of enthusiasm which are generated by new developments such as the Internet and multi-media CD ROMS raise the hackles of those who have seen many previous technological breakthroughs (e.g. language laboratories, programmed learning, the video player) which have been accompanied by extravagant claims but have failed to deliver. University teaching is, in many ways, a deeply conservative profession. Lecturers are attached to a version of academic freedom which means that each believes they have a distinctive perspective on their subject and leads to what Sir Peter Swinnerton-Dyer, former Chief Executive of the Universities Funding Council, called the “minor uniqueness” of lectures (THES 5 July 1991). The professional ideology of university lecturers emphasizes a highly personalized approach to teaching. The “not invented here” syndrome, long a justification for not using a single set text (other, perhaps, than the one the lecturer has written), does not provide a fertile environment for the adoption of off-the-shelf courseware. Teaching, especially lecturing, is viewed by its practitioners as essentially an art form, with the prizes going to prima donnas. In direct questioning the majority of lecturers may claim that the fact that a particular piece of computer courseware has been developed elsewhere is no bar to its use in their own courses (e.g. Allen et al. 1996), but, in practice, there has to date been little take-up of programs developed by others. The first round of the CTI, in the mid-1980s, funded the development of many dozens of pieces of courseware, the vast majority of which have sunk without trace. In the US an attempt to establish a national clearing-house for social science courseware was similarly unsuccessful. The claim by the developers and vendors of some technological innovations that adoption of their product will, by itself, lead to a rapid improvement in student learning is, not surprisingly, likely to be met with considerable scepticism. Part of the problem lies in the lack of attention to pedagogical concerns which sometimes seems to characterize CAL materials. Although it can be claimed that many forms of traditional teaching—the mass lecture with the actor-lecturer, the tutorial based on notions of a Socratic dialogue, the workshop based on images of apprentice training, the reading list based on images of the gentleman— scholar—have themselves evolved in relative isolation from knowledge about how students may most effectively learn, lecturers have themselves learnt through them and they have the great advantage of having stood the test of time. New techniques need to demonstrate their pedagogical advantages.
EXTERNAL ENCOURAGEMENT: COMPUTER LITERACY
129
Even the demonstration of the effectiveness of a piece of CAL software is not without its hazards, given the relatively low prestige in which selfconscious attention to teaching has been held in British academic life. Despite the establishment of national quality assessment procedures, it remains the case that the route to both personal and institutional success in academia is generally perceived to be through published research. Institutions may be able to capitalize on achieving a rating of excellence in teaching quality exercises and, in Scotland at least, receive modest financial reward from such an attainment, but they will receive far more rewards in terms of both money and prestige by achieving a high grading in the research assessment exercise. The relative advantage of pursuing research rather than teaching is even more pronounced for individual members of staff: the route to promotion is likely to be much shorter via the publication of research than through the demonstration of good teaching. And in the research assessment exercise, where each member of staff has to be considered separately to see whether they are “research active”, the publication of material designed for teaching counts for little. Although some attempts have been made to promote the use of new technology for teaching in terms of its effect in increasing the efficiency of the teaching process—and thus releasing lecturers’ time for research—there is little general evidence that either individual members of staff or institutions which have invested heavily in IT have received much recognition or reward. In order to establish the credibility of CAL it must be possible to show that the use of IT leads to real improvements in both the effectiveness of student learning and the efficient use of resources, including academic staff. The available evidence is, at best, sketchy. Attempts to evaluate the effectiveness of CAL frequently turn out to be little more than non-critical descriptions of a piece of software, extracted from context, and with little comparative evidence. Dixon (1992, p. 24) notes: There is a marked absence of rigorous evaluation techniques in CAL publications and, since articles are generally written by the same enthusiastic people who implemented the software, there is room for concerns about objectivity. It is not simply that many people appear unconvinced that CAL represents a clear advance over existing forms of teaching; scepticism may well be a sensible stance in the face of unbridled enthusiasm. More seriously, some teachers, including a number of respected social science professors (SocInfo 1989, unpublished), appear to believe that the use of CAL necessitates a mechanistic form of teaching and will lead to stereotyped and uncritical forms of learning. Although some holders of this position concede that the use of computers in teaching may have some benefits for the development of simple skills and for rote learning, they appear to believe that it has little to offer for the discursive forms of learning required to grasp the social sciences. At its extreme the use of CAL is equated with the alienation of the students from learning and the academic community. Claims that the use of computers in teaching will release staff time are, if anything, treated with even less belief than those which focus on the effectiveness of CAL. The preparation of lectures and tutorials is a familiar activity; the inclusion of computer-based material in a course requires new skills and new patterns of work. The fact that the yardstick is frequently in terms of the amount of time required to produce new pieces of software gives rise to horrifying figures: Whiting (1990) suggests that one hour of commercialquality courseware may require the input of 300 person-hours of preparation. Even if the number of hours required can be reduced to a fraction of this figure, it seems unlikely that those who devote themselves to the preparation of courseware will reap much reward. It can, in any case, be argued that the model of the individual courseware developer is profoundly wasteful.
130
INFORMATION TECHNOLOGY AND TEACHING IN THE SOCIAL SCIENCES
It is not only the ideology of academia that militates against the whole-scale and enthusiastic take-up of new technology. There are also a number of financial and structural obstacles which need to be overcome. Some, relating to patterns of promotion and prestige, have already been mentioned; others relate to the way in which resources are distributed and controlled within universities. The supply and maintenance of computers and their peripherals is generally handled through existing committees set up to manage scientific equipment. This may well lead to disparities in the take-up of CAL. Departments which have no previous history as major equipment-users find themselves at a disadvantage in arguing the case for new tranches of equipment, money and space; conversely, those departments which do have a history as equipment-users may be able to argue that by devoting a certain proportion of their budgets to, say, student computer laboratories rather than a wet laboratory, they are making a direct contribution to improved efficiency. The argument is not just about hardware and software: it runs to a criterion of “liveware” Few departments in the arts and social sciences have their own technical support; in the sciences it may be relatively easy to re-train a laboratory technician to become a provider of computer support. Even in the case of well-endowed science depart ments, however, the pace of change in IT may lead to severe strains being placed on budgets. The replacement cycle for micro-computers is short and the effort to keep up with acceptable technology at a time when there are an increasing number of other claims on relatively static university budgets may prove an impossible financial burden. At the very least, it is likely that equipment and finance committees will require a clear demonstration of the cost-effectiveness of the investment proposed. The provision of such data is difficult and, in most cases, involves more acts of faith than of solid accountancy. Changes in the conditions of work of academic staff make it difficult for them to depart from old ways and invest the time and effort needed to acquire new skills. Staff-student ratios, which historically have been much lower in British universities than in most other parts of the world and have enabled the relatively lavish use of personal tutorials, have suffered a considerable degradation in the last ten years, often doubling (to around 1:20). At the same time, the need for all staff to contribute to the income-raising business of research has reduced the incentive to take time out to develop new teaching methods. Although there is now much more overt attention to staff training in teaching methods than used to be the case, it is not always immediately clear what the benefits to individual members of staff are in undertaking the training. And, even if staff are interested in acquiring the skills necessary to explore the possibilities of ICT for teaching and learning, it may be difficult for them to find the time and energy to devote to the task. The point is often made that the uses of technology are not value free, but reflect the values of those who implement them. Given the inertia built in to institutions it is not surprising that ICT may be introduced as a way of doing the same thing more efficiently. CAL may be seen as a way of reinforcing the learning done in lectures, reducing the amount of contact required. Computer-assisted assessment may be seen as a way of improving the efficiency of the marking process in examinations. Networks may be seen as a means of concentrating resources: enabling staff to be located in one institution, with students taking the courses remotely. The end result of this process might be a small number of central institutions, with others serving simply as satellites. Hardly surprisingly, this prospect is likely to be welcomed only in those institutions which feel confident that they will be the hub sites. Most evaluations of the use of ICT in teaching and learning are based on old paradigms, examining the technology simply as a tool for pursuing existing objectives. Given the constraints of funding it may be necessary to show that ICT can be used to improve the efficiency and effectiveness of conventional higher education, but this ignores the potential of the new technologies for a more radical restructuring of the educational process. It is in relationship to these possibilities that the relevance of the social sciences
EXTERNAL ENCOURAGEMENT: COMPUTER LITERACY
131
becomes most apparent and the possibilities for the use of ICT for teaching and learning become most exciting. The special case of the social sciences Some of the reasons for the slow take-up of ICT in the teaching of social science in the UK reflect factors already mentioned: the emphasis on research rather than teaching, the lack of well-funded teaching laboratories and support staff in most social science departments, the ideological preferences of some senior academics who have occupied influential positions. The funds devoted to the development of courseware and other computer-based learning resources for the social sciences is a small fraction of that devoted to the natural and physical sciences. The particular nature of the learning task required of social science students, with its emphasis on engagement and criticism, has also been a deterrent to the early adoption of ICT in the area. While it may be relatively easy to produce a computer-based module on, say, the technical aspects of population processes, it has proved more difficult to date to use CAL to engage students in consideration of the social, cultural and ethical issues involved in such elements of the processes as the determination of family size, considerations of the quality of life, risk-taking behaviour, etc. Much of the courseware being produced by such initiatives as TLTP in Britain or Högskolans Grundutbildningsrådet in Sweden is directed to the needs of the physical and natural sciences and emphasizes the development of analytical skills based on individual repetition. Although recent software may be a far cry from early rote-learning programs, there remains relatively little interaction between the student and the program. There is also a selective factor at work in terms of the sorts of student who are attracted to social science courses, especially those in sociology, social policy and social work. A high proportion are female, many are mature (i.e. in their late 20s–40s), with considerable experience of work and/or child-rearing. To the positive reasons for their choice of subject is often a negative reason: a dislike—even fear—of the mathematics assumed to characterize the natural sciences and, less obviously, computing. Given the nature of the student intake it will be several years before the investment in school computing is translated into computer proficiency at the entry to undergraduate studies. Surveys of in-coming students to the Stirling Department of Applied Social Science show that less than half claim any previous familiarity with computers. In response to questions about their attitudes towards computers, a significant proportion, overwhelmingly female, confess that they are afraid of looking stupid if they make a mistake while using a computer and that they believe that working with computers is likely to lead to social isolation. At school the common story is that computers are monopolized by boys in out-of-class games use, while during classes the machines are used principally in support of mathematics, physics and language courses (in all of which excellent CAL software exists). Language courses are dominated by girls, but mathematics and the physical sciences remain largely male preserves. At primary school level the story is more equitable, with CAL packages being used across a wide range of the curriculum, but at secondary level (13–18) the identification of computers with mathematics has an unfortunate sequel. Students who avoid mathematics at school also avoid computers, and may face an anxious time when confronted with either or both in the research methods courses which form such an important part of most core programmes in the social sciences. The fact that a high proportion of the students may be returning to formal education after a prolonged absence, may have a rather fragile sense of their academic competence and are frightened of “making mistakes” may make the computer laboratory a traumatic experience. Far from encouraging the open flexible approach to learning espoused by CAL enthusiasts, the experience may result in a rigid and formal approach in which most effort is expended on ensuring that the right keys are pressed rather than attempting to understand what is intended to be imparted. Efforts to use the computer literates and enthusiasts among the class may
132
INFORMATION TECHNOLOGY AND TEACHING IN THE SOCIAL SCIENCES
sometimes help, but may also serve to reinforce the latent insecurity lurking beneath the surface in some students. In the majority of cases, it is possible to overcome the barriers, but the effort to do so is likely to be expensive in terms of labour input and may severely dent the claims for gains in efficiency. The picture is not entirely depressing. The most recent edition of the SocInfo Guide to Resources (1997) lists 79 course packages. There are instances of good uses being made of CAL in such areas as the teaching of urban dynamics (SimCity), decision-making in social work (Ethics), introductory economics (WinEcon) and a host of packages for teaching survey research and statistics. Social science discussion groups are alive and well.2 Some branches of the social sciences, such as demography, have been revolutionized by the development of the spreadsheet and similar tools. The availability of large databases and resources such as national newspapers on CD ROM makes it possible for students to browse among original source materials to a much greater extent than was previously the case. At the senior undergraduate level the ability of students to use a word processor and to produce graphs rapidly and accurately has greatly improved the level of presentation expected of final projects and has probably led to an increase in the sophistication of analyses. The use of questionnaire packages, such as PinPoint, has made the production of student questionnaires more professional and left less room for error. But one is still left with a feeling that there should be more to show for the investment of time, money and effort which has gone into CAL than these desirable but relatively minor achievements. Future possibilities: computer networks and educational communities To date most applications of ICT in the teaching of the social sciences have been substitutes or additions to conventional approaches. This is not to deny that these applications may represent an important augmentation of the teaching and learning processes. Rather than getting students to take (partial) notes of a (selective) lecture on, say, a piece of government legislation, they can be referred to the CD ROM containing the text of the original parliamentary debate; rather than an exercise based on artificial (and often sanitized) data they can use a questionnaire package to conduct their own survey or make use of one of the data sets downloadable either from CD ROM (e.g. British census data or OECD datasets) or across the academic network (e.g. the General Household Survey). With the use of appropriate utility and tutorial programs they can revise and practise their skills in statistics (Statistics for the Terrified) or the interpretation of tables. More exciting possibilities lie on the horizon, making use of improvements in the processing power of hardware and of the coming-together of information and communication technologies. The increased power available on the desktop means that GUIS can become standard, removing the necessity for typing arcane command language instructions. Although it cannot be assumed that all users can immediately interpret icons or, indeed, manipulate mice, the number who experience difficulty appears to be much lower than was the case in systems dependent on DOS. In addition to allowing the rapid manipulation of textual and numerical information the increase in power also presages further developments in the use of graphics, audio and video modes. When the desktop provides a transparent interface to the outside world through the Internet, as promised by the new generation of set-top boxes, the possibilities for new forms of learning are radically increased. And just over the skyline is the possibility of virtual reality.
2.
[email protected];
[email protected];
[email protected];
[email protected];
[email protected];
[email protected]
EXTERNAL ENCOURAGEMENT: COMPUTER LITERACY
133
In early 1997 the EU established a working party on the development of multimedia educational applications, and under the IVth Framework Telematics Application Programme a large number of collaborative projects are being considered, making use of a variety of networked facilities. The particular significance of these developments for teaching and learning in the social sciences lies in the analogy between social and electronic networks and in the facility of computer-aided communication to allow the development of shared understandings. In remote areas the use of telematics networks to deliver interactive multimedia modules, supported by computer conferencing techniques, will provide both a valuable educational resource and a basis for the renegotiation of a sense of community. Video-conferencing is a further area which promises to provide both a rich context for research and a further opening-up of teaching and learning. The development of computer-mediated communication for the remote supervision of student placements in practice-oriented disciplines such as social work or nursing will enable these to take place in a variety of settings which are currently difficult to establish. The use of telematics will enable the sharing of placement and other educational experiences across national boundaries. The boundaries between education, leisure and information may also be expected to become more flexible. “Edutainment” is already with us, in the guise of such programs as SimCity, but there remains great potential for the development of programs based on interactive game-playing in such areas as group processes, decision-making and conflict resolution. The opportunity exists for universities to make their stores of knowledge available in user-friendly forms, perhaps on the Web. There are already a large number of Websites available which contain material of immediate use to the social science student.3 The material does not have to be passive: a project under development in Stirling concerns the provision of information and advice on drugs. Based on the facilities of the Scottish Drugs Training Project, it is intended to develop and evaluate a multimedia drugs information system which will include textual, graphical, audio and video information on substance use and abuse and will also support e-mail and conferencing facilities, providing opportunities for drug users to receive advice and counselling and to record their experiences in a controlled environment. Similar possibilities exist in many other areas where social science departments have skills and knowledge relevant to real-life issues. More generally, developments based on the model provided by multi-user discussion groups, based on earlier multi-user dungeons and dragons games, promise the possibility of collaborative learning in the electronic replication of social networks: the University of Huddersfield, in collaboration with Soclnfo (see Glossary), has recently obtained a grant from the Joint Information Systems Committee for the development of just such an on-line discussion group.4 The main limitation to developments in this field is our imagination. The use of information and communication technologies in university teaching so far has been as a supplement to existing methods. Given the negotiated nature of knowledge in the social sciences it is hardly surprising that the spread of CAL in them has been relatively slow and that much of the most valuable software to date has been productivity tools such as document-processors, spreadsheets and databases, rather than courseware itself. In contrast to earlier twentieth-century developments in teaching and learning technologies, however, the spread of computer-based information and communication technologies presages a whole-scale revolution in the nature of teaching and learning and promises to be the matrix of the learning society of the twenty-first century.
3. http://www.stir.ac.uk/socinfo
134
INFORMATION TECHNOLOGY AND TEACHING IN THE SOCIAL SCIENCES
References P.Allen, S.Booth, P.Crompton, D.Timms, Case studies: integrating learning with technology (Stirling: University of Stirling: Project VARSETILE, 1996). (http::/www.annick.stir.ac.uk) M.Dixon, The uptake of IT as a teaching aid in higher education: a social science perspective (Oxford: Computers in Teaching Initiative Support Service, 1992). J.Gardner, “Computers in higher education teaching and learning” in Computing Across the Curriculum J.Gardner & F.McBride (eds) (Oxford: Computers in Teaching Initiative Support Service, 1990). N.Gardner, “Bringing the electronic campus to reality”. Paper presented at Conference on the Electronic Campus, quoted in Dixon (1992). Joint Information Systems Committee, Exploiting information systems in higher education: an issues paper (Bristol: HEFCE, 1995). D.Laurillard, Rethinking university teaching: a framework for the effective use of educational technology (London: Routledge, 1993). J.Whiting, “Interactive learning in Europe”, in The interactive learning revolution, J.Barker & N.Tucker (eds) (London: Kogan Page, 1990). Software
SimCity PinPoint WinEcon OECD Statistics for the Terrified Further reading P.Allen, S.Booth, P.Crompton, D.Timms, Integrating learning with technology: case studies II. Project VARSETILE (Stirling, 1997). http://annick.stir.ac.uk. M.S.Henry (ed.), SocInfo newsletters, SocInfo publications (Stirling: University of Stirling, 1990–97). M.S.Henry (ed.), Studying and using technology: a guide for social and political science students (Oxford: Blackwell, 1997). M.S.Henry, “The role of computers in social policy”, in The SPA student’s companion to social policy (Oxford: Blackwell, 1997).
GLOSSARY
BMDP: a computer program to analyze statistics CAL: Computer Aided Learning CBL: Computer Based Learning CHEST: Combined Higher Education Software Team CSCW: Computer-Supported Co-operative Work CTIC: Computers in Technology Information Centre CUSee Me: a video-conferencing program DesignNet: a program eLib: Electronics Libraries Program FTP: File Transfer Protocol GIS: Geographical Information System GLIM4: a computer program for generalized linear models GraphIT!: a computer program GUI: Graphical User Interface HCI: Human Computer Interaction ICT: Information and Communications Technologies ISDN: Integrated Services Digital Network ITTI: Information Technology Training Initiative MIDRIB: Medical Images Digitized Reference Information Bank Minitab: a computer program for statistical analysis MTL: Multimedia Teaching and Learning NUD.IST: Non-Numerical, Unstructured Data Indexing Searching and Theorising: a computer program for qualitative data analysis Pinpoint: a computer program for quantitative data analysis PowerPoint: a computer program for presentations on and off-line ROCOCO: RemOte Cooperation and COmmunication SAS: a computer program for quantitative data analysis SPSS: Statistical Packages for the Social Sciences: a computer program for quantitative data analysis TILT: Teaching with Independent Learning Technologies TLTP: Teaching and Learning Technology Programme TQA: Teaching Quality Assessment ULTRALAB: a research unit specializing in multimedia at the University of East Anglia VARESTILE: Value-Added Reuse at Stirling of Existing Technology in the Learning Experience, an institutional TLTP project based at the University of Stirling WIMP: Windows Interface Mouse Programme SocInfo SocInfo is the national CTI Centre for Sociology, Politics and Social Policy, based at the University of Stirling and funded under a joint programme by all the UK Higher Education Councils until July 1999. Since its inception in 1989, SocInfo’s remit has been to encourage academics in the social and political sciences to improve the quality of teaching and learning by using the range of new technologies effectively.
136
GLOSSARY
Consequently, SocInfo offers a number of core services: information and advice, reviews of courseware and software, training seminars and workshops and a range of regular publications on- and off-line. To assist in the huge task of dissemination, SocInfo has established a network of representatives who act as local gateways to the regions across the UK. In addition, a formal Advisory Group meets three times per year, chaired by Professor Sally Brown, Deputy Vice-Principal of Teaching and Learning at the University of Stirling. The SocInfo Advisory Group is made up of members from SocInfo, the British Sociological Association, the Political Studies Association, the Social Policy Association and other related associations. The Centre also has a number of close collaborative links with a number of CIT (Communication and Information Technologies) related projects. For further details about SocInfo, refer to: http://www.stir.ac.uk/ socinfo/ Further information For further details of software and more information on technical terminology please refer to: G.R.Gibbs (ed.), M.S.Henry (Series ed.), SocInfo guide to IT resources in sociology, polities and social policy, edn 4, no. 2 (SocInfo: University of Stirling, 1977). ISBN: 0951 5746 3 9.
INDEX
academic staff agnostic approach to IT 116–17 evangelistic approach to IT 114–15 incentives and disincentives for IT uptake 122–3 influence of organizational ethos on IT uptake 120 interdisciplinary approach to IT 44, 148, 176 resistance to IT 113–14, 184–6 use of educational technology 130–1 see also teaching access to technology by students 123–4, 134–5 Cambridge On-Line City Project 24–5 economic limitations 36 gender differences 31–2 in home 28–33 inequality 21, 23, 25 social class differences 32–3 active learning 167, 170–1, 181–2 administrative planning 182–3 agnostic approach to IT 116–17 Argyris, C. 169 assessment 18–19, 94–5, 132 teaching quality assessment 140–1, 143, 185 see also evaluation attitudes 117–18 audio CD players 28–30 audio conferencing 161 authoring 45–6, 67, 70
Cambridge Childcare Information Project 25 Cambridge On-Line City Project 24–5 case studies pedagogic advantages and limitations of 167–70 use in multimedia courseware 170–1, 175–7 change in society 12–13 workplace resistance to 118–20 childcare services 25 children, technology in homes of 30–1 co-operation see computer-supported co-operative work compatibility, of hardware and software 124–5 computer aided learning (CAL) academic responses to 185–6 evaluation of 185 role in teaching 92, 93–6 as substitution or support for teaching 87–9, 90–1, 96, 99–100 see also courseware computer conferencing 161 computer enthusiasts 182 computer literacy 6, 180–1 computer networks 191–2 computer-based learning (CBL) courseware 133–4, 165 computer-supported co-operative work (CSCW) 75–6, 105, 107–9, 110 communication in DesignNet project 78–80 future of 82–3 team work 81–2 computers compatibility 124 household ownership of and access to 28–33 student ownership of 135 consolidation in teaching 94–5 Correlation Explorer 96–7 cost-benefit analysis 152–3 cost-effectiveness analysis 153–4
Bates, A. 151 behaviour, impact of technology on 19 Beishuizen, J.J. 173 Bloom, B.S. 168 Brackenbury, S. 114–15 BT, virtual degree courses 9–11 CAL see computer aided learning 137
138
INDEX
costs comparison of educational technologies 160–1 concepts of 154–6 of courseware development 165, 176, 186 of distance and traditional teaching 159–60 identifying and estimating technology costs 156–9 courseware academic resistance to 113–14, 184–6 adaptation of 131 advantages of multimedia courseware 166–7, 173–5 costs of 165, 176, 186 development of see data game approach to statistics; DesignNet project; GraphIT!; Ideologies of Welfare examples of CAL software 96–9 incentives and disincentives to develop 122–3 interdisciplinary approach to 44, 175–7 role of CAL software in teaching 92, 93–6 role in interdisciplinary knowledge integration 171–3 for social sciences 188, 189–90 for social work students 133–4 as substitution or support for teaching 87–9, 90–1, 96, 99–100 use of case studies in 170–1, 175–7 data game approach to statistics 53–5 evaluation of 61–3 examples of 55–61 design, use of computers in collaborative work 75–6 DesignNet project 76–8, 82–3 methods of working 80 task meaningfulness 81 team work 81–2 technological communication 78–80 disabled students 133, 134, 135, 146 disciplines specialization of 172 see also interdisciplinary approach distance learning 156, 159–60 Dixon, M. 174–5, 185 e-mail 24 economic approach to educational systems 152–6 economic approach to educational technologies 152 comparing costs 160–1 costs of distance and traditional teaching 159 identifying and estimating costs 156–9
economies of scale 157–8, 160, 162 editorial structure in courseware development 46–7 education evolution of 5–8, 11–12 half-life education 9 impact of IT on 26 responses to technological change 20 virtual degree courses 9–11, 12 see also higher education educational communities 191–2 educational systems concept of costs 154–6 cost-benefit analysis 152–3 cost-effectiveness analysis 153–4 educational technologies comparing costs of 160–1 costs of distance and traditional teaching 159–60 definition of 130 factors influencing choice of 151–2 identifying and estimating costs 156–9 non-standard students’ use of 133, 134, 135, 146 student access to and use of 123–4, 131–2, 134–5 see also ICT effectiveness, and use of ICT 181–2, 185–6 efficiency, and use of ICT 181, 186 egological organizations 108–9 elaboration in teaching 95 employment patterns, impact of IT on 34–5 equality see access to technology ethnography 107–9 evaluation of CAL 185 in courseware development 47–9, 67, 68, 71 of data game approach to statistics 61–3 see also assessment evangelistic approach to IT 114–15 exams see assessment experiential learning 167, 170 gender differences in access to IT 31–2 and student attitudes to ICT 188–9 government policy 21–2, 25–6 GraphIT! 41–2 authoring of 45–6 editorial structure 46–7 evaluation of 47–9 interdisciplinary work 44 tutorial programs 42–3 groupware see computer-supported co-operative work
INDEX
Hague, R. 114–15 half-life education 9 hardware see computers higher education access to technology in 123–4, 134–5 barriers to ICT uptake in teaching 183–8 future use of ICT in social science 190–2 ICT in context of 121–2, 139–40, 180–1, 186–7 ICT uptake in social science 188–90 influences on ICT use 118, 120, 180–3 virtual degree courses 9–11, 12 home, IT in 28–33, 35–6 homeworking 34–5 Human Computer Interaction (HCI) approach 105–6 humans, relationship with technology 3–4, 5 ICT (information and communication technology) academic resistance toward 113–14, 184–6 advantages and disadvantages of use in sociology 146– 9 agnostic approach to 116–17 barriers to uptake in teaching 183–8 economic limitations on impact of 36 evangelistic approach to 114–15 hardware and software compatibility 124–5 in higher education context 121–2, 139–40, 180–1, 186–7 impact on employment patterns 34–5 impact on leisure and service consumption 35–6 incentives and disincentives for uptake 122–3 influences on use in higher education 118, 120, 180–3 liberatory potential of 17–18 relationship with social science 179 sociology students’ use of 141–6 support for users 125–6, 135, 145–6 uptake in social science teaching 104–5, 110, 188–92 see also access to technology ; educational technologies Ideologies of Welfare (IoW) 65–6 multimedia lesson in 68–73 Inspiration 97–8 interactive benefits of courseware 167, 170–1 interactive graphs 55–7 interdisciplinary approach role of courseware 171–3 to IT development 44, 81–2, 148, 175–7 Internet 36–7, 134 see also World Wide Web IT (information technology) see ICT
139
Laurillard, D. 90, 91 learning active and experiential 167, 170–1, 181–2 CAL as tool for 95–6 impact of courseware on motivation 166–7 student-centred 167, 169, 174 see also teaching leisure, impact of IT on 35–6 life-long learning 181–2 linefitting, in statistics courseware 55–7 MacLaboratory 98–9 modelling see simulation and modelling motivation impact of multimedia courseware on 166–7 to use ICT in teaching 122–3, 180–2 multidisciplinary work see interdisciplinary approach multimedia teaching and learning (MTL) 66–8, 165–6 Ideologies of Welfare lesson 68–73 see also courseware non-linearity of multimedia courseware 167, 170, 173–4, 175 non-standard students definition of 130 use of educational technology 133, 134, 135, 146 occupation differences, in access to IT 32–3 organizational ethos, influence on behaviour 118–20 pattern recognition, in statistics courseware 57–61 pharmacy, multimedia teaching and learning 66–8, 73 plagiarism 147 policy on technology 19–20, 21–2, 25–6 presentation 91–2, 132, 166–7 public services impact of IT on consumption 35–6 information about 24–5 quality see evaluation; teaching quality assessment Rector, A.L. R. 89 remediation in teaching 93–4 research funding for 22 rewards of 185 resources for IT in higher education 147–8, 186–7
140
INDEX
Rogers, C.R. 174 sensitivity analysis 158–9 services impact of IT on consumption 35–6 information on public services 24–5 simulation and modelling 133, 167, 170, 174 social class differences in access to IT 32–3 social science ICT uptake in teaching and learning 188–92 relationship with IT 179 technology in teaching 104–5, 110 society, technological change in 12–13 sociology advantages and disadvantages of IT in 146–9 role of IT for students of 141–6 software compatibility 124–5 higher education standards of 124, 183–4 see also courseware special needs students 133, 134, 135, 146 statistics data game approach to teaching 53–5 evaluation of data game approach 61–3 examples of data game approach 55–61 GraphIT! courseware 42 problem of learning and teaching 51–2 student-centred learning 167, 169, 174 students access to and use of ICT 123–4, 131–2, 134–5 attitudes of social science students to ICT 188–9 definition of non-standard students 130 non-standard students’ use of ICT 133, 134, 135, 146 sociology students’ use of ICT 141–6 subject-specific learning objectives 168 support for IT users 125–6, 135, 145–6 teachers see academic staff teaching barriers to ICT uptake in 183–8 CAL software as substitution or support for 87–9, 90– 1, 96, 99–100 functions of 91–5 ICT uptake in social science 104–5, 110, 165–6, 179, 188–92 influences on ICT use 122–3, 180–3 with Internet 134 recent research on 89–91 use of case studies see case studies
see also academic staff; courseware teaching quality assessment (TQA) 140–1, 143, 185 technologist perspective 103, 104, 182 technology and change in society 12–13 communication between humans and 5 developing relationship with humankind 3–4 government policy on science and 21–2, 25–6 impact on behaviour 19 see also access to technology; educational technologies; ICT technomethodological approach 108–9 telematics 191 telepresence technology, and shared experience 8–11 test choice module of statistics courseware 57–61 TILT (teaching with independent learning technology) 47 transferable skills 168–9 transnational team work 81–2 tutorial programs 42–3 tutorials 93 video conferencing 161, 191 video recorders 28–30 Wagner, L. 159 welfare ideology see Ideologies of Welfare workgroup computing see computer- supported cooperative work World Wide Web 24–5, 191 see also Internet