Developments in Marine Technology, 12
Science-Technology Synergy for Research in the Marine Environment"
Challenges for the XXI Century
DEVELOPMENTS IN MARINE TECHNOLOGY Vol.
1
Vol. Vol.
2 3
Vol.
4
Vol.
5
Vol. Vol.
6 7
Vol. Vol.
8 9
Vol.
10
Vol.
11
Vol
12
Marine and Offshore Safety (P.A. Frieze, R.C. McGregor and I.E. Winkle, Editors) Behaviour of Offshore Structures (J.A. Battjes, Editor) Steel in Marine Structures (C. Noordhoek and J. de Back, Editors) Floating Structures and Offshore Operations (G. van Oortmerssen, Editor) Nonlinear Methods in Offshore Engineering (S.K. Chakrabarti) CFD and CAD in Ship Design (G. van Oortmerssen, Editor) Dynamics of Marine Vehicles and Structures in Waves (W.G. Price, P.Temarel and A.J. Keane, Editors) A Course in Ocean Engineering (S. Gran) MARIN Jubilee 1992: Special Jubilee Volume (MARIN, Editor) Hydrodynamics: Computation, Model Tests and Reality (H.J.J. van den Boom, Editor) Practical Design of Ships and Mobile Units (M.W.C. Oosterveld and S.G.Tan, Editors) Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century (L. Beranzoli, P. Favali, G. Smriglio, Editors)
Developments in Marine Technology, 12
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century
Edited by Laura Beranzoli Paolo Favali Guiseppe Smriglio t lnstifufoNazionale de Geofisica e Vulcanologica, Roma, Italy
ELSEVIER SCIENCE B.V. Sara Burgerhartstraat 25 P.O. Box 211, 1000 AE Amsterdam, The Netherlands
9 2002 Elsevier Science B.V. All rights reserved.
This work is protected under copyright by Elsevier Science, and the following terms and conditions apply to its use: Photocopying Single photocopies of single chapters may be made for personal use as allowed by national copyright laws. Permission of the Publisher and payment of a fee is required for all other photocopying, including multiple or systematic copying, copying for advertising or promotional purposes, resale, and all forms of document delivery. Special rates are available for educational institutions that wish to make photocopies for non-profit educational classroom use. Permissions may be sought directly from Elsevier Science Global Rights Department, PO Box 800, Oxford OX5 1DX, UK; phone: (+44) 1865 843830, fax: (+44) 1865 853333, e-mail:
[email protected]. You may also contact Global Rights directly through Elsevier's home page (http://www.elsevier.com), by selecting 'Obtaining Permissions'. In the USA, users may clear permissions and make payments through the Copyright Clearance Center, Inc., 222 Rosewood Drive, Danvers, MA 01923, USA; phone: (+1) (978) 7508400, fax: (+1) (978) 7504744, and in the UK through the Copyright Licensing Agency Rapid Clearance Service (CLARCS), 90 Tottenham Court Road, London W1P 0LP, UK; phone: (+44) 207 631 5555; fax: (+44) 207 631 5500. Other countries may have a local reprographic rights agency for payments. Derivative Works Tables of contents may be reproduced for internal circulation, but permission of Elsevier Science is required for external resale or distribution of such material. Permission of the Publisher is required for all other derivative works, including compilations and translations. Electronic Storage or Usage Permission of the Publisher is required to store or use electronically any material contained in this work, including any chapter or part of a chapter. Except as outlined above, no part of this work may be reproduced, stored in a retrieval system or transmitted in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without prior written permission of the Publisher. Address permissions requests to: Elsevier Science Global Rights Department, at the mail, fax and e-mail addresses noted above. Notice No responsibility is assumed by the Publisher for any injury and/or damage to persons or property as a matter of products liability, negligence or otherwise, or from any use or operation of any methods, products, instructions or ideas contained in the material herein. Because of rapid advances in the medical sciences, in particular, independent verification of diagnoses and drug dosages should be made.
First edition 2002 Library of Congress Cataloging in Publication Data A catalog record from the Library of Congress has been applied for.
ISBN:0-444-50591-1 Q The paper used in this publication meets the requirements of ANSI/NISO Z39.48-1992 (Permanence of Paper). Printed in The Netherlands.
In memory of Giuseppe Smriglio To our friend Giuseppe, who abruptly passed away shortly before this volume issue. Giuseppe, having conceived GEOSTAR, left an indelible mark for the scientists and technologists involved in deep-sea exploration. His intuition is living and guiding us in continuing the way he traced. Laura Beranzoli - Paolo Favali
This Page Intentionally Left Blank
vii
Table of Contents Acknowledgements
L. Beranzoli, P. Favali, G. Smriglio Preface
xiii
J. Boissonnas
Part I - W h y deep-sea observatories?
Perspectives and challenges in marine research G. Ollier, P. Favali, G. Smriglio and F. Gasparoni Research for the protection of the Deep Sea H. Thiel
11
Deep physical oceanography experimentation and benefits from bottom observatories C. Millot
19
Why global geomagnetism needs ocean-bottom observatories F. J. Lowes
27
Ocean-bottom seismology in the Third Millennium J. Bialas, E. R. Flueh, J. P. Morgan, K. Schleisiek and G. Neuhaeuser
37
Part II - Seafloor observatories 9 state of the art and ongoing experiments
45
Development of seismic real-time monitoring systems at subduction zones around Japanese islands using decommissioned submarine cables J. Kasahara
47
viii Geophysical Ocean Bottom Observatories or Temporary Portable Networks? J.-P. Montagner, J-F. Karczewski, E. Stutzmann, G. Roult, W Crawford, P. Lognonn~, L. B6guery, S. Cacho, G. Coste, J-C. Koenig, J. Savary, B. Romanowicz and D Stakes H20: The Hawaii-2 Observatory A. D. Chave, F. K. Duennebier, R. Butler, R. A. Petitt, Jr., F. B. Wooding, D. Harris, J. W Bailey, E. Hobart, J. Jolly, A. D. Bowen and D. R. Yoerger The MBARI Margin Seismology Experiment: A Prototype Seafloor Observatory D. S. Stakes, B. Romanowicz, M. L. Begnaud, K. C. McNally, J. P. Montagner, E. Stutzmann and M. Pasyanos Towards a permanent deep sea observatory: the GEOSTAR European experiment P. Favali, G. Smriglio, , L. Beranzoli, T. Braun, M. Calcara, G. D'Anna, A. De Santis, D. Di Mauro, G. Etiope, F. Frugoni, V. Iafolla, S. Monna, C. Montuori, S. Nozzoli, P. Palangio, G. Romeo
59
83
93
111
NEMO: a project for a Km3-scale neutrino telescope in the Mediterranean Sea near the south Italy coasts A. Capone
121
Part l l I - Marine technologies for deep-sea observatories
131
Deep Sea Challenges of Marine Technology and Oceanographic Engineering G. Clauss and S. Hoog
133
From Abel to Geostar: development of the first european deep-sea scientific observatory F. Gasparoni, D. Calore and R. Campaci
143
Design and realization of Communication Systems for the GEOSTAR project J. Marvaldi, Y. Aoustin, G. Ayela, D. Barbot, J. Blandin, J. M. Coudeville, D. Fellmann, G. Loagc, Ch. Podeur, A. Priou Gravimeter for Deep Sea Measurements V. Iafolla and S. Nozzoli
161
183
ix
Part I V - Marine environmental and risk assessment
199
The Deep-sea as an Area for Geotechnical Intervention H. U. Oebius and H. W. Gerber
201
Offshore hydrocarbon leakage" hazards and monitoring G. Etiope, P. Carnevale, F. Gasparoni, M. Calcara, P. Favali and G. Smriglio
217
The use of a Coastal HF Radar system for determining the vector field of surface currents G. Budillon, G. Dallaporta and A. Mazzoldi
229
The Italian Tsunami Warning System" state of the art A. Maramai, A. Piscini, G. D 'Anna and L. Graziani
247
Advanced technologies: equipments for environmental monitoring in coastal areas G. Zappalh
261
This Page Intentionally Left Blank
xi
Acknowledgements This volume is one of the most significant results of the conference "ScienceTechnology Synergy for Research in Marine Environment: Challenges for the XXI Century" held in Erice and Ustica Italy, September 1999. The volume has the aim of giving the state of the art of developments in technology and scientific research applied to the deep sea long-term observatories. The scientific finalisation of those observatories in Earth sciences and environmental studies are presented as well as some of the most recent experiments of long-term monitoring worldwide (sessions I and II). Session III is devoted to the description of new technologies enabling the deep sea long-term observatory deployment/recovery and functioning. Session IV is a miscellany of contributions mainly addressed to the risk assessment in marine environment from deep sea to coastal regions in order to give information on systems that are complementary to the monitoring through deep sea long-term observatories. The Editors wish to thank the Institutions that supported the conference and this volume: the Istituto Nazionale di Geofisica e Vulcanologia and the President of the institute, Prof. Enzo Boschi; the European Commission (MAST Programme), the former Director of MAST-III, Dr. Jean Boissonnas and the Scientific Officer, Dr. Gilles Ollier; the Natural Marine Reserve of Ustica and its Director, Dr. Roberto Sequi; the Commune of Ustica and its Major, Dr. Attilio Licciardi; the Ettore Majorana Centre for Scientific Culture and its Director, Prof. Antonino Zichichi. The Editors also wish to thank the the referees, for their competence and care in reviewing the papers, and authors for their precious contributions. Special thanks are due to Mrs. Sabina Vallati for her precious, careful and patient work as editor assistant and in the arrangement of the contributions within the volume. Many thanks also to Mrs. Francesca Di Stefano for helping Mrs. Vallati in the elaboration of some of the images. The Editors are also very grateful to Dr. Caterina Montuori for the help in the preparation of the final version of the volume. Finally, the Editors wish to remember the scientists and friends recently deceased who, in different ways, contributed to the conference and the volume: Dr. Alberto Gabriele, Dr. Vladimir Rukol, Dr. Gianfranco Dalla Porta and Dr. Luc Floury.
Laura Beranzoli Paolo Favali Giuseppe Smriglio
This Page Intentionally Left Blank
xiii
Preface Pensioners are often assumed to have plenty of spare time for reading. When I undertook to write this Preface at the kind insistence of the editors, I began by refreshing my memory about the history of underwater exploration. It seems that fishermen, even well into the 18th century, believed the sea in the Gulf of Lions to be bottomless. I discovered, or re-discovered, the legend of Alexander the Great's dive in the Arabian Sea; the account given by the Scriptures (Acts XXVII) of St. Paul caught in a storm at sea, his crew trying to measure sea depth to avoid shipwreck; the first attempts, dating from the Renaissance, to design primitive submersibles and diving bells; the efforts by increasing numbers of inventors throughout the 17th, 18 th and early 19th centuries to overcome the constraints of underwater exploration. In those days, the main objectives of dives - made with primitive submersibles and equipments - were treasure retrieval and warfare. In parallel, but especially after 1800, information on bathymetry began to accumulate with the multiplication of soundings, and some indications of the possible presence of deep sea life began to appear. The mid 19th century is a crucial period. The emergent cable industry needed a reasonably accurate knowledge of bathymetry and sea floor topography; measurements had of course to be carried out from surface ships. Notable advances also began at that time on underwater navigation and by 1870, when he imagined the Nautilus, Jules Verne was already able to draw upon a number of recent developments. However, modem oceanography in the accepted sense was not yet born" as we all know, the turning point was the Challenger expedition of 1872-1876. Jules Verne's visions of the sea floor reveal both the power of his imagination and the extent of misconceptions that were still prevailing in his days. For a long time, progress in scientific understanding was rather slow. As a student in geology, I learned about the existence of a mid-Atlantic ridge (was it perhaps a belt of folded sediments with volcanoes all along ?) and of deep oceanic trenches. The deep sea floor was deemed to be covered with red clay (" l'argile rouge des grands fonds "). We had not yet heard about plate tectonics. I do not need to recall here the extraordinary advances that began in the 60's and led to the now generally accepted concept of the deep sea floor as a dynamic environment, as a fundamental component of the Earth System. Greatly improved means of accessing ocean depths, with bathyscaphes and later with submersibles, were a determining factor of progress. By the end of the 1980s, manned submersibles had brought in a wealth of information. Much insistence was still placed on the necessity of direct observation by man. Characteristically enough, a Special Issue of the Marine Technology Society Journal devoted to deep ocean research in June 1990, is named "A deepest ocean presence" and specifies" " ...Touching bottom is one thing; establishing an effective, meaningful presence is quite another. The latter implies extended durations at depths...in order to conduct experiments, observations, and work at will. It should further imply efficient and versatile operations, with favourable cost, time and logistics elements". Meanwhile, an important shift of approach was under way. Although the objectives set out above in the quotation from the MTS Journal remained valid, we began to hear about fixed sea floor stations. Today, monitoring the marine environment, including sea floor processes, has become a key objective of scientific activity. New requirements can best be met by automated and modular observatories, to be networked together. These observatories must be multidisciplinary; ideally, they should encompass multiple observing sites on a spatial scale
xiv appropriate to the processes under study. I am pleased to stress that Europe has been quite active in this field. The MAST programme, supported from 1991 to 1994, a feasibility study for an abyssal benthic laboratory. ECOPS (a former joint EC/ESF Committee devoted to Ocean and Polar Science ) in 1994 mentioned the possible role of such stations to implement its so-called Grand Challenge on the Deep Sea Floor as a Changing Environment. The GEOSTAR 1 proposal followed in 1995, benefitting from the earlier feasibility study and as I write this preface, the GEOSTAR 2 project is in full activity. In the context of another discipline, I should also mention extensive European collaboration on ANTARES, a project to deploy a neutrino observatory off the coast of SE France. So, the scene is now set for a new era of investigations on the ocean floor, relying on the most advanced technologies. What can be the place of imagination and excitement in this context, I mean, the sort of feelings conveyed by Jules Verne, that can still be experienced during any deep dive of a manned submersible ? Only 50 years ago, Rachel Carson, in "The sea around us", could write of man, whose distant ancestors of geological times came from the ocean: "Over the centuries, with all the skill and ingenuity and reasoning powers of his mind, he has sought to explore and investigate even its most remote parts, so that he might re-enter it mentally and imaginatively .... He found ways to probe its depths, he let down nets to capture its life, he invented mechanical eyes and ears that could re-create for his senses a world long lost . . . . And yet he has returned to his mother sea only on her own terms... " It may be that man is now substituting his "own terms" for those of the sea. Should this worry us ? Perhaps not. Every researcher will claim that with the powerful means now at his disposal, opportunity for exciting discovery are far from being over ! In organising the Erice conference, the editors of this volume have not only provided a striking venue in a very beautiful landscape, their objective was, above all, to stimulate the scientific community, particularly in Europe, for the planning of frontier research on the sea floor: they should be thanked for their efforts.
Jean BOISSONNAS former head of the EU MArine Science and Technology (MAST) research programme.
XV
List of the Authors
Y. Aoustin, IFREMER-Center of Brest, B.P. 70 29280 Plouzan6, France G. Ayela, ORCA-Instrumentation, 5, rue Pierre Rivoalon 29200 Brest, France J. W. Bailey, Department of Applied Ocean Physics and Engineering, Woods Hole Oceanographic Institution, Woods Hole, MA 02543, U.S.A D. Barbot, ORCA-Instrumentation, 5, rue Pierre Rivoalon 29200 Brest, France M. L. Begnaud, MBARI, 7700 Sandholdt Road, Moss Landing CA 95039-0628, U.S.A. (Now at Los Alamos National Laboratory, Los Alamos, NM 87545, U.S.A.) L. Beranzoli, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605,00143 Rome, Italy L. B6guery, Seismological Laboratory, URA, CNRS 195, Institut de Physique du Globe, Paris, France J. Bialas, GEOMAR, Research Center for Marine Geosciences, Christian Albrechts University Kiel, Wischhofstr. 1-3, D 24148 Kiel, Germany J. Blandin, IFREMER-Center of Brest, B.P. 70 29280 Plouzan4, France J. Boissonnas, European Commission, Rue de la Loi 200, 1049 Brussels, Belgium D. Bowen, Department of Applied Ocean Physics and Engineering, Woods Hole Oceanographic Institution, Woods Hole, MA 02543, U.S.A G.M. Bozzo, Tecnomare S.p.A., San Marco 3584, 30124 Venezia, Italy T. Braun, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy and INGV-Osservatorio Sismologico Centralizzato Arezzo, Via Uguccione della Faggiuola 3, 52100 Arezzo, Italy G. Budillon, Istituto Universitario Navale, Istituto di Meteorologia e Oceanografia, Via Acton 38, 80133 Napoli, Italy R. Butler, Incorporated Research Institutions for Seismology, 1200 New York Ave. NW, Suite 800, Washington, DC 20005, USA S. Cacho, Seismological Laboratory, URA, CNRS 195, Institut de Physique du Globe, Paris, France M. Calcara, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy D. Calore, Tecnomare S.p.A., San Marco 3584, 30124 Venezia, Italy R. Campaci, Tecnomare S.p.A., San Marco 3584, 30124 Venezia, Italy A. Capone, Dipartimento di Fisica, Universitfi di Roma "La Sapienza", P.le Aldo Moro 2, 00185 Roma, Italy P. Carnevale, Tecnomare S.p.A., San Marco 3584, 30124 Venezia, Italy A. D. Chave, Woods Hole Oceanographic Inst., Woods Hole, MA 02543, USA G. Clauss, Institute of Naval Architecture and Ocean Engineering at the Technische Universit&it Berlin, Germany W. Crawford, Seismological Laboratory, URA, CNRS 195, Institut de Physique du Globe, Paris, France
xvi J. M. Coudeville, ORCA-Instrumentation, 5, rue Pierre Rivoalon 29200 Brest, France G. Dallaporta, C.N.R., Istituto Studio Dinamica Grandi Masse. S.Polo 1364, 30125 Venezia. Italy G. D'Anna, Istituto Nazionale di Geofisica e Vulcanologia, Osservatorio Geofisico di Gibilmanna, 90015 Cefah), Palermo, Italy De Santis, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy D. Di Mauro, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy F. K. Duennebier, School of Ocean and Earth Science and Technology, University of Hawaii at Manoa, Honolulu, HI 96822, U.S.A. G. Etiope, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy P. Favali, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy D. Fellmann, ORCA-Instrumentation, 5, rue Pierre Rivoalon 29200 Brest, France E. Flueh, Geomar Wischhofstr. 1-3, 24148 Kiel, Germany F. Frugoni, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy F. Gasparoni, Tecnomare S.p.A., San Marco 3584, 30124 Venezia, Italy H. W. Gerber, TFH Berlin, Luxemburger Str. 10, 13553 Berlin, Germany L. Graziani, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy D. Harris, School of Ocean and Earth Science and Technology, University of Hawaii at Manoa, Honolulu, HI 96822, U.S.A. E. Hobart, Department of Applied Ocean Physics and Engineering, Woods Hole Oceanographic Institution, Woods Hole, MA 02543, U.S.A S. Hoog, Institute of Naval Architecture and Ocean Engineering at the Technische Universit~it Berlin, Salznfer 17-19, 10587 Berlin, Germany V. Iafolla, Istituto di Fisica dello Spazio Interplanetario (IFSI) CNR, Via del Fosso del Cavaliere 100, 00133 Rome, Italy J. Jolly, School of Ocean and Earth Science and Technology, University of Hawaii at Manoa, Honolulu, HI 96822, U.S.A. J.-Y. Jourdain, Thomson Marconi Sonar S.A.S., 525 Route del Dolines, BP 157, 06903 Sophie Antipolis Cedex, France J.-F. Karczewski, Seismological Laboratory, URA, CNRS 195, Institut de Physique du Globe, Paris, France J. Kasahara, Earthquake Research Institute, University of Tokyo, 1-1-1, Bunkyo-Ku, Tokyo 113, Japan J-C. Koenig, Seismological Laboratory, URA, CNRS 195, Institut de Physique du Globe,
xvii Paris, France G. Loaec, IFREMER-Center of Brest, B.P. 70 29280 Plouzan6, France J.F. Lowes, University of Newcastle Upon Tyne, NE1 7RU, Newcastle Upon Tyne, United Kingdom P. Lognonn6, Seismological Laboratory, URA, CNRS 195, Institut de Physique du Globe, Paris, France A. Maramai, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143, Roma, Italy J. Marvaldi, IFREMER, Centre de Brest, BP 70, Plouzan6 29280, France K. C. McNally, UC Santa Cruz, Dept Earth Sciences, Santa Cruz, CA 95064, U.S.A. A. Mazzoldi, C.N.R., Istituto Studio Dinamica Grandi Masse, S.Polo 1364, 30125 Venezia, Italy C. Millot, Antenne LOB-CNRS, BP 330, La Seyne/mer 83507, France J.-P. Montagner, IPG, Place Jussieu 4, 75252 Paris CEDEX 05, France S. Monna, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy C. Montuori, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy G. Neuhaeuser, Send GmbH, Signal-Elektronik und Netz-Dienste, Stubbenhuk 10, D 20459 Hamburg, Germany S. Nozzoli, Istituto di Fisica dello Spazio Interplanetario (IFSI) CNR, Via del Fosso del Cavaliere 100, 00133 Rome, Italy H. U. Oebius, Technische Universitaet Berlin, FB 6/ITU-WUM, Sekr. VWS, MuellerBreslau-Str. (Schleuseninsel), 10623 Berlin, Germany G. Ollier, European Commission, Rue de la Loi 200, 1049 Brussels, Belgium P. Palangio, Istituto Nazionale di Geofisica e Vulcanologia, Osservatorio Geomagnetico, Castello Cinquecentesco dell'Aquila, 67100 Aquila, Italy M. Pasyanos, UCB Seismographic Station, Berkeley, CA, U.S.A. (Now at Lawrence Livermore National Laboratory, Livermore, CA 94551, U.S.A.) R. A. Petitt, Jr., Department of Applied Ocean Physics and Engineering, Woods Hole Oceanographic Institution, Woods Hole, MA 02543, U.S.A J. Phipps Morgan, GEOMAR, Research Center for Marine Geosciences, Christian Albrechts University Kiel, Wischhofstr. 1-3, D 24148 Kiel, Germany A. Piscini, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy Ch. Podeur, IFREMER-Center of Brest, B.P. 70 29280 Plouzan6, France A. Priou, ORCA-Instrumentation, 5, rue Pierre Rivoalon 29200 Brest, France B. Romanowicz, UCB Seismographic Station, Berkeley, CA, U.S.A. G. Romeo, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy G. Roult, Seismological Laboratory, URA, CNRS 195, Institut de Physique du Globe, Paris, France
xviii H. Roe, University of Southampton, Waterfront Campus, European Way, Southampton SO 14 3ZH, U.K. J. Savary, Seismological Laboratory, URA, CNRS 195, Institut de Physique du Globe, Paris, France K. Schleisiek, Send GmbH, Signal-Elektronik und Netz-Dienste, Stubbenhuk 10, D 20459 Hamburg, Germany G. Smriglio, Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italyt D. S. Stakes, MBARI, 7700 Sandholdt Road, Moss Landing CA 95039-0628, U.S.A. E. Stutzmann, IPG, Dept. Seismology, Case 89, Paris 75252, France H. Thiel, Poppenbuettler Markt 8a, 22399 Hambourg, Germany F. B. Wooding, Department of Geology and Geophysics, Woods Hole Oceanographic Institution, Woods Hole, MA 02543, U.S.A. D. R. Yoerger, Department of Applied Ocean Physics and Engineering, Woods Hole Oceanographic Institution, Woods Hole, MA 02543, U.S.A G. ZappalS., Istituto Talassografico-CNR, Spianata San Ranieri 86, 98122 Messina, Italy
Part I Why deep-sea observatories?
This Page Intentionally Left Blank
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
9 2002 Elsevier Science B.V. All rights reserved.
Perspectives and challenges in marine research G. Ollier a, Chairman bt P. Favali b and G. Smriglio, Co-Chairman F. Gasparoni c, Rapporteur aEuropean Commission, Rue de la Loi 200, 1049 Brussels (Belgium) bIstituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Roma (Italy) CTecnomare SpA, San Marco 3584, 30124 Venezia (Italy)
INTRODUCTION In the period 8-13 September 1999, the Ettore Majorana Center for Scientific Culture (Erice, Sicily) hosted the 16th Course of the International School of Geophysics, devoted to "Science-Technology Synergy for Research in Marine Environment: Challenges for the XXI Century". Over 70 representative from European and international scientific institutions and industries attended. The Conference focused on the most recent developments in Marine Technologies and Scientific Research, with a particular emphasis on environmental studies and on the monitoring of major natural risks. The Course was concluded by a Round Table, specifically devoted to discuss the main topics raised from the conference and, more generally, perspectives and challenges in marine research. The Conference had a strong focus on benthic observatories and in particular those aiming at long-term monitoring were reviewed during the conference. In the domain of Geophysical research where the activities relating to deep-sea observatories already went somewhat ahead compared to the other domains the objective is either to better access to the parameters of earthquake sources or to investigate the deep structure of the earth. Biological research is also a good candidate for benthic laboratories in particular nowadays to help assessing the deep marine biodiversity. Physical oceanographers also call for long- term
Report of the Round Table on "Science-Technology Synergy for Research in Marine Environment: Challenges for the XXI Century", Ustica, Italy, 14th September 1999.
monitoring of the deep-water mass because of its impact on global processes and in particular on climate. The GOOS/EUROGOOS/MEDGOOS community (Global Observation Systems) in particular is interested in such devices. The industry is nowadays moving into deeper waters, which implies also more technology developments relating to long term deep-sea observatories. Such operations in deeper waters are of higher risk and necessitate a complete monitoring of the environmental parameters (currents, sediment stability, impact of the exploration/exploitation activities). Even astronomers are now seeking the benefit of deep-water research in giving a high priority to deep-water neutrino telescopes that will enable them to receive information from remote areas of the universe further than optical telescopes. A rather well accepted interpretation by the audience of all the requests put by the scientific community is that an optimised baseline approach is needed in order to tackle common technological problems which are basically: (1) communication problems, (2) energy problems, (3) deep-sea component availability and (4) servicing the deep-sea stations. So, it is quite clear that collaboration across the disciplines in the domain of deep-sea observatories will remain a key approach and will even necessitate being strengthened. The technology needed for such developments is quite demanding whereas the resources available are limited, in the context of several possible approaches to monitor in the long-terms the deep-sea environment. The Round Table extended the conference through discussions envisaging the various technical possibilities and frameworks for moving deeper in complement to what was said during the sessions of the conference itself. In fact the central question that was put forward during the round table was how to move into the deep-sea in technological and economic terms. The discussion was structured around the following four axes: (1) the need for marine instrumentation in the near future, (2) the role and framework of science and industry interactions, (3) deep-sea observatories: progress achieved till now and further needs and (4) key components for marine observatories.
PERSPECTIVES
IN MARINE
INSTRUMENTATION
by Howard Roe (Southampton Oceanographic Centre, U.K.) Understanding the processes in marine environment needs in particular an effective data collection. Technology represents the tool to achieve this objective, especially in deep sea environment where extreme robustness, minimum maintenance, need for minimum calibration are mandatory requirements for the instrumented packages. At the same time the possibility to observe variability of phenomena in time and space imposes us to collect data in the fight scale; this means that technology developments must be oriented in the design of what is effectively needed by the users. Scientific benefit and cheaper approaches should therefore be considered as the leading factors influencing the future technological developments in marine instrumentation. Multidisciplinary approaches such as those characterising the new benthic observatories under development in Europe and U.S.A. represents a solution to the problem. Needs for the future may be summarised as follows: Extend the number and type of platforms, with particular reference to landing systems, benthic observatories and Remotely Operated Vehicles; at the same time the more
traditional stations (like buoys, subsurface moorings, etc.) need to be kept into operation to continue time series already started. - Collect in real time; increase the number and type of instrumented packages that may be remotely operated continuously (like plankton recording samplers). - Develop systems that may be compatible with the logistics and requirements of different organisations; this will widen the number of disciplines that may benefit from the development of new fixed stations and observatories. A promising perspective is the link of fixed stations (observatories, moorings) with AUVs, which feature the potential of being more cost effective than survey ships. Possible proposals for increase the role and presence of European community are: make an inventory of large scale facilities; - develop new platforms at European level; stimulate technological transfer from fields like space and medical science; - promote and run interactive programs (for schools, education, etc.), taking example from the U.S.A. -
-
SCIENCE-TECHNOLOGY INTERACTION: THE INDUSTRY POINT OF VIEW by G. M.Bozzo (Tecnomare, Italy) Three main issues justify the need of more fruitful relationship between science and technology: improving the understanding and knowledge of the sea; identifying better tools and procedures for management and safeguard of the sea and marine resources; - securing the sustainable development of marine resources; These issues feature important and direct implications on significant aspects such as: quality of life, understanding of the global phenomena of the earth, socio-economical development, etc. For this reason it is becoming more and more important to be able to explain to "normal people" and to society the importance of the sea, why it is important to invest in improving its understanding, why to go deeper, etc. On this regard, a better communication between the marine community and society about sea and related problems is necessary, with the involvement of all the different actors, with the commitment to provide qualified information on fundamental issues such as: quality of water and status of the marine ecosystem. This approach will also avoid the emotional approach, which often characterises the debate on marine issues. As regard the relationship between Scientific and Technological Communities, these have been considered more a "client-vendor" relationship rather than a real co-operation where both parties could have a complementary role to play. In this regard the difficulties in accepting each other, between science and technology, which characterised the discussion during the first years of implementation of the European Program MAST is quite over. Nobody questions now about the importance of co-operation, but how to improve and increase such co-operation. Two good examples proving the need for a better synergy are relevant to deep-sea activities. -
-
Deep-sea research, for a better understanding of the deep ocean in the global phenomena, call for the availability of suitable enabling technologies as well as of robust, economic and reliable instruments. Some of these technologies are already available to the offshore industry. On the other side, the hydrocarbon industry is moving towards new areas for the exploration and exploitation of deep-sea reservoirs in several areas of the world (Gulf of Mexico, offshore Brazil, Mediterranean Sea, North Sea, Caspian Sea, Far East). To proceed in this development industry has to face the basic problem of achieving a better understanding of physical and biological phenomena affecting the deep sea, not yet well known, in order to properly design the production systems. In this case Science may provide the solution to the problem. One important concept needs to be understood: marine technology is not a commodity; the time necessary to develop new marine technology, suitable to operate in very severe environmental conditions, is often more than 6 to 8 years. In addition proper training of the persons involved is necessary to secure a successful result. In this regard MAST Program was the most important tool to foster the development of a modem marine science and technology community and to create an effective co-operation at European level. Marine Reserves can play an important role in the support of Pilot Projects for new technological developments.
D E E P SEA O B S E R V A T O R I E S
by Barbara Romanowicz (Berkeley University, U.S.A.) The role and importance of deep-sea observatories in marine and earth sciences is now recognised at worldwide level. Seismologists, together with geomagnetists and geodesists were the first communities to launch the idea of a global coverage of the Earth surface with permanent observatories. For obvious logistic reasons, monitoring stations relevant to these disciplines have been traditionally located on continents and islands, but it has soon become evident that, to achieve a uniform distribution in the observation networks, and, consequently, to improve our understanding of Earth's interior, underwater stations have to be developed and operated at deep sea bed. In 1993 an international organisation (International Ocean Network) has been established to co-ordinate activities in this specific field; a Workshop (Multidisciplinary Observatories on the Deep Sea Floor, Marseille, France 1995), was dedicated to discuss, between scientists and technologists, the various aspects of the problem. Deep-sea observatories represent the technological solution for reaching the target of global observations. As regards the initiatives in progress within the European Union, Geostar represents the most significant one, focusing on multidisciplinarity, communications and simplified deployment. Standardisation is one of the key issues for globality; modularity, common installation and recovery tools could provide easier maintenance procedures and lower costs. In this regard the European Union should promote participation to global programs and not limit attention to local problems. Real-time data recovery from ocean bottom observatories represents another crucial aspect; for some of the possible applications real-time data connection is an essential requirement, that cannot be met by the presently available solutions. Experimentation of different approaches should be encouraged to ensure a fruitful evolution of these systems.
Observatories need to be designed taking into account the requirements of multidisciplinarity, so to enable scientists to deploy different instruments and share infrastructures, in which observations of several phenomena are carried out simultaneously and over a time scale of one year or more. In conclusion, deep-sea observatories call for a strong relationship between science and technology; it is a matter of fact that scientific requirements represent a stimulus to develop new solutions, but at the same time only new technological developments may enable the achievement of the most challenging goals of the deep-sea observatories (like autonomy, real-time data communication, etc.).
KEY TECHNOLOGIES FOR MARINE OBSERVATORIES
by Jean-Yves Jourdain (Thompson, France) Advantages of co-operation between scientists and industry have been widely underlined during the conference. It is important for European industry to be involved, at their early beginning, within ambitious scientific projects and to face these challenges together with scientists. For industry there is the problem of finding the synergy between many different marine applications because, except for oil exploration and exploitation, each application cannot constitute a market. One of the most critical features influencing the industrial strategy is the perspective of a clear market. In this regard the scientific market is not always obvious from the industry point of view, while the oil industry represents a good example of clear market. With particular reference to the underwater acoustics, it is possible to mention several topics where significant progress has been done: low frequency and wide band sources, imaging (thanks to synthetic aperture processing), positioning. Acoustical links could now be efficient whatever the application to be addressed; nowadays marine systems never use such links for transmitting commands or images (even if they often have an acoustical link for security reasons when all other possibilities fail: a paradoxical situation). Finally, developments in autocalibration techniques, particularly those exploiting correlation of scattered signals, are of paramount importance for future systems.
SUMMARY OF DISCUSSION TOPICS The conference was mainly oriented to marine research in deep sea and represented an important opportunity of discussion between scientists and technologists involved in this field. Most significant contributions were relevant to unmanned deep sea observatories, that are providing a suitable solution as a complementary technique to more traditional systems like deep-sea ROVs, manned submersibles, buoy networks, etc., enabling the development of permanent monitoring networks. For this reason they are subjects of a growing interest from many scientific missions and disciplines and at the same time represent one of the most challenging goals for the next century of technical development. In this regard, the conference gave the opportunity to present in detail the main experiments that are being carried out worldwide.
Another aspect that has qualified the conference has been the presence of a wide range of scientific disciplines all interested in working in deep sea and potential beneficiary of common approaches and infrastructures. Most of the interventions made by the participants pointed out the importance of multidisciplinary approach both for the advantage in terms of costs (sharing common facilities) and for the added value in carrying out synoptic observations from different instruments. This trend seems to be well advanced for geophysical sciences, less so for more traditional marine sciences, like physical oceanography. Other disciplines need to be involved, in particular biologists and chemical oceanographers. At the same time it is to be kept clearly in mind that certain disciplines have very specific needs, so multidisciplinarity is not to be intended like a "dogma", but must be adapted, case by case, to find the best solution. In some cases the absence of some disciplines (like biology) in deep-sea observatories is simply because sensing techniques are not yet ready. In this regard it is essential to take advantage from know-how available in other fields, like space and laboratory analytical techniques. Two significant examples of the potential of technology transfer were presented at the conference, both relevant to instruments originally developed for space applications, that are now being adapted to deep-sea use in the framework of Geostar project. One is a gravity meter developed by the "Istituto per la Fisica dello Spazio Interplanetario" of the Italian Research Council, the other is a three-axial broad-band seismometer developed by the French Institut du Physique du Globe. In other cases there is a problem of achieving an active involvement of specific disciplines, and obtain a clear definition of their requirements and objectives. It has been pointed out that in past cases, the answers from the scientific community to initiatives oriented to investigate their expectations and goals were too limited (see for example the questionnaire distributed in the MAST-2 Abyssal Benthic Laboratory feasibility study: the percentage of replies was only 20%). After the conclusion of the MAST program, marine science and technology must now scatter all over the European framework programme. The marine community faces now two problems: to be able to communicate to public opinion the reason and advantages of investing in marine research; - to link what is achieved during the pre-competitive phase (typical of MAST projects) to fully operative devices. Community funding covers a small percentage of the total research budget; the large part derives from national funds, that are however not easily available for initiatives that features an intrinsic risk. There is a difficult gap to fill. Sharing of resources is therefore a need. The development of large scale infrastructures like deep-sea geophysical observatories may represent a good opportunity for the use by other communities. In the Fifth Framework Programme there are budgetary lines on large-scale infrastructures. Also, collaboration between EU and other countries is welcome if funding is not possible. Besides, the availability of the infrastructures, use of ships at European level represents another problem, especially for deep-sea activities that are characterised by high costs. Many interventions underline how the U.S.A. seems to be well ahead of Europe in the field of education to marine sciences. Initiatives like exhibitions (as already done by astrophysical institutes) should be promoted at European level to show and involve the public. -
CONCLUSIONS The Round Table is concluded by Attilio Licciardi, Major of Ustica, that welcomes participants and underlines the importance of the event for all the Sicilian community. Ustica, the island that hosts the first Italian Marine Reserve, intends to represent a meeting place for the marine scientists; on this regard the co-operation agreement signed between Ustica Marine Reserve and Istituto Nazionale di Geofisica for GEOSTAR project represents a first important achievement.
This Page Intentionally Left Blank
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
11
9 2002 Elsevier Science B.V. All rights reserved.
Research for the Protection of the Deep-Sea Hjalmar Thiel Poppenbfittler Markt 8A, 22399 Hamburg, Germany
The exploitation of deep-sea resources began about 50 years ago and is expected to increase in the future. In these activities, impact assessments must be conducted to avoid any misuse of the environment. Short accounts are presented of the potential activities in waste deposition, ore mining and hydrocarbon extraction; in each case the nature of the required environmental research is characterised. Finally, technical requirements and the current international legal framework governing activities in the deep sea are briefly summarised.
1. B A C K G R O U N D
Marine science has seen substantial developments during the second half of the 20th century, with a number of remarkable discoveries. These have included the final proof of the existence of plate tectonics and the many related processes: for example, the occurrence of warm and cold water venting from the seafloor; the predominance of chemoautotrophically fuelled animals at these localities and the specialised communities thriving around them [15]. BuHowever, these amazing discoveries were founded on basic research by scientists curious about the oceans' functional relationships and processes which gradually developed into more general insights about the oceanic and atmospheric systems. In the 1950s, for example, the el Nino and la Nina situations were believed to be rather localised effects; today they are recognised as important phenomena within a world-wide weather oscillation system [61. Basic science is of the utmost importance for an understanding of ocean processes. However, the last decade has seen an increasing shift towards applied science in many countries, including those of the European Community, directed by political decision makers. In many fields this may be justified by the needs of modem human society. However, one eminent human need is the proper functioning of the Earth's ecosystems, the human environment. Ecology must therefore be regarded as an applied science; it is wealth creating, not in the sense of financial resources, but in the sense of human health and quality of life. Most publications on the marine ecosystem deal with relatively shallow waters and much less is known about the remote deep ocean. However, human society already uses the deep sea and is in the process of preparing further large-scale intrusions into this inner space. Oceanographers from all disciplines must consider the need for precautionary investigations into the potential effects of this intrusion to enable governments and industry to avoid environmentally harmful activities and to undertake appropriate a posteriori studies. Thinking ahead, and considering potential impacts of current developments, will guide mankind into
12 wise uses of our environment and will help remove the mysteries of the deep ocean and bring it fully into the human ambit [7].
2. USE OF DEEP-SEA R E S O U R C E S E N V I R O N M E N T A L RESEARCH
AND
REQUIREMENTS
FOR
THE
The terrestrial parts of the Earth are already severely impacted by man [8,9]. Consequently, the potential use of the deep oceans, both for waste disposal and as a source of potentially valuable minerals, is frequently referred to. Indeed, the deep sea has already been used for the storage of wastes and the possible extraction of metalliferous resources is being actively explored. There is no valid ethical argument against the use of the deep ocean for the permanent disposal of waste material when the available land areas are exhausted. But the available data on the deep sea are not yet sufficient to allow adequate discussion of the best practicable environmental options for permanent waste disposal into this environment or the extraction of resources from it. As part of the process of correcting this deficiency, wastes already disposed of in the deep ocean should be viewed as experiments, albeit unintentional ones, and should be studied accordingly. Examples of such waste disposal activities that might be studied in this way, and experiments that might be conducted to investigate the possible impact of future disposals and resource extraction, are summarised below. For more details, the reader is referred to Thiel et al. [ 10, 11 ]. Munitions have been dumped in the sea for many years, in both shallow and deep waters [12]. Much of this material was dumped following World War II, but munitions dumping has also occurred subsequently. Despite the existence of the London Convention (the former London Dumping Convention) there has been no control over such dumping and there is no publicly available data on its extent since, under the sovereignty of states, naval authorities have been exempted from the convention. As a scientist, I cannot understand why a particular section of society should be exempted in this way from rules which apply to all other sections. The natural environment does not recognise such distinctions, and impacts on it are the same wherever they originate. In fact, munition dumping may not be particularly harmful to the wider environment. But any exploitation of the environment, including dumping munitions in the deep sea, demands study of the potential effects and this has never been done. The last munitions dumping in European waters probably occurred in 1994 when Portugal scuttled a ship containing 2200 metric tonnes of various types of munitions close to the outer boundary of its Exclusive Economic Zone [13]. However, no investigation of the short-term effects on the environment have been conducted. This represents a missed opportunity to obtain valuable information, though it is still not too late to study the longer-term effects. Large structures include those such as the oil storage platform Brent Spar which was intended to be dumped in the deep sea in 1995. As a result of public pressure following the intervention of Greenpeace, this plan was abandoned in favour of onshore re-use and disposal [14]. Scientists would have been in a better position to advise on the likely environmental consequences of the original Brent Spar disposal plan if the impacts of other large structures, already on the seafloor, had been studied before. There are many such structures lying on or buried within deep sea sediments, including the munitions carrier mentioned above and hundreds of sunken commercial vessels and warships. A study of their near- and far-field
13 environments would provide valuable data for impact predictions for other large structures as and when these are required. Radioactive wastes were dumped in the sea from the end of World War II until 1982 when, on the advice of the International Atomic Energy Agency, the London Convention banned this mode of final storage. Most of the dumped material was disposed of in the NE Atlantic between 1949 and 1982 [15,16]. Radioactively low level contaminated material from hospitals and industry was packed in barrels with concrete or bitumen as a matrix and dumped mainly in water depths between 4500 and 5300 metres. Environmental investigations were conducted in the general areas of the waste and no contamination above background levels was discovered. But, except for a few which were trawled, the barrels were never thoroughly inspected and the near-field environment, at the centimetre to metre scales, was not studied. Fine scale sediment coting surveys and analysis of near-field sediment and animal samples would provide useful information for the evaluation of past and potential future waste disposals. The technology for such studies around barrels of varying in situ times and degrees of deterioration already exists: manned submersibles and remotely operated vehicles. Oil and gas have been extracted in shelf and upper slope regions from rock formations deep below the seafloor for many years, and the industry is progressively penetrating deeper down the continental slope. The extraction process itself does not seriously impact the deep sea unless the riser connection fails and oil leaks into the ocean. But the impacts of drilling muds and cuttings piles around drilling rigs are known from shallow water studies [17]; with appropriate modifications, these results should be applicable to deep water situations. Gas hydrates were discovered only recently, but they have a wide distribution in continental slope regions. They constitute a solid material with a high energy content and may replace other resources in the future [18]. But the effects of such exploitation remain speculative. Extraction of the hydrates may lead to instabilities in the continental slopes and may cause sediment slides with far-reaching environmental consequences. Precautionary environmental studies such as large-scale experiments cannot be contemplated, but thorough site surveys would be an absolute prerequisite for any decision on exploitation, since sedimentary slope structures vary and each potential extraction region would require special consideration of the possible environmental risks. Sewage sludge and dredge spoils from harbours and rivers, when not contaminated with heavy metals, organochlorines or oil residues might be recycled to agricultural areas, but contaminated material require permanent storage. Potential disposal sites are becoming increasingly rare on land and interest in ocean dumping may recur. This is not to promote deep-sea disposal, but to urge oceanographers to be aware of potential future attacks on the deep ocean floor and the overlying water column. At present, they are not prepared for a best practicable environmental option comparison with terrestrial storage. Municipal waste dumping has occurred into the deep western Atlantic at the surface 106 nautical miles offNew Jersey in over 2500 metres of water [ 19]. This notorious dump site was used between 1986 and 1992 and the environmental effects were studied carefully. Further disposal activities are not being considered at the moment, but various technologies have been developed for dumping such waste at mid-ocean depths [20], suggesting that this option is still being considered. Surface release of wastes is certainly not advisable. At the very least, they should be delivered to the seafloor either in some form of riser or packed in geotextile container bags in order to confine any impact to a restricted area. In either case, more information than is currently available will be needed to evaluate the impact of any proposed future deep ocean waste dumping.
14 As in the case of deep-sea mining, only a large-scale experiment could produce this information. It is simply not possible to extrapolate from data gathered by traditional oceanographic techniques to large-scale intrusions expected from any commercial disposal activity. Commercial-scale disposal might involve the order of 1 million tonnes per year, but a feasible experiment should approach this scale of dumping through a series of steps with intermittent monitoring activities. For example, an initial dumping of 100,000 tonnes might be envisaged. If the results from monitoring the effects of this quantity were encouraging, it could be followed by a single disposal of around 1 million tonnes. Again, if the results were encouraging a final stage could involve monitoring the effects of dumping 1 million tonnes each year for ten years. Such a nested experiment would need to allow sufficient time for any environmental impacts to occur and for the analysis and evaluation of the monitoring results [10]. Carbon dioxide (C02) sequestration into the ocean has already been discussed for some years, aiming at a reduction of further increases in the concentration of greenhouse gases in the atmosphere [21,22]. The ocean surface takes up CO2 in natural processes, but the deep ocean has a much higher storage capacity. Eventually, any CO2 naturally or artificially sequestered in the deep ocean will return to the atmosphere, but the time-lag of several hundreds of years should be sufficiently long to allow other methods of CO2 sequestration to be found in the meantime. However, CO2 is poisonous at high concentrations by changing the pH of the water so that the possible effect of deep-sea storage on the marine fauna would require careful study. Again, it is difficult to extrapolate from laboratory data and a step-wise approach should therefore be employed. Such an approach might begin with laboratory-scale experiments using aquaria or mesocosms, move on to an intermediate-scale experiment in, for example, a relatively isolated deep, cold t]ord, and, finally, to a major experiment in the deep ocean. As in the case of large volume waste disposal, progression to each successive stage would depend upon the acceptability of the impact of the previous stage. Deep-sea ores have been known for about 130 years. However, the widespread occurrence of polymetallic nodules on the sediment surface at depths between about 4000 and 6000 m, and their potential as an industrial resource, has become apparent only in recent decades with the development of modem oceanographic techniques and particularly of deep-sea photography [23]. In fact, it was the presence of these curious concretions which prompted the most severe controversy between developing and industrialised nations during the negotiations of the United Nations Convention on the Law of the Sea (UNCLOS) which delayed this treaty entering into force. In the meantime, other potentially exploitable ores have been discovered, including polymetallic crusts, metalliferous muds and massive sulphides, in addition to phosphoritic nodules with potential in the production of fertilisers. Most efforts, both in exploration and in exploitation technology development, have been concentrated on polymetallic nodules and metalliferous muds, and technical mining tests were successfully conducted in the late 1970s. Environmental problems were foreseen from the beginning, and impact assessments accompanied the commercial sea-going activities [24,25]. The most stimulating insight was gained in the early 1980s during evaluation of the environmental studies for metalliferous mud mining in the Red Sea [26]. Data collected by traditional methods proved to be unsuitable for the prediction of environmental mining impacts since it was impossible to extrapolate from small scale impacts to assess the likely effects of much larger commercial activities.
15 As a result, the first large-scale disturbance experiment (by Germany) was developed [27] as a new approach to observe the re-establishment of the deep-sea benthic community after a severe disturbance of several square kilometres of seafloor in a polymetallic nodule area [28]. Other large-scale re-sedimentation and faunal recovery experiments followed by the United States, Japan, India and the Interoceanmetal Joint Organisation headquartered in Poland [29,30,31,32,33]. Further experiments, on a variety of scales, must be conducted for the evaluation of the impacts of nodule and metalliferous mud mining and the potential impacts of polymetallic crust and massive sulphide mining must be considered. An additional essential step will also be the monitoring of early mining tests lasting several months. All of these activities will be large scale undertakings which should be financed through international co-operation and the resulting knowledge shared. Little is known about plans for the exploitation of massive sulphides. The environmental problems are certainly different from those involved in polymetallic nodule mining since the ore body is not, in this case, confined to the sediment surface and mining may take place close to hydrothermal vents with their associated specialised communities. This situation may have similarities to the mining of metalliferous muds in the Red Sea where the basic resource is relatively similar to massive sulphides. Moreover, the environmental studies conducted during the exploration of the Atlantis II Deep in the Red Sea was unable to exclude the possible existence ofhydrothermal vent communities on the upsloping outskirts of the Deep.
3. REQUIRED TECHNIQUES Much of the applied deep-sea research envisaged for the future can be achieved with standard oceanographic equipment available on most research vessels. However, more specialised technology will be essential to answer many of the current questions. As noted above, barrels containing radioactive waste should be inspected closely and this can be achieved only with remotely operaing vehicles (ROVs) or with manned submersibles. The same is true for the collection of samples close to dumped munitions and large structures, both because of the risk of loss or damage to conventionally towed sampling devices and because these cannot be positioned sufficiently accurately to obtain the relevant samples. Even monitoring large-scale dumpsites, such as those for sewage sludge or dredge spoils, may require fine scale sampling achievable only with ROVs or submersibles. Long-term measurements and observations may also be needed. These can be achieved with deployed abyssal laboratory type installations [34] such as GEOSTAR (see this volume), with the equipment carried tailored specifically to the task in hand. Studies of hydrothermal vents, for example, would require a broad spectrum of equipment including cameras, corers, water bottles, current and turbidity meters and a range of sensors. Similarly, at sewage sludge or dredge spoil dumpsites the equipment should detect particle sedimentation and erosion, sediment and contaminant transport, life and bioturbation.
4. L E G A L C O N D I T I O N S F O R THE P R O T E C T I O N OF THE D E E P - S E A
The United Nations Convention on the Law of the Sea [35], which has been ratified by more than 132 State Parties (131 States and the European Union), contains many paragraphs relating to the deep seabed environment. Responsibility for environmental protection of the
16 deep-sea floor, referred to as the Area in the UNCLOS document, is assigned to the International Seabed Authority (ISA). A code for the exploration for polymetallic nodule mining is under negotiation within the ISA and environmental guidelines for this activity have already been drafted [36]. No other use of the Area has been discussed thoroughly since the Law of the Sea entered into force in November 1994 and the ISA was established. A general statement related to the use of other deep-sea resources might be a useful position document before such discussions take place. However, any ISA environmental provisions have no legal force outside the Area, that is in the Exclusive Economic Zones (EEZs) of individual states. UNCLOS agreed on a 200 nautical mile wide EEZ for coastal states (with exceptions), and Article 193 formulates that <<States have the sovereign right to exploit their natural resources pursuant to their environmental policies and in accordance with their duty to protect and preserve the marine environment>>. But legal provisions to protect the environment in many countries may be weakly developed, and in most countries the marine environment, particularly the offshore parts of the EEZ, may not be included within the legal framework. Thus, the missing legal provisions for EEZs have a potential for severe disturbances in deep sea regions, and no international legal regulations may be effective. The legal basis for the protection of the sea should be further developed, and this should include clear formulations for the need for precautionary research, environmental impact assessments, conservation of species and habitats [37,38], and monitoring. It should also insist that in order to develop the best practicable environmental options, the best available and appropriate techniques should be employed. ACKNOWLEDGEMENTS
I am most grateful to Prof. Dr. Paolo Favali and Prof. Dr. Giuseppe Smriglio for inviting me to participate in the course Science - Technology Synergy for Research in the Marine Environment." Challenges for the XXI Century. I thank Dr. Anthony L. Rice for the language correction of my manuscript.
REFERENCES
1. V. Tunnicliffe, Oceanogr. Mar. Biol., Ann. Rev., 29 (199 l) 319. 2. L.M. Parson, C. L. Walker and D. Dixon (eds.), Hydrothermal Vents and Processes. Spec. Publ. The Geological Society, London, 1995. 3. Ch. R. Fisher, Biosystem. Ecol. Ser., 11 (1996) 313. 4. C. R. German, L. M. Parson and R. A. Mills, in C.P. Summerhayes and S.A. Thorpe (eds.), Oceanography. An Illustrated Guide, Kanson Publishing Ltd., London, 1996. 5. V. A. Gebruk, S. V. Galkin, A. L. Vereshchaka, L. I. Moskalev and A. J. Southward, Advanc. Mar. Biol. 32 (1997) 94 6. R.W. Schmitt and S. E. Wijffels, Geophys. Monogr., 75 (1993) 77. 7. K.J. Hsti and J. Thiede (eds.), Use and Misuse of the Seafloor, John Wiley & Sons Ltd, Chichester, 1992. 8. WBGU (German Advisory Council on Global Change), World in Transition: The Threat to Soils, Economica, Bonn, 1995.
17 9. WBGU (German Advisory Council on Global Change), World in Transition: Ways Towards Sustainable Management of Freshwater Resources, Springer, Berlin, 1999. 10. H. Thiel, M. V. Angel, E. J. Foell, A. L. Rice and G. Schriever, Environmental Risks from Large-Scale Ecological Research in the Deep Sea: A Desk Study, Contract No. MAS2CT94-0086, Commission of the European Communities, Directorate General for Science, Research and Development, 1998. 11. H. Thiel, Anthropogenic impacts on the deep sea, in: P. A. Tyler (ed.); Ecosystems of the World: The Deep Sea, Elsevier Science, Amsterdam, in press. 12. ACOPS, ACOPS Yearbook 1986/87, Advisory Committee of Pollution of the Sea, London, 1988. 13. SEBA (Working Group on Sea-bed activities), Oslo and Paris Conventions for the Prevention of Marine Pollution, Summary Record 95/8/1-E (1995). 14. T. Rice and P. Owen, Decommissioning the Brent Spar, E. & F.N. Spon, London and New York, 1999. 15. F. G. T. Holliday, Report of the Independent Review of Disposal of Radioactive Wastes in the Northeast Atlantic, HMSO, London, 1984. 16. IAEA (International Atomic Energy Agency), Inventory of Radioactive Material Entering the Marine Environment, Sea Disposal of Radioactive Waste, IAEA TECDOC-588, Vienna, 1991. 17. F. Olsgard, and J. S. Gray, Mar. Ecol. Progr. Ser., 122 (1995) 277. 18. M. Kelland, Mar. Poll. Bull., 29 (1994) 307. 19. M. H. Bothner, H. Takada, I. T. Knight, R. T. Hill, B. Butman, J. W. Farrington, R. R. Colwell and J. F. Grassle, Mar. Environm. Res., 38 (1994) 43. 20. P. J. Valent and D. K. Young, Abyssal Seafloor Waste Isolation: Environmental Report. Naval Research Laboratory, Stennis Space Center, NRL/MR/7401-95-7576, 1995. 21. T. Ohsumi, Mar. Techn. Soc. Joum., 29 (1996) 58. 22. B. Ormerod and M. V. Angel, Ocean Storage of Carbon Dioxide: Workshop 2 Environmental Impact, IEA Greenhouse Gas R&D Programme, Cheltenham, 1996. 23. J. L. Mero, The Mineral Resources of the Sea, Elsevier Oceanogr. Ser. 1, Elsevier, New York, 1965. 24. E. Ozturgut, J. Lavelle and R. E. Bums, in: R.A. Geyer (eds.), Marine Environmental Pollution, Dumping and Mining, Elsevier Oceanogr. Ser. 27B, Elsevier Science Publishers, Amsterdam, 1981. 25. H. Thiel, E. J. Foell and G. Schriever, Ber. Zentr. Meeres- Klimaforsch. Univ. Hamburg, 26 (1991) 1. 26. H. Thiel, H. Weikert and L. Karbe, Ambio, 15 (1986) 34. 27. H. Thiel, Mar. Mining, 10 (1991) 369. 28. H. Thiel and G. Schfiever, Ambio, 19 (1990) 245. 29. T. Kaneko (Sato), K. Ogawa and T. Fukushima, Proc. 1st Ocean Mining Syrup., Tsukuba, Japan (1995) 181. 30. T. Radziejewska, Proc. Intern. Syrup. Environm. Stud. in Deep-Sea Mining, Tokyo (1997) 223. 31. D. D. Trueblood and E. Ozturgut, Proc. 7 th Intern. Offshore Polar Engin. Conf., Honolulu (1997) 481. 32. R. Sharma, Proc. 3~dOcean Mining Syrup., Goa (1999) 118. 33. H. Thiel and Forschungsverbund Tiefsee-Umweltschutz, Deep-Sea Res. II, (in press).
18 34. H. Thiel, K.-O. Kirstein, C. Luth, U. Luth, G. Luther, L.-A. Meyer-Reil, O. Pfannkuche and M. Weydert, Jour. Mar. Syst., 4 (1994) 421. 35. United Nations, The Law of the Sea, Division of Ocean Affairs and the Law of the Sea, Office of Legal Affairs, United Nations, New York, 1997. 36. ISA (International Seabed Authority), Deep-seabed Polymetallic Nodule Exploration: Development of Environmental Guidelines, International Seabed Authority, Office of Resources and Environmental Monitoring, Kingston, 1999. 37. H. Thiel and T. J. A. Koslow, Ocean Challenge, 9 (1999) 14. 38. J. A. Koslow, G. W. Boehlert, J. D. M. Gordon, R. L. Haedrich, P. Lorance and N. Parin, ICES J. Mar. Sci., 57 (in press).
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
19
9 2002 Elsevier Science B.V. All rights reserved.
D e e p physical observatories
oceanography
experimentation
and
benefits
from
bottom
Claude Millot Antenne LOB-COM-CNRS, BP 330, F-83507 La Seyne/mer, France
Physical oceanography is probably the marine sciences discipline which has benefited the most from in situ autonomous sampling. The major reason is that it deals with state and dynamical equations relying parameters (temperature, salinity, pressure, current) that can be measured almost easily, at least locally (current can also be remotely sampled). Simple instruments and more sophisticated devices have thus been deployed for a while, either in the bottom layer (on which the paper focuses) or in the water column. Even though they obviously have limits, bottom observatories appear to be potentially valuable devices in physical oceanography.
1. INTRODUCTION This paper is a review article. As emphasized by the referees, it thus should have provided the reader with a set of references on up-to-date technologies and figures of general interest. Nevertheless, because of the tremendous and rapid technological progresses, the available references are relatively old and not of great interest, while space is too limited here to show an exhaustive set of figures. Therefore, instead of having classical references and figures, we have thought it more valuable to provide the reader with some www addresses that will help him in initiating a research about more specific, up-to-date and well-illustrated information. Sampling the water column is usually done with instruments which are either ship-handled or left drifting or set in place on moored supports. The first group generally allows a rather manageable description of the phenomena of interest since the sampling strategy can be adjusted and modified conveniently. But ship mobility prevents one from having fully synoptic data sets, and shiptime is expensive. Moreover, ships do not allow us to acquire long time series, and thus monitoring a specific area. This can be partly done with drifting (i.e. lagrangian, [1,2]) instruments, as long as they remain in an area where they can be located. This area is the world ocean for instruments drifting at the surface (located from space), but it practically reduces to a few hundreds up to few thousands kilometres for instruments drifting at intermediate depths (the sole location technique being acoustics). Such instruments are generally aimed at drifting in the same water mass, and are thus especially efficient in describing its circulation, not in specifying its temporal variations (the recorded variations being both spatial and temporal). In order to get sufficiently long time series directly related to equations written in an easily "legible" (i.e. eulerian [3]) form and suitable to be analysed with the available techniques (statistics, correlations, spectral analysis, etc.), data sets must be recorded at fixed locations. Oceanographical instruments can be set in series on a line moored on the bottom (reaching the surface or not), or directly set on the bottom on more or less
20 sophisticated observatories. Integrating sound travel times (acoustic tomography) over relatively small or large distances provide information on average parameters, current or temperature, respectively. For global experiments (e.g. WOCE and TOGA [4,5]) as well as for regional ones (e.g. ELISA [6]), the in situ sampling strategy ideally combines both eulerian and lagrangian instruments. Some of these instruments have to be built by the research teams for their own use, but most of them are marketed and available from everywhere. Most of the largest companies specializing in physical oceanography instrumentation present their products during well known exhibitions (e.g. Oceanology International Exhibition [7]) and are thus indexed by the organizing committees (information available upon request). Now, physical oceanography deals with (i) a state equation showing three parameters (temperature, salinity, pressure) which can be measured locally almost easily, and the density (which is computed from them), and (ii) dynamical equations showing the density, the 3-D current (which can be measured both locally and remotely, over a few hundreds of metres on the vertical), and the external forcings. Space variables can either be specified easily with the required accuracy, especially for bottom-mounted observatories, or constitute the major sampling problem, especially for drifting devices. Due to the relatively large time scales of the major oceanographical phenomena, the time variable is generally measured easily with sufficient accuracy; measuring time becomes a problem with tomography, that is measuring slight differences in sound speed over long distances. The major characteristics of the physical parameters, as well as the performances and specific problems of the physical sensors which could be operated with bottom observatories are first presented in Section 2. Benefits and limits of bottom observatories, mainly with respect to simple mooting lines are then analysed in Section 3. Finally, possible developments of bottom observatories are discussed in Section 4.
2. THE PHYSICAL OCEANOGRAPHY PARAMETERS AND SENSORS 2.1. Temperature
Being basically measured with a thermistor, temperature is certainly the state parameter that is the most easily sampled at any place and for any duration. The major difficulty is to reach a sufficient accuracy, since the temperature gradients, which are relatively large on the vertical in the surface layers, can be very small and with no specific direction in the deep ocean. Indeed, the surface layers warmed up by the sun either temporary (in spring and summer at mid-latitudes), or in some global zones (the tropical ocean), become lighter and thus remain at the surface. Even though they are mixed over a few tens of metres by the surface agitation (the waves mainly), they come to be separated from the deeper layers by a thermocline where the vertical gradients are in the order of a few tenths (and even more) ~ Down to a few tens (respectively hundreds) metres, an accuracy/resolution of-O.1 (respectively, 0.01) ~ is thus generally sufficient and can be achieved without any sophisticated calibration. In some other places (the polar zones of the Atlantic ocean, the northern parts of semienclosed seas such as the Mediterranean), and during severe meteorological conditions (wintertime cold and strong winds), surface waters are cooled and evaporated. They thus become denser, tend to sink, locally homogenize the whole water column, and then spread horizontally in the whole deep ocean/sea, driven by the horizontal pressure gradients and the Coriolis effect. Being no more in contact with the atmosphere, dense water masses keep for
21 years, decades and even centuries, the characteristics they got in their formation zones, which are basically different from zone to zone. In such a way, they can be recognized more or less everywhere (i.e. far away from their formation zone) and classified. Note that because of pressure, the temperature of a sinking homogeneous water mass increases (a few hundredths to a few tens ~ / km, depending on salinity and pressure), so that physical oceanographers do not deal with in situ, but with potential (i.e. at zero pressure) temperatures. Now, while propagating, these dense waters adjust themselves on the vertical due to their relative densities. They also mix, even if slightly, with the surrounding waters. Therefore, and except in the vicinity of topographic features such as a trench between two basins or a continental slope, gradients on both the vertical and the horizontal can be extremely low and sensors must be able to give evidence of variations of few 0.001 ~ Such sensors are sophisticated and must be adequately calibrated before and after experiments lasting several months/years (e.g. WOCE [4]). Acoustic tomography is the sole technics able to measure slight temperature variations (related to some general global warming for instance) at a global scale and at intermediate layers; it has recently been used and validated at the scale of the Western Mediterranean Sea during the Thetis-2 MAST project. The temperature of the deepest layers can be as low as -2~ (in the southern Atlantic ocean) while it is-13 ~ in the Mediterranean sea.
2.2. Salinity Salinity is not measured directly but via a related parameter, the one universally used being the conductivity. In a cylindrical cell, a primary circuit induces an electromagnetical field which, in turn, induces a current (proportional to the conductivity) in a secondary circuit. The salinity-conductivity relationship strongly depends on the temperature, which must therefore be simultaneously measured in a convenient way. Problems can arise in relatively heterogeneous conditions, due to different response times of the temperature and conductivity sensors swept by moving waters. They are solved by setting the sensors in a tube where water is flushed at a constant rate by a pump during the measurements. Such problems are unexpected in deep, relatively homogeneous and slowly moving waters. Another problem with conductivity measurements comes from turbidity. Indeed, organic particles produced in the whole water column or inorganic ones originated from the coastal zones, tend to sink and accumulate in the deep basins, generally forming more or less compact muds. Mainly due to the current, and also partly, to any topographical obstacle, these particles are more or less permanently resuspended. They thus form a few metres up to a few tens of metres bottom nepheloid layer. These particles obviously penetrate into the conductivity cell and tend to sediment on the cell inner sides, thus temporary or continuously modifying the water volume and then the measured conductivity. Such a cell must be cleaned, which can probably be achieved efficiently with the pump already mentioned. From an oceanographic point of view, the surface salinity is modified at a global scale by air-sea interactions, through the evaporation/precipitation processes. More locally, it depends on the proximity of inputs from rivers and melting of ice sheets. Dense waters formed in the Mediterranean Sea will thus have much higher salinities (more than 38 g of salt per litre or practical salinity unit, psu) than the ones formed in the Labrador or Weddel Seas (less than 35 psu). As for temperature, accuracy/resolution of a few tenths up to hundredths of psu is sufficient in the surface and intermediate layers (and easily achieved as long as the conductivity sensor cleaning is efficient). In the deeper layers (see WOCE [4]), significant information needs to reach a few thousands of psu and dedicated calibrations.
22
2.3. Pressure
Pressure is measured with piezoelectric quartzs which have an accuracy proportional to their measurement range. Not considering the effect of waves (complex and limited to a depth of about half their wavelength; note that wave characteristics can be inferred from pressure sensors immersed close to the surface), the pressure is said to be hydrostatic, since it only depends on the mean height and density of the water column above the sensor. Pressure is needed to compute, with both in situ temperature and salinity, the in situ density of the water mass in which these parameters have been measured (i.e. locally); to do this, adequate basic sensors provide a sufficient accuracy. But pressure can also be considered as a remote parameter to try specifying the water height (and thus the sea level) and mean density; sophisticated pressure tide gauges [8,9] reaching relative accuracies/resolutions of 10 -s (i.e. 0.1 mm at 10.000 m!) are now available and very easily managed. Separating height from density variations can be impossible. For instance, the seasonal warming up of the surface layers in the Mediterranean Sea ( b y - 1 0 ~ over few tens metres) decreases their density (by 1-2 kg/m 3) and increases the sea level (by 10-20 cm), without leading to any pressure variation. Generally speaking, height and density effects on pressure variations could be separated according to the variations time scales and a knowledge of the environmental background. Practically, since mean (over a large depth interval) density variations are relatively small, pressure variations will be directly related to sea level ones, especially at mesoscale, that is periods of hours to days and weeks. Phenomena of specific interest which could be monitored with sampling intervals of the order of 1 hour are tides, wind effects pushing waters against (respectively away from) a coast and thus increasing (respectively decreasing) the sea level, long waves (associated to phenomena such as E1 Nifio), eddies (such as the Gulf Stream tings); higher sampling rates could allow monitoring phenomena such as seiches (natural oscillations of oceanic basins) and tsunamis. 2.4. Currents
The largest amplitudes (a few m/s) are encountered in the open ocean within the core of the major currents (e.g. the Gulf Stream) or within straits (e.g. Gibraltar) or around capes and between islands/rocks (generally due to tides there). In the abyssal domain, they are relatively low (order of cm/s) but can be large (50 cm/s have been measured at -2000 m in the Mediterranean Sea) under specific conditions. Currents close to the bottom are almost horizontal, and could thus be easily measured (with an accuracy of a few cm/s) with any basic current meter [10,11,12]. Such instruments use mechanical sensors (for instance, a rotor for the speed and a wing for the direction), or involve acoustic elements (measuring travel times over few decimetres pathways), or measure electrical quantities (since saline water moving across a magnetic field induces potential differences in a perpendicular direction). Because of friction, note that the current is zero just at the bottom while it is sheared over a few tens of metres height (within what is called the Ekman bottom layer). Now, currents can also be measured remotely with relatively sophisticated instruments generally called Acoustic Doppler Current Profilers (ADCPs, [13,14]). Basically, an ADCP measures the Doppler shift between a frequency emitted from a source and the one echoed from suspended particles entrained by the current in a water layer; the shift is proportional to the current radial velocity. The analysis of the echoes at different time lags after the emission gives the radial velocity in cells at different distances from the source. Because of the overall horizontal stratification of the water column, and especially close to the bottom, the current is
23 expected to be quite homogeneous horizontally at a specific location, while it can be markedly different on the vertical. Therefore, orientating an antenna horizontally gives radial velocities significant but similar for all cells, while orientating it vertically gives zero radial velocities (even if they correspond to currents different in the various cells). A satisfying compromise is found with an orientation of 20-30 ~ with respect to the vertical and at least 3 (practically 4 to allow redundancy) antennae regularly distributed around the vertical. The 3-D current profile classically associated with the x-y-z orthogonal axes can thus be computed from radial velocities in a series of cells. The ADCP performances (accuracy, range) basically depend on the working frequency. They also depend on the various specifications which can be modified such as the cells size, the number of pings over which averages are computed, the emission intensity, as well as on the density of adequate suspended particles. The layer thickness can range from 400 to 500 m for a 75 kHz frequency to a few tens of metres for a -600 kHz one. The cells size may have to be increased up to several metres (with low frequencies) but can be as low as a few decimetres (with high frequencies) while, in general, several tens up to few hundreds pings are averaged to give relatively accurate measurements. Accuracy/resolution is similar for all components (while, for instance, vertical velocity cannot be measured with mechanical sensors). Enlarged battery packs generally allow, for instance, 1-h measurements d u r i n g - 1 year for a long range (i.e. 75 kHz) ADCP. The current field can be defined with a more or less fine resolution in a box domain, either in the mid-ocean or within a channel, using acoustic tomography.
3. B O T T O M O B S E R V A T O R I E S B E N E F I T S AND LIMITS 3.1. The hydrological sensors Conductivity, temperature, and pressure sensors are commonly set on basic instruments available on the market and known as CTDs (D standing for depth even though it cannot be measured directly) [15,16]. When operated from a ship, either yo-yoed ( a t - 1 m/s, ship arrested) or to-yoed (yo-yoed at -1 m/s and towed at a few m/s [17]), they have to sample more or less stratified layers so that the various sensors must be flushed by a pump (especially when yo-yoed) while the response times must be correctly considered. Now, as long as CTDs can be operated with an electric cable, energy and memory are not limited and data can be recorded at high sampling rates (several tens Hz). Also, sensors can be sophisticated (i.e. sensible even if fragile), easily calibrated and changed in case of necessity. No doubts that such CTDs give the most confident data sets, even though they do not allow collecting time series at a fixed location. Long (weeks to months) time series can be collected with fully autonomous CTDs also available on the market and set on moorings at any depth (in line between a float and a weight). But, since energy and memory are limited, they generally do not acquire data at sampling rates higher than a few times per hour. Sensors must be relatively reliable and robust, which generally means relatively rough. In any case, calibration before and after deployment, and even in situ comparisons with ship-handled measurements, will allow specifying eventual drifts. Note that moorings are supports not stable enough, especially away from the bottom, to allow pressure measurements able to infer sea level and mean density variations. Therefore, a bottom observatory offers several advantages. Essentially, energy and memory can be less restricted than for marketised autonomous CTDs. In order to clean the
24 conductivity cell, a relatively powerful pump (or another cleaning device) can be operated. Such a pump can also be operated more frequently to allow a higher sampling rate, as long as more memory is available (even though this might not be necessary due to the overall homogeneity of the deep ocean). Since the observatory is accurately located and close to the bottom, in situ comparisons with ship-handled instruments are relatively easy and efficient to estimate drifts of the sensors. To be emphasized is the specific interest to compare and combine the pressure measurements from deep bottom observatories with sea-level heights from satellite altimeters, even though the former have a maximum accuracy/resolution much better than the latter (a few hundredths cm vs. a few cm). For instance, the sea-level being directly given by an altimeter will allow defining, with a pressure sensor, the mean density (and mean temperature in case of global warming) of the world ocean, or even the thickness of a surface layer whose density could be estimated by other means (as in the case of E1 Nifio for instance). In this respect, specifying the tides in the middle part of the ocean is relatively easy (series of periodic signals of constant amplitudes and phases); it will allow refining tidal models and thus will markedly improve the corrections actually made to altimetric data sets, which appears to be of major global interest. Except for the fact that bottom observatories only allow, for the time being, measurements at a few dm/m from the bottom, a limitation could arise from the turbulences they create in the current field, which will necessarily increase the resuspension of sedimented particles. This will occur whatever the location of the sensors are (within the observatory and with respect to the current direction), and will contribute mainly in making the conductivity cell more dirty. 3.2. The dynamical sensors Currents close to the bottom are surely, even if more or less, disturbed by any observatory; the larger the observatory, the stronger the perturbation of the current field. When considering also the shear in the Ekman bottom layer, it is clear that any pure physical study about bottom layer dynamics will need a specific support. Even though such a support can be more than a single mooring line, and thus assimilated in some sense to a bottom observatory, it will necessarily have to disturb the current field as little as possible, which prevents it from deploying other instruments or devices there. Local current measurements from an observatory can thus be only aimed either at roughly defining the environmental characteristics in the study area, or at accurately specifying the small scale current (i.e. noise source) close to some sensible sensor, such as a seismometer, for instance. Obviously, ADCPs can be set advantageously on bottom observatories. Apart from the few metres of bottom layer where the current is more or less disturbed by the observatory, they give data sets of good quality. But, ADCPs are equipped with complementary sensors such as those measuring the tilt and the orientation of the instrument, so that good quality is also expected from instruments set on moorings. Ideally, both long range (relatively coarse resolution) and short range (sharp resolution) ADCPs should be mounted on a bottom observatory, which can also be easily done on a mooting. The major advantage provided by an observatory is thus the supplementary amount of both energy and memory which allows more energetic and/or frequent pings and thus more accurate and/or longer data sets.
25
3.3. Relationships with other sensors and devices In addition to the increased amount of energy and memory, an obvious advantage provided by a bottom station is the complementarity with sensors dedicated to other disciplines (as already mentioned in this and other papers in this publication). Also, the possibility to check the functioning of the various instruments, through messengers or more classical acoustics systems, is certainly an interesting opportunity. Finally, the possibility to modify the sampling rate of the physical sensors, based on events detected by other sensors and in order to analyse more finely specific phenomena of a non-physical-oceanography origin, is an interesting possibility offered by a bottom observatory. Nevertheless, the complexity of acoustic propagation close to the bottom (due to both reflexion and refraction) does not make bottom observatories very efficient for acoustic tomography.
4. POSSIBLE D E V E L O P M E N T S OF B O T T O M O B S E R V A T O R I E S
Obviously, more or less complicated (thus, expensive and hardly realizable!) devices can be imagined on the basis of the presently or soon-available technology. Local hydrological and dynamical sensors are probably used actually at the maximum of their potential performances. But the technology of remote sensors such as the ADCP is in marked progress so that, in the forthcoming years, the range and accuracy/resolution of the current measurements will certainly increase; in particular, ranges of a few thousand metres will probably be possible soon. A major limitation of actual bottom observatories might be their incapacity to sample scalar parameters (hydrological or similar ones such as oxygen, nutrients, suspended matters) over relatively large depth ranges. One could imagine that messengers, instead of being only carriers of information, could also be equipped with their own sensors (acting as CTDs) and thus getting profiles over the whole water column while rising up to the surface. Also, using at least 3 acoustic sources deployed for the whole experiment in the bottom observatory surroundings could permit locating the rising messengers, and thus getting vertical profiles of the horizontal current over the whole depth. Whatever the location of such observatories could be (i.e. in the coastal zone or in the mid-ocean) and whatever the sampling rate (in terms of profiles per day, week or month), it is clear that the technical capacity of satellite systems allows transmission of convenient data sets in real-time, and thus an efficient monitoring of a specific area at a relatively low (compared to ship and crew) cost. Autonomous profiling devices equipped with hydrological sensors are commonly operated. Some are based on a string [10] of sensors (mainly temperature and conductivity), connected or not to a unique recorder, and distributed along a line a few tens to a few hundreds of metres in length maintained vertical by floats. Others are based on a unique set of sensors, typically a CTD sometimes equipped with a current meter, yo-yoing along a cable which can be as long as a few thousands metres. Others are free-falling expandable probes (XBTs, XCTDs, [18]) that are connected to a recorder on a ship through a very thin cable which brokes at some nominal depth. Obviously, the accuracy and sampling rates of these devices are variable, and can be more or less adequate according to the hydrological characteristics of the water masses in which they are operated. Nevertheless, such profiling devices (including inverted expandable ones that could be easily developed) could be also operated from a bottom station, taking benefit from the advantages already mentioned (energy, memory, monitoring, checking), since their deployment from the observatory after the observatory setting, as well as their release from the observatory before the observatory recovery (for both a string and a
26 yo-yo in order to simplify the observatory manipulations) do not theoretically raise major problems. To conclude, and even though most of the physical oceanography sensors are presently operated in a rather convenient way from simple moorings, it is clear that the actual bottom observatories offer interesting possibilities, mainly in terms of increasing capacities of the standard equipments, monitoring the sampling rates to better describe phenomena (eventually of non physical-oceanography origin), checking the functioning and sending messages. In the future, such observatories could be the less costly solution to conveniently monitor the global ocean through a new generation of profiling sensors.
ACKNOWLEDGMENTS I would like to warmly thank J.-L. Fuda for fruitful discussions about this manuscript and his overall valuable involvment in the GEOSTAR project.
REFERENCES
1. http ://www.rsmas.miami.edu/groups/ldmg.html 2. http ://rafos. gso.uri, edu/ 3. http://stommel.tamu.edu/-baum/reid/bookl/book/nodeS.html 4. http://www-ocean.tamu.edu/WOCE/uswoce.html 5. http ://www.ncdc.noaa.gov/coare/ 6. http ://www.com.univ-mrs. fr/LOB/ELISA 7. http ://www. spearhead, co.uk/ 8. http ://www.nbi.ac.uk/psmsl/sea_level.html 9. http://www.paroscientific.com/site_map.htm 10. http://www.aanderaa.com/ 11. http://www.interoceansystems.com/index.htm 12. http://www.falmouth.com/ 13. http://www.rdinstruments.com/ 14. http ://www.sontek.com/ 15. http://www.seabird.com/ 16. http ://www.business 1.com/genocean/ 17. http'//matisse.whoi.edu/seasoarintro.html 18. http://www.sippican.com/
Science-Technology Synergy for Research in the Marine Environment. Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
27
9 2002 Elsevier Science B.V. All rights reserved.
Why global geomagnetism needs ocean-bottom observatories Frank J. Lowes Physics Department, University of Newcastle, Newcastle Upon Tyne, NE3 3QE, United Kingdom The main geomagnetic field is mostly large scale, coming from electric currents induced by motions in the fluid conducting core. But, there is also a significant, mostly small-scale, contribution from the magnetization of near-surface crustal rocks, which acts as noise for surface observations. The main field is usually modelled by a spherical harmonic expansion, truncated at degree about n=10-13. However, there is a significant change of the core field with time, and this so-called secular variation is very poorly known and difficult to predict. Because of the crustal noise, the accurate measurement of the local secular variation needs long-term measurements at a fixed s i t e - a magnetic observatory. Because of the large gaps in the distribution of surface geomagnetic observatories, the secular variation can at present be modelled, at best, to about degree n-8. Some of the gaps in the distribution could be filled by observatories at island sites, but there is a need for at least 8 sites in mid-ocean; in practice this means ocean-bottom observatories. Although satellite surveys will in future give good models of the field at a particular epoch, there will still be a need for a global network of surface stations to act as a baseline, and to better determine the instantaneous secular variation.
1. THE M A I N G E O M A G N E T I C FIELD
Most people have come across the Earth's magnetic field simply as the source of the mysterious torque that makes a compass needle point north. In fact this torque comes only from the horizontal part of a magnetic field; the field is actually a three-dimensional vector which also points upwards or downwards, as well as roughly northwards. And the field is not confined to the Earth's surface, but extends inwards and outwards. To a crude approximation, the field is dipolar, like that of a bar magnet near the centre of the Earth (though at present tilted about 10 ~ to the geographic axis)(Fig. 1). At the Earth's surface, near the equator the field is roughly horizontal, and of magnitude about 30,000 nT, and at the north and south magnetic poles it is vertical, and of magnitude about 60,000 nT; I will use 45,000 nT as a typical global average. However, the dipole nature of the field is only very approximate; if we remove the global best-fitting dipole (a vector subtraction), then at the surface we are still left with a field of magnitude about 15,000 nT, the so-called non-dipole field. This non-dipole part of the main field has a length scale of thousands ofkilometres.
28
Figure 1. Schematic picture of dipole field
Crustal
Magnatization
-200 nT
,,x_,.x.., Dynamo -45000 nT Currents
Solid Mantle
Liquid Conducting Core
Figure 2. The main field (about 45,000 nT vector rms over the surface) is producted by electric currents in the core, while the crustal field (about 200 nT rms) comes from near surface rocks.
29 This main field has its origin in motions in the fluid core, 3000 km below us (Fig. 2). Because of our large distance from the source, it is not surprising that what we see is of mainly long length scale. Note that, as far as the main field is concerned, the continents and oceans do not exist; the field does not "see" the Earth's surface, or the details on it; such details are added to diagrams only to help the reader locate features. In the early days it was sufficient just to produce a chart showing how the field varied over the Earth's surface. But now we are also interested in the field above and below the surface, and it is standard practice to use a mathematical model of the field. For convenience the modelling is in terms not of the magnetic field vector B itself, but of the corresponding magnetostatic potential, V, from which B is obtained as B = -grad V. It is conventional to express V as a sum of spherical harmonics, so that at any one time (and slightly simplifying) V(r,O,)~)
= aZnZ,ngn
m
(a/r) n+l s,,m(0,)~).
Here (r,0,)~) are the usual spherical polar geocentric coordinates, a is the Earth's radius, and the gnm are numerical coefficients having the dimensions of magnetic field. The series of surface harmonics snm(t3,)~) is the equivalent of a two-dimensional Fourier series, but one appropriate to a spherical surface. A relevant parameter is the degree n of the harmonic; this is essentially a spatial wavenumber, or spatial "frequency", with the corresponding minimum wavelength 27~a/n = 40,000/n kin. Theoretically the series should go to n=oo, but in practice we put an upper limit on n, corresponding to a lower limit on the length scale. This limit is not arbitrary, but arises because at shorter wavelengths there is a significant source of"noise" which we cannot avoid. Although most of the internal field comes from the core, there is also a small contribution from the top 20 km or so of the rocks of the continents, and ocean floor (Fig. 2). This thin shell is weakly magnetic because of the ferromagnetic minerals in the rocks. Deeper than about 20 km the temperature is too high for the minerals to be ferromagnetic. In some cases these minerals were magnetized by the geomagnetic main field existing when they were formed, and in all cases they have induced magnetization because they are in the present main field. The overall contribution at the surface is a field of magnitude typically 200 nT. Most of this crustal field is on quite a small length scale, of the order of metres and kilometres; however, there is some contribution at long wavelengths. If we have measurements of the field only on or above the surface, there is no way we can separate this crustal field from the core field. However, if we look, as in Figure 3, at the spatial power spectrum of the field observed at the Earth's surface (e.g. [1 ]), we see a steeply falling part at low frequency/long wavelength, and a much flatter part at high frequency/short wavelength. The shallow slope of the high-frequency part shows that this part almost certainly comes from a very local source, and it is consistent with that source being in the crust; it would need sources of completely unrealistic magnitude if it were to come from the core. On the other hand, the steep slope of the low-frequency part is consistent with an origin in the core. Although we cannot rigorously prove this separation, it is reasonable to assign almost all of the long-wavelength part to electric currents in the core, and to assign almost all of the short-wavelength part to the magnetization of crustal rocks. If we fit the spectrum by two straight lines, they intersect at about n=13, corresponding to a wavelength of about 3000 kin. So, those of us interested in the main field coming from the core restrict our interest to spherical harmonic degrees n = 1-12 or13; this means that to specify the field at a given time
30
10 8 9
o
Core
10 6 t--
10 4 Crust 10 2 o
05 (45000 nT)
~
O
9
I
I
1
10
15
20
Degree n ::
(-200 nT)
Figure 3. Spatial power spectrum: mean square field at the surface R. : (n + 1)E[(g~") 2 + (h,') 2] coming from all the harmonics of degree n. m
we need to determine 168 or 195 numerical coefficients. Of course one man's noise can be another man's signal! For someone interested in the structure of continental plates and the history of their motion, or in mineral exploration, it is the crustal magnetic field which is the signal, and the core field which is the noise.
2. TIME V A R I A T I O N OF THE FIELD
So much for the geomagnetic field at a given time. However, unlike almost all other geophysical phenomena of internal origin, the geomagnetic field is changing quite rapidly in time. This c~B/c3t (itself a three-dimensional vector) has a magnitude of typically 70 nT/yr, corresponding to more than 1% in 10 years! Fig. 4 gives an example. And this c~B/& is difficult to predict as it is itself changing with time. So if we are going to produce an accurate model or map of the present field, we can only use recent observations. An observation made at a particular site 100 years ago does not add very much to our knowledge of what the field there is at present! This rate of change of the main field is called the secular variation ("Secular", meaning slow, by comparison with the daily variations). It is lack of knowledge of this secular variation that is probably the greatest problem in global geomagnetism, both from the point of view of specifying the present field, and for theoreticians interested in processes in the fluid core.
31
Tashkent 9149149
1500] -nT
9
O 9
9
Z - 45000 9
9149
OOOOO4PO 9149
1000
X-
25000
++ 4 - + + + + + + + + + + + 4"-I-.I. ++-t,
9
500
9
++4.+++
9
9 +
4-
++
++
+++++
" .
+
4-
+4.++ +
@Nil'l I 1
II
9 IN t I I I iN
IN II
9I l l
I IIIi
Y- 2000
I li I
Wniem 9149149
1940
1950
1960
1970
1980
Figure 4. Time variation of three orthogonal components of the field at a typical observatory. The lower full line shows schematically how the (long-wavelength) core field varies with position at Time 1. At the later Time 2, the secular variation (SV, the time variation of the core field) will have moved this line to the higher position. But superimposed on this is the (effectively constant, short-wavelength) crustal field represented by the dotted line. So, if the observation point (large dot) is moved between the two times, there could be a large error in the deduced secular variation. Unfortunately, because the crustal magnetization produces a field which can vary very rapidly with position, for field measurements at non-identical sites this magnetization effectively introduces random noise, equivalent in magnitude to a few years' secular variation; see Fig. 5. Thus, it is very difficult to determine the secular variation from measurements which are scattered in space as well as in time. We could certainly estimate an average secular variation over several decades, but then we are smearing out the way in which the secular variation is changing in time. To measure the secular variation we need to have our magnetometers accurately fixed in the same position for a long time; that is, we need a magnetic observatory. Ideally this observatory needs instruments having stable baseline and calibration, and known temperature coefficients, mounted on physically stable pillars and sheltered by a non-magnetic building, in a location where man-made magnetic disturbance is, and is expected to remain, small, and be maintained for a hundred years!
32
_._
9
...... :
..., "
9
9
i
~
"
Time
2
o~"I
9 v,,-I
(D
!
SV
%,"
/".. ..-,.," o
,~ .
-
9 o~
.," ,
-
-
"
9
o ; .
-
Ti me
1
stal Beld r-
Position
Figure 5. How the crustal field confuses measurement of secular variation.
Figure 6. Present distribution of magnetic observatories; not all are fully operational (Hammer equal area projection)
33 If we are going to determine the secular variation down to wavelengths of about 3000 km (corresponding to fitting harmonics up to n=13), we would need a network of at least 180 observatories, uniformly spaced over the surface. We do in fact have about 180 observatories, but they are anything but uniformly spaced (Fig. 6). We see that, for this purpose, they are too close together in the developed countries, very sparse in the developing countries, and, not surprisingly, essentially absent in the oceans!
3. THE NEED FOR OCEANIC O B S E R V A T O R I E S The lack of observatories in oceanic areas means that there are quite large areas of the surface where we cannot measure the secular variation. If you are navigating a boat in the South Pacific, you would be wise not to put too much trust in the statement of magnetic declination (the horizontal angle between magnetic north and true north), and of rate of change of declination, given on the chart! Such a lack of data does not significantly affect our knowledge of the field in Europe say, just as lack of ocean current measurements in the Pacific does not affect our knowledge of the currents in the Mediterranean. But there are users (chart makers, if nobody else!) who do need information about the field in the Pacific. And there are users of our models of the main geomagnetic field who are concerned not only with the Earth's surface; theoreticians want to know the secular variation at the surface of the core, and space scientists want to know it in the ionosphere, 100 km above the surface, and out to the magnetosphere at 10 earth radii. It is particularly in this context of downward and upward continuation that these large data gaps become particularly important, and lead to significant degradation of our global models. Fig. 7, from a simulation by Langel et al. [2], shows quantitatively how using good data (+ 10 nT/yr) from the existing observatory network still leads to errors of magnitude 100 nT/yr in the South Pacific; if the 180 observatories had been uniformly distributed over the surface we would have expected errors of magnitude only about 8 nT/yr, with a maximum error of about 20 nT/yr, there would only be one contour on the chart! This large amplification of the errors shows up one major problem we have. If we limit our modelling to low degrees/long wavelengths, then our model effectively interpolates between our observation sites, and gaps are less important; for these long wavelengths the numerical coefficients are less affected by data gaps. But this limitation to long wavelength means that we are ignoring a significant amount of field at shorter wavelengths. It is when we try to estimate the coefficients for increasingly shorter wavelength contributions that a data gap has more and more effect in increasing the error of these coefficients; it was in attempting to determine the n=10-13 coefficients that these very large errors of prediction were introduced in the South Pacific region. For these comparatively short wavelengths the other data points could not adequately constrain the values in the South Pacific. In fact, at present we do not try to model the instantaneous secular variation beyond ?/=8. Thus, it is clear that we desperately need to fill in the gaps in the current distribution of observatories. Langel et al. [2] found that, even when all possible island sites had been included, there was still a need for at least 8 ocean-bottom observatories, as shown on Fig. 8. Of course no ocean-bottom station is going to survive for 100 years, so some compromise is needed; but with a good, stable, magnetometer, even a year or two will add another value of
34
Figure 7. Magnitude of the expected error in the vertical component of a n = 13 model of the secular variation, using existing observatories assumed to have +/- 10 nT/yr uncertain~. Contour interval 10 nT/yr (reproduced by permission from Langel et al., 1995)
Figure 8. The locations of existing observatories are shown as squares. The locations of possible new island observatories are shown as open circles, and the locations of the 8 ocean-bottom observatories, that would still be needed to give a reasonably uniform coverage, are shown by large filled circles (after Langel et al., 1995)
35 secular variation in a region where it will have a significant effect on improving our model at that time. Obviously, the longer the occupation, the more valuable the data will be. So far I have ignored the use of data obtained from satellite surveys. Back in 1980 we had 6 months of MAGSAT data, so that for the first time we had truly global coverage, and could feel reasonably confident that we had a good knowledge of the main geomagnetic field. But ever since then our knowledge has been getting worse again; the recent surface data are nowhere near adequate in themselves, and we could not update the 1980 model because we did not know the secular variation sufficiently accurately and with sufficient detail. We now have the Oersted satellite in orbit, and, after some initial problems, it is now producing good data. So we should soon again have a good knowledge of the main field. Also, there are plans for various other magnetometer missions. However, these missions will be comparatively short-lived; so it will be quite difficult to determine an instantaneous secular variation from any one satellite. In addition there is the ever-present problem with satellite measurements that they are from an instrument which is moving quite fast in space. There are rapid time variations of the field due to time-varying sources in the ionosphere and magnetosphere, and it is quite difficult to separate the variation of the field with position from the variation of the field with time. We are not yet able to do this separation uniquely, and there is no doubt that measuring the time variations at a sufficient number of fixed ground stations is going to be necessary for the foreseeable future. So I can only emphasize that we in global geomagnetism desperately need ocean bottom observatories, and I hope that GEOSTAR will be one step on the way to achieving this.
ACKNOWLEDGEMENT Some of the diagrams in this paper were prepared using the Generic Mapping Tools produced by Wessel & Smith [3]. REFERENCES
1. R.A. Langel and R. H. Estes, Geophys. Res. Lett., 9 (1982) 250. 2. R.A. Langel, R. T. Baldwin and A. W. Green, J. Geomag. Geoelectr., 47 (1995) 475. 3. P. Wessel and W. H. F. Smith, EOS Trans. Amer. Geophys. Union, 79 (1998) 579.
This Page Intentionally Left Blank
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
37
9 2002 Elsevier Science B.V. All rights reserved.
O c e a n - B o t t o m S e i s m o l o g y in the Third M i l l e n n i u m J. Bialas a, E. R. Flueh a, J. Phipps Morgan a, K. Schleisiek b and G. Neuhaeuser b aGEOMAR, Research Center for Marine Geosciences, Christian Albrechts University Kiel, Wischhofstr. 1-3, D 24148 Kiel, Germany bSend GmbH, Signal-Elektronik und Netz-Dienste, Stubbenhuk 10, D 20459 Hamburg, Germany
Marine seismic wide-angle data acquisition and earthquake seismology observations are at the verge of a quantum leap in data quality and density. Advances in micro-electronic technology facilitates the construction of instruments that enable large data volumes to be collected and that are small and cheap enough so that large numbers can be built and operated economically. The main improvements are a dramatic decrease of power consumption (< 250 mW) and increase in clock stability (< 0.05 ppm). Several scenarios for future experiments are discussed in this contribution.
1. INTRODUCTION Seismic studies in the marine environment have a long tradition. During the last decades, acquisition of multichannel near-vertical reflection data has been developed to near-perfection by the hydrocarbon exploration industry. We will soon see specialized vessels towing up to 100 km of streamers for 3-D seismic investigations. The collection of supplementary seismic wide-angle data has historically been promoted by academia, and recent advances in recording technology have lead to a major development of new instruments during the last decade. We are on the verge of making experiments where up to 100 ocean bottom instruments will be simultaneously deployed. Using airguns as sources, the data volume collected in marine work is generally much higher compared to land work, at a much lower cost per data record. In contrast to active source seismology, marine earthquake seismology is clearly not nearly as developed as land based observations. Instead, it is comparable to what was done on land about 20 to 30 years ago. Several factors contribute to this situation, although primarily it has been missing technology that led to this imbalance. Deployments of at least a few months are needed to study local seismicity. As was convincingly shown by Hasegawa et al. [1], reliable epicenter location in subduction zones can only be obtained by incorporating marine stations. At reasonable sampling rates, this requires instruments capable of storing a few GBytes of data. Recently, this class of marine instrument has become available and has been successfully used for local marine seismic networks [2,3]. Networks recording teleseismic events for local tomographic studies require deployment times ranging from at least six months to possibly years, which is currently beyond the
38 capacity of most existing instruments. Perhaps the best example of this type of experiment was the MELT study of the structure beneath the southern East Pacific Rise [4]. The scientific needs for even longer duration Seafloor Observatories have been formulated at several recent workshops. Seismic stations that meet this role are still at the edge of what is technologically possible. So far, only a few trial deployments have been made to develop and test seafloor observatory technology [5,6]. Besides power supply and data capacity, the coupling of a seismic sensor to the ground remains the most prohibitive obstacle to advances in seismic recording capability. ROVs or submersibles offer the optimum solution, though the involved costs are high. Alternatively, free-falling and gimballed instruments are more easy to deploy, but the collected data are typically noisier and less broad-band. The U.S. Ocean Seismic Network Pilot Experiment (OSNPE [7]), carried out in 1998 at ODP site 843 B, used a simple but heavy mechanical frame to push the seismometer into the seabed, which had a profound effect on data quality. In this contribution, we summarize the scientific aims of future marine seismological work, present a new data logger that meets most of the requirements and speculate about possible scenarios that may be made operational in the first decade of the new millennium.
2. C U R R E N T T E C H N O L O G Y For more than a decade the need to extend land based broadband instrument arrays to the sea has been recognized [8], and in 1993 the International Ocean Network (ION) committee was formed to coordinate the effort towards broadband marine observatories. Several DSDP/ODP drillholes were used for test deployments [9,6] or drilled on purpose (site 396B; site 794; OSN1- site 843B; leg 179; leg 186). These experiments all showed that the ocean floor is a suitable environment for broadband observations, and that the technological challenges can be met. Data storage capacity has meanwhile become a relative non-problem due to advances in portable computer recording technology, but installation of sensors and data retrieval at a reasonable cost is still limited by technological issues. Currently, sensors can only be installed in the seafloor using a drill ship, a submersible or a ROV. Data retrieval is somewhat simplified when obsolete telecommunication cables can be used, such as for the H 2 0 observatory [ 10] or off Japan [ 11 ]. However, these are limited in number and are not at the optimum locations. New installations of purpose-designed fibre-optic cables are only practical close to the shore [ 12,13]. Free-falling instruments have been developed, which, based on 1 Hz seismometers using electronic feedback, extend into the semi-broadband (50 sec) range [ 14]. These instruments, as well as those used for active seismic experiments, all use gimballed seismometers that rest on the seafloor. Their easy handling is at the expense of a poor coupling and a significantly increased noise level compared to buried sensors, as shown by many detailed tests [15]. Often, they are complemented by an additional hydrophone or differential pressure gauge (DPG), which always provide excellent coupling but retain little information on particle motion. At GEOMAR we have currently created a pool of 40 ocean bottom hydrophones (OBH) or seismometers (OBS), mainly used for active seismic experiments including high resolution work (up to 4 kHz, e.g. [16]). These low power (1.5 W) and high data capacity (4 GByte) instruments have also been used for local seismicity studies at the Chilean margin [3], and recently we have started to extend the application range into the broadband spectrum required for future teleseismic and regional seismicity studies. In a first step, a new recorder was developed, which is described below. We are now in the process of testing several sensor packages. The GEOMAR instruments [17,18] have a modular design, so they can be adopted
39 to the needs of a particular experiment. They cover the frequency range from 120 sec to 4 kHz, with different sensors available.
3. FUTURE INSTRUMENT NEEDS IN OCEAN B O T T O M S E I S M O L O G Y As on land, marine seismic packages would be deployed for different purposes that have somewhat different instrument needs. One use is as four-component stations (a threecomponent seismometer plus a hydrophone) that "fill spatial gaps" caused by the uneven global distribution of land and island stations, which would improve the data coverage in global seismic studies. Stations used for this purpose would ideally have very broadband sensors (-1-0.003 Hz) and continuous recording. In this case, they would be a seafloor "analogue" to instruments in the current land-based networks of digital seismic stations. For earthquake location studies, quasi-permanent stations are desirable. However, for global tomographic studies, 1-2 year deployments of mobile local arrays (discussed further in the next paragraph) may actually be preferable, since they can currently obtain significantly more information at a much lower cost than a permanently maintained seafloor or drillhole seismic station. In addition, tomographic studies do not require real-time data return, which allows the use of recording packages that are collected after a single 1-2 year deployment, thus greatly reducing the station costs as no telemetry will be necessary. Since most of the accessible mantle and core ray paths to a given station location can be "illuminated" by a few years of natural seismicity, a longer deployment at a single site will typically gather less new information about Earth's internal seismic structure than a redeployment at a different seafloor site. In addition, the use of mobile arrays in place of a single permanent station would allow beam-forming and stacking analysis that could improve resolution. Another use of marine stations is for earthquake and nuclear test monitoring. For this purpose, real time (or near real time) data recovery is necessary, which will require the development of either data telemetry or a method to quickly recover data from a field instrument, e.g. through messengers [5]. Stations are also envisaged as a component of seafloor observatories to monitor tectonic or volcanic activity in "natural laboratories". Here local seismicity would be a key target, which would ideally require an array of four-component seismic stations that can record microseismicity which contains much higher frequency (-10 Hz) information than teleseismic arrivals. Telemetry from at least a few stations would be extremely desirable for monitoring purposes. Finally, we anticipate that a major future use will be PASSCAL-like arrays of marine seismometers that are used in portable 1-2 year deployments to study the regional crustal, lithosphere, and upper mantle structure beneath specific regions, e.g. for regional studies of the lithosphere and upper mantle structure beneath an island arc or oceanic hotspot. Also, to study 50-200 km scale variations in upper mantle asthenosphere structures associated with upwelling and melting at mid-ocean spreading centers and mantle plumes, plume-spreading center interaction, island arc and back-arc spreading and volcanism. Many sites will meet both regional and global seismic objectives with 1-2 years of passive recording coupled with an active seismic experiment to constrain crustal and shallow lithospheric seismic structure. For these experiments telemetry is not necessary, so that the required instrument can be much simpler with lower deployment and recovery costs.
40
Figure 1. Photograph showing the complete GEOLON-MLS recorder housing all electronic parts including digitizer and clock. At GEOMAR, we will focus on year-long deployments in array-seismic studies. We believe that this instrument is both feasible and relatively affordable with current microprocessor and sensor technology. A similar approach has recently been initiated in the U.S. to create the U.S. National Ocean Bottom Seismic Instrumentation Pool (OBSIP [19]).
4. G E O L O N - M L S THE L O W - P O W E R D A T A L O G G E R
In 1998 we decided to develop a new data logger that can be easily used for long-term deployments. This means that high data capacity and low power consumption were the two main requirements. In addition, we wanted to keep the modular design of our instrument fleet, and therefore the physical dimensions of that new recorder were to be close to those of the METHUSALEM MBS recorder used for active source profiling [18]. The design of the new system (Fig. 1), which is optimized for long time seismological investigations on the ocean bottom, led to some important modifications. Because of the smaller range of planned applications the flexibility and functional extent could be reduced for significant cost-savings. On the other hand, because of the need for long-term stand-alone operation, the following had to be optimized: 9 space and weight constraints for the batteries needed for 12-month-deployments require much lower power consumption than that of the OBH recorder; 9 the necessary clock stability and precision demands a better clock; the Seascan (R) oscillator was selected which limits the deviation to 1.5 sec max. over 1 year (< 50 ppb);
41 9 to realize a data storage capacity of 6-12 GBytes the new instrument was equipped with 12 PCMCIA slots. The availability of already announced PCMCIA elements with larger capacities will extend the overall capacity correspondingly in the short future; 9 the new instrument should fit into the same pressure cylinder of 15 cm inner diameter as used for METHUSALEM-MBS; it will thus be mechanically compatible with the existing OBH/OBS equipment; 9 it should be easily possible to use the recorder with different types of seismometers, hydrophones and pressure sensors. For this reason the analogue front-end was designed as an exchangeable unit which is plug-replaceable to adapt to the particular interface requirements of the sensors used. The key features of the GEOLON-MLS are summarized in Table 1. 5. FUNCTIONAL DESCRIPTION OF THE DATA L O G G E R 5.1.
Hardware
The analogue front-end has four input channels. Three channels are prepared for a threecomponent seismometer. The fourth channel is prepared for hydrophones or pressure sensors based on strain-gauges. The input sensitivity of its low-noise preamplifier can be set via switches. Table 1 Key features of the GEOLON-MLS Analogue inputs: Seismometer Input sensitivity
3 channels custom configurable
Hydrophone (or pressure sensor)
1 channel with low-noise preamplifier (gain switch-selectable)
Time synchronization
DCF77-coded signals or single pulse
Internal time base drift
< 0.05 ppm, < 1.5 sec/year (from 0~ to +30~
Power supply: - external - internal
from 6.2 V to 16.5 V 3 AA alkaline cells to ease handling during preparation
Power consumption Storage medium Storage capacity
Weight
Recording: from 230 mW to 250 mW Lowbat standby: 100 mW PCMCIA flash-disk / hard disk 12 PCMCIA slots type II or 6 PCMCIA slots type III (at present 12 GBytes) 1.5 kg without batteries and PCMCIA storage modules
42 Table 2 Sample Rates and Resolution Samples per Second
f-3dB (Hz)
Resolution (Bits)
Signal-to-Noise Ratio (dB)
1
0.3
21
120
2
0.7
21
120
5
1.7
21
120
10
3.3
2O
114
2O
6.7
20
110
30
10.0
19
106
50
16.7
18
100
100 *
33
17
96
200 * * optional
67
14
78
5.2. Preparation for measuring campaign The instrument can be parameterized using an ASCII terminal via its RS232 interface. The high precision oscillator is synchronized using DCF77 compatible pulses.
5.3. Data recording After low pass filtering, the signals of the four input channels are digitized using Sigma-Delta A/D converters. A final decimating sharp digital low-pass filter is realized in software by a Digital Signal Processor. The effective signal resolution depends on the sample rate as shown in Table 2. Finally the samples are permanently stored on PCMCIA flash disk or hard disk memory modules. The first file on each PCMCIA card is a special system file containing all control, status, and identification information of the actual experiment and the particular PCMCIA card. The synchronization time from the beginning of the experiment and the skew time at the end, giving the time deviation over the experiment duration, are also documented in this file, as well as all used filter coefficients and filter delay. The remaining storage capacity is available for data files. Each data file can be identified by a name generated automatically. A Huffman coded delta modulation is used for data compression, resulting in 22-bit signals. Proper recording operation or correct parameter settings may be checked by visualizing the currently acquired signal on a selected channel via the RS232 interface. This is even possible on the ocean bottom in the deep sea, provided a modem and suitable data link are available.
5.4. Data analysis The recorded data are retrieved by plugging the PCMCIA storage modules into a PC. Available standard software supports data transfer and decompression into 32-bit signals. The PASSCAL format is used as standard data transfer format allowing the generation of the SEG-Y format as a base for different individual analysis methods. An example of data collected during the virgin dive of the GEOLON-MLS recorder in October 1999 offshore Costa Rica is shown in Fig.2.
43 6. FUTURE ISSUES The cost for the use of a large research vessel is now the largest cost component in future passive marine experiments. As ship costs are often "hidden" as part of the operating budget, there is often little concern over ship costs in a particular experiment, but instead a large problem with ship scheduling. Deploying a buried instrument-sensor will require a specialized marine vessel in the foreseeable future (to operate a ROV or more specialized deployment tool), but instrument recovery should be made as simple as possible so that any ship of opportunity (e.g. a fishing boat where available) can potentially be used. Free-fall instruments should also be suitable for deployment by ships of opportunity. Therefore, they should be easy to prepare, relatively low-weight and easy to move. The development of a relatively portable "installation tool" or the ability to use a standard simple ROV for installation would allow instrument deployment from a large fraction of the marine research fleet. Active data acoustic telemetry is now technologically possible to a moored surface transponder. However, routine real-time data telemetry requires considerable power that is beyond the capabilities of current battery technology. Perhaps fuel-cell technology will be able to significantly improve the power-to-cost and power-to-weight ratio of high-power seafloor installations. The power supply could also come from the moored surface transponder (i.e. fuel cell, wave-action, solar power, or all of the above). This type of surface plus seafloor deployment is obviously much more expensive than the deployment of simple stand-alone recording packages. At present this type of real-time seismic monitoring seems best suited to military applications (e.g. nuclear bomb test monitoring) where cost is a lower priority.
Figure 2. Data-example of marine airgun shots recorded during the first deployment of the GEOLON-MLS data logger.
44 ACKNOWLEDGEMENTS The data shown in Figure 2 were collected during cruise SO144-1 off Costa Rica in October 1999. This cruise was funded by the German Ministery for Science and Technology (BMBF), and we especially thank master H. Papenhagen and his crew for their expert assistance during the cruise.
REFERENCES
1. 2. 3. 4. 5.
A. Hasegawa, S. Horiuchi and N. Umino, J. Geophys. Res., 99-B 11 (1994) 22295. J. Lilvall and T. Francis, Geophys. J. RAS, 54 (1978) 721. S. Husen, E. Kissling, E. Flueh and G. Asch, Geophys. J. Int., 138 (1999) 687. MELT Science Team, Science, 1998. L. Beranzoli, A. De Santis, G. Etiope, P. Favali, F. Frugoni, G. Smriglio, F. Gasparoni and A. Marigo, Phys. Earth Plan. Int., 108 (1998) 175. 6. J.P. Montagner, J. F. Karczewski, B. Romanowicz, S. Bouaricha, P. Lognonne, G. Roult, E. Stutzmann, J. L. Thirot, J. Brion, B. Dole, D. Fouassier, J. C. Koenig, J. Savary, L. Floury, J. Dupond, A. Echerdour, H. Floch, Phys. Earth Planet. Inter., 84 (1994) 321. 7. http://msg.whoi.edu/osn/EOS/EOS_paperadditional_j ul_23_99.html 8. A. Dziewonski and G. M. Purdy (eds), Proc. Workshop on broad-band downhole seismometers in the deep ocean, Woods Hole Ocean. Inst., 1988. 9. K. Suyehiro, T. Kanazawa, N. Hirata, M. Shinohara and H. Kinoshita, Proc. ODP Sci. Results, 127/128 (1992) 1061. 10. R. Butler, A. D. Chave, F. K. Duennebier, D. R. Yoerger, R. Petitt, D. Harris, F. B. Wooding, A. D. Bowen, J. Bailey, J. Jolly, E. Hobart, J. A. Hildebrand, A. H. Dodeman, EOS, in press. 11. K. Hirata, M. Aoyagi, S. Morita, I. Fujisawa, H. Mikada, Y. Kaiho, R. Iwase, K. Kawaguchi, Y. Sugioka, K. Suyehiro and H. Kinoshita, EOS Trans. AGU, 80 (1999) 46. 12. F. K. Duennebier, D. N. Hams, J. Jolly and J. Babinec, EOS Trans. AGU, 80 (1999) 46. 13. www.neptune.washington.edu 14. R. S. Jacobson, L. M. Dorman, G. M. Purdy, A. Schultz, S. Solomon, EOS Trans. AGU, 72 (1991) 506. 15. T. W. Barash, C. G. Doll, J. A. Collins, G. H. Sutton and S. C. Solomon, Mar. Geophys. Res., 16 (1994) 347. 16. J. Mienert and P. Bryn, EOS, 78 (1997) 567. ! 7. E. R. Flueh and J. Bialas, Int. Underwater Systems Design, 18 (1996) 18. 18. J. Bialas and E. R. Flueh, Sea Technology, 40 (1999) 41. 19. http://www.obsip.org
P a r t II Seafloor observatories 9 state of the art and ongoing experiments
This Page Intentionally Left Blank
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
47
9 2002 Elsevier Science B.V. All rights reserved.
D e v e l o p m e n t o f seismic real-time monitoring systems at subduction zones around Japanese islands using d e c o m m i s s i o n e d submarine cables J. Kasahara Earthquake Research Institute, University of Tokyo, 1,1,1, Yayoi, Bunkyo, Tokyo 113-0032, Japan
Two submarine OBS systems were developed using decommissioned submarine cables: the IZU OBS using the GeO-TOC cable at the Izu-Bonin Trench, and the VENUS multidisciplinary station using the GOGC cable at the Ryukyu Trench. The former system uses digital data acquisition, but it borrows much from the traditional submarine cable technology. The latter system is more sophisticated than GeO-TOC, and it was deployed by manned and unmanned submersibles, deep-tow equipment, and cable ships. Many new technologies have been developed for both OBS. The IZU OBS has been operated for three and half years and it has contributed to improving hypocenter determination accuracy along the Izu-Bonin seduction zone. The VENUS system proved the usefulness of decommissioned submarine cables for multi-disciplinary measurements at the deep-sea floor.
1. INTRODUCTION Several subduction zones surround the Japanese Archipelagos: the Kuril Trench, the Japan Trench, the Izu-Bonin Trench, the Nankai Trough, and the Ryukyu (Nansei-shoto) Trench. The Pacific plate and the Philippine Sea plate subduct beneath the Japanese archipelagoes at these sites. The subducting movement of the plates generates destructive earthquakes along the plate boundaries and in the forearc slopes. To minimize fatalities and damage to buildings caused by large earthquakes, it is important to study the nature of natural earthquakes and seismic structures at subduction zones. Since 1965, we have developed two major kinds of Ocean Bottom Seismometer (OBS): the mobile OBS and the submarine cable OBS. The first system [ 1] has been used extensively for seismic surveys during the past several years. The real-time monitoring of seismicity, however, is also very important to understand ongoing movements along the subducting plate boundary. To perform real-time monitoring of the subduction zone, we have developed two submarine cable OBS systems using decommissioned submarine cables. In this article, the author describes only the cable system.
48 2. USE OF DECOMMISSIONED SEISMIC MONITORING
SUBMARINE
CABLES
FOR
REAL-TIME
Real-time seismic observation on the deep-sea floor is sought from the viewpoint of earthquake hazard. One of the best technologies to achieve real-time observations is the use of submarine cables, which have long histories of technological development and proven field works in telecommunication. Although fiber-optic submarine cables apply very advanced and reliable technology, using new fiber-optic submarine cables is extremely costly. Another kind of submarine cable is the coaxial cable, which can provide electrical power and real-time telemetry similar to fiber-optic cables. Due to the extremely rapid growth of fiber-optic technology and huge demands for global telecommunication, a number of fiber-optic submarine cables with Giga-to-Tera bit capacity have been deployed. Although many submarine cable OBS have been deployed around the Japanese coast during the past two decades [2], each system required a huge investment. Therefore, construction of similar system at a distance far from the shore would not be easy. For example, a submarine cable OBS at the Izu-Bonin arc would require much longer cables and using a fiber-optic system is not practical. With the installation of fiber-optic systems, the rather old TPC-1 and TPC-2 coaxial submarine cables terminated their long lives in commercial service in 1990 and 1994, respectively. TPC-1 was the first Japan-US submarine cable constructed in 1964 and TPC-2 was the second one constructed in 1976. By reusing such resources, real-time geophysical observatories on the deep-sea floor can be realized with high reliability and at reasonable cost [3,4]. Sections of the TPC-1, between Ninomiya, Japan and Guam Island, and the TPC-2 between Okinawa Island, Japan and Guam Island were donated to the ERI (Earthquake Research Institute) and the Incorporated Research Institutions for Seismology (IRIS) in 1990, and to the ERI in 1996, respectively. Both cables cross geophysically important regions (Fig. 1). TPC-1 is routed from Guam, the Marina-Izu-Bonin Trenches to the Sagami-Bay. TPC-2 is routed from the Mariana Trough to the mid-Philippine Sea plate to the Ryukyu Trench. These areas contain seismically very active subduction zones and a rifting backarc basin. There are two major projects for the scientific reuse of decommissioned submarine cables in Japan. The Geophysical and Oceanographical Trans Ocean Cable (hereafter GeO-TOC) project uses the TPC-1 cable. The other is the Versatile Eco-Monitoring Network of the Undersea-Cable System (VENUS) project. The VENUS project uses the TPC-2 cable (Guam Okinawa Geophysical Cable; hereafter GOGC). In the VENUS project, a multi-disciplinary observatory was installed at the Ryukyu Trench.
3. GeO-TOC SYSTEM The length of the GeO-TOC submarine cable is 2659 km (Fig. 1). The GeO-TOC system has 74 repeaters [5], which contain dual vacuum-tube amplifiers to compensate for the gainloss caused by submarine cables. The submarine cable is a 1-inch diameter coaxial cable. The power supply allowance for the TPC-1 system is 1940 V and 370 mA for instruments. If each station uses 30 W, more than 20 stations can be installed along the cable. TPC-1 had 138 voice channels during commercial use. If data-telemetry uses several voice channels, there remains enough capacity for scientific use. A question arose over the remaining service life of the cable system, because the official life of the system was 25 years. Because the design service life of submarine electronics, however, is roughly more than 50 years and the
49
Figure 1. Cable routes of GeO-TOC and GOGC, and locations of the IZU and the VENUS OBS.
50 estimated lives of submarine cables are longer than the lives of electronics, the remaining service life of the GeO-TOC system should be sufficient for scientific observations [6]. The IZU OBS [6,7] was designed to use well-developed submarine cable technology because this was the first trial in the world, in which an old coaxial cable system was converted for scientific use. For this reason, the OBS package resembled a submarine cable repeater. The basic consideration was placed on reusing the old telephone equipment to minimize construction cost, without loosing data accuracy. To simplify the system, 4 kHz was adopted for the frequency width of data-transmission, which corresponds to the traditional analog telephone bandwidth. The electronics of the IZU OBS comprises sensors, A/D converters, control-CPUs, data telemetry unit, PSFs (Power Separation Filters), and DC-DC converters (Fig. 2). The PSFs separate high-frequency signals from the DC high voltage. The DC-DC converters supply the necessary DC voltages to the electronics using the high DC voltage. The OBS contains three orthogonal accelerometers (JAE-5V-IIIA, manufactured by Japan Aeronautic Electronics Co. Ltd.), hydrophone (ITC-1010, manufactured by International Transducer Corp. Ltd.), external quartz thermometer-pressure gauge (PaloScience-8B7000-15), and six thermometers to monitor temperature change of electronics. Two 16-bit CPUs (Hitachi H-8) are applied for data acquisition, formatting data, level change of modems, and rotation of gimbals. Three delta-sigma 24-bit A/D converters (Crystal Semiconductor-CS5322/5323) with a 125/62.5 Hz sampling rate are used for digitizing accelerometer outputs. The resolution of accelerometers is estimated to be 0.1 mgals for 0.1-62.5 Hz. The frequency characteristics are fiat over DC62.5 Hz. The resolution of the quartz thermometer is 0.001 ~ Data from each accelerometer are transmitted to shore at 9600 bps through one phone channel by FDM (Frequency Division Multiplexing). Slow data such as temperature and pressure are multiplexed and transmitted to shore through a phone channel. In total, the IZU OBS uses only five phone channels for data (uplink) and two phone channels for downlinking system control commands. The hydrophone signals, however, are transmitted in analog form to avoid loosing all data in the case of a malfunction of the digital system. On January 13, 1997, the IZU OBS was deployed on the forearc slope of the Izu-Bonin Trench (31~ and 140~ 2,708 m ) , which is approximately 400 km south of Tokyo, using the cable ship M/V "KDD-Ocean Link" (Fig. 1) [6]. The operation was carried out as follows. First, the middle of two repeaters of the GeO-TOC cable was hooked by a set of grapnels, which are a kind of anchors with three or four flukes, and lifted to the shipboard. The main cable was cut on board and the OBS unit was spliced into the main cable using a traditional cable repair technique. At2er verifying communication between the OBS and the shore through the actual submarine cable, the OBS was deployed in the sea through the drum cable engine and the stern-chute of the ship. The +4170 V and 370 mA DC using the same DC power supply as that previously used has been fed to the GeO-TOC cable from the Guam station. The 90 V has been increased in comparison to the previous one. The other end of the submarine cable is landed at Ninomiya. The FDM modulated signals are separated to each channel at Ninomiya and transferred to the ERI, Tokyo using a commercial data line at 64 kbps.
51 To Guam SD main coaxial 900V
De; ! 370n~
cable
-t Co+ler.....i ""~;SF
DC component
....~__~[,C-DC ... 't
1
I c~
[ High freq. t component
I ! Im h
~lps F 13~.~
pressure , ,,~ Quart ][ - ~ l l't~t, ] [ and Z temperature[ ! ~ I'
10
cony.
' I
_ I ~1 2 CPUs ~ ......
freq.
~component ,r ,,
~n~~
To Ninomiya k SD main 810V DC" coaxial cable
Block diagram of IZU OBS with couplers
~
,'
+
Data to Ninomiya and commands from Ninomiya
,
............... t Hydrophone ARC ~
!
+0 ! i xsbD+ 1 uotlv.
Temperal:ures ! [ Ac,celeromet ers ! Pressure ........................................................................................................................................ case PSF Po~er Separation Filter ARC:Acoustic Receiving Circuit HB Hybrid circuit FCC:Frequency Converter Circuit MODModulator/Demodulator
Figure 2. Block diagram of the IZU OBS sensors and electronics. The hydrophone data are digitized by a 16-bit A/D converter at Ninomiya. A GPS clock placed at Ninomiya gives timing to the data. Continuous data have been stored on a 3.5 inch MO-disk every five days since the installation in January 1997. The typical size of one hour of data is approximately 4.5 Mb when the ground is quiet. The IZU data have been processed regularly since March 1998 by the University Micro-Earthquake Network) to determine hypocenters in Japan. The IZU OBS has provided the seismic data to the University Micro-Earthquake Network as the far south station, and contributed to improving hypocenter accuracy, especially in depth, for earthquakes occurring along the Izu-Bonin subduction zone [7]. In general speaking, earthquake hypocenters along the Izu-Bonin trench and its forearc slope are determined several 10 km shallower and 15 km eastward if IZU station is added to the land stations, because the nearest land seismic stations for the IZU station are at Aogashima (32~ 139~ and Iwo-jima (24~ 141~ The distance between two stations are of the order of 1000 km. The data have also been disclosed to IRIS and the Tokyo Metropolitan Seismic Network. In the three and half years since their deployment, the OBS unit, the main cables and the repeaters have experienced no major problems. There were some minor problems in shilling the center frequency on the receiving unit, and adjusting the center frequency solved this issue. Several earthquakes along the Japan Trench and Izu-Bonin Trench subuduction zones have been observed every day by the IZU OBS. Only one waveform example is shown in Fig. 3, which shows a seismic record for the deep focus event at the Izu-Bonin subducting slab.
52
Figure 3. Records of the deep-focus earthquake (at 11:16, January 15, 1999, 560-km depth). EW, NS, UD, Hyd. components from top to bottom. Record length is 2 minutes. The noise spectrum is always interesting in the OBS performance. The noise power spectra of the IZU OBS showed high peaks at the period of 0.2-~0.3 see. on horizontal accelerometers and similar power spectra for the vertical accelerometer and the hydrophone (Fig. 4). Spectra shorter than 0.1 see. are similar to the HNM (High Noise Model), but the level of 0.2~0.3 seconds peak is a little lower than the HNM [8]. The noise level of 0.2~0.3 see. has a tendency to be strongly influenced by the sea state. Because the IZU station is located at the middle of the Typhoon corridor and also below the main stream of the Kuroshio Current, they might give some affects on the noise level, even though the depth of the OBS is 2700 m.
4. VENUS-GOGC SYSTEM In contrast to the GeO-TOC OBS, the VENUS project (1995~1999) intended to develop new technologies for using decommissioned submarine cables for environmental measurements at the ocean bottom. A multi-disciplinary observatory using the GOGC cable [7, 9] was installed at a depth of 2200 m on the forearc slope of the Ryukyu Trench at 50 km from the mainland of Okinawa Island in the fall of 1999 (Fig. 1). The objectives of the VENUS project [9] are to develop a multi-disciplinary station to study deep-sea environmental changes due to subduction of the Philippine Sea plate at the Ryukyu Trench.
53
Figure 4. Noise power spectral density for the IZU. One PSD equals 10-~m/s) H-Izand 102 pa2/Hz for accelerometers and hydrophone, respectively. (a) EW component of accelerometers. (b) Vertical component of accelerometers. (c) Hydrophone channel. The absolute noise level of hydrophone noise lower than 4 Hz may be underestimated due to the response of hydrophone and analog data transmission.
54 Nine Japanese institutions jointly worked on this project. The cable length of the GOGC is 2400 km. The system uses 1.5 inch-diameter coaxial cables. The former TPC-2 system had 845 voice channels. Although +1080 V DC from Okinawa a n d - 1 0 8 0 V DC from Guam were supplied to the cables at a constant current during commercial use [10], the electric power supply in the VENUS project was modified to a single supply from Okinawa. The bottom system comprises seven bottom sensor units, a bottom telemetry system, and main coaxial cables. The land system comprises shore station and data center. The total power dissipation caused by the bottom units is approximately 53.5 W. To minimize corrosion during the long observation period, all pressure cases and the major parts of frames for the bottom units were made of titanium and plastics. The interfaces made of titanium and stainless steel, if they existed, were protected by plastic insulators. The sensor units comprise ocean bottom broadband seismometers (OBBS), a tsunami pressure sensor, a hydrophone array, a multi-sensor unit, geodetic instruments, geoelectricgeomagnetic instruments, and a mobile unit (Fig. 5) [9,11]. Seismic measurement uses "Guralp CMG-1T" triaxial broadband seismometers with gimbals. Seismometers respond to periods between 300 and 0.05 sec. The OBBS outputs are digitized at 24-bit and 100 Hz. The tsunami gauge uses a quartz pressure sensor and the resolution for sea-level change is 0.5 mm. The multi-sensor unit comprises short-period seismometers, a hydrophone, a digital still-life camera, a CTD, a current meter, a nephelometer, and sub-bottom temperature probes. The hydrophone array is composed of five hydrophones with a 700 rn spacing due to some limitations on the budget [ 12]. Sixteen-bit data are transmitted to shore. The geodetic changes are acoustically determined by precise baseline measurements between two transponders. Three units are placed in a triangular formation and the distance between two units is approximately 1 km. The estimated accuracy of geodetic measurements will be a few centimeters/year, which may be smaller than the expected precursory crustal deformation near the trench if an M7 class earthquake occurs just beneath the site [ 13]. The geoelectric-geomagnetic unit comprises a proton magnetometer, flux-gate magnetometers, and orthogonal geo-potentiometers. The length of the geo-potential measurement is 10 m. The mobile unit consists of an acoustic communication unit and a remote instrument. A separate report describes details of each sensor [ 11]. The bottom telemetry system comprises a data-coupling unit, a data-telemetry unit and a junction box. The data-coupling and data-telemetry units generate 24 DC using 3,000 V and separate the high-voltage DC component from high-frequency carriers, and again mix the high-frequency carrier with the DC component. The data-telemetry unit multiplexes the data, and sends them to shore using a 240-kHz carrier bandwidth. The transmission rate for the multiplexed data is 96 kbps. Each instrument, however, uses a particular transmission rate, 19.2 kbps for the OBBS. Hydrophone data use another 240-kHz bandwidth. If an instrument does not operate correctly, users can shut it down remotely from a land base. The junction box has nine so-called Remotely Operated Vehicles (ROVs) undersea mateable connectors. The ROV connectors allow units to be plugged in and unplugged on the ocean floor with the assistance of a manned submersible or an ROV. The route of the GOGC cables was identified on the ocean floor, with a thin sediment cover, by the deep-tow camera of the RJV Yokosuka in February 1998. In March 1998, the submersible Shinkai 6500 (Dive # 411/YK98-02-Leg3) cut the GOGC at 25~ and 128~ at a water depth of 2,200-m using a newly developed cable cutter. Three major legs were carried out by the M/S Kuroshio-Maru, the R/V Kaiyo, and the RJV Karei/ROV-
55 Kaiko for deployment of the telemetry system, deployment of instruments and extensionconnection of cables, respectively. Instruments were confined within an approximately 1-km radius area around the junction box (Fig. 5). The data-coupling unit was spliced into the main cable on the deck of the M/S Kuroshio-Maru in August 1999. The M/S Kuroshio-Maru installed the bottom telemetry system with the tsunami sensor and the hydrophone array at the ocean bottom. The location of the telemetry system is 25~ and 128~ at a water depth of 2157-m. The deep-tow equipment of the RfV Kaiyo installed five instrument units at the ocean bottom in September and October 1999. The ROV-Kaiko connected nine ROV connectors of the cable end to connectors on the junction box at the ocean bottom in October 1999. The OBBS was placed at 80 m north of the bottom telemetry system at a water depth of 2154 m. Figure 6 shows the bottom telemetry system at 2200 m depth. The OBBS was not buried in the sediment as there was no appropriate equipment on the. Instead of burying, 80 kg weights were placed on the OBBS flame, but it seems to be of no help in reducing the low frequency noise [ 11 ].
Figure 5. Instruments and telemetry system configuration at the VENUS multi-disciplinary observatory.
56 A shore station is located in Okinawa. Some shore equipment was obtained from the previous station used by the TPC-2 system. The shore-receiving unit demodulates signals and sends them to Yokosuka, Japan, using two 64-kbps lines. It also supplies 3100 V to the cable. Data from ocean bottom instruments are stored in the data storage at the Japan Marine Science and Technology Center (JAMSTEC). Scientists can obtain their own data and also communicate to a particular instrument using a data terminal at JAMSTEC. A number of earthquakes including two large events of southern California (Ms=7.3) of October 16 and the Taiwan earthquake (Ms 6.1) of November 1, 1999, were observed [11]. However, the noise level of broadband seismometers seems to be extremely large (_--_5~tm/s at 300 sec. for horizontal components) due to strong influences from the infra-gravity wave and ocean currents, because the OBBS was not buried in ocean-bottom sediments [8].
5. CONCLUSIONS AND FUTURE PLAN The two projects (GeO-TOC and VENUS) use decommissioned submarine cables for realtime geophysical monitoring on the deep-sea floor. Data from the ocean bottom can be monitored in the laboratory. Three and half years of operation of the GeO-TOC IZU OBS confirm the usefulness and the reliability of the system. During operation, the system has served to improve hypocenter accuracy of the University Micro Earthquake Network as the far south station. The OBBS records of VENUS show the necessity of installing seismometers in sediments to reduce noise [e.g., 8]. The submarine cable between Okinawa and Ninomiya (Okinawa Cable) will be used to monitor future large earthquakes along the Nankai Trough.
Figure 6. Photograph of the bottom telemetry system at 2200 m deep. Deep-sea ROV connectors can be seen.
57 ACKOWLEDGEMENTS
The Ministry of Education, Science, Culture and Sports, Japan, supports the GeO-TOC project. The Science and Technology Agency, Japan, supports the VENUS project. The author thanks JAMSTEC for its great assistance in operating research boats and submersible vehicles. The author also thanks ERI for permission to use the GeO-TOC and GOGC cables for seismological research.
REFERENCES
1. J. Kasahara, T. Matsubara, T. Sato and K. Mochizuki, J. Mar. Acous. Soc. Jpn., 24 (1997) 39. 2. H. Kinoshita, in: Proceedings of International Workshop on Scientific Use of Submarine Cables, Okinawa, 1997, 119. 3. S. Nagumo and D. A. Walker, EOS Trans. AGU, 70 (1989) 673. 4. J. Kasahara, H. Utada and H. Kinoshita, J. Phys. Earth, 43 (1995) 619. 5. KDD Technical Jour., 42 (1964) 1. 6. J. Kasahara, H. Utada, T. Sato and H. Kinoshita, Phys. Earth Planet. Intr., 108 (1998) 113. 7. J. Kasahara, T. Sato, H. Momma and Y. Sirasaki, Earth Planets Space, 50 (1998) 913. 8. S.C. Webb, Rev. Geophys., 36 (1998) 105. 9. J. Kasahara, Y. Shirasaki and H. Momma, IEEE J. Ocean Engin., 25 (2000) 111. 10. KDD Technical Jour., 88 (1976) 3. 11. J. Kasahara (ed.), Final report on the VENUS project, ERI, 2000. 12. K. Watanabe, in: J. Kasahara (ed.), Final report on the VENUS project, ERI, 17 (2000). 13. Y. Nagaya, in: J. Kasahara (ed.), Final report on the VENUS project, ERI, 148 (2000).
This Page Intentionally Left Blank
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
59
9 2002 Elsevier Science B.V. All rights reserved.
G e o p h y s i c a l O c e a n B o t t o m O b s e r v a t o r i e s or T e m p o r a r y P o r t a b l e N e t w o r k s ? J-P. Montagner a, J-F. Karczewski a, E. Stutzmann a, G. Roult a, W. Crawford a, P. Lognonn6 a, L. B6guerya, S. Cacho a, J-C. Koenig a, J. Savarya, B. Romanowicz b and D. Stakes c aSeismological Laboratory, URA, CNRS 195, Institut de Physique du Globe, Paris, France bSeismological Laboratory, U.C. Berkeley, Berkeley, California, U.S.A. CMBARI, Moss Landing, CA 95039-0628, California, U.S.A.
Most of the scientific issues are multiscale for spatial scales as well as for temporal scales. There is a general consensus that it is necessary to install geophysical stations in oceans which are presently instrumental deserts. In this paper, we review the general constraints of a network of ocean bottom stations which should enable us to address these scientific issues by covering the different spatial scales from global down to local scale. The recent progresses made by Japanese, French, German, Italian and U.S. groups show that the technical challenge of installing long-term or 'even better' permanent geophysical ocean bottom observatories (coined GOBOs) is not out of reach. Different technological developments are presently explored and prefigure what will be the future geophysical ocean bottom stations and networks. They are integrating the concept of multiparameter station, which is demonstrated to have a great scientific interest. The installation of different kinds of sensors at the same place in a seismic station allows us to enhance the signal-to-noise ratio and opens wide the possibility of new discoveries. The finding of the excitation of normal modes in the absence of large earthquakes is emblematic in that respect. Such a multiparameter oceanic observatory includes at least broad band seismometers, microbarometers or pressure gauges, microthermometers, and possibly other sensors (e.g. electromagnetic sensors, strain meters, GPS). The design of the complete chain of acquisition, from the sensor to the distribution of data, will imply integration of all the technical progresses made in micromechanics, electronics, computer science, space science, and telecommunication systems. There is also a real need for developing seismological and more generally geophysical arrays enabling us to address scientific issues at regional and local scales, particularly for understanding active processes in seismic and volcanic areas. The networks at all scales must be coordinated in order to constitute a hierarchical or multiscale network, which will be the basic tool for addressing scientific issues in geosciences. However, the strategy for developing reference GOBOs and temporary stations is not necessarily the same. The technological developments are largely dependent on the period of operation of the station: a
60 long-term geophysical observatory is much more difficult to install and maintain than a time-limited station, due to the problems of power and data transmission. For example, the solution for ensuring the long-term operation is the installation of cables. The cables can be either laid down on the seafloor between the station and the sea-shore or installed vertically, in connection with a surface buoy, in order to ensure the link between sea floor and sea surface, where data can be teletransmitted and power can be provided by solar panels (dieselpowered or windmills). Such a "heavy" observatory necessitates the use of a manned submersible, ROV or AUV. On the other hand, the installation of typically one hundred temporary "light" stations must be performed by usual oceanographic vessels using simple dropping procedures. The design of both kinds of stations will be detailed. Whereas GOBO should follow the multiparameter concept, a temporary station must be, for practical reasons, dedicated to a single parameter measurement. All the efforts necessary to achieve and maintain such a multiscale, multiparameter network represent a formidable technological challenge for the next decade.
1. I N T R O D U C T I O N The last twenty years have seen the explosion of a new kind of seismology, broad band seismology. The Federation of Digital Seismographic Networks (FDSN) played a key role in promoting this new seismology and in coordinating the projects of several countries in broad band seismology, by proposing different standards for the sensors, a data distribution format and by avoiding the duplication of national efforts [1 ]. The same philosophy is followed by the geomagnetism community, which has launched the InterMagnet program. However, in spite of these international efforts, the global coverage of the Earth by digital seismic stations of global networks such as GEOSCOPE, IRIS/GSN and GEOFON, and regional networks (e.g. MedNet, CSN, CNSN, POSEIDON) is still very uneven. Most of the stations are located on continents and primarily in the Northern Hemisphere. Therefore, a large part of the oceanic areas (2/3 of the surface of the Earth) is devoid of seismic and geophysical instrumentation. At the same time, the broad band revolution also concerned the local networks and portable instrument arrays (e.g. PASSCAL in U.S.A., SKIPPY in Australia, GEOFON in Germany, RLBM in France). Such portable networks make it possible to investigate geodynamic processes at regional scales (around 100 km) such as continental roots, tectonic processes, deep origin of plumes. The lateral resolution of tomographic models and detailed studies of active processes (e.g. earthquakes, volcanic activity of plumes and ridges, landslides), is primarily limited by the poor coverage of the oceanic areas by stations. The investigation of such processes is equivalent to looking at an object from only one side or to be blind in one eye. Consequently, a recommendation of recent prospective workshops (see for example the proceedings of the recent ION/ODP workshop [2]) was to promote the installation of long-term oceanic seismographic stations or even better, geophysical stations and observatories. However, this is a very difficult task, due, firstly to the environmental hostile conditions prevailing at the bottom of the ocean, and secondly to the difficulty to correctly install stations
61 on the sea floor, to maintain stable long-term observations, to retrieve data in real time, and to supply power for a long period of time. Due to the high cost of such observatories, the geosciences community has realized that these observatories have to be multidisciplinary. These multiparameter geophysical ocean bottom observatories (hereafter referred as GOBO) are interesting not only from a financial point of view but also from a scientific point of view. Beauduin et al. [3] and Roult and Crawford [4] demonstrated that the co-location of broad band seismometers and microbarometers makes it possible to improve the signal-to-noise ratio for land stations. But the concept of multiparameter station is valid for land station as well as for ocean bottom observatories. The international organization, International Ocean Network (ION), was launched in 1993 in order to coordinate the international efforts in the design, site location and installation of ocean bottom observatories [5]. The same effort must be done for portable ocean bottom stations. In this paper, we review different recent developments which are ongoing regarding instrumentation (sensors, pilot experiments on the sea floor), multiparameter stations, and data teletransmission. The relationship between technological developments and the duration of operation (short-term experiments, semi-permanent and permanent stations) will be discussed. We highlight the French contribution to this international efforts.
2. TOWARDS AN INTERNATIONAL OCEAN N E T W O R K A uniform coverage of the Earth with geophysical observatories at different scales is particularly important from a scientific point of view for understanding the Earth dynamics. Different spatial scales can be considered: global scale (characteristic spacing of stations around 2,000 km), regional scale (typical dimensions: 1,000 km, spacing of the order of 100 km), local scale (dimensions smaller than 100 km). For the global scale, most emerged lands are now covered by broad band stations (Fig. 1). The site locations are coordinated through the FDSN [1 ]. Its initial goal was to obtain the best uniform coverage of the Earth as possible with a station spacing of around 2000 km, which corresponds to a number of stations around 100. This goal is largely overpassed and more than 200 stations are now part of the Federation network. This network includes all stations of global networks (GEOSCOPE, Geofon, IRIS/GSN), and selected stations of regional networks such as Chinese Seismograph Network (CSN), Canadian National Seismograph Network (CNSN), Mediterranean Network (MedNet, Italy), POSEIDON (Japan), Australian National Seismograph Network (ANSN). Though most of continents and emerged lands are adequately covered, the station coverage is still very uneven and dramatically unbalanced towards continents, particularly in the northern hemisphere. The same problem exists for geomagnetic stations. A large part of oceanic areas, particularly in the southern hemisphere, is devoid of instruments. From the scientific point of view, that means a strong aliasing of tomographic models and the impossibility to correctly investigate active processes occurring in these areas.
62
Figure 1. Broad band digital stations. With the present station coverage, the lateral resolution of global tomographic models is limited to about 1000 km. For source studies, the azimuthal coverage of seismic sources is also very uneven. The different scientific issues regarding global studies and active processes were extensively discussed in the ION/ODP workshop [2] held in Marseilles in January 1995. Even though all islands are instrumented, there will be large parts of oceans unsampled particularly in the Pacific Ocean and in the Indian Ocean. 2.1. Pilot experiments on the ocean floor
The installation of a network of GOBOs represents a "formidable" technological challenge and several pilot experiments have been carried out in order to unravel different technical issues. Several groups in Japan, France and U.S.A. have performed preliminary experiments focussed towards the goal of installing permanent seismic stations. In March 1991, a downhole set of broad band seismometers CMG3 was successfully placed in the ODP hole 843B in Japan sea but not recovered [6]; teleseismic events were recorded and broad band seismic noise spectra (0.03 s - 200 s) were obtained [7].
63
NADIA
4~
.o NA OFM
HOLE
396 a 20 m maxl
OFP
Figure 2. Sketch of the OFM/SISMOBS experiment [8]. 2.1.1. SISMOBS/OFM In May 1992, the French pilot experiment OFM/SISMOBS was successfully conducted and two sets of CMG3 broad band seismometers were installed, operated for more than one week and recovered [8]. The experiment (Fig. 2) took place in the North-Atlantic Ocean at 23~ and 43.3~ at the location of the DSDP hole 396B. A first set of CMG3 seismometers (called OFM) was installed on the sea floor at 20 m from the hole and was semi-buried within the sediments. It was possible to install a second set of CMG3 seismometers (named, OFP) into the hole down to -296 m below ocean bottom level. After the installation of both sets of seismometers, seismic signals were recorded continuously during 8 days for OFM and 5 days for OFP at a sampling rate of 5 samples per second. The different instrumentation tools were designed by a team of the Technical Division of INSU, now integrated into IPG-Saint-Maur. The experiment made necessary the simultaneous use of the oceanographic vessel NADIR, of the submersible NAUTILE, and the re-entry logging system NADIA. All the logistical support was provided by IFREMER. From a technological point of view, this experiment was a complete success. The most important scientific results are the following [9]: the seismic noise is smaller in the period range 4-30 s for both OFM (sea floor seismometers) and OFP (downhole seismometers) than in a typical broad band continental station such as SSB (France, GEOSCOPE). But, more important, the noise is smaller than the noise at SSB up to 600 s for
64
OFM. This low level of seismic noise implies that the detection threshold of earthquakes is very low and it has been possible to correctly record teleseismic earthquakes of magnitude as small as 5.2 at an epicentral distance of 105 ~ (Fig. 3). Another important qualitative result is that the noise level tends to decrease as time goes on for both OFM and OFP [ 10]. It is observed that the amplitude of noise is systematically and rapidly decreasing for OFP at long periods (T >50 s). For OFM, there is some tendency of noise decreasing with time but its variations are more erratic and can correspond to the normal variations of noise in a seismic station. The noise level decrease for OFP can be approximated by an exponential, but the asymptotic level corresponding to t --~ ~ is still larger for OFP than for OFM. However, the duration of the operation for OFP (5 days) is too short to have an accurate estimate of the exponential decay. That means that the equilibrium stage was not yet attained by the end of the experiment. Therefore, the key issue of whether it is important to install seismometers down boreholes or on the sea floor was still unresolved. This experiment demonstrated that a broad band seismometer carefully installed on the sea floor and semi-buried can present an excellent signal/noise ratio and provide useful seismic data. Hokkaido, Japan - May 7, 1992 - 06h2,3'56.1" 41.175N 144.7E - depth: 15km - mb - 5.8 Bandpass filler: 10 - 50 mHz
Figure 3. An example of earthquake recorded on both OFM and OFP vertical component.
65
2.1.2. MOISE: Monterey Bay Ocean bottom International Seismic Experiment Following the international workshop of ION in Marseilles [2], a cooperative multiparameter project between IPG (Paris), UBO (Brest), MBARI (Monterey, California) and UC Berkeley was launched in order to test the feasibility of installing, operating and recovering different geophysical sensors (primarily broad band seismometers and electromagnetometers) on the sea floor during three months in order to investigate the covariations of the different signals recorded by these sensors. This experiment named MOISE was conducted from June to September 1997 off the California coast in the Monterey Bay on seafloor sediments at a depth of 1015 m. The different geophysical instruments were deployed by using the remotely operating vehicle (ROV) Ventana of Monterey Bay Aquarium Research Institute (MBARI). The experiment (Fig. 4) is described in [11 ]. Preliminary results are presented in [12] demonstrating the strong correlation between seismic noise and deep water currents. A systematic study of the seismic noise level variations is presented in [13]. It is shown that the seismic noise level was stable throughout the experiment. It is comparable to terrestrial station noise below 15 s, and displays strong diurnal variations at long periods. These diurnal variations can be removed from the vertical component by subtracting the effect of the horizontal components, decreasing the vertical noise by up to 40 db. Coherence between long period seismic, electromagnetic and environmental data was investigated. The coherence between the different signals is maximum near 12 hours, a consequence of tidal effects. The coherence between the vertical seismic signal pressure and current velocity is high throughout the experiment. There is no significant high coherence with the vertical magnetic field. 2.1.3. O.S.NI: February-June 1998 Plans in U.S. called for a three-phase approach [14]. In phase 1 (completed), pilot experiments are proposed to address the fundamental problems of sensor coupling in holes, noise, devising solutions for power, data retrieval and reliability on the multiple year time scale. In phase 2 (under progress), a small number of prototype observatories will be installed, immediately contributing data to the seismological community. In phase 3, the complete network Ocean Seismic Network (OSN) of 20~25 stations will be installed and will complement the IRIS/GSN. An important experiment was carried out at the OSN-1 drill site (ODP hole 843B) 225 km south-west of Oahu, Hawaii, in water 4407 m deep. The noise level in the Pacific Ocean is known to be larger than in the Atlantic Ocean ([ 15] for a review). A complete description of this experiment can be found in Stephens et al. [16]. Three broad band seismic systems were tested: a seismometer (Guralp CMG-3T) resting on the seafloor, a seismometer (CMG-3T) buried within 1 m of the seaflor and a seismometer (Teledyne KS5400) clamped at 248 m beneath the seafloor in the hard rock basement. The instruments were deployed in early February and recovered in early June 1998. The results of the experiment confirm the previous results of the Japanese and French SISMOBS/OFM experiment. In the microseism and short-period band, the borehole sensor had the quietest ambient noise levels, particularly
66
Figure 4. Monterey Bay Ocean bottom International Seismic Experiment. The ROV Ventana of MBARI is holding the seismic broad band package [ 11 ]. on the horizontal components. But at periods longer than 10 s (noise notch and infragravity wave band following the classification of Webb [15]), the buried sensor was as quiet or quieter than the borehole sensor. The good news again was that broad band seafloor seismic installations can yield quality data comparable to land stations, not only in the Atlantic Ocean but in the Pacific Ocean as well. The debate regarding borehole versus buried sensors is now closed. The borehole system can be used for permanent observatory sites provided that the sensor be cemented in place, which should reduce the noise level above 10 s. 2.2. Reference Geophysical Ocean Bottom Observatories on Global scale The same three-phase scheme is also followed by other members of the ION community and the phase which requires installing a small number of prototype observatories is ongoing.
67 These sites were detailed in the ION proposal [5]. In May 1998, a hole was drilled in the middle of the Indian Ocean on the Ninetyeast Ridge. The installation should have been carried out in the framework of a French-Japanese cooperative project, but both partners are still waiting for funding from their respective agencies. In November 1998, U.S. groups installed a junction box on the cable linking the Hawaiian archipelago and California coast about half-way, including broad band seismometers [17]. An unfortunate failure delayed the operation of this first ocean bottom observatory named H20. It was repaired during Fall 1999 and instruments are providing seismic data to the IRIS DMC in Seattle. H20 is a good example on how to re-use retired telecommunication cables. The main advantage of cable is that it enables us to provide power to the instruments and to transmit the data. Unfortunately, most of the ION selected sites are located far away from available cables, particularly in the Southern hemisphere. And the installation of several thousand kilometers long cables should be far too expensive to constitute an alternative to completely autonomous GOBOs. In June 1999, during ODP Leg 186, two sites off the northeast coast of Japan were successfully drilled, in order to monitor seismic and aseismic crustal deformation associated with the subduction of the Pacific plate beneath Japan. Borehole strainmeters, tiltmeters and broad band seismometers (Guralp CMG-1) were installed at the bottom of the drilled sites. At least one set of instruments turns out to be operating correctly (Suyehiro, personal communication).
3. SHORT T E R M STATIONS, REGIONAL AND LOCAL SCALE ARRAYS Long-term GOBO will enable us to investigate global scale geodynamics and will serve as reference stations. However, in order to study active processes, the global ION network must be complemented by regional and local scale networks. Permanent regional networks are devoted to the study of structure for spatial wavelengths from the tens of kilometers to a maximum of 1000 km, and to the localization and study of regional earthquakes. The scales between 100 km and 1000 km in most of the oceanic areas is naturally out of reach so far. As for local networks (scale between a few kilometers and hundreds of kilometers), the number of permanent networks is very limited. They are located either where active processes occur (seismic or volcanic areas) or in quiet areas for nuclear test monitoring. There is a real need as well, for high resolution experiments (scale smaller than 1 km or 100 m) which should be able to map interesting three-dimensional geological objects (e.g. faults systems, magma chambers). So far, regional and local networks cannot provide a uniform sampling of intermediate spatial wavelengths (between 10 km and 1000 km) and mini-scales (smaller than 1 km) for the large variety of geological environments. Only a limited number of tectonically active areas are covered and our knowledge in active processes occurring on plate tectonics boundaries below oceans (ridges, passive or active margins), or in the middle of plates (intraplate volcanism, plumes) is still very limited. The short wavelength structure, below the crust, in the deep mantle, in the D"-layer and in the core is very poorly known. The present coverage of stations does not enable us to address these issues with the
68 available networks. The development of broad band portable networks such as SKIPPY in Australia [ 18] or PASSCAL in U.S.A. among many other initiatives in different European and Asian countries, has demonstrated that hot scientific issues on continents can be addressed by short-term (usually less than one year long) experiments. The new american initiative, U.S.-ARRAY, consists in developing a network of 1000 broad band (BB) instruments and recording systems. However, such a portable broad band network is still missing in oceans. During the MELT experiment [ 19], it was possible to record long-period surface waves and body waves by using short-period instruments with broadened bandwidth, but the quality of data was rather poor due to the inappropriate sensors. However, this pilot experiment demonstrated the interest of such data for understanding the behaviour of ridges and dynamic processes.
3.1. GEODIS: Geophysical Diving Saucer In order to be efficient and achievable, a portable broad band (BB) seismic network must be easy to install and to recover, must have a low power consumption and must be reliable. If we imagine a network of at least 100 instruments, the main limitation of such a network will be its financial cost and the associated expenses in terms of sea campaigns and maintainance. Different institutions around the world are working on the design of such a network. We will detail the project named GEOphysical Diving Saucer (GEODIS) of the IPG of Paris which is developing such an instrument fulfilling these requirements. On a small platform of small diameter (0.93 m), different boxes including three component broad band seismometers, power, recording system and eventually environmental sensors (e.g. pressure, temperature, currentmeter) are interconnected (Fig. 5). The height is 43 cm and the architecture is designed in order to look like a saucer to minimize the influence of bottom currents. The weight of inside water will be 140 kg (216 kg in air) and the round flat frame at its base enables a good coupling with the sea floor. The power will be provided by lithium batteries and should be smaller than 1 watt, in order to ensure a functioning for at least one year. All components of GEODIS are designed to minimize their power consumption. The masterpiece of the instrument is the seismic sensor which is derived from the spatial technology (see next section). Such an instrument can be connected to other platforms through a cable. A first experiment will take place in autumn 2000 off Ustica Island in the Thyrrhenian sea (Gasparoni et al., this volume). GEODIS will be connected to the GEOSTAR platform [20]. This experiment is supposed to test the feasibility of the concept of GEODIS and to make a quantitative comparison as for the seismic noise with the CMG-3T seismic sensor installed on the GEOSTAR platform by the ING team. Two different ways can be used for installing GEODIS. It can be installed either as an autonomous station thrown off board or by using an installer. In the first case, it is necessary to add a weight which will act as a ballast, connected at the bottom of GEODIS with an acoustic release. To get a positive weight balance, 6-8 benthos spheres can be assembled as a toroid. The spheres can be used to store enough Li batteries to
69 operate for at least one year. A solid state mass memory similar to the ones designed for a small station on planet Mars (8 Gbytes) can be used inside the cylinder. In the second case, an "universal" installer must be designed. It is connected either to a classical cable or to a mixed cable with optical fiber and copper wire. The extremity of the cable is connected to the installer which includes standard submarine lights and video camera in order to select convenient site for installation and to recover GEODIS by the end of the experiment. A simple device enabling the communication between the installer and the station will be implemented, to check from the vessel, the quality of the installation and the good health of GEODIS. The same device can be used during the recovery for correctly stopping all cycles and to lock the different moving pieces of the seismometer. The choice between these two options is only dictated by the cost. The second option is more expensive but should ensure a high reliability for the installation. The installer can be even improved by adding a digger to bury or semi-bury the sphere of the seismometer.
Figure 5. GEODIS: GEOphysical Diving Saucer designed by the Technical Team of OFM.
70 3.2. The sensor The present technology does not make it possible the use a very broad band seismometer for automatic or robotics installation, required for the deployment of seismometers in hostile environments, such as ocean bottom, boreholes and comparable hot environments, cold locations such as Antarctica or on the surface of the other telluric planets of the solar system. Less sensitive seismometers, like the STS2 or CMG3 are then used, even if the noise is comparable or even less than levels recorded in the seismic vault. An improvement of the global network VBB instruments, for both the sensitivity, the miniaturisation and the automatic installation capability is then highly desirable. This improvement will be reached by the development of a new VBB 3-axis instrument in France, derived from the prototype presently designed for future planetary missions. This type of seismometer is the second generation of a space-qualified seismometer after the OPTIMISM seismometer, which was onboard the two Small Surface Station of the Russian mission MARS96, launched in November 1996 but which unfortunately failed and was swallowed up by the South Pacific Ocean. Both sensors were designed to survive to high g load (up to 350 gram during 10 ms, where gram is the amplitude of the gravity field). The seismometer which will equip GEODIS is the terrestrial simplified version of the SEIS-NL seismometer developed to monitor the Martian seismic activity within the framework of the NETLANDER mission [21 ] planned for 2005. It is under development through a CNES R&T Program and its terrestrial version could be developed by the SODERN company in cooperation with IPG in Paris (Fig. 6). The external part of the package is efficiently thermally insulated. Optional equipment, composed of a thermal/ current shield and an inclinometer will be available to reduce installation cost. Cost issues will, however, forbid the use of such ultra-sensitive sensors for very dense portable networks. Moreover, and even if field tests have proved the possibility of reaching very low noise levels for surface installations [23], the micro-seismic noise of the seismometers deployed on the surface installation or in small holes is generally one order of magnitude greater and allows, without strong loss, the use of more noisy but much cheaper, miniaturised seismometers. Such instruments will probably allow the development of seismic stations based either on a "bury and forget" philosophy or a "drop and forget" philosophy. The "bury and forget" philosophy may be used after a major earthquake in order to rapidly deploy thousands of sensors on a regional scale and for an operation time of a few weeks. The "drop and forget" strategy makes it possible to deploy seismic stations from the sea surface, by mini-penetrators, developed by Geko-Prakla company for their internal use, able to penetrate the ground for a better seismic coupling. 3.3. Scientific targets: plumes, seismogenic zones The scientific targets of such broad band ocean bottom portable arrays are numerous. Most of the active processes in the Earth take place below oceans or at the boundary between oceans and continents. The manifestations of these active processes can induce catastrophic events,
71
Figure 6. Very-broad band (UBB) seismometer, designed for future Martian missions [24]. in terms of strong earthquakes or volcanic eruptions. Generally, they are related to large scale material flow in the mantle, such as either upwellings, downwellings or even differential flows on both sides of a transform fault (North Anatolian fault, San Andreas fault, for example). In subduction zones, the installation of seismic stations on the oceanic side might enable us to understand the interaction of the subducting plate with the overlying plate, the role of fluids, and the processes involved in the digestion of sediments and oceanic crust. The most advanced projects are those of the Japanese, where a few ocean bottom stations connected to the seashore by cables, are already operating. Such a project exists as well in France where it is devoted to the investigation of the structure of the Antilles Arc, in order to understand the relationship between the seismogenic zone and the volcanic belt. The recent earthquake in Turkey (Aug. 18, 1999) draws attention on the part of North Anatolian fault located below the Marmara Sea, just to the south of Istanbul, where the next damaging seismic event might occur in the next 20 years [25]. As for the upwellings, their surface manifestations, mid-ocean ridges and plumes, are usually less dramatic, but their structure is not well understood. Plumes are probably the most mysterious geological objects. Their origin at depth (asthenosphere, transition zone between
72 400 and 1000 km, D-layer) is still debated. Their geodynamic role in the opening of ridges, in plate reorganization, and in biological crises is probably the most exciting scientific issue for the next decade. Several ongoing initiatives in Europe propose to address these issues. A project named MAGIA (Multiscale Approach of Geohazards Investigation in Azores) is going to be proposed to the European Community. The proposal, foreseing the coordination of CGUL (Lisbon, Portugal) and the involvement of French, German, Italian and Spanish groups, proposes a multiscale approach in order to find the origin at depth of the Azores plume, to understand its interaction with the mid-Atlantic ridge and the associated geohazards. Figure 7 presents the area of investigation where a regional broadband network on the seafloor and in islands will be complemented by dense, classical OBS deployments along lines. These two scales of study should enable us to relate the regional scale heterogeneities (larger than 50 km) to short scale heterogeneities (smaller than 50 km). At an even smaller scale, some seismic reflection experiments should shed some light on the crustal ,A. BB Marine Network '~
VBB Network 9 Island Network
MAGIA
Figure 7. The MAGIA experiment (Multiscale Approach of Geohazards Investigation in Azores) is an European project involving CGUL (Lisbon, Portugal), and several institutions in France, Germany, Italy and Spain.
73 structure between Azores and the Mid-Atlantic Ridge. All these projects will be coordinated under a larger umbrella, the project Monitoring of the Mid-Atlantic Ridge (MOMAR). Another ambitious program devoted only to the investigation of oceanic plume was launched in 2000, under the intiative of GEOMAR (Kiel, Germany), IPG (Paris, France) and the University of Cambridge (UK). In order to solve its scientific objectives, the Plume Oceanic Project plans to develop a network of 100 broad band ocean bottom stations. A first target might be the La Reunion Plume in the Indian Ocean.
4. DISCUSSION: L O N G - T E R M OBSERVATORIES VERSUS T E M P O R A R Y STATIONS The technological issues regarding long-term observatories or temporary stations are quite similar. However, the technological developments are largely dependent on the period of operation of the station: a long-term geophysical observatory is much more difficult to maintain than a temporary ocean bottom station, due to the problems of power supply, failures and data retrieval and transmission. The scientific purposes are different as well, though complementary. Long-term observatories can be used as reference stations and as nodes for short-term experiments with a dense coverage of stations. The best solution for ensuring the long-term operation is the installation of cables. The type of installation is dependent on the distance from the sea-shore. For a short distance from the coast (less than 200 km), the cables can be laid down on the seafloor between the station and the sea-shore. But, for large distance (more than 200 km), the solution consists in installing a moored buoy, connected through a vertical cable. The cable ensures the link between the surface buoy at the sea surface and the sea floor. The only way to operate on a continuous and long-term basis is an ocean floor observatory, to collect and teletransmit data in real-time. Power can be provided by solar panels, diesel generators or windmills. 4.1. General strategy In spite of many advances performed by American, Japanese and French groups, many technical problems are still pending. Due to the high cost of GOBO, the ongoing plan consists of trying to associate different geophysical communities interested in long-term ocean bottom observations. Such a coherent strategy should enable scientists from different fields to develop a synergy by coordinating their efforts. A GOBO must be able to address several scientific issues at a global scale, fulfill different scientific constraints and will need several sensors, which must be independent in order to constitute a scientific module. The concept of modular observatory provides a large degree of freedom to the different scientific groups in the design of their specific sensor and should make the maintenance of the system easier. The notion of standardization is also intimately associated with the concept of independent modules. The different scientific modules have to be connected to a central module by standard connectors.
74 The central common module is the brain and the heart of the GOBO. The whole observatory is organized around it. It is composed of several units, having different functions and providing different facilities, power supply, data storage systems, sending commands to each dedicated module, communication with the outer world through a cable connected to a buoy on the surface and teletransmission to data centers. The cable between the ocean floor and the surface, makes it possible to provide data on a timely basis and eventually to send orders to the different modules from a vessel. From a technical point of view, the observatory shares common modules (power, dataloggers, recording systems, transmission of data), which should be very versatile and are almost independent of any kind of scientific needs. A first step should be achieved in GEOSTAR-2 project where messengers with Gbytes of data can be automatically released and recovered on a regular basis from bottom station up to the sea surface. Therefore, the design of a GOBO must follow the basic philosophy presented previously of multidisciplinarity and modularity. We detail in the next section, what might be the scientific interest of multiparameter stations. Such a philosophy is not necessarily the best one for temporary deployments. The multiparameter concept tends to make the design of the station, its installation (use of manned submersible, ROV or AUV) and its recovery during sea campaigns more complex. Thus, it tends to multiply the causes of failures. In order to reduce the cost of a large number of non-permanent stations, we can accept a degree of reliability smaller than for permanent observatories. Therefore, it is likely that for portable ocean bottom arrays, each station must be dedicated to one sensor, seismic or electromagnetic. However, the recording of environmental parameters such as pressure, temperature and currents are highly desirable.
4.2. Multiparameter stations Since the beginning of the nineties, the concept of multiparameter stations is emerging. Firstly, it enables us to improve the signal-to-noise ratio of signal or equivalently to separate the signal due to seismic waves and the signal due to the fluids envelopes of the Earth (ocean and atmosphere). Secondly, the purpose is, in a sense, more empirical and regards the investigation of correlations between independent physical parameters relevant to a complex scientific problem (earthquake prediction and more generally active processes). For example, the multiparameter observations around Corinth Gulf show a nice correlation on land between seismic anisotropy and magnetic anisotropy [26]. Thirdly, it allows us purpose is to realize economies of scale by allowing sensors required by disciplines to use the same power source, recording and data transmission systems and to simplify operations and maintenance organization. For the first kind of application, some progress has been made and some preliminary results were obtained in the GEOSCOPE station of SSB (Saint-Sauveur-en-Badole, France) and TAM (Tamanrasset, Algeria). Since 1989, microbarometers were installed in SSB, where two sets of STS1 seismometers [27] were present. The complete design of the experiment is presented in Beauduin et al. [3]. It is observed that microbarometric pressure is correlated with the horizontal acceleration component signals of seismic noise. The vertical component
75
Figure 8. Excitation of normal modes in the absence of earthquakes. Top: before pressure correction. Bottom: atter pressure correction [4].
76 of seismic noise is only correlated to pressure at long periods (larger than 500 s). After selecting a seismic signal without earthquakes, it is possible to calculate the transfer function between pressure and seismic noise (amplitude and phase). It is then possible to remove the effect of pressure from the seismic signal. Montagner et al. [28] present an example of such a calculation, where two earthquakes hidden in the noise are perfectly observed after correction. However, Beauduin et al. [3] show that the correlation between pressure and seismic signal is not systematic and is largely dependent upon the quality of the installation of sensors. This means that in a conventional land-based station, where there is easy access to the sensors, the simultaneous recording of pressure and seismic signal can only be an indicator of a problem in the installation. The same kind of approach was followed by Routt and Crawford [4], who showed that it is possible to detect more clearly, free oscillations of the Earth in the absence of earthquakes by subtracting the effect of the atmospheric pressure (Fig. 8). Multicomponent seafloor instruments have also proven useful for improving the seismic signal and for determining crustal structure independent of seismic sources. Crawford et al. [29] use autonomous seafloor packages containing a broad band seismometer and a differential pressure gauge to measure the seafloor deformation under pressure forcing by ocean waves (Fig. 9). They invert the deformation/pressure transfer function to determine crustal structure, and, in particular, to detect and quantify low shear velocity regions such as magma chambers and regions of high hydrothermal circulation. The compliance signal is generally between 0.002 and 0.05 Hz, requiting instruments sensitive to lower frequencies than typical OBS seismometers and hydrophones.They use Lacoste-Romberg vertical gravimeters or Streckeisen STS-2 broad band seismometers as the acceleration sensor, and differential pressure gauges [30] to measure the pressure variations. These instruments have also been used to demonstrate that low-frequency seafloor seismic noise can be reduced by subtracting this "compliance" signal [31] and, in the case of a nonstable emplacement, a significant low frequency tilt noise signal can be removed from the vertical component by either precisely leveling the seismometer, or subtracting coherent horizontal noise from the vertical signal [29]. In the future, the compliance sensors will be constructed using broad band seismometers (either the STS-2 or a Guralp CMG-3) and a combined differential pressure gauge/hydrophone sampled at 50 Hz to create a truly broad band instrument sensitive to high frequency processes such as airgun shots and microearthquakes as well as low-frequency information such as seafloor compliance and teleseismic earthquake arrivals. These results demonstrate the utility of this simultaneous recording for stations installed in hostile environments (planet Mars, ocean bottom, drill holes) where it is almost impossible to check and modify the installation of seismometers on a regular basis. The next question is: what should be an ideal standard multiparameter station and can this concept be adapted to portable stations? A multiparameter station classically includes broad band seismometers, high frequency seismometers, microbarometer, microthermometer, electromagnetic sensors, GPS receiver. According to local requirements, it should be possible to add strainmeters, gravimeters, and any kind of environmental modules and geochemical sensors. Such a philosophy was followed in the design of the GEOSTAR platform [20],
77
a _
~ 2-40 km
an i~(L
t
t
1-10 l.tm
E
b
(8
(.9 ..._,
10 .4
r.
_
.
a d
"
_
E
o (u c~
"o o (8 10 ~ (D r 0 (8 L..
o o
10~' 104
10~
1~
10-1
Frequency (Hz)
Figure 9. Compliance sensor: a) description of the experiment; b) transfer function between the acceleration and the sea surface displacement.
78 1998), but the limitations of such a station are its cost, its weight and its inherent complexity. It is likely that a reasonable approach might be to co-locate different dedicated stations, following the procedure used during the MOISE experiment [11,13].
4.3. Temporal scale According to the previous discussion, the main difference between GOBOs and portable stations concerns the period of operation. An important application of long-term observatory results from the long-time series of signals. Observatories enable us to investigate the time- dependent earth processes. With the recent controversy of discovery of differential rotation of the Earth's inner core [33, 34], a time-dependent seismology is emerging. The compilation of past magnetic observations since the 18th century, has made it possible to investigate the secular variations of the magnetic field and to gain insight on the flow in the outer core at the core-mantle boundary. Since most sensors are recording present physical fields (magnetic, seismic, gravity fields), they only provide information on the instantaneous Earth, by comparison to the geological time scale. If we want to study the time evolution of phenomena, the upper bound of the temporal scale is the human lifetime or in the best case, the history of mankind. Therefore, for time-scales larger than thousands of years, the paleosciences (e.g. paleomagnetism, geology, geochemistry) will still be necessary for providing invaluable information on the evolution of the Earth system. The future networks will only be able to save data on time-scales around a century. However, even at this time scale, there are very interesting geophysical phenomena, such as the earthquake or volcanic cycles. These examples demonstrate the necessity of installing a global network of ocean bottom observatories which will have to work on a long-term basis (several decades). 4.4. Multiscale approach As advocated previously, scientific issues involve many spatial scales. The consequence of this statement is that it is necessary to investigate and relate these different scales and, therefore, to have geophysical networks at different scales. The basis of such a network based on the concept of hierarchical network or multiscale network was presented in Montagner et al. [28]. The goal of this network is to provide observations at all spatial scales, with a uniform distribution of sub-networks. It can be easily understood that such a perfect network leads to an exponential increase in the number of stations and is rather unrealistic. In order to solve this problem, the selection of nodes can be dictated by scientific needs and interests. The community investigating active processes, will propose to install local and mini-scale networks where active processes can provide the most valuable observations, i.e. in tectonic areas (along plate boundaries or for investigating intraplate volcanism or oceanic plumes). As for the instruments in the stations, it is unlikely that there will be only one kind of sensors for the whole network. Since there is a direct relationship between wavelength )~ and frequency f it is obvious that, as scale is decreasing, sensors must be sensitive to higher frequencies. However, due to technological progress, the number of instruments might be quite limited (two or three types of sensors). For example, in seismology, broad band seismometers can cover the frequency range 0.001-80 Hz and industrial short period
79 seismometers can span higher frequencies. So far, there are networks at all scales, but the important point regards the coordination of the different networks, which have to share the same philosophy on the standardization of sensors, data format (such as SEED format) and data distribution. The complete coverage of the global scale is presently ongoing with the extension of the global network of the FDSN towards the ocean thanks to ION. This global network should enable us to investigate the structure of the whole Earth down to scales around 1000 km. The scales between 10 km up to 1000 km can be investigated by portable broad band seismic networks which are exist for emerged lands and under development for oceanic areas.
5. CONCLUSION The networks which are presently under development, must be considered as the instrument of the whole geoscience community. The design of the future network will have to present an even distribution of stations at all scales and to make the whole dataset available on a timely basis. It includes not only the sensor but the complete chain, from the acquisition to data storage and distribution. This whole chain has to be designed in a coherent manner, in order to save time and money. If the geoscience community agrees on this concept of hierarchical (or equivalently multiscale) network, a global strategy must be defined, and the networks at all scales have to be coordinated. The part of the world where most of the scales are poorly investigated is the ocean. The priority in the next years might be to achieve the global scale coverage of the Earth and the development of portable ocean-bottom array with at least 100 stations. The efforts necessary to achieve such a multiscale network represent a formidable technological challenge for the next decade.
ACKNOWLEDGMENTS
This work was partly supported by INSU-OFM (1999), URA/CNRS 7580, Institut de Physique du Globe, Paris.
Seismological Laboratory,
REFERENCES
1. B. Romanowicz and A.M. Dziewonski, EOS, 67 (1986) 541. 2. J.-P. Montagner and Y. Lancelot (eds.), Multidisciplinary observatories on the deep sea floor, I.O.N. workshop, Marseilles, 1995. 3. R. Beauduin, P. Lognonn6, J.P. Montagner, S. Cacho, J.F. Karczewski and M. Morand, Bull. Seism. Soc. Am., 86 (1996a) 1760. 4. G. Roult and W. Crawford, Phys. Earth Planet. Int., 121 (2000) 325. 5. K. Suyehiro in: J.-P. Montagner and Y. Lancelot (eds.), Multidisciplinary observatories on the deep sea floor, I.O.N. workshop, Marseilles, 1995.
80 6. K. Suyehiro, T. Kanazawa, N. Hirata, M. Shinohara and H. Kinoshita, Proc. ODP, Scientific Results, 127/128 (1992). 7. T. Kanazawa, K. Suyehiro, N. Hirata and M. Shinohara, Proc. ODP, Scientific Results, 127/128, (1992). 8. J.P. Montagner, B. Romanowicz and J.F. Karczewski, EOS, Trans. AGU, 75 (1994a) 150. 9. J.P. Montagner, J.F. Karczewski, B. Romanowicz, S. Bouaricha, P. Lognonne, G. Roult, E. Stutzmann, J.L. Thirot, D. Fouassier, J.C. Koenig, J. Savary, L. Floury, J. Dupond, A. Echardour and H. Floc'h, Phys. Earth Planet. Int., 84 (1994b) 321. 10. R. Beauduin, J.P. Montagner and J.F. Karczewski, Geophys. Res. Lett., 24 (1996b) 493. 11. D. Stakes, B. Romanowicz, J.P. Montagner and P. Yarits, EOS Trans. AGU, 79 (1998) 301. 12. B. Romanowicz, D. Stakes, J.P. Montagner, P. Tarits, R. Uhrhammer, M. Begnaud, E. Stutzmann, M. Pasyanos, J.F. Karczewski, S. Etchemendy and D. Neuhauser, Earth Planet. Sci., 50 (1998) 927. 13. E. Stutzmann, W. Crawford, J.L. Thirot, J.P. Montagner, P. Tarits, D. Stakes, B. Romanowicz, A. Sebai, J.F. Karczewski, D. Neuhauser and S. Etchemendy, Bull. Seism. Soc. Am., (2000) submitted. 14. G.M. Purdy, and J.A. Orcutt (Ed.), Broad band Seismology in the oceans - Towards a Five-year plan (Ocean Seismic Network/ Joint Oceanographic Institutions Inc.), Washington, D.C., 1995. 15. S. C. Webb, Rev. Geophys. Space Phys., 36 (1997) 105. 16. R. Stephens, J. A. Collins, J.A. Hildebrand, J.A. Orcutt, K.R. Peal, F.N. Spiess and F.L. Vernon, EOS, Trans. AGU, 1999. 17. R. Butler, A. D. Chave, F.K. Duennebier, D.R. Yoerger, R. Petitt D. Harris, F.B. Wooding, A.D. Bowen, J. Bailey, J. Jolly, E. Hobart, J.A. Hildebrand and A.H. Dodeman, E.O.S., Trans. AGU, 81 (2000) 157. 18. R. D. Van der Hilst, B. L. N. Kennett, D. Christie and J. Grant, E.O.S., 75 (1994) 177. 19. D. W. Forsyth and the MELT Seismic team, Science, 280 (1998) 1215. 20. L. Beranzoli, A. De Santis, G. Etiope; P. Favali, F. Frugoni, G. Smriglio, F. Gasparoni and A. Marigo, Phys. Earth Planet. Int., 108 (1998) 175. 21. P. Lognonne, and the NETLANDER team, Planet. Space Sci., 48 (2000) 1289. 22. Coste 23. P. Lognonn6 J. Gagnepain-Beyneix, W.B. Banerdt, S. Cacho, J.F. Karczewski, and M. Morand, Planet. Space Sci., 44 (1996) 1237. 24. Cacho, S., Etude et r6alisation d'un prototype de sismometre tr6s large bande, 3 axes, qualifi6 spatial, Th6se Universit6 Paris VII, 1996. 25. A. Hubert-Ferrari, G. C. P. King, A. Barka, E. Jacques, S. Nalbant, B. Meyer, R. Armijo, P. Tapponnier and G. King, Nature, 404 (2000) 269. 26. P. Bernard, G. Chouliaras, A. Tzanis, P. Briole, M.-P. Bouin, J. Tellez, G. Stavrakakis and K. Makropoulos, Geophys. Res. Lett., 24 (1997) 2227. 27. E. Wielandt and G. Streickeisen, Bull. Seism. Soc. Am., 72 (1982) 2349. 28. J.P. Montagner, P. Lognonn6, R. Beauduin, G. Roult, J. F. Karczewski and E. Stutzmann, Phys. Earth Planet. Int., 108 (1998) 155.
81 29. 30. 31. 32. 33. 34.
W. C. Crawford, S. C. Webb and J. A. Hildebrand, J. Geophys. Res., 104 (1999) 2923. C. S. Cox, T. Deaton and S. C. Webb, J. Atmos. Oceanic Technol., 1 (1984) 237. S.C. Webb and W. C. Crawford, Bull. Seism. Soc. Am. 89 (1999) 1535. W.C. Crawford and S. C. Webb, Bull. Seism. Soc. Am., 90 (2000) 952. X. D. Song and P. G. Richards, Nature, 382 (1996) 221. Dziewonski
This Page Intentionally Left Blank
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
83
9 2002 Elsevier Science B.V. All rights reserved. H 2 0 : The H a w a i i - 2 O b s e r v a t o r y A. D. Chave a, F. K. Duennebier b, R. Butler c, R. A. Petitt, Jr. a, F. B. Wooding d, D. Harris b, J. W. Bailey a, E. Hobart a, J. Jollyb, A. D. Bowen a and D. R. Yoerger a aDepartment of Applied Ocean Physics and Engineering, Woods Hole Oceanographic Institution, Woods Hole, MA 02543, U.S.A bSchool of Ocean and Earth Science and Technology, University of Hawaii at Manoa, Honolulu, HI 96822, U.S.A. CIncorporated Research Institutions for Seismology, 1200 New York Ave. NW, Suite 800, Washington, DC 20005, USA dDepartment of Geology and Geophysics, Woods Hole Oceanographic Institution, Woods Hole, MA 02543, U.S.A.
A permanent deep ocean scientific research facility, the Hawaii-2 Observatory or H20, was installed on the retired HAW-2 commercial submarine telephone cable in mid-1998. H20 consists of a seafloor submarine cable termination and junction box in 5000 m of water located halfway between Hawaii and California. The H20 infrastructure was installed from a large research vessel using the Jason ROV and standard over-the-side gear. The junction box provides two-way digital communication at variable data rates of up to 115 kbit/s using the RS-422 protocol and a total of 400 watts of power for both junction box systems and user equipment. Instruments may be connected by an ROV to the junction box at 8 wet-mateable connectors. The H20 junction box is a "smart" design which incorporates redundancy to protect against failure and with full control of instrument functionality from shore. Initial instrumentation at the H20 site includes broad band seismometer and hydrophone packages.
1. I N T R O D U C T I O N
The Hawaii-2 (HAW-2) submarine telephone cable was laid in 1964 between San Luis Obispo, California and Makaha, Oahu, Hawaii. It is a second generation vacuum tube repeater (AT&T SD series) analog system [1] which continued in service until 1989, when a cable break off California led to its retirement from commercial service. In 1996, the entire HAW-2 wet plant was acquired by the Incorporated Research Institutions for Seismology (IRIS) from AT&T on behalf of the US scientific community. H20 was installed close to the midpoint between two repeaters (which are spaced 20 nm apart) near 28N, 142W at about 5000 m water depth (Fig. 1). The lithosphere west of 140W in this area was formed between the Pacific and Farallon plates under normal spreading conditions, but at a fast half-rate of 7 cm/year [2,3]. The crustal age based on magnetic lineations is about 45 Ma (isochron 20) or mid-Eocene. The regional physiography is one of
84 abyssal hills with a nominal but variable 50-100 m cover of terrigeneous clay sediment. The local relief around the H 2 0 junction box is quite subdued; a deep-towed survey for 1 km around that point reveals no rock outcrops and very gentle relief of a few tens of meters on a smoothly sedimented bottom. The science drivers behind the installation of H 2 0 are primarily in global geophysics, especially seismology. The H 2 0 site is located at a point on Earth's surface where there is no land for about 2000 km in any direction. For this reason, it is a high priority site for the Ocean Seismic Network (OSN) component of the Global Seismic Network (GSN) [4], and serves as the first operational OSN station. While the present H 2 0 seismometer utilizes a buried broad band sensor, the Ocean Drilling Program (ODP) is scheduled to drill a re-entry borehole close to the H 2 0 site at the end of 2001 for subsequent installation of a downhole seismometer. The H 2 0 site is also one of eight seafloor locations identified by the geomagnetic community where permanent observatories are required [5]. Finally, H 2 0 is located at a logisticallyconvenient place for testing permanent seafloor instrumentation and observatory concepts in the deep ocean.
Figure 1. The Hawaii-2 Observatory is sited 1750 km east-northeast of Honolulu. The path of the Hawaii-2 cable runs between Oahu and Kauai, then heads northeast toward California.
85 2. SHORE INSTALLATION Originally, HAW-2 was powered both from the California and Hawaii ends of the system using opposite polarity power supplies and a telecommunications-standard seawater return. In 1992, the California end of HAW-2 from the shore station out to deep water was removed by AT&T, and hence H20 was designed to be a single-end system powered from Hawaii with a local sea ground. Because the shore plant had been removed from the Makaha cable station, this also required the reinstallation of high voltage power supply and high (radio) frequency line equipment. In the original HAW-2 installation, Makaha was a B terminal with a negative power supply operating at a nominal 5000 V and low band (150-550 kHz) receive/high band (650-1050 kHz) transmit, yielding a total of 138 analog telephone channels, each occupying 3 kHz using frequency domain multiplexing. The original performance characteristics of the system were maintained at re-installation by using actual SD shore equipment previously salvaged from a commercial installation on Guam. The power supply was reduced from quad to dual redundancy and the high frequency line was modernized by replacing vacuum tubes with solid state devices, but all passive components remain the same, and hence the equalization properties of the system have not been altered. In fact, system tests after shore plant reinstallation showed that repeater performance was not substantially different from the initial values measured in 1964. In addition to the analog high frequency line, it was necessary to install modulation/demodulation equipment and associated computing hardware at Makaha to provide a mirror image of the H20 junction box electronics, as described in the next section. This provides a series of digital data streams which are transmitted over a frame relay from Makaha to the University of Hawaii, and then onto the Internet. One of the design goals of H20 was minimization of the need for local data archiving and instrument control by utilizing the Internet. In effect, instrument owners can control their bottom packages from anywhere on Earth.
3. THE H 2 0 JUNCTION BOX
From the outset, it was deemed necessary to design a seafloor installation that minimizes the cost of instrument connection and which provides a simple mechanical and electronic interface for scientific users. The former precludes the connection of dedicated in-line instruments using standard industry cable handling practices, both because of the high initial expense and because future instrument failures or upgrades require a second costly recovery and re-installation. In addition, this approach deters future usage of the cable by the broad community due to the necessity for sophisticated and expensive installation tools. Instead, an approach was used which focuses the electronic complexity in a seafloor junction box into which scientific users can simply plug their instruments using standard deep submergence assets, and which provides a comparatively simple digital communications interface as well as DC power. Finally, it was also required that the junction box be installed using a conventional, large oceanographic ship assisted by a remotely operated vehichle (ROV) rather than a specialized cable ship. The H20 junction box was designed with a number of goals intended to maximize reliability and flexibility.
86
Figure 2. Cartoon showing the H20 site. The termination frame is shown at the left, while the junction box is on the fight. The sea ground is off to the fight of the picture. The mechanical aspects include (1) minimizing corrosion concerns, (2) maintaining compatibility with a generic ROV so that a specialized vehicle will not be necessary in future instrument installations, (3) simplifying the deployment, and 4) ensuring in situ stability. Figure 2 shows a cartoon of the mechanical layout of a seafloor installation that satisfies these criteria. The first condition is met by constructing the junction box of titanium alloys and plastic, thus avoiding the usual corrosion problems encountered with aluminum. The SD cable is terminated at a gimbal recovered from an SD repeater and attached to a titanium flame containing a wet-mateable underwater connector. Use of this termination flame allowed the SD cable to be lowered to the seafloor during installation without the complications which entail from attachment to the main H 2 0 junction box, which is deployed separately. It also allows the junction box to be easily retrieved and serviced either for upgrades or in the event of a failure. The SD cable is connected to the main junction box by a short (-30 m) oil-filled underwater-mateable umbilical. The power conditioning pressure case on the junction box contains a shunt regulator to extract power from the constant current SD system, and is terminated by a sea ground which is deployed far enough from the junction box to eliminate corrosion concerns. The junction box electronics pressure case contains all of the systems necessary to control the power to and communicate digitally with instruments, multiplex the digital data they produce, and transmit it to Makaha on the submarine cable. It also contains the control systems necessary to adjust the communications systems and control power to individual instruments. The electronics pressure case is connected to an oil-filled connector manifold which provides eight ROV-compatible, wet-mateable 8 pin connectors, with four connectors on either side of the manifold. These provide an RS-422 communications interface with extemal instruments as well as 48-volt power. The connector manifold also houses two additional 4-pin connectors to which the termination flame and sea ground are attached. The connector manifold is designed to provide space for ROV access, and the connectors are
87 specifically intended to be compatible with a standard ROV manipulator, being based on the Ocean Design Nautilus family. The entire junction box sits on a broad weighted base and frame that protects vulnerable pressure cases and associated connectors, yet places the connector manifold well clear of the seafloor. The electronic systems for the H20 junction box were also designed to meet a set of criteria, including (1) making use of the available bandwidth on HAW-2 in an efficient manner, (2) accommodating both low and high data rate users, (3) making the main interface between plug-in instruments and the junction box digital to simplify future design work and minimize noise, (4) protecting the cable system from interference or damage by users, (5) providing a down-link capability to control junction box and user instrument functions, (6) making use of commercial hardware whenever possible to minimize engineering costs, and (7) using high reliability design principles with failsafes and fallbacks in case of partial failures. The first two criteria preclude simply maintaining the existing frequency division multiplex (FDM) SD architecture and assigning channel space directly to individual instruments, as this makes inefficient use of the available bandwidth for low data rate instruments and may be inadequate for higher rate ones.
Figure 3. Block diagram of the H20 junction box electronics. See text for details.
88
The DC and RF components arriving from Hawaii on the SD cable are separated at a power separation filter (PSF) located in the oil-filled manifold. The DC component is sent to the power supply pressure case. Because the SD system operates in a constant current mode at 370 mA, it is necessary to produce a voltage drop across a stack of shunt regulators to extract power from the system. The shunt regulators must extract a fixed amount of power from the cable, dissipating that which is not used by the junction box or instruments as heat. There are eight shunt regulators in series, each providing a nominal 50 watts so that about 400 watts is available to power both the junction box electronics and user instruments. When the cable is powered on, two stacks come up automatically to support the system bus, and additional stacks can be brought on line as needed to power the user bus. Both the system and user bus power are generated by DC-DC converters to give electrical isolation of the cable and junction box systems. Standard levels of 48 volts are sent to the main electronic pressure case. The power conversion electronics can be monitored and adjusted using a control computer which is completely independent of the remainder of the junction box, providing a degree of failsafe operation. The RF component is sent to the main electronic pressure case, where it is fed into the SD channel analog interface (Figure 3). The main electronic pressure case contains the remainder of the junction box systems for communications and control. The SD channel analog interface is a passive filter and summing network which separates the uplink (to H20) highband and downlink (from H20) lowband signals, and further combines or splits the FDM spectrum in each band while matching the impedance of the SD cable. Control of most junction box functions is provided through a system controller, which is a dual tone multiple frequency (DTMF) interface to the junction box and user power systems, and to the more fundamental electronic functions of the junction box. The system controller provides the ability to turn junction box subsystems on and off, select backup systems in the event of failure, halt and reboot junction box computer systems individually or collectively, and control power distribution to users. A separate system telemeters junction box electronic parameters like system and user bus voltages and currents or subsystem status information back to shore on a dedicated channel. It also provides a ground fault detection capability to protect connectors and pressure cases from damage in the event that a component failure allows electric current to pass through the pressure case wall. Overcurrent at each user connector is protected through a foldback current limiter, preventing a single user from drawing more than about 50 watts. The heart of the junction box is a communications block consisting of an FDM interface, a set of five V.34 protocol modems, and a PC104 computer. Each of the five blocks provides bidirectional communications at up to 80 kbps using the standard RS-422 protocol. In addition to the five blocks, there are seven individual modem channels for low data rate users. A multiplexer or crosspoint switch allows a given block or modem to be connected to any of the six connectors under command of the system controller. The PC104 is a compact implementation of the PC bus which satisfies reduced space requirements and power constraints for embedded control applications. These computers serve to package and time stamp the data arriving from instruments and distribute the packets among four modems. Precise (about 1 ms) timing is provided by an IRIG board in each stack. The reverse function is performed for data arriving from shore. Each modem occupies a 10-kHz dual sideband channel which is frequency-domain multiplexed onto a section of the downlink low or uplink high band. Each communications block occupies about 40 kHz of spectrum including guard bands, so the five blocks plus seven modem channels require about 270 of the available 400
89
kHz. The remaining approximately 130 kHz of bandwidth is held in reserve to accommodate future expansion. A mirror image of the junction box set of communications blocks and modems is connected to the high frequency line at Makaha. A computer system at Makaha provides full remote control of all multiplexing, monitoring, and stack control functions through a graphical user interface. It also separates the control and user data streams to enhance security. 4. INSTALLATION The H 2 0 junction box was installed in September 1998 using a large US oceanographic research vessel (R/V Thomas G. Thompson) and the Woods Hole Oceanographic Institution Jason ROV. The installation site was selected based on prior site survey data to be approximately halfway between the lay positions of repeaters to minimize the possibility of damage while handling the SD cable. The cable was first located visually using the ROV and found to be about 3/4 nm south of the nominal lay position (which is based on 1964 navigation capability). The ROV then transited along the cable for 5 km (1 water depth) toward California. The cable was cut at this point using a hydraulic cable cutter on the vehicle. The Hawaii end of the cut cable was recovered using a flatfish grapnel attached to the oceanographic 9/16" trawl wire on Thompson. The recovery point was 5 km toward Hawaii from the cut point, so that an equal amount of cable would hang from either side of the grapnel to avoid slippage and cable loss during retrieval. The cable recovery took about 24 hours with trawl wire tension that was continuously near the operating limit of 24,000 lb. The vessel was maneuvered using its dynamic positioning system to minimize cable tension and maintain a nearly vertical cable throughout the operation. The cable was recovered to the fantail of the research vessel through a stern chute and secured to bits on deck. It was then cut at the lift point so that the Hawaii end could be identified. The remaining 5 km cable stub was discarded. The cable termination depicted in Figure 2, which had an SD pigtail already attached to its gimbal, was then spliced into the main cable using standard industry methods. The junction box next underwent a thorough alignment and final checkout on deck after the cable was repowered from the Makaha end. This required about 4 days of closely coordinated effort between shipboard and Makaha engineers during which the cable was hung from the Thompson's stem A-frame and the ship maintained station using dynamic positioning. Once testing was complete, the junction box was unplugged from the cable termination and the latter was maneuvered over the stem to be lowered to the seafloor on the end of the trawl wire through a set of acoustic releases. During this operation, the 1/2" chain holding the load on the acoustic releases failed, sending the cable and termination plunging to the seafloor. The resulting pile of cable was surveyed using a towed camera system and the termination frame was found to be intact, uptight, on top of the cable pile, and within 2 m of open seafloor. The system was tested by installing a shorting plug on the termination frame with Jason and powering it up. Functionality was found to be normal, although the final location of the termination frame was about 2 km west of the intended installation site. The junction box was lowered to the open seafloor north of the termination frame on an acoustically-navigated trawl wire through acoustic releases. Jason proceeded to hook up the umbilical cables linking the junction box to the termination frame and the sea ground, respectively. The system was tested and found to function correctly, but a total failure
90
occurred about 12 hours later which necessitated recovery of the junction box. This was done by hooking a lift line dropped from the ROV depressor weight to the top of the junction box with Jason and recovering all 3 components (depressor, junction box, and ROV). The problem was quickly traced to contaminated oil in the manifold and repaired. The junction box was then re-installed. The primary sensor which was deployed at the H 2 0 site is a ULF seismic system consisting of two packages. The main acoustic sensor package (ASP) houses most of the electronics along with both absolute and differential pressure sensors, a hydrophone, a temperature sensor, and a two component current meter. This package is connected to the H 2 0 junction box with a short tether to provide communications and power. The ground motion sensor package (GMSP) is pulled away from the ASP by Jason and lowered into a caisson that the ROV had previously buried in the sediments using a hydraulic pump. The GMSP sensors include a Guralp CMG-3 broadband triaxial seismometer along with t i l t m e t e r s , a t e m p e r a t u r e s e n s o r , and a l e v e l i n g s y s t e m . The s e n s o r s t a g e also
Figure 4. Earthquake recorded at the H20 seismic system on May 4, 2000. This magnitude 7.3 quake occurred at 04:21:33 Z at 1.41S, 123.6E, at a distance of 90 ~ from the H20 observatory. The X and Y traces show movement in the horizontal direction, while the Z trace shows movement in the vertical direction. The hydrophone shows changes in pressure. The data have been filtered to display signals at frequencies below 0.12 Hz.
91 contains a mechanical shaker consisting of an offset cam on a motor that can be used to vibrate the package at frequencies from 5 to 60 Hz for calibration and coupling tests. Further calibration can be performed on command by sending specified inputs to the Guralp feedback coils. The Guralp sensor is digitized over both high and low gain ranges, effectively yielding 24-bit resolution, as is standard Global Seismic Network practice. Figure 4 shows a sample seismogram from the broad band sensor package. About 2 months after installation in 1998, the seismic package failed due to catastrophic flooding of the current meter. In September 1999, the H20 site was visited to repair this package and install some upgrades to the junction box to improve performance. A separate high frequency hydrophone was also added to the instrument suite. The entire system has performed well since this time. Seismic data from H20 is archived at the IRIS Data Management Center in Seattle, from which it is freely available to scientists around the world. Further plans for instrument installations at H20 include a benthic biology experiment and a seafloor geomagnetic observatory. A borehole seismometer will be installed in 2003 or 2004. It is anticipated that H20 will serve the scientific community for many years.
REFERENCES
1. R.D. Ehrbar, J.M Fraser, R.A. Kelley, L.H. Morris, E.T. Mottram, and P.W. Rounds, The SD submarine cable system, Bell Syst. Tech. J., 43 (1964) 1155-1184. 2. T. Atwater, Plate tectonic history of the northeast Pacific and western North America, in E. L. Winterer, D. M. Hussong, and R. W. Decker (eds.), The Eastern Pacific Ocean and Hawaii, Boulder: Geol. Soc. Am., (1989) 21-72. 3. S. C. Cande and D. V. Kent, A new geomagnetic polarity time scale for the late Cretaceous and Cenozoic, J. Geophys. Res., 97 (1992) 13917-13951. 4. G.M. Purdy and J. A. Orcutt (eds.), Broadband Seismology in the Oceans, Washington D.C.: Joint Oceanographic Inst., (1995) 106. 5. J.R. Heirtzler, A.W. Green, Jr, J.R. Booker, R. Langel, A.D. Chave, and N.W. Peddie, An Enhanced Geomagnetic Observatory Network, Report to the US Geodynamics Committee, National Research Council, (1994) 62.
This Page Intentionally Left Blank
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
93
9 2002 Elsevier Science B.V. All rights reserved.
The M B A R I M a r g i n S e i s m o l o g y Experiment: A Prototype Seafloor O b s e r v a t o r y Debra S. Stakes a, Barbara Romanowicz b, Michael L. Begnaud c, Karen C. McNallyd, JeanPaul Montagner e, Eleonore Stutzmann e and Mike Pasyanos f aMBARI, 7700 Sandholdt Road, Moss Landing CA 95039-0628, U.S.A.
bUCBSeismographic Station, Berkeley, CA, U.S.A. CMBARI (Now at Los Alamos National Laboratory, Los Alamos, NM 87545, U.S.A.) dUC Santa Cruz, Dept Earth Sciences, Santa Cruz, CA 95064, U.S.A. eIPG, Dept. Seismology, Case 89, Paris 75252, France fUCB (Now at Lawrence Livermore National Laboratory, Livermore, CA 94551, U.S.A.) The MBARI Margin Seismology Experiment, conducted from 1996 through 1999, had both technical and scientific goals. The technical goals were to develop new sensors and methods for the development of long-term seafloor geophysical observatories. The scientific goals of the project were to constrain the seismicity of the major faults that crosscut the continental margin of Central California. The 1997 component of this project was MOISE (Monterey Bay Ocean Bottom International Seismic Experiment), an international cooperative pilot experiment that successfully deployed a suite of geophysical and oceanographic instrument packages on the ocean floor using MBARI's ROV Ventana, a tethered Remotely Operated Vehicle (ROV). The goal of MOISE was to advance the global Seafloor Observatory effort through the development and installation of a prototype suite of instruments placed on the western side of the San Andreas fault system offshore of Central California. The MOISE instrument suite was a digital broad band seismometer package partially buried within the sediment-covered floor of Monterey Bay. Several regional earthquakes of magnitude 3.5 and larger as well as several large teleseisms were well recorded during the three-month deployment. The seismic data from MOISE suggest that burial of the broad band sensor package in the continental margin sediments adequately reduces the noise from bottom currents such that both regional and teleseismic events can be usefully recorded, at least in the "low-noise notch" (a frequency band between 5 and 50 sec). Both conventional and well-coupled, ROVinstalled, short-period instruments were deployed in conjunction with the MOISE experiment. During 1998, an offshore network of five ROV-installed instruments was continuously deployed for 8 months. These MBARI "corehole" seismometers include short-period geophone packages mounted in an underwater housing that can be inserted into a 2.5" diameter borehole to provide improved mechanical coupling with the seafloor. The results of this field program demonstrate that placing instruments in offshore sites reduces azimuthal
94 gap, horizontal and vertical location errors, and provides more robust focal mechanism solutions to constrain seismicity on nearshore faults.
1. N E E D F O R L O N G - T E R M S E A F L O O R O B S E R V A T O R I E S
The limited distribution of continents and islands around the world precludes adequate coverage by land-based geophysical observatories to address many important scientific issues related to plate tectonics and the deep structure and dynamics of Earth (e.g. [1]). Long-term seismic observatories on the ocean floor are necessary both for global dynamics studies of deep Earth structure and regional active-process studies that focus on the seismicity, tectonics and hydrothermal volcanic activity of Earth's crust (e.g. [2]). Continuous measurements from seafloor instruments also provide the ability to characterize episodic events, such as undersea volcanic eruptions and avalanches, as well as unrecognized linkages with biogeochemical processes which may not be noted with traditional expeditionary oceanographic approaches. Such long-term stations should be as low-noise, multidisciplinary and broad band as possible with deployment periods of at least 5 years (e.g. [3, 4]. Arrays of multidisciplinary sensors that include both short and long-period ocean bottom seismometers placed in accessible near-shore sites can be used to constrain critical regional processes that may have significant impact on heavily populated coastal areas. In northern California, existing public broad band and short-period stations are predominantly located on the eastern side of the North-America/Pacific plate boundary, and the seismic activity on the off-shore fault system related to this plate boundary is very poorly documented. The complex of active faults that crosscut the continental margin is considered part of the San Andreas system or relicts of the pre-San Andreas Oligocene plate reorganization. The San Gregorio (SGF) and Monterey Bay fault zones (MBFZ) are the major offshore faults in central California. Seismic activity along these structures has been correlated with the distribution of benthic cold seep communities [5, 6] and submarine mass wasting events [7]. There is some chance that they could pose an earthquake hazard for the adjacent populations from Monterey Peninsula to Santa Cruz [8]. The estimated location, mechanism and size of moderate to large events associated with the SGF and the MBFZ is biased by the uneven distribution of seismograph stations. The historical catalogue of seismic events suggests that earthquakes of magnitude 4.0 have not been uncommon. For example, a magnitude 6.2 earthquake doublet was recorded for Monterey Bay in 1926. In addition, a recent analysis by the U.S. Geological Survey suggests that the northern San Gregorio is capable of a magnitude 7.0 event [8]. The sparse distribution of seismograph stations near the Monterey Bay combined with the absence of stations on the west side of the faults created large errors in the determination of hypocenters and focal mechanisms for the characteristic moderate to small events (M<2.0). In addition, the levels of microseismicity or creep are unknown. During the 1997-1998 Monterey Bay Aquarium Research Institute (MBARI) Margin Seismology Project, a suite of three-component broad band and short-period seismometers were deployed in Monterey Bay to supplement the measurements made by the onshore seismograph network (Fig. 1). The initial results of this field program demonstrate that placing instruments in offshore sites reduces azimuthal gap, horizontal and vertical location errors, and focal mechanism uncertainties [9].
95 In addition, the use of offshore stations provides needed phase arrivals for events far offshore, contributes to velocity studies of the SGF and MBFZ [10], and offers unique opportunities for seismological studies that land-based instruments are unable to provide (i.e., studies ofT-phase and marine mammal acoustics).
Figure 1. Bathymetric map of the Monterey Bay area showing the locations of instruments deployed during the MBARI Margin Seismology Program in 1997 and 1998. The MOISE instrumentation, including the broad band seismometer, EM sensor package and current meter/pressure gauge were deployed at a depth of 1015 m at the site indicated by the star, the station MOIS. Other instrumentation was placed near MOIS during 1997 to facilitate instrument comparisons (MA2C, MFRG, MSPW, MDUO, MCGR). During 1998, the short-period array was consistently reoccupied at
96 select sites (MDUO, MCGR, MHRS, MNSR, MPTP) to attain better azimuthal control over the known fault segments.
2. I N S T R U M E N T S AND M E T H O D O L O G Y
Although the scientific need for time-series data from seafloor instruments has been recognized for at least a decade, the technical limitations have proven to be daunting. Technical challenges common to different types of seafloor observatories include: (a) the optimal installation of sensors to maximize signal to noise; (b) a source of power for months to years of operation; (c) timely data retrieval for confirmation that the instruments are properly functioning. Conventional seismometers are deployed by releasing them over the rail of the ship, with minimal control over the specific site or attitude. This type of instrument requires an acoustic release and flotation for autonomous retrieval, both of which compromise the data quality by reducing the mechanical coupling between the seafloor and the instrument. Periods of observations are frequently limited to the duration of a single expedition or to the life of the batteries deployed with the instrument. Deployments longer than a few months are the exception. Longer deployments incorporate increasing risk as instrument malfunction will be undetected until the completion of the experiment. As a result, baseline phenomena are assumed to be extrapolations of results from limited observation periods. Major catastrophic or episodic events may remain completely undetected between the finite windows of observation. 2.1. The MOISE methodology MOISE [ 11, 12] was the centerpiece of a series of proof-of-concept experiments conducted in Monterey Bay to develop an enhanced methodology for seafloor installations that exploits the capabilities of tethered ROVS. Although the instrumentation was deployed only for a few months, the strategies invoked for their installation emulated those required for a permanent deployment. These included: in situ assembly of instruments using underwater connectors manipulated by submersibles, improved coupling of sensors by burial or installation into boreholes, repeated access to sensor data during the experiment, and the addition of external battery packs to extend the instrument deployment period [13]. The most complex instrument deployed for the MOISE experiment was a three-component broad band seismometer system, based on a Guralp CMG-3T, modified for this experiment by DT/INSU (Division Technique, Institut National des Sciences de l'Univers, Paris). Broad band seismic sensors require very precise leveling and can be damaged by rough treatment. The MOISE package contained the sensors on leveling gimbals, a small cpu with a 16-bit A/D and clock module, and a rechargeable battery (Fig. 2a). The sensor could be initially leveled via commands from the ROV and new software developed for this experiment re-centered the sensors autonomously. The housing included the male side of an 8-pin Nautilus underwater connector. The ROV was used to sink a 55-cm diameter PVC caisson by suctioning mud from the center, a technique adapted from Duennebier and Sutton [14]. The result of this procedure was a hole that was approximately 50 cm deep into the cohesive, organic-rich mud typical of the California continental margin. The ROV then carried the sensor to the seafloor and placed it into the prepared site. After deployment, the ROV connected the broad band seismometer sensor to an "LCHEAPO" datalogger (provided by Scripps Institution of Oceanography/IGPP) using an
97 mounted on a metal tripod with supports for three Nautilus connectors. Following a 3-day period during which the sensor was allowed to settle, the ROV returned and filled the annulus between the sensor housing and the caisson with glass beads. The ROV then re-connected to the datalogger. Commands sent from the surface ship to the datalogger via the ROV were echoed to the sensor package for unlocking and leveling the sensors. Once the sensors were levelled, the data collection was initiated. The ROV monitored the initial data stream as it was recorded on board the datalogger to confirm proper operation of the system. The ROV disconnected at this point and left the instrument to collect data for 1 week. After this period, the ROV returned to the site and connected a second Benthos sphere (without disturbing the sensor) with additional lithium batteries to power the instrument for a total deployment period of up to 100 days. Reconnection to the datalogger permitted daily examples of the background noise and one event to be downloaded from the logger hard-drive simultaneous with monitoring the real-time data from the sensor package. The instrument was visited again after 1 month and the data download procedure was repeated. The sensors were also re-leveled by a few degrees during this visit to return them to the mid-point of their dynamic range. Such re-leveling is necessary for instruments installed in sediments because of gradual compaction and settling. In September, at the end of the deployment period, the ROV re-connected and commands were sent to shut down and lock the sensor package. During the first week of the broad band seismometer deployment, other instruments were placed on the MOISE site to collect contemporaneous data. Collaborators from Laboratoire de Geophysique at Universit6 de Bretagne Occidentale (UBO) provided an electromagnetic sensor package to monitor the co-variation of the magnetic field and seismic events. The EM package consisted of a three-component fluxgate magnetometer, Overhauser magnetometer (measuring the total intensity of the earth's magnetic field), and a two-component electric field sensor using a salt-bridge chopper for electrode drift removal. The magnetometer was carried down by the ROV and placed into position to orient the sensors as close to north as possible. To address the question of bottom current and tidal effects, an $4 current meter, a Seagate CTD (conductivity-temperature-depth for salinity measurement), and a Paroscientific pressure gauge, all mounted on a tripod support frame, were placed a few tens of meters from the other instruments. The bottom currents were anticipated to be the most profound marine noise present in the seismic record due to the shear coupling at the sediment water interface [14, 15, 16]. The presence of strong bottom currents at the site induced large seismic and electric noise, that facilitated a correlation between currents, electric field and seismic signal [ 17]. A variety of conventional and well-coupled short-period seismometers and hydrophones were also deployed as stand-alone, contemporaneous instruments to provide more comparative data for regional studies. These instruments included conventional, surface-ship deployed short period seismometers with either 4.5-Hz or 1-Hz sensors and a single channel hydrophone. In addition, ROV-deployed "corehole seismometers" were installed in small diameter boreholes in the granite walls of the Canyon for most of the experiment. These corehole seismometers were the instrument packages used to extend the seismic measurements through repeated deployments during 1998 as the continuation of the MBARI Margin Seismology Project. In addition to the ocean-bottom deployments, ten IRIS-Passcal RefI'eks were placed along the Monterey coast to increase the density of land-based instruments in the area (Fig. 1).
98
b) Figure 2. (a) The ROV Ventana holding the Guralp broad band sensor package just prior to deployment. The housing contains the three sensors on gymbals in a locked position; a small computer with an independent clock and a 16-bit A/D connected to a rechargeable battery. The housing was carried to the seafloor by the ROV and placed into the prepared site (adapted from Dawe et al., 1998). (b) The MBARI short-period corehole sensor sitting adjacent to the Benthos sphere containing the datalogger and Li-batteries. The sensor can either be carried down attached to the datalogger or these can be connected in situ. The ROV maintains a communication link with the logger until the successfully installation is confirmed. Visual confirmation of sensor leveling is confirmed by the LEDs in the vertical handle (adapted from Stakes et al. [12]).
99 2.2. The MBARI Corehole Seismometers. The three-component, wide response (1-~90 Hz), miniature sensor package mainly used for the MBARI Seismology program was developed by the Jet Propulsion Laboratory in collaboration with MBAR! (Fig. 2b). The sensor technology and instrument response characteristics are described in Stakes et al. [12]. These short-period geophone packages are mounted in an underwater housing which can be inserted into a 2.5" diameter borehole to provide improved mechanical coupling with the seafloor. These small diameter boreholes (dubbed coreholes) have been drilled into the vertical walls of Monterey and Carmel submarine canyons for the seismometer deployment using an underwater diamond coting system mounted on the ROV Ventana [18]. Accessible sites west of the SGF are all sedimented, requiting a "portable corehole" a cylindrical hole within a low-profile cement block dubbed a "seismonument". Waveforms for these instruments compared to several traditional ocean floor seismometers [19] show them to have higher signal-to-noise ratios and lower environmental noise (such as bottom currents), features that enhance their capabilities in recording small near-shore seismic events (Fig. 3). The sensor packages and dataloggers were repeatedly deployed and recovered during 1998 using one of the MBARI ROVs. The robotic arms of the ROV placed the sensor packages into the coreholes and rotated them until they were properly leveled, as indicated by an LED on the handle. The seismonuments were carried to the seafloor by the ROV and similarly positioned by the manipulator. Low-power dataloggers were connected to these sensor packages, either on the surface prior to deployment, or in situ after the sensor packages were deployed. The loggers' electronics and their Li-battery packs were housed in Benthos spheres anchored several meters from the sensor packages. The deployment period for each instrument varied from 6 to 12 weeks. Timing for each instrument was maintained by an onboard temperature-compensated crystal oscillator with a rated drift of 1.5 sec/year. The MBARI/JPL seismometers, because of their improved mechanical coupling, collect data in which the S-wave arrivals are resolvable for both large and small events. The capability to resolve the horizontal shear waves as well as the vertical compressional waves is critical to constraining the depth of the events, especially for offshore events that typically have a large azimuthal gap. Traditionally deployed ocean-bottom seismometers can display high background noise levels for the horizontal channels due to non-linear coupling between the geophones and Earth [14, 20, 21] and from ocean floor currents. Accurate direct measurement of the sensor orientation by the ROV combined with the well-displayed shear wave data permitted consistent rotation of the horizontal channels into radial and transverse components. This allowed for more accurate shear wave arrival picks, added phases for event relocations, and leads to more detailed waveform analyses (Fig. 4). The seafloor deployment sites for 1997 and 1998 of MBARI's "Margin Seismology Project" are shown on Fig. 1. During 1997, the corehole deployments were geographically limited to facilitate instrument comparisons during MOISE. Much of the development and the most continuous data have been obtained from site Duoseismo (MDUO) where two coreholes were placed 17 m apart in Cretaceous granitic basement for side by side comparisons [ 19]. A comparison of data from MOIS (the broad band site) and MDUO (the corehole site within the granite) is provided in Fig. 5.
100
Figure 3. Two events recorded in 1998 were co-located within a few hundred meters of each other. Waveforms for the two events recorded at each of three seafloor short-period seismometer sites show good correlation due to the high fidelity of the digital instruments, both for corehole (MDUO) and sedimented seismonument sites (MCGR, MNSR).
101
Figure 4. Unfiltered record section for short-period corehole instrument at site DuoSeismo (MDUO). The high fidelity of the digital instruments and control of the sensor orientation permitted consistent rotation of the shear waves into radial and transverse components. Use of both horizontal and vertical components resulted in better constrained focal mechanisms. The grey waveforms are associated with events cataloged by the Northern California Earthquake Data Center. These waveforms were used to estimate P and S travel time curves for MDUO. Many uncataloged events were detected at MDUO with their associated waveforms shown in black. The travel time curves were used to help constrain the epicenter distance of the uncatalogued events.
102
Figure 5. Comparison of broad band data for a regional earthquake (Nevada, distance - 400 km) recorded at MOIS and MDUO, a nearby continental station, SAO, and an adjacent island station FARB. The data have been high band-pass filtered at 50 sec (modified after [12]).
103 3. RESULTS The results from MOISE are both technical and scientific: the successful strategy for deployment of a broad band sensor package into soft sediment and correlation of seismic and electromagnetic data with physical oceanographic information. In addition, data from MOISE have been incorporated into the MBARI Seismology Program database and used to accurately determine hypocenters and mechanisms for the offshore faults. Details of these results are provided by: Romanowicz et al. [11], Stutzman et. al. [17], Stakes et. al. [12], Begnaud and Stakes [9] and Begnaud et al. [10]. The instrument sites at 800-1000 m depth were well within the SOFAR channel as well as being within a marine mammal sanctuary. This location was the deepest site easily accessible to the ROV Ventana operating on a daily basis from Moss Landing. As a result, there are large numbers of marine mammal calls and Tphase arrivals (water carried acoustic signal) found within the seismometer data records [22]. The continental margin seismometer sites provided records of T-phases prior to their complete conversion and recording by land-based stations. 3.1. Correlation of noise and currents
The deployment of the broad band sensor package was successful, though less than perfect. Although the sensors and gimbals were completely buried in the sediment and stabilized by the glass beads, the upper third of the housing and the support for the connector were exposed to highly variable bottom currents. In spite of the less-that,-optimal installation, we were able to identify all large teleseisms (M>6.0), for which 30-50 sec surface waves were prominent, as well as numerous regional M>4 earthquakes in the continuous data recorded by this instrument. Three component broad band seismic data were acquired continuously on the MOISE Broad band package from 21 June through 11 September 1997, at a sampling rate of 20 samples/sec. Several regional earthquakes of magnitude 3.5 and larger as well as several large teleseisms were well recorded during that time period. Comparison of the records obtained at the ocean-bottom site (station name: MOIS) with those of nearby land sites of the Berkeley Digital Seismic Network (BDSN) and of the Geoscope Network (SCZ) provide useful insight into the quality of the data (Fig. 5). The teleseismic events are band-pass filtered between 5 and 50 secl a frequency band of minimum noise (or low-noise "notch") as consistent with previous seafloor experiments (e.g. [23, 24, 25]). Background noise levels were found to fluctuate in this period band (Fig. 6), and the best recordings were obtained during the quietest periods, when noise in the minimum noise window was comparable to that observed commonly at nearby land sites [12]. Investigation of the source of the large noise fluctuations at station MOIS in the low-noise window was made possible owing to the contemporaneous recording of physical oceanographic data. In the period band 10-50 sec, background seismic noise is often strongly correlated with the bottom current velocity, which exhibits large fluctuations that in general can be related to tides [12, 17].
104
MOISE Power Spectral Density -80
'I"
-85
J
-90
,,,
i
,
L
,
i
I197
-95 -100 -105
~, ~
110
-115
-120 -125
i:L -~3o -135
,.
-140 -145 -150
IIIB155 -160
a
J/1 1
|
0.2
/
\"
9/1/97",,
.....
,
,
,
0.5
1
2
,,
, 5
I
,=s
,--,.-"
,
~
10
20
: I 50
Period (see) Figure 6. Comparison of noise power density spectra on the vertical component of MOISE, computed on 3 different days, using 8 hours of data starting at 0 hours GMT. Depending on the date and the time of day, noise fluctuated by over 20 dB both in the microseismic band (due to atmospheric conditions) and in the low-noise notch between periods of 10 ~50 sec (due to tidal currents).
3.2. Hypocenters and mechanisms for offshore faults During the 1997-98 MBARI ocean-bottom instrument deployments, 30 local Monterey Bay seismic events were listed on the Northern California Earthquake Data Center (NCEDC) catalog [26]. Phase data for these events were obtained from the NCEDC and combined with the MBARI phase data. The majority of Monterey Bay events detected in 1997-98 were in the 1.4-2.5 magnitude range. The 65 best constrained recent and historical events (denoted "master events") were used to develop an improved 1-D crustal velocity model. The relocated master events and their composite mechanisms are shown on Fig. 7. This improved 1-D model was used to relocate all major events within the historical seismic catalogue [10] using the HYPOINVERSE location program [27]. Much of the observed offshore seismicity in 1997---98 occurred on the northern SGF, but there were important events on the MBFZ (Fig. 7). The MBARI ocean-bottom dataset collected in 1997 also included 79 seismic events not catalogued by the NCEDC. There were many additional events observed on single
105 instruments that could not be confirmed as seismic. Similar small events were observed for the seafloor deployments in 1998. Even with the addition of these smaller events, there were no recorded events on the southern SGF that defines the Carmel Canyon offPebble Beach and Carmel Bay. Events observed on the MBFZ varied in depth and magnitude. Events on the northern SGF show a depth distribution consistently deeper on the eastern side. This suggests a fault plane that dips to the east offshore of Santa Cruz. The capability to resolve shear waves, as well as compressional waves, is critical to constraining depth and epicenter location which then permit more accurate focal mechanism determination. First-motion focal mechanism solutions (generated by the FPFIT program [28]) were calculated for the well-located master events. The ocean-bottom instrument locations provide key positions on the focal sphere for better constraining the nodal planes, reducing the strike, dip, and rake uncertainties, reducing the number of multiple solutions, and allowing for more consistent mechanisms [9]. Composite focal mechanisms for the master events are provided in Fig. 7. Mechanisms for the MBFZ are consistently strike-slip with fault planes that are near vertical. West of Santa Cruz, the SGF strikes at N 27 ~ + 3~ W and dips to the east at 61 ~ + 5~ with primarily thrust motion and a component of fight-oblique slip [29]. The discovery of this dipping focal plane and thrust mechanisms highlights a potential hazard for the adjacent populated coastal region.
4. THE FUTURE: MOISE-II AND THE M O O S
Plans are underway to install a permanent broad band sensor in Monterey Bay. This project (MOISE-II) is a collaboration between between UC Berkeley and MBARI. The MOISE-I! instrument array will include a CTD, current meter and pressure gauge similar to the original MOISE. However, the sampling rates for the instruments will be better matched to permit some removal of the bottom current noise. The sensor package will also be completely buried to minimize the impact of the tidal currents. The permanent broad band station will be the first seafloor node for the Berkeley Digital Seismic Network (BDSN [30]). Initially, this instrument will operate autonomously with batteries and data transfer conducted during ROV dives. It is anticipated that the MOISE-II will ultimately benefit from other, ongoing MBARI efforts to provide continuous real-time telemetry back to the shore-based laboratory. The connection to a MOOS mooting (MBARI Ocean Observing System) will provide the data link for the offshore site to provide continuous data on regional and teleseismic events. The MOOS effort is in a nascent stage, with intense planning only completed in early 2000. Primary components of MOOS are telemetered moorings that can support both upper water column and seafloor experiments (Fig. 8). A suite of seafloor or near-bottom instruments could be networked to a single MOOS junction box for truly multidisciplinary experiments. For the MBARI effort, much effort is being placed on in situ chemical and microbiological sensors as well as the low-power electronics to support them. Advanced software protocols will eventually operate within the MOOS central hub to allow in situ data processing for event recognition and response. One of the next major operational steps will be to test methodologies for using the ROV to cable instrument packages to a central hub. Autonomous underwater vehicles (AUVs) are expected to play a major role in the MOOS effort, providing three-dimensional surveys around the MOOS moorings for projects as diverse as climate studies or following hydrothermal plumes.
106
Figure. 7. Major scientific results of Margin Seismology Project study. Relocated master events define the zones of high seismicity as the northern San Gregorio in Monterey Bay and some strands of the Monterey Bay Fault Zone. The southern San Gregorio appears to be aseismic. Re-location of the full historical seismic dataset follows this same trend (see [9]). On the San Gregorio in northern Monterey Bay, events to east are deeper, reflecting fault plane dipping to east. Composite fault plane solutions show fight-lateral strike-slip mechanisms on MBFZ and thrust or oblique mechanism on SGF (figure modified after [29]).
107
Figure 8. Schematic of the telemetered mooring that will be the foundation of the MBARI Ocean Observing System. Seafloor instruments would be connected by an ROV to the benthic hub using fiber optic cable. The instruments will be battery powered so that low power electronics are emphasized. Communication with distant instrument sites as well as three-dimensional mapping will be accomplished via an AUV. Local data processing will enable event identification and response for catastrophic events such as volcanic eruptions or mass wasting events. The surface communications will be two-way to permit land-based modification of instrument protocols.
108 5. CONCLUSIONS The MOISE experiment was a technological success and demonstrates the feasibility of broad band instruments deployed and maintained by submersible within the sediments on continental margins. MOISE-II will open the door for permanent seafloor nodes that extend land-based seismic arrays. The ROV deployment and subsequent reconnection in midexperiment has demonstrated that this type of vehicle can install complex instrumentation on the seafloor in a dependable and relatively simple manner. The low level of background noise observed at long periods during intervals of low bottom currents indicates that complete burial of the sensor package in future experiments should result in broad band stations of sufficient quality to usefully complement the land-based stations. This is particularly encouraging for improving coverage in northern California, where current seismic station distribution is less than optimal for monitoring and understanding strain release and tectonics of this region. Solving the long-term issues of power and data acquisition and retrieval, preferably continuously and in real time [30], through connections to land, should be the focus of the next pilot experiments and is within reach of currently available technology. MOOS will ultimately contribute to the solution to these daunting technical problems. The scientific results of the MBARI Seismology Project highlight the value of a modest number of offshore stations. Even with a relatively small number of instruments (3-5) and a limited period of observation (less than 18 months) many of the inconsistencies of the historical seismic catalogue have been resolved. The Monterey Bay Fault Zone is clearly active, with the potential of occasional moderate to large earthquakes. The pure strike-slip character of the historical events and the vertical slip planes perhaps limits the potential hazards for the Monterey Peninsula. Evidence within the Monterey Canyon, however, does suggest linkages with mass wasting events that contributes to sedimentary processes. The segmentation of the San Gregorio is apparent within the historical seismic catalogue. The northern segment has a comparatively high level of seismicity, mostly characterized by moderate (M 3-5) to potentially large (M 6-7) earthquakes. Of greater concern, however, is the consistent evidence of a fault zone that dips to the east beneath Santa Cruz with focal mechanisms that have a significant thrust component. These results clearly demonstrate the importance of extending land-based seismic networks to the seafloor to constrain both regional and global seismic patterns.
ACKNOWLEDGEMENTS The MOISE experiment would not have been possible without the technical prowess of Jean Francois Karczewski, Jean Claude Koenig and Jean Savary, from DT/INSU in Paris, Doug Neuhauser, from U.C. Berkeley and Steve Etchemendy. T. Craig Dawe and Paul McGill from MBARI. Karen Salamy and Annette Gough assisted with manuscript preparation. Funding for the MBARI Margin Seismology project was provided by the Lucile and David Packard Foundation. Vicky Gallardo and Jesse Williams provided technical assistance with the data analysis. Additional support for MOISE was provided by U.C. Berkeley for software development. DT/1NSU provided funds for the participants from France as well as the broad band sensor used in the MOISE experiment.
109 REFERENCES
1. COSOD 11 (European Science Foundation), Rep. 2nd Conf. Sci. Ocean Drilling, Strasbourg, 1987. 2. G. M. Purdy, O.S.N. workshop, Broad band Seismology in the Oceans: Towards a FiveYear Plan, La Jolla, 1995. 3. B. Romanowicz and A. M. Dziewonski, Eos Trans. AGU, 67 (1986) 541. 4. FUMAGES, Future of Marine Geology and Geophysics, in: P. Baker and M. McNutt (eds.), Workshop Proc., De 5-7, CORE, Washington, DC, 1996, 259 pp. 5. J. P. Barry, H. G. Greene, D. L. Orange, C. H. Baxter, B. H. Robison, R. E. Kochevar, J. W. Nybakken, D. L. Reed and C. M. McHugh, Deep Sea Res., 43 (1996) 1739. 6. D. L. Orange, H. G. Greene, D. Reed, J. B. Martin, C. M. McHugh, W. B. F. Ryan, N. Maher, D. Stakes and J. Barry, Geol. Soc. Am. Bull., 111 (1999) 992. 7. H. G. Greene, R. E. Garrison, H. G. Greene, K. R. Hicks, G. E. Weber and T. L. Wright (eds), in: Geology and Tectonics of the Central California Coastal Region, San Francisco to Monterey, (1990) 31. 8. Working Group on California Earthquake Probabilities, Earthquake Probabilities in the San Francisco Bay Region: 2000 to 2030 - A Summary of Findings, Open File Report, (1999) 99. 9. M. L. Begnaud and D. S. Stakes, Bull. Seism. Soc. Am., 90 (2000) 414. 10.M.L. Begnaud, K. C. McNally, D. S. Stakes and V. A. Gallardo, Bull. Seism. Soc. Am., (2000) in press. 11. B. Romanowicz, D. Stakes, J. P. Montagner, P. Tarits, R. Uhrhammer, M. Begnaud, E. Stutzmann, M. Pasyanos, J. F. Karczewski, S. Etchemendy and D. Neuhauser, Earth, Planets Science, 50 (1998) 927. 12. D. S. Stakes, B. Romanowicz, J. P. Montagner, P. Tarits, J. F. Karczewski, S. Etchemendy, C. Dawe, D. Neuhauser, P. McGill, J. C. Koenig, J. Savary, M. Begnaud and M. Pasyanos, Eos Trans. AGU, 79 (1998a) 301. 13. T. C. Dawe, D. S. Stakes, P. McGill, S. Etchemendy and J. Barry, Proc. Oceans 98, IEEE, 1998. 14. F. K. Duennebier and G. H. Sutton, Mar. Geophys. Res., 17 (1995) 535. 15. S.C. Webb, IEEE J. Oceanic Eng., 13 (1988) 263. 16. S. C. Webb, Rev. of Geophys, 36 (1998) 105. 17. E. Stutzmann, J.-L. Thirot, J.-P. Montagner, P. Tarits, D. Stakes, B. Romanowicz, A. Sebai, J.-F. Karczewski, D. Neuhauser and S. Etchemendy, Bull. Seism. Soc. Am.,_(2000) in review. 18. D. S. Stakes, G. L. Holloway, P. Tucker, T.C. Dawe, R. Burton, J. A. R. McFarlane and S. Etchemendy, Marine Tech. Soc. J., 31 (1997) 11. 19. D. Stakes, J. McClain, T. VanZandt, P. McGill and M. Begnaud, Geophys. Res. Lett., 25 (1998b) 2745. 20. G. H. Sutton and F. K. Duennebier, Mar. Geophys. Res., 9 (1987) 47. 21. J. C. Osler and D. M. F. Chapman, J. Geophys. Res., 103 (1998) 9879. 22. J. E. De Laughter, D. S. Stakes, M. L. Begnaud and J. S. McClain, Eos Trans. AGU, 78 (1997). 23. S. C. Webb, X. Zhang and W. Crawford, J. Geophys. Res., 96 (1991) 2723.
110 24. J.P. Montagner, B. Romanowicz and J. F. Karczewski, Eos Trans. AGU, 75 (1994a) 150. 25. J. P. Montagner, J. F. Karczewski, B. Romanowicz, S. Bouaricha, P. Lognonne, G. Roult, E. Stutzmann, J. L. Thirot, J. Brion, B. Dole, D. Fouassier, J.-C Koenig, J. C. Savary, L. Floury, J. Dupond, A. Echardour and H. Floc'h, Phys. Earth and Planet Inter., 84 (1994b) 321. 26. B. Romanowicz, D. Neuhauser, B. Bogaert and D. Oppenheimer, Eos Trans. AGU, 75 (1994) 258. 27. F. W. Klein, User's guide to HYPOINVERSE, a program for VAX computers to solve for earthquake locations and magnitudes, U.S. Geol. Surv. Open-File Rept. 89-314, (1989) pp. 58. 28. P. Reasenberg and D. H. Oppenheimer, FPFIT, FPPLOT, and FPPAGE: FORTRAN computer programs for calculating and displaying earthquake fault-plane solutions, U.S. Geol. Surv. Open-File Rept. 85-739, (1985) pp. 109. 29. K. C. McNally and D. S. Stakes, Eos Trans. AGU, 79 (1998). 30. B. Romanowicz, L. Gee, M. Murray, D. Neuhauser and R. Uhrhammer, IRIS Newsletter, XVI- 1 (1997) 6.
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
111
9 2002 Elsevier Science B.V. All rights reserved.
T o w a r d s a p e r m a n e n t deep sea observatory: the G E O S T A R E u r o p e a n e x p e r i m e n t P. Favali a, G. Smriglioa~L. Beranzoli a, T. Braun a, M. Calcara a , G. D'Anna a, A. De Santis a, D. Di Mauro a, G. Eti0pe a, F. Frugoni a, V. Iafolla b, S. Monna a, C. Montuori a, S. Nozzoli b, P. Palangio a and G. Romeo a Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy b Istituto di Fisica dello Spazio Interplanetario (IFSI) CNR, Via del Fosso del Cavaliere 100, 00133 Rome, Italy
GEOSTAR is the prototype of the first European long-term, multidisciplinary deep sea observatory for continuous monitoring of geophysical, geochemical and oceanographic parameters. GEOSTAR is the example of a strong synergy between science and technology addressed to the development of new technological solutions for the observatory realisation and management. The GEOSTAR system is described outlining the enhancements introduced during five years of project activity. An example of data retrieved from the observatory being the deep sea mission nmning is also given.
I. INTRODUCTION The deep sea plays a key-role in global climate changes and is a strategic place for the observation in solid Earth geophysics [1]. The recent enhancements of both the marine intervention systems and the auxiliary systems for sea monitoring, have stimulated the scientific community to extend the existing land-based network of permanent observatories into ocean basins, especially at abyssal depths which have been poorly explored. The present distribution of observatories is still constrained by the presence of the oceans and consequently it is inadequate to address important scientific issues related to many features of our planet at global scale (Earth's interior and dynamics), and to regional investigations of active processes in Geophysics (Tectonics of plate boundaries, Seismicity, Volcanism and degassing activity) and in environmental sciences (Oceanography, Biology, Global changes of Climate). In particular, there is the need to deepen the knowledge of the Earth system as a whole and to investigate the suitable strategies for the limitation of natural hazards related to global changes. In the last decades the scientists have addressed their interest toward the understanding of the interactions between ocean environmental processes, terrestrial and atmospheric phenomena [2]. Accordingly, multiparametric monitoring through observatories at sea bottom have been recognised as a fundamental task for the years to come. A sea floor observatory can be defined as an unmanned system at a fixed site in the ocean floor providing power, command, control and communication to sensors located nearby. In skeletal form, the system is foremost a junction box connected to a series of scientific
112 instruments, installed in a main central module, or located at distance as cable linked satellites. Deep sea observatories represent a real challenge for industry because of the sophisticated technology for auxiliary systems and logistics required to work under extreme conditions (e.g. energy consumption, communication means, software development, operation cable technology, positioning). International Institutions and Associations have been established to promote deep sea activities (e.g. Global Oceanic Observing System (GOOS); International Ocean Network (ION)) and workshops and congresses have been organised to allow international co-operation, to exchange ideas and to disseminate the presently available data [3, 4, 5, 6]. Presently, scientific institutions and companies already co-operate in the framework of projects aimed at the realisation of deep sea instrumented platforms and observing stations. For a review of the state of the art the main references are reported in Montagner et al. (this volume). The synergy between geo-sciences and marine technology is hopeful to get stronger in the future decades to spread the Earth's surface with a deep sea floor world wide permanent, reliable, multidisciplinary observing network for a systematic Earth monitoring. In this contribution, the scientific finalisation of the deep sea observatories with respect to geophysics and environmental disciplines is described shortly, and the first European experiment involving the prototype of a deep sea observatory, GEOSTAR, is presented.
2. SCIENTIFIC FINALISATION OF DEEP SEA F L O O R OBSERVATORIES Sea floor observatories open new perspectives to perform multidisciplinary studies in Earth sciences based on long-term time series data from coastal areas to deep sea. In Seismology, long-term sea floor observations can improve studies on local and regional structures (e.g., volcanic underwater structures, ridges, trenches) as well as global scale investigations (e.g., tides, global tomography), the latter through the integration with on-land observatory data. That is crucial for improving the reliability of the present 3-D tomographic models of the Earth structure and overcomes the drawbacks belonging to the use of traditional Ocean Bottom Seismometers (damages caused by the free landing, uncontrolled deposition and orientation, poorly accurate estimation of the site position, relatively short autonomy). In Geomagnetism, availability of sea-bottom stations will give a unique opportunity to measure the geomagnetic field and its long-term variations at great depth (e.g. [7]). A relative small number of geomagnetic observations in the oceans would be sufficient to statistically double the quality of the present global geomagnetic models [8, 9]. The use of sea-bottom stations will be especially significant in the case of simultaneous measurements at different altitudes, from deep sea to outer space jointly with aeromagnetic campaigns (by planes and/or balloons) and satellite magnetic missions. This aspect, like Oersted Project [10], is crucial to obtain a reliable data inversion of potential fields such as the geomagnetic field. In Gravimetry, the availability of continuous measurements of field variations could contribute to increase the accuracy of the geoid definition and temporal short-term changes, Earth tides and free oscillations, leading to a better comprehension of the deep Earth structure and dynamics. In Geochemistry, long-term measurements of significant deep water parameters such as pH, Eh, temperature, electrical conductivity, turbidity, and dissolved gases (CO2, CH4, 02, H2S, etc.) are demanded by the scientific community in order to better characterise processes
113 occurring either in hydrothermal and hydrocarbon-prone areas, and to record signals related to environmental changes [2]. In Physical Oceanography, short-term time series of water currents are usually collected at abyssal depth by instruments recording in local mode. The prospect of having long time series of data from the deep sea will deepen the knowledge of the water currents regime in the sea, leading to a more complete and reliable water circulation modelling. Scientific requirements and potential of seafloor observatories in Biology are discussed by Thiel et al. [2]. Finally, beyond Earth science disciplines, seafloor observatories can be considered as open platforms to host new instruments to be tested or laboratories for physical experiments (e.g., Capone et al., this volume).
3. The
GEOSTAR
example
GEOSTAR (Geophysical and Oceanographic STation for Abyssal Research) is a deep sea, multidisciplinary, scientific, modular unmanned observatory (Fig. 1) realised through projects (1995-1998; 1999-2001) founded by the Marine Science and Technology E.C. Programme (MAST III). GEOSTAR is one of the few experiments on deep sea scientific observatories currently being conducted around the world (e.g., see Kasahara et al. and Montagner et al., this volume). GEOSTAR is presently devoted to the measurement of geophysical and environmental parameters during long-term operation at abyssal depths (up to 4000 m.w.d.). The GEOSTAR concept is based on a sea bottom observatory (Bottom Station) designed to support operation of a wide range of sensors and to ensure the management of a complete mission also in remote mode. All sensors are controlled by an intelligent unit, the Data Acquisition and Control System, managing the whole data flow from the sensors to the communication systems, and performing preliminary data quality checks. High precision clock ensures a long-term highly stable and accurate time reference for all the data. A complete description of the equipment of the Bottom Station is given in Table 1. A dedicated cable-suspended thrustered module, named Mobile Docker, ensures accurate and safe positioning at sea floor, re-entry and recovery of the Bottom Station. This innovative deployment and recovery procedure is derived from the "two-module" concept successfully applied by NASA in the Apollo missions, where one module performs tasks for the other. The Mobile Docker, able to be driven by a winch of a medium-size research vessel, can carry heavy loads (up to 20 tons) and retrieve them at the end of the mission. The manoeuvrability is ensured with an accuracy in the range of 5% of the water depth. The deployment/recovery solution adopted in GEOSTAR overcomes typical limitations of other systems like the need of neutral asset in water, the free-falling deployment (very imprecise and rough), the short autonomy and the limited payload. Communication capabilities are mainly ensured by two means: - buoyant data capsules (Messengers) released automatically by the Bottom Station when filled (see Marvaldi et al., this volume), or from the sea surface by acoustic command. Once at sea surface, the Messenger transmits the data to a shore station through a satellite link. - Near-real-time Communication System allowing bi-directional connection between a shore operator (via satellite) and the Bottom Station (via acoustics). A standard two-way acoustic communication system is also available to allow direct data exchange between sea surface and Bottom Station. An interface for submarine cables is also available for a real-time communication.
114
Figure 1. The Mobile Docker (above) connected to the Bottom Station (below) on the R/V Urania deck, before deployment.
115 Table 1 Characteristics and instrumentation of GEOSTAR 1 and GEOSTAR 2
Scientific Payload
GEOSTAR 1 3 comp. broad-band seismom. Scalar (proton) magnetom. Fluxgate (x-y) magnetom. 300 kHz ADCP CTD Transmissometer
GEOSTAR 2 3 comp. broad-band seismom. Scalar (proton) magnetom. Fluxgate (x-y-z) magnetom. 300 kHz ADCP CTD (with pump) Transmissometer Gravity Meter Space broad-band seismom. 3-D-current meter Chemical Package (pH, H2S) Water sampler
Bottom Station
Dimension (mm) Weight (kg) Material Battery pack Mass Mem.(Gbyte)
3500 x 3500 x 2900 2160 (in air); 1055 (in water) A1 5083 24 V, 300 Ah 2
3500 x 3500 x 2900 mm 3100 (in air); 1475 (in water) A1 5083, Ti grade 5 24 V, 2400 Ah 4
Mobile Docker
Dimension Weight Material Power Thrust Telemetry,
2955 x 2748 x 2032 954 (in air); 748 (in water) A1, Stainless Steel 7 2 x 700 (horiz.) 2 video, 3 RS232 lines 2 b&w
2878 x 2348 x 1540 1050 (in air); 850 (in water) A1, Stainless Steel 25 2 x 700(horiz.), 2 x 700(vert.) 2 video, 12 RS232 lines 3 b&w, 1 color 4 x 250 (+ 1 optional) Sonar, Altimeter
Comm.
(mm) (kg) (kW) (N) Video
Lighting (W) Addit. devices
4 x 250
# Messengers
2 Expendable (32 kByte, ARGOS transmitter) 1 Storage (40 Mbyte, ARGOS positioning)
3 Expendable (32 kByte, ARGOS transmitter) 2 Storage (40 Mbyte, ARGOS positioning)
Acoustics
Multimodulation Acoustic 12 kHz, up to 2400 bit/s bidirectional link
Multimodulation Acoustic 12 kHz, up to 2400 bit/s bidirectional link with a buoy
Systems
Modem
116 The GEOSTAR abyssal station is complementary to other deep sea observational methods (modular observatories, arrays, instrumented submarine cables) and can be easily integrated to other monitoring systems. The Mobile Docker might also be adapted in the future for industrial submarine operations. The possibility of re-entering submarine scientific bore-holes to deploy and recover specific long-term monitoring instrumentation using the Mobile Docker will also be possible The first phase of the GEOSTAR project (1995---1998), named GEOSTAR 1, was aimed at the design and realisation of the system, and at the verification of the performances and reliability in a demonstration mission performed in the Adriatic Sea in Summer 1998 [11, 12, 13]. The second phase of the project (GEOSTAR 2), started in 1999, was aimed at demonstrating through a long-term (6-~8 months) scientific mission in deep sea conditions, the complete functionality of an improved version of the benthic multidisciplinary observatory. Technical details are reported in Gasparoni et al. (this volume). Specific objectives of the mission included continuous long-term acquisition of magnetic, gravimetric, seismological, oceanographic and environmental parameters (water current speed and direction, temperature, salinity) and automatic water sampling. The communication system was enhanced including the near-real-time link. The deep sea mission started on 23 September 2000, with the deployment of the Bottom Station at about 2000 m b.s.l., in the Tyrrhenian Sea, south of Ustica Island. The buoy for the near-real-time link was deployed in December 2000 (see Fig. 2). Scientific and status data recovered through Messenger releases and period checks from shore via near-real-time communication, indicated the correct functioning of all the sensors and devices of the Bottom Station. The dissemination and analysis of the scientific data thus began the observatory still operational at seabed. As example,.data from the gravity meter (seismological signal) and the CTD (temperature) are reported in Fig. 3 and 4.
4. CONCLUSIONS Long-term, multidisciplinary submarine observatories offer the opportunity of an innovative Earth monitoring based on continuous measurements of geophysical and environmental parameters to be extended in areas where presently the observational data are scarce or totally lacking. GEOSTAR is the result of a strong synergy between science and technology and witnesses that scientific requirements represent a stimulus to develop new technological solutions to be applied to the observatory realisation. Station modularity can easily widen the present range of investigations, and be the basic element of future sea floor networks to be managed by a limited number of deployment/recovery and maintenance vehicles in comparison with the number of observation sites. The sharing of the installation procedure and the recovery tools among different scientific and technological partners could provide easier and cheaper maintenance procedures. Features as multidisciplinarity and long-term operation capability need to be at the basis of the design of observatories. In this way simultaneous observations with different instruments of a wide set of phenomena, over a significant time-scale, ensure a correct approach to the study of parameters' variations and eventual correlations.
117
Figure 2. The GEOSTAR deep sea mission: (a) deployment phase; (b) The Bottom Station visualised by the Sonar signal; (c) the Bottom Station after the deployment visualised by the video cameras of the Mobile Docker; (d) the buoy of the near-real-time communication system.
118
Figure 3. New Ireland event occurred on November 17th 2000 at 21:01:56 UTC (Mb 6.2) recorded by Gravity meter
Days (from 23 September 2001) Figure 4. Extract of the water temperature data vs. time.
119 Real-time.data recovery from ocean bottom observatories represents one of the crucial aspects; for some of the possible applications (e.g., tsunamis warning systems) real-time data connection is an essential requirement that is not still matched by any present communication system but the cable connection. Experimentation of different approaches should be encouraged to ensure a fruitful evolution of these systems. New developments in technology may enable the achievement of the most challenging goals of deep-sea observatories (like autonomy, real-time data communication). ACKNOWLEDGEMENTS
The GEOSTAR co-ordinators of Istituto Nazionale di Geofisica e Vulcanologia acknowledge the extraordinary work of all the partners and local project managers, Istituto per la Geologia Marina- CNR (M. Marani), Tecnomare S.p.A.(F. Gasparoni), , Technical University of Berlin (H. Gerber), Technische Fachhochschule Berlin (G. Clauss), Ifremer (J. Marvaldi), Laboratoire de Oc6anographie et de Biogeochimie (C. Millot), Orca Instrumentation (J.-M. Coudeville), Institut de Physique du Globe de Paris (J.-P. Montagner). Many thanks to colleagues of Istituto Nazionale di Geofisica e Vulcanologia: Roberto D'Anna, Matteo Grimaldi, Giuseppe Passafiume for technical assistance and Luigi Innocenzi for the photo/video report of the missions. All GEOSTAR team is particularly grateful to Claudio Viezzoli,, Capt. Emanuele Gentile and the R/V Urania crew for their skill and careful assistance.
REFERENCES
1. C. Wunsch, EOS, Trans. AGU 79 (1998) 62. 2. H. Thiel, K. O. Kirstein, C. Luth, U. Luth, G. Luther, L. A. Meyer-Reil, O. Pfannkuche, and M. Weydert, J. Mar. Sys., 4 (1994) 421. 3. J.P. Montagner and Y. Lancelot (Eds.), Proceedings ION/ODP of the International workshop on Multidisciplinary observatories on the deep seafloor, Marseille, 1995, pp. 229. 4. Abstracts of the 25th General Assembly European Seismological Commission, Reykjavik, 1996, 35. 5. Proceedings of the International workshop on Scientific Use of Submarine Cables, Okinawa, 1997, pp. 184. 6. National Research Council, Illuminating the hidden planet, the future of seafloor observatory, National Academy Press, Washington, D.C., 2000. 7. J.P. Montagner, J.-F. Karczewski, E. Stutzmann, G. Roult, W. Crawford, P. Lognonn6, L. B6guery, S. Cacho, J.-C. Koenig, J. Savry, B. Romanowicz, and D. Stakes, this volume. 8. J. Segawa, and K. Koizumi, J. Geomag. and Geoelectr., 46 (1994) 381. 9. J.R. Heirtzler, J. Booker, A. Chave, A.W. Jr. Green, R. Langel, and N.W. Peddie, (U.S. Geomagnetic Observatory Task Group), A report to U.S. Geodynamics Committee, (1994) 62. 10. Oersted Project: http://www.magnet.oma.be/home/oersted/oersted.html 11. L. Beranzoli, A. De Santis, G. Etiope, P. Favali, F. Frugoni, G. Smriglio, F. Gasparoni, A. Marigo, Phys. Earth Plan. Inter., 108 (1998) 175.
120 12. J.Y. Jourdain, European Commission Project information booklet EUR18885 (ed. Gilles Oilier) 1999, pp. 31. 13. L. Beranzoli, T. Braun, M. Calcara, D. Calore, R. Campaci, J-M. Coudeville, A. De Santis, G. Etiope, P. Favali, F. Frugoni, J.-L. Fuda, F. Gamberi, F. Gasparoni, H. Gerber, M. Marani, J. Marvaldi, C. Millot, C. Montuori, P. Palangio, G. Romeo, and G. Smriglio, EOS, Trans., AGU, 81 (2000), 45.
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
121
9 2002 Elsevier Science B.V. All rights reserved.
N E M O " a project for a Km3-scale neutrino telescope in the M e d i t e r r a n e a n Sea near the south Italy coasts A. Capone for the NEMO* Collaboration Department of Physics, University "La Sapienza" and 1NFN-Rome, P.le Aldo Moro 2, 00185, Roma, Italy The quest for neutrino astronomy could lead the scientific community to the deployment of a Km3-scale deep-sea neutrino telescope in the Mediterranean area in the next years. Several international Collaborations have started studies on the feasibility of this scientific and technological challenge. The NEMO Collaboration, funded by INFN, is involved in developing technologies relevant for detector design and optimisation. Deep-sea measurements (light transmission, optical background, current intensity, salinity, density, sedimentation, bio-fouling) have been carried out or are, at present, going on in several stations of the Ionian and the Tyrrhenian Sea close to the Italian Coasts. An overview of the experiment and results of measurements is shown.
1. THE ROLE OF N E U T R I N O A S T R O N O M Y
So far, the observation of the Universe has been carried out mainly looking at the electromagnetic emissions from stars: very low frequency radio waves, optical frequencies and gamma rays allowed to investigate inside and outside our galaxy. Ultra GeV gamma rays have been observed from other galaxies like Crab Nebula and from the active galaxies Markarian 421 and Markarian 501 [1]. Very high-energy particles, with energy in excess of 102o eV, have been detected [2], but the originating sources are unknown and the way they are produced is one of the open problems in Astro-physics and Particle physics. Several models suggest, together with the emission of gammas from the sources, the emission of neutrinos. If galactic or extra-galactic accelerators produce high-energy photons through the production and decay of neutral pions it is reasonable that they are also a source of neutrinos through the production and decay of charged pions. Neutrinos are massless, weakly interacting particles. They are not affected by strong, electromagnetic or gravitational interactions. As a consequence, neutrinos do not interact and are not deflected during their travel from the source to the Earth and can offer the possibility to identify their sources through the "source tracking back": this is the neutrino astronomy basic assumption. Moreover high-energy gamma ray fluxes are reduced by the interaction with the low energy background radiation of the Universe. Neutrino fluxes are not reduced during their journey to the Earth and offer the possibility to study regions of the space invisible to the gamma ray astronomy. Neutrino astronomy will complement and extend the traditional gamma rays astronomy.
122
These muons are propagate in a transparent medium they travel faster than light (that propagates at the speed Vtight=c/n) and produce a light plane wave front along its track. The detection of the Cerenkov radiation is a signature of the high-energy muon and can allow for its track reconstruction. Due to the very low neutrino interaction cross section a big amount of matter is needed to make neutrinos interact and then to detect them. Calculations show that a detector suitable for neutrino astronomy should reach the incredible weight of 109 tons! Four decades ago it was proposed [3,4,5,6] to study several aspects of particle physics and astrophysics in underground and/or underwater detectors able to detect high-energy neutrinos produced in the atmosphere or coming from the space. In a deep-sea Neutrino Telescope, seawater is both a huge (1 Km 3) target for neutrinos and a Cerenkov radiator [3]. Moreover, a thick water shield can effectively filter out most part of the intense cosmic rays flux that reaches the Earth surface and that, in these experiments, is a source of background. If the telescope is located at depth more than 3000 m the flux of cosmic rays is reduced by a factor-105. Experimental attempts to deploy an underwater Neutrino Detector have started only two decades ago with the pioneering challenge of the DUMAND Collaboration who proposed [7] to instrument a large volume of water in a deep Pacific Ocean site close to Hawaii, at 4000 m depth. After the DUMAND experience other projects have been started and several groups aim to test neutrino telescope prototypes. BAIKAL [8], in Baikal Lake (Siberia), and AMANDA [9] (in South Pole ice pack) are, at present, the only working detectors, but the instrumented volume is still too small for Neutrino Astrophysics. The Mediterranean area seems to offer optimal conditions for the deployment of the telescope. Three European Collaborations are independently working towards the "Km 3 Telescope": ANTARES [10] in Marseilles Gulf (France), NESTOR [11] in the Western Ionian Sea (Greece) and recently the NEMO Collaboration in the Eastern Ionian Sea (Italy).
2. D E E P SEA E N V I R O N M E N T
At the beginning of its activity, the NEMO Collaboration has selected four sites (Fig. 1) close to the Italian coasts that could be appropriate for the construction of the Km 3 astrophysical neutrino detector. The approximately coordinates of these sites are: 9 9 9 9
35 ~ 50' 39 ~ 05' 39 ~ 05' 40 ~ 40'
N, N, N, N,
16 ~ 10' 13 ~ 20' 14 ~ 20' 12 ~ 45'
E E E E
in in in in
the the the the
Ionian Sea, South-East of"Capo Passero" Tyrrhenian Sea, North-East of Ustica Island Tyrrhenian Sea, North of Alicudi Island central Tyrrhenian Sea, South of Ponza Island
The choice of the suitable place for the telescope needs the knowledge of many oceanographic parameters. The main characteristics required are depth, proximity to the coast, good optical transmission, low sea-currents intensity, low biological activity and low optical background.
123
Figure 1. Map of the Italian Coasts. The sites selected by NEMO are marked: Capo Passero (circle), Ponza (triangle), Ustica (cross), Alicudi (square) and Catania Tes site (star). The site has to be deep enough to filter out the down-going atmospheric muon background and allow the discovery capability of up-going tracks (originated from up-going neutrino ~ interactions below the detector). The development of computer codes for event simulation and reconstruction allows us to identify as "up-going" a genuine "down-going" muon with a probability lower than 10 5. At 3500 m below the sea level, where the atmospheric muon flux is reduced by 5 orders of magnitudes we have the possibility to isolate events due to up-going neutrinos. The site has to be as close as possible to the coast. The data transmission to the on-shore laboratory, as well as the transmission of power from the laboratory to the off-shore detector, will be obtained via an electro-optical multi-fibre cable. Data transmission over distances less than 100 Km is possible, with DFB laser, without signal amplifiers that would imply a huge
124 increase of costs and difficulties in deployment. The electronics design must take into account also the difficulties in power transmission: about 10,000 phototubes together with their read-out electronics should be deployed and biased. NEMO Collaboration is developing a custom low-power electronics able to reach this goal. The optical properties of deep-sea water play a fundamental role in choosing the site. The detector effective area is not only directly determined by the extension of the instrumented volume but is strongly affected by the light attenuation in water. A muon track crossing the water at 50 m distances from the detector can be easily observed since the emitted photons have a non-vanishing probability to reach the Optical Sensors. Mainly two microscopic processes affect the propagation of light in water: absorption and scattering. The light absorption directly reduces the number of photons reaching the detector, light scattering influences the measurement of the photon arrival time on Photon Detectors, which is a fundamental parameter in track reconstruction. Both absorption and scattering have been measured in several sites. In the selected region the particulate and its sedimentation must have very low values. Particulate in water increases the light scattering and worsen the track reconstruction and the detector angular resolution. Sedimentation and fouling deposited on the sensitive part of Photon Detectors reduce the global detector efficiency. The site has to be "quiet", i.e. the water current has to be limited and sufficiently stable in order to avoid special requirements on the mechanical structure and the detector deployment. In low current intensity conditions the optical noise due to bioluminescence, often excited by variation of the water currents, is also reduced. Other oceanographic parameters measurements such as salinity density and temperature have been carried out in several sites in different seasons offering a useful map of seasonal changing in the Western Ionian area. Optical noise measurements must also be carried out. Part of this optical noise is due to 4~ decay. This radioisotope is present with an abundance of 0.0117 % in natural potassium (if sea water salinity=38.7 gr/1). When 4~ decays, the emitted relativistic electron produces Cerenkov light and gives rise to an optical noise that can reach 50 kHz on a 10" diameter photo-multiplier at the level of a single photoelectron signals. Optical noise can also be originated by bioluminescence. Luminescent organisms can belong to several kinds of marine species ranging from bacteria and fungi to larger fish. The observation of this phenomenon, with Photon Detectors able to study the pulse time characteristics, will probably reveal the characteristic emission behaviour of individual biological light sources.
3. DEEP SEA MEASUREMENTS 3.1. Optical Measurements Light transmission in water is characterised by the attenuation length, Lc, that is a function of absorption length, La (i.e. the path that a photon travels before it is absorbed with probability P=l-1/e) and scattering length, LB (which measures light diffusion on molecules and particulate). In order to measure attenuation and absorption, a compact size device is commercially available: the AC-9 transmissometer manufactured by WetLabs. Device dimensions (68 cm height 10.2 cm diameter) and good accuracy in measurements of a (the absorption coefficient) and c (the attenuation coefficient) seem to be excellent. The AC-9 performs a and c measurements independently, using two different light paths, for 9 different wavelengths
125 (412,440, 488,510, 532, 555,650, 676, 715 nm). In our experimental set-up (Fig. 2) the AC-9 is connected to a CTD (Ocean MK31, by IDRONAUT) able to record data on depth, water temperature and salinity. The set-up is connected, in telemetry, to a data acquisition system on the ship. During several cruises on board the Italian Research Oceanographic Vessel URANIA we have been able to collect data in Ponza site (October '98), in Capo Passero (January '99, February '99, August '99) and in the vicinity of Matapan Abyss, Lat: 36030 , N, Long: 21o12 , E, depth = 5000 m, (January '99). In Fig. 3 we compare values of absorption (a) and attenuation (c) coefficients measured in the area of Capo Passero to the pure water coefficients [12,13]. A preliminary analysis of collected data shows that the optical properties of deep-sea water in the three sites are close to the pure water ones.
Figure 2. Experimental set-up for optical measurements. AC-9 (left), CTD (fight) are clearly visible. The battery pack for power is hidden behind the CTD.
126 The measured absorption coefficient close to Capo Passero site (at -~3000 m depth) amounts to a440nm-- 0.014• m l p while the value of attenuation coefficient in the same site is r 0.025• m -]. Deep-sea water light transmission suffers both for Rayleigh (on molecules) and Mie (on particulate) scattering. We are, at present, developing an instrument to measure the diffused light angular distribution. The knowledge of this angular distribution will allow us to evaluate the light transmission length in water, strongly related to Lc-(1-
), where ~) is the light scattering angle. Since now, assuming for Mie scattering a conservative value of the average diffusion angle (- 0.9), blue light transmission length in Capo Passero reaches 60 m.
Capo Passero (3000-3300 m) 1
__L ----l
____L-
___L
.....
j ....
. . . . .
.=1 . . . . . . . .
l
. . . . .
j ........
i. . . . . . . .
I I
----~
i / ! ~
l l i
i
i
II
j i
1 0 -1 - - - L
----.I-----~ I--
8
----I=.
i ---I-
---L v
I
0
I I
I I
i
i
i
i
i'
i
i
~
i
~ . . . . . .
1
I
I
~ measured
~
i
I
[
i
'
i i
i
t
I
I
i
i
pure
-4
i
! .... 1 0 -2
I I
----I--~
. . . .
in~llllI~lmrl
----I
1
- - - ~ ---I
----~
-I I'll "1
. . . . u
I
I I I
I
----~
M . . . . . . 1
. . . . . I
~ . . . . . .
I
i I
i
i
'
,
I!
I'
i
i,
!
!l . . . . . . . . . .
i ~
~
'
B" ..........
~
~
' i
1
I
I
L
I
i I "l'------"e-"-I
i~ ............
i
i I" . . . . . . . . . . . .
i I. . . . . . . . . . . . .
.i-..~- . . . . .
i
i
~
i,
i
i
i
,r
I,
i
I
--i
I .............
I
I
,'
f!
,"
I
I .............
I
i
iI
I
I
I
l
l
l
l
l
I I t t 1 l
I I I I I I
I I I 1' I' 1'
I I 1" T T
I I I I I I
@. . . . . . . 1
I............ I
f ............ ~
I............. I
I 1 I
I I I
I 4 I
I I I
& . . . . . . .
I............
~ ............
I.............
I
.... 450
i
I'
,I,,,,I,,,,I 400
i
i
i
.... -'-1
IT
I
I
i'
i
,,.,J
I
i i
!I
~
.-i . . . . . .
1
~ l
.....
i 'Z'
'1'
i
..ft.. . . . . . --4
-I . . . . . .
i
i
measured
. . . . .
I I
I
i
water
i
i
i ............
I I
I
~..........
L ............
i
cpurewater
. . . . .
i I
I ............
i
/~
b . . . . .
i |
L ............
I I
C
. . . . . . . . . .
i
J .............
I I
A .....
i
I
,I ............
I I
i
1-'! a
i
|
i . . . . .
I
---i-..
i
.I
500
~ ............. 1
I 4 I
I
4 .............
I
I .... 550
I .... 600
I" . . . . . . . . . . . .
I
I
I,,,,I,,,, 650
700
750
Z(nm) Figure 3. Measured light absorption and attenuation coefficients (defined as 1 over length) in Capo Passero, as function of wavelength (solid symbols) compared to pure water (open symbols).
127 3.2. Current Measurements Since July 1998 we have started a campaign of study in the selected sites to obtain all needed the information. We have positioned in the region of Capo Passero a current-metre chain made by two Aanderaa RCM8 current-meters positioned respectively 10 m and 100 m above the Sea bottom. Currents measured by the two instruments are in very good agreement. Deep-sea water current measured from August '98 to February '99 is quite stable in intensity (the average value is about 2.8 cm/s) and direction (the water flows from SE to NW and the average angle is 38 ~ NW) as shown in Figure 4. Another set of measurements, carried out from February '99 to August '99, is under analysis; preliminary results show that average current intensity and direction have no seasonal changing. 3.3. Bio-fouling Measurements Bio-fouling and sedimentation are also relevant parameters in choosing the site for locating a Km 3 detector. ANTARES Collaboration has shown that biological growth on Optical Modules can reduce, in few months, the transparency of glass surfaces looking upwards and then light detection decreases significantly. In December 1999, in Capo Passero site, NEMO will deploy a deep-sea station to measure sedimentation and bio-fouling rate for a long-term period (about one year). A standard sedimentation trap, already deployed in August '99, will acquire information on particulate density and dimension. Sedimentation information will be integrated over a 1-month period. Bio-fouling rate will be evaluated measuring the transparency of a glass sphere containing an array of photodiodes and illuminated by a blue LED light. The deep-sea station (Fig. 5) is also provided by a CTD (IDRONAUT MK317) and a current-meter (Aanderaa RCM8) in order to relate bio-fouling data to oceanographic parameters. An acoustic modem (Datasonics ATM-877) will allow us to download data up to the sea surface at any time. The station has been tested during an oceanographic campaign in August '99.
4. O T H E R PLANS FOR THE N E A R FUTURE
The NEMO Collaboration is working to build, in a short time, a test-site near Catania (see Fig. 1) easily reachable from the Port of the town. The station will be devoted to test in medium-deep sea environment (2000 m depth) optical modules, electronics, data transmission set-up, electrical and optical connectors. A 20 Km long electro-optical submarine cable will connect the shore to deep sea, where a splitter box will divide the cable into several ways. Three ways will be devoted to NEMO, one to a seismologic monitoring station, in collaboration with 1NG (Istituto Nazionale di Geofisica, Italy), the remaining ones could be used by interested institutions or industries supporting the experiment.
128
20
Capo Passero
Lat: 36~
15 Ago 98
,--, 10
Sep 98
N; Long: 15~ Oct 98
Nov 98
E
,a
Dec 98 i
I
gl llt
,__.,
E; Depth: 3289 m
'.
5
I
:
.,
i,,l
r Jan 99 i
Feb 99
.! ~.
.
t ~t.
,
r
9i
,f
\:'
i
"i)f =
0
"E
)
-s if:
1. //! o
-10
l
Average Intensity= 2.75 [cm/sec] Average Angle = 38~ Depth = 3279 m 23 Ago 98 - 11 Feb 99
: ~/! iJ
North
-15 -20
,, ,
-480
9 I
0
'
I
,
,
,
4'00 8'80 3 '00
'
480
,
"
,
3840 43120 "
Hours
Figure 4. Deep sea current velocity in Capo Passero site over 6 months time.
Figure 5. The set-up for biofouling study.
129 REFERENCES
1. J. Quinn et al., ApJ L83 (1996), 456; J. Quinn et al., Proc. 25 th I.C.R.C. (Durban), Vol. 3 (1997), 249. 2. M. Takeda et al., Phys. Rev. Lett., 81 (1998), 1163. 3. M.A. Markov, Proc. 10 Int. Conf. High Energy Physics, Rochester (1960), 579. 4. M.A. Markov and I. M. Zheleznykh, Nucl. Phys. 27 (1961), 385. 5. K. Greisen, Proc. Int. Cone Instrumentation for High Energy Physics, Berkley (1960), 209. 6. V.S. Berezinskii and G. T. Zatsepin, Sov. Phys. Usp. 20(1977), 36. 7. H. Blood, J. Learned, F. Reines and A. Roberts, in: H. Faissner (ed.), Proc. 1976 International Neutrino Conf. Aachen, (1977), 688. 8. V. Balkanov, Proc. 26 th I.C.R.C. (Salt Lake City), Vol. 2 (1999),176; O. Streicher, Proc. 26 th I.C.R.C. (Salt Lake City), Vol. 2 (1999),192. 9. J. Kim et al., Proc. 26 th I.C.R.C. (Salt Lake City), Vol. 2 (1999),196; F. Halzen, Proc. 26 th I.C.R.C. (Salt Lake City), Vol. 2 (1999), 428. 10. J. R. Hubbard et al., Proc. 26 th I.C.R.C. (Salt Lake City), Vol. 2 (1999), 436. 11. S. Bottai et al., Proc. 26 th I.C.R.C. (Salt Lake City), Vol. 2 (1999), 456. 12. R. M. Pope, Ph.D. Thesis. (Texas A & M College Station, TX, 1993). 13. H. Bulteveld et al., Optical properties of pure water, in Ocean Optics XIII, in: J.S. Jaffe (ed.), Proc. SPEI, 2258, 174-183, 1994.
This Page Intentionally Left Blank
Part III Marine technologies for deep-sea observatories
This Page Intentionally Left Blank
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
133
9 2002 Elsevier Science B.V. All rights reserved.
Deep Sea Challenges o f Marine T e c h n o l o g y and Oceanographic Engineering G. Clauss and S. Hoog Institute of Naval Architecture and Ocean Engineering at the Technische. Universit~it, Berlin, Germany
Symbiosis is the close relationship of members of a different species from which both derive some advantages. Good and reliable relationships between competent partners gain the best and longest lasting advantages for both sides not only for animal or plant symbiotic structures but also for symbiotic relations in the technical world of marine sciences and marine technology. A challenging focus in marine science lays in the possibility to gain multidisciplinary geophysical and oceanographic data from long-term observations at deep-sea plains. These data should be analysed near real-time. The intention of conquering deep seas is a present task and future challenge for marine science and offshore technology. Consequently, some impulses for symbiotic structures in the field of deep-sea data acquisition and observation between marine sciences and offshore technology are identified in this paper.
I. INTRODUCTION Like in the maritime living space, marine science has an intensive, symbiotic, relationship to marine technology. Marine science is a methodically oriented branch with major fields like meteorology, physical oceanography as well as marine chemistry, biology, geology and geography. Marine technology is related to maritime engineering, with major fields like naval architecture, coastal engineering, mining, process technology, materials technology, electronics and food technology. Marine technology as a problem-oriented branch also delivers traditionally the technical support for marine science applications. This co-operative field of marine technology is called "oceanographic engineering", and aims for the development of equipment, methods and procedures for all kinds of oceanographic and meteorological studies in an environment protecting manner. One major task is the exploration and exploitation of the deep seas, widely supported by "man in the sea" operations in shallow waters and by Remotely Operated Vehicles (ROVs) in deep waters. With GEOSTAR [1 ], a new specialized ROV with high payload capacity and a multidisciplinary, autonomous benthic station is introduced in this paper.
134 2. THE DEEP-SEA FRONTIER IS DESCENDING
Recent increase of crude oil prices at the world market allows oil companies to develop even marginal and deep-sea hydrocarbon deposits. Fig. 1 illustrates the latest deep-water records of offshore production facilities in the North Sea, the Gulf of Mexico (GoM) and the Campos Basin in Brazil. Other promising deep-sea areas are found in the waters off West Africa and Southeast Asia / Australia. The related technical developments are based on both the intensive and consequent redesign and adaptation of existing oil production systems and on the invention of new concepts. Of course, environmental protection is a major condition.
D
0
~West So'e rO 0 m Ekofisk
Mc Gee 200 "-"
Hutt
Fortie(~~~
_~
!
o
!Hondo A _ Cognac(E;)
400
z~ Marlim
600 E "r" I13. I.U a Ill I'--
Green ~ Canyon
80O
ITog
Troll Oil
~-~. Troll Gas ,.~E~~le Baldpate ' <~ Petronius[~]g
~ A~ger ~ ~ M a r s
1000 North 3ulf of Mexicc Brazil Sea Steel platforms 1200 ----'-'-'--" (jackets) Concrete platforms
1400
I
Marlin
9174
O
-O Ursa
single-twothree piece
m Diana L~
Compliant tower "Tension leg platforms (TLP)
1600
SPAR buoy "underwater 1800 .---------..completions
!
2000 1950
1960
9
Mensa z~ South Marlim 31
&
I 1970
I 1980
Roncado 1990
YEAR
Figure 1. Increase in water depth of offshore production facilities [3].
2000
135 An overview of currently established oil production structures in the deep sea reveals three major development lines [2]: -
Tension Leg Platforms (TLPs), e.g. the Ursa TLP reaches the water depth of 1190 meters in the GoM in 1998 Sub-sea completions, e.g. a FPSO (Floating Production, Storage and Offioading vessel) in the Roncador field (Campos Basin) is operating at depth of 1863 meters since 1998 SPAR buoys with surface trees, e.g. a SPAR buoy started to operate in the Diana field (GoM) in 1420 meters water depth in 1999.
World wide leasing activities for ultra-deep-sea areas are promising and will further enhance this challenging trend.
2.1.Going into the deep in the offshore oil and gas industry 2.1.1. Manifolds and Christmas Trees An essential issue for the successful development of deep-sea oil deposits is the new subsea completion technology. Figure 2a shows the deployment of the central manifold for the Troika field in the GoM. It looks like a giant cage. A manifold ties together the fluid flows produced from satellite wells, and pumps it up to the storage vessel or directly to an export pipeline network. A key technology is the ROV (Remotely Operated Vehicle) to be used for guidelineless deployment, installation and maintenance of sub-sea structures. This technology comprises the deployment and connection of risers and control umbilicals at "easy access interfaces" between the sub-sea structures. Christmas Trees as shown in Fig. 2b are integral components of sub-sea developments. A great number of these trees cover a hydrocarbon deposit allowing simultaneous production from up to 59 satellite wells (Asgard field, North Sea) [2]. Christmas Trees provide blowout prevention, pressure and temperature monitoring as well as data read back from downhole instrumentation through control umbilicals. By umbilicals, some of these parameters are interactively controlled from the central producing vessel. In addition, electrical power is transmitted to the sub-sea installations. The entire sub-sea development must be especially prepared and proved to be remotely operable. For this reason the development of standardized, or universal, interfaces and automated, task-based intervention methods is necessary to advance to the next step. Enhancements in the ROV field as well as in the field of remote, online data acquisition will lead to safe and cost effective sub-sea operations improving deep sea activities of the oil companies. 2.1.2. Sub-sea developments cover huge hydrocarbon deposit areas The size and depth of hydrocarbon deposits as well as water depth and environmental conditions are initial criteria to choose the optimum technology for development. Figure 2c gives an impression of these criteria. Crude oil or gas production from marginal fields or deep-sea areas need the application of all deep-sea technologies mentioned previously. It includes drilling with semi-submersibles or drill ships, production with FPSOs, SPARs or similar vessels, and crude oil transfer with shuttle tankers or pipelines. The production vessel is held on location with a mooring system or by dynamic positioning. Flexible or steel risers provide the necessary connections to the sub-sea equipment on the ocean floor. Satellite wells are spread over the whole deposit area. The dimension of the covered sub-sea area can easily be several hundred square kilometers, while the monitoring
136
Figure 2. (a) Deployment of sub-sea manifold, Troika field, GoM ( 9 BPAmoco). (b) Christmas tree, Marlim field, Campos Basin, Brazil ( 9 Petrobras). (c) Schematic of possible sub-sea developments using FPSOs and semi submersibles ( 9 BPAmoco).
137 and control of any fluid flow parameter is possible due to bi-directional umbilicals. Thus, the online connection to different satellite stations is a key factor for safe and effective operation of these "out of human reach" facilities. 2.2. Going into the deep in marine sciences Installation and maintenance of sub-sea developments in the offshore oil and gas industry are major tasks for ROV interventions, which led to an increasing role of ROV technology in deep seas. Due to increasing demands on capability, productivity and reliability of remote intervention tools and techniques, ROV technology is continuously improving. Key issues of future developments are high depth rates, equipment modularity, reduced size, universal interfaces and automation of intervention tasks. Currently about 90 companies and institutions are involved in the development of the next ROV generation worldwide that led to about 300 different ROV types and almost 3000 units presently on the market [4]. Although these numbers imply a high grade of coverage of most demands, the market of specialized and prototype ROVs for the relatively low budgets of marine scientific programs is still growing. A recent example is the development of the MODUS ROV, which has been developed in co-operation with these authors. 2.2.1. GEOSTAR: A new toolset for marine sciences deep-sea operations Funded by EU in the framework of Marine Science and Technology (MAST3), GEOSTAR (GEophysical and Oceanographic STation for Abyssal Research) is a current example of a successful co-operation between marine science and marine technology. Participating companies and institutions come from Italy (ING, Tecnomare), France (Ifremer, LOB, IPG, Orca) and Germany (TU Berlin, TFH Berlin). The GEOSTAR system consists of two main components the deep-sea intervention tool named MODUS (MObile Docker for Underwater Sciences) designed and built at the University of Applied Sciences (TFH) Berlin and the Technical University Berlin (TUB), and the innovative deep-sea observatory, designed and built in Italy and France. This benthic station is capable of performing autonomous, long-term geophysical, geochemical and oceanographic observations in abyssal depths. It is deployed, recovered and serviced by the MODUS via an electro-mechanical umbilical, winch, ship system. The concept of GEOSTAR has been evaluated during the European ABEL and DESIBEL projects (MAST 2) in the early 90s. As one result, the MODUS was specified to be frequently able to deploy and recover heavy payload stations to abyssal depths. A successful demonstration mission in August 1998 in the Adriatic Sea has proven the basic functionality of the GEOSTAR system in shallow waters (GEOSTAR phase 1: water depth 40 m). The second phase of GEOSTAR (GEOSTAR 2) has led to the overall upgrade of main components as the MODUS and the benthic laboratory (up to 4000 m water depth). Utilizing the services of the Italian R/V Urania in September 2000, sea trials of the upgraded system have proven the GEOSTAR deep-water capabilities, and led to the successful deployment of the benthic station in 2000 m water depth in the Tyrrhenian Sea near Ustica Island north of Sicily (Fig. 3). MODUS is responsible for the exact positioning and retrieval of the benthic station at/from the seafloor: Figure 3 shows the enhanced deep-sea version of MODUS latched with the frame of the benthic station. The 'catch' operation is aided by a 2 m diameter wide-open funnel, which fits exactly on top of the benthic station. A flexible pin provides the
138
Figure 3. GEOSTAR system with umbilical-tethered MODUS ROV and the benthic station (left: dummy, right: prototype) with A-Frame of R/V Urania. Tyrrhenian Sea, September 2000. electromechanical connection. Ascent and descent operations down to 4000 m are managed by the ship-mounted winch and umbilical. This umbilical supplies the vertical movements and provides power and data transmission to the submerged devices. Horizontal movements are driven by brush-less DC thrusters (2x700 N thrust) by aid of a ship-mounted dynamic Global Posltlomng System (dGPS) and an Ultra-Short-Base-Line (USBL) system, while the final approach to the station is supported by sonar and a video system mounted on the MODUS. To stabilize the horizontal position adjustment during coupling procedures, two vertically oriented thrusters are installed. All remote operations and data transfers are monitored and stored on videotapes or log files supplied by the operation control unit on board the ship. /r
.
.
.
Once deployed at location, the benthic station is capable of staying 6 months up to one year on the ocean floor. During this time the extensive multidisciplinary sensor equipment like seismometers, scalar and vectorial magnetometers, gravity meter, ADCP (Acoustic Doppler Current Profiler, 300 kHz), CTD (Conductivity-Temperature-Depth), transmissiometer, hydrophone, electrochemical package (pH, H2S) and water sampler (48x500 ml) are autonomously surveying the location. The frame of the station also hosts all necessary equipment for the missions such as battery pack, Data Acquisition and Control System (DACS) as well as a communication system. Main characteristics of the modular GEOSTAR system are reported in Table 1.
139 Table 1 Properties of the GEOSTAR 2 system main components MODUS
Benthic Station
Purpose
Umbilical driven frequent operations
Material Weight in air (kg) Weight in water (kg) Total length (mm) Total width (mm) Total height (mm)
Aluminium / Stainless Steel / Titanium (pressure boxes) 1070 730 2878 2348 1700 (without cable termination)
Long-term missions (6-12 months) in deep ocean waters Aluminium/Stainless Steel / Titanium (pressure boxes) 2942 1416 35O0 35OO 2900 (without docking pin and magnetometer booms)
Thrust lateral (N) Thrust vertical (N) Depth rated (m)
2x700 2x700 4000
4000
The acquired data are transmitted via pop-up buoys and, in addition, by acoustic telemetry. For near real-time transmission the use of a dedicated mooring system and INMARSAT infrastructure is foreseen. Due to its moderate dimensions and the need of only three operators for deployment and recovery, the MODUS ROV is a very cost-effective tool. In addition, two persons are required to check functionality of the sensors and to start operation of the benthic laboratory. For reliability and cost reduction reasons, the GEOSTAR system uses deep sea proved "out of the shelf" components like thrusters, sonar, altimeter, underwater lights and cameras or acoustic releases. Taking advantage of the multitude of theoretical and experimental investigations for "standard" ROVs published in the literature during 25 years, the design process for a special configuration still has to face both the environmental and especially the inner space requirements and constraints of the specific application. In various project stages the hydrodynamic and hydroelastic behaviour of the tethered GEOSTAR system has been investigated and further developed [5-7]. Computational Fluid Dynamics (CFD) analysis is utilized to optimize structural-induced drag losses of MODUS with respect to power consumption and thruster arrangement, while simulations of the system dynamics are performed to prevent slack conditions with dangerous snap loads during operation: The analysis comprises various research ship headings for the prevailing sea during the many hours deployment and recovery procedures. The response of the ship to sea spectra (calculated with the Response Amplitude Operators (RAOs)of the ship) is the input of a nonlinear dynamic time-domain analysis of motions of the submerged system with umbilical, ROV and benthic station at its end. Simulations for the operation with R/V Urania result in a limiting significant wave height of Hs=0.5 m for paying out 3500 m of cable in beam seas. Different values are achieved for shorter cable length and different ship-wave headings [5].
140 This short excursion should only give an idea of the global considerations made during the development procedure of the GEOSTAR system. The dynamics of MODUS is similar to other vertically umbilical-tethered systems like caged ROVs, stand alone CTDs, drill strings, water sampling rosettes, piston corers and therefore the analytical approach is highly developed (e.g. [8-10]). In the next section the paper focuses on some specific aspects concerning the ability of the modular GEOSTAR system to be part of a symbiotic structure with "conventional" offshore sub-sea facilities like manifolds, templates and satellite wells.
3. SYMBIOSIS IS POSSIBLE
Symbiotic structures require the consensus between two partners having equal fights. The knowledge of each starting point allows articulating common goals. Consequently, our suggestions concerning symbiotic structures in the field of marine science and offshore technology are based upon the technical and scientific status quo in the outgoing 20 th and beginning 21 st century described in this paper. The possibility to combine state-of-the-art deep-sea data collection systems for long-term applications with sub-sea structures, which are presently used in the offshore industry, is convincing. As a first approach, the main developed during the GEOSTAR project the MODUS ROV, the benthic laboratory with its deep
Figure 4. Sketch of symbiosis scenarios between sub-sea oil and gas production facilities and multidisciplinary benthic laboratories (here: Symbolic open frames of GEOSTAR lab and MODUS).
141 ocean proved sensor modules (or other systems) and the Data Acquisition and Control System (DACS) are proposed for the following applications: Deployment of specifically equipped benthic stations for exploration purposes at predetermined locations to evaluate time dependent environmental data starting before exploitation activities (durability: autonomous up to one year); deployment, service and recovery of a variety of different bottom stations with just one MODUS ROV. -
Deployment of benthic stations to a location nearby an already operating sub-sea development for various purposes, e.g. for environmental quality control or monitoring of actual working conditions for service ROVs (durability: not terminated due to direct power supply and online data connection via umbilicals with sub-sea structures); deployment, service and recovery of the benthic stations with the MODUS ROV.
-
Installation of the newly developed and deep sea proved GEOSTAR sensor packages including DACS on existing or new sub-sea developments as standard data collection units (durability: same as the hosting structures due to direct power and data interconnection); service and recovery with aid of conventional maintenance ROVs.
-
Interactive real-time internet connection to the DACS for data retrieval and/or reconfiguration of mission control is possible; access to all available data for the oil and gas company in charge and for the marine scientists community.
The range of collectible data meets the increasing demand of marine scientists and offshore engineers for reliable long-term site surveying and observation for a better understanding of deep ocean processes. As sub-sea developments in offshore hydrocarbon production are actually continuously controlled and managed, data collection and real-time transmission for marine science and offshore technology purposes offers a great opportunity for safe and costeffective symbiotic structures, now and in the near future.
4.
CONCLUSIONS
The status quo in offshore technology offers opportunities to a wide range of national and intemational co-operations in geological and oceanographic data collection systems. The deep-sea frontier is descending: latest water depth records in exploration and production of hydrocarbons are pointing on further developments. Long-term data collection from deep seas offers a reliable basis for geological and oceanographic archives and improved sub-sea designs. Feasibility studies for symbiotic structures should be promoted to guarantee maximum performance of sensors and undisturbed operation of sub-sea facilities. Symbiotic structures for marine science and offshore technology are promising to be safe, reliable and cost effective, and comply strictly with safety and environmental standards.
ACKNOWLEDGEMENTS The GEOSTAR project is carried out under contracts EU/DG XII Mast III-CT95-0007 (GEOSTAR 1) and Mast III-CT98-0183 (GEOSTAR 2). The present partnership includes: Istituto Nazionale di Geofisica (Coordination) and Tecnomare SpA (Italy), University of Applied Sciences Berlin and Technical University Berlin (Germany), IFREMER, Orca
142 Instrumentation, Laboratoire d'Ocranographie et de Biogeochimie - CNRS and Institut de Physique du Globe de Paris (France).
REFERENCES
1. GEOSTAR, EU/DG XII MAST III-CT98-0183, 1999. 2. Offshore Technology, Intemet: www.offshore-technology.com, Net Resources International Ltd, London, UK, 2000. 3. G. F. Clauss, Herausforderungen und Innovationen der Meerestechnik, Yearbook of the Schiffbautechnische Gesellschat~ STG, Hamburg, 2000. 4. J. Westwood, What the Future Holds for ROVs and AUVs, UnderWater Magazine, Article reprint: May/June 2000, Intemet: www.diveweb.com/rov/, 2000. 5. M. Manenti, GEOSTAR 2: Installation procedure preliminary dynamic analysis, Project Report, Tecnomare, Italy, 2000. 6. G. Clauss and S. Hoog, Challenges and Innovations in Marine Technology, slide presentation, 16th Course of International School of Geophysics: Science-Technology Synergy for Research in Marine Environment, Erice, Sicily, 1999. 7. F. Gasparoni, P. Favali, G. Smriglio, H. Gerber, G. Clauss, J. Marvaldi, D. Fellmann, C. Millot, J.F. Montagner and M. Marani, Results and Perspectives from the first Mission of the European abyssal observatory GEOSTAR, Contribution to Oceanology Int. 2000 Conference, Brighton, UK, 2000. 8. F.R. Driscoll, R. G. Lueck and M. Nahon, Development and validation of a lumped-mass dynamics model of a deep-sea ROV system, Appl. Ocean Res., 22 (2000)169-182. 9. F. S. Hover, M. A. Grosenbaugh and M. S. Triantafyllou, Calculation of Dynamic Motions and Tensions in Towed Underwater Cables, IEEE, J. Oceanic Engin., Vol. 19, No. 3, July 1994. 10. M. Caccia, G. Indiveri and G. Veruggio, Modeling and Identification of Open-Frame variable Configuration Unmanned Underwater Vehicles, IEEE, J. Oceanic Engin., Vol. 25, No. 2, April 2000.
Science-Technology Synergyfor Research m the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
143
9 2002 Elsevier Science B.V. All rights reserved.
F r o m A B E L to G E O S T A R : scientific o b s e r v a t o r y
development
o f the
first e u r o p e a n
deep-sea
F. Gasparoni, D. Calore and R. Campaci Tecnomare S.p.A., San Marco 3584, 30124 Venezia, Italy
Since the adoption of the Marine Science and Technology programme in 1989, marine research in the European Union has substantially grown, contributing in a significant manner to the challenge of understanding the deep-sea environment and processes. Deep-sea observation and monitoring depends on the availability of techniques to operate at the sites efficiently on several temporal and spatial scales and according to a variety of scientific objectives. Autonomous systems of course hold the greatest promise for widespread use. They are experiencing a rapid technological development and the research activities performed by a number of companies and organisations in the last decade is the demonstration that it is now possible to overcome some of their most significant limitations. It is now possible to introduce working prototypes and begin thinking to the future routine applications of autonomous systems. With the present paper, we describe the recent progresses of the first European deep-sea scientific observatory. Firstly, we introduce what has been a pioneering feasibility study, awarded by the European Commission to bridge the gap between the practice of offshore technology and the possibility of developing engineered technologies for scientific investigation: the ABEL study. Secondly, we illustrate the main aspects of the GEOSTAR project, financially supported by the European Commission in the context of the MAST-3 programme, that demonstrated in practice the validity of the proposed approach, by successfully developing and operating the first prototype of an innovative multidisciplinary deep-sea observatory.
1. A B E L
In 1991, the European Commission (in the context of MAST-2 programme) promoted a feasibility and economical study aimed at identifying the scientific requirements, the possible technological solutions and the opportunities for the development of an Abyssal BEnthic Laboratory (ABEL). The basic requirements of the study can be summarised as follows: ensure the possibility to carry out scientific, multidisciplinary, autonomous observations and experiments, in situ, at water depths up to 6000 m. Carried out between 1992 and 1993, the study examined the feasibility of a variety of benthic systems and proved the feasibility of a concept of benthic laboratory, able to operate both in autonomous and controlled mode for periods of several months up to one year. The
144 configuration proposed by Tecnomare [1,2,3] was a network of co-operating stations, able to be reconfigured according to the specific mission requirements. The system, shown in Fig. 1, included a main station (the "Benthic Investigation Laboratory") devoted to the execution of the most complex tasks, with a number of secondary fixed stations ("Satellite Stations") acting as nodes of the measuring network, and a mobile vehicle ("Mobile Station") that extends the capabilities of the fixed stations by enabling the possibility of conducting surveys in the investigation area and the possibility to interact with the fixed stations. The mobile vehicle concept has been particularly promoted by the Woods Hole Oceanographic Institution (WHOI) with the construction of the Autonomous Benthic Explorer (ABE). ABE can reach depths underwater of up to 4500 m, carrying out long-term experiments and tests. The range varies between 10 and 100 km, depending on the batteries, and it reaches a speed of up to 2 knots. ABEL architecture included also a dedicated module dedicated to the deployment and recovery of the various stations, as well as a shore station. The ABEL concept represents the equivalent at deep-sea bed of a multidisciplinary meteorological or geophysical laboratory. Significant analogies may be identified also with respect to the on-going preliminary design studies of planetary stations [4].
i
i
. '~
.
.
.
~._
LAND I ' Ii- - _ =, ,
.
_
DP V E S S E L
,
'
SEASUR AC SEA
m
BOTTOM
/ .."
~x
i
/o~ %
9
(~
| " . . . _
INVESTIGATION
AREA
Figure 1. Abyssal Benthic Laboratory concept
."
'/
145 The approach used to overcome the technical limitations of those systems traditionally used to conduct scientific research at deep-sea bottom, and to assign priorities to technological development, was based on three key elements: (a) extension of the capabilities of the instrumented benthic stations, to ensure adequate support to the multiplicity of the scientific packages foreseen; (b) surface assisted deployment and recovery by means of a dedicated tool, to enable accurate and reliable positioning at seafloor, as well as re-entry on the stations for maintenance tasks and recovery at the end of the mission; (c) maximum interaction with the operator during all phases of the mission, to allow remote mission management (scientific data recovery, housekeeping data recovery) without requiring marine operations. In parallel, other studies and activities were carried out at European level, aimed at defining needs and expectations for long-term investigations at abyssal depths [5] and at investigating methods for deployment and intervention on future benthic stations [6]. As an example, within the DESIBEL study four concepts were investigated, namely: - an active docking system with a mobile hook (LOMOS), an active docking system with a special ROV (REMORA), a light scientific ROV (ROV 6000), - a free swimming vehicle (FREE MODULE). For each concept, engineering studies were conducted as well as cross comparisons, mainly based on simulations of a variety of operational conditions. In particular, LOMOS concept resulted in the most suitable solution in case of need to manage heavy payloads, such as the advanced benthic stations identified in ABEL study. Moreover, several international conferences and workshops [7, 8] reconfirmed the need to "join forces" to achieve, even with different scientific objectives, the realisation of multidisciplinary sea-bottom observatories and to extend at a global scale the existing land-based networks of permanent observatories of Earth processes [9]. It was argued that such approach would allow a significant overall cost reduction and contribute significantly to the development of a new generation of "carriers" of scientific packages that are indeed required to advance the present understanding of a variety of Earth process.
2.
G E O S T A R
To proceed further along with the development line traced by the ABEL study, eight scientific and technological European organisations (Istituto Nazionale di Geofisica e Vulcanologia, Tecnomare, CNR - Istituto per la Geologia Marina, Technische Universitat Berlin, IFREMER, ORCA Instrumentation, University of Newcastle Upon Tyne, Thetis GmbH) joined their efforts in the GEOSTAR project (GEophysical and Oceanographic STation for Abyssal Research), aimed at the development of the prototype of an innovative deep-sea benthic observatory, able to carry out long-term geophysical, geochemical and oceanographic observations at abyssal depths (4000 m.w.d.). This project, successfully proposed to the first call of MAST-3 program, can be considered as the first realisation of Abyssal Benthic Laboratory concept. The GEOSTAR prototype was designed around a "scientific reference mission" to be conducted at a significant site in deep Mediterranean Sea [10,11]. Tasks to be accomplished by GEOSTAR are:
146
continuous measurement of seismic and magnetic parameters; - continuous measurement of the 3D current profile in the Bottom Boundary Layer; continuous measurement of physical and geochemical water parameters in the Bottom Boundary Layer. Conceived as a tool open to a variety of applications, including possibility of being reconfigured for different missions and different sites both near the coast and in open seas, GEOSTAR (Fig. 2 and [12,13]) is composed of four main subsystems: the Bottom Station (BS): the deep-sea observatory, hosting scientific packages, data acquisition, data storage, energy supply, communication equipment; - the Scientific Packages: the payload specific to the mission (see Table 3); - the Communication System (CS): based on expendable data capsules (the so-called ARGOS Messengers) and an Acoustic Telemetry Link ensuring the interface with the operator during all phases of the mission; - the Mobile Docker (MD): the tool for BS deployment and recovery, derived from LOMOS concept. -
-
-
Figure 2. GEOSTAR concept. Cable connection is an option not implemented in the prototype version.
147
2.1.Bottom
Station
GEOSTAR Bottom Station (BS) is a stand-alone autonomous unit, based on a four-legged frame supporting all the scientific packages and the mission equipment (such as the data acquisition and control system, the battery pack, the communication systems, etc.). Such a station is characterised by specific technological solutions introduced to enable the execution of long-term deep-sea missions. Among these solutions: - a lightweight and modular frame; - active devices for the deployment of specific packages from the station; a dedicated data acquisition and control hardware; - autonomous mission control capabilities, including power management and self-diagnostics; - possibility of being reconfigured according to different mission requirements; multiple possibilities of interfacing with external devices (communication systems, deployment system) for continuous control of system status both during the deployment phase and during the mission. The design of the station (Fig. 3 and Table 1) takes into account the specific requirements of the various scientific packages, for what concerns the physical "mounting constraints" and the minimisation of possible electromagnetic interferences. These requirements apply in particular to the magnetometers and the seismometer, whose specific requirements do not allow a direct mounting on the Bottom Station frame, as the resulting measurements would be affected by noise and disturbances induced by the station itself and by the devices mounted on it. The glass spheres housing the magnetometers have been mounted at the end of two long booms (approx. 2.5 m) hinged at two opposite comers of the Bottom Station frame. These booms are extended, once at seabed, by a command applied by the surface operator; this ensures that the sensors are located far from the Bottom Station structure and from possible sources of electromagnetic noise. Another "active device" is used to deploy the seismometer, in order to release it from the station and put it in direct contact with the seafloor. Again, this actuation is commanded by the surface operator. The actuation systems have been designed and manufactured specifically for this application; basically they consist of modified acoustic releases directly interfaced to Bottom Station electronics. Automatic levelling systems have also been developed for all the sensors (such as the seismometer) which are sensitive to the tilt of the station. Table 1. GEOSTAR Bottom Station Dimensions 3500 x 3500 (overall), height 2900 mm Weight in air
2160 kg (GEOSTAR 1 version); 2942 kg (GEOSTAR 2 version)
Weight in water
1055 kg (GEOSTAR 1 version); 1416 kg (GEOSTAR 2 version)
Material
Aluminium 5083 (frame), Titanium grade 5 (vessels), Stainless Steel (docking pin)
Water depth
4000 m
148
Figure 3. GEOSTAR Bottom Station frame; the magnetometer boom on the left of the Bottom Station is shown in extended position. Mobile Docker visible in the background.
149 GEOSTAR mission is managed by a dedicated Data Acquisition and Control System (DACS), designed to meet the following main requirements: (a) power consumption as low as possible, for the limited energy supplied by the battery pack; (b) communication channels with high redundancy, for the quantity of instrumentation packages and peripheral devices the bottom station can potentially host; (c) expandability, for the support of new packages to be added in future configurations; (d) high data storage capacity, for the amount of data produced by instruments, particularly the seismometer; (e) self-diagnostic monitoring, to allow a certain degree of "fault tolerance" to the system and increase its intrinsic reliability. Low power consumption is obtained by means of hardware and software solutions. Concerning the hardware solutions, they are: - use of"CMOS technology"; - adoption of"electronic switches" allowing us to power-on each device only when required by the user or by the mission schedule; - use of high efficiency "DC/DC converters" to make available the different supply voltages for the external devices (self-diagnostic sensors or scientific packages). The software implemented contributes to the power saving, by keeping the microprocessors in "sleep mode" during the waiting periods between data acquisition, data exchange or management of commands from the surface unit. Moreover, self-diagnostic sensors, hard-disk drives and communication serial drivers are powered-on only when necessary. Modularity and expandability are ensured by a "distributed processing architecture", based on several intelligent boards, each devoted to a specific set of tasks. In the present GEOSTAR configuration, one board is completely dedicated to the management of the communication devices and self-diagnostic, whereas two boards manage data acquisition from the scientific packages. Each board makes available at least 6 "full duplex" serial links, 15 output digital lines (4 of them configurable also as "opto-isolated" input) and 6 "TTL" lines each configurable as digital I/O, "UART", input capture, input transition counter and "PWM". Another two dedicated boards manage mission data logging inside the hard disks. Besides that, on each board, a "black-box" has also been implemented (based on flash memory cards) to log the main events which occurred and store short daily status and data file produced during a mission. Another essential aspect for the management of long-term missions in deep-sea environment is the capability to monitor the integrity and internal status of the system itself; for this purpose a set of dedicated sensors and self-diagnostic tasks have been implemented. The monitored parameters are: Battery Vessel Water Detector; - Battery Vessel Temperature; Battery Vessel Voltage; - Battery Vessel Current; - DACS Vessel water Detector; - DACS Vessel Temperature; Bottom Station Pitch; - Bottom Station Roll; - Distance from seabed (during deployment), settlement (while at seabed). -
-
-
150 The analog and digital signals supplied by the status sensors are conditioned by two dedicated boards powered on only during the self-diagnostic monitoring phase. Another peculiar aspect of GEOSTAR is data management. Due to the limited resources available inside the station (energy, volume and mass memory), the need to enable periodical checks during the mission and to make available after recovery a comprehensive set of data for the subsequent analysis work, a data management strategy [14] has been implemented in the Bottom Station and in the dedicated Surface Unit.
2.2. Scientific Packages The Scientific Packages hosted by GEOSTAR 1 and GEOSTAR 2 missions are listed in Table 3. A number of advantages have to be posed to their characteristics, for example, their intrinsic ability to operate in deep water and harsh environments. Concerning the interface requirements, all sensors have been designed for minimum power consumption and in such a way to operate and provide information to the DACS upon request by the DACS itself. 2.3. Communication System To make available during the mission information on the state of the station and on collected data, a dedicated Communication System has been developed in the project [15], based on a set of releasable capsules, called "Messengers", able to transfer data by satellite telemetry (ARGOS) once arrived at sea surface. Two types of Messengers are used: - Expendable Messengers, released periodically (depending on the mission duration) or under particular conditions (i.e., in case of major failure detection in the BS); they can store up to 48 kbytes of data; - Storage Messengers, released on external request (e.g., by operators on a ship) and storing up to 40 Mbyte of data. Besides that, an Acoustic Telemetry Link, connected to a surface buoy or a ship of opportunity, ensures near-real-time interface with the operator during all mission phases. A standard interface for data transmission via cable is also available to enable real-time connection of the BS to external units or directly to shore stations. 2.4. Mobile Docker The handling of the Bottom Station is ensured by a dedicated vehicle (Mobile Docker) that is basically a special, simplified version of a ROV, able to deploy at sea-floor heavy payloads and subsequently recover them as a shipborn procedure. The Mobile Docker (Fig. 3 and [16]) is equipped with two thrusters ensuring mobility on the horizontal plane (x, y), while the winch of the support vessel regulates the descent/ascent (z). By means of visual (TV cameras, lights) and instrumental (sonar, LF transponder) systems the Mobile Docker is able to locate the pre-determined installation area or find the Bottom Station for recovery. For the prototype, an operation range greater than 5% of the water depth has been calculated and subsequently verified during the missions carried out in 1998 and 2000.
151 Table 2. GEOSTAR Data Acquisition and Control System Configuration 3 CPU boards, 2 Hard disk units, 1 switch board, 2 status boards Serial Lines
18 full duplex RS232, 9 full duplex TTL
Analog Input
40 analog input coded in 12 bit registers
Digital I/O
69 digital output (12 configurable as optoisolated input)
RAM Memory
2 Mbyte per CPU board; 1 Mbyte per Hard Drive unit
Mass Memory
2.1 Gbyte per Hard Drive unit; 48 Mbyte (on Flash Card) per CPU board
Electronic Switches
20 solid state switches, up to 4 A each, fused
Power Consumption
180 mA @ 24 V in IDLE mode (only data acquisition and control electronics powered on); 450 mA @ 24 V in MISSION mode (all scientific packages powered-on)
Table 3. GEOSTAR Scientific Payload Scientific Package Triaxial broad-band seismometer Scalar magnetometer Fluxgate magnetometer 300 kHz Acoustic Doppler Current Profiler CTD Transmissometer Electrochemical package (pH, H2S) Water sampler (48 x 500 ml samples) Hydrophone Gravity meter Triaxial broad-band seismometer (derived from a space qualified prototype) Single-point current meter (incl. tilt, heading, T, P)
GEOSTAR 1
GEOSTAR 2
152 Table 4. Typical GEOSTAR operational sequence Mission configuration file created and downloaded into the Bottom Pre-mission
Station
Bottom Station and Mobile Docker connected on ship deck Ship on site; GEOSTAR launched in seawater Deployment
Guided descent with continuos monitoring of Bottom Station and scientific packages status; touchdown Actuation of releases (seismometer and magnetometers deployment), final checks and mission START Mobile Docker disconnected and recovered onboard the ship Bottom Station executing the programmed mission; periodic release of
Mission
Messengers; periodic interrogation from shore or a ship of opportunity (on command from shore, or at programmed time) STOP mission, lock seismometer Ship on site; Mobile Docker launched Guided descent; Bottom Station localisation through Mobile Docker
Recovery
instrumental and visual systems. Mobile Docker position actively controlled by the pilot Mobile Docker re-entry and connection to the Bottom Station System recovery onboard the ship
Post-mission
Mission data download and dissemination
Monitoring and guidance is ensured by a Surface Unit including video recorders, monitors, a control panel showing all data transferred via telemetry from the MD, two joysticks for steering, and an interface for the BS data. The umbilical provides the power supply to the MD (thrusters and electronic devices) and fibre optic lines. The typical operational procedure is summarised in Table 4.
153
Figure 4. GEOSTAR recovery at the end of mission-1 (August-September 1998)
154 2.5. Mission-1
In August 1998, GEOSTAR was deployed in the Adriatic Sea at the depth of 42 m, and operated continuously for 450 hours during which data regarding seismic events, magnetic field variations, water current and water characteristics have been collected (see fig. 4) and [17-20]). During the mission, all system capabilities and marine operations were successfully verified in real conditions, demonstrating (a) capability of the Bottom Station Data Acquisition and Control System to support the payload and manage the various mission tasks; (b) capability of the Communication System based on expendable capsules (automatic and commanded release, as well as satellite data transfer verified); (c) usefulness of the acoustic telemetry link, that guaranteed possibility to interact with the Bottom Station during all phases of the mission (to check status of the system while at seabed, collect data, command Storage Messenger release, stop mission and lock seismometer masses), and at the same time represented an essential back-up of the cable link used during deployment; (d) capability of the Mobile Docker to handle deployment and recovery operations, in particular re-entry and connection to the Bottom Station; (e) possibility to handle the observatory from a medium-sized oceanographic ship (the Italian R/V Urania, 1115 t gross tonnage, 61.3 m overall length, 11.1 m width); (f) cost-effectiveness of the GEOSTAR concept, whose deployment and recovery missions were managed by a vessel with a daily cost of around 12.500 Euro, and required a few days for their completion and simple operational procedures; (g) capability of the Bottom Station to operate as multidisciplinary observatory (all scientific packages correctly supported and significant data collected). 2.6. Mission-2
GEOSTAR mission-2 (September 2000~March 2001) represents the first long-term scientific mission in a deep-sea site. For this purpose the system was enhanced as follows: - increased scientific payload (see Table 3); - dedicated electromechanical umbilical cable and winch; - near-real-time communication system (NRTCS), based on the development of a surface buoy (moored in the vicinity of the Bottom Station deployment site) managing operation of a satellite link (surface part) and an acoustic link (underwater part); - Mobile Docker hydrodynamic optimisation and increased navigation payload (sonar, color camera, altimeter). The mission site was selected in Southern Tyrrhenian Sea (offshore Ustica Island, approx. 2000 m.w.d.) GEOSTAR deployment and recovery operations (see Fig. 5) were managed by the same ship used for mission-1. All system capabilities were confirmed in deep-sea conditions; moreover, continuous data recovery through the NRTCS (one interrogation per day) and the periodic release of Messengers (one release per month) made possible the execution of complete checks on the system functionality, as well as the start of the scientific data analysis while the mission was still ongoing.
155 3. THE FUTURE 3.1. Lessons Learned The experience of the first two operational missions demonstrated the validity of ABEL/GEOSTAR concept, and at the same time represented an important step toward the construction of more and more reliable deep-sea scientific observatories. The fundamental technologies have been validated, a number of critical limitations have been overcome and a number of technological limitations are better recognised. A number of basic principles for the design and construction of new generations of deep-sea scientific observatories can be more explicit. In addition to the basic quantitative scientific and technical requirements, these criteria constituted the guidelines for the definition of the GEOSTAR architecture. (a) Modularity and Interface Standardisation. The introduction of system modularity allowed the achievement of the required flexibility and expandability (i.e. the possibility to modify system configuration with time-money saving in design), the possibility to develop and test separately each component and the possibility to use each component as a stand-alone investigation system. (b) Reliability, Maintainability and Redundancy. Increased reliability and simplified maintenance operations are a primary factor for the success of deep-sea systems. Their structure, the mechanical and electronic equipment and the control logic have to be designed in order to avoid that a single, maybe minor, failure of the system affects in a substantial way the overall set of investigations, the mission. Of course, a failure must not affect the survival of the system and the possibilities to recover the system components even if within the framework of emergency procedures. Redundancy of critical components is considered a guideline for the achievement of adequate reliability target values; for each system component a reliability target shall be defined. The single failure criterion (namely a single failure must not lead to the system loss) is an important design requirement. (c) Functional Autonomy and Hardware Simplicity. Functional Autonomy and Hardware Simplicity represent a necessity to guarantee reliable operability after very prolonged deployments in hostile environments. In addition, functional autonomy is a necessary key issue to guarantee autonomous investigations and achieve an adequate cost saving. According to reliability and maintainability requirements the complexity of the system components should be kept at a minimum. (d) Friendliness and Workability. Friendliness and workability should guide all phases of definition of a "mission specific configuration" taking into account all the possible operational constraints (given by the environmental condition and by power, communication and material duration). 3.2. Opportunities GEOSTAR represents the precursor of a new generation of observatories: both the prototype and the individual scientific modules, as well as the specific technologies developed inside the project, can be exploited in the context of different applications (e.g. offshore activities) especially in the new era of increasing environmental awareness. Some examples are hereby described. (a) Site Characterisation. This application has many aspects of similarity with the scientific use; in this case the data collected are used for the characterisation of a specific site where, for example, an offshore structure or subsea system has to be installed. Besides standard environmental measurements, like current profiling, the station may also host instrumented
156
Figure 5. GEOSTAR deployment for the first deep-sea mission (September 2000)
157 packages devoted to the acquisition of measurements related to the design of cathodic protection systems and to geotechnical parameters. Possibility of managing long-term collection of seismological data represents another important feature. (b) Environmental Protection and Global Change Studies. Thanks to its modular design, GEOSTAR foresees the possibility to include additional sensor packages, samplers, analysers and even manage other satellite units in line with the ABEL philosophy. These features would enable the use of GEOSTAR for environmental protection purposes. Range of applications include water quality monitoring, pollution studies, tsunami forecasting, water-sediment interface monitoring. (c) Integration into More Complex Experiments. Technologies developed in GEOSTAR project may be of great importance for other scientific and industrial applications where autonomous operation of instrumented systems is required. Typical fields of application include design and development of underwater control systems, autonomous vehicles, remote data acquisition and control systems. Observatories like GEOSTAR may also be used to work in parallel with other experiments for the collection of environmental parameters. One of the most significant opportunities is represented by the underwater particle physics experiments like deep-sea neutrino telescopes. (d) Support Platform for Technological Activities. One of the most critical aspects in the successful development of instrumented systems and packages for deep-sea applications is represented by the conduction of significant validation tests in real conditions, especially in case the verification of the long-term performances of the system are required. In this regard, GEOSTAR may represent a suitable support for the execution of experimentation and validation tests of a wide range of systems and tools. Some examples: testing facility for deep -sea ROV performance evaluation, long-term test of material performances, long-term test of instrumented packages or special components (valves, connectors, etc.). Building upon the ABEL concept, it is expected that major savings could be achieved by addressing several technological programs at the same site by having common infrastructure (the support station, the data gathering system, the power source) and using the same logistic facilities (ship, etc.).
4. CONCLUSIONS GEOSTAR makes available the first European deep-sea autonomous observatory devoted to long-term, continuous and multiparametric observations. From the scientific point of view, the quantity and type of data collected will enable significant progress in geophysics, geochemistry and physical oceanography for several applications in marine and Earth studies. Besides that, thanks to its modular design and expandability, GEOSTAR foresees the possibility of including both additional sensor packages and other satellite apparatuses, and of operating as a node of future monitoring networks. Therefore, it may respond to the expectancies of a wider scientific community, including biologists and environmental engineers. The experience of the first two missions of GEOSTAR was of paramount importance to assess the potentialities and limitations of a number of enabling technologies and to improve the reliability of deep-sea scientific observatories. GEOSTAR also demonstrated that it is nowadays possible to reduce in a significant manner the cost of technologies that have traditionally been the exclusivity of military applications and of application for hydrocarbon exploitation.
158 Apart from continuing to work to improve reliability of deep-sea scientific observatories, we consider the next challenges as being "networking capabilities". This would pose important requirements for issues such as: (i) mechanical standardisation, (ii) electrical standardisation, (iii) energy, (iv) scientific and communication protocols and (v) real-time scientific data transfer. But, now that the technological road to the deep-sea has been traced, it is just a matter of time before we go along it with the new deep-sea science.
ACKNOWLEDGMENTS
The authors would like to thank colleagues Michele Capobianco for the kind assistance and helpful suggestions in reviewing this paper and Felice Da Prat for the mechanical design work. Luigi Innocenzi of Istituto Nazionale di Geofisica provided all the photos used in this paper. ABEL technical and economical feasibility study was carried out by Tecnomare (I) under EU-DGXII contract MAST-CT91-0075. GEOSTAR project was carried out under EU-DGXII contracts MAS3-CT95-0007 (GEOSTAR 1) and MAS3-CT98-0183 (GEOSTAR 2). GEOSTAR 2 partnership includes: Istituto Nazionale di Geofisica (I), Ifremer (F), CNRS - Laboratoire d'Oceanographie et de Biogeochimie (F), Yechnische Fachhochschule Berlin (D), Technische UniversitS.t Berlin (D), Yecnomare (I), Orca Instrumentation (F), Institut de Physique du Globe (F).
REFERENCES
1. F. Gasparoni, M. Berta and G.M. Bozzo, Proc. Oceanology International Conference, Brighton, 1994. 2. F. Gasparoni, Technical and Economical Feasibility Study of an Abyssal Benthic Laboratory for Long-term Experiments, Study contract no. MAST-CT91-0075, Final report, 1993. 3. M. Berta, F. Gasparoni and M. Capobianco, J. Mar. Sys., 6 (1995) 211. 4. S. Di Pippo, W. Prendin and F. Gasparoni, Proc. I-SAIRAS, Pasadena, 1994, p.18. 5. H. Thiel, K. O. Kirstein, C. Luth, U. Luth, G. Luther, L.A. Meyer-Reil, O. Pfannkuche and M. Weydert, J. Mar. Sys., 4 (1994) 421. 6. DESIBEL, New Methods for DEep Sea Intervention on future BEnthic Laborarories EU contract MAS2-CT 94-0082 Reports, 1997. 7. ECOPS Euroconference, The Deep-Sea Floor as a Changing Environment, Report, San Feliu de Guixols, 1994, p. 48. 8. ECOPS Workshop, Identifying priorities for investment in developing new technology for the ECOPS Grand Challenges, Report, Brighton, 1994. 9. J.-P. Montagner and Y. Lancelot (eds.), Multidisciplinary observatories on the deep sea floor, I.O.N. workshop, Marseilles, 1995. 10. L. Beranzoli, A. De Santis, G. Etiope, P. Favali, F. Frugoni, G. Smriglio, F. Gasparoni and A. Marigo, Phys. Earth Plan. Inter., 108 (1998) 175.
159 11. F. Gasparoni, P. Favali, G. Smriglio, M. Marani, L. Floury, J.M. Coudeville, C. Millot, H. Gerber, H. Amann and J. Dobson, Proc. Oceanology International Conference, Brighton, 1998, p. 225. 12. F. Gasparoni, A. Marigo, H. Gerber, D. Schulze, P. Favali, and G. Smriglio, GEOSTAR: the first European Observatory for geophysical and oceanographic characterisation of deep sea sites, Proc. OMAE, Lisbon, 1998. 13. F. Gasparoni, D. Calore, R. Campaci and A. Marigo, GEOSTAR - Development and test of an innovative benthic station for long-term investigations at abyssal depths, Proceedings Oceans '98 Conference, Nice, 1998, p. 1091. 14. P. Favali, G. Smriglio, L. Beranzoli, G. Etiope, F. Frugoni, F. Gasparoni, D. Calore and R. Campaci, Data from GEOSTAR: a contribution to Ocean Monitoring, Symposium on Ocean Data for Scientists, Dublin, 1997. 15. J. Marvaldi, J. Blandin, C. Podeur, and J.-M. Coudeville, GEOSTAR- Development and test of a communication system for deep-sea benthic stations, Proc. Oceans '98 Conference, Nice, 1998. 16. H.W. Gerber and D. Schulze, GEOSTAR- Development and test of a deployment and recovery system for deep-sea benthic observatories, Proc. Oceans '98 Conference, Nice, 1998. 17. F. Gasparoni, P. Favali. G. Smriglio, H. Gerber, G. Clauss, J. Marvaldi, D.Fellmann, C. Millot, J.F. Montagner and M. Marani, Results and perspectives from the first mission of the European Abyssal Observatory GEOSTAR, Proc. Oceanology International Conference 2000, p. 21. 18. L. Beranzoli, T. Braun, M. Calcara, D. Calore, R. Campaci, J-M. Coudeville, A. De Santis, G. Etiope, P. Favali, F. Frugoni, J.-L. Fuda, F. Gamberi, F. Gasparoni, H. Gerber, M. Marani, J. Marvaldi, C. Millot, C. Montuori, P. Palangio, G. Romeo and G. Smriglio, EOS Trans. AGU, 81 (2000), 45. 19. F. Gasparoni, D. Calore, R. Campaci and A. Marigo, GEOSTAR - Development and test of an innovative observatory for geophysical and oceanographic characterisation of deep sea sites, Proc. OMC'99 Conference, Ravenna, 1999. 20. H. Gerber, D. Schultze and F. Gasparoni, GEOSTAR: first mission results and future perspectives for intervention with heavy deep-sea long-term laboratories, Proc. OMAE, St. John's, Newfoundland, 1999.
This Page Intentionally Left Blank
Science-Technology Synergyfor Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
161
9 2002 Elsevier Science B.V. All rights reserved.
D e s i g n and realization o f C o m m u n i c a t i o n S y s t e m s for the G E O S T A R project J. Marvaldi a, Y. Aoustin a, G. Ayela b, D. Barbot b, J. Blandin a, J. M. Coudeville b, D. Fellmann b, G. Loaec a, Ch. Podeur a and A. Priou b aIFREMER-Center ofBrest, B.P. 70 29280 Plouzanr, France bORCA-Instrumentation, 5, rue Pierre Rivoalon 29200 Brest, France
In all experiments involving sea bottom deployed data acquisition systems one is faced with the widespread need of transmitting data to the user. There is a range of available solutions with their own limits while developments are in continuous progress. The choice of the most convenient means will depend on the specific features of the system and its deployment site. Two types of Communication System are to be implemented in GEOSTAR, the autonomous deep-sea station able to collect long-term records of geophysical and oceanographic data. The first type consists of a set of buoyant data capsules, the MESSENGERS, which carry the data to the surface with a typical periodicity of 1 month. They are recovered by a ship or transmit the data via the ARGOS system. The other type is a bi-directional Near Real Time Communication System based on an acoustic link between the station and a relay surface buoy and then a satellite or radio link from the buoy to the shore. The development of the Messenger Communication System, already tested during the first GEOSTAR test mission, will be reviewed and then basic considerations on the design of the Near Real Time Communication System will be presented.
1. INTRODUCTION GEOSTAR (GEophysical and Oceanographic STation for Abyssal Research) is a scientific and technological project supported by the European Community. The objectives of the project are to design, build and test a deep sea observatory operating autonomously for long periods of time and capable of collecting data in various scientific fields: seismology, geomagnetism, geochemistry, physical oceanography. The GEOSTAR system is fitted with two Communication Systems (CS) to transfer the data from the Bottom-Station (BS) to the final scientific users. The first one, the Messenger Communication System (MCS), consists of a set of buoyant data capsules, the MESSENGERS (MES), which carry the data to the surface with a typical periodicity of 1 month and are either recovered by a ship or transmit the data via ARGOS system. The other one is a bi-directional Near Real Time Communication System (NRTCS) based on an acoustic link between the station and a relay surface buoy and then a satellite or radio link
162
from the buoy to the shore. This system is intended to get daily communications with the Bottom-Station. During the first phase of the project (November 1995 - December 1998), a GEOSTAR prototype system, fitted with a MCS only, has been tested in shallow water during about three weeks in the Adriatic Sea. The second phase (January 1999 -~ June 2001) will consist of a general up-grade of the whole system for deep sea operations down to 4000 meter depth, in the implementation of the NRTCS and finally in a deep sea mission for a 6-month period offshore from the island of Ustica (Sicily) at 3400 meter depth. We will review the functions of the Communication Systems, then describe the architecture and components of the MCS, already tested in the test mission at sea, and present basic considerations on the NRTCS to be implemented in the future.
2. FUNCTIONS OF THE C O M M U N I C A T I O N SYSTEMS In the absence of a communication link towards the surface the users of underwater sensors and autonomous sea floor stations are confronted with several problems. The collected scientific data are only available when recovering the instruments at the end of the mission, maybe several months after the deployment and the scientists cannot be rapidly informed of the occurrence of particular interesting events. The users have neither information on an eventual dysfunction which can either endanger the station or stop or corrupt the data acquisition and consequently make unfruitful the forthcoming stay of the station at the bottom. The first step in the conception of the Communications System to be installed is to define the types and volume of information to be transmitted and to select the convenient procedures. Two types of information must be dealt with: the scientific data, which are collected continuously or according to prescribed sequences by the various instruments of the Scientific Package, and the technical data, which give information on the operating state of the system. Inside each group the distinction is made between normal and abnormal data, abnormality being defined with reference to prescribed criteria, mainly relating to high or low levels. Two basic types of data transmission systems are conceivable : - by transferring data blocks to underwater capsules periodically released by the sea floor station; by emitting the data through a permanent link from the bottom to the shore. In the first case (see [1,2]) a set of data capsules as small as possible are stored on the station. The capsule encloses a data storage unit in a pressure resistant housing which must also be conceived as a small underwater vehicle with convenient floating capacity. When reaching the surface, either the capsule is recovered by a ship on the site or it emits its data. At present the ARGOS satellite system provides an efficient solution to transmit rather limited amount of data from a small floating equipment. The limits of this type of system are multiple: capsule data memory capacity, ARGOS data transmission rate, physical volume of the capsules which necessarily restricts the number of units stored on the station. As a broad conclusion the data capsule system is convenient for the recovery of summaries of the recorded data typically at a monthly rate. The implementation of a permanent link brings notable advantages as regards the data transmission capacity, e.g. the possibility to make contact with the station at almost any -
163 moment and even to send commands to it. However, a crucial factor to be taken into account is the specific energy consumed per unit of information transmitted, as the energy resources of the system may be limited. Several link configurations are conceivable: Direct cable link to the shore: The station is connected to the shore either by a purpose laid cable [2] or by using an older out of service telecommunication cable, as is the case for experiments in the Pacific Ocean. The cable connection brings the advantages of a high data transmission rate, a true real time communication and the possibility to provide electrical energy to the bottom station. Inversely the lay-out of a dedicated cable is restricted to sites not too far from the coast as even on short distances it is a costly operation. Link through a relay surface buoy [3,4]: The buoy is connected to shore by a radioelectric link, either satellite, radio or cellular phone, while the station is connected to the buoy through a cable or an immaterial acoustic link. The choice of the radioelectric link will depend on the experiment site; at short distance from the coast the VHF radio or cellular phone is an efficient solution. But the satellite is the only solution for remote sites. Concerning the satellite, the choice is between: "a real time interactive" satellite on geostationary orbit (GEO) requiring an active antenna on the buoy and "a store and forward" satellite on low earth orbit (LEO) operating with a much more simple omnidirectional antenna. One still waits for the advent of a "real time interactive" LEO satellite system. The acoustic underwater link can be implemented on a rather small surface buoy, but it is subjected to the limitation of the energy stored on the station. An underwater cable link offers the advantages of a high data transmission rate and of providing the station with energy produced on the buoy by solar panels for instance. Inversely, it requires a more complex buoy and fittings and heavier means for putting the buoy in place and connecting the cable to the station. As to the communication procedures for GEOSTAR, it was specified that the systems should be able to transfer periodically on request complete sequences of the scientific data collected since the former request, transfer periodically a report of the technical state of the BS, transfer periodically a selection of the scientific data or statistical values derived from them, inform the users of all abnormal events of a technical nature with a minimum delay, and inform of eventual interesting particular scientific events shortly after their occurrence. The data acquisition and storage unit of the Bottom-Station keeps continuous records of all the scientific or technical data generated. Of course, data acquisition rate depends on the type of instrument, from several tenths per second to, for instance, one per hour. The recovery of this exhaustive set of data is made at the end of the mission, typically several months after its beginning, and their storage requires some Gbytes of memory capacity. According to the requirements specified above, the Bottom Station periodically generates and stores reports elaborated from the scientific and technical data. At regular time intervals, the latest generated report is transferred to the MCS. Each Messenger can store some tenths of these reports. As the number of MESSENGERS fitted on the station is limited to 5 to 10 units due to their physical size, the MES release periodicity is typically 1 month and the periodicity of data report transfer to the MES is adapted to fill their memory capacity accordingly. The Bottom-Station also generates other types of records to be transmitted on request through the NRTCS. The periodicity of this information transfer is typically from one to a few interrogations per day, and the message size must be dimensioned in accordance with the
164 channel data rate capacity and with the amount of energy stored on the Bottom-Station and the surface relay buoy. The data structures to be transmitted through the MCS are the Data Records and the Summary Messages. The Data Records are data packages of a maximum size of around 800 Kbytes generated every hour. They are composed of several data blocks corresponding to each instrument of the BS. The block size and the nature of the data contained depends on the volume of data produced by the equipment. Every one out of N records is transferred to the Storage type MES in activity. With a memory capacity of 40 Mbytes, each Storage-Messenger can store about 40 Data Records. The Summary Messages are data packages of a maximum size of around 600 bytes generated every day. They are composed of several data blocks corresponding to each instrument on the BS and to a certain number of technical operating parameters of the BS. The instrument blocks only contains statistical values derived from the measures. Summary Messages are transferred to the Expendable type MES. With a memory capacity of 32 Kbytes, each Expendable-Messenger can store about 30 Summary Messages, that is the data of 1 month. Other two data structures to be transmitted through the NRTCS are the Data Summaries, with about 250 bytes capacity and a contents similar to that of the Data Records, and the Status Summaries, with a size of about 150 bytes and contents similar to that of the Summary Messages.
3. THE MESSENGER C O M M U N I C A T I O N SYSTEM 3.1. Architecture of the system The Messenger Communication System (MCS, Fig. 1) is composed of four main subsystems: (i) The Messenger Drive Unit (MDU), the intelligent electronic unit which manages the data transfer from the Bottom-Station to the MESSENGERS and controls their release. (ii) The MESSENGERS (MES), the vehicles used to bring the data to the surface. Basically the MES are pressure resistant housings with a positive buoyancy, which contain the electronics which manage the data transfer and storage, an ARGOS emitter and antenna and the powering batteries. MES are of two types, Expendable and Storage. The Expendable MESSENGERS (MES-E) are released on alarms or at predetermined time intervals. When reaching the sea surface they deliver their data through the ARGOS system, and are then considered as non-reusable equipment. The Storage MESSENGERS (MES-S) are released by an acoustic command sent from a ship on the site of the experiment. They are localized by the ARGOS system or goniometry, recovered by the ship and their data unit is extracted and read on a micro-computer. (iii) The Acoustic-Transmission-System (ATS), the acoustic link and equipment by which the command to release a Storage Messenger is sent to the BS. It also allows the operator to recover a summary of the registered data. (iv) The ARGOS-Transmission-System, this sub-system transfers the data load of an Expendable Messenger to the exploitation shore-station. Two ancillary sub-systems include the interconnections between the BS, MDU and MES, consisting in a set of cables and connectors, and the MES storing and releasing sub-system,
165
composed of the container of the MES and the foot-blocks of the MES, which are the attachment points of the MES to the container and incorporate an underwater demateable connector and a burn-wire release.
3.2. Description of the sub-systems 3.2.1. The Messenger-Drive-Unit (MDU) The MDU (overall length, 604 ram; outer diameter, 137 mm; mass, 15 kg) manages the information transfer from the BS to the MES. It can accommodate up to nine MES. The data are received through the main link (BS to MDU) and directed towards the proper MES. A secondary link is provided for initialisation or test purposes before the deployment. When a MES-E is full, it is immediately released while a full MES-S is only considered as releasable and effectively released when the MDU receives the proper command.
Figure 1. The Messengers Communication System installed on the URANIA R/V (CNR) during the GEOSTAR first mission.
166
As the MDU may act as a watch-dog of the system activity, a MES-E may be released even if not full, if the MDU receives an alarm signal or if it detects itself an abnormal situation. The MDU electronics is housed in a pressure vessel composed of a wound glass filament - epoxy tube with aluminum end-closures. End-closures are tightened by a system of wedges inserted into the slots of a collar extending out of the pressure tube. One of them is removable, it is fitted with the Seacon connectors and supports the frame of the electronics. The electronics is based on a 68HC11F1 CPU card and a specific application card (see functional synopsis in Fig. 2). A quad UART is used for the different links. Two links are associated to RS-232 drivers for the main and test links, while a third one is distributed to up to nine RS-485 drivers (1 for each MES). A flash EEPROM is used as a 1 Mbytes data buffer. An absolute time clock is mainly used for the watch-dog function. The MDU has it's own energy, provided by a pack of lithium batteries dimensioned for more than 1 year of operation. 3.2.2.
The
Expendable
MESSENGERS
(MES-E)
A MES-E stores a limited amount of data (32 Kbytes) either alarms or a sampling of technical and scientific data. It conveys these data to the surface either after an alarm or according to a preprogrammed sequence, and emits the data to the final user's shore station through the ARGOS system. A MES-E (Fig. 3) is mainly composed of: - a tubular pressure housing containing the electronics with top and foot end-closures, - a floatability collar; - connector and burn wire release mounted on the foot end-closure. The MES are designed for a service depth of 4000 meters. The main characteristics of the MES-E are given in Table 1. The pressure housing is built with composite materials for reasons of transparency to the ARGOS emission and also for corrosion resistance. The cylindrical housing consists of a tube made of wound E glass filaments in an epoxy resin matrix. The top end-closure is machined from a composite material consisting of unwound glass fiber sheets bound in a resin polymerized under pressure. The closure is fitted with a peripheral O-ring seal and fixed to the tube by a screwed cap. The foot end-closure is made from aluminum bronze and contributes to load the MES foot for stability purpose. The foot closure is fitted with three components : - an OD-Blue 4 contact male connector from Ocean-Design Inc.; - a burn Inconel wire release from Seascan; - a depressurization screw for putting the MES under vacuum. The foot closure is removable and supports the frame of the electronics. It is fitted with a peripheral O-ring seal and fixed to the tube by the wedge-in-slot device described before. The MES-E foot is further weighted by a bronze collar. The MES-E floatability is brought by a cylindrical block of syntactic foam surrounding the pressure-resistant tube. Foam density is 640 kg/m 3 and has a 6000-meter depth capacity. The necessity to give the MES the proper buoyancy and a positive hydrostatic stability, both in full immersion and when floating at the surface, requires precise weight, volume and stability calculations and then a careful weight and dimension control of the as-built components. Electronic circuits (see Functional synopsis in Fig. 4) have been designed to keep the on-board energy as small as possible.
167
Figure 2. Messenger Drive Unit functional synopsis
168
Expandeble Messenger(MES- E)
GRP closure ARGOS antenna
GRP tube
water line
........
i
/ \~/
ARGOS emitter
T
foarm float y
I I" ' e,eotroo~cs i-t i'
'~
~
batteries
bronze inconel wire
Figure 3. Expendable Messenger structure.
169
ARGOS Antenna
UHF 88 T r a n s m i t t e r ARGOS Power Alimentatio
Serial Link
Transmitter ~,GOS's powel command
Interface Card PCMCIA Flash dis~ 40Mb (using DOS formal
GCAT 6000 Processor card PC/CHIP 8086/8018
LITHIUM BATTERIES
Mass St,
e Messenger MES-S
Inconel ring MDU
Figure 4. Expendable Messenger functional synopsis
170
Table 1. Messenger characteristics
Expendable MES-E
Storage MES-S
1360 mm
1360 mm
Dimensions Overall length Glass Reinforced Plastic tube (s) Inner diameter
55 mm
55/105 mm
Outer diameter
69 mm
69/137 mm
Length
1250 mm
785/410 mm
Outer diameter
200 mm
200 mm
Height
420 mm
510 mm
Total mass
14.500 kg
21.700 kg
Floatability
23.50 N
20.30 N
93 mm
73 mm
Stability floating (GM)
37 mm
27 mm
Floatation collar emersion
23 mm
12 mm
Heave period
1.2 s
2.0 s
Swing period
4.0 s
5.0 s
Ascent speed
1 - 1.2 m/s
1 - 1.2 m/s
Floatability collar
Mass and floatability
Various Stability immersed (GB)
Typically about 1 mW is required in the sleep mode. The system comprises a 68HC11F1 based CPU, an application card, an ARGOS emitter card and an ARGOS antenna. The application card offers the following resources : - a wake-up system activated by detection of an activity on the RS 485 link (from the MDU) or by a real-time clock; - a flash EEPROM used as a 32 Kbytes data buffer; - a current generator to burn the release; - an ARGOS transmitter controls. The CPU UART is shared by the RS-485 link and the ARGOS emitter. The energy, provided by eight C-size lithium batteries, is sufficient for more than 1 year of bottom operation followed by 1 month of ARGOS emission.
3.2.3. The Storage-MESSENGERS (MES-S) A MES-S stores a rather large volume of data (40 Mbytes) corresponding to sets of continuous records of scientific data, conveys these data to the surface after receiving an acoustic order from a ship on the site. It is positioned by the ARGOS system and recovered by the ship, for extracting and reading the data unit.
171 Although the MES-S structure is basically the same as the MES-E, some differences are worth pointing out (Fig. 5). The pressure housing is an assembly of two tubes of different diameters connected by a composite material junction piece. The upper tube is similar to the upper part of the MES-E housing and contains the ARGOS emitter and antenna. The lower tube of a larger diameter contains the batteries, the electronics and the data storage unit. The floatability is given by the same foam block as for the MES-E, with an additional machined part around the junction piece of the tubes. The main characteristics are given in Table 1. The functional synopsis of the electronics is given in Fig. 6. Using the same base as the MES-E, the MES-S needs one more module to drive the PCMCIA card. It consists of a small format PC-XT compatible module, based on a PC/Chip based CPU (GCAT6000), a PCMCIA interface card and a parallel interface to the application card. Using the ATA standard, this system allows the user to increase the MES-S storing capability. The flash EEPROM is used here as a 4 Mbyte data buffer. The batteries have a greater capacity than the one for the MES-E, because of a higher energy consumption for the data transfers. They consist of twelve D-size lithium batteries.
3.2.4. The Acoustic-Transmission-System (ATS) The ATS is basically an acoustic underwater modem. It is used for half-duplex underwater transmissions at data rates up to 2400 baud. The ATS system is composed of surface and the bottom units. The ATS surface module communicates with the GEOSTAR Surface Control Unit (SCU) through a RS-232C serial link. A graphic interface enables us to set easily the acoustic modulation parameters and to test the quality of the acoustic transmission, allowing the operator to optimize the data link. The ATS seabed module communicates with the Bottom-Station, also through a RS-232C serial link. Consequently, ATS operates as an acoustic modem by the GEOSTAR system, allowing a two-way dialog between the Surface Control Unit and the Bottom-Station. The main commands that can be sent from the SCU to the BS via the acoustic link are : - Autotest, providing data concerning the BS attitude, the battery capacity, leak detection, temperatures inside the BS containers, operating status of the BS etc.; - Disk Check, providing the raw data saved inside the hard drives of the Bottom- Station; - Summary Message Recovery, providing the summary messages of the BS containing statistical information on the data supplied daily by all sensors; - Release MESSENGERS, to release of the active MES-S.
3.2.5. The ARGOS-Transmission-System As soon as a MES-E reaches the sea surface, it begins to transmit its 32 Kbytes of data via ARGOS system. Data emission is managed by a protocol based on some basic requirements: the emission of the alarms has priority. The emission of the scientific data is made by packages randomly selected in the chronological series, each package is re-emitted more than once. These principles have been set to give the final user better chances to effectively receive the critical information, and to recover blocks of data spread on the whole duration of the acquisition, in spite of possible variations in the ARGOS transmission quality. The procedure also insures that an overview of the whole set of the data will be guaranteed in spite of an accidental premature disabling of the MES-E.
172 Storage Messenger(MES- S) GRP closure
,ILL"
:
.
.r
ARGOS antenna ~2
GRP tube 1
water line
foam Float
L ~:
~.
,
ARGOS emitter
.
~.
;:, i[
Yl
.~..
junction tube
,
.
electronics
,))":. mass memory
GRP tube 2
k.
,.. ~."
batteries
removable closure
2"
O.D. Blue plug inconel wire release
Figure 5. Storage Messenger structure.
173
Figure 6. Storage Messenger functional synopsis.
174
3.2.6. Interconnections The interconnections (Fig. 7) comprises four links as in Table 2. All the connectors are Seacon products, except the connection to the MES made with an underwater demateable OD-Blue 4-contact Connector from Ocean-Design. The male part is inserted in the MES foot end-closure and the female box is fitted to the JB-MES cable.
3.2.7. Messenger storing and releasing sub-system The MES are contained in storing silos in vertical position free of obstacles in the upper part. The silos apply an upward force to the MES, which adds to their own floatability for disconnecting the OD-Blue connector. The MES individual silos are composed of two concentric lengths of PVC tube. The MES is introduced in the upper tube and its floatability collar rests on the lower tube which glides inside the upper one, to which it is connected by springs (Fig. 8). When the springs are extended the reaction force reaches 80 Newton which adds to the 20 N MES floatability, pushing the MES upwards with a force more than twice as great as the connector disconnecting effort measured under pressure. When at sea surface the MES mass is sufficient to drive the lower tube of the silo in a low position, where it must be maintained by the release link when the system is immersed. The MES silos are fixed vertically in a glass reinforced resin container bolted to the Bottom-Station structure. The MES foot-block is a machined piece in polyamid. It is inserted inside the foot collar of the MES and has the functions of a mechanical link between the MES and the storage container and also maintains a good connection between the OD-Blue female box and the male connector of the MES. The foot-block, on one side, is attached to the MES by a small diameter synthetic cable inserted inside the ring of the bum-wire release and, on the other side, it is pinned to the storage container. In immersion the flexible link counteracts the spring push and the MES floatability. For security during handling at air-water interface, rods are inserted through the MES container to prevent the MES launch in case of an accidental break of the flexible link.
Table 2. Bottom-Station to MDU
Serial link RS-232 One 5-conductor neoprene cable
MDU to Junction-Box (JB)
Serial link RS-485 Six 2-stranded conductor cables in equipressure in oil. The JB can receive up to 6 plugs.
JB to MES
Serial link RS-485 One 4-conductor neoprene cable.
MDU test-link
Serial link RS-232 One 4-conductor neoprene cable, for tests in air.
175
Electrical connection scheme
I I
t
I
Mr~u
I
MES N1
MES N2
1
I
i
i
i
: ~ t e s t plug )Oj l[~neoprene -~.~
|
I.oi ___~---~ I
O.D. blue connector connection to BS
equipressure cable
Figure 7. MCS interconnections.
equipressure junction box
176
Messenger storing and releasing system
container
PVC tubes
wire release
O.D. Blue connector
container Foot block
Figure 8. Messenger container.
177
3.3. The Messenger Communication System tests 3.3.1. Laboratory tests of the sub-systems The MCS sub-systems have been submitted to a series of tests in Ifremer's testing installations. Some tests are commonly performed on oceanographic equipment, others are specific to the MCS. Mechanical tests. In view of the first GEOSTAR demonstration mission at less than 50 meters depth during the 1998 summer, the service pressure was fixed at 20 bars. All the pressure vessels have been tested according to the program: 24 hours at 1.3 times the service pressure, and at least 70 hours at service pressure. Tests on spare MES pressure housings fitted with strain-guages have been performed to assess the maximum operational depth, with the aim to confirm the design objective of 4000 meters. The test program consists of: 1 hour stay at 412 bars, 1 hour stay at 536 bars (1.3 times the service pressure), a creeping test consisting of 3 successive stays at 412 bars during 1, 10 and 100 hours, with intermediate relaxation periods of 10 and 30 hours, a final test up to the destruction pressure. The tests were satisfactory. The destruction occurred at 810 bars, that is, at about twice the service pressure, which is convenient as regards both the resistance and the long-term creeping in immersion. In addition, a non-instrumented MES has been left under pressure at 412 bars for more than 4 months, with no signs of degradation. All the MCS sub-systems have been tested in vibration with control of the electronics operation before and after the tests. The program consists of a frequency sweep from 5 to 16 Hz with a double amplitude of 1 mm, and from 16 to 55 Hz with a double amplitude of 0.1 mm. The MCS interconnected sub-systems have been submitted to temperature 40~ and wetness 93 %, during 12 hours with functional checks at the beginning and the end of the stay in hot conditions and after the return to the ambient conditions. After a second 6- hour stay at 40~ the components have been immersed in a water tank at 17~ reconnected and tested. Electronic and software functions have been tested on desk before conditioning the electronics and then at numerous steps during the execution of the mechanical tests. During hydrodynamic tests on MES a rigorous weight control procedure in three steps was followed. An initial estimation of the weight, center of gravity, volume and inertia estimations based on the component dimensions and specified or measured material densities. All the asbuilt individual parts or mid-integrated components were weighted and re-calculation of the weight file was done. A final measurement of the weight and floatability of the assembled MES was done. The stability module or restoring moment of the MES, both completely immersed and floating at the surface were deduced from the weight estimations and measurements. Resonant periods in heave and pitch have been calculated and checked by tests in calm water. Behaviour tests in sea wave have been performed in regular waves up to the performance limits of the available wave canal: wave periods < 2 s, wave heights < 0.25 rn/s. Maximum pitch amplitudes and submergences have been checked on video-records. Manual release and ascent tests have been performed in Ifremer's 20 m deep basin to check the proper exit of the MES from the silo and the subsequent trajectory upwards. A measured ascending speed around 1 m/s was in accordance with the theoretical estimation.
178
3.3.2. Basin tests of the MCS integrated to the Bottom-Station These tests have been performed in Ifremer's 20 meter deep sea water test basin in order to check the system in conditions as close as possible to the ones foreseen during the demonstration mission in the Adriatic Sea. An exhaustive test program of the MCS operation was performed which comprised acquisitions of data by the BS, transfer of the data to the MES and launch of the MES. 3.3.3. Demonstration mission in the Adriatic Sea This mission took place offshore at Ravenna during about 3 weeks in the summer of 1998 at about 40 meter depth. The MCS comprised two MES-E and one MES-S. About 440 hours of good quality data have been recorded by the BS. After 1 week, the first MES-E was automatically released at full capacity after the reception of 24 Summary Messages of good quality. It was left drifting during 2 days for emission via ARGOS system. The second MES-E was secured to the BS and was recovered at the end of the mission with 29 good quality Summary Messages stored. Before the recovery of the BS, the MES-S was released by sending the proper acoustic command and 5 Mbytes of data were finally recovered. This mission confirmed the good operation of the MCS both electronically and mechanically. 3.3.4. Prospects for the future The MCS will be used in the near future during the 6-month deep-sea mission of GEOSTAR at 3400 meter depth, offshore at Ustica island. For this operation the MCS will comprise three MES-E and two MES-S. As the present design of the components has been validated by the high pressure tests, it was kept and the deep sea mission was a rigorous bench test for endurance in real deep-sea conditions. At longer term, studies and preliminary tests are devised in view of simplifying the design of a number of parts such as the tube end-closure, or the MES electrical connection.
4. THE NEAR REAL TIME C O M M U N I C A T I O N SYSTEM 4.1. General considerations In order to get daily information, it is necessary to put in place a permanent link between the BS and the shore. The system is basically composed of three segments : - an underwater link from the BS to a relay buoy; - a relay buoy; - a radio or satellite link from the buoy to the shore. The performances of the whole system are determined by a number of operating characteristics relating to each of these segments.
The underwater link. Two main options are available: (1) to put in place a vertical data cable or (2) to transmit the data by acoustics. The cable offers a higher data rate capacity and a safer transmission than the acoustic link working in an open environment and subjected to any disturbance of that environment. On the other band the acoustic link is simpler to install as there is no physical link between the bottom and the surface.
179
With a data cable link, a number of technical problems of various difficulties are to be dealt with. The use of a ROV could be required to connect the underwater cable to the BS. The design of a connection between the cable and the buoy capable of enduring the millions of fatigue cycles due to the buoy movements in the waves could be difficult. To place a bigger buoy to support the weight of the cable could require more complex logistics. Both in cable and acoustic links, an important factor is the energy required to transmit a certain amount of data, as the BS is powered by batteries which generally cannot be recharged during the mission. Ways to overcome this energy limitation problem could be either to convey energy through the BS-surface buoy cable or to have energy sources on the BS which could be replaced underwater. The surface to shore link. If the experiment site is close enough to the shore, one can use a VHF radio link. An even simpler solution is to use the cellular phone if there is a good quality coverage of the site. The satellite link is the only solution free of any geographic constraints. The satellite communications make use of the geostationary satellites (GEO), towards which the buoy antenna has to be actively directed, and Low Earth Orbit satellites (LEO) which can be utilized with a passive omnidirectional antenna. In the case of "real time" transmission, a dialogue can be arranged between the BS and the user. In the case of "store and forward" transmission time, delays take place between the emission of a command from the shore, its effective transmission to the BS and the return of the command execution acknowledgment. The "real time" transmission is preferred by the user, especially if he has the possibility to adjust some variables in real time, such as the acoustics emission parameters. A LEO satellite link is the best solution for implementing a link to and from a buoy of rather limited dimensions, which is subjected to harsh movements in the waves and on which the available energy quantity is limited. This energy limitation could however be overcome by the use of solar panels for recharging the batteries. On the contrary a satellite link requiting an active antenna will suffer from various limitations, e.g., the operation of the antenna will no longer be possible above certain values of the amplitude or speed of the movements of the buoy and, in any case, the orientation of the antenna is an energy consuming process.
4.2. The NRTCS system for the GEOSTAR experiment The definition of the GEOSTAR NRTCS results from some initial basic choices. The use of a data cable between the BS and the surface buoy was discarded for reasons of complexity within the context and table time of the Project. This option led necessarily to the use of an acoustic link. The use of an interactive link between the buoy and the shore was chosen so that the user can operate from the shore in the same way as from a ship on site. The transmission of Data Summaries of 250 bytes and of the Status Summaries of 150 bytes was chosen, because of the limitation in the transmission capability of the BS due to the amount of energy available. The Acoustic-Transmission-System (ATS). The acoustic link is based on kHz. Several types of modulation are available with data rates ranging bits/second. The main equipment of the ATS comprises the transducer, the batteries conditioned in two pressure resistant housings, placed on the BS,
transmission at 12 from 200 to 2400 electronics and the and the transducer
180 and the electronics placed on the surface buoy. Similar equipment can be used to communicate directly with the BS from a ship on site. The surface to shore link. This is based on an INMARSAT mini-M satellite system, which uses an active antenna and will necessarily be out of operation above a certain level of the sea conditions on the site. In redundancy a VHF radio link is available between the buoy and the Ustica island in the absence of a good quality cellular phone coverage of the GEOSTAR site. The surface relay buoy. This is foreseen to be a floating body of rather limited dimensions which has to accommodate the surface acoustic transducer and electronics, the radio/satellite equipment, the electronics unit managing the dialogue between the acoustic and satellite links, an energy storage for the acoustic and satellite equipment, the positioning and signaling equipment (ARGOS emitter, lights, radar reflector). The mooting will consist of two major lengths of nylon and polypropylene cables with a steelcable in the upper 100 meters. The dead weight will have be placed at a distance from the BS already laid down, which is compatible with the aperture of the acoustic transducer emission cones. On the buoy special attention will be devoted to provide a quiet acoustic environment to the surface transducer lowered some tens of meters below the surface.
5. CONCLUSION Within the scope of the GEOSTAR project, two different types of communication systems with a benthic station will be developed, the Messenger Communication System and the Near Real Time Communication System. The Messenger Communication System is suitable for a monthly transfer of information. The MCS design has already been tested with success during the short duration shallow water mission of summer 1998 in the Adriatic Sea. The deep-sea 6-month mission started in summer 2000 confirmed its endurance in real deep sea conditions. In the future, various design up-gradings are foreseen to simplify the components and make the system more cost effective. The Near Real Time Communication System is suitable for transmitting daily information. The system designed for the next GEOSTAR experiment reflects a number of circumstantial basic choices (e.g., acoustic underwater link) and equipment availability constraints (e.g., present unavailability of an interactive LEO satellite data channel), and will thus probably be more subject to evolutions.
ACKNOWLEDGMENT These developments have been carried out in the frame of the European Commission DG XII MAST 3 program: contract MAS3-CT 95-0007 and are pursued under contract MAS3-CT98-0183
181 REFERENCES
1. P. R. Foden and R. Spencer, Proceedings of the OCEANS'95, IEEE Conference, San Diego, 1995. 2. H. Momma, K. Kawagushi and R. Iwase, Proceedings of the 9th International Offshore and Polar Engineering Conference, Brest, 1999, 603. 3. F. I. Gonzalez, H. M. Milburn, E. N. Bernard and J. C. Newman, Proceedings of the International Workshop on Tsunami disaster mitigation, Tokyo, 1998. 4. G. Meinecke, V. Ratmeyer and G. Wefer, Proceedings of the Oceanology International 2000 Conference, Brighton, 2000, 33.
This Page Intentionally Left Blank
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
183
9 2002 Elsevier Science B.V. All rights reserved.
Gravimeter for Deep Sea Measurements V. Iafolla and S. Nozzoli Istituto di Fisica dello Spazio Interplanetario (IFSI) CNR, Via del Fosso del Cavaliere, 100 0013 3 Rome, Italy A gravimeter with sensitivity 10-9g/~H-zz in the range 10-5-1 Hz, to be used in the Mediterranean deep-sea measurement will be described. The instrument developed in our laboratory will be used in the first GEOSTAR mission to be performed close to Ustica island coast. The technology used to produce this instrument has been developed in the framework of a space program, with the financial support of the Italian Space Agency (ASI), concerning the realisation of a space-borne high sensitive accelerometer named ISA (Italian Spring Accelerometer) [ 1-4]. Gravimeter main characteristics will be described, together with the integration problems on the GEOSTAR station. The instrument is intended to perform measurements of tides vertical component, recording of earthquakes and teleseismics. Problems connected with the necessary dynamics restrict the instrument's intrinsic sensitivity, while thermal stability conditions in the Mediterranean Sea floor, limit its low frequencies range at 10.5 Hz.
1. INSTRUMENT DESCRIPTION
Measurements of tiny gravitational or inertial accelerations are of primary importance in a large variety of frameworks in science and technology. In all these frameworks, the problem consists in the detection of small displacements produced by accelerations acting on a proof mass. For gravitational accelerations we mean the one produced by gravitational forces acting directly on the proof mass, while the inertial ones are produced by forces acting on the reference frame, where the proof mass is located. Usually the proof mass is connected to the reference frame by a recalling force (mechanical, electrical or magnetic). A transducer detects the displacements of the proof mass, changing them into an electrical signal. The gravimeter we are realising has a proof mass suspended by a mechanical spring. The spring recalling force is in equilibrium with the gravitational force produced by the Earth gravity. Any change in this equilibrium gives a change of the test mass position. 1.1. Mechanical oscillator
As was already said, the main part of the gravimeter is a mechanical oscillator. The realised prototype is shown in fig. 1. It is obtained by working one single piece of aluminium (A1 5056). The test mass is connected to an external frame, through two torsion arms. The restraint of the proof mass is due to the suspension torsion elements. Forces acting along the sensitive axis ~, produce a torque and a consequent rotation around the ~"axis.
184 Two sensing capacitors are obtained by two additional plates connected to the opposite sides of a central one and electrically separated from it by insulating washers, and used to detect the signal, as explained below. The mechanical oscillator is described by the following equation: I~i + r/,O + to,,9 = )1~ (1) where K, represents the torsional elastic constant, I the proof mass moment of inertia, r/, the dissipation coefficient and M the moment of the forces acting on it. The moment is produced by the resulting gravitational forces, which act directly on the proof mass, or the accelerations which act on the reference frame. Mechanical frequency evaluation formula is: 1 ~-~
f0=
.
In our case the experimental frequency obtained is 15 Hz, in agreement with the estimated value. For small angles of rotation and, to simplify the following evaluations, we will refer to the linear displacement of the center of mass of the harmonic oscillator instead of its rotations. The formula between rotation angles and center of mass displacements is: Ob=x,
where b is the distance between rotating axis ~, and centre of mass. With this assumption Eq. (1) can be written in the following way: 5 mY + rl' k, x --ff .ic + - ~ =F
If we put: mr
=~-m,5
rl=
~
k,
,
k=b2
,
we get: mr.~ + r/~ + kx = F . The resonance frequency and the mechanical quality factor are expressed by the following formulas: ~
f~ = 2--~-~
Qm:
-2~
,
,
mr
rl m r (-O o
For accelerations at frequencies lower than f0, at which the gravimeter is used, the transfer function between accelerations and displacements of the centre of mass is:
x(co)
(co)b
2
a( o) / COo .
185
Figure 1. Mechanical structure of the gravimeter. In its working position the gravimeter sensitive axis (~) is directed along the local vertical. Gravity acceleration (9.8 m/s 2) acting on the proof mass gives a displacement equal to 1.1.10-am. The central plate is machined in such a way to compensate this big displacement. The external reference frame faces and proof mass faces are coplanar only under the action of gravity. We recall that the minimum detectable acceleration (10 -8 m/s 2) produces a displacement equal to 6.3-10 -~3 m. 1.2. Read-out system Figure 2 shows the gravimeter read-out electrical scheme. The sensing capacitors, Cl and C 2 , and two external fixed capacitors, C, and C b , connected in a bridge configuration, are used to get the signal. The bridge is driven by a transformer T to a voltage
Vp -~- Vpo cos(2~fpt) at f p = f 2 / 2 ~ . Usually the bridge is close to its equilibrium position and between the points A and B the driven voltage is reduced to a few mV. Displacements of the proof mass give a variation of the two detector capacitors, producing bridge unbalance and a consequent modulation of the driven voltage at the mechanical signal frequency. The capacitive bridge output signal is sent to a low noise amplifier, characterised by very high input impedance Z i , an equivalent noise voltage generator e, and an equivalent current noise generator i,. Tablel shows the list of the elements appearing in the scheme of Fig. 2, with their meaning. After the low noise amplifier the signal is demodulated and acquired.
186
k
t
C
"T'c3,,!
[
MASS q
lc4 i
l
v~,,
If
.......
[.............c2j 1
v DEMODULATOR
~ X
Figure 2. Electrical scheme for signal detection: ax is the equivalent voltage generator associated with signal; vb is the equivalent generator associated with Brownian noise; e, is the equivalent noise voltage generator associated to the amplifier; in is the equivalent noise current generator associated to the amplifier; Zi is the amplifier input impedance and Vr~ is the equivalent voltage generator associated with the capacities loss.
1.3. Sensitivity At the capacitive bridge output, in the
case
Z i >> 1/(27fpC), C is the capacity between
AB, there is the following voltage:
v~ : v~ c , o ) + c ~ ( ~ ) ) - c + Considering that
c,(o)~Co + Co b o; c ~ ( o ) ~ C o -Co ~bo, dl
d2
if the distance between the two capacitors faces are d~ ~ d2 = d, we have: ab0 s vs = ~ - ~ c s / 2 , 2 where ar (transducer factor between central mass displacement to output voltage) is: 2d It represents the maximum field value inside each capacity detector. The formula establishes the link between the mechanical signal amplitude value at frequency fs and the value of the bridge output signal at the frequency f p + f~ or f p - f ~ . Brownian noise of the harmonic oscillator. In the mechanical oscillator the dissipations are associated with a Brownian noise generator, which produces acceleration given by: F 2 (co) 2 4kbT q 4kbTco o a b (co) = - - - - T - = ------5-- = mr
mr
mrQm
187 In this equation kb is the Boltzman constant and T is the oscillator thermodynamic temperature. To obtain a low Brownian noise level a high mechanical quality factor is required. Preamplifier noise. The amplifier noise contribution can be taken into account considering the equivalent noise generator ~ and i, given by: e,
2
= 4kbT, Z
.2
; t,
= 4kbT. I Z,
where Z n - en/i . is the noise impedance and T, = e,i, is the amplifier noise temperature. 4k b Total system noise. Total gravimeter noise in terms of acceleration acting on the test mass, can be expressed by the following formula:
a2(o) 4kb~o[r ,
]
4 Z Co
mr L ~ + <
~
~
9
Where Q is the total system quality factor, including the mechanical oscillator dissipation and thermal noise associated to the transducer loss, given by the equation: 1 1 1 Q Qm Qcle New factors appearing in the previous formulas are: inverse of the acquisition time Af = 1/ At, et2 C electromechanical factor fl = 2 , indicating the ratio between the electrical energy inside m r (Oo
the transducer and the mechanical energy of the harmonic oscillator, and the electrical 1 coo t g 6 dissipation given by ~ 4 , where tg6 is the tangent loss angle associated with
Q~e
C~ /~
the transducer. Table 2 lists the experimental values measured for each parameter and the evaluated gravimeter sensitivity. 1.4. Low dissipation electronic A particular gravimeter arrangement has been studied to provide a total system dissipation not exceeding 300 mW. Figure 3 gives the instrument block system. A voltage at 10 kHz drives the gravimeter transducer by a high stability quartz oscillator. After the low noise AC amplifier the signal is sent to a demodulator which uses as reference a signal coming from the same oscillator. The demodulated signal is sent to a 24-bit A/D converter. A micro controller device, every 10-60 sec., gives the start to the converter, reads the data and sends them to the RS232 serial port. Through this port the GEOSTAR DACS (Data Acquisition Control System) collects the data. The DACS can also give a signal for synchronising the gravimeter acquisition with other instruments. Close to each box is indicated the power dissipation. Total gravimeter power budget is 190 mW.
188 Table 2 Gravimeter experimental value amin
Sensitivity
( g / 4 I-Iz )
10 -9
mr
Proof mass
(kg)
0.22
fo
Frequency of resonance
(Hz)
15
fp
Bias frequency
P
Pressure
C 1 and C 2 Sensing capacitors
tg(~c1
(mbar)
10 3
(pF)
300 4.10 -4
Tangent loss angle
C a and C b External fixed capacitors
tg6ca
10
(pF)
3" 10 -4
Tangent loss angle
C3 and C4 Control capacitors
300
(pF)
300
Electronic device (Analog Device) AD743/AD ]fin
Voltage noise of amplifier ( v / ,frizz )
3.10 -9
i,
Current noise of amplifier ( A / ~]Hz )
7.10 -15
L
Temperature noise of amplifier
(~
0.76
Transducer factor
( V/ m)
105
fl
Electromechanical transducer factor
15.10 -4
Qe
Electrical quality factor
103
Qm
Mechanical quality factor
10 -1
Q
Total quality factor 2
abw 2 ael
10 -1
Brownian noise
(1TI/S2)2/H
7.10 -17
Electronic noise
(m/s2)2/U
29" 10 -20
189
Figure 3. Gravimeter electronic blocks diagram.
2. M E C H A N I C A L A R R A N G E M E N T During measurements operation the gravimeter must be almost in horizontal position, the gravity acceleration g acts parallel to the ~ sensitive axis. Variations of the gravimeter support plane angle, giving a g component variation acting on it, equal to gcom = g" (1--COSa). For small angle variation, as in the operative phase, when GEOSTASR is steady in the sea floor, g .... ~ g" a2 / 2 is usually negligible. But in the first phase of the mission, when GEOSTAR is located in the sea floor, its position can be also 10 ~ out of the horizontal plane and such an angle produces a signal variation of 1.5-10-2g, acting on the proof mass; such a large signal will be able to drive the instrument out of its dynamic. To avoid this, without readjusting the gravimeter position, a particular arrangement will be implemented. During the mission there is no possibility to communicate with the gravimeter.
190
Figure 4. Mechanical arrangement of the gravimeter inside a spherical glass housing, permeating it to keep the vertical position. The gravimeter is suspended inside a deep-sea glass instrument housing with a wire, see Fig. 4, in this condition the verticality of the sensitive axis is automatically recovered with a precision better than 1~. For such a value the g component acting on the gravimeter is 1.5.10 -4 g . As we will see in below this value is low enough considering the dynamics of the instrument. The gravimeter position and the bridge read-out equilibrium are adjusted at the sea level and the pendulum enables us to recover this position also in the Mediterranean sea floor.
3. DYNAMICS To establish the gravimeter dynamics for its operation during the scheduled 6 months mission, it is necessary to estimate the expected signal levels. In addition, it is necessary to estimate the reference signal value for which the gravimeter provides its values (to be reminded that the gravimeter provides measurements of variations of the vertical component of the acceleration, with respect to an initial reference value). In the following we give an estimation of the gravimeter dynamic, evaluating the output signals and disturbances variation. Minimum detectable signal. For a sensitivity of l O-9g/ffHz and, using a mechanical oscillator with frequency equal to 15 Hz, it is necessary to detect proof mass displacements equal to 6.3.10-13m. Such displacements are converted into an electrical signal by the capacitive transducer. Recalling that the preamplifier voltage noise value is v, = 10-Sv/ffHz, the transducer factor must be a > 104 V/re. In our case the distance between the transducer capacitors faces is d o = 150 lam and the value of biasing voltage is Vb = 2.5 V.
191 Maximum signal variation. The signals that can be detected with the gravimeter are: vertical components of tides, teleseism from remote earthquakes and low frequency movements of the Earth crust surface. Prediction for this effect indicates a value not bigger than 10 ~g. Thermal variations. Gravimeter thermal stability experimental value is A g / A T = l O-4g/~ This value is connected to the elastic constant thermal variation of the recalling springs, which hold the proof mass. The daily temperature variation is supposed less than 10-3~ while during the 6 months of the mission it could be approximately I~ The maximum signal related to this temperature variation is 10-4g. In section 4.2 we will give the procedure used to correct this undesired signal. Time-dependent variations. Experimental characterisations indicate a time-dependent output signal variation due to the relaxation of the spring torsional elastic constant and a consequent change of the proof mass equilibrium position. These displacements decay exponentially with time. Its experimental evaluation shows an upper limit, equal to Ag/At = 5.10-Sg/day. During the 180 days mission, this exponential drift gives a variation of 9.10-3g. The exponential signal is easy to remove, but it is necessary to take it into account in the dynamics evaluation. Reference position signal. The mechanical arrangement allows the gravimeter to recover the verticality with a precision of 1 ~ Supposing it to be able to fix the gravimeter output at the surface of the sea at "zero", position, when GEOSTAR reaches the sea floor, 1 ~ away from the old position gives a new reference level equal to 1.5-10-4g. Hence, the value we consider for the dynamics estimation is 9.10 -3 g , due to the time signal variation. Signals dynamic evaluation. The instrument dynamic necessary to detect all the signals has to be: Sig.max/Sig.n~n = 9 . 1 0 -3/10 -9 =9.106 In order to reach such a value it is necessary to consider the dynamics of the mechanic and electronic parts. Dynamic of the mechanics. The gravimeter mechanical part does not give problems to the dynamics. The minimum acceleration signal gives proof mass displacements equal to 6.3.10-~3m, while the maximum displacement is limited only by the fixed plates of the capacitors proof mass, at a distance d o = 150/~m. Dynamic of the electronics. The low noise amplification gain is set to 100 in such a way that the input noise of the demodulator is negligible. Its saturation voltage is, 2.5 V, the dynamics (ratio between the saturation voltage and minimum voltage to detect or amplifier noise voltage) is 1.2.10 6 . Concerning the A/D (analogue to digital) converter, it is a 24-bit with a dynamics equal to 2 24 = 1 . 6 . 1 0 7 The amplifier dynamics is not enough to cover all the signal variations, but it is necessary to consider that the time-dependent signal variations are over-estimated and we are confident that this effect will be much lower during the mission, several months after having mounted the gravimeter mechanical part.
192 4. EXPERIMENTAL CHARACTERISATION In this section we will describe some experimental characterisations of the gravimeter we are realising. 4.1. Calibration An easy way to calibrate the gravimeter is to detect the gravity acceleration component acting on it. It is possible to vary this component by changing the instrument support plane angle with respect to the horizontal position. As we have seen in Section 2, the gravity acceleration component is related to the a angle variation by the following formula: gcom g . ( 1 - c o s a ) . Two fixed legs and one with a possibility to be adjusted by a micrometric screw, support the gravimeter. Changes of the micrometric screw positions give a variation of the support plane. Figure 5a shows the output voltage of the gravimeter vs. the micrometric screw displacements. The displacements are referred to an arbitrary position. Depending on the linearity of the gravimeter response, the curve is an arc of sinusoid. Figure 5b shows the value of the gravimeter output signal vs. the variation of the gravity component, evaluated by the previous values. In this case the calibration factor (ratio between gravimeter output in V and input acceleration in g) is: a r =5.7.10 -1 g / V :
4.2. Transfer function measuring Figure 6a shows the gravimeter transfer function. It is obtained by forcing the system with accelerations having equal amplitudes, but swept in frequency from 1 Hz to 20 Hz. Accelerations are obtained using the control capacitors (see figs. 1-2) as actuators. A sinusoidal voltage added to a DC voltage drives these actuators. The driving signal and the gravimeter signal are sent to a spectrum analyser that gives the amplitude and phase graphs of the transfer function. The gravimeter is a second order system; it shows a resonance at 15 Hz, after which the signal is attenuated at 40 dB for a decade, and it is flat before this frequency. It is necessary to recall that the gravimeter frequency range is below its resonance frequency. In these measurements the mechanical quality factor is kept low, by leaving gas trapped in the capacitors gaps, in order to allow measurements (spectrum analyser does not work properly when the systems have high quality factor). Figure 6b shows the gravimeter transfer function after the introduction of a low pass filter to lower the signal at the mechanical resonance frequency, when the system is under vacuum, inside of the deep-sea glass instrumentation housing. 4.3. Correction for the temperature and time signal variation Figure 7 shows the vertical acceleration, black line, recorded by the gravimeter in about 2 days of acquisition, and the simultaneous value of the temperature measured with a thermometer connected to the central plate of the gravimeter (grey line). The data were detrended and the temperature multiplied by a constant value (aremp = l O - 4 g / ~ to convert it into acceleration variations. The starts of the two series of data are shifted along the vertical
193
. 5
i
!
1
i
:
:
i
i
(a)
1.45
. . . . ;---
1.41
>
1.35
:3
a.
1.3
:3
0
1.25 1.2 1.15
0
i
,
i
i
: 10
; 20
; 30
: 40
,
50
60
Inclination [ m m ] 1.5 I
i i
i i
(b)
!
1.45 ,_, 1.4 > i..__.
,, 1.35 :3 13.
z 0
1.3 1.25 1.2 1.15 0.82 0.84 ..
0.86 0.88
0.9
0.92 0.94 0.96 0.98
1
Acceleration[ g ] Figure 5. (a) Output voltage of the gravimeter v s . the micrometric screw displacements; (b) value of the gravimeter output signal v s . the variation of the gravity component evaluated by the preceding value.
194
Gravimeter transfer function
(a) (9 "0
100 O.
E
<
100
,---, 100
101 I
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
_, . . . . . . . . .
l
I
I
i
,___L__L__L_,_
. . . . . . . . . . . . . . . . . . . . . . . .
(9 (9 (9
o9
0
iiiiii
(9 c~ o3 J~
n -100
100
102
Frequency [Hz ]
10 ~
(b)
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
"(9a l 01 E 10o
101
I00
10o
I ,
i ,
i ,
I ,
i ,
i ,
101
i ,
i ,
i ,
I ,
i ,
L_
r
0
(9
100
iiiiiiiii
i-
J~ Q.
-200 10 I
10 0
Frequency
101
[ Hz ]
Figure 6. (a) Gravimeter transfer function; (b) gravimeter transfer function after the introduction of a low pass filter.
195
12
x 10"" I
!
10
8 w
~:~ 6
5
4'--
o
4
,~
2
,tJ,
0
,<
:
t
|
:
-2
-4 ,
-6
60
,
;
70
,
1
80 Time
90 [ h]
1 O0
110
Figure 7. Vertical acceleration, red line, recorded by the gravimeter in about 2 days of acquisition, and the simultaneous value of the temperature measured with a thermometer connected to the central plate of the gravimeter. axis to make them coincide. The difference at the end of the series data is related to the time signal variations, that in this case is about 10 -5 g / d a y . By removing this signal exponential variation it is possible to estimate a linear temperature correction, reducing the thermal effect by a factor > 100. The gravimeter will be equipped with a very sensitive thermometer, but it is still not clear if it will be possible to use this to correct temperature variations as they are around m~ as it is expected in the Mediterranean sea floor.
5. U N D E R G R O U N D L O N G - T E R M M E A S U R E M E N T S It is necessary to stress that to simplify the system and to avoid power consumption the gravimeter is not thermal stabilised. As we wrote, we are confident that in the Mediterranean sea floor, temperature variations will be less than 10-3~ during several days. The above mentioned experimental characterisations were carried out in the lab. In these conditions the seismic noise level and the thermal variations are very high, and it was not possible to perform measurements at high sensitivity levels. To demonstrate the system capability at low signal levels, we show here some geophysical measurements performed with an inclinometer, similar to the gravimeter here described [5, 6]. The sensitive mass is placed in vertical position so that the gravity moment acting on it is almost zero. Angle variations from verticality give the variations of the moment to be detected.
196
Figure 8. Earthquake from Japan recorded with our inclinometer located in the Gran Sasso, L'Aquila underground laboratory of the National Institute of Nuclear Physics (INFN).
Figure 9. Vertical component of the Earth solid tides.
197 Figure 8 shows an earthquake from Japan recorded with our inclinometer located in the Gran Sasso underground laboratory of the National Institute of Nuclear Physics (INFN). Figure 9 shows the vertical component of the solid tides of Earth recorded in the same place. 6. CONCLUSIONS The main characteristics of the gravimeter realised for the GEOSTAR mission are: its very high sensitivity (10-gg/~]Hz ) in a frequency range of 10.5 ~- 1 Hz and its high dynamics (1.2.106). The high thermal stability conditions in the Mediterranean sea floor and the intrinsic thermal stability of the instrument, permitted us to avoid electronic thermal stabilization; these conditions, together with the new electronic technology, allowed us to maintain the power dissipation at a level of 300 mW. The instrument described can be suitable for use in deep-sea long-term measurements, in particular as element of a net coveting the whole planet. Tides and free oscillations of Earth measurements, performed in extremely stable conditions, in terms of temperature and pressure and in the absence of local disturbances, can give information on its elastic constants and on its internal structure.
ACKNOWLEDGEMENTS We would like to thank Dr. P. Favali, G. Smriglio and all the GEOSTAR staff, for helpful discussions and active collaboration. This work was supported by the ING (Istituto Nazionale di Geofisica e Vulcanologia).
REFERENCES
1. V. Iafolla, S. Nozzoli, A. Mandiello, 2 nd Joint Meeting of the International Gravity Commission and the International Geoid Commission, Trieste, September 1998. 2. F. Sansb, A. Albertella, G. Bianco, A. Della Torte, M. Fermi, V. Iafolla, S. Nozzoli, A. Lenti, F. Migliaccio, A. Milani, A. Rossi, Towards an Integrated Global Geodetic Observing System (IGGOS), Munich, October 1998. 3. V. Iafolla, E. C. Lorenzini, V. Milyukov and S. Nozzoli, Gravitation & Cosmology, 3 (1997) 151. 4. V. Iafolla, S. Nozzoli, E. C. Lorenzini and V. Milyukov, Rev. Scient. Instr., 69 (1998) 4146. 5. F. Fuligni, V. Iafolla, V. Milyukov and S. Nozzoli, I1Nuovo Cimento, 20C (1997) 637. 6. F. Fuligni and V. Iafolla, I1 Nuovo Cimento, 20C (1997) 619.
This Page Intentionally Left Blank
P a r t IV M a r i n e environmental and risk assessment
This Page Intentionally Left Blank
Science-Technology Synergyfor Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
201
9 2002 Elsevier Science B.V. All rights reserved.
The Deep-Sea as an Area for Geotechnical Intervention H. U. Oebius a and H. W. Gerber
b
aTechnische Universit~t Berlin, FB 6/ITU-WUM, (Schleuseninsel), D 10623 Berlin, Germany
Sekr. VWS, Mueller-Breslau-Str.
b
Technische Fachhochschule Berlin, FB 9 (Maschinenbau), Luxemburgische Str. 10, D 13353 Berlin, Germany
Together with German joint projects assessing environmental impact in connection with the recovery of raw materials in the Peru Basin, geotechnical investigations were also carried out. The results from mineralogical, sediment-petrographical and soil mechanical investigations of the upper part of deep-sea sediments were used to assist geology, geochemistry and benthology in the interpretation and evaluation of the early diagenesis of deep-sea sediments, of resuspension, resettlement, and transport phenomena, of slope stability and of bioturbatic influence studies, of the appearance and significance of lebensspuren in the sediments, as well as of consolidation, geochemical, and microbiological processes. This contribution will illustrate the background for and the usage of the geotechnics in deep-sea research, it will demonstrate the difficulties involved in deep-sea geotechnics, give some examples for a useful application of geotechnics, and propose the development of an insitu soil mechanical laboratory.
1. INTRODUCTION Geotechnics, as the name itself explains, is a civil engineering discipline which deals with the mechanical stability of soils under the influence of external and internal forces. It is well known and accepted in terrestrial, coastal and even in offshore engineering, where it is concerned with natural soil mechanical failures (landslides, soil subsidences, sediment transport), and in onshore, shoreside and offshore construction works of all kinds. But deep-sea geotechnics? What is that good for? The deep sea, covering 50 % of our globe's surface, is still the last frontier for scientific and engineering activities. The deeper we go the less explored and understood becomes that extremely hostile marine environment with high salt content, high pressure, low temperatures (or high temperature gradients), high corrosion impacts, low visibility, mostly very soft soils, and a biotope of extremely high diversity but low number of individuals. Despite very
202 ambitious and multinational research projects like WOCE, JGOFS, or CLIVAR, which have remarkably increased our understanding of the oceanic cycles and of the far-reaching interaction between the deep-sea area and climate change, the connection to the benthic environment is not yet entirely understood, mostly due to a lack of appropriate means to assess the processes and changes and the enormous expenditures connected with ocean research [ 1]. However, something we do know or have to anticipate is that the deep sea is a place where natural events of extraordinary dimensions, kinetic energy, and impact on the climatic and biological environment happen, like E1 Nifio/E1 Nifia which affects the abyssal and benthic particle flux and life forms of still unknown quality and quantity. Turbidity currents and landslides attract special attention since possible associations with gas hydrates have been pronounced and since the enormous distances which they can travel as well as the damage which they can cause have become apparent (Fig. 1). The layer of recently-deposited deep-sea sediments, largely uninteresting for marine geology and geophysics, on the other hand serves not only as recipient, storage and exchange area for all kinds of chemical, biological, and sedimentological compounds including the early diagenetic products which finally generate basin sediments, but also as a habitat for a variety of benthic life forms, some of which we do not even know yet. The deep sea, the abyssal plains, the hot spot and cold vent areas as well as the metalliferous deeps are a potential living and non-living resource. The exploitation of the partly very specialized biotope for pharmaceutical and medical purposes and the commercial recovery of manganese nodules, manganese crusts, phosphorites, metalliferous muds, or massive sulphides will also largely affect the deep-sea sediments. The deep sea is also the worlds greatest dumping site for the residues of a number of hydrological, geological and anthropogenic processes. These will finally reach the deep regions of the oceans and will be deposited there. Also, the artificial deposition of all kinds of hazardous and non-hazardous wastes in subduction areas and/or deep-sea basins is still being discussed by politicians and engineers, a process which will interfere with the hydrolithosphere interface, covering, exposing, generating, disturbing and/or eroding parts of it. Most of the above mentioned events and activities interfere with the mechanical structure of the sediment layer and, therefore, can be explained by or will require geotechnical study. This refers not only to standard problems such as the stability of subsea slopes, the foundation of scientific and commercial structures on the seafloor, its trafficability, or the removal of raw materials and parts of the recent sediment, but also to early diagenetic questions of deep-sea sediments. It also refers to the role of consolidation processes on the exchange of chemical compounds from interstitial to abyssal waters, to the depth demarcation of the living space of burrowing animals, and to the importance of different strata for settling animals. For geotechnics to be of assistance in providing reliable solutions to some of the above mentioned problems, it needs appropriate technical means and scientifically based standards.
203
+ + + 4" "4"
50 ~ , .,:-
":,'"
,
.
ooo
PISTON CORE STATIONS +++++
r
SUBMARINE TELEGRAPH CABLES
4-'+ +++
%
AREA OF SLIDES*SLUMPS NEAR EPICENTER
45 ~ 4+++ +
AREATRAVELLEDBY DESTRUCTIVETURBIDITY CURRENT;BROKENCABLES
%
=~-.-; ~. - - . .
MARGINALAREAOF WEAKER CURRENT;CABLESBURIED; NOT BROKEN
.~~----~
40 ~
9.
.-....
~ ; 9
......
.
_
*,,
,
.
I : i
.
. . . ..
"" : : ' . . . . . . : .
,
++4,-
-
100 FATHOM CONTOUR
~ ~ ~ . ~ ;-..
~
~ o
. 9
. ,. ' ~ ~
sl~
NORTHEAST ~,BERMUDA~
~
60 ~
~
..~/.~, ..-. 9
.
~
35~
~
.
*~
9
.
.
9
9
9
9
9
9
~
o ,.
. 9
~
.. 9
o 9176
: : : ....-.
.
9
ABYSSAL PLAIN
o
.....
9 1 7 6
o .
o
.
9. ' ..'. ~"w'.':?~? ".''.[..II~I.A. R HILLS AND MOUNTAINS ~
.
.
50 ~
Figure 1. Grand Banks landslide.
204 2. DEEP-SEA G E O T E C H N I C S
Marine geotechnics is a multidisciplinary science. At least three disciplines are required to fully understand the mechanical behaviour of deep-sea sediments and to develop appropriate standards and predictive models. The first discipline requires a geological, geochemical and geohydrological approach to the determination of the composition and the structure of deep-sea soils. The second refers to an engineering approach applying terrestrial geotechnical experiences to marine conditions including the development of appropriate sampling and testing equipment and methods, as well as of satisfying standards. The third and highly important one addresses biology, recognizing its major role in deepsea sediments where plankton detritus and micro-biological effects play a dominant part, as well as an input from geotechnics, to the understanding of the behaviour of, for example, burrowing animals. There are a lot of characteristics which make deep-sea soils different from terrestrial or coastal sediments. Therefore the application of what we know about geotechnics to deep-sea conditions is not that simple. The main differences can be pointed out as in the following [2]. The deep-sea floor is predominantly composed of depositional material. Diagenesis and lithification of biogenic material have a strong influence on its structure. Deep-sea sediments are 100 % saturated with the exception of isolated inclusions of gas and gas hydrates in continental slope areas. The interstitial water is primarily salt water which contains different chemical compounds in solution which may have cementing affects when precipitating. Deepsea sediments normally have very high void ratios especially in the Holocene layer [3]. A typical component are carbonate sands which occur predominantly in marine areas and the behaviour of which differs remarkably from other sandy materials. Biogenic fossils are an important factor in the composition of deep-sea sediments. They have a strong influence on the structure of the sediments as well as on the reactions to induced stresses [3]. Biogenic activities of meiofauna (bioturbation) and especially microfauna (bacteria carpets) may have a cementing rather than a destructive effect [1]. The more detailed investigation of the importance of these differences alone opens a large field for research.
3. EXAMPLES OF THE APPLICATION OF GEOTECHNICS IN THE DEEP SEA 3.1. Baseline Studies
Terrestrial geotechnical baseline studies include the physical characteristics of the sediments (grain size, soil classification, water content, density, packing, frost criteria, porosity), geotechnical sampling, geohydrology, mechanical fundamentals of the sediment (structural analysis, fractures, tensions, frictions, elasticity), pressure distribution, pressure/settling behaviour, consolidation, shear strenghts, and soil subsidence. The same baseline studies are essential on the ocean floor and have to provide a satisfactory geotechnical description of the structure and behaviour of deep-sea sediments under study. Since the necessary sophisticated geotechnical laboratory equipment is far too
205 complex to be deployed on the seafloor, the baseline studies have to be accomplished on board research vessels, under the provision, however, that large volume, undisturbed sediment samples are made available [4]. This was not possible until recently. The two sediment samplers in use unfortunately did not provide sufficiently undisturbed samples. The boxcorer, which provides large volume samples, has the disadvantage that normally between 2 and 4 cm of the top layer is missing, whilst the multicorer, which provides samples with an undisturbed top layer, suffers from very small diameter (90 mm) sampling tubes with too high wall friction, as well as from the lack of an in-situ bottom closure which provokes a prolongation of the sample, and therefore a structural disturbance [5]. Consequently a new sediment sampler had to be developed and tested. This resulted in the design and construction of the Maxicorer, which for the first time, provided undisturbed sediment samples from the deep sea on board a research vessel with sufficiently high quality for geotechnical investigations (Fig. 2).
3.1.1. Sediment Transport Sediment transport is not just a phenomenon in inland waters and coastal areas. Currents which are strong enough to erode the top sediment layer also appear in the deep sea. Canyons and seamounts can provide narrow channels where the tidal currents create high velocities to pass through [6]. There are also natural drift currents, vortices, or subsea storm currents, the velocities of which are high enough to cause erosion of the seabed and sediment transport. The presence of hiatuses in basin sediments cannot be explained without the existence of such erosional events [7]. The erosion possible is dependent on the composition of the top-layer sediment, the shear stress of the current and the frequency of eroding events. Determining these parameters and forecasting erosion causes great problems in inland, coastal and deep-sea waters. The second-mentioned is especially difficult to obtain by measurements, since the near-bottom currents, which are responsible for the creation of the shear velocity, as well as the velocity distribution in the boundary layer, do not lend themselves to easy access. In the deep sea the first requirement also causes problems, since the influence of micro-organisms on the stability of the recent top layer is hard to assess, as we also know from the Wadden seas [8 ]. To overcome these difficulties an instrument was developed, not only capable of directly measuring the shear stress at the seafloor but also of detecting visually the moment of incipient erosion. This shear stress meter was designed and developed for inland and coastal water down to 40 m but it can be adapted for deep-sea conditions.
3.1.2. Resuspension/Resettlement "Deep-sea sediment" as a general description does not exist. It differs locally as much as sediment does on land. Consequently each location has to be treated individually.
206
Figure 2. Maxicorer sample. Resuspension of deep-sea sediment appears in conjunction with any natural or anthropogenic activity which interferes with the top layer of the seafloor. Since the sediment is mostly composed of very fine particles with extremely low settling velocities, the risk of resuspended material being transported over long distances is possible [3, 8, 9]. However, two factors act in the opposite direction: the semi-liquid top layer of deep-sea sediments is apparently not being resuspended completely in the form of individual fine particles, but mostly broken from the sediment layer in the form of chunks (Fig. 3) [1, 10,11]; aggregation of clay-like particles starts instantaneously after resuspension, forming greater particles which resettle fast [9]. These first general results from investigations in the Peru Basin have been counter-checked and verified by further measurements in the Central SW Pacific [8,11 ]. Another subject which can be put under this topic is that of slope instability. This phenomenon occurs naturally, when the inner friction of a sediment body is reduced along a sliding surface, for example by increased pore pressure or due to too high inclinations of slope. In this case the sediment body will break off along this surface and glide down the slope forming a turbidity current. Since the reasons for such instabilities are soil mechanical failures, geotechnics can assist in the understanding of the trigger functions for their -
-
207
Figure 3. Behaviour of deep-sea sediment after impact by water jets.
208 occurrence, e.g., whether gas hydrates can produce such slide faces or just trigger such failures. 3.1.3. Consolidation Immediately after settling the particles start to consolidate under the influence of gravity. However, in contrast to terrestrial soil mechanics, the upper part of deep-sea sediments seems not to follow this rule. Its composition of aggregates with mostly great volumes and low densities as well as the complete saturation of the soil (Fig. 4) result in a buoyancy which is supported by the electrical charging of the clay-like particles and which provokes the particles to form a card house-like structure. Without influences from bioturbation this non-consolidated top layer can be clearly distinguished from the consolidating sublayer [1, 3]. This buoyancy of the completely submerged particles also has other consequences. If upward fluid flow occurs in the sediment, this can support the buoyancy of the aggregates and prevent them from consolidating to even greater sediment depths. Such "quick sands" in the deep-sea (or "mud volcanoes"), reaching down to several meters, have been detected all over the oceans and even in the Wadden seas. They appear mostly in cold seep areas, areas of gas hydrate deposits, or in fracture zones. Since their extension is mostly relatively limited they are difficult to detect [12]. There is also another puzzling effect in top layer geotechnics. Although the shear strength does not show remarkable changes, the water content normally drops by 20 %. The reason for this behaviour is not yet completely understood. One explanation could be, that bioturbation plays a dominant role, destroying the aggregates and transporting the fine clay-like particles inside the shells of plankton detritus. This would displace water and simultaneously have no remarkable effect on the shear strength [3]. 3.1.4. Bioturbation Bioturbation starts as soon the particles have settled on the seafloor. A number of species digest the particles, or house and seek protection in the sediment, respectively. The crawling animals looking for shelter or nourishment and the traces of their activities can be seen on the sediment surface. In nearly every sediment sample there also occur traces of burrowers down to several meters (Fig. 4). One of the questions not yet solved is, how deep these creatures really burrow. There are two factors which make one hesitate in accepting that the deeper traces are newly produced: - the availability of foodstuff decreases rapidly below the semi-liquid top layer which gives the animals no motive to burrow deeper; - the shear strength of the sediment increases rapidly with depth, so that the burrowers would really have to work hard to dig themselves through. Shelter could be a plausible reason for the creatures to ignore these inconveniences; but until we examine the traces biologically, chemically, and geotechnically, there can be no reliable answer.
209
Figure 4. Typical shear strength distribution in deep-sea sediments. On the other hand shear strength measurements are a relatively simple means to quickly assess the degree of bioturbation. The lebensspuren of the burrowers can be detected, provided a shear strength meter can be settled on the seafloor, it can penetrate the sediment down to a depth of about 50 cm, and its sensor is small enough to distinguish between the ducts and the undisturbed sediment. The result will be, either a signal of the integrated effects of animals burrowing in the semi-liquid consistency of the top layer, or a clearly distinguishable individual signal. Such measurements also revealed areas where bioturbation is very limited [1 ].
3.2. Trafficability/Raw Material Recovery The trafficability of deep-sea sediments is of great importance for deep-sea mining. Here, not only the bearing capacity has to be secured but also the manouverability of vehicles on the seafloor [ 13,14,15,16]. In terrestrial geotechnics, plate bearing tests have become an accepted technique to determine the specific bearing capacities of different types of sediments. This
210 technique therefore was also applied to deep-sea sediment during cruises into the Peru-Basin, i.e., circular plates of different diameters (20 cm 2, 50 cm 2, 100 cm 2) were pressed with constant penetration velocity slowly into the smooth clay-like and silty sediments of Maxicorer samples, and the penetration-dependent force was constantly measured. The results of these measurements are shown in Fig. 5. This figure reveals another difficulty in transferring geotechnics from land to the deep sea. If too small a plate diameter is chosen, the plate acts like a cone, displacing rather than pressing on the sediment. Reliable and transferable measurements need a minimum diameter.
3.3. Geochemistry Interdisciplinary investigations in the Peru Basin have clearly demonstrated a close connection between geochemistry and geotechnics. This is understandable, since both disciplines have to rely very much on the flux of interstitial water. Its displacement from the sediment body by consolidating effects also influences the chemistry of the upper layer. The profiles show a distinct connection between the geotechnical behaviour of the semi-liquid top layer and the geochemical composition. This interrelationship is now being investigated in more detail under the auspices of a special Sino-German research project.
4. DEEP-SEA GEOTECHNICAL INSTRUMENTATION
4.1. Undisturbed Sediment Samples (Maxicorer) Assessment of the undisturbed situation of a natural deep-sea environment is an essential prerequisite if changes due to natural or anthropogenic impacts on the sea floor in marginal and deep seas are to be determined and forecast. Many of these factors cannot be assessed except by in-situ experiments (which explains the desire for bottom landers) due to the required operational conditions and the insufficiency of sampling procedures. This also affects geotechnical measurements which have to rely on undisturbed sediment conditions. Unfortunately, no appropriate in-situ instrumentation is available, since, as mentioned above, the existing samplers cause unacceptable structural changes of the sediment and that of the benthic milieu. The Maxicorer was designed to comprehensively unite the advantages of both classic sediment samplers, trying to avoid the disadvantages (Fig. 6). It provides three sampling tubes of 300 mm in diameter and 600 mm in length each, constructed from acrylic glass to enable a direct optical observation. They are assembled around a damped central deployment device which guarantees a slow penetration of the deep-sea sediment without displacing it before coring. They are attached by a special suspension which leaves the sampling tubes hanging freely. A metal cutting blade at the lower end prevents the sampling tubes from being destroyed and simultaneously facilitates penetration. The weight of a filled sampling tube is about 50 kg [5].
211
,ooo ~'
~
/
/
9 Plate III (lOOcm2)
o 5o0o .,--.~
.
~_) 4000
.
Platell
(50cm 2)
A Plate I .
.
/
/
(2 .
u_
3000
2000
.~.~
,ooo
o!
~"
"
0
10
20
30 40 50 PENETRATION DEPTH
60 [mm]
70
6000
9 Plate III (lOOcm2) 5OOO
'--"Z 4000 LU n" :D O0 3000 CO UJ n,"
m Platell .
.
.
.
(50cm 2) .
/
.
13_ 2000
1000
10
20
30
40
PENETRATION
50
DEPTH
60
[mm]
Figure 5. Plate bearing test.
80
212
Figure 6. Maxicorer. The deployment unit is installed within a flame which carries, protects and positions the central unit. It is constructed from six pipes which are based on a triangular fundamental structure also manufactured from pipes. The diameter of the base is about 3.2 m. It has to support the Maxicorer on the deep-sea sediment. In the case of very soft sediment there is a possibility to add special feet to the flame. The deployment unit is suspended to this flame by a Kardan joint, which allows a maximum inclination of the central unit of about */- 5~ The sampling tubes are closed at both ends in-situ by a special mechanism to which the deep-sea cable is attached. When the coring has been completed and the Maxicorer is being lifted the cable starts to move three levers with cutting blades at their ends similar to that of the boxcorer. These blades reach under the sampling tubes closing them tightly. Simultaneously, three top locks are released which fall into place at the upper end of the tubes by their own weight sealing them water tight. Tests with the Maxicorer have revealed that the increase of temperature of samples taken in the tropical central Pacific remained less than 4 ~ C. On board the research vessel the base of the flame can be opened at either edge of the triangle giving access to the sampling tubes which can be easily removed from the suspension. A processing unit, enabling the extrusion of the sediment in the sampling tubes in small steps and/or the cutting of the tubes into two halves, is used with the Maxicorer.
213 4.2. Long-term Geotechnic Bottom Station Supercritical currents for the stability of sediments provoke erosion, which can produce scours around natural and artificial structures in the aquatic environment, large scale sediment transport changing the morphology, or complete removal of sediment layers (geological hiatus). During the EU-MAST I and II programmes, therefore, an in-situ shear stress meter was designed and successfully deployed in coastal waters down to 40 m water depth in the River Rhine (Fig. 7). This station not only measured currents in the boundary layer and directly above the seabed as well as the induced shear stress, but also vertical forces on the seafloor, the thickness of sediment ripples migrating across the instrument, and the wave heights and periods at the sea surface. The shear stress meter consists of a shear plate of about 275 mm in diameter mounted on an artificial circular "dune" of about 800 mm in diameter suspending the shear plate, providing constant height above the seafloor, and by its contour, guaranteeing representative forces from arbitrary directions. The circular shear plate is constructed from buoyant resin material covered with sediment of uniform diameter thus providing a defined roughness. It is suspended from cantilever beams in horizontal and vertical planes and equipped with strain gauges connected fully compensated by a Wheatstone bridge.
Figure 7. Image of a geotechnical bottom station.
214
The overall height of the shear stress meter is about 54 mm. The station moreover contains two current meters at 200 mm and 500 mm above the shear plate, respectively, a compass, two inclinometers, a pressure sensor, an adjusted pressure sensor with high resolution for wave measurements, and an underwater video recording. Energy supply, data collection and storage happens on board the station in a pressure case. It is now planned to use this in-situ shear stress meter as the basic structure for a deep-sea geotechnical bottom station. Since the shear plate will prevent free-fall/pop-up deployment because of its large buried diameter which will establish too high adhesion forces to be overcome by the buoyant components, means will have to be found which allow a deployment and recovery from aboard a research vessel. The mobile docker, designed and constructed during the EU-GEOSTAR-project seems to be an appropriate means for this purpose. This device has been tested successfully during the deployment and recovery of a geophysical station but will have to be adjusted to smaller stations than GEOSTAR. The deep-sea geotechnical bottom station will contain the shear plate as designed for MAST I and II, which has to be adapted to deep-sea conditions. This means that two different cantilever beam systems will be constructed, one for current velocities between 0 and 50 cm/s and one for velocities between 10 and 250 cm/s. The shear plate roughness will be adjusted accordingly. Compass, pressure meter, and inclinometers will be installed as before. The station will be equipped with two ADCPs, a short-range and a long-range type, with a sediment shear strength meter for soil mechanical measurements, a plate-bearing test equipment, and a scouring device. It will also be equipped with video recording and a time-lapse camera. Connection of this station to oceanographic tomographic chains can be foreseen.
5. CONCLUSIONS We started this paper with the question: "What is deep-sea geotechnics good for?". This contribution has tried to give some answers, showing how the use of geotechnical analysis can illuminate some of the mechanics of natural and artificial events by adding other aspects. The spectrum is relatively broad and includes all questions which deal with changes of the hydrolithosphere interface. Some of these changes cannot even be explained without the application of geotechnics. It is not a new discipline in the deep sea, but some problems could not be tackled until now due to the lack of appropriate sediment sampling devices and geotechnical equipment. Now that these deficiencies have been overcome, geotechnics can contribute to ocean sciences on a broader basis.
215 ACKNOWLEDGEMENT
The authors want to thank the German Minister for Education, Research, and Technology for the funding of research projects which made this contribution possible. They also want to thank their colleagues from various marine disciplines for valuable discussions and fruitful interaction.
REFERENCES
1. H.U. Oebius, Deep-sea Mining and its Environmental Consequences, VWS, Eigenverlag., 1998. 2. Anonymous, Seminar Workshops I - III, in: A.L.Inderbitzen, (ed.), Deep-Sea Sediments. Plenum Press, New York, 1974. 3. B. Grupe, H. J. Becker and H. U. Oebius, Geotechnical and sedimentological investigations of deep-sea sediments from a manganese nodule field of the Peru Basin. Deep-sea Research, 1999, (under review) 4. F. Liu,, H. U. Oebius, B. Grupe, H. J. Becker, Basic Research on Characteristics of Deep-sea Sediment Clouds Produced by Marine Mining. Intern. Symp. On: Environmental Research for Deep-sea Mining, Tokyo, Japan, 20. - 21.11.1997, 1997. 5. H.W. Gerber, H. U. Oebius and B. Grupe, Maxicorer - A Device for Taking Undisturbed Samples of Sediment Including the Benthic Boundary Layer. Sea Technology, 37, 10, pp. 66-69, (1996) 6. H. Klein and E. Mittelstaedt, Currents and dispersion in the abyssal Northeast Atlantic. Results from the NOAMP field program. Deep-sea Research, 39, 10, pp. 1727-1745, (1992) 7. A. Mangini, P. Stoffers and R. Botz, Periodic Events of Bottom Transport of Peru Basin Sediment During the Late Quaternary. Marine Geol. 76 (1987) 325-329. 8. R. E. Bums, B. H. Erickson, J. W. Lavelle and E. Ozturgut, Observations and measurements during the monitoring of deep ocean manganese nodules mining tests in the North Pacific, March- May 1978. NOAA Techn. Memo. ERL MESA-47:1-63, 1978. 9. H.J. Becker, B. Grupe, H. U. Oebius, F. Liu, The behaviour of deep-sea sediments under the impact of anthropogenic activities. Deep-sea Research, 1999, (under review) 10. B. Greger, Auswertung der Video-Filme von der Tauchfahrt der "Nautile" im Zentralpazifik. Pers. comm. im Rahmen des BMFT Projekts MTK 0355 2. Entwicklung von Manganknollenabbau- und-gewinnungsverfahren, 1988. 11. I. Modlitba, Geotechnical Evaluation of the IOM-BIE Test in the central tropical Pacific in 1997. IOM, unpublished, personal discussion, 1998. 12. GEMONOD, Environment of nodules mining sites: geological data an specification. Technical Note No. 33, May, Paris (1985) 13. D. K. Gibson and V. C. Andersen, Sea-floor Mechanics and Trafficability Measurements with the Tracked Vehicle "RUM", in: A.L. Inderbitzen, (ed.), Deep-Sea Sediments, Plenum Press, New York, 1974.
216 14. U. Hahlbrock, Aufnehmermaschinen ftir den Meeresbergbau. 8. Aufbauseminar Meerestechnik, TU Clausthal/TU Berlin, Berlin, 1980. 15. A. Bath and B. Greger, Entwicklung von Manganknollenabbau- und-gewinnungsverfahren. Schlul3bericht BMFT-Projekt MTK 0355 2, PREUSSAG AG, January, 1988. 16. H. U. Oebius, Entwicklung eines umweltschonenden Manganknollenabbau- u n d gewinnungsverfahrens. BMFT-Projekt MTK 0355 2, VWS-Bericht Nr. 1228/93, 1993.
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
217
9 2002 Elsevier Science B.V. All rights reserved.
Offshore h y d r o c a r b o n leakage: hazards and m o n i t o r i n g G. Etiope a, P. Carnevale b, F. Gasparoni b, M. Calcara a, P. Favali a'c and G. Smriglio a aIstituto Nazionale di Geofisica e Vulcanologia, Via Vigna Murata 605, 00143 Rome, Italy bTecnomare S.p.a., S. Marco 3584, 30124 Venice, Italy CUniversity "G. D'Annunzio", Via dei Vestini, Campus Universitario, Chieti, Italy
Natural leakage to the seafloor of gaseous hydrocarbons from source beds or reservoirs represents an important hazard for offshore oil exploration, construction and exploitation activities. Evidence for, and the impact of, seafloor gas occurrence are examined, and the potential of benthic observatories for continuous, long-term, multiparametric monitoring is discussed.
1. INTRODUCTION The natural occurrence and emission of gas on the seafloor have profound implications for different tasks of the offshore oil and gas industry, including hydrocarbon exploration, construction of facilities, and exploitation activity. A number of case histories have documented catastrophic "blow-outs" during drilling operations or damage to offshore structures due to gas leakage phenomena and shallow gas pockets [1]. Safety issues in the offshore drilling industry recommend the execution of shallow gas surveys before any drilling activity. Gas at high pressure may "blow out" if the pressure is suddenly released, endangering the drilling rig and any support vessels. Pipeline and cable laying activities can be affected by the presence on the seafloor of pockmarks caused by the eruption of gas from the sediment. Underwater slides may occur if the entrained gas causes a sufficient decrease in the shear strength of the sediment. The presence of gas is also known to be one of the factors causing cementation of the sediment. In fact, sulfate-reducing microbes in the shallow sediment are utilising leaking hydrocarbons as a source of carbon, oxidising them and producing dissolved bicarbonate, resulting in the precipitation of calcium carbonate. In the last decade, submarine geophysical and geochemical investigations have allowed the discovery of new, important, seafloor gas-bearing structures (e.g., pockmarks), and diffuse or localised gas leakages in many hydrocarbon-prone areas throughout the world [2,3,4,5]. In this work, we briefly review the main evidence for seafloor gas leakage, and examine the potential impact or implications for the offshore oil industry. Then, a new approach for monitoring seafloor gas occurrence, based on benthic observatories is proposed. In particular, the potential of the recently developed GEOSTAR observatory for geophysical, geochemical and geotechnical measurements is also described.
218 2. EVIDENCE FOR O F F S H O R E GAS L E A K A G E S
Offshore investigations have frequently indicated the widespread occurrence throughout the world of gas in and on the seafloor (Fig. 1), either in the form of visible gas emissions (gas vents), as pockets (bubbles) of undissolved gas trapped in the seabed sediments, or as hydrates (methane clathrate). Evidence of fluid escape includes geomorphic features such as seep precipitates (carbonates), pockmarks, piping/rills, brine pools, and mud volcanoes. These geomorphic signatures have been well described at scales ranging from kilometers to meters. The gas is typically methane and carbon dioxide derived from biogenic production, thermal cracking of organic compounds at great depths, and migration from source beds or fluid reservoirs to the seafloor along preferential routes of secondary permeability, i.e., faults and fracture networks induced by tectonic processes (Fig. 2). These gas manifestations are an expression of the lithosphere degassing, whose geological and global change implications are recently discussed in [6].
Figure 1. Global distribution of fluid seeps. Hydrocarbon seep and pockmark distribution (from [5]).
219
greenhouse gas o ~ seepage o~ o
0 ~
precipitates and
benthic station ~
brine pools Oo ~o o o
groundwater exchange ~,
o~
Figure 2. Schematic cross-section of seepage system and the role of tectonics in focusing fluid flow. Seafloor manifestations of seepage include direct expulsions of oil and gas, gas hydrates, biological communities and precipitates of various types. Continuous and long-term data related to seepages can be monitored by multidisciplinary benthic observatories (redrawn from [5] with additions).
2,1,
Free gas and pockmarks Pockmarks represent the main evidence of free fluid release from the sub-seafloor [7]. They are cone-shaped depressions, usually due to "blow-out" of gas, occurring worldwide on the floors of all the oceans, in shallow waters and at depths of thousands of meters, within clays, silts, and sands. Typical pockmarks range in size from less than 1 m to a 0.5 km across, and to more than 20-30 m in depth below the seafloor. Giant pockmarks (diameter above 100-200 m) are reported in Belfast Bay, Maine [9] and the Barents Sea, Norway [ 10]. Other pockmarks have been found in the eastern Canadian continental shelf [2], Black Sea [11], offshore north-western Spain [12], Patras Gulf [13], Skagerrak [14,15], Scotian Shelf [8], Norwegian Channel [ 16], Stockholm archipelago [ 17] [ 18], Corinth-Patras rift [ 19], Arabian Gulf [20] and Arctic Ocean [21 ]. Although fluid release is common to all pockmarks, the mechanism of their formation and growth is not always clear, as it may differ from site to site. They may form suddenly, during catastrophic outgassing events related to earthquakes or tsunami [9,13,19], or may evolve slowly due to complex processes of erosion and sediment redistribution [9]. Pockmarks have also been attributed to decomposition of gas hydrates [10]. Recent investigations in the U.K. Atlantic margin suggested the occurrence of water-beating structures caused by seawater circulation in the sub-seafloor, with channelled upflow through vents (Huggett, personal communication). It seems that gas-beating pockmarks are preferentially distributed in lines, along gas conducting faults, whilst water-generated pockmarks are not linked with any apparent tectonic structure, but randomly distributed.
220 Gas venting, sometimes apparently unrelated to pockmarks, occurs in many hydrocarbonprone basins along the Thetyan belt, northern Europe and the Gulf of Mexico. Gas emissions in the European and Mid-East shelf seas are described by [3,4,13,17]. The widespread existence of gas (dominated by CH4) bubbles in the sediments forming the seabed is documented by [8,22,23]. Fields of gassy sediments are known in the continental shelf of western India [24], Mississippi Delta and Gulf of Mexico [25], Grand Banks and Scotian Shelf [26], North Sea and Scotian Shelf [27], Norwegian Trench [7], Adriatic Sea [28], Arabian Gulf [20] and South China Sea [29]. These areas are often characterised by neotectonic activity and gas-beating vertical faults allowing for relatively rapid migration of hydrocarbon fluids to the surface and accumulation in the sea-floor sediments. Wheeler [22] modelled the formation and movement of gas "macrobubbles" (sizes of the order of 1-1000 ram) in the shallow seafloor sediments, whilst Etiope [30] and Etiope and Lombardi [31] developed the concept of the migration of "microbubbles" (sizes of the order of 0.01-1 mm) through the geologic column. The vertical ascent of microbubbles and continuous gas flow through fractures (mainly CO2, C~ to C5 hydrocarbons and carrying dissolved or adsorbed C6+ compounds) are the most effective microseepage migration mechanisms in agreement with observational data [32,33]. Sediment faulting and stress-strain variations are thus an important mechanism by which carbon dioxide, methane and other gases are transported from deep sedimentary rocks to the seafloor. In particular, some studies envisage a close relationship between gas output and release in pockmarks fields, faulting and seismicity [13,19,34]. Therefore, seismic recording on the seabed, especially within pockmark fields, (e.g., by ocean bottom seismometers or by moored or benthic observatories) is an important task to be included in a general programme of marine gas leakage or geo-hazard monitoring.
2.2. Leaking hydrates A large amount of literature is now available on methane hydrates, members of the clathrate structural family. Details on their origins and chemico-physical behaviour can be found, for example, in [35,36,37,38]. Seabed sediments containing natural gas hydrate occur worldwide on slopes and rises of active and passive continental margins, on the continental shelves of polar regions, and in deeper (>300 m) marine environments. Deep ocean drilling discovered hydrates along both the eastern and western American continental shelf (Peril, Costa Rica, Guatemala, Mexico, United States, Canada), offshore Japan, New Zealand, N o r ~ y , Black and Caspian Seas [36]. The quantity of methane hydrates in many oceanic environments is relevant [39,40]. Globally, current estimates are around 10 ~9 g of methane carbon, a factor of 2 larger than the carbon present in all known fossil fuel (coal, oil, and natural gas) deposits [35]. Hydrates also represent an important hazard when they decompose or melt, releasing free gas, potentially becoming a source of "greenhouse" gas and altering the engineering properties of the seabed. Ascending gas plumes and bubble columns may sometimes originate from hydrate-beating seafloor [41 ] leading to significant output of CH4 into the seawater and the atmosphere. Hydrates generally occur within the upper few hundred meters of seafloor sediments, and often overlie deeper zones containing bubbles of free gas [39,42]. Possible relationships between hydrates, methane fluxes and pockmarks have been recently discovered [10,21,43], suggesting a quite complex picture of seafloor gas phenomenology. In particular, amounts and temporal variations of gas stored and emitted, as well as the evolution of seabed structures and sediment engineering properties, remain poorly understood.
221
3. IMPLICATIONS FOR THE OFFSHORE INDUSTRY 3.1. Hydrocarbon exploration Detecting hydrocarbons offshore has been a standard part of exploration programmes for some time. Hydrocarbon seeps at the seafloor provide an indirect view of deep petroleum production, migration and accumulation processes. These seeps have in fact been used in petroleum prospecting and have led to the discovery of major oil and gas fields. Hydrocarbon microseepages can create anomalies in the near bottom seawater chemistry, which outline hydrocarbon pools and can be used therefore as a viable exploration tool. Pockmarks and other geomorphic structures can be important indicators of subsurface petroleum deposits. Also, biogenic production of methane by methanogenesis in sediments below the zone of sulfate reduction can generate sufficient methane to produce gas blisters and pockmarks. However, even if at global scale the volumes leaking offshore are significantly greater than those detected onshore; many of the offshore fields now being discovered do not necessarily have visible seeping hydrocarbons. On the other hand, offshore gas leakages are not always an indication of productive reservoirs, but may be biogenic. Many wells drilled within areas of gas leakage were unproductive, with consequent substantial economic losses for the oil companies. Large sums of money were lost just because the gas seeps were not submitted to detailed investigations in terms of fluid origin (biogenic vs. thermogenic), mixing processes and relation with brittle tectonics. It is important to be able to differentiate methane produced by shallow biological processes and that produced by thermogenic processes at depth. This can be made by two geochemical methods based on the determination of isotopic composition of the methane, and/or the presence or absence of ethane and heavier hydrocarbons. Then only, through temporal, continuous monitoring, can some important information on the subsurface fluid potential, such as amount of gas discharged, origin, migration episodicity and relation with seismic activity, be acquired. A possible optimal use of temporal monitoring of gas leakages is in field definition after a wildcat discovery and in the extension of older producing fields. Basically, a programme of gas monitoring can help to minimise the percentage of failures and maximise the probability of economic success. 3.2. Offshore engineering and exploitation activity The presence of free gas within shallow sediments may have important consequences for offshore industry [44]. In particular high gas concentrations and pressures introduce the risk of potentially catastrophic " b l o w - o u t s " during drilling o p e r a t i o n s , so that shallow o v e r - p r e s s u r e d gas zones may cause significant delays in the exploitation schedule, and could even result in the field abandonment [1 ]. Moreover, the presence of shallow gas could affect the performance of foundations of offshore structures in at least three different ways: 1. the presence of undissolved gas could alter the engineering properties of sediments, such as stiffness and strength, thus influencing possible displacement and ultimate load capacity of a foundation;
222 2. the interface between foundation and sediments could act as preferential migration pathways for gas, reducing the skin friction of piles and skirts and causing a reduction in the load capacity of a foundation; 3. gas flux from the seabed could affect foundation performance: a build-up of gas inside the skirts of a gravity platform could cause instability (as well as potential hazards from fire and explosion), and sudden emission of large volumes of gas into the sea would reduce the buoyancy of a structure and result in the formation of pockmarks threatening the stability of a foundation [45,46]. Basically, free gas in the seafloor increases the compressibility, reduces the undrained shear strength and speed of sound in sediments [44]. These parameters can evolve and change with time, depending on the variation of gas concentration in the sediments or episodic venting. Seismic activity, in particular, can be responsible for sudden gas outputs perturbing the sediment performance, analogous to sand liquefaction on land. In the presence of gas hydrates, operations that alter the in situ temperature and pressure conditions on the seafloor or within the sediments may cause hydrates to decompose, releasing methane and water to the sediment [47,48]. Potential problems include: sediment instability; - seabed erosion; corrosion and dissolution of materials; formation of seafloor biological communities that subsist on methane. Furthermore, these problems may not arise immediately, but may take years to develop as fluid flow from the producing formation warms the surrounding sediments, heating ambient gas hydrates and causing their breakdown (Figs. 3 and 4). Shallow gas in the seabed may lead to mud volcanoes, that are recognised to be the main threat to mobile offshore drilling units (MODU), pipelines and fixed platforms [49]. In some locations, such as the Caspian Sea, mud volcanoes are capable of erupting without warning, blowing mud, water, fire, boulders, and explosive gas into the air and devastating any structure in the area. There is a large amount of methane gas involved in these explosions, along with formation water and unconsolidated rocks, including boulders. The threat of gas seepage, seafloor fracturing, and shallow gas pockets are very real hazards for drillers operating from bottom-mounted figs. Falling mud can endanger the foundation of a rig. Extruding mud creates serious cracking, disrupting the surface with faults or reactivating pre-existing faults. Operators need to consider the presence of mud volcanoes when placing rigs on the bottom. The weight transfer involved in jacking up a rig could disturb the seafloor, or punch through in an area with an unstable surface. As the mud is compressed, it rises like a fluid, being less dense than the surrounding rock. As the mud, with petroleum gases suspended under pressure in its pore water, reaches the surface, volume expansion occurs releasing this gas violently under explosive pressures. The over-pressured slurry can escape at very high rates, producing not only flowing mud, but a great amount of gas. The mixture moves so quickly, it can spontaneously ignite at the surface. There can also be a localization of gas around the foundation of fixed platforms or MODUs, or contained in pockets below the seafloor. The gas in these pockets is under high pressure and is released quickly if the pocket is disturbed by drilling or ground breakage from a drilling mat. Another problem is the spontaneous emission of gas, which has collected under a rig. Also, pipelines are subject to the flow of mud off the slopes of these volcanoes. The mud flows can sweep a pipeline out of position, or buckle or crush the pipelines. -
-
-
223
B. Later conditions
A. Initial conditions
1•
Subsea _._._..~, ~ - - ~ , wellhead I-
seafloor I - - Fracture o f " ~ .,70~ filling "4~}t hydrate -
.... ] Hydrate .~l'nodule s ~ / "
o
Newly-formed Exposed Chemosynthetic casing community Hydrate \ Sediment --~.,,~., • . . . . . . . . . . ,//,.._-. Re-formed rafts ,,~,,~J [ _ ~ l ~ gas "..o ~ ' ~ t l l~f~-. hydrate 9 Methane "to ~ - . ~ ~ ~t~.~'IZ " bubbles
/
9
9
,::,.::;:~;:
9
9
Hydrocarbon at formation temperature
Warmed" :"j~ .... [ ':~';~soci2~ng sediment Gas hydrate
Figure 3. Schematic diagram illustrating the potential effects of decomposing gas hydrate on a subsea production system. At initial conditions a subsea well head produces hydrocarbons from a reservoir. At later conditions, producing hydrocarbons from formations below causes gradual heating of the sediments surrounding the well bore. Any gas hydrates within the sediment break down and release methane and water. Methane can thus move up toward the seafloor, creating a new methane source and/or re-forming as gas hydrate. Methane, in contact with sulphate, causes H2S to form, creating a corrosive environment. Furthermore, both methane and H2S at the seafloor can attract chemosynthetic biota and stimulate biologic community growth. With gas hydrate at the seafloor, rafts of hydrate and sediment have the potential to lift off the bottom and eventually create excavations, exposing additional infrastructure to the corrosive action of H2S-charged waters (from [47] with permission of Offshore Technology Conference).
Buried
Sediment
A. Initial conditions Seafloor
B. Later conditions
Dissociated gas hydrate
Figure 4. Schematic diagram showing gas hydrate dissociation causing sediment instability. At initial conditions, a pipeline is buried in deep-water sediments containing gas hydrates. Over time, heat from hydrocarbons flowing in the pipeline warms the surrounding sediment, changing ambient temperature conditions. Heating causes gas hydrates to dissociate, releasing methane and water into the pore spaces of the sediment. Adding these fluids causes sediment instability and slumping on the seafloor. The pipeline ruptures when the sediment founders and shifts (from [47] with permission of Offshore Technology Conference).
224 In areas hosting production wells, occurrence of gas in the seafloor and its emission into the sea may vary over time as a result of the changes in the subsurface pressure regimes. During drilling and exploitation activity, sedimentary rocks and fluid reservoirs are subjected to mechanical stresses, variations in loading, pressure, temperature, all leading to changes in the gas migration and/or oil recovery potential. It is evident therefore that simultaneously monitoring gas occurrence and seismic activity in correspondence with offshore structures would represent an important surveillance action. At the same time, in the case of production wells, monitoring provides a useful approach for controlling fluid extraction rate and acquiring information on how fluid extraction can influence the circulation of fluids in the surrounding sediments, eventually inducing seismicity, with possible negative impact to the submarine structures and systems.
4. OFFSHORE GAS DETECTION 4.1. Background
The search and detection of offshore gas and related structures can be based on geochemical and geophysical techniques. In geochemical surveys it is assumed that the hydrocarbons leaking from the ocean floor form a plume that is either directly above the accumulation because of the lack of current, or is shifted down current. In some cases, a hydrocarbon anomaly is not detected in the water unless sampling is done close to the seabed. Since the 1960s offshore hydrocarbon leakages have been investigated through areal surveys, based on grab sampling and analyses of water, but rarely repeated over time. These surveys are commonly labelled the "sniffer" technique. The sniffer consists of a metal "fish", containing an electric pump and a sonar system, which is towed behind a ship. Water is continuously pumped to the surface, degassed and analysed for hydrocarbon gases by gas chromatography. This approach, useful for reconnaissance areal surveys, has two main limitations: - the technique is limited to water depths of approximately 300-400 m due to drag on the cable supporting the "fish", making depth control difficult; - the results, referring to as a discrete sampling, do not allow a detailed examination of the nature and time variation of the gas leakage. Geophysical surveys are based on high-resolution reflection profiles, side-scan sonar, and echosounder recording [9,13,12,19]. Gas is recognised either as enhanced reflectors or by acoustically obscured zones. A particular feature is that of dome-shaped reflectors, low relief intrasedimentary doming formed by high pressure gas build-up [7]. Pockmarks are observed as conical and dish shaped incisions in the seismic records. The activity of pockmarks, i.e., continuous supply of gas, can be recognised by the existence of a columnar, acoustically turbid zone underneath the pockmarks themselves [13]. The integration of these geochemical and geophysical methods is always the most effective approach for exploring and defining offshore gas occurrence in the space-domain; but they cannot provide fundamental information in the time-domain. 4.2. A new approach for long-term monitoring
The cognitive elements related to the subsurface fluid potential and seafloor evolution, such as the amount of gas discharged, migration episodicity, and relation to seismic activity,
225 can only be acquired through continuous and long-term data recording. Autonomous benthic stations are the most recent technological development for this purpose. The main scientific and technological requirements for a benthic station capable of performing gas leakage monitoring include: versatile and cost effective procedures of station deployment and recovery in shallow and deep sea sites; continuous and long-term multiparametric measurements; precision positioning of the sensors within a stable platform; good performance of the sensors for long-term (several months) measurements; real-time or near real-time data communication; management of instrumentation by means of an intelligent unit (data acquisition and control system), allowing periodic and automatic adjustments of the sensors to maintain data quality. -
-
-
-
These requirements are presently fulfilled by the GEOSTAR station (Geophysical and Oceanographic Station for Abyssal Research), the first observatory available for monitoring geophysical, geochemical and oceanographic parameters from coastal areas to the deep seafloor [50,51]. GEOSTAR can host a series of instruments, including seismometers, magnetometers, current meters, conductivity/salinity and temperature sensors, water samplers and chemical probes. In particular, seafloor degassing can be monitored through a set of specific sensors. In this respect, GEOSTAR is presently equipped with a time series water sampler, a CTD (sensor for water conductivity, temperature, pressure, and derived parameters such as salinity and density), a 0.25-m pathlength transmissometer for water turbidity determination, a chemical package for pH and HzS measurement, and an ADCP (Acoustic Doppler Current Profiler) for monitoring water current direction and velocity. The water sampling unit can be used throughout 18 months with 48 collection events of 500 ml each. Water can be sampled at specific points and heights above the seabed by sampling tubes fixed onto the station frame. After the GEOSTAR recovery, water samples can be analysed in an onshore laboratory for gases (e.g., hydrocarbons, carbon dioxide, oxygen), chemical constituents, biological products and isotopes. The pH and HzS electrode system includes a self-calibration device and has been recently tested successfully in a hyperbaric chamber under 350 bars pressure. Upgrades and addition of new sensors in the system (e.g., for NH4 + and total iron by colorimetry) are planned. Thanks to the availability of additional interfaces, new instruments for further chemical parameters of seawater and indirect tracers of hydrocarbon leakage (e.g., CO2, CH4, Eh) can also be added to GEOSTAR. All sensors are managed by an intelligent unit (DACS, Data Acquisition and Control System) able to perform a first level of data quality control. A high-accuracy clock, included in the seismometer, acts as a time reference for the whole station, allowing precise comparison in the time domain among all the scientific measurements to evaluate possible correlations: the simultaneous collection of different data will allow possible causal links among natural phenomena to be studied, and hence, lead to a better understanding of the phenomena observed at the sea bottom. The accurate positioning of the station with respect to seepage manifestations or gas hydrates occurrences is a further advantage of the system. The GEOSTAR Mobile Docker, the deployment/recovery vehicle, can transport the station to the seafloor and perform
226 accurate landings due to specific devices including thrusters, telemetry, video, lighting, sonar, and altimeter. The solution adopted overcomes typical limitations of other systems like the need of neutral buoyancy in water, free-falling deployment (very imprecise and rough), short autonomy and limited payload. Besides the described environmental monitoring, the station can also host instrumented packages devoted to the acquisition of measurements related to the cathodic protection design parameters and geotechnical parameters. With reference to geotechnical surveys, the benthic station should be upgraded in order to investigate the seabed with good resolution in the top layers, enabling a proper design of a future gravity structure to be installed in the surrounding area. In particular, the integrated station shall be able to provide an exhaustive set of soil mechanics data useful for mud mats and skirt design, pipeline design and laying, trenching operations and anchoring of vessels. The geotechnical instrumentation necessary to perform the requested task should typically entail a penetrometer, a shear strength meter, a plate pressure meter and, eventually, a core sampler. This instrumentation could be remotely controlled from the surface by means of the GEOSTAR Mobile Docker telemetry line (see also Oebius and Gerber, in this volume). The possibility of managing long-term collection of seismological data represents a further important feature, as discussed in Sections 2.1 and 3.2. A significant example in this regard is the PROCAP-2000 programme launched by the Brazilian oil company Petrobras aimed at characterising the Brazilian continental shelf in the Campos Basin Area (2000 m.w.d.). It should include a specific activity devoted to seismic monitoring for which the use of instrumented benthic stations is foreseen.
5. CONCLUSIONS Natural leakage of gaseous hydrocarbons at the seafloor (gas bubbles in the sediments, leaking hydrates, vents in pockmarks) is a widespread and relevant phenomenon in hydrocarbon-prone basins, as also recognised with increased interest by recent literature. Still, offshore gas leakage has not been adequately explored, being an important hazard for offshore oil exploration, exploitation, and construction of related facilities. The possibility of studying and monitoring gas occurrence and evolution at the seafloor is now offered by autonomous benthic observatories, which can be positioned with great precision on the seafloor and can collect long-term, continuous and multiparametric (geophysical, geochemical, oceanographic and geotechnical) data. Benthic observatories, like the existing GEOSTAR, can be used for a reliable characterisation and monitoring of a specific site where, for example, an offshore structure or subsea system is to be installed. The advantages would include the evaluation of parameters related to the subsurface fluid potential and seafloor evolution. Such parameters, e.g. amount of gas discharged, gas output episodes, seismic activity, are essential for the management of both exploration and construction activities. The data will also aid in finding the best solutions to engineering problems associated with hydrocarbon production and gas occurrence in seafloor sediments. This approach would enable the evaluation of the best commercial hydrocarbon prospects. As it has occurred in the realisation of GEOSTAR, a close collaboration between experts of different disciplines, coming from scientific and industrial groups (engineers, geochemists, geophysicists), is now demanded for effective development, testing and operativity of advanced systems for seafloor monitoring.
227 ACKNOWLEDGEMENTS
Thanks are due to R.W. Klusman and T. Francis for the helpful review and comments. The GEOSTAR project has been funded by the European Commission (EC contracts MAS3CT950007 and CT98-0183). The GEOSTAR partnership includes Istituto Nazionale di Geofisica, Tecnomare S.p.a., Technische Universit~it Berlin, Technische Fachhochschule Berlin, IFREMER, ORCA Instrumentation, Laboratoire de Oc6anographie et de Biogeochimie of Marseille, Institute de Physique du Globe de Paris. The geochemical package is developed by Systea s.r.l., with collaboration of Tecnomare and Istituto Nazionale di Geofisica.
REFERENCES
1. M.E. Smith, M. K. Rieken, D. J. Streu, M.T. Gaona and J. Sousa, Offshore Technology Conference, Houston, OTC 8893, 1998. 2. G.B.J. Fader, Cont. Shelf Res., 11 (1991) 1123. 3. E. Suess, in: Weydert et al. (eds.), 2nd MAST days and Euromar Market, 1 (1995) 303. 4. E. Uchupi, S. A. Swift and D. A. Ross, Marine Geology, 129 (1996) 237. 5. J.C. Moore, AAPG Bulletin 83 (1999) 681. 6. N.A. Morner and G. Etiope, Global Planet. Change, (2001) in press. 7. Graham and Trotman (eds.), Seabed pockmarks and seepages: impact on geology, biology and the marine environment, London, 1988. 8. L.H. King, and B. MacLean, Geol. Soc. Am. Bull., 81 (1970) 3141. 9. J.T. Kelley, S. M. Dickson, D. F. Belknap, Barnhardt and M. Henderson, Geology, 22 (1994) 59. 10. Chapman and Hall (eds.), Gas-related sea-floor depressions. Glaciated Continental Margins, London, 1997. 11. L. Dimitrov and V. Dontcheva, Bull. Geol. Soc. Denmark, 41 (1994) 1. 12. A. Garcia-Garcia, F.Vilas and S. Garcia-Gil, Environm. Geol., 38 (1999) 296. 13. T. Hasiotis, N. Papatheodorou, N. Kastanos and G. Ferentinos, Marine Geology, 130 (1996) 333 14. M. Hovland, Marine Petrol. Geol., 8 (3) (1991) 311. 15. M. Hovland, Cont. ShelfRes., 12 (10) (1992) 1111. 16. M. D. Max, R. Schreiber and N. Z. Cherkis, Marine Geophys. Res., 14 (1) (1992) 77. 17. P. S6derberg, and T. Flod6n, Contin. Shelf Res., 12 (10) (1992) 1157. 18. P. S6derberg, Ph. D.-thesis, Geol. Geochem., Stockholm Univ., 1993. 19. S. Soter, Tectonophysics, 308 (1999) 275. 20. J. P. Ellis and W. T. McGuinness. Proc. of Oceanology Intern. Conf., Brighton, March, 1986 p. 353. 21. P. R. Vogt, K. Crane, E. Sundvore, M. D. Max. and S. L. Pfirman, Fram Strait. Geology, 22 (1994) 255. 22. S. J. Wheeler, Marine Geotechnol., 9 (1990) 113.
228 23. M. Hovland, A. G. Judd and R. A. Burke, Chemosphere, 26 (1993) 559. 24. S. M. Karisiddaiah, M. Veerayya, K. H. Vora and B. G. Wagle, Marine Geology, 110 (1993) 143. 25. T. Whelan, J. M. Coleman, J. N. Suhayda and H. H. Roberts, Marine Geotechnol., 2 (1977) 147. 26. G. B. J. Fader, BIO Review '85, Bedford Institute of Oceanography, 1985, p. 16. 27. A. G. Judd, J. Davies, J. Wilson, R. Holmes, G. Baron and I. Bryden, Marine Geology, 137 (1997) 165. 28. P. V. Curzi and A. Veggiani, Acta Naturalia de 1' "Ateneo Parmense" (1985) 21, 79. 29. J. Platt, Offshore Eng., August (1977) 45. 30. G. Etiope, J. Environm. Radioactiv., 40 (1998) 11. 31. G. Etiope and S. Lombardi, Environm. Geol., 27 (1996) 226. 32. L. C. Price, A critical overview and proposed working model of surface geochemical exploration, in: Unconventional Methods in Exploration for Petroleum and Natural Gas IV. South. Method. Univ. Press, Dallas, 1986. 33. A. Brown, AAPG Bulletin, 84 (11) (2000) 1775. 34. M. E. Field and A. E. Jennings, 77 (1987) 39. 35. K. A. Kvenvolden, Chem. Geol., 71 (1988) 41. 36. K. A. Kvenvolden, Rev. Geophysics, 31 (1993) 173. 37. Marcel Dekker (eds.), Clathrate Hydrates of Natural Gases, New York, 1990. 38. E. Suess, G. Bohn~ann and E. Lausch, Sci. Am., 281, 5 (1999) 76. 39. G. R. Dickens, C. K. Paull, P. Wallace and ODP Leg 164 Scient. Party, Nature, 385 (1997) 426. 40. G. D. Ginsburg and V. A. Soloviev, EOS, Trans. AGU, 76 (1995) S 164. 41. C. K. Paull, F. N. Spiess, W. III Ussler and W. A. Borowski, Geology, 23 (1995) 89. 42. N. L. B. Bangs, D. S. Sawyer and X. Golovchenko, Geology, 21 (1993) 299. 43. W. S. C. K Paull and W. Ussler III, Geology, 24 (1996) 655. 44. G. C. Sills and S. J. Wheeler, Cont. ShelfRes., 12 (1992) 1239. 45. M. Hovland, Quart. J. Eng. Geol., 22 (2) (1989) 131. 46. K. Wang, E. E. Davis and G. van der Kamp, J. Geophys. Res., 103, (6) (1998) 12339. 47. W. S. Borowski and C. K. Paull, Offshore Mediterranean Conference, Houston, OTC 8297, 1997. 48. R. P. LaBelle, Offshore Technnology Conference, Houston, OTC 8708, 1998. 49. W. Furlow, Oil & Gas Journal, 59 (3) (1999) 50. L. Beranzoli, A. De Santis, G. Etiope, P. Favali, F. Frugoni, G. Smriglio, F. Gasparoni and A. Marigo, Phys. Earth Planet. Int., 108 (1998) 175. 51. L. Beranzoli, T. Braun, M. Calcara, D. Calore, R. Campaci, J. M. Coudeville, A. De Santis, G. Etiope, P. Favali, F. Frugoni, J. L. Fuda, F. Gamberi, F. Gasparoni, H. Gerber, M. Marani, J. Marvaldi, C. Millot, C. Montuori, P. Palangio, G. Romeo and G. Smriglio. EOS, 81 (5) (2000) 45.
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
229
9 2002 Elsevier Science B.V. All rights reserved.
The use of a Coastal HF Radar system for determining the vector field of surface currents G. Budillon a, G. Dallaporta b and A. Mazzoldi b aIstituto Universitario Navale, Istituto di Meteorologia e Oceanografia, Via Acton 38, 80133 Naples, Italy bC.N.R., Istituto Studio Dinamica Grandi Masse, S. Polo 1364, 30125 Venice, Italy
The HF coastal radar, using backscatter from sea waves of the electromagnetic waves emitted from land-based coastal stations, allows mapping sea surface currents along the coast with a range of several kilometers. This paper describes the use of the HF radar system installed within the framework of the MURST PRISMA 2 Project, to measure the surface current field in the near shore area in front of the city of Ancona (Northern Adriatic Sea). The system has been operating continuously from September 1997 to the spring 2000. Its use is fundamental because of the possibility of monitoring a wide coastal area, that would otherwise be difficult to cover, demonstrating its capability of being a useful supplement in data collection with standard oceanographic field measurements. Preliminary results show the well-known southeastward (i.e. along shore) coastal jet that extends its influence as far as 10-15 km from the coast; further offshore a cyclonic eddy, with a diameter of 10-20 km typically occurs. This circulation can be occasionally reversed showing an anticlyclonic eddy that forces a northward coastal current with a far less energetic signature. The spectral analysis reveals a propagation of the semidiurnal frequencies from NW to SE according to the propagation of a Kelvin wave theory; conversely an exponential increase from the coast to offshore has been identified for the near-inertial frequencies. An analysis of the mean kinetic energy during a Bora wind episode describes the typical dynamic of a surface layer driven by an impulsive atmospheric force.
1. INTRODUCTION Currents within 1 m of the sea surface are highly variable, being driven by geostrophic forces and tides, and are strongly influenced by the local wind and wave fields at different time and spatial scales. Near-surface current patterns, and how they respond locally to the relevant prevailing forces is a crucial element for the effective management of operations in coastal waters, and an increasingly important input for global resource monitoring and weather predictions. Nearly all available techniques, for measuring this thin surface layer are Lagrangian, meaning that they measure the trajectory of a parcel of water near the surface, thus obtaining one or more current streamline on function of time. The method used in our
230 case is an Eulerian one. It measures surface currents at points spaced approximately 1.5 km from each other on one particular grid. The HF Radar experiment has been carried out in the framework of the national project "PRISMA-2" (acronym of project for the safeguard of the Adriatic Sea, second phase), financed by the Ministry of University and Scientific Research (MURST), under the coordination of the National Research Council (CNR). The aims of the experiment were: 1. obtain a map of coastal surface current in order to give oceanographers a tool for defining and following frontal lines (e.g., the line dividing the coastal fresh water from offshore salt water); 2. experiment with this recent technology in order to investigate its potential for other applications in the future.
2. HF R A D A R
The mechanism which permits a HF radar system (wavelength 9~~ 6-30 m) to measure sea surface currents, involves the physics of reflection and dispersion of electromagnetic waves from sea waves. Under a given wind field, sea waves of different lengths are generated; of this multiplicity of waves, some travelling radially to the receiving antenna reflect back a large signal when they have a wavelength L = )~/2, )~ being the HF wavelength. Observing sea waves from the coast with the HF radar and analysing the spectrum of received echoes, one can point out the presence of two overlapping peaks, symmetrically spaced around transmission frequency (Fig. 1). They are the HF signals given by sea wave trains moving radially toward and away from the radar (Bragg effect), having a spatial period precisely one half of the transmitter radar wavelength. The spectrum of the continuous-wave transmitted signal is a narrow peak at the carrier frequency location fo = c/)~ (taken as zero reference value), where c is the speed of light in vacuum. In the absence of current, the received firstorder sea echo appears as two symmetrical spaced peaks about carrier (Doppler shifts), corresponding to sea wavelength L = )~/2. These peaks are shifted by a small amount with respect to original positions in the presence of a surface current. This shift depends on the fact that the "target", seen by the radar, is in motion, and the signal given out by the transmitter comes back at a frequency different from that transmitted because of radial velocity of the "target". This velocity is the result of the linear combination of two components. The first is the gravitational velocity Vph = (g~/4rt) ~/2 (g = 9.81 ms -2 gravity acceleration) of the surface waves. The corresponding resulting Doppler shift iSfph = 2Vph/)~ [2]. The second component is made by surface current, so that the two peaks produced by the waves are shifted a further amount proportional to the radial component of current velocity. This quantity is fcr-- 2Vcr/)2, where Vcr is the real radial component of the current in the radar direction. The total Doppler shift is so Af = fph+fcr. The velocity is obtained from the total shift using the equation Vcr = )~Afl2-Vph It has been demonstrated that the thickness of surface water that influences the Doppler shift, when a surface current is present, is in 1/25 of radar wave length [3]. The relation is d = )~/8rt approximately, where d is the surface water thickness. Because a single radar station is capable of measuring only the component of flow along a radial beam emanating from the site, radial currents from two (or more) sites must be combined to form the surface current flow, which is two-dimensional (Figs. 2 and 3).
231
transmitted electromagnetic
~
wave
~x/2
backscattered :sea echo
Advancing wave echo
First Order Sea Echo With No Current
i
/~ RECEIVED SEA ECHO SIGNAL STRENGTH
tll
Transmitted signal
receding wave echo
l;I
JIL
J~,
First-Order Sea Echo with Advancing Current
I ^
Transmitte: [k Tr frequency
--,4 F- z~(= 2Vcr:]'l
I! Jt
L~( - 2 Vcr X
DOPPLER SHIFT
Figure 1. Generation mechanism of the Doppler shift [1 ].
232
Fugure 2.Radial Currents map about remotesites.
Working range of HF radar depends on the attenuation of electromagnetic waves travelling from the transmitter, to "target" and return, on the energy back-scattered by sea wave roughness, on atmospheric noise and on noise by radio interference. The attenuation of wave HF is directly proportional to the working frequency and inversely proportional to conductivity, that is a proportional function of salinity and temperature. The HF radar systems are constructed in a frequency range between 5 MHz and 50 MHz. The maximum absolute range of every frequency depends on radar parameters such as the power, the band, the model of antenna, etc. The distance resolution is limited by disturbances by radio interference that are minimized using transmission signals to narrow bandwidths. This problem is strong with HF radar working at low frequency, near that of radio transmissions, while there is no interference from long range radio sources with VHF frequencies higher than 50 MHz. In the absence of very near radio sources it is possible by employing broad bandwidths and obtaining high spatial resolution. Using a 150 MHz system one can obtain a resolution below 100 m. It is necessary, then, to find a compromise between range of work and spatial resolution; for example, working between 20-30 MHz, the range of work is 40-60 km and the spatial resolution is 0.3-1.5 km, well adapted to many coastal studies.
3. R A D A R STATIONS
Within the framework of the described MURST PRISMA 2 Project, the CODAR SeaSonde (by COS, LTD) was installed by the Institute I.S.D.G.M. of C.N.R.
233 The SeaSonde is a Coastal HF Radar that radiates less than 80 watts average power in the frequency band near 25 MHz. Two radars operate at sites 25 km apart, each measuring radial currents to a distance of about 45 km. Each radar station produces radial velocities vs range and bearing. The range cell spacing, that depends on the signal bandwidth, has been set from 1.5 km, but this resolution may be increased to 500 m, if necessary; radial speeds are produced for each of 32 range cells at 5 ~ bearing increments over the annular sector falling over the sea. At remote sites the radar unit runs continually, acquiring 512 samples in 256 seconds. The data extracted are averaged on 15 complete acquisitions and, every hour, files of these radial speeds, in polar coordinates, are processed and stored at each remote site and/or sent by modem to the central site, in Venice. The choice of remote sites is dictated by sea coverage required. It is necessary to observe some important conditions about the installation of antennas (principally the distance between the antennas and the water, the distance between the building housing transmitter and receiver, the absence of obstacles around the antennas) to obtain the very best sea surface coverage. In our case, the choice sites are: building on the sea front of Senigallia and north wharf of Ancona's harbour. The total coverage area is 1000-1200 km z, approximately. The composition of instrumentation at remote sites is: one transmitting antenna with omnidirectional pattern; one receiving antenna, capable of distinguish the direction of incoming echoes; a transmitter and receiver - computing hardware, modem, power supply and telephone line. At the central site a computer Power PC is connected to remote sites by modem and telephone lines. -
-
-
4. M A P OF S U R F A C E C U R R E N T Radial speed maps from each radar site alone are not a complete depiction of the surface current flow, which is two-dimensional. This is the way at least two radar systems observing the same spot on the sea from different directions are used to construct a total vector from each site's radial components. Many programs have been developed to process data of radial files [4]. A simple program, developed using MatLab language, gives the results in Fig. 3 and 4. Figure 3 is the map of total current vectors over a grid spanning the overlapped remote site coverage areas. One notices that the radial vectors in the coverage area of two remote sites are not always associated with the same point. Therefore, the cells into which total sea area is divided, will give a vector combination of radial vectors, which is the mean of many vectors, depending on the area's size in which means are computed. Then, a map of errors can be reproduced using vector values whose least square mean is higher than the fixed given value. To prove better the errors in Fig. 4, the scale of the vectors is 2 cm/s instead of 50 cm/s of Fig. 3. The vector map of Figure 4 gives a representation of errors that can be explained as: (a) velocity vector error is an absolute value, both direction and intensity; (b) one considers only velocity errors over 1 cm/s (or with different thresholds); (c) direction error is represented by a deviation with respect to North, in degrees; (d) intensity error is proportional to the length of vector error.
234
Figure 3. Map of surface current vectors
Figure 4. Map of errorsmore than l cm/s, relativeto the vector map in Fig. 3.
235 What is shown is just one portion of the total error in any HF radar measurement. It is the portion proportional to the geometric sampling grid [5]. Generally, about the errors, the central area is the best investigated by the two radars, with an error below 1 cm/s. Errors over 1 cm/s are systematic in the extremes of the coverage area, where the numbers of available radial vectors is less per unit area, while the direction error seems to increase moving from south to north. This is compatible with an asymmetrical coverage by the two radars. At increasing distance of the observation point from the antennas the measurement reliability decreases for distortion errors; additional errors can be due to a too narrow view angle. Lipa and Barrick [4] suggest, in consequence of the above, to take into consideration only data relative to a matrix of 20x20 km for a setting analogous to ours. Keeping in mind that this footprint was based on the performance of an older and less powerful system, after a first analysis aimed at assessing the consistency of data outside this core area with those inside it, we have decided to take into account measurement spanning over a 30 x 30 km area [6]. Besides distortion, another problem which may arise in radar data is the presence of spatial and/or temporal gaps; this can be due to obvious hardware reasons but also to problems with the direction finding algorithms or with the signal maximum range fluctuations [7].
5. FIRST R E S U L T S
In the adopted configuration, the system provides data averaged over 1-hour periods. In order to filter out semi-diurnal and diurnal tides as well as inertial oscillations, data were low-pass filtered having a Chebyshev filter with a 50-hour cutoff. Spatial/temporal gaps were filled by means of linear interpolation with neighbouring data (in space and time). In this work we analyze the hourly data acquired during the months of August and September 1997. The whole dataset start from July 1997 and stop at March 2000. It must be stressed that in this study, which is a preliminary contribution to the analysis of this important dataset, we have used the HF radar velocity data as processed by the manufacturer. In Fig. 5 we present weekly surface current maps in the studied area relative to the month of September 1997. The surface circulation is dominated by a southward coastal jet and by a single cyclonic eddy further offshore. This dynamical feature is very frequent in our dataset and it is often detectable also in the hourly data; its typical dimension is of the order of twice the Rossby radius of deformation, which was estimated in the same area and period at around 6.5 km [8]. The coastal current shows typical velocities of the order of 10~-20 cm/s and it influences the coast for 10-15 km seaward. The cyclonic (anticlockwise) circulation shows a symmetric field located approximately in the middle of the studied area. We expect the topography to be important in setting the location and the persistence of the eddy in the region. A major forcing setting up the eddy can be obviously related to the local field of the wind stress. This observed pattern is consistent with the previous knowledge of the general circulation of the Adriatic Sea. This quasi-permanent circulation pattern does not occur in several situations as detected, for example, at the beginning of September, when the coastal jet direction is almost completely reversed and the investigated area is dominated by an anticyclonic (clockwise) local circulation, most likely due to meteorological conditions [9]. For a complete presentation of the monthly averaged circulation in this area see Budillon et al. [10].
236
I week
II week
i
i \
\ \
\
\
\
20cmls
\
9
44.0
..
r162
43.8
~,
; "eL, ; ; ; \
~Ancola"
43.6 13.2
13.4
/
"~---~._/~Aoco ~a
"-.
13.6
13.8
13.2
13.4
III week
" "-,
13.6
13.8
IV week
i
Ix \
',
44.0
\
,,Ionl
20em/s
\\
4r
....
,,
I~m
2(Rmm/s
N
44.0
,e---
\
~ ~_
~.~,.-,~.~
'I ,r
,l~J
x',g -. ~.," ,. ~"~.,. ~'~,L---x/ 4,-p. 'l" T . L
43.8
43.8
k
"-.\
\ ~...
"-~-....___/~Ao col
43.6 13.2
13.4
~ ~
(O~.d)
13.6
"\k
\
13.8
i -'----...._.~Anco,
43.6 13.2
13.4
(6e.d)
13.6
Figure 5. Weekly averaged circulation in September 1997 (adapted from [9]).
/
113.8
237
Figure 6. Spectral analysis of the longshore component measured by the CODAR in a nearshore (upper right panel) and offshore (bottom right panel) location in August 1997. See the left panel for the position of the analyzed points.
238
The energy spectra (Fig. 6) of the longshore velocity component estimated for two points clearly shows the tide periods (frequencies of approximately 0.04 c/h and 0.08 c/h for the diurnal and semidiumal component, respectively), with the last one lessening of energy from the coast to offshore. This agrees with what is known about the tide in the northern Adriatic, i.e. the semidiumal frequency is composed of Kelvin wave that decreases exponentially towards offshore and the diurnal frequency is composed of shallow water waves that propagate along the axis of the Adriatic basin. Moreover, it is emphasized that the inertial frequency that, at this latitude corresponds to a period at around 17 hours is identified by the energy peak at about 0.058 c/h in the offshore location (Fig. 6, fight panel). This frequency disappears close to the coast (Fig. 6, left panel), according to a bigger influence of the bottom stress in shallow water area. The observation that near-inertial current decays toward the coast is not new, but it is very interesting to describe the scale over which the decay is occurring in this area. The maximum of the inertial frequency has been calculated, from the coast to seaward, along three different transects perpendicular to the coastline and the results are shown in Fig. 7. For all the transects there is an exponential increase of the energy associated with the inertial currents moving from the coast to offshore (from 2 to 6 times greater along 30 km). Moreover, observing the exponential fitting line, a general decrease of the energy is observed moving from the northern (dashed line) to southern section (solid line).
6E+006
oq c>., i,,_
4E+006
..... 4-
..Q i,._
v >.., i,._ c"
LU
2E+006 .....
3
+
i,...
el.}
~"~,/
++
t-
,,1-1t
0 OE+O00 ....... 0
9
-
90 - 0 10
oo
20 Distance from the Coast (km)
~30
4O
Figure 7. Plot of the maximum inertial energy versus the offshore distance along transects perpendicular to the coast: square for northem one (Senigallia, dashed line), dots for southem one (Ancona, solid line), and cross for the central one (between Senigallia and Ancona, dashed-dotted line).
239 Mean Kinetic Energy 3000
I
a)
,
2000
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
, .
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
E 1000 O 0 5 1500
c,q
E o
10
15
20
I ,
I ,
I ,
15
20
25
1000 500 0 5
10
September 1997
Figure 8. Mean Kinetic energy of the surface llayer over the area covered by the CODAR during the Bora forcing occurred in Septembr 1997: (a) hourly data and (b) low pass filtred data @ = 50 hours). As all the closed or semi closed seas, the Adriatic Sea is perturbed by free oscillations or seiches (mainly induced by the atmosphere interactions) having characteristic modes depending on the geometry of the basin. Two main seiches are known for the Adriatic Sea having periods of about 22 and 11 hour [11]. The first one represents the fundamental longitudinal free oscillation of the basin, with a nodal line at the southern boundary (Otranto Channel). The 11 hours oscillation might be interpreted as the bimodal seiches. Both seiches are present in the spectrum obtained for the offshore location (Fig. 6, fight panel) at 0.045 and 0.09 c/h. Besides these main oscillations, other shorter seiches have been observed and described by numerical models. Treating the Adriatic Sea as a non-rotating one-dimensional barotropic basin, Bajc [ 12] calculated also a period of about 12 hours, supporting the Defant's hypothesis of the so-called "closed Adriatic". Therefore, in order to investigate the response of the studied area to the atmospheric forcing, an isolated meteorological event of strong NW wind (Bora) has been analyzed. The Bora occurred after a long period of breeze (low wind stress) on September, 14, 1997, and 2 days later it disappeared. In Fig. 8 is shown the mean kinetic energy (m.k.e. = 0.5"[u2+v2]) obtained averaging the hourly velocity data (upper panel) and the low-pass filtered data with a cut frequency of 50 hour (bottom panel). The m.k.e, oscillates with values less than 500 cm2/s 2 until the afternoon of September 14 1997, then it quickly increases at night to a value of 2400 cm2/s2. On the five successive days, the m.k.e, gradually decreases until resuming its previous magnitude in the weather forcing.
25
240 This is a classic behavior of a coastal transient under the influence of an impulsive forcing of wind stress; therefore, it can be asserted that the CODAR has correctly defined the energetic input in this coastal area. The spectrum analysis shown in Fig. 9 has been obtained using the offshore and longshore hourly data of Fig. 8 in order to investigate the origin of the large oscillation in the m.k.e. observed during the Bora forcing. The absence of any peak in the inertial or near-inertial frequency exclude the presence of inertial currents forced by the wind stress produced by the Bora blowing in the Northern Adriatic (it is also unusual to see inertial oscillation in the energy signal because the velocity vector rotates with inertial periods, but the energy would not present inertial variations unless only one component is analyzed). Otherwise, the presence of two peaks of energy with similar magnitude at the semi-diurnal frequencies (centered at the period of 12.5 and 12.0 hours) for the alongshore component can support the results of Bajca and the Defant's hypothesis on the existence of a seiche with an oscillation period of about 12 hours (determined supposing that the Adriatic basin closed at the Otranto Channel). Due to the coincidence with the semidiurnal constituents of the tide, the effect of this particular seiche is to amplify (with decreasing amplitudes) the tide oscillation with a semidiurnal period. The decay time of this seiche (about 5 days) is not very long in comparison with the other main seiches of the Adriatic Sea (about 10-15 days). At the bottom panel of Fig. 9 the rotary coefficient (CR) is shown. The rotary coefficient gives the partition of total energy at the different frequencies: the magnitude will be one for pure rotary motion and zero for unidirectional motion; its sign is related to the polarization of the ellipses: positive values for clockwise rotation and negative for anticlockwise [ 13]. As expected, the maximum positive peaks (anticyclonic rotation) of the rotary coefficient appear at the diurnal and inertial frequencies. At the semidiurnal periodicity CR shows different values in the corresponding two major peaks. A unidirectional motion is detected at approximately 12.5 hours while the oscillation with a period of 12.0 hours clearly has an anticyclonic behavior. The most important peaks corresponding to the anticlockwise (cyclonic) motion are detected at different frequency ranges where the total energy of the surface current is essentially negligible. This indicates that the mean velocity measured by the CODAR, at a fixed point in space, in the course of time rotates along a clockwise path, whereas the spatial picture of the averaged velocity field is direct anti-clockwise (cyclonic gyre). In order to better visualize the rotary nature of the semidiurnal current, these are presented in terms of elliptic properties, specifically the semi-major and semi-minor axis and orientation. The semi-diurnal tidal ellipses show a smooth, even sweep of the currents with maximum amplitudes of the semi-major axis in a band extending 10-15 km offshore well correlated with the presence of the coastal jet (Fig. 10). The semi-diurnal ellipse orientation of the tidal currents shows a preferential behavior parallel to the coast. In the central sector the direction veers clockwise from the coast to offshore. In the remainder the ellipse directions appear inclined toward the coast in shallow water and turn anticlockwise more offshore (Fig. 11). The semi-minor axes show a general constant behavior coming from the cost to offshore, therefore the ellipse eccentricities are generally controlled by the semi-major axis. Actually the eccentricity decreases from the coast to offshore. It must be stressed, once more, that these results can be biased because they were obtained considering a limited data set.
241
10
0
Offshore c o m p o n e n t
x 105
0
0.02
0.04 Frequency
0.06 (C/h)
0.08
0.1
0.08
0.1
Lo ngsho re co mpo nent
x 10 6
OF
0
0.02
0.04 Frequency
0.06 (C/h)
Rotary Coefficient and Total Energy I
I
I
I
I
I
I
I
0.5
-0.5 ....... - ..... -1
0
0.02
0.04 Frequency
0.06 (C/h)
0.08
0.1
Figure 9. Spectrum analysis of the hourly time series of Fig. 8 (top panel); note the different magnitude for the alongshore and offshore spectra. In the bottom panel the rotary coefficient (bar) and the normalised total energy (line) are reported.
242 September
1997
9 + . . . . . . .
,.. . . . . . . . . . . . . . . . . .
_
. . . . . . .
,. . . . . . . . . . . . . . . . .
,. . . . . . . .
7 ._6 .......................................................... .o5 . . . . . . . : - - - ; - - - : + - . . . . . . . . . . . . . +,
X
~
+
+
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
.
!,...
,
,
+
,
'
,
:,. . . . . . . .
:. . . . . . . .
:. . . . . . . .
; _ ; _~_-t _ ;_-_ ~~-:-._: co 3 . . . . . +-. . +--:. -~- . - ~ . . : z + . . .+. § : . . . . . . . + : ,-+. . . . . . .
,,. . . . . . . .
E, 4 ._
. . . . ~ - : +;
,
+
+ - ~.- ::-+- - - - -++ - :-, - + . . . . .
.... ~__~:_..... ~_,_i___; . _ t _ ~ . ~ 1
.....
,,. . . . . *
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
+ - - 4 -*.
,
O0
+
-F,~ 5
10
15
I 25
20
+
.....
,
1
30
Offshoredistance(km)
35
Figure 10. Semi-major axis versus offshore distance, September 1997.
September
1997-
Semidiurnal
3 0 ................i~............................................. ..1......................... :}: i "i ............................................. -'" ..-
"-
0
f f -: : 2 5
.x':
-+-
.~-
--.-
"+--i
"/-.
1
.-
- -
sh
~"
/
~
............ --4-" ~ . . . . . . . .
"~
-~,
./- . -
............JI
+
.................
f' ,t'
o r 2 0 .................!....p~.............-...............~ e dis
ta nc
................ , ~ ..............
~
'21
~~
1 5 .................i . - i j .................! ..............~
~...,,.,
~
"ix
!
/
................~ '
~
~
....
~-+-~
ii+
'
~
.......
i ..........
_..+.._
..............................
i
!
.
-15
-10
-5
Longshore
. 0 distance
5 (km)
ref. (2xl)i-I10
15
243 6. CONCLUSIONS Sea surface currents have an important role in coastal processes and in the exchange between coastal and offshore waters [14]. In particular, surface currents strongly affect planktonic larvae transport and their retention in areas suitable for hatching. This is fundamental in marine population dynamics reproduction and, therefore, in the evolution of a coastal system. Also, very important is the observation and study of current surface flow whose description is difficult using conventional Eulerian instruments (e.g., ADCP, moorings, etc.), while it is easier using Lagrangian systems (e.g., drifters), even if the spatial resolution is less. Experimentation with the radar technique began about 25 years ago, but only for a few years has the available system been reliable. The radar, with respect to other standard techniques, allows us to obtain a good spatial resolution and long time series of data, on an hourly basis, permitting an analysis of current fields to be made on many tidal cycles, and the identification of possible seasonal evolutions in the local coastal circulation over a wide area. The SeaSonde, installed in the Northern Adriatic Sea in April 1997, tested for 4 months and operating since August 1997, has proven to be a reliable system to define the coastal circulation in this area at different temporal scales. The surface circulation offshore Ancona described by the radar has shown some interesting features of this coastal area in the northern Adriatic Sea as: (1) the presence of a strong coastal jet extending its influence 10-15 km seaward; (2) the decrease of the tidal energy moving offshore (as Kelvin wave); (3) the exponential decrease of the near-inertial current approaching the shallower depth; (4) the analysis of the kinetic energy during a strong atmospheric forcing has been used in this paper to underline the adequacy of the system; in particular, the presence of a minor seiche (with period of about 12 hours) has been described on the basis of the current oscillation and decay time; (5) the rotary analysis has shown a predominant clockwise characteristic of the current rotation corresponding with the most energetic frequencies; (6) the eccentricity at the semi-diurnal tide frequencies is generally controlled by the semi-major axis amplitude that decrease from the coast to offshore. We conclude that this analysis has provided a preliminary useful examination of spatial and temporal structure in the data; similar analyses could of course be done with different approaches and techniques. For this reason this work must be viewed as exploratory and our interpretations should be tested in additional studies.
ACKNOWLEDGEMENTS This work was carried out in the framework of the activities of Task 1 of PRISMA-2 Project, and of Cofin2000 funded by the Italian Ministry for Scientific Research. We wish to thank Prof. M. Moretti for helpful discussions. The constructive comments and suggestions of the two anonymous referees also greatly improved the manuscript.
APPENDIX Specifications of SeaSonde Briefly, we describe how the SeaSonde operates to accomplish the measurements.
244 (a) Range of the target: the distance of the patch of scatters in any radar depends on the time delay of the scattered signal after transmission. The SeaSonde employs a unique method of determining the range from this time delay. By modulating the transmitted signal with a sweep-frequency signal and demodulating it properly in the receiver, the time delay is converted to a large-scale frequency shift in the echo signal. Therefore, the first digital analysis of the signal extracts the range or distance to the sea-surface scatters, and sorts it into 32 range "bins". (b) Velocity from Doppler of the target: information about the velocity of the scattering sea waves (which includes speed contributions due to both current and wave motions) is obtained by a second spectral processing of the signal from each range bin, giving the Dopplerfrequency shifts due to these motions. The length of the time series used for this spectral processing dictates the velocity resolution; at 25 MHz for a 256 s time series sample, this corresponds to a velocity resolution about 2.5 cm/s (the velocity accuracy can be better or worse depending on environmental factors). (c) Beating of the target: after the range to the scatters and their radial speeds have been determined by the two spectral processing steps outlined above, the final step involves extraction of the beating angle to the patch of scatters. This is done for the echo at each spectral point (range and speed) by using simultaneous data collected from three-directional receive antennas. The complex voltages from these three antennas are put through a "direction 9 finding" algorithm to get the beating [15]. At the end of these three signal-processing algorithms, surface current speed maps for each remote site are available in polar coordinates.
REFERENCES
1. 2. 3. 4. 5.
D.E. Barrick, M. W. Evans and B. L. Weber, Science, 198 (1977) 138. D.E. Barrick, Journal of Oceanic Engineering, OE-11 (1986) 286. K.W. Gurgel, Oceanology (1998) 54. B.J. Lipa and D. E. Barrick, Journal of Oceanic Engineering, OE-8 (1983) 226. R.D. Chapman, L. K. Shay, H. C. Graber, J. B. Edson, A. Karachintsev, C. L. Trump and D. B. Ross, J. Geophys. Res., 102 (1997) 18873. 6. G. Budillon, E. Paschini, A. Russo and A. Simioli, Proceeding of the XIII Congresso dell'Associazione Italiana di Oceanografia e Limnologia (In Italian, 2000a) 23. 7. J.D. Paduan and L. K. Rosenfeld, J. Geophys. Res., 101 (1996) 20669. 8. G. Budillon, M. Grotti, P. Rivaro, G. Spezie and S. Tucci, in: F. M. Faranda, L. Guglielmo and G. Spezie (Eds), Structures and processes in the Mediterranean ecosystems, Springer-Verlag, 2001 a, p. 9. 9. G. Budillon, E. Paschini, A. Simioli and E. Zambianchi, in: F. M. Faranda, L. Guglielmo and G. Spezie (Eds), Structures and processes in the Mediterranean ecosystems, SpringerVerlag, 2001 b, p. 19. 10. G. Budillon, E. Sansone and G. Spezie, IAME Proceedings, 2000b. 11. P. Franco, L. Jeftic, P. Malanotte-Rizzoli, A. Michelato and M. Orlic, Oceanologica Acta, 5 (1982) 379. 12. C. Bajc, Boll. Geofis. Teor. Appl., 15 (1972) 57. 13. J. Gonella, Deep-Sea Res., 19 (1972) 833. 14. E. Bijorkstedt and J. Roughgarden, Oceanography, 10 (1997) 64.
245 15. D. E. Barrick and B. Lipa, Oceanography, 10 (1997) 72. 16. R. O. Schmidt, IEEE Trans. Antennas Propag., AP-34 (1986) 276.
This Page Intentionally Left Blank
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
247
9 2002 Elsevier Science B.V. All rights reserved.
The Italian T s u n a m i W a r n i n g System: state o f the art A. Maramai, A. Piscini, G. D'Anna and L. Graziani Istituto Nazionale di Geofisica e Vulcanologia, Via di Vigna Murata 605, 00143 Rome, Italy
From June 1996 to December 1998, a European Union Project has been carried out, called GITEC-TWO (Genesis and Impact of Tsunamis on the European Coasts-Tsunami Warning and Observations). One of the main goals of the project was the design and the realization of the pilot station of the first Italian Tsunami Warning System (TWS), to be installed in the eastern Sicily coast, that experienced large tsunamis in the past. The Italian TWS is conceived for those coastlines potentially affected by tsunamis generated by local earthquakes and it is designed for an immediate detection and early warning, taking into account the characteristics of the tsunamis occurring in this area. At present, the pilot station has been installed and some preliminary analysis of the recorded data has been carried out.
1. I N T R O D U C T I O N During a number of years, tsunami research in the world has evolved by steps, alternating periods of great interest to others of moderate or even no attention and it is worth noting that each phase of interest renewal has been generally triggered by the occurrence of big events. For example, the extraordinary 1956 Aegean sea tsunami, occurred in the Santorini island, prompted tsunami research in Europe and after the '60s, the following period of peak interest on tsunamis in the Mediterranean region flourished around 1980-85, when new tsunamis catalogues [1,2] came to light. Although it was well known that tsunamis constitute an important hazard factor for the European coastlines, especially for Greece and Italy, the interest on this problem diminished again up to the 1992, when a first EU project called GITEC (Genesis and Impact of Tsunamis on the European Coasts) started, with the objectives of studying tsunami generation mechanisms and of evaluating tsunami hazard in European seas, in order to reduce tsunami risk in Europe. One of the most relevant results of this project was the production of the first unified catalogue of European tsunamis, in database form [3]. The realization of this catalogue was the starting-point for a second EU project, named GITEC-TWO (Tsunami Warning and Observations), which is, in a sense, the continuation and the enlargement of the previous project, but mainly focused in the problems of tsunami modeling and warning. In this project the Istituto Nazionale di Geofisica (ING) was involved in the task of the realization of the pilot station of the first Italian tsunami warning system (TWS), in collaboration with the University of Bologna. In fact, the conception of the Italian TWS is just the result of this co-operation and some papers have been produced in the last years [4,5]. A detailed knowledge of the tsunamigenic potential of the coastal areas is a fundamental tool for the realization of an efficient TWS and the availability of a complete and reliable
248 catalogue is the basis for the identification of the most suitable coastal place for the system installation. In this context, the realization of the new Italian tsunami catalogue [6], an extraction of the European catalogue as far it concerns Italy, put in evidence the coastal regions particularly exposed to tsunami attacks. It is underlined that along the Italian coasts a relevant number of destructive tsunami waves occurred in historical times as well as in the present century. Also, a number of minor tsunamis are well documented in the literature. The analysis of the catalogue showed that Calabria and Sicily are the Italian regions where the most relevant tsunamigenic earthquakes took place in the past. These regions are characterized by a high seismicity level with a consequent high seismic risk; the first relevant group of seismic events is located in the Messina Straits, that is the most active and dangerous tsunamigenic source, where the catastrophic December 1908 event occurred, followed by a huge tsunami with waves exceeding 10 meters which caused severe damage in many places along the eastern Sicily coasts and a large number of victims. The western Ionian Sea is another interesting and dangerous tsunamigenic source and a group of relevant seismic events is located in the Augusta-Siracusa area, where the destructive January 1693 earthquake took place, affecting especially Augusta, Catania and Messina (Fig.l). In this area the last significant seismic event occurred on December 13, 1990, located in the sea offshore just a few kilometers in front of Augusta, that caused some anomalous sea behaviour and some measured changes in the bathymetry. Taking into account the historical information on the characteristics of the Italian tsunamis, the relevant social/economical and environmental value of the exposed coastlines and also the facilities for the TWS station installation and maintenance, the selected coastal area for the pilot station installation is the coastal segment facing the Ionian Sea and, in particular, the area of Augusta. Since this area has been chosen as the most suitable place for the installation, in the frame of the GITEC-TWO some additional research has been done, especially on the historical tsunamigenic events that have occurred in this part of the coast. In this context, Piatanesi et al. [7,8] made a study of the 1693 event to define the position of the tsunamigenic fault consistent with the known earthquake and tsunami data. Taking into account the bathymetry, the authors performed some numerical simulations of the tsunami, considering the initial sea withdrawal in the different localities involved. They concluded that the fault named $5 in Fig. 1, is the most probable candidate as responsible of the effects. It could be underlined that, in spite of the limitation of the method, the offshore fault system located in front of Augusta on the megastructure known as Hyblean-Maltese escarpment is responsible for generating the earthquake. Only by analyzing the tsunami data, and in particular the first wave polarity, was it possible to draw this kind of conclusion.
2. THE ITALIAN TSUNAMI WARNING SYSTEM In the Mediterranean region tsunamis are usually generated by local earthquakes with quite short travel-times from the source to the coast, generally not exceeding 20 minutes. This behaviour is typical, of course, also for the Italian tsunamis, where the waves attack the coast within 10-15 minutes after the shock, e.c. in particular, the tsunamis which occurred in the eastern Sicily coasts. Historical information points out that they are generated by earthquakes with focal region very close to the coast, with epicenter both in the sea or in land. In addition, for all the tsunamis in the area of Augusta the first movement of the sea was a withdrawal, followed, within a few minutes, by a flooding. Considering that, usually, for strong shocks the
249
N
Messina ~ /
../
/
a~cuv
/ .t""
/
/
/ -/ /VIII
/" / .i" I
_
0
r
..-
//
~ "w
..
../Mascalilf f f-"
,/
/
9"
/" ~5.ormi .
/ I~
/ "" Cpial /-'/ /
/
IX
~-'"'"
/-"
./ x
.4
/~ug~s~ ( XI ..~J\
"'~.~...../
M._ \ l acus .
50 km
Figure 1. Isoseismal lines (dashed) for the January 11, 1693 event. The sign indicates strongtsunami effects.
250
focal mechanisms are the same for many cycles, it is reasonable to hypothesize that a possible future earthquake in this area could cause the same kind of sea behaviour, with an initial withdrawal. The hypothesis of a delay between the tsunami generation and the wave attack is fundamental for a suitable utilization of the Italian TWS. In fact, the system differs from those at present operating all over the world because they are efficient for tsunamis generated by distant earthquakes, being based on a real-time seismic network detecting and locating the shock: if the shock is in the sea with a relevant magnitude, a message is issued to the tidegauge networks for a possible tsunami verification. This kind of system cannot work for local earthquakes because the time elapsing between the tsunami generation and the wave attack is quite short. On the contrary, the Italian pilot TWS is a very local system conceived for the coastlines attacked by the waves just a few minutes after the shock. In such cases what is needed is an immediate detection of the generated tsunami. This is why our system is an integrated system designed for the simultaneous acquisition of both the seismic signal and the sea-surface displacement, for processing both signals, for detecting dangerous conditions and for issuing an alarm message to specific local units of the Civil Protection service within 4-5 minutes after the shock. Two cases must be essentially distinguished, according to the first sea movement: if the first movement will be positive (a flooding of the coast), even the above mentioned 4-5 minutes delay will be too long and the alarm could not be effective. Also, if the first movement will be a withdrawal (that is the usual behaviour in the Augusta area), the inundation will occur about 10-15 minutes later and the system will be able to send a suitable alarm message and proper actions could be taken. 2.1. Architecture and Realization As already mentioned, the integration of the system with the installation of a tide gauge station and the use of seismic data is viewed as the fight strategy to timely detect anomalous sea movements and to send a suitable alarm. The pilot TWS station was been installed on July 1998 in a suitable wharf, managed by the EniChem Oil Company, just outside the Augusta harbour. The wharf (see Fig. 2) has all the necessary characteristics for a good installation: (a) it is located a few km in front of the seismogenic structure generating the 1693 earthquake; (b) it is a safe place controlled 24 hours a day; (c) it is only rarely used by the oil company; (d) it is about 1 km long into the sea, with the water depth ranging from 2.5 m (at the beach) to more than 10 m; (e) it has some platforms with basic facilities for the station installation and maintenance. As regards the monitoring of the sea conditions, a water-level station, including a pressure gauge and a barometric unit, is used. The sensor is a pressure gauge, ranging from 0-10 m with an accuracy of 1 cm, that has been installed in the sea, at a depth of about 4 m. The chosen sample rate is 1 Hz, averaged over a 15 sec time window. We have to emphasize that the choice of this rate is particularly relevant because the sea level data available up to now in Italy, both from national and local networks, are low-frequency data, only useful to study long period oscillations. On the contrary, our choice allows us to obtain information on the events of interest for us and to have a much larger data set. The pressure gauge is connected to an automatic station (SM3840 by SIAP factory), located in the wharf, able to carry out some diagnostics, to collect and process the sea level data from the sensor and the atmospheric pressure data from the barometric unit included in the station itself. The station, fed either by a battery or by a photovoltaic panel to be used in emergency conditions, is also able to process many other parameters such as temperature, wind velocity and direction, solar radiation, etc. The atmospheric pressure is an additional collected parameter,
251 sampled every 15 sec, necessary to be compared with the sea level data in case of anomalous oscillations. Both the sea level and the atmospheric pressure data are sent, via dedicated radio link, from the automatic station on the wharf to the TWS control center located in land about 4 km far from the coast. Concerning the seismic data, in this experimental stage the already available three component seismic station (S-13 seismometers) of Augusta is used, which is an operating station of the Italian seismic network. During the GITEC-TWO project a study of the background seismic noise has been carried out, analyzing some months of continuous registrations both in time and frequency domain. Fig. 3. shows the Power Spectral Density (PSD) acceleration diagrams for vertical and horizontal components. The main contribution of noise is given by human activity (frequency greater than 1 Hz). By filtering all frequencies greater than 2 Hz we obtain the diagram of Fig.4.
Figure 2. General view of the wharf.
252
Figure 3. PSD diagrams (acceleration vs frequency) respectively for AU9Z, AU9N and AU9E component.
Figure 4. Filtered recording for the three component seismic station (velocity vs time).
253 As mentioned before, both the sea level and the seismic data are sent to a dedicated TWS control center, collecting and processing the data in real-time and able to detect the occurrence of a significant seismic event and to identify anomalous sea level oscillations. If both signals show anomalous behaviour to suggest the possible occurrence of a tsunamigenic event, an alarm message is issued to local units of the Civil Protection service. The control center is equipped with two PC units, in LAN network, one for the seismic signal acquisition and processing and the second for the sea level data processing and for the alarm message procedures. The PC responsible for the seismic signal acquisition, programmed in LabView environment, contains specific routines for the detection of the earthquake; after the detection, the seismic data will be sent to the second PC unit, that receives the sea level data too, continuously transmitted through the radio link. This PC is equipped with a CM5000 software that has some main tasks: (a) automatic acquisition of the sea level data; (b) real time data validation; (c) data collection in a relational database (DBMS) SQL standard; (d) graphic display of the data; (e) station activity monitoring; (f) configuration of new stations. In the PC unit equipped with the CM5000 software there also resides for routine uses a predictive model (developed by means of a neural network) to define a tide gauge trend, continuously related to the real data, in order to detect anomalous sea level oscillations and to single out an alarm condition. This LAN at the control center is connected to the 1NG center in Rome, by means of a dedicated telephone line, with a PPP/TCP access on the SCO UNIX PC, in order to perform some remote tasks (i.e. data downloading, system remote control, etc.). At present at the control center only the PC with the CM5000 software is installed, dedicated to the acquisition and processing of the sea level and pressure data. However, before the end of this year the second PC unit, in LabView environment, will be installed, also containing the algorithm for the alarm message transmission. In Fig. 5 the general scheme of the complete system is shown and Fig. 6 shows a general view of the automatic station installed in the wharf (including the protection box, the antenna and the photovoltaic panel).
Figure 5. General scheme of the Italian pilot TWS.
254
Figure 6. View of the automatic station installed in the wharf
2.2. Data collection and analysis As in the Mediterranean basin it is more probable to record weak tsunami waves rather than strong events, it becomes very important to know what are the noise characteristics that can interfere and mask the real phenomena. So, in the case of this project, in order to determine the main characteristics of the local noise and their dependence on harbour geometry, we performed a systematic collection and analysis of tide-gauge records, also considering different existing atmospheric conditions. Since installation, the pilot system has been working, although with a discontinuous activity due to some problems both in the software and in the technical apparatus, allowing the collection of quite a large amount of data. At present, the availability of different sets of sea level data for different periods of time, sampled every second, allows us to make some preliminary fundamental analysis. Starting from the longest continuous available dataset, first of all we have observed the tidal trend by applying a low pass filtering procedure with 0.01 Hz comer frequency to cut the oscillations with periods lower than 100 sec The pure tide trace is clearly visible only by filtering with 0.00027 Hz comer frequency, in order to cut oscillations with periods shorter than 1 hour. In Fig. 7 an example of 6 days registration is shown and the peaks of the tidal waves, diurnal and semidiurnal, can be observed.
255
1
'
'
'
I
t
!
.1
!
i ~ ?5
-.1
i..................................................... 1.............
........
i . . . . . . .
1 ..........
x10+5
Time (s)
Figure 7. Filtered (0.01 Hz) sea level data (from June 13 to 18 1999) (displacement in meters vs time in seconds).
256 For a better understanding of the phenomenon we carried out the spectral analysis of the signal, obtaining some Power Spectral Density (PSD) diagrams. In Fig. 8 the PSD diagrams for the data of Fig. 7 and for the July data are shown. As regards the period from June 13 to 18, 1999 (black line) four peaks, around 1250, 800, 500 and 5 sec., are evident. The same kind of behavior was observed by analyzing the data in different time intervals as, for example, for the period July 11-16, 1999 (grey line). Taking into account this information, we could hypothesize that the Augusta harbour, and, in particular, the basin where the tide gauge is located, is characterized, from a spectral point of view, by the oscillation period above mentioned, that is, 1250 sec (oscillations with amplitude one order of magnitude lower than tidal waves). In order to support this hypothesis, we have considered the available studies describing the relations between the oscillations periods and the geometrical characteristics of the basins in "shallow water" conditions, to compute the dimensions of the Augusta basin. The approximated formula (Merian) from [9]: Tn = 2L / n (g h)1/2
(a)
was used, where L is the dimension of the basin in the wave propagation axis, g is the gravity and h the sea depth in the basin. The parameter n is inversely proportional to the wavelength: the oscillation wavelength is 2 L (if n=l), is L (if n=2), is 2/3L (if_n=3), etc. The period with n_= 1, T !, is the main oscillation period" T ! = 2 L / ( g h_) 1/2
(b)
We have substituted the experimental period found, 1250 sec in formula (b), by considering an average depth of 6 meters. We have obtained an L value of about 4800 m, that is in the same order of magnitude of the Augusta harbour. This simple model (only considering the mean length and depth of the basin) is therefore able to explain the 1250sec peak, but to explain the other observed oscillation frequencies some more complex numerical models, taking into account the real geometry of the basin, are needed. The comparison between sea level data and atmospheric pressure spectra (Fig. 9) does not indicate any correlation in the range of frequencies studied. In fact, the atmospheric phenomena can contribute to the changing of the waves amplitude, but not to their oscillation periods. During the period of activity of the system no significant event has occurred to generate an alarm condition. At the Augusta seismic station the largest seismic events which occurred in the Mediterranean area, respectively in Turkey and Greece (Fig. 10), have been recorded but, obviously, no alarm message has been issued because of the absence of anomalous conditions in the sea level data. On the other hand, never in the past have the Italian coasts experienced tsunamis generated by distant sources. Likewise, in the case of anomalies in the sea level data, if the system does not find a trigger of a possible seismic event, no sort of alarm message will be sent. The necessity of having simultaneous anomalies in both the seismic and sea level signals, in order to identify a warning condition, strongly reduce the possibility of false alarms.
257
Figure 8. Power Spectral Density diagrams(displacement vs. frequency). The black line shows unfiltered data from June 13 to 18, 1999, the grey line shows the same data for the period July 11 to 16, 1999
Figure 9. Comparison between the the PSD (amplitude vs. frequency) of the sea level data (black line) and the atmospheric pressure data (grey dashed line) for July 1999.
258
Fig. 10. The August 17, 1999 Turkish event (upper trace) and the September 9, 1999 Greek event (lower trace) recorded at the three component S-13 Augusta seismic station (AU9). Only the vertical component is shown for both events.
3. CONCLUSIONS A TWS for the Calabrian Sicilian tsunamis has been realized and installed close to the Augusta harbour. This system works for local tsunamis, taking advantage of the delay between the tsunami generation and the waves attack. In spite of the limitations due to the very short time (10-15 rain) available for all the operations, this is the only kind of system suitable for the characteristics of the Italian tsunamis, and in particular for those occumng in the eastern Sicily coast. Although some technical problems occurred during the operating period, preliminary analysis for both the seismic and sea level data have been carried out. In the future the installation of a strong motion seismometer is foreseen, to support the S-13 seismometer in order to reduce the possibility of saturation, just to improve the quality of the available data.
259 ACKNOWLEDGMENTS
The authors wish to thank Prof. Stefano Tinti of the University of Bologna for his constant contribution to the realization of the TWS. This work was supported by the GITECTWO project, contract ENV4-CT96-0297.
REFERENCES
1. J. Antonopoulos, Ann. Geofi, 33 (1980) 141. 2. G.A. Papadopoulos and B. J. Chalkis, Mar. Geol., 56 (1984) 309. 3. S. Tinti, M. A. Baptista, C. B. Harbitz and A. Maramai, Proceedings of International Conference on Tsunamis, Paris, 1998, p.84. 4. A. Maramai, S. Tinti, Phys. Chem. Earth, 21 (1996) 39. 5. A. Piscini, A. Maramai and S. Tinti, International Conference on Tsunamis, Paris, 1998, p.137. 6. S. Tinti and A. Maramai, Annali di Geofisica, 34 (1996) 1253. 7. A. Piatanesi, S. Tinti, A. Maramai and E. Bortolucci, Atti XV GNGTS, Rome, 1996, p.337. 8. A. Piatanesi and S. Tinti, J.Geophys.Res., 103 (1998) 2749. 9. F. Mosetti, Oceanografia, Del Bianco Editore, Udine, 1964.
This Page Intentionally Left Blank
Science-Technology Synergy for Research in the Marine Environment: Challenges for the XXI Century L. Beranzoli and P. Favali and G. Smriglio, editors.
261
9 2002 Elsevier Science B.V. All rights reserved.
A d v a n c e d technologies: e q u i p m e n t s for e n v i r o n m e n t a l m o n i t o r i n g in coastal areas G. ZappalS. Istituto Sperimentale Talassografico CNR, Spianata San Ranieri 86, 98122 Messina, Italy
Traditional equipments and techniques for offshore oceanography hardly meet the requirements of coastal monitoring (committed to the acquisition of long-term time series of characterizing parameters), so creating the need for special-purpose instrumentation able to work safely and in all weathers. The lack in the 1980s of such instrumentation and the ecological interest for environmental preservation led to the organization in 1988 of the CNR "Strategical Project for Automatic Monitoring of Marine Pollution in Southern Italy Seas". During the ten years which the Project lasted, new instruments and systems for automatic and real time monitoring of marine coastal waters were developed, for the first time in Italy. The availability of new microprocessors and semiconductors allowed to design and build a whole new generation of data acquisition and transmission equipments, combining low cost with good performances in terms of power consumption, size and reliability. The know-how obtained was also used in the "Venice Lagoon System" programme and in the PRISMA 2 - SubProject 5 "Remote Sensing and Monitoring" research of the Italian Ministry for University and Scientific and Technological Research (MURST), aimed at the study and safeguard of the Adriatic Sea. Systems for real time automatic monitoring of water characterizing parameters (e.g. temperature, salinity, dissolved oxygen, nutrients content...) were installed in Augusta (SR), Messina and Venice. The paper presents the developed equipments and some examples of their use.
1. M A I N D E S I G N C O N S I D E R A T I O N S
The main restriction which systems are to comply with concern the size, weight and power consumption of the equipments, that must be minimized to fit also on little buoys. The systems are usually powered by rechargeable batteries backed up by solar cells or wind generators. They are evidence of the impossibility of using the same instrumentation that could fit in a laboratory or on a ship and the need to ensure the proper supply voltage in all battery conditions, avoiding wastage of limited and precious energy. The ordinary maintenance required by the station (sensor cleaning and reactive refilling) should be reduced to a minimum; its interval usually varies, according to the trophic conditions, between 2 and 4 weeks. The measuring systems vary according to the kind of monitoring required: physical parameters (e.g. water temperature and salinity) can be easily measured and used also as tracking parameters (e.g. to estimate the dispersion of pollutants carried by fresh water bodies),
262
chemical parameter measurements can be performed using sensors (e.g. dissolved oxygen and pH) or automatic instruments (e.g. nitric nitrogen, phosphates). The communication system (if necessary redundant) must be chosen according to the site conditions: VHF and UHF private radio links are usually the most reliable solutions, but require base stations for the antenna equipment. The use of cellular phones is simpler, but (more for GSM than for E-TACS) their applicability and reliability depend on the telecom company; satellite communications are usually too expensive for long-term monitoring. The last, but not least, requirement to be met is the cost - building up a network of n monitoring stations involves a multiplication by a factor at least n+l (the +1 takes into account the need for a spare system) of the costs of a single one. The number of stations depends on the characteristics of the site and on the expected results - a preliminary study of the hydrodynamic conditions and of the presence of sewers or fiver mouths is necessary to define the locations and quantity of the monitoring stations.
2. THE FIRST SYSTEMS: THE FLEXIBLE MONITORING NETWORK The first systems were obtained with the cooperation of some firms making their first steps in this new business. Experiences in environmental monitoring performed in Germany [1,2] and France [3], encouraged us to experiment with the realization of a network in Augusta Bay (SR), of which a very short description follows.
2.1. The buoys Five small buoys were installed in Augusta Bay for about 3 years in the early 1990s. The mooring depth ranged around 11 m. The measuring subsystem was based on an industrial grade 8088 PC compatible computer, equipped with 12 bit A/D converters with serial output (one per sensor). All the buoys communicated on the same frequency, using 10 W UHF RTXs with analog modems; the carrier detection was used to switch on and off the measuring equipment; the operations were controlled by an automatic base station (hosted by the harbour authority), sequentially activating the data acquisition and transmission from each buoy, that had no local data storage. A remote control software was used to manage the network from the Institute in Messina and data transfer. Measurements performed included water temperature, salinity, dissolved oxygen, turbidity and fluorescence [4]. 2.2. The continuous measuring system The network was supplemented by a continuous measuring system hosted in a little boat cruising on a prefixed course. Sea water was continuously pumped in a little tank, housing a multiparametric CTD probe (with sensors for temperature, conductivity, dissolved oxygen, pH), a turbidometer and a fluorimeter for chlorophyll a. The tank also supplied water to a continuous flow analyzer, able to measure the nitrates in water with a response time of about 300 sec. Acquired data, integrated with the GPS fix, were real-time displayed and stored in a database for the subsequent elaborations.
263 3. THE NEW SYSTEMS The experience obtained with the first systems showed their limits and boosted the custom design of a new series of equipments, all developed and built in Messina. 3.1. The data acquisition and transmission system Core of the new generation of devices is the data acquisition and transmission system whose general architecture is showed in Fig. 1. It offers a great modularity and is conceptually similar to that described in 1999 by Lessing et al. [5] for the American NDBC. It is able to control almost any instrument, communicating via standard serial or parallel digital interfaces, or reading measurements presented as an analogue voltage or current. 3.1.1. The computer subsystem The basic architecture of the computer subsystem is an IBM PC-like one, adapted during the years to the different needs and to the availability of new components. An x86 CPU is the heart of the system, dialoguing with external devices using standard RS232 serial communication ports and I/O mapped bidirectional ports controlled on a single bit basis; a 12-bit resolution, 16 input A/D converter and solid state storage devices complete the system. Two different kinds of boards were adopted, both with satisfactory results. The first system, built in 1996 to equip the "Advanced Technology Coastal Monitoring Platform", used a 486 CPU with industrial grade boards and had no local data storage. Housed in an IP65 box, it was the natural development of a 286 system designed in the early 1990s in the framework of the "Venice Lagoon System" Strategical Project.
Packet Switching Modem Cellular Modem
Solar Cells
Communication Subsystem
PC-LIKE COMPUTER
Main Power Suppley
GSne I--
Pho
Subsurface CTD [ Nutrient Analyzers
I I
Batteries
Dig ital I/O
....
[ i
Rem ote Controls]
I Ana,o0
Interface
[
[
_[ M eteo Station r [
Future Expansions
Radiometer and analog t l Sensor and System I
Figure 1. The basic architecture of the data acquisition and transmission system.
264 The computer subsystem was switched on only during the measurements and then switched off again sending commands to the packet switching modem. A new implementation always switched on, with local data storage, using a 386 CPU and PC104 standard boards, was developed in 1997 in the framework of "PRISMA 2" Project, and installed in "Acqua Alta" platform in the Venice Lagoon. 3.1.2. The communication subsystem The first communication subsystem used in Messina basically consisted of a 2 W UHF radio transceiver, a commutable 10 W RF amplifier-antenna preamplifier, a 2400 baud packet switching radio modem, equipped with 4 remotely controlled digital I/O ports and having the capability to act as a transponder to stations not directly reachable from the base. The system in Venice has been equipped with a Nokia 2110 GSM cellular phone and 9600 baud Nokia Cellular Data Card PCMCIA modem. A new communication GSM Module, expressely designed for this kind of application, has recently been integrated with a new PC104 electronics to be used in the Messina system and in the new generation of buoys. 3.1.3. The software On the remote systems a hardware controls all the functions, implementing a set of macrocommands designed to best perform all the measuring and maintenance operations; it also has the capability to perform, if requested, as a stand alone monitor, and to act as a "repater" to other stations, so making a buoy the "master" of a series, performing like an "intermediate base station". In the main base station software runs, able to perform timed data acquisition with real time transmission and/or timed data transfer. The operating mode can be chosen according to the communication device and the scientific needs: usually the UHF link is chosen for real time data acquisition and transmission, while the GSM link is preferred for delayed data transmission. 3.2. The Advanced Technology Coastal Monitoring Platform The platform (about 8 m 2) has the shape of an equilateral triangle, composed of four little buoys (three in the corners and one in the center). A steel grid lays over the structure and is used as a floor for maintenance operators; p y r a m i d - s h a p e d towers are bolted over the corner buoys, while a higher, triangular prism-shaped tower overhangs the center one, with the solar panels. The platform (Fig. 2, left) was equipped with: the data acquisition and transmission system previously described (Fig. 2, middle), a CTDO probe, a meteo station, colorimetric analyzers for Ammonia and Phosphates, a remotely controlled water multi-sampler for bacteriological analysis (Fig. 2, fight). The water sampler system features eight 250-ml bottles, preconditioned with 10 cc of formalin for sample preservation, cyclically filled by a peristaltic pump via a circular water distributor with position control, operated by a stepping motor. The bottle frame is a circular slice of teflon.
265
Figure 2. The Platform moored near Messina (left), the Computer subsystem (1996 version) (middle) and a close-up of the water sampler (right).
4. SOME APPLICATIONS 4.1. Upwelling detection in the Strait of Messina The tidal movements cause the alternate presence in the Strait of Messina of Tyrrhenian (warmer, nutrient-poor) and Jonian (colder, nutrient-rich) waters. A complete substitution doesn't occur, but springs of Jonian waters rise up among the Tyrrhenians. A study performed in the mid-90s using a little boat equipped with the continuous measuring system cruising on a fixed course, measuring some tracking parameters (mainly temperature and salinity) in the few hours during which the tidal movements develop the highest and lowest intensity, enabled us to find and point out the most important upwelling areas also near the southern Sicilian and Calabrian coasts, and define the zones affected by the dispersion of urban waste waters in the various hydrodynamic conditions [4]. 4.2. Physical, chemical and bacteriological measurements using the "Advanced Technology Coastal Monitoring Platform" in Messina The platform was moored in September 1996 on a 20-m. depth, near the mouth of a torrent used as a urban sewer, in a site in which periodic tidal streams produce an alternate and variable mixing of sea and waste waters [6]. The experience demonstrated the practicability of automatic water sampling and in situ measurement of nutrients. A statistical elaboration about temperature (T) and salinity (S) values measured hourly in two sample weeks, in September 1996 and March 1997, is reported in Table 1. The lower salinity values in March, having a high coefficient of variation (C.V. = Standard Deviation/Average) confirm the strong influence of the rain waters on the sewer flux, and a preponderance of the plume over the tidal movements, while the lower flux in September (confirmed also by the higher salinity levels) shows the tidal variations; the behaviour of the r correlation coefficient (Pearson) is probably caused by the less diluted sewer flux arriving with the Tyrrhenian waters. The statistical elaborations on Orto-phosphate (12 October, 1996) and Ammonia (19 October, 1996) concentrations, measured every 2 hours and r correlation with temperature are reported in Table 2 and confirm the tidal influence on the dilution of the sewer.
266 Table 1
min
T(~ max
Aver.
C.V.
10-16 Sept. '96
19.63
24.03
22.39
21-27Mar.'97
15.28
16.66
15.96
S(psu) max
Aver.
C.V.
r
0.85
34.33 37.09
36.50
0.17
0.59
0.04
32.56 36.57
35.00
0.60
-0.21
min
Table 2
min
PO4 (12 Oct. 96) max Av. C.V.
0.56
5.38
2.98
2.92
r
min
NH4 + (19 Oct. 96) max Av. C.V.
0.11
2.20
4.89
3.14
0.71
r 0.35
The positive Pearson coefficient shows that the measured values increase with temperature, with the warmer Tyrrhenian waters bringing the plume closer to the platform. The values of r depend on the different dispersion of orto-phosphates and ammonia in the skin and subsurface layers. Water samples automatically collected and fixed were analysed in laboratory to detect the presence of Escherichia coli, a micro-organism indicator of faecal pollution. Data obtained show the presence always of high concentrations of Escherichia coli cells, ranging from 1.36 xl 03 to 3.78xl 04 cells/100 ml (Fig. 3). All the measurement showed the existence of a base level of chemical and bacteriological pollution, with increases related to the presence of warmer Tyrrhenian waters drawing the waste waters closer to the platform.
4.3. Physical and chemical measurements in Venice Lagoon The system was installed in 1997 in a laboratory on the second deck of the "Acqua Alta" platform in Venice Lagoon. It is equipped with a CTDO probe, a modified Barnes PRT-5 TIR Radiometer and a pMAC Ammonia Colorimetric Analyzer. The combined use of radiometer and submersed CTD probe allows us to experience the variability (due to solar radiation and wind) of surface temperature with respect to the water mass underneath, and the presence of "anomalies" (like the peak in the hourly measurements showed in Fig. 4) that can be related to the presence of materials and fluxes affecting only the very skin layer [7]. The observation of water temperature and ammonia content in Fig. 5 displays the arrival of warmer offshore waters, having lower ammonia content than the continental ones; measuring some physical tracking parameters it is possible to monitor the arrival of fresh waters and hypotesize the sea capability of diluting pollutants of anthropic origin [8].
267
Figure 3. Escherichia coli and temperature values in Messina Platform from 20 th to 27 th, March 1997. 27
...........................................................................................................................................................................................................................
2
/ /U-._
--~
RA-
O o
251
24! 0.0 3.0
6.0
9.0 120 15.0 18.0 21.0 0.0 3.00 6.0 Hours of the
9.0 120 15.0 18.0 21.0 0.0
Figure 4. Temperatures measured by radiometer and submersed probe from 10 th to 29 th, September 1997.
Hours
of the
Figure 5. Temperature and ammonia values measured in Venice platform from 23th to 24 th, February 1999.
268 5. CONCLUSIONS The performance of the systems is well worth the cost. Thanks to the modular structure of the acquisition systems new instruments and sensors can be easily added. The systems have proven to be a useful, reliable basis for building automatic networks for automatic environment monitoring and they are still open to further development.
REFERENCES
1. K. Grisard, Proc. Oceans '94, (I) (1994) 38. 2. D. Knauth, F. Schroeder, R. Menzel, S. Thurow, S. Marx, E. Gebhart, D. Kohnke and F. Holzkamm, Proc. Oceanology International '96 (III) (1996) 21. 3. A.H. Carof, D. Sauzade and Y. Henocque, Proc. Oceans '94 (III) (1994) 298 4. E. Crisafi, F. Azzaro, G. Zappal/l and G. Magazzfl, Proc. Oceans '94 (I) (1994) 455. 5. P. Lessing, D. Henderson and B. Edwards, Proc. Oceans '99 (II) (1999) 785. 6. G. Zappal/t, E. Crisafi, G. Caruso, F. Azzaro and G. Magazzfl, Proc. Oceanology International 98 (I) (1998) 69. 7. G. Zappal/t, L. Alberotanza and E. Crisafi, Proc. Ocean Community Conference '98 (I) (1998) 585. 8. G. Zappal/l, L. Alberotanza and E. Crisafi, Proc. Oceans '99 (II) (1999) 796.