In order to archive, quality control and disseminate a large variety of marine data in a marine data exchange platfonn, a marine XML has been developed to encapsulate marine data, which provides an efficient means to ...In order to archive, quality control and disseminate a large variety of marine data in a marine data exchange platfonn, a marine XML has been developed to encapsulate marine data, which provides an efficient means to store, transfer and display marine data. This paper first presents the details of the main marine XML elements and then gives an example showing how to transform CTD-observed data into Marine XML format, which illustrates the XML encapsulation process of marine observed data.展开更多
The study of marine data visualization is of great value. Marine data, due to its large scale, random variation and multiresolution in nature, are hard to be visualized and analyzed. Nowadays, constructing an ocean mo...The study of marine data visualization is of great value. Marine data, due to its large scale, random variation and multiresolution in nature, are hard to be visualized and analyzed. Nowadays, constructing an ocean model and visualizing model results have become some of the most important research topics of ‘Digital Ocean'. In this paper, a spherical ray casting method is developed to improve the traditional ray-casting algorithm and to make efficient use of GPUs. Aiming at the ocean current data, a 3D view-dependent line integral convolution method is used, in which the spatial frequency is adapted according to the distance from a camera. The study is based on a 3D virtual reality and visualization engine, namely the VV-Ocean. Some interactive operations are also provided to highlight the interesting structures and the characteristics of volumetric data. Finally, the marine data gathered in the East China Sea are displayed and analyzed. The results show that the method meets the requirements of real-time and interactive rendering.展开更多
With long-term marine surveys and research,and especially with the development of new marine environment monitoring technologies,prodigious amounts of complex marine environmental data are generated,and continuously i...With long-term marine surveys and research,and especially with the development of new marine environment monitoring technologies,prodigious amounts of complex marine environmental data are generated,and continuously increase rapidly.Features of these data include massive volume,widespread distribution,multiple-sources,heterogeneous,multi-dimensional and dynamic in structure and time.The present study recommends an integrative visualization solution for these data,to enhance the visual display of data and data archives,and to develop a joint use of these data distributed among different organizations or communities.This study also analyses the web services technologies and defines the concept of the marine information gird,then focuses on the spatiotemporal visualization method and proposes a process-oriented spatiotemporal visualization method.We discuss how marine environmental data can be organized based on the spatiotemporal visualization method,and how organized data are represented for use with web services and stored in a reusable fashion.In addition,we provide an original visualization architecture that is integrative and based on the explored technologies.In the end,we propose a prototype system of marine environmental data of the South China Sea for visualizations of Argo floats,sea surface temperature fields,sea current fields,salinity,in-situ investigation data,and ocean stations.An integration visualization architecture is illustrated on the prototype system,which highlights the process-oriented temporal visualization method and demonstrates the benefit of the architecture and the methods described in this study.展开更多
To meet the requirements of efficient management and web publishing for marine remote sensing data, a spatial database engine, named MRSSDE, is designed independently. The logical model, physical model, and optimizati...To meet the requirements of efficient management and web publishing for marine remote sensing data, a spatial database engine, named MRSSDE, is designed independently. The logical model, physical model, and optimization method of MRSSDE are discussed in detail. Compared to the ArcSDE, which is the leading product of Spatial Database Engine, the MRSSDE proved to be more effective.展开更多
at present,data security has become the most urgent and primary issue in the era of digital economy.Marine scientific data security is the most urgent core issue of marine data resource management and sharing service....at present,data security has become the most urgent and primary issue in the era of digital economy.Marine scientific data security is the most urgent core issue of marine data resource management and sharing service.This paper focuses on the analysis of the needs of marine scientific data security governance,in-depth development of marine scientific data security governance approaches and methods,and practical application in the national marine scientific data center,optimizing the existing data management model,ensuring the safety of marine scientific data,and fully releasing the data value.展开更多
The use of blended acquisition technology in marine seismic exploration has the advantages of high acquisition efficiency and low exploration costs.However,during acquisition,the primary source may be disturbed by adj...The use of blended acquisition technology in marine seismic exploration has the advantages of high acquisition efficiency and low exploration costs.However,during acquisition,the primary source may be disturbed by adjacent sources,resulting in blended noise that can adversely affect data processing and interpretation.Therefore,the de-blending method is needed to suppress blended noise and improve the quality of subsequent processing.Conventional de-blending methods,such as denoising and inversion methods,encounter challenges in parameter selection and entail high computational costs.In contrast,deep learning-based de-blending methods demonstrate reduced reliance on manual intervention and provide rapid calculation speeds post-training.In this study,we propose a Uformer network using a nonoverlapping window multihead attention mechanism designed for de-blending blended data in the common shot domain.We add the depthwise convolution to the feedforward network to improve Uformer’s ability to capture local context information.The loss function comprises SSIM and L1 loss.Our test results indicate that the Uformer outperforms convolutional neural networks and traditional denoising methods across various evaluation metrics,thus highlighting the effectiveness and advantages of Uformer in de-blending blended data.展开更多
Typhoon Chaba was the most intense typhoon to strike western Guangdong since Typhoon Mujigae in 2015.According to the National Disaster Reduction Center of China,in the morning of July 7,2022,over 1.5 million people i...Typhoon Chaba was the most intense typhoon to strike western Guangdong since Typhoon Mujigae in 2015.According to the National Disaster Reduction Center of China,in the morning of July 7,2022,over 1.5 million people in Guangdong,Guangxi,and Hainan were affected by Typhoon Chaba.The typhoon also caused the“Fukui 001”ship to be in distress in the waters near Yangjiang,Guangdong,on July 2,resulting in big casualties.Studies have indicated that wind field forecast for Typhoon Chaba was not accurate.To better simulate typhoon events and assess their impacts,we proposed the use of a model wind field(Fujita-Takahashi)integrated with the Copernicus Marine and Environmental Monitoring Service(CMEMS)data to reconstruct effectively the overall wind field of Typhoon Chaba.The simulation result aligns well with the observations,particularly at the Dashu Island Station,showing consistent trends in wind speed changes.However,certain limitations were noted.The model shows that the attenuation of wind speed is slower when typhoon neared land than that observed,indicating that the model has a high simulation accuracy for the ocean wind field,but may have deviations near coastal areas.The result is accurate for open sea but deviated for near land due to the land friction effect.Therefore,we recommend to adjust the model to improve the accuracy for near coasts.展开更多
The ocean bottom seismograph(OBS)is a powerful device deployed on the seafloor for acquiring marine seismic data,capable of detecting the multi-scale Earth’s interiors from submarine sediments to the mantle.Due to th...The ocean bottom seismograph(OBS)is a powerful device deployed on the seafloor for acquiring marine seismic data,capable of detecting the multi-scale Earth’s interiors from submarine sediments to the mantle.Due to the frequent use of free-fall deployment,it is challenging to accurately track its precise position.Additionally,the internal crystal oscillator clock of the OBS has limited accuracy,resulting in clock drift for long-term work on the seabed.To improve the reliability of OBS detections,it is crucial to calculate the precise OBS location and time correction.Focusing on accurately determining OBS position and timing,this study developed a positioning method that integrates time correction based on the Markov Chain Monte Carlo(MCMC)algorithm,utilizing travel times of direct water waves triggered by two-dimensional(2-D)shot lines or three-dimensional(3-D)airgun arrays.This newly developed method can simultaneously estimate accurate OBS location and time correction,incorporating bathymetric data into the inversion procedures to improve sampling efficiency and enhance the reliability of the final results.Synthetic tests with appropriate noise levels are performed independently to evaluate the feasibility and reliability of our method,indicating that it is robust enough to determine OBS location and time correction precisely.Finally,we use travel-time data recorded at three OBSs deployed in the Southwest Indian Ridge to relocate locations and calculate time corrections.The results exhibit high consistency when using 2-D and 3-D shot data,indicating that high-resolution bathymetric data plays a fingerprint role in inversion to evaluate precise OBS location and time correction.展开更多
Data availability is of vital importance for marine and oceanographic research but most of the European data are fragmented,not always validated and not easily accessible.In the countries bordering the European seas,m...Data availability is of vital importance for marine and oceanographic research but most of the European data are fragmented,not always validated and not easily accessible.In the countries bordering the European seas,more than 1000 scientific laboratories from governmental organisations and private industry collect data using various sensors on board of research vessels,submarines,fixed and drifting platforms,aeroplanes and satellites to measure physical,geophysical,geological,biological and chemical parameters,biological species and others.SeaDataNet is an Integrated Research Infrastructure Initiative(I3)(2006-2011)in the EU FP6 framework programme.It is developing an efficient distributed Pan-European marine data management infrastructure for managing these large and diverse data sets.It is interconnecting the existing professional data centres of 35 countries,active in data collection and providing integrated databases of standardised quality on-line.This article describes the architecture and the features of the SeaDataNet infrastructure.In particular it describes the way interoperability is achieved between all the contributing data centres.Finally it highlights the on-going developments and challenges.展开更多
This paper presents a framework containing ten components to deliver a data management process for the storage and management of data used for Marine Spatial Planning(MSP)in Ireland.The work includes a data process fl...This paper presents a framework containing ten components to deliver a data management process for the storage and management of data used for Marine Spatial Planning(MSP)in Ireland.The work includes a data process flow and a recommended solution architecture.The architecture includes a central data catalogue and a spatial storage system.The components of the process are presented to maximise the reuse potential of any dataset within an MSP context.The terms‘Suitability’and‘Readiness’in the MSP context are offered as both formal and considered assessments of data,as is the applicability of a data stewardship maturity matrix.How data contained in such a storage system can be published externally to potential consumers of these data is also explored.The process presents a means of managing data and metadata to ensure data lineage is optimised by carrying information about the origin of and the processing applied to the data;to evaluate the quality and relevance of geospatial datasets for use in MSP decisions in Ireland.The process was piloted in the National Marine Planning Framework for Ireland in the development of draft map products;feedback from the public consultation is ongoing and not presented.展开更多
Marine big data are characterized by a large amount and complex structures,which bring great challenges to data management and retrieval.Based on the GeoSOT Grid Code and the composite index structure of the MongoDB d...Marine big data are characterized by a large amount and complex structures,which bring great challenges to data management and retrieval.Based on the GeoSOT Grid Code and the composite index structure of the MongoDB database,this paper proposes a spatio-temporal grid index model(STGI)for efficient optimized query of marine big data.A spatio-temporal secondary index is created on the spatial code and time code columns to build a composite index in the MongoDB database used for the storage of massive marine data.Multiple comparative experiments demonstrate that the retrieval efficiency adopting the STGI approach is increased by more than two to three times compared with other index models.Through theoretical analysis and experimental verification,the conclusion could be achieved that the STGI model is quite suitable for retrieving large-scale spatial data with low time frequency,such as marine big data.展开更多
The ocean is a critical part of the global ecosystem.The marine ecosystem balance is crucial for human survival and sustainable development.However,due to the impacts of global climate change and human activities,the ...The ocean is a critical part of the global ecosystem.The marine ecosystem balance is crucial for human survival and sustainable development.However,due to the impacts of global climate change and human activities,the ocean is rapidly changing,which poses an enormous threat to human health and the econ-omy.“Conserve and sustainably use the oceans,seas and marine resources”is one of the 17 Sustainable Development Goals(SDGs).Therefore,it is urgent to construct a transformative marine scientific solution to promote sustainable development.Marine data is the basis of ocean cognition and governance.Marine science has ush-ered in the era of big data with continuous advances in modern marine data acquisition.While big data provides a large amount of data for SDG research,it simultaneously brings unprecedented challenges.This study introduces an overall framework of a system for solving the current problems faced by marine data serving SDGs from the perspective of marine data management and application.Also,it articulates how the system helps the SDGs through two application cases of managing fragmented marine data and developing global climate change data products.展开更多
基金funds of Ocean University of China Research Initiation Grant and the National 908 Project entitled ‘Marine Information Exchange and Integration Technology Based on XML’ (No 908-03-01-07)
文摘In order to archive, quality control and disseminate a large variety of marine data in a marine data exchange platfonn, a marine XML has been developed to encapsulate marine data, which provides an efficient means to store, transfer and display marine data. This paper first presents the details of the main marine XML elements and then gives an example showing how to transform CTD-observed data into Marine XML format, which illustrates the XML encapsulation process of marine observed data.
基金supported by the Natural Science Foundation of China under Project 41076115the Global Change Research Program of China under project 2012CB955603the Public Science and Technology Research Funds of the Ocean under project 201005019
文摘The study of marine data visualization is of great value. Marine data, due to its large scale, random variation and multiresolution in nature, are hard to be visualized and analyzed. Nowadays, constructing an ocean model and visualizing model results have become some of the most important research topics of ‘Digital Ocean'. In this paper, a spherical ray casting method is developed to improve the traditional ray-casting algorithm and to make efficient use of GPUs. Aiming at the ocean current data, a 3D view-dependent line integral convolution method is used, in which the spatial frequency is adapted according to the distance from a camera. The study is based on a 3D virtual reality and visualization engine, namely the VV-Ocean. Some interactive operations are also provided to highlight the interesting structures and the characteristics of volumetric data. Finally, the marine data gathered in the East China Sea are displayed and analyzed. The results show that the method meets the requirements of real-time and interactive rendering.
基金Supported by the Knowledge Innovation Program of the Chinese Academy of Sciences (No.KZCX1-YW-12-04)the National High Technology Research and Development Program of China (863 Program) (Nos.2009AA12Z148,2007AA092202)Support for this study was provided by the Institute of Geographical Sciences and the Natural Resources Research,Chinese Academy of Science (IGSNRR,CAS) and the Institute of Oceanology, CAS
文摘With long-term marine surveys and research,and especially with the development of new marine environment monitoring technologies,prodigious amounts of complex marine environmental data are generated,and continuously increase rapidly.Features of these data include massive volume,widespread distribution,multiple-sources,heterogeneous,multi-dimensional and dynamic in structure and time.The present study recommends an integrative visualization solution for these data,to enhance the visual display of data and data archives,and to develop a joint use of these data distributed among different organizations or communities.This study also analyses the web services technologies and defines the concept of the marine information gird,then focuses on the spatiotemporal visualization method and proposes a process-oriented spatiotemporal visualization method.We discuss how marine environmental data can be organized based on the spatiotemporal visualization method,and how organized data are represented for use with web services and stored in a reusable fashion.In addition,we provide an original visualization architecture that is integrative and based on the explored technologies.In the end,we propose a prototype system of marine environmental data of the South China Sea for visualizations of Argo floats,sea surface temperature fields,sea current fields,salinity,in-situ investigation data,and ocean stations.An integration visualization architecture is illustrated on the prototype system,which highlights the process-oriented temporal visualization method and demonstrates the benefit of the architecture and the methods described in this study.
基金Supported by the National 863 High-Tech Program of China (No.2007AA12Z237), the Natural Science Foundation of China (No. 40571123).
文摘To meet the requirements of efficient management and web publishing for marine remote sensing data, a spatial database engine, named MRSSDE, is designed independently. The logical model, physical model, and optimization method of MRSSDE are discussed in detail. Compared to the ArcSDE, which is the leading product of Spatial Database Engine, the MRSSDE proved to be more effective.
文摘at present,data security has become the most urgent and primary issue in the era of digital economy.Marine scientific data security is the most urgent core issue of marine data resource management and sharing service.This paper focuses on the analysis of the needs of marine scientific data security governance,in-depth development of marine scientific data security governance approaches and methods,and practical application in the national marine scientific data center,optimizing the existing data management model,ensuring the safety of marine scientific data,and fully releasing the data value.
基金supported by the National Natural Science Foundation of China(Research on Dynamic Location of Receiving Points and Wave Field Separation Technology Based on Deep Learning in OBN Seismic Exploration,No.42074140)the Sinopec Geophysical Corporation,Project of OBC/OBN Seismic Data Wave Field Characteristics Analysis and Ghost Wave Suppression(No.SGC-202206)。
文摘The use of blended acquisition technology in marine seismic exploration has the advantages of high acquisition efficiency and low exploration costs.However,during acquisition,the primary source may be disturbed by adjacent sources,resulting in blended noise that can adversely affect data processing and interpretation.Therefore,the de-blending method is needed to suppress blended noise and improve the quality of subsequent processing.Conventional de-blending methods,such as denoising and inversion methods,encounter challenges in parameter selection and entail high computational costs.In contrast,deep learning-based de-blending methods demonstrate reduced reliance on manual intervention and provide rapid calculation speeds post-training.In this study,we propose a Uformer network using a nonoverlapping window multihead attention mechanism designed for de-blending blended data in the common shot domain.We add the depthwise convolution to the feedforward network to improve Uformer’s ability to capture local context information.The loss function comprises SSIM and L1 loss.Our test results indicate that the Uformer outperforms convolutional neural networks and traditional denoising methods across various evaluation metrics,thus highlighting the effectiveness and advantages of Uformer in de-blending blended data.
基金Supported by the National Key Research and Development Program of China(Nos.2021YFC3101801,2023YFC3008200)the National Natural Science Foundation of China(Nos.42476219,41976200)+6 种基金the National Foreign Experts Program(No.S20240134)the Innovative Team Plan of the Department of Education of Guangdong Province(No.2023KCXTD015)the Tropical Ocean Environment in Western Coastal Waters Observation and Research Station of Guangdong Province(No.2024B1212040008)the Independent Research Project of the Southern Ocean Laboratory(No.SML2022SP301)the Shandong Innovation and Development Research Institute Think Tank Projectthe Guangdong Ocean University Scientific Research Program(No.060302032106)the Start-up Fund for Ph D Researchers(No.060302032104)。
文摘Typhoon Chaba was the most intense typhoon to strike western Guangdong since Typhoon Mujigae in 2015.According to the National Disaster Reduction Center of China,in the morning of July 7,2022,over 1.5 million people in Guangdong,Guangxi,and Hainan were affected by Typhoon Chaba.The typhoon also caused the“Fukui 001”ship to be in distress in the waters near Yangjiang,Guangdong,on July 2,resulting in big casualties.Studies have indicated that wind field forecast for Typhoon Chaba was not accurate.To better simulate typhoon events and assess their impacts,we proposed the use of a model wind field(Fujita-Takahashi)integrated with the Copernicus Marine and Environmental Monitoring Service(CMEMS)data to reconstruct effectively the overall wind field of Typhoon Chaba.The simulation result aligns well with the observations,particularly at the Dashu Island Station,showing consistent trends in wind speed changes.However,certain limitations were noted.The model shows that the attenuation of wind speed is slower when typhoon neared land than that observed,indicating that the model has a high simulation accuracy for the ocean wind field,but may have deviations near coastal areas.The result is accurate for open sea but deviated for near land due to the land friction effect.Therefore,we recommend to adjust the model to improve the accuracy for near coasts.
基金The National Key Research and Development Program of China under contract No.2021YFC3101404the National Natural Science Foundation of China under contract Nos 42106068,42376052,42276064 and 42276075+1 种基金the Foundation of State Key Laboratory of Submarine Geoscience under contract No.sglkfkt2025-2the Zhejiang Provincial Natural Science Foundation of China under contract No.LZ23D060004.
文摘The ocean bottom seismograph(OBS)is a powerful device deployed on the seafloor for acquiring marine seismic data,capable of detecting the multi-scale Earth’s interiors from submarine sediments to the mantle.Due to the frequent use of free-fall deployment,it is challenging to accurately track its precise position.Additionally,the internal crystal oscillator clock of the OBS has limited accuracy,resulting in clock drift for long-term work on the seabed.To improve the reliability of OBS detections,it is crucial to calculate the precise OBS location and time correction.Focusing on accurately determining OBS position and timing,this study developed a positioning method that integrates time correction based on the Markov Chain Monte Carlo(MCMC)algorithm,utilizing travel times of direct water waves triggered by two-dimensional(2-D)shot lines or three-dimensional(3-D)airgun arrays.This newly developed method can simultaneously estimate accurate OBS location and time correction,incorporating bathymetric data into the inversion procedures to improve sampling efficiency and enhance the reliability of the final results.Synthetic tests with appropriate noise levels are performed independently to evaluate the feasibility and reliability of our method,indicating that it is robust enough to determine OBS location and time correction precisely.Finally,we use travel-time data recorded at three OBSs deployed in the Southwest Indian Ridge to relocate locations and calculate time corrections.The results exhibit high consistency when using 2-D and 3-D shot data,indicating that high-resolution bathymetric data plays a fingerprint role in inversion to evaluate precise OBS location and time correction.
文摘Data availability is of vital importance for marine and oceanographic research but most of the European data are fragmented,not always validated and not easily accessible.In the countries bordering the European seas,more than 1000 scientific laboratories from governmental organisations and private industry collect data using various sensors on board of research vessels,submarines,fixed and drifting platforms,aeroplanes and satellites to measure physical,geophysical,geological,biological and chemical parameters,biological species and others.SeaDataNet is an Integrated Research Infrastructure Initiative(I3)(2006-2011)in the EU FP6 framework programme.It is developing an efficient distributed Pan-European marine data management infrastructure for managing these large and diverse data sets.It is interconnecting the existing professional data centres of 35 countries,active in data collection and providing integrated databases of standardised quality on-line.This article describes the architecture and the features of the SeaDataNet infrastructure.In particular it describes the way interoperability is achieved between all the contributing data centres.Finally it highlights the on-going developments and challenges.
基金supported by the Irish Government and the European Maritime&Fisheries Fund as part of the EMFF Operational Programme for 2014–2020.
文摘This paper presents a framework containing ten components to deliver a data management process for the storage and management of data used for Marine Spatial Planning(MSP)in Ireland.The work includes a data process flow and a recommended solution architecture.The architecture includes a central data catalogue and a spatial storage system.The components of the process are presented to maximise the reuse potential of any dataset within an MSP context.The terms‘Suitability’and‘Readiness’in the MSP context are offered as both formal and considered assessments of data,as is the applicability of a data stewardship maturity matrix.How data contained in such a storage system can be published externally to potential consumers of these data is also explored.The process presents a means of managing data and metadata to ensure data lineage is optimised by carrying information about the origin of and the processing applied to the data;to evaluate the quality and relevance of geospatial datasets for use in MSP decisions in Ireland.The process was piloted in the National Marine Planning Framework for Ireland in the development of draft map products;feedback from the public consultation is ongoing and not presented.
基金This research was funded by the National Key Research and Development Plan(2018YFB0505300)the Guangxi Science and Technology Major Project(AA18118025)+1 种基金the Opening Foundation of Key Laboratory of Environment Change and Resources Use in Beibu Gulf,Ministry of Education(Nanning Normal University)Guangxi Key Laboratory of Earth Surface Processes and Intelligent Simulation(Nanning Normal University)(No.NNNU-KLOP-K1905).
文摘Marine big data are characterized by a large amount and complex structures,which bring great challenges to data management and retrieval.Based on the GeoSOT Grid Code and the composite index structure of the MongoDB database,this paper proposes a spatio-temporal grid index model(STGI)for efficient optimized query of marine big data.A spatio-temporal secondary index is created on the spatial code and time code columns to build a composite index in the MongoDB database used for the storage of massive marine data.Multiple comparative experiments demonstrate that the retrieval efficiency adopting the STGI approach is increased by more than two to three times compared with other index models.Through theoretical analysis and experimental verification,the conclusion could be achieved that the STGI model is quite suitable for retrieving large-scale spatial data with low time frequency,such as marine big data.
基金This work was supported by the Strategic Priority Research Program of the Chinese Academy of Sciences[XDA19060101,XDA19060104,XDB42040401]the National Key R&D Program of China[2017YFA0603201]+4 种基金Youth Innovation Promotion Association of the Chinese Academy of Sciencesthe Key R&D project of Shandong Province(2019JZZY010102)the Key deployment project of Center for Ocean Mega-Science,CAS(COMS2019R02)the Chinese Academy of Sciences(Y9KY04101L)National Natural Science Foundation of China[grant number U2006211].
文摘The ocean is a critical part of the global ecosystem.The marine ecosystem balance is crucial for human survival and sustainable development.However,due to the impacts of global climate change and human activities,the ocean is rapidly changing,which poses an enormous threat to human health and the econ-omy.“Conserve and sustainably use the oceans,seas and marine resources”is one of the 17 Sustainable Development Goals(SDGs).Therefore,it is urgent to construct a transformative marine scientific solution to promote sustainable development.Marine data is the basis of ocean cognition and governance.Marine science has ush-ered in the era of big data with continuous advances in modern marine data acquisition.While big data provides a large amount of data for SDG research,it simultaneously brings unprecedented challenges.This study introduces an overall framework of a system for solving the current problems faced by marine data serving SDGs from the perspective of marine data management and application.Also,it articulates how the system helps the SDGs through two application cases of managing fragmented marine data and developing global climate change data products.