期刊文献+
共找到41篇文章
< 1 2 3 >
每页显示 20 50 100
ARCHITECTURE OF DYNAMIC DATA DRIVEN SIMULATION FOR WILDFIRE AND ITS REALIZATION
1
作者 燕雪峰 胡小林 +1 位作者 古锋 郭松 《Transactions of Nanjing University of Aeronautics and Astronautics》 EI 2010年第2期190-197,共8页
Dynamic data driven simulation (DDDS) is proposed to improve the model by incorporaing real data from the practical systems into the model. Instead of giving a static input, multiple possible sets of inputs are fed ... Dynamic data driven simulation (DDDS) is proposed to improve the model by incorporaing real data from the practical systems into the model. Instead of giving a static input, multiple possible sets of inputs are fed into the model. And the computational errors are corrected using statistical approaches. It involves a variety of aspects, including the uncertainty modeling, the measurement evaluation, the system model and the measurement model coupling ,the computation complexity, and the performance issue. Authors intend to set up the architecture of DDDS for wildfire spread model, DEVS-FIRE, based on the discrete event speeification (DEVS) formalism. The experimental results show that the framework can track the dynamically changing fire front based on fire sen- sor data, thus, it provides more aecurate predictions. 展开更多
关键词 state estimation dynamic systems DEVS-FIRE dynamic data driven application system (DDDAS)
在线阅读 下载PDF
A Novel Robust Nonlinear Dynamic Data Reconciliation 被引量:4
2
作者 高倩 阎威武 邵惠鹤 《Chinese Journal of Chemical Engineering》 SCIE EI CAS CSCD 2007年第5期698-702,共5页
Outlier in one variable will smear the estimation of other measurements in data reconciliation (DR). In this article, a novel robust method is proposed for nonlinear dynamic data reconciliation, to reduce the influe... Outlier in one variable will smear the estimation of other measurements in data reconciliation (DR). In this article, a novel robust method is proposed for nonlinear dynamic data reconciliation, to reduce the influence of outliers on the result of DR. This method introduces a penalty function matrix in a conventional least-square objective function, to assign small weights for outliers and large weights for normal measurements. To avoid the loss of data information, element-wise Mahalanobis distance is proposed, as an improvement on vector-wise distance, to construct a penalty function matrix. The correlation of measurement error is also considered in this article. The method introduces the robust statistical theory into conventional least square estimator by constructing the penalty weight matrix and gets not only good robustness but also simple calculation. Simulation of a continuous stirred tank reactor, verifies the effectiveness of the proposed algorithm. 展开更多
关键词 nonlinear dynamic data reconciliation ROBUST M-ESTIMATOR OUTLIER OPTIMIZATION
在线阅读 下载PDF
Differential Privacy Preserving Dynamic Data Release Scheme Based on Jensen-Shannon Divergence 被引量:3
3
作者 Ying Cai Yu Zhang +1 位作者 Jingjing Qu Wenjin Li 《China Communications》 SCIE CSCD 2022年第6期11-21,共11页
Health monitoring data or the data about infectious diseases such as COVID-19 may need to be constantly updated and dynamically released,but they may contain user's sensitive information.Thus,how to preserve the u... Health monitoring data or the data about infectious diseases such as COVID-19 may need to be constantly updated and dynamically released,but they may contain user's sensitive information.Thus,how to preserve the user's privacy before their release is critically important yet challenging.Differential Privacy(DP)is well-known to provide effective privacy protection,and thus the dynamic DP preserving data release was designed to publish a histogram to meet DP guarantee.Unfortunately,this scheme may result in high cumulative errors and lower the data availability.To address this problem,in this paper,we apply Jensen-Shannon(JS)divergence to design the OPTICS(Ordering Points To Identify The Clustering Structure)scheme.It uses JS divergence to measure the difference between the updated data set at the current release time and private data set at the previous release time.By comparing the difference with a threshold,only when the difference is greater than the threshold,can we apply OPTICS to publish DP protected data sets.Our experimental results show that the absolute errors and average relative errors are significantly lower than those existing works. 展开更多
关键词 differential privacy dynamic data release Jensen-Shannon divergence
在线阅读 下载PDF
Dynamic data packing towards the optimization of QoC and QoS in networked control systems 被引量:5
4
作者 KANG Yu ZHAO YunBo 《Science China(Technological Sciences)》 SCIE EI CAS CSCD 2016年第1期72-80,共9页
A class of networked control systems is investigated whose communication network is shared with other applications. The design objective for such a system setting is not only the optimization of the control performanc... A class of networked control systems is investigated whose communication network is shared with other applications. The design objective for such a system setting is not only the optimization of the control performance but also the efficient utilization of the communication resources. We observe that at a large time scale the data packet delay in the communication network is roughly varying piecewise constant, which is typically true for data networks like the Internet. Based on this observation, a dynamic data packing scheme is proposed within the recently developed packet-based control framework for networked control systems. As expected this proposed approach achieves a fine balance between the control performance and the communication utilization: the similar control performance can be obtained at dramatically reduced cost of the communication resources. Simulations illustrate the effectiveness of the proposed approach. 展开更多
关键词 networked control systems packet delay variation dynamic data packing quality of control quality of service
原文传递
A Statistical Comparison Method of the Differences among Single Points for Linear Dynamic Experimental Data
5
作者 XUPeng-yun XUChun-tao 《Journal of Northeast Agricultural University(English Edition)》 CAS 2000年第2期109-112,共4页
The experimental random error and desired valuse of non observed points in dynamic indexes were estimated by establishing the linear regression equations about variety regulations of dynamic indexes.The methods for d... The experimental random error and desired valuse of non observed points in dynamic indexes were estimated by establishing the linear regression equations about variety regulations of dynamic indexes.The methods for difference significant test among different treatments using dynamic point as indexes were presented without setting the replication on each dynamic point observed. 展开更多
关键词 linear dynamic data dynamic point non replication observation
在线阅读 下载PDF
Improved Fair and Dynamic Provable Data Possession Supporting Public Verification 被引量:2
6
作者 REN Zhengwei WANG Lina +1 位作者 DENG Ruyi YU Rongwei 《Wuhan University Journal of Natural Sciences》 CAS 2013年第4期348-354,共7页
A number of proposals have been suggested to tackle data integrity and privacy concerns in cloud storage in which some existing schemes suffer from vulnerabilities in data dynamics. In this paper, we propose an improv... A number of proposals have been suggested to tackle data integrity and privacy concerns in cloud storage in which some existing schemes suffer from vulnerabilities in data dynamics. In this paper, we propose an improved fairness and dynamic provable data possession scheme that supports public verification and batch auditing while preserves data privacy. The rb23Tree is utilized to facilitate data dynamics. Moreover, the fairness is considered to prevent a dishonest user from accusing the cloud service provider of manipulating the data. The scheme allows a third party auditor (TPA) to verify the data integrity without learning any information about the data content during the auditing process. Furthermore, our scheme also allows batch auditing, which greatly accelerates the auditing process when there are multiple auditing requests. Security analysis and extensive experimental evaluations show that our scheme is secure and efficient. 展开更多
关键词 cloud storage data integrity public audit data dynamics privacy protection
原文传递
Reducing Computational and Communication Complexity for Dynamic Provable Data Possession
7
作者 刘妃妃 谷大武 +2 位作者 陆海宁 龙斌 李晓晖 《China Communications》 SCIE CSCD 2011年第6期67-75,共9页
Nowadays, an increasing number of persons choose to outsource their computing demands and storage demands to the Cloud. In order to ensure the integrity of the data in the untrusted Cloud, especially the dynamic files... Nowadays, an increasing number of persons choose to outsource their computing demands and storage demands to the Cloud. In order to ensure the integrity of the data in the untrusted Cloud, especially the dynamic files which can be updated online, we propose an improved dynamic provable data possession model. We use some homomorphic tags to verify the integrity of the file and use some hash values generated by some secret values and tags to prevent replay attack and forgery attack. Compared with previous works, our proposal reduces the computational and communication complexity from O(logn) to O(1). We did some experiments to ensure this improvement and extended the model to file sharing situation. 展开更多
关键词 cloud computing proofs of storage dynamic provable data possession file sharing
在线阅读 下载PDF
Evaluating the resource management and profitability efficiencies of US commercial banks from a dynamic network perspective
8
作者 Qian Long Kweh Wen-Min Lu +1 位作者 Kaoru Tone Hsian-Ming Liu 《Financial Innovation》 2024年第1期4081-4100,共20页
The central concept of strategic benchmarking is resource management efficiency,which ultimately results in profitability.However,little is known about performance measurement from resource-based perspectives.This stu... The central concept of strategic benchmarking is resource management efficiency,which ultimately results in profitability.However,little is known about performance measurement from resource-based perspectives.This study uses the data envelopment analysis(DEA)model with a dynamic network structure to measure the resource management and profitability efficiencies of 287 US commercial banks from 2010 to 2020.Furthermore,we provide frontier projections and incorporate five variables,namely capital adequacy,asset quality,management quality,earning ability,and liquidity(i.e.,the CAMEL ratings).The results revealed that the room for improvement in bank performance is 55.4%.In addition,we found that the CAMEL ratings of efficient banks are generally higher than those of inefficient banks,and management quality,earnings quality,and liquidity ratios positively contribute to bank performance.Moreover,big banks are generally more efficient than small banks.Overall,this study continues the current heated debate on performance measurement in the banking industry,with a particular focus on the DEA application to answer the fundamental question of why resource management efficiency reflects benchmark firms and provides insights into how efficient management of CAMEL ratings would help in improving their performance. 展开更多
关键词 Performance evaluation dynamic network data envelopment analysis CAMEL ratings Resource management efficiency Profitability efficiency
在线阅读 下载PDF
Mode of Operation for Modification, Insertion, and Deletion of Encrypted Data
9
作者 Taek-Young Youn Nam-Su Jho 《Computers, Materials & Continua》 SCIE EI 2022年第10期151-164,共14页
Due to the development of 5G communication,many aspects of information technology(IT)services are changing.With the development of communication technologies such as 5G,it has become possible to provide IT services th... Due to the development of 5G communication,many aspects of information technology(IT)services are changing.With the development of communication technologies such as 5G,it has become possible to provide IT services that were difficult to provide in the past.One of the services made possible through this change is cloud-based collaboration.In order to support secure collaboration over cloud,encryption technology to securely manage dynamic data is essential.However,since the existing encryption technology is not suitable for encryption of dynamic data,a new technology that can provide encryption for dynamic data is required for secure cloudbased collaboration.In this paper,we propose a new encryption technology to support secure collaboration for dynamic data in the cloud.Specifically,we propose an encryption operation mode which can support data updates such as modification,addition,and deletion of encrypted data in an encrypted state.To support the dynamic update of encrypted data,we invent a new mode of operation technique named linked-block cipher(LBC).Basic idea of our work is to use an updatable random value so-called link to link two encrypted blocks.Due to the use of updatable random link values,we can modify,insert,and delete an encrypted data without decrypt it. 展开更多
关键词 data encryption cloud-based collaboration dynamic data update
在线阅读 下载PDF
A Data Assured Deletion Scheme in Cloud Storage 被引量:7
10
作者 LI Chaoling CHEN Yue ZHOU Yanzhou 《China Communications》 SCIE CSCD 2014年第4期98-110,共13页
In order to provide a practicable solution to data confidentiality in cloud storage service,a data assured deletion scheme,which achieves the fine grained access control,hopping and sniffing attacks resistance,data dy... In order to provide a practicable solution to data confidentiality in cloud storage service,a data assured deletion scheme,which achieves the fine grained access control,hopping and sniffing attacks resistance,data dynamics and deduplication,is proposed.In our scheme,data blocks are encrypted by a two-level encryption approach,in which the control keys are generated from a key derivation tree,encrypted by an All-OrNothing algorithm and then distributed into DHT network after being partitioned by secret sharing.This guarantees that only authorized users can recover the control keys and then decrypt the outsourced data in an ownerspecified data lifetime.Besides confidentiality,data dynamics and deduplication are also achieved separately by adjustment of key derivation tree and convergent encryption.The analysis and experimental results show that our scheme can satisfy its security goal and perform the assured deletion with low cost. 展开更多
关键词 cloud storage data confidentiality secure data assured deletion data dynamics
在线阅读 下载PDF
Towards Comprehensive Provable Data Possession in Cloud Computing 被引量:1
11
作者 LI Chaoling CHEN Yue +1 位作者 TAN Pengxu YANG Gang 《Wuhan University Journal of Natural Sciences》 CAS 2013年第3期265-271,共7页
To check the remote data integrity in cloud computing,we have proposed an efficient and full data dynamic provable data possession(PDP) scheme that uses a SN(serial number)-BN(block number) table to support data... To check the remote data integrity in cloud computing,we have proposed an efficient and full data dynamic provable data possession(PDP) scheme that uses a SN(serial number)-BN(block number) table to support data block update.In this article,we first analyze and test its performance in detail.The result shows that our scheme is efficient with low computation,storage,and communication costs.Then,we discuss how to extend the dynamic scheme to support other features,including public auditability,privacy preservation,fairness,and multiple-replica checking.After being extended,a comprehensive PDP scheme that has high efficiency and satisfies all main requirements is provided. 展开更多
关键词 cloud computing provable data possession data dynamics SN-BN table
原文传递
Decision Model of Knowledge Transfer in Big Data Environment 被引量:7
12
作者 Chuanrong Wu Yingwu Chen Feng Li 《China Communications》 SCIE CSCD 2016年第7期100-107,共8页
A decision model of knowledge transfer is presented on the basis of the characteristics of knowledge transfer in a big data environment.This model can determine the weight of knowledge transferred from another enterpr... A decision model of knowledge transfer is presented on the basis of the characteristics of knowledge transfer in a big data environment.This model can determine the weight of knowledge transferred from another enterprise or from a big data provider.Numerous simulation experiments are implemented to test the efficiency of the optimization model.Simulation experiment results show that when increasing the weight of knowledge from big data knowledge provider,the total discount expectation of profits will increase,and the transfer cost will be reduced.The calculated results are in accordance with the actual economic situation.The optimization model can provide useful decision support for enterprises in a big data environment. 展开更多
关键词 big data knowledge transfer optimization simulation dynamic network
在线阅读 下载PDF
Agglomeration and Employment Density:Test Based on Panel Data of Prefecture-Level Cities of China
13
作者 Gang WU Xiuchuan XU 《Asian Agricultural Research》 2016年第8期8-11,17,共5页
Related factors for measuring urban agglomeration effect were studied firstly.Then,panel data of 283 prefecture level cities of China were collected to analyze the effect of agglomeration on employment density.Besides... Related factors for measuring urban agglomeration effect were studied firstly.Then,panel data of 283 prefecture level cities of China were collected to analyze the effect of agglomeration on employment density.Besides,fixed effect model was applied to analyze static panel data,and two-step generalized method of moments(GMM) estimator was employed to analyze dynamic panel data.Results reveal that per capita regional GDP,public medical care level,and population mobility have significant effect on employment density.Therefore,there exists effect of agglomeration economy in prefecture level cities of China in the current stage. 展开更多
关键词 Agglomeration effect Employment density dynamic panel data
在线阅读 下载PDF
Data Fusion about Serviceability Reliability Prediction for the Long-Span Bridge Girder Based on MBDLM and Gaussian Copula Technique
14
作者 Xueping Fan Guanghong Yang +2 位作者 Zhipeng Shang Xiaoxiong Zhao Yuefei Liu 《Structural Durability & Health Monitoring》 EI 2021年第1期69-83,共15页
This article presented a new data fusion approach for reasonably predicting dynamic serviceability reliability of the long-span bridge girder.Firstly,multivariate Bayesian dynamic linear model(MBDLM)considering dynami... This article presented a new data fusion approach for reasonably predicting dynamic serviceability reliability of the long-span bridge girder.Firstly,multivariate Bayesian dynamic linear model(MBDLM)considering dynamic correlation among the multiple variables is provided to predict dynamic extreme deflections;secondly,with the proposed MBDLM,the dynamic correlation coefficients between any two performance functions can be predicted;finally,based on MBDLM and Gaussian copula technique,a new data fusion method is given to predict the serviceability reliability of the long-span bridge girder,and the monitoring extreme deflection data from an actual bridge is provided to illustrated the feasibility and application of the proposed method. 展开更多
关键词 dynamic extreme deflection data serviceability reliability prediction structural health monitoring multivariate Bayesian dynamic linear models Gaussian copula technique
在线阅读 下载PDF
Simulation and experiment of starting transient flow field of the hydrostatic bearing based on dynamic mesh method
15
作者 张艳芹 Kong Xiangbin +3 位作者 Guo Lili Yu Xiaodong Dai Chunxi Shao Junpeng 《High Technology Letters》 EI CAS 2017年第3期298-305,共8页
A new method is developed to assess and analyze the dynamic performance of hydrostatic bearing oil film by using an amulets-layer dynamic mesh technique. It is implemented using C Language to compile the UDF program o... A new method is developed to assess and analyze the dynamic performance of hydrostatic bearing oil film by using an amulets-layer dynamic mesh technique. It is implemented using C Language to compile the UDF program of a single oil film of the hydrostatic bearing. The effects of key lubrication parameters of the hydrostatic bearing are evaluated and analyzed under various working conditions,i.e. under no-load,a load of 40 t,a full load of 160 t,and the rotation speed of 1r/min,2r/min,4r/min,8r/min,16r/min,32r/min. The transient data of oil film bearing capacity under different load and rotation speed are acquired for a total of 18 working conditions during the oil film thickness changing. It allows the effective prediction of dynamic performance of large size hydrostatic bearing. Experiments on hydrostatic bearing oil film have been performed and the results were used to define the boundary conditions for the numerical simulations and validate the developed numerical model. The results showed that the oil film thickness became thinner with the increase of the operating time of the hydrostatic bearing,both the oil film rigidity and the oil cavity pressure increased significantly,and the increase of the bearing capacity was inversely proportional to the cube of the change of the film thickness. Meanwhile,the effect of the load condition on carrying capacity of large size static bearing was more important than the speed condition. The error between the simulation value and the experimental value was 4.25%. 展开更多
关键词 hydrostatic bearing dynamic mesh transient data oil pad UDF
在线阅读 下载PDF
Special Issue for “AI+BT for Big Clinical Omics Data”
16
《Genomics, Proteomics & Bioinformatics》 2025年第1期I0018-I0018,共1页
The journal Genomics,Proteomics&Bioinformatics(GPB)invites leading scholars to contribute high-quality manuscripts for a special issue on“AI+BT for Big Clinical Omics Data”scheduled for publication in the Autumn... The journal Genomics,Proteomics&Bioinformatics(GPB)invites leading scholars to contribute high-quality manuscripts for a special issue on“AI+BT for Big Clinical Omics Data”scheduled for publication in the Autumn of 2026.This special issue seeks submissions that focus on integrating artificial intelligence(AI)and biotechnologies(BT)to largely improve the collection,modelling,analysis,and application of large-scale clinical omics data.The goal is to address the challenges posed by the high-dimensional and dynamic nature of big clinical omics data and explore their potential to advance the diagnosis and treatment of complex diseases. 展开更多
关键词 dynamic data diagnosis artificial intelligence ai complex diseases big clinical omics data BIOTECHNOLOGY artificial intelligence high dimensional data
原文传递
Tailored Partitioning for Healthcare Big Data: A Novel Technique for Efficient Data Management and Hash Retrieval in RDBMS Relational Architectures
17
作者 Ehsan Soltanmohammadi Neset Hikmet Dilek Akgun 《Journal of Data Analysis and Information Processing》 2025年第1期46-65,共20页
Efficient data management in healthcare is essential for providing timely and accurate patient care, yet traditional partitioning methods in relational databases often struggle with the high volume, heterogeneity, and... Efficient data management in healthcare is essential for providing timely and accurate patient care, yet traditional partitioning methods in relational databases often struggle with the high volume, heterogeneity, and regulatory complexity of healthcare data. This research introduces a tailored partitioning strategy leveraging the MD5 hashing algorithm to enhance data insertion, query performance, and load balancing in healthcare systems. By applying a consistent hash function to patient IDs, our approach achieves uniform distribution of records across partitions, optimizing retrieval paths and reducing access latency while ensuring data integrity and compliance. We evaluated the method through experiments focusing on partitioning efficiency, scalability, and fault tolerance. The partitioning efficiency analysis compared our MD5-based approach with standard round-robin methods, measuring insertion times, query latency, and data distribution balance. Scalability tests assessed system performance across increasing dataset sizes and varying partition counts, while fault tolerance experiments examined data integrity and retrieval performance under simulated partition failures. The experimental results demonstrate that the MD5-based partitioning strategy significantly reduces query retrieval times by optimizing data access patterns, achieving up to X% better performance compared to round-robin methods. It also scales effectively with larger datasets, maintaining low latency and ensuring robust resilience under failure scenarios. This novel approach offers a scalable, efficient, and fault-tolerant solution for healthcare systems, facilitating faster clinical decision-making and improved patient care in complex data environments. 展开更多
关键词 Healthcare data Partitioning Relational database Management Systems (RDBMS) Big data Management Load Balance Query Performance Improvement data Integrity and Fault Tolerance EFFICIENT Big data in Healthcare dynamic data Distribution Healthcare Information Systems Partitioning Algorithms Performance Evaluation in databases
在线阅读 下载PDF
An integrated virtual geographic environmental simulation framework:a case study of flood disaster simulation 被引量:10
18
作者 Yulin DING Qing ZHU Hui LIN 《Geo-Spatial Information Science》 SCIE EI 2014年第4期190-200,共11页
Dynamic flood disaster simulation is an emerging and promising technology significantly useful in urban planning,risk assessment,and integrated decision support systems.It is still an important issue to integrate the ... Dynamic flood disaster simulation is an emerging and promising technology significantly useful in urban planning,risk assessment,and integrated decision support systems.It is still an important issue to integrate the large assets such as dynamic observational data,numerical flood simulation models,geographic information technologies,and computing resources into a unified framework.For the intended end user,it is also a holistic solution to create computer interpretable representations and gain insightful understanding of the dynamic disaster processes,the complex impacts,and interactions of disaster factors.In particular,it is still difficult to access and join harmonized data,processing algorithms,and models that are provided by different environmental information infrastructures.In this paper,we demonstrate a virtual geographic environments-based integrated environmental simulation framework for flood disaster management based on the notion of interlinked resources,which is capable of automated accumulating and manipulating of sensor data,creating dynamic geo-analysis and three-dimensional visualizations of ongoing geo-process,and updating the contents of simulation models representing the real environment.The prototype system is evaluated by applying it as a proof of concept to integrate in situ weather observations,numerical weather and flood disaster simulation models,visualization,and analysis of the real time flood event.Case applications indicate that the developed framework can be adopted for use by decision-makers for short-term planning and control since the resulting simulation and visualization are completely based on the latest status of environment. 展开更多
关键词 FLOOD virtual geographic environments(VGE) dynamic data driven active simulation geo-model and geo-data integration
原文传递
Extending Auditing Models to Correspond with Clients’ Needs in Cloud Environments
19
作者 Rizik M. H. Al-Sayyed Esam Y. Al-Nsour Laith M. Al-Omari 《International Journal of Communications, Network and System Sciences》 2016年第9期347-360,共15页
The user control over the life cycle of data is of an extreme importance in clouds in order to determine whether the service provider adheres to the client’s pre-specified needs in the contract between them or n... The user control over the life cycle of data is of an extreme importance in clouds in order to determine whether the service provider adheres to the client’s pre-specified needs in the contract between them or not, significant clients concerns raise on some aspects like social, location and the laws to which the data are subject to. The problem is even magnified more with the lack of transparency by Cloud Service Providers (CSPs). Auditing and compliance enforcement introduce different set of challenges in cloud computing that are not yet resolved. In this paper, a conducted questionnaire showed that the data owners have real concerns about not just the secrecy and integrity of their data in cloud environment, but also for spatial, temporal, and legal issues related to their data especially for sensitive or personal data. The questionnaire results show the importance for the data owners to address mainly three major issues: Their ability to continue the work, the secrecy and integrity of their data, and the spatial, legal, temporal constraints related to their data. Although a good volume of work was dedicated for auditing in the literature, only little work was dedicated to the fulfillment of the contractual obligations of the CSPs. The paper contributes to knowledge by proposing an extension to the auditing models to include the fulfillment of contractual obligations aspects beside the important aspects of secrecy and integrity of client’s data. 展开更多
关键词 AUDITING Public Audibility dynamic data Auditing Spatial Control Temporal Control Logging data Contractual Obligations
在线阅读 下载PDF
A high-resolution Asia-Pacific regional coupled prediction system with dynamically downscaling coupled data assimilation 被引量:4
20
作者 Mingkui Li Shaoqing Zhang +17 位作者 Lixin Wu Xiaopei Lin Ping Chang Gohkan Danabasoglu Zhiqiang Wei Xiaolin Yu Huiqin Hu Xiaohui Ma Weiwei Ma Dongning Jia Xin Liu Haoran Zhao Kai Mao Youwei Ma Yingjing Jiang Xue Wang Guangliang Liu Yuhu Chen 《Science Bulletin》 SCIE EI CAS CSCD 2020年第21期1849-1858,M0004,共11页
A regional coupled prediction system for the Asia-Pacific(AP-RCP)(38°E-180°,20°S-60°N) area has been established.The AP-RCP system consists of WRF-ROMS(Weather Research and Forecast,and Regional Oc... A regional coupled prediction system for the Asia-Pacific(AP-RCP)(38°E-180°,20°S-60°N) area has been established.The AP-RCP system consists of WRF-ROMS(Weather Research and Forecast,and Regional Ocean Model System) coupled models combined with local observational information through dynamically downscaling coupled data assimilation(CDA).The system generates 18-day forecasts for the atmosphere and ocean environment on a daily quasi-operational schedule at Pilot National Laboratory for Marine Science and Technology(Qingdao)(QNLM),consisting of 2 different-resolution coupled models:27 km WRF coupled with 9 km ROMS,9 km WRF coupled with 3 km ROMS,while a version of 3 km WRF coupled with 3 km ROMS is in a test mode.This study is a first step to evaluate the impact of high-resolution coupled model with dynamically downscaling CDA on the extended-range predictions,focusing on forecasts of typhoon onset,improved precipitation and typhoon intensity forecasts as well as simulation of the Kuroshio current variability associated with mesoscale oceanic activities.The results show that for realizing the extended-range predictability of atmospheric and oceanic environment characterized by statistics of mesoscale activities,a fine resolution coupled model resolving local mesoscale phenomena with balanced and coherent coupled initialization is a necessary first step.The next challenges include improving the planetary boundary physics and the representation of air-sea and air-land interactions to enable the model to resolve kilometer or sub-kilometer processes. 展开更多
关键词 High-resolution coupled models dynamically downscaling coupled data assimilation Improved weather and typhoon forecasting Extended-range predictability
原文传递
上一页 1 2 3 下一页 到第
使用帮助 返回顶部