The digital development rights in developing countries are based on establishing a new international economic order and ensuring equal participation in the digital globalization process to achieve people's well-ro...The digital development rights in developing countries are based on establishing a new international economic order and ensuring equal participation in the digital globalization process to achieve people's well-rounded development in the digital society.The relationship between cross-border data flows and the realization of digital development rights in developing countries is quite complex.Currently,developing countries seek to safeguard their existing digital interests through unilateral regulation to protect data sovereignty and multilateral regulation for cross-border data cooperation.However,developing countries still have to face internal conflicts between national digital development rights and individual and corporate digital development rights during the process of realizing digital development rights.They also encounter external contradictions such as developed countries interfering with developing countries'data sovereignty,developed countries squeezing the policy space of developing countries through dominant rules,and developing countries having conflicts between domestic and international rules.This article argues that balancing openness and security on digital trade platforms is the optimal solution for developing countries to realize their digital development rights.The establishment of WTO digital trade rules should inherently reflect the fundamental demands of developing countries in cross-border data flows.At the same time,given China's dual role as a digital powerhouse and a developing country,it should actively promote the realization of digital development rights in developing countries.展开更多
The Fourth Industrial Revolution has endowed the concept of state sovereignty with new era-specific connotations,leading to the emergence of the theory of data sovereignty.While countries refine their domestic legisla...The Fourth Industrial Revolution has endowed the concept of state sovereignty with new era-specific connotations,leading to the emergence of the theory of data sovereignty.While countries refine their domestic legislation to establish their data sovereignty,they are also actively engaging in the negotiation of cross-border data flow rules within international trade agreements to construct data sovereignty.During these negotiations,countries express differing regulatory claims,with some focusing on safeguarding sovereignty and protecting human rights,some prioritizing economic promotion and security assurance,and others targeting traditional and innovative digital trade barriers.These varied approaches reflect the tension between three pairs of values:collectivism and individualism,freedom and security,and tradition and innovation.Based on their distinct value pursuits,three representative models of data sovereignty construction have emerged globally.At the current juncture,when international rules for digital trade are still in their nascent stages,China should timely establish its data sovereignty rules,actively participate in global data sovereignty competition,and balance its sovereignty interests with other interests.Specifically,China should explore the scope of system-acceptable digital trade barriers through free trade zones;integrate domestic and international legal frameworks to ensure the alignment of China’s data governance legislation with its obligations under international trade agreements;and use the development of the“Digital Silk Road”as a starting point to prioritize the formation of digital trade rules with countries participating in the Belt and Road Initiative,promoting the Chinese solutions internationally.展开更多
Cross-border data flows not only involve cross-border trade issues,but also severely challenge personal information protection,national data security,and the jurisdiction of justice and enforcement.As the current digi...Cross-border data flows not only involve cross-border trade issues,but also severely challenge personal information protection,national data security,and the jurisdiction of justice and enforcement.As the current digital trade negotiations could not accommodate these challenges,China has initiated the concept of secure cross-border data flow and has launched a dual-track multi-level regulatory system,including control system for overseas transfer of important data,system of crossborder provision of personal information,and system of cross-border data request for justice and enforcement.To explore a global regulatory framework for cross-border data flows,legitimate and controllable cross-border data flows should be promoted,supervision should be categorized based on risk concerned,and the rule of law should be coordinated at home and abroad to promote system compatibility.To this end,the key is to build a compatible regulatory framework,which includes clarifying the scope of important data to define the“Negative List”for preventing national security risks,improving the cross-border accountability for protecting personal information rights and interests to ease pre-supervision pressure,and focusing on data access rights instead of data localization for upholding the jurisdiction of justice and enforcement.展开更多
The regulations of cross-border data flows is a growing challenge for the international community.International trade agreements,however,appear to be pioneering legal methods to cope,as they have grappled with this is...The regulations of cross-border data flows is a growing challenge for the international community.International trade agreements,however,appear to be pioneering legal methods to cope,as they have grappled with this issue since the 1990s.The World Trade Organization(WTO)rules system offers a partial solution under the General Agreement on Trade in Services(GATS),which covers aspects related to cross-border data flows.The Comprehensive and Progressive Agreement for Trans-Pacific Partnership(CPTPP)and the United States-Mexico-Canada Agreement(USMCA)have also been perceived to provide forward-looking resolutions.In this context,this article analyzes why a resolution to this issue may be illusory.While they regulate cross-border data flows in various ways,the structure and wording of exception articles of both the CPTPP and USMCA have the potential to pose significant challenges to the international legal system.The new system,attempting to weigh societal values and economic development,is imbalanced,often valuing free trade more than individual online privacy and cybersecurity.Furthermore,the inclusion of poison-pill clauses is,by nature,antithetical to cooperation.Thus,for the international community generally,and China in particular,cross-border data flows would best be regulated under the WTO-centered multilateral trade law system.展开更多
Characteristics of the Internet traffic data flow are studied based on the chaos theory. A phase space that is isometric with the network dynamic system is reconstructed by using the single variable time series of a n...Characteristics of the Internet traffic data flow are studied based on the chaos theory. A phase space that is isometric with the network dynamic system is reconstructed by using the single variable time series of a network flow. Some parameters, such as the correlative dimension and the Lyapunov exponent are calculated, and the chaos characteristic is proved to exist in Internet traffic data flows. A neural network model is construct- ed based on radial basis function (RBF) to forecast actual Internet traffic data flow. Simulation results show that, compared with other forecasts of the forward-feedback neural network, the forecast of the RBF neural network based on the chaos theory has faster learning capacity and higher forecasting accuracy.展开更多
The lifetime of a wireless sensor network(WSN)is crucial for determining the maximum duration for data collection in Internet of Things applications.To extend the WSN's lifetime,we propose deploying an unmanned gr...The lifetime of a wireless sensor network(WSN)is crucial for determining the maximum duration for data collection in Internet of Things applications.To extend the WSN's lifetime,we propose deploying an unmanned ground vehicle(UGV)within the energy-hungry WSN.This allows nodes,including sensors and the UGV,to share their energy using wireless power transfer techniques.To optimize the UGV's trajectory,we have developed a tabu searchbased method for global optimality,followed by a clustering-based method suitable for real-world applications.When the UGV reaches a stopping point,it functions as a regular sensor with ample battery.Accordingly,we have designed optimal data and energy allocation algorithms for both centralized and distributed deployment.Simulation results demonstrate that the UGV and energy-sharing significantly extend the WSN's lifetime.This effect is especially prominent in sparsely connected WSNs compared to highly connected ones,and energy-sharing has a more pronounced impact on network lifetime extension than UGV mobility.展开更多
The utilization of computation resources and reconfiguration time has a large impact on reconfiguration system performance. In order to promote the performance, a dynamical self-reconfigurable mechanism for data-drive...The utilization of computation resources and reconfiguration time has a large impact on reconfiguration system performance. In order to promote the performance, a dynamical self-reconfigurable mechanism for data-driven cell array is proposed. Cells can be fired only when the needed data arrives, and cell array can be worked on two modes: fixed execution and reconfiguration. On reconfiguration mode, cell function and data flow direction are changed automatically at run time according to contexts. Simultaneously using an H-tree interconnection network, through pre-storing multiple application mapping contexts in reconfiguration buffer, multiple applications can execute concurrently and context switching time is the minimal. For verifying system performance, some algorithms are selected for mapping onto the proposed structure, and the amount of configuration contexts and execution time are recorded for statistical analysis. The results show that the proposed self-reconfigurable mechanism can reduce the number of contexts efficiently, and has a low computing time.展开更多
This paper proposes a method of data-flow testing for Web services composition.Firstly,to facilitate data flow analysis and constraints collecting,the existing model representation of business process execution langua...This paper proposes a method of data-flow testing for Web services composition.Firstly,to facilitate data flow analysis and constraints collecting,the existing model representation of business process execution language(BPEL)is modified in company with the analysis of data dependency and an exact representation of dead path elimination(DPE)is proposed,which over-comes the difficulties brought to dataflow analysis.Then defining and using information based on data flow rules is collected by parsing BPEL and Web services description language(WSDL)documents and the def-use annotated control flow graph is created.Based on this model,data-flow anomalies which indicate potential errors can be discovered by traversing the paths of graph,and all-du-paths used in dynamic data flow testing for Web services composition are automatically generated,then testers can design the test cases according to the collected constraints for each path selected.展开更多
The present study aims to improve the efficiency of typical procedures used for post-processing flow field data by applying a neural-network technology.Assuming a problem of aircraft design as the workhorse,a regressi...The present study aims to improve the efficiency of typical procedures used for post-processing flow field data by applying a neural-network technology.Assuming a problem of aircraft design as the workhorse,a regression calculation model for processing the flow data of a FCN-VGG19 aircraft is elaborated based on VGGNet(Visual Geometry Group Net)and FCN(Fully Convolutional Network)techniques.As shown by the results,the model displays a strong fitting ability,and there is almost no over-fitting in training.Moreover,the model has good accuracy and convergence.For different input data and different grids,the model basically achieves convergence,showing good performances.It is shown that the proposed simulation regression model based on FCN has great potential in typical problems of computational fluid dynamics(CFD)and related data processing.展开更多
With its high repeatability,the airgun source has been used to monitor the temporal variations of subsurface structures. However,under different working conditions,there will be subtle differences in the airgun source...With its high repeatability,the airgun source has been used to monitor the temporal variations of subsurface structures. However,under different working conditions,there will be subtle differences in the airgun source signals. To some extent,deconvolution can eliminate changes of the recorded signals due to source variations. Generally speaking,in order to remove the airgun source wavelet signal and obtain the Green's functions between the airgun source and stations,we need to select an appropriate method to perform the deconvolution process for seismic waveform data. Frequency domain water level deconvolution and time domain iterative deconvolution are two kinds of deconvolution methods widely used in the field of receiver functions,etc. We use the Binchuan( in Yunnan Province,China) airgun data as an example to compare the performance of these two deconvolution methods in airgun source data processing. The results indicate that frequency domain water level deconvolution is better in terms of computational efficiency;time domain iterative deconvolution is better in terms of the signal-to-noise ratio( SNR),and the initial motion of P-wave is also clearer. We further discuss the sequence issue of deconvolution and stack for multiple-shot airgun data processing. Finally,we propose a general processing flow for the airgun source data to extract the Green 's functions between the airgun source and stations.展开更多
Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at hig...Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and side-stream smoke. Our laboratory monitors six urinary VNAs—N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)—using isotope dilution GC-MS/ MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR<sup>TM</sup> and Caliper Staccato<sup>TM</sup> workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV < 10%) and accuracy (85% - 111%). More importantly this method increases sample throughput while maintaining a low limit of detection (<10 pg/mL) for all analytes. A streamlined sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle.展开更多
SOZL (structured methodology + object-oriented methodology + Z language) is a language that attempts to integrate structured method, object-oriented method and formal method. The core of this language is predicate dat...SOZL (structured methodology + object-oriented methodology + Z language) is a language that attempts to integrate structured method, object-oriented method and formal method. The core of this language is predicate data flow diagram (PDFD). In order to eliminate the ambiguity of predicate data flow diagrams and their associated textual specifications, a formalization of the syntax and semantics of predicate data flow diagrams is necessary. In this paper we use Z notation to define an abstract syntax and the related structural constraints for the PDFD notation, and provide it with an axiomatic semantics based on the concept of data availability and functionality of predicate operation. Finally, an example is given to establish functionality consistent decomposition on hierarchical PDFD (HPDFD).展开更多
Architectures based on the data flow computing model provide an alternative to the conventional Von-Neumann architecture that are widelyused for general purpose computing.Processors based on the data flow architecture...Architectures based on the data flow computing model provide an alternative to the conventional Von-Neumann architecture that are widelyused for general purpose computing.Processors based on the data flow architecture employ fine-grain data-driven parallelism.These architectures have thepotential to exploit the inherent parallelism in compute intensive applicationslike signal processing,image and video processing and so on and can thusachieve faster throughputs and higher power efficiency.In this paper,severaldata flow computing architectures are explored,and their main architecturalfeatures are studied.Furthermore,a classification of the processors is presented based on whether they employ either the data flow execution modelexclusively or in combination with the control flow model and are accordinglygrouped as exclusive data flow or hybrid architectures.The hybrid categoryis further subdivided as conjoint or accelerator-style architectures dependingon how they deploy and separate the data flow and control flow executionmodel within their execution blocks.Lastly,a brief comparison and discussionof their advantages and drawbacks is also considered.From this study weconclude that although the data flow architectures are seen to have maturedsignificantly,issues like data-structure handling and lack of efficient placementand scheduling algorithms have prevented these from becoming commerciallyviable.展开更多
A new synthetical knowledge representation model that integrates the attribute grammar model with the semantic network model was presented. The model mainly uses symbols of attribute grammar to establish a set of sy...A new synthetical knowledge representation model that integrates the attribute grammar model with the semantic network model was presented. The model mainly uses symbols of attribute grammar to establish a set of syntax and semantic rules suitable for a semantic network. Based on the model,the paper introduces a formal method defining data flow diagrams (DFD) and also simply explains how to use the method.展开更多
We consider the problem of data flow fuzzy control of discrete queuing systems with three different service-rate servers. The objective is to dynamically assign customers to idle severs based on the state of the syste...We consider the problem of data flow fuzzy control of discrete queuing systems with three different service-rate servers. The objective is to dynamically assign customers to idle severs based on the state of the system so as to minimize the mean sojourn time of customers. Simulation shows the validity of the fuzzy controller.展开更多
In this paper, on the basis of experimental data of two kinds of chemical explosions, the piston-pushing model of spherical blast-waves and the second-order Godunov-type scheme of finite difference methods with high i...In this paper, on the basis of experimental data of two kinds of chemical explosions, the piston-pushing model of spherical blast-waves and the second-order Godunov-type scheme of finite difference methods with high identification to discontinuity are used to the numerical reconstruction of part of an actual hemispherical blast-wave flow field by properly adjusting the moving bounary conditions of a piston. This method is simple and reliable. It is suitable to the evaluation of effects of the blast-wave flow field away from the explosion center.展开更多
The tidal current duration (TCD) and velocity (TCV) and suspended sediment concentration (SSC) were measured in the dry season in December, 2011 and in the flood season in June, 2012 at the upper part of the Nor...The tidal current duration (TCD) and velocity (TCV) and suspended sediment concentration (SSC) were measured in the dry season in December, 2011 and in the flood season in June, 2012 at the upper part of the North Channel of Changjiang Estuary. They were assimilated with the measured data in 2003, 2004, 2006 and 2007, using the tidal range's proportion conversion. Variations in TCD and TCV, preferential flow and SSC have been calculated. Influences of typical engineering projects such as Qingcaosha fresh water reservoir, Yangtze River Bridge, and land reclamation on the ebb and flood TCD, TCV and SSC in the North Channel for the last 10 years are discussed. The results show that: (1) currently, in the upper part of North Channel, the ebb tide dominates; after the construction of the typical projects, ebb TCD and TCV tends to be larger and the vertical average ebb and flood SSC decrease during the flood season while SSC increases during the dry season; (2) changes in the vertical average TCV are mainly contributed by seasonal runoff variation during the flood season, which is larger in the flood season than that in the dry season; the controlling parameters of increasing ebb TCD and TCV are those large-scale engineering projects in the North Channel; variation in SSC may result mainly from the reduction of basin annual sediment loads, large-scale nearshore projects and so on.展开更多
基金a preliminary result of the Chinese Government Scholarship High-level Graduate Program sponsored by China Scholarship Council(Program No.CSC202206310052)。
文摘The digital development rights in developing countries are based on establishing a new international economic order and ensuring equal participation in the digital globalization process to achieve people's well-rounded development in the digital society.The relationship between cross-border data flows and the realization of digital development rights in developing countries is quite complex.Currently,developing countries seek to safeguard their existing digital interests through unilateral regulation to protect data sovereignty and multilateral regulation for cross-border data cooperation.However,developing countries still have to face internal conflicts between national digital development rights and individual and corporate digital development rights during the process of realizing digital development rights.They also encounter external contradictions such as developed countries interfering with developing countries'data sovereignty,developed countries squeezing the policy space of developing countries through dominant rules,and developing countries having conflicts between domestic and international rules.This article argues that balancing openness and security on digital trade platforms is the optimal solution for developing countries to realize their digital development rights.The establishment of WTO digital trade rules should inherently reflect the fundamental demands of developing countries in cross-border data flows.At the same time,given China's dual role as a digital powerhouse and a developing country,it should actively promote the realization of digital development rights in developing countries.
基金This paper is a phased result of the“Research on the Issue of China’s Data Export System”(24SFB3035)a research project of the Ministry of Justice of China on the construction of the rule of law and the study of legal theories at the ministerial level in 2024.
文摘The Fourth Industrial Revolution has endowed the concept of state sovereignty with new era-specific connotations,leading to the emergence of the theory of data sovereignty.While countries refine their domestic legislation to establish their data sovereignty,they are also actively engaging in the negotiation of cross-border data flow rules within international trade agreements to construct data sovereignty.During these negotiations,countries express differing regulatory claims,with some focusing on safeguarding sovereignty and protecting human rights,some prioritizing economic promotion and security assurance,and others targeting traditional and innovative digital trade barriers.These varied approaches reflect the tension between three pairs of values:collectivism and individualism,freedom and security,and tradition and innovation.Based on their distinct value pursuits,three representative models of data sovereignty construction have emerged globally.At the current juncture,when international rules for digital trade are still in their nascent stages,China should timely establish its data sovereignty rules,actively participate in global data sovereignty competition,and balance its sovereignty interests with other interests.Specifically,China should explore the scope of system-acceptable digital trade barriers through free trade zones;integrate domestic and international legal frameworks to ensure the alignment of China’s data governance legislation with its obligations under international trade agreements;and use the development of the“Digital Silk Road”as a starting point to prioritize the formation of digital trade rules with countries participating in the Belt and Road Initiative,promoting the Chinese solutions internationally.
基金This article is funded by National Social Science Foundation’s general project“Theoretical and Practical Research on International Criminal Judicial Assistance in Combating Cybercrime”(Project No.:19BFX073)National Social Science Foundation’s major project“Translation,Research and Database Construction of Cyberspace Policies and Regulations”(Project No.:20&ZD179).
文摘Cross-border data flows not only involve cross-border trade issues,but also severely challenge personal information protection,national data security,and the jurisdiction of justice and enforcement.As the current digital trade negotiations could not accommodate these challenges,China has initiated the concept of secure cross-border data flow and has launched a dual-track multi-level regulatory system,including control system for overseas transfer of important data,system of crossborder provision of personal information,and system of cross-border data request for justice and enforcement.To explore a global regulatory framework for cross-border data flows,legitimate and controllable cross-border data flows should be promoted,supervision should be categorized based on risk concerned,and the rule of law should be coordinated at home and abroad to promote system compatibility.To this end,the key is to build a compatible regulatory framework,which includes clarifying the scope of important data to define the“Negative List”for preventing national security risks,improving the cross-border accountability for protecting personal information rights and interests to ease pre-supervision pressure,and focusing on data access rights instead of data localization for upholding the jurisdiction of justice and enforcement.
基金This article is supported by the National Social Science Fund Project"China's Non-Market Economy Status in WTO Trade Remedies"(Project No.15XFX023)the Human Rights Institute of Southwest University of Political Science and Law(SWUPL HRI)2015 Yearly Research Project"Global Human Rights Governance under the TPP."All mistakes and omissions are my responsibility.
文摘The regulations of cross-border data flows is a growing challenge for the international community.International trade agreements,however,appear to be pioneering legal methods to cope,as they have grappled with this issue since the 1990s.The World Trade Organization(WTO)rules system offers a partial solution under the General Agreement on Trade in Services(GATS),which covers aspects related to cross-border data flows.The Comprehensive and Progressive Agreement for Trans-Pacific Partnership(CPTPP)and the United States-Mexico-Canada Agreement(USMCA)have also been perceived to provide forward-looking resolutions.In this context,this article analyzes why a resolution to this issue may be illusory.While they regulate cross-border data flows in various ways,the structure and wording of exception articles of both the CPTPP and USMCA have the potential to pose significant challenges to the international legal system.The new system,attempting to weigh societal values and economic development,is imbalanced,often valuing free trade more than individual online privacy and cybersecurity.Furthermore,the inclusion of poison-pill clauses is,by nature,antithetical to cooperation.Thus,for the international community generally,and China in particular,cross-border data flows would best be regulated under the WTO-centered multilateral trade law system.
文摘Characteristics of the Internet traffic data flow are studied based on the chaos theory. A phase space that is isometric with the network dynamic system is reconstructed by using the single variable time series of a network flow. Some parameters, such as the correlative dimension and the Lyapunov exponent are calculated, and the chaos characteristic is proved to exist in Internet traffic data flows. A neural network model is construct- ed based on radial basis function (RBF) to forecast actual Internet traffic data flow. Simulation results show that, compared with other forecasts of the forward-feedback neural network, the forecast of the RBF neural network based on the chaos theory has faster learning capacity and higher forecasting accuracy.
基金supported by the National Natural Science Foundation of China(No.62171486 and No.U2001213)the Guangdong Basic and Applied Basic Research Project(2022A1515140166)。
文摘The lifetime of a wireless sensor network(WSN)is crucial for determining the maximum duration for data collection in Internet of Things applications.To extend the WSN's lifetime,we propose deploying an unmanned ground vehicle(UGV)within the energy-hungry WSN.This allows nodes,including sensors and the UGV,to share their energy using wireless power transfer techniques.To optimize the UGV's trajectory,we have developed a tabu searchbased method for global optimality,followed by a clustering-based method suitable for real-world applications.When the UGV reaches a stopping point,it functions as a regular sensor with ample battery.Accordingly,we have designed optimal data and energy allocation algorithms for both centralized and distributed deployment.Simulation results demonstrate that the UGV and energy-sharing significantly extend the WSN's lifetime.This effect is especially prominent in sparsely connected WSNs compared to highly connected ones,and energy-sharing has a more pronounced impact on network lifetime extension than UGV mobility.
基金the National Natural Science Foundation of China (Nos. 61802304, 61834005, 61772417, 61634004, and 61602377)the Shaanxi Provincial Co-ordination Innovation Project of Science and Technology (No. 2016KTZDGY02-04-02)。
文摘The utilization of computation resources and reconfiguration time has a large impact on reconfiguration system performance. In order to promote the performance, a dynamical self-reconfigurable mechanism for data-driven cell array is proposed. Cells can be fired only when the needed data arrives, and cell array can be worked on two modes: fixed execution and reconfiguration. On reconfiguration mode, cell function and data flow direction are changed automatically at run time according to contexts. Simultaneously using an H-tree interconnection network, through pre-storing multiple application mapping contexts in reconfiguration buffer, multiple applications can execute concurrently and context switching time is the minimal. For verifying system performance, some algorithms are selected for mapping onto the proposed structure, and the amount of configuration contexts and execution time are recorded for statistical analysis. The results show that the proposed self-reconfigurable mechanism can reduce the number of contexts efficiently, and has a low computing time.
基金the National Natural Science Foundation of China(60425206,60503033)National Basic Research Program of China(973 Program,2002CB312000)Opening Foundation of State Key Laboratory of Software Engineering in Wuhan University
文摘This paper proposes a method of data-flow testing for Web services composition.Firstly,to facilitate data flow analysis and constraints collecting,the existing model representation of business process execution language(BPEL)is modified in company with the analysis of data dependency and an exact representation of dead path elimination(DPE)is proposed,which over-comes the difficulties brought to dataflow analysis.Then defining and using information based on data flow rules is collected by parsing BPEL and Web services description language(WSDL)documents and the def-use annotated control flow graph is created.Based on this model,data-flow anomalies which indicate potential errors can be discovered by traversing the paths of graph,and all-du-paths used in dynamic data flow testing for Web services composition are automatically generated,then testers can design the test cases according to the collected constraints for each path selected.
文摘The present study aims to improve the efficiency of typical procedures used for post-processing flow field data by applying a neural-network technology.Assuming a problem of aircraft design as the workhorse,a regression calculation model for processing the flow data of a FCN-VGG19 aircraft is elaborated based on VGGNet(Visual Geometry Group Net)and FCN(Fully Convolutional Network)techniques.As shown by the results,the model displays a strong fitting ability,and there is almost no over-fitting in training.Moreover,the model has good accuracy and convergence.For different input data and different grids,the model basically achieves convergence,showing good performances.It is shown that the proposed simulation regression model based on FCN has great potential in typical problems of computational fluid dynamics(CFD)and related data processing.
基金jointly sponsored by the Special Fund for Earthquake Scientific Research in the Public Welfare of China Earthquake Administration(201508008)the tundamental Research Funds for the Central University(WK2080000053)Academician Chen Yong Workstation Project in Yunnan Province
文摘With its high repeatability,the airgun source has been used to monitor the temporal variations of subsurface structures. However,under different working conditions,there will be subtle differences in the airgun source signals. To some extent,deconvolution can eliminate changes of the recorded signals due to source variations. Generally speaking,in order to remove the airgun source wavelet signal and obtain the Green's functions between the airgun source and stations,we need to select an appropriate method to perform the deconvolution process for seismic waveform data. Frequency domain water level deconvolution and time domain iterative deconvolution are two kinds of deconvolution methods widely used in the field of receiver functions,etc. We use the Binchuan( in Yunnan Province,China) airgun data as an example to compare the performance of these two deconvolution methods in airgun source data processing. The results indicate that frequency domain water level deconvolution is better in terms of computational efficiency;time domain iterative deconvolution is better in terms of the signal-to-noise ratio( SNR),and the initial motion of P-wave is also clearer. We further discuss the sequence issue of deconvolution and stack for multiple-shot airgun data processing. Finally,we propose a general processing flow for the airgun source data to extract the Green 's functions between the airgun source and stations.
文摘Volatile nitrosamines (VNAs) are a group of compounds classified as probable (group 2A) and possible (group 2B) carcinogens in humans. Along with certain foods and contaminated drinking water, VNAs are detected at high levels in tobacco products and in both mainstream and side-stream smoke. Our laboratory monitors six urinary VNAs—N-nitrosodimethylamine (NDMA), N-nitrosomethylethylamine (NMEA), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPIP), N-nitrosopyrrolidine (NPYR), and N-nitrosomorpholine (NMOR)—using isotope dilution GC-MS/ MS (QQQ) for large population studies such as the National Health and Nutrition Examination Survey (NHANES). In this paper, we report for the first time a new automated sample preparation method to more efficiently quantitate these VNAs. Automation is done using Hamilton STAR<sup>TM</sup> and Caliper Staccato<sup>TM</sup> workstations. This new automated method reduces sample preparation time from 4 hours to 2.5 hours while maintaining precision (inter-run CV < 10%) and accuracy (85% - 111%). More importantly this method increases sample throughput while maintaining a low limit of detection (<10 pg/mL) for all analytes. A streamlined sample data flow was created in parallel to the automated method, in which samples can be tracked from receiving to final LIMs output with minimal human intervention, further minimizing human error in the sample preparation process. This new automated method and the sample data flow are currently applied in bio-monitoring of VNAs in the US non-institutionalized population NHANES 2013-2014 cycle.
文摘SOZL (structured methodology + object-oriented methodology + Z language) is a language that attempts to integrate structured method, object-oriented method and formal method. The core of this language is predicate data flow diagram (PDFD). In order to eliminate the ambiguity of predicate data flow diagrams and their associated textual specifications, a formalization of the syntax and semantics of predicate data flow diagrams is necessary. In this paper we use Z notation to define an abstract syntax and the related structural constraints for the PDFD notation, and provide it with an axiomatic semantics based on the concept of data availability and functionality of predicate operation. Finally, an example is given to establish functionality consistent decomposition on hierarchical PDFD (HPDFD).
文摘Architectures based on the data flow computing model provide an alternative to the conventional Von-Neumann architecture that are widelyused for general purpose computing.Processors based on the data flow architecture employ fine-grain data-driven parallelism.These architectures have thepotential to exploit the inherent parallelism in compute intensive applicationslike signal processing,image and video processing and so on and can thusachieve faster throughputs and higher power efficiency.In this paper,severaldata flow computing architectures are explored,and their main architecturalfeatures are studied.Furthermore,a classification of the processors is presented based on whether they employ either the data flow execution modelexclusively or in combination with the control flow model and are accordinglygrouped as exclusive data flow or hybrid architectures.The hybrid categoryis further subdivided as conjoint or accelerator-style architectures dependingon how they deploy and separate the data flow and control flow executionmodel within their execution blocks.Lastly,a brief comparison and discussionof their advantages and drawbacks is also considered.From this study weconclude that although the data flow architectures are seen to have maturedsignificantly,issues like data-structure handling and lack of efficient placementand scheduling algorithms have prevented these from becoming commerciallyviable.
文摘A new synthetical knowledge representation model that integrates the attribute grammar model with the semantic network model was presented. The model mainly uses symbols of attribute grammar to establish a set of syntax and semantic rules suitable for a semantic network. Based on the model,the paper introduces a formal method defining data flow diagrams (DFD) and also simply explains how to use the method.
文摘We consider the problem of data flow fuzzy control of discrete queuing systems with three different service-rate servers. The objective is to dynamically assign customers to idle severs based on the state of the system so as to minimize the mean sojourn time of customers. Simulation shows the validity of the fuzzy controller.
文摘In this paper, on the basis of experimental data of two kinds of chemical explosions, the piston-pushing model of spherical blast-waves and the second-order Godunov-type scheme of finite difference methods with high identification to discontinuity are used to the numerical reconstruction of part of an actual hemispherical blast-wave flow field by properly adjusting the moving bounary conditions of a piston. This method is simple and reliable. It is suitable to the evaluation of effects of the blast-wave flow field away from the explosion center.
文摘The tidal current duration (TCD) and velocity (TCV) and suspended sediment concentration (SSC) were measured in the dry season in December, 2011 and in the flood season in June, 2012 at the upper part of the North Channel of Changjiang Estuary. They were assimilated with the measured data in 2003, 2004, 2006 and 2007, using the tidal range's proportion conversion. Variations in TCD and TCV, preferential flow and SSC have been calculated. Influences of typical engineering projects such as Qingcaosha fresh water reservoir, Yangtze River Bridge, and land reclamation on the ebb and flood TCD, TCV and SSC in the North Channel for the last 10 years are discussed. The results show that: (1) currently, in the upper part of North Channel, the ebb tide dominates; after the construction of the typical projects, ebb TCD and TCV tends to be larger and the vertical average ebb and flood SSC decrease during the flood season while SSC increases during the dry season; (2) changes in the vertical average TCV are mainly contributed by seasonal runoff variation during the flood season, which is larger in the flood season than that in the dry season; the controlling parameters of increasing ebb TCD and TCV are those large-scale engineering projects in the North Channel; variation in SSC may result mainly from the reduction of basin annual sediment loads, large-scale nearshore projects and so on.