For random vibration of airborne platform, the accurate evaluation is a key indicator to ensure normal operation of airborne equipment in flight. However, only limited power spectral density(PSD) data can be obtaine...For random vibration of airborne platform, the accurate evaluation is a key indicator to ensure normal operation of airborne equipment in flight. However, only limited power spectral density(PSD) data can be obtained at the stage of flight test. Thus, those conventional evaluation methods cannot be employed when the distribution characteristics and priori information are unknown. In this paper, the fuzzy norm method(FNM) is proposed which combines the advantages of fuzzy theory and norm theory. The proposed method can deeply dig system information from limited data, which probability distribution is not taken into account. Firstly, the FNM is employed to evaluate variable interval and expanded uncertainty from limited PSD data, and the performance of FNM is demonstrated by confidence level, reliability and computing accuracy of expanded uncertainty. In addition, the optimal fuzzy parameters are discussed to meet the requirements of aviation standards and metrological practice. Finally, computer simulation is used to prove the adaptability of FNM. Compared with statistical methods, FNM has superiority for evaluating expanded uncertainty from limited data. The results show that the reliability of calculation and evaluation is superior to 95%.展开更多
This paper studies the problem of robust H∞ control of piecewise-linear chaotic systems with random data loss. The communication links between the plant and the controller are assumed to be imperfect (that is, data ...This paper studies the problem of robust H∞ control of piecewise-linear chaotic systems with random data loss. The communication links between the plant and the controller are assumed to be imperfect (that is, data loss occurs intermittently, which appears typically in a network environment). The data loss is modelled as a random process which obeys a Bernoulli distribution. In the face of random data loss, a piecewise controller is designed to robustly stabilize the networked system in the sense of mean square and also achieve a prescribed H∞ disturbance attenuation performance based on a piecewise-quadratic Lyapunov function. The required H∞ controllers can be designed by solving a set of linear matrix inequalities (LMIs). Chua's system is provided to illustrate the usefulness and applicability of the developed theoretical results.展开更多
We consider the fourth-order nonlinear Schr?dinger equation(4NLS)(i?t+εΔ+Δ2)u=c1um+c2(?u)um-1+c3(?u)2um-2,and establish the conditional almost sure global well-posedness for random initial data in Hs(Rd)for s∈(sc-...We consider the fourth-order nonlinear Schr?dinger equation(4NLS)(i?t+εΔ+Δ2)u=c1um+c2(?u)um-1+c3(?u)2um-2,and establish the conditional almost sure global well-posedness for random initial data in Hs(Rd)for s∈(sc-1/2,sc],when d≥3 and m≥5,where sc:=d/2-2/(m-1)is the scaling critical regularity of 4NLS with the second order derivative nonlinearities.Our proof relies on the nonlinear estimates in a new M-norm and the stability theory in the probabilistic setting.Similar supercritical global well-posedness results also hold for d=2,m≥4 and d≥3,3≤m<5.展开更多
The advent of the digital era and computer-based remote communications has significantly enhanced the applicability of various sciences over the past two decades,notably data science(DS)and cryptography(CG).Data scien...The advent of the digital era and computer-based remote communications has significantly enhanced the applicability of various sciences over the past two decades,notably data science(DS)and cryptography(CG).Data science involves clustering and categorizing unstructured data,while cryptography ensures security and privacy aspects.Despite certain CG laws and requirements mandating fully randomized or pseudonoise outputs from CG primitives and schemes,it appears that CG policies might impede data scientists from working on ciphers or analyzing information systems supporting security and privacy services.However,this study posits that CG does not entirely preclude data scientists from operating in the presence of ciphers,as there are several examples of successful collaborations,including homomorphic encryption schemes,searchable encryption algorithms,secret-sharing protocols,and protocols offering conditional privacy.These instances,along with others,indicate numerous potential solutions for fostering collaboration between DS and CG.Therefore,this study classifies the challenges faced by DS and CG into three distinct groups:challenging problems(which can be conditionally solved and are currently available to use;e.g.,using secret sharing protocols,zero-knowledge proofs,partial homomorphic encryption algorithms,etc.),open problems(where proofs to solve exist but remain unsolved and is now considered as open problems;e.g.,proposing efficient functional encryption algorithm,fully homomorphic encryption scheme,etc.),and hard problems(infeasible to solve with current knowledge and tools).Ultimately,the paper will address specific solutions and outline future directions to tackle the challenges arising at the intersection of DS and CG,such as providing specific access for DS experts in secret-sharing algorithms,assigning data index dimensions to DS experts in ultra-dimension encryption algorithms,defining some functional keys in functional encryption schemes for DS experts,and giving limited shares of data to them for analytics.展开更多
Planetary gear systems have been widely used in transportation, construction, metallurgy, petroleum, aviation and other industrial fields. Under the same condition of power transmission, they have a more compact struc...Planetary gear systems have been widely used in transportation, construction, metallurgy, petroleum, aviation and other industrial fields. Under the same condition of power transmission, they have a more compact structure than ordinary gear train. However, some critical parts, such as sun gear, planet gear and ring gear often suffer from fatigue and wear under the conditions of high speed and heavy load. For reliability research, in order to predict the fatigue probability life of planetary gear system, detailed kinematic and mechanical analysis for a planetary gear system is firstly completed. Meanwhile, a gear bending fatigue test is carried out at a stress level to obtain the strength information of specific gears. Then, a life distribution transformation model is established according to the order statistics theory. Transformation process is that, the life distribution of test gear is transformed to that of single tooth, and then the life distribution of single tooth can be effectively transformed to that of the planetary gear system. In addition, the effectiveness of the transformation model is finally verified by a processing method with random censoring data.展开更多
---Double data rate synchronous dynamic random access memory (DDR3) has become one of the most mainstream applications in current server and computer systems. In order to quickly set up a system-level signal integri...---Double data rate synchronous dynamic random access memory (DDR3) has become one of the most mainstream applications in current server and computer systems. In order to quickly set up a system-level signal integrity (SI) simulation flow for the DDR3 interface, two system-level SI simulation methodologies, which are board-level S-parameter extraction in the frequency-domain and system-level simulation assumptions in the time domain, are introduced in this paper. By comparing the flow of Speed2000 and PowerSI/Hspice, PowerSI is chosen for the printed circuit board (PCB) board-level S-parameter extraction, while Tektronix oscilloscope (TDS7404) is used for the DDR3 waveform measurement. The lab measurement shows good agreement between simulation and measurement. The study shows that the combination of PowerSI and Hspice is recommended for quick system-level DDR3 SI simulation.展开更多
In the wake of the research community gaining deep understanding about control-hijacking attacks,data-oriented attacks have emerged.Among data-oriented attacks,data structure manipulation attack(DSMA)is a major catego...In the wake of the research community gaining deep understanding about control-hijacking attacks,data-oriented attacks have emerged.Among data-oriented attacks,data structure manipulation attack(DSMA)is a major category.Pioneering research was conducted and shows that DSMA is able to circumvent the most effective defenses against control-hijacking attacks-DEP,ASLR and CFI.Up to this day,only two defense techniques have demonstrated their effectiveness:Data Flow Integrity(DFI)and Data Structure Layout Randomization(DSLR).However,DFI has high performance overhead,and dynamic DSLR has two main limitations.L-1:Randomizing a large set of data structures will significantly affect the performance.L-2:To be practical,only a fixed sub-set of data structures are randomized.In the case that the data structures targeted by an attack are not covered,dynamic DSLR is essentially noneffective.To address these two limitations,we propose a novel technique,feedback-control-based adaptive DSLR and build a system named SALADSPlus.SALADSPlus seeks to optimize the trade-off between security and cost through feedback control.Using a novel feedback-control-based adaptive algorithm extended from the Upper Confidence Bound(UCB)algorithm,the defender(controller)uses the feedbacks(cost-effectiveness)from previous randomization cycles to adaptively choose the set of data structures to randomize(the next action).Different from dynamic DSLR,the set of randomized data structures are adaptively changed based on the feedbacks.To obtain the feedbacks,SALADSPlus inserts canary in each data structure at the time of compilation.We have implemented SALADSPlus based on gcc-4.5.0.Experimental results show that the runtime overheads are 1.8%,3.7%,and 5.3% when the randomization cycles are selected as 10s,5s,and 1s respectively.展开更多
Publication biases and collection limitations are the main disadvantages of a traditional meta-analysis based on aggregate patient data(APD)from published articles.Individual patient data(IPD)meta-analysis,as the ...Publication biases and collection limitations are the main disadvantages of a traditional meta-analysis based on aggregate patient data(APD)from published articles.Individual patient data(IPD)meta-analysis,as the gold standard of systematic review,is a possible alternative in this context.However,the publications relative to IPD meta-analyses are still rare compared with the traditional ones,especially in the research oriented to Chinese medicine(CM).In this article,the strengths and detailed functioning of IPD meta-analysis are described.Furthermore,the need for IPD meta-analysis to assess the treatments based on CM was also discussed.Compared with the traditional APD meta-analysis,the IPD meta-analysis might give a more accurate and unbiased assessment and is worth to be recommended to CM researchers.展开更多
Missing data and time-dependent covariates often arise simultaneously in longitudinal studies,and directly applying classical approaches may result in a loss of efficiency and biased estimates.To deal with this proble...Missing data and time-dependent covariates often arise simultaneously in longitudinal studies,and directly applying classical approaches may result in a loss of efficiency and biased estimates.To deal with this problem,we propose weighted corrected estimating equations under the missing at random mechanism,followed by developing a shrinkage empirical likelihood estimation approach for the parameters of interest when time-dependent covariates are present.Such procedure improves efficiency over generalized estimation equations approach with working independent assumption,via combining the independent estimating equations and the extracted additional information from the estimating equations that are excluded by the independence assumption.The contribution from the remaining estimating equations is weighted according to the likelihood of each equation being a consistent estimating equation and the information it carries.We show that the estimators are asymptotically normally distributed and the empirical likelihood ratio statistic and its profile counterpart follow central chi-square distributions asymptotically when evaluated at the true parameter.The practical performance of our approach is demonstrated through numerical simulations and data analysis.展开更多
Because the traditional method is difficult to obtain the internal relationshipand association rules of data when dealingwith massive data, a fuzzy clusteringmethod is proposed to analyze massive data. Firstly, the sa...Because the traditional method is difficult to obtain the internal relationshipand association rules of data when dealingwith massive data, a fuzzy clusteringmethod is proposed to analyze massive data. Firstly, the sample matrix wasnormalized through the normalization of sample data. Secondly, a fuzzy equivalencematrix was constructed by using fuzzy clustering method based on thenormalization matrix, and then the fuzzy equivalence matrix was applied as thebasis for dynamic clustering. Finally, a series of classifications were carried out onthe mass data at the cut-set level successively and a dynamic cluster diagram wasgenerated. The experimental results show that using data fuzzy clustering methodcan effectively identify association rules of data sets by multiple iterations ofmassive data, and the clustering process has short running time and good robustness.Therefore, it can be widely applied to the identification and classification ofassociation rules of massive data such as sound, image and natural resources.展开更多
Researchers have devised a system to recover targeted files from 200 megabytes of data encoded in DNA.Random access is a key for a practical DNA-based memory,but until now,researchers have been able to achieve it with...Researchers have devised a system to recover targeted files from 200 megabytes of data encoded in DNA.Random access is a key for a practical DNA-based memory,but until now,researchers have been able to achieve it with only up to 0.15 megabytes of data.DNA data storage involves translating the binary 0s and 1s of digital data into sequences of the four bases A,C,G,and T that make up DNA.展开更多
In this paper, we discuss the asymptotic normality of the wavelet estimator of the density function based on censored data, when the survival and the censoring times form a stationary α-mixing sequence. To simulate t...In this paper, we discuss the asymptotic normality of the wavelet estimator of the density function based on censored data, when the survival and the censoring times form a stationary α-mixing sequence. To simulate the distribution of estimator such that it is easy to perform statistical inference for the density function, a random weighted estimator of the density function is also constructed and investigated. Finite sample behavior of the estimator is investigated via simulations too.展开更多
基金supported by Aeronautical Science Foundation of China (No. 20100251006)Technological Foundation Project of China (No. J132012C001)
文摘For random vibration of airborne platform, the accurate evaluation is a key indicator to ensure normal operation of airborne equipment in flight. However, only limited power spectral density(PSD) data can be obtained at the stage of flight test. Thus, those conventional evaluation methods cannot be employed when the distribution characteristics and priori information are unknown. In this paper, the fuzzy norm method(FNM) is proposed which combines the advantages of fuzzy theory and norm theory. The proposed method can deeply dig system information from limited data, which probability distribution is not taken into account. Firstly, the FNM is employed to evaluate variable interval and expanded uncertainty from limited PSD data, and the performance of FNM is demonstrated by confidence level, reliability and computing accuracy of expanded uncertainty. In addition, the optimal fuzzy parameters are discussed to meet the requirements of aviation standards and metrological practice. Finally, computer simulation is used to prove the adaptability of FNM. Compared with statistical methods, FNM has superiority for evaluating expanded uncertainty from limited data. The results show that the reliability of calculation and evaluation is superior to 95%.
基金Project partially supported by the Young Scientists Fund of the National Natural Science Foundation of China(Grant No.60904004)the Key Youth Science and Technology Foundation of University of Electronic Science and Technology of China (Grant No.L08010201JX0720)
文摘This paper studies the problem of robust H∞ control of piecewise-linear chaotic systems with random data loss. The communication links between the plant and the controller are assumed to be imperfect (that is, data loss occurs intermittently, which appears typically in a network environment). The data loss is modelled as a random process which obeys a Bernoulli distribution. In the face of random data loss, a piecewise controller is designed to robustly stabilize the networked system in the sense of mean square and also achieve a prescribed H∞ disturbance attenuation performance based on a piecewise-quadratic Lyapunov function. The required H∞ controllers can be designed by solving a set of linear matrix inequalities (LMIs). Chua's system is provided to illustrate the usefulness and applicability of the developed theoretical results.
基金supported by the NationalNatural Science Foundation of China(12001236)the Natural Science Foundation of Guangdong Province(2020A1515110494)。
文摘We consider the fourth-order nonlinear Schr?dinger equation(4NLS)(i?t+εΔ+Δ2)u=c1um+c2(?u)um-1+c3(?u)2um-2,and establish the conditional almost sure global well-posedness for random initial data in Hs(Rd)for s∈(sc-1/2,sc],when d≥3 and m≥5,where sc:=d/2-2/(m-1)is the scaling critical regularity of 4NLS with the second order derivative nonlinearities.Our proof relies on the nonlinear estimates in a new M-norm and the stability theory in the probabilistic setting.Similar supercritical global well-posedness results also hold for d=2,m≥4 and d≥3,3≤m<5.
文摘The advent of the digital era and computer-based remote communications has significantly enhanced the applicability of various sciences over the past two decades,notably data science(DS)and cryptography(CG).Data science involves clustering and categorizing unstructured data,while cryptography ensures security and privacy aspects.Despite certain CG laws and requirements mandating fully randomized or pseudonoise outputs from CG primitives and schemes,it appears that CG policies might impede data scientists from working on ciphers or analyzing information systems supporting security and privacy services.However,this study posits that CG does not entirely preclude data scientists from operating in the presence of ciphers,as there are several examples of successful collaborations,including homomorphic encryption schemes,searchable encryption algorithms,secret-sharing protocols,and protocols offering conditional privacy.These instances,along with others,indicate numerous potential solutions for fostering collaboration between DS and CG.Therefore,this study classifies the challenges faced by DS and CG into three distinct groups:challenging problems(which can be conditionally solved and are currently available to use;e.g.,using secret sharing protocols,zero-knowledge proofs,partial homomorphic encryption algorithms,etc.),open problems(where proofs to solve exist but remain unsolved and is now considered as open problems;e.g.,proposing efficient functional encryption algorithm,fully homomorphic encryption scheme,etc.),and hard problems(infeasible to solve with current knowledge and tools).Ultimately,the paper will address specific solutions and outline future directions to tackle the challenges arising at the intersection of DS and CG,such as providing specific access for DS experts in secret-sharing algorithms,assigning data index dimensions to DS experts in ultra-dimension encryption algorithms,defining some functional keys in functional encryption schemes for DS experts,and giving limited shares of data to them for analytics.
基金Supported by National Key Technology Research and Development Program of China(Grant No.2014BAF08B01)Natural Science Foundation of China(Grant No.51335003)Collaborative Innovation Center of Major Machine Manufacturing in Liaoning Province of China
文摘Planetary gear systems have been widely used in transportation, construction, metallurgy, petroleum, aviation and other industrial fields. Under the same condition of power transmission, they have a more compact structure than ordinary gear train. However, some critical parts, such as sun gear, planet gear and ring gear often suffer from fatigue and wear under the conditions of high speed and heavy load. For reliability research, in order to predict the fatigue probability life of planetary gear system, detailed kinematic and mechanical analysis for a planetary gear system is firstly completed. Meanwhile, a gear bending fatigue test is carried out at a stress level to obtain the strength information of specific gears. Then, a life distribution transformation model is established according to the order statistics theory. Transformation process is that, the life distribution of test gear is transformed to that of single tooth, and then the life distribution of single tooth can be effectively transformed to that of the planetary gear system. In addition, the effectiveness of the transformation model is finally verified by a processing method with random censoring data.
基金supported by the National Natural Science Foundation of China under Grant No.61161001
文摘---Double data rate synchronous dynamic random access memory (DDR3) has become one of the most mainstream applications in current server and computer systems. In order to quickly set up a system-level signal integrity (SI) simulation flow for the DDR3 interface, two system-level SI simulation methodologies, which are board-level S-parameter extraction in the frequency-domain and system-level simulation assumptions in the time domain, are introduced in this paper. By comparing the flow of Speed2000 and PowerSI/Hspice, PowerSI is chosen for the printed circuit board (PCB) board-level S-parameter extraction, while Tektronix oscilloscope (TDS7404) is used for the DDR3 waveform measurement. The lab measurement shows good agreement between simulation and measurement. The study shows that the combination of PowerSI and Hspice is recommended for quick system-level DDR3 SI simulation.
基金supported by ARO W911NF-13-1-0421(MURI)NSF CNS-1422594NSF CNS-1505664.
文摘In the wake of the research community gaining deep understanding about control-hijacking attacks,data-oriented attacks have emerged.Among data-oriented attacks,data structure manipulation attack(DSMA)is a major category.Pioneering research was conducted and shows that DSMA is able to circumvent the most effective defenses against control-hijacking attacks-DEP,ASLR and CFI.Up to this day,only two defense techniques have demonstrated their effectiveness:Data Flow Integrity(DFI)and Data Structure Layout Randomization(DSLR).However,DFI has high performance overhead,and dynamic DSLR has two main limitations.L-1:Randomizing a large set of data structures will significantly affect the performance.L-2:To be practical,only a fixed sub-set of data structures are randomized.In the case that the data structures targeted by an attack are not covered,dynamic DSLR is essentially noneffective.To address these two limitations,we propose a novel technique,feedback-control-based adaptive DSLR and build a system named SALADSPlus.SALADSPlus seeks to optimize the trade-off between security and cost through feedback control.Using a novel feedback-control-based adaptive algorithm extended from the Upper Confidence Bound(UCB)algorithm,the defender(controller)uses the feedbacks(cost-effectiveness)from previous randomization cycles to adaptively choose the set of data structures to randomize(the next action).Different from dynamic DSLR,the set of randomized data structures are adaptively changed based on the feedbacks.To obtain the feedbacks,SALADSPlus inserts canary in each data structure at the time of compilation.We have implemented SALADSPlus based on gcc-4.5.0.Experimental results show that the runtime overheads are 1.8%,3.7%,and 5.3% when the randomization cycles are selected as 10s,5s,and 1s respectively.
基金Supported by the Fundamental Research Funds for the CentralPublic Welfare Research Institutes of China(No.ZZ070818 andZ0259)National Natural Science Foundation of China(No.81072920 and 81303149)
文摘Publication biases and collection limitations are the main disadvantages of a traditional meta-analysis based on aggregate patient data(APD)from published articles.Individual patient data(IPD)meta-analysis,as the gold standard of systematic review,is a possible alternative in this context.However,the publications relative to IPD meta-analyses are still rare compared with the traditional ones,especially in the research oriented to Chinese medicine(CM).In this article,the strengths and detailed functioning of IPD meta-analysis are described.Furthermore,the need for IPD meta-analysis to assess the treatments based on CM was also discussed.Compared with the traditional APD meta-analysis,the IPD meta-analysis might give a more accurate and unbiased assessment and is worth to be recommended to CM researchers.
基金supported by the NNSF of China(No.11271347)the Fundamental Research Funds for the Central Universities
文摘Missing data and time-dependent covariates often arise simultaneously in longitudinal studies,and directly applying classical approaches may result in a loss of efficiency and biased estimates.To deal with this problem,we propose weighted corrected estimating equations under the missing at random mechanism,followed by developing a shrinkage empirical likelihood estimation approach for the parameters of interest when time-dependent covariates are present.Such procedure improves efficiency over generalized estimation equations approach with working independent assumption,via combining the independent estimating equations and the extracted additional information from the estimating equations that are excluded by the independence assumption.The contribution from the remaining estimating equations is weighted according to the likelihood of each equation being a consistent estimating equation and the information it carries.We show that the estimators are asymptotically normally distributed and the empirical likelihood ratio statistic and its profile counterpart follow central chi-square distributions asymptotically when evaluated at the true parameter.The practical performance of our approach is demonstrated through numerical simulations and data analysis.
文摘Because the traditional method is difficult to obtain the internal relationshipand association rules of data when dealingwith massive data, a fuzzy clusteringmethod is proposed to analyze massive data. Firstly, the sample matrix wasnormalized through the normalization of sample data. Secondly, a fuzzy equivalencematrix was constructed by using fuzzy clustering method based on thenormalization matrix, and then the fuzzy equivalence matrix was applied as thebasis for dynamic clustering. Finally, a series of classifications were carried out onthe mass data at the cut-set level successively and a dynamic cluster diagram wasgenerated. The experimental results show that using data fuzzy clustering methodcan effectively identify association rules of data sets by multiple iterations ofmassive data, and the clustering process has short running time and good robustness.Therefore, it can be widely applied to the identification and classification ofassociation rules of massive data such as sound, image and natural resources.
文摘Researchers have devised a system to recover targeted files from 200 megabytes of data encoded in DNA.Random access is a key for a practical DNA-based memory,but until now,researchers have been able to achieve it with only up to 0.15 megabytes of data.DNA data storage involves translating the binary 0s and 1s of digital data into sequences of the four bases A,C,G,and T that make up DNA.
基金Supported by the National Natural Science Foundation of China (No.10871146)
文摘In this paper, we discuss the asymptotic normality of the wavelet estimator of the density function based on censored data, when the survival and the censoring times form a stationary α-mixing sequence. To simulate the distribution of estimator such that it is easy to perform statistical inference for the density function, a random weighted estimator of the density function is also constructed and investigated. Finite sample behavior of the estimator is investigated via simulations too.