Investing in cryptocurrencies is progressively becoming a norm;however,these assets are excessively volatile and often decrease or increase in value instantly.Thus,rational investors holding cryptocurrencies for exten...Investing in cryptocurrencies is progressively becoming a norm;however,these assets are excessively volatile and often decrease or increase in value instantly.Thus,rational investors holding cryptocurrencies for extended periods firmly search for assets that can diversify their risk,preferably with assets other than cryptocurrencies.In this study,we consider the two most studied cryptocurrencies with the highest capitalization and trading volume/value,namely Bitcoin and Ethereum.Specifically,we examine whether high-performing leading US tech stocks(Facebook,Amazon,Apple,Netflix,Google[FAANG])can provide any diversification benefits to cryptocurrency investors.To do so,we employ dynamic conditional correlation(DCC),asymmetric DCC,time-varying parameter vector autoregression-based connectedness measures,dynamic correlation-based hedge and safe-haven regression analyses,portfolio optimization and hedging strategies,time-and frequency-based wavelet coherence,and high-frequency 10-min intraday data from January 1,2018 to January 31,2023.We find that FAANG stocks can be considered(at least weak)safe havens for Bitcoin and Ethereum during the sample period.Our subperiod analyses reveal that the safehaven role of FAANG stocks,specifically for Bitcoin,has noticeably increased.While the safe-haven property of Facebook is the most promising,for Netflix it is blurred between a weak–safe-haven and a hedge.Our findings may help investors,policymakers,and academicians to invest in cryptocurrencies,formulate relevant investment guidelines,and extend the literature on cryptocurrencies,respectively.展开更多
This work focuses on enhancing low frequency seismic data using a convolutional neural network trained on synthetic data.Traditional seismic data often lack both high and low frequencies,which are essential for detail...This work focuses on enhancing low frequency seismic data using a convolutional neural network trained on synthetic data.Traditional seismic data often lack both high and low frequencies,which are essential for detailed geological interpretation and various geophysical applications.Low frequency data is particularly valuable for reducing wavelet sidelobes and improving full waveform inversion(FWI).Conventional methods for bandwidth extension include seismic deconvolution and sparse inversion,which have limitations in recovering low frequencies.The study explores the potential of the U-net,which has been successful in other geophysical applications such as noise attenuation and seismic resolution enhancement.The novelty in our approach is that we do not rely on computationally expensive finite difference modelling to create training data.Instead,our synthetic training data is created from individual randomly perturbed events with variations in bandwidth,making it more adaptable to different data sets compared to previous deep learning methods.The method was tested on both synthetic and real seismic data,demonstrating effective low frequency reconstruction and sidelobe reduction.With a synthetic full waveform inversion to recover a velocity model and a seismic amplitude inversion to estimate acoustic impedance we demonstrate the validity and benefit of the proposed method.Overall,the study presents a robust approach to seismic bandwidth extension using deep learning,emphasizing the importance of diverse and well-designed but computationally inexpensive synthetic training data.展开更多
Area Sampling Frames (ASFs) are the basis of many statistical programs around the world. To improve the accuracy, objectivity and efficiency of crop survey estimates, an automated stratification method based on geos...Area Sampling Frames (ASFs) are the basis of many statistical programs around the world. To improve the accuracy, objectivity and efficiency of crop survey estimates, an automated stratification method based on geospatial crop planting frequency and cultivation data is proposed. This paper investigates using 2008-2013 geospatial corn, soybean and wheat planting frequency data layers to create three corresponding single crop specific and one multi-crop specific South Dakota (SD) U.S. ASF stratifications. Corn, soybeans and wheat are three major crops in South Dakota. The crop specific ASF stratifications are developed based on crop frequency statistics derived at the primary sampling unit (PSU) level based on the Crop Frequency Data Layers. The SD corn, soybean and wheat mean planting frequency strata of the single crop stratifications are substratified by percent cultivation based on the 2013 Cultivation Layer. The three newly derived ASF stratifications provide more crop specific information when compared to the current National Agricultural Statistics Service (NASS) ASF based on percent cultivation alone. Further, a multi-crop stratification is developed based on the individual corn, soybean and wheat planting frequency data layers. It is observed that all four crop frequency based ASF stratifications consistently predict corn, soybean and wheat planting patterns well as verified by the 2014 Farm Service Agency (FSA) Common Land Unit (CLU) and 578 administrative data. This demonstrates that the new stratifications based on crop planting frequency and cultivation are crop type independent and applicable to all major crops. Further, these results indicate that the new crop specific ASF stratifications have great potential to improve ASF accuracy, efficiency and crop estimates.展开更多
The quality of the low frequency electromagnetic data is affected by the spike and the trend noises.Failure in removal of the spikes and the trends reduces the credibility of data explanation.Based on the analyses of ...The quality of the low frequency electromagnetic data is affected by the spike and the trend noises.Failure in removal of the spikes and the trends reduces the credibility of data explanation.Based on the analyses of the causes and characteristics of these noises,this paper presents the results of a preset statistics stacking method(PSSM)and a piecewise linear fitting method(PLFM)in de-noising the spikes and trends,respectively.The magnitudes of the spikes are either higher or lower than the normal values,which leads to distortion of the useful signal.Comparisons have been performed in removing of the spikes among the average,the statistics and the PSSM methods,and the results indicate that only the PSSM can remove the spikes successfully.On the other hand,the spectrums of the linear and nonlinear trends mainly lie in the low frequency band and can change the calculated resistivity significantly.No influence of the trends is observed when the frequency is higher than a certain threshold value.The PLSM can remove effectively both the linear and nonlinear trends with errors around 1% in the power spectrum.The proposed methods present an effective way for de-noising the spike and the trend noises in the low frequency electromagnetic data,and establish a research basis for de-noising the low frequency noises.展开更多
This paper describes a data transmission method using a cyclic redundancy check and inaudible frequencies.The proposed method uses inaudible high frequencies from 18 k Hz to 22 k Hz generated via the inner speaker of ...This paper describes a data transmission method using a cyclic redundancy check and inaudible frequencies.The proposed method uses inaudible high frequencies from 18 k Hz to 22 k Hz generated via the inner speaker of smart devices.Using the proposed method,the performance is evaluated by conducting data transmission tests between a smart book and smart phone.The test results confirm that the proposed method can send 32 bits of data in an average of 235 ms,the transmission success rate reaches 99.47%,and the error detection rate of the cyclic redundancy check is0.53%.展开更多
Following Bessembinder and Seguins,trading volume is separated into expected and unexpected components.Meanwhile,realized volatility is divided into continuous and discontinuous jump components.We make the empirical r...Following Bessembinder and Seguins,trading volume is separated into expected and unexpected components.Meanwhile,realized volatility is divided into continuous and discontinuous jump components.We make the empirical research to investigate the relationship between trading volume components and various realized volatility using1min high frequency data of Shanghai copper and aluminum futures.Moreover,the asymmetry of volatility-volume relationship is investigated.The results show that there is strong positive correlation between volatility and trading volume when realized volatility and its continuous component are considered.The relationship between trading volume and discontinuous jump component is ambiguous.The expected and unexpected trading volumes have positive influence on volatility.Furthermore,the unexpected trading volume,which is caused by arrival of new information,has a larger influence on price volatility.The findings also show that an asymmetric volatility-volume relationship indeed exists,which can be interpreted by the fact that trading volume has more explanatory power in positive realized semi-variance than negative realized semi-variance.The influence of positive trading volume shock on volatility is larger than that of negative trading volume shock,which reflects strong arbitrage in Chinese copper and aluminum futures markets.展开更多
Background:Accurate mapping of tree species is highly desired in the management and research of plantation forests,whose ecosystem services are currently under threats.Time-series multispectral satellite images,e.g.,f...Background:Accurate mapping of tree species is highly desired in the management and research of plantation forests,whose ecosystem services are currently under threats.Time-series multispectral satellite images,e.g.,from Landsat-8(L8)and Sentinel-2(S2),have been proven useful in mapping general forest types,yet we do not know quantitatively how their spectral features(e.g.,red-edge)and temporal frequency of data acquisitions(e.g.,16-day vs.5-day)contribute to plantation forest mapping to the species level.Moreover,it is unclear to what extent the fusion of L8 and S2 will result in improvements in tree species mapping of northern plantation forests in China.Methods:We designed three sets of classification experiments(i.e.,single-date,multi-date,and spectral-temporal)to evaluate the performances of L8 and S2 data for mapping keystone timber tree species in northern China.We first used seven pairs of L8 and S2 images to evaluate the performances of L8 and S2 key spectral features for separating these tree species across key growing stages.Then we extracted the spectral-temporal features from all available images of different temporal frequency of data acquisition(i.e.,L8 time series,S2 time series,and fusion of L8 and S2)to assess the contribution of image temporal frequency on the accuracy of tree species mapping in the study area.Results:1)S2 outperformed L8 images in all classification experiments,with or without the red edge bands(0.4%–3.4%and 0.2%–4.4%higher for overall accuracy and macro-F1,respectively);2)NDTI(the ratio of SWIR1 minus SWIR2 to SWIR1 plus SWIR2)and Tasseled Cap coefficients were most important features in all the classifications,and for time-series experiments,the spectral-temporal features of red band-related vegetation indices were most useful;3)increasing the temporal frequency of data acquisition can improve overall accuracy of tree species mapping for up to 3.2%(from 90.1%using single-date imagery to 93.3%using S2 time-series),yet similar overall accuracies were achieved using S2 time-series(93.3%)and the fusion of S2 and L8(93.2%).Conclusions:This study quantifies the contributions of L8 and S2 spectral and temporal features in mapping keystone tree species of northern plantation forests in China and suggests that for mapping tree species in China's northern plantation forests,the effects of increasing the temporal frequency of data acquisition could saturate quickly after using only two images from key phenological stages.展开更多
In this paper we consider the problem of testing long memory for a continuous time process based on high frequency data. We provide two test statistics to distinguish between a semimartingale and a fractional integral...In this paper we consider the problem of testing long memory for a continuous time process based on high frequency data. We provide two test statistics to distinguish between a semimartingale and a fractional integral process with jumps, where the integral is driven by a fractional Brownian motion with long memory. The small-sample performances of the statistics are evidenced by means of simulation studies. The real data analysis shows that the fractional integral process with jumps can capture the long memory of some financial data.展开更多
This study examines the use of high frequency data in finance,including volatility estimation and jump tests.High frequency data allows the construction of model-free volatility measures for asset returns.Realized var...This study examines the use of high frequency data in finance,including volatility estimation and jump tests.High frequency data allows the construction of model-free volatility measures for asset returns.Realized variance is a consistent estimator of quadratic variation under mild regularity conditions.Other variation concepts,such as power variation and bipower variation,are useful and important for analyzing high frequency data when jumps are present.High frequency data can also be used to test jumps in asset prices.We discuss three jump tests:bipower variation test,power variation test,and variance swap test in this study.The presence of market microstructure noise complicates the analysis of high frequency data.The survey introduces several robust methods of volatility estimation and jump tests in the presence of market microstructure noise.Finally,some applications of jump tests in asset pricing are discussed in this article.展开更多
In this paper,we construct and implement a new architecture and learning method of customized hybrid RBF neural network for high frequency time series data forecasting.The hybridization is carried out using two runnin...In this paper,we construct and implement a new architecture and learning method of customized hybrid RBF neural network for high frequency time series data forecasting.The hybridization is carried out using two running approaches.In the first one,the ARCH(Autoregressive Conditionally Heteroscedastic)-GARCH(Generalized ARCH)methodology is applied.The second modeling approach is based on RBF(Radial Basic Function)neural network using Gaussian activation function with cloud concept.The use of both methods is useful,because there is no knowledge about the relationship between the inputs into the system and its output.Both approaches are merged into one framework to predict the final forecast values.The question arises whether non-linear methods like neural networks can help modeling any non-linearities being inherent within the estimated statistical model.We also test the customized version of the RBF combined with the machine learning method based on SVM learning system.The proposed novel approach is applied to high frequency data of the BUX stock index time series.Our results show that the proposed approach achieves better forecast accuracy on the validation dataset than most available techniques.展开更多
Increasing attention has been focused on the analysis of the realized volatil- ity, which can be treated as a proxy for the true volatility. In this paper, we study the potential use of the realized volatility as a pr...Increasing attention has been focused on the analysis of the realized volatil- ity, which can be treated as a proxy for the true volatility. In this paper, we study the potential use of the realized volatility as a proxy in a stochastic volatility model estimation. We estimate the leveraged stochastic volatility model using the realized volatility computed from five popular methods across six sampling-frequency transaction data (from 1-min to 60- min) based on the trust region method. Availability of the realized volatility allows us to estimate the model parameters via the MLE and thus avoids computational challenge in the high dimensional integration. Six stock indices are considered in the empirical investigation. We discover some consistent findings and interesting patterns from the empirical results. In general, the significant leverage effect is consistently detected at each sampling frequency and the volatility persistence becomes weaker at the lower sampling frequency.展开更多
This paper presents an approach to design proportional-integral-derivative controllers for inductionmachines usingmeasurements.Most controlmethods developed for induc-tion machines are generally based on mathematical ...This paper presents an approach to design proportional-integral-derivative controllers for inductionmachines usingmeasurements.Most controlmethods developed for induc-tion machines are generally based on mathematical models.Due to complex dynamics of induction machines,identified models are often unable to perfectly describe their behaviour.Thus,the system performance will be limited by the quality of the identified model.Hence,developing control methods that do not require the availability of system model is advantageous.Here,we propose an approach that uses the frequency response data to directly design controllers.The main idea here is to find controller parameters so that the closed-loop frequency response fits a desired frequency response.Its main advantage is that errors associated with the modelling process are avoided.Moreover,the control design process does not depend on the order and complexity of the plant.A practical application to induction machines illustrates the efficacy of the proposed approach.展开更多
The chlorophyll-a concentration data obtained through remote sensing are important for a wide range of scientific concerns.However,cloud cover and limitations of inversion algorithms of chlorophyll-a concentration lea...The chlorophyll-a concentration data obtained through remote sensing are important for a wide range of scientific concerns.However,cloud cover and limitations of inversion algorithms of chlorophyll-a concentration lead to data loss,which critically limits studying the mechanism of spatial-temporal patterns of chlorophyll-a concentration in response to marine environment changes.If the commonly used operational chlorophyll-a concentration products can offer the best data coverage frequency,highest accuracy,best applicability,and greatest robustness at different scales remains debatable to date.Therefore,in the present study,four commonly used operational multi-sensor multi-algorithm fusion products were compared and subjected to validation based on statistical analysis using the available data measured at multiple spatial and temporal scales.The experimental results revealed that in terms of spatial distribution,the chlorophyll-a concentration products generated by averaging method(Chl1-AV/AVW)and GSM model(Chl1-GSM)presented a relatively high data coverage frequency in Case Ⅰ water regions and extremely low or no data coverage frequency in the estuarine coastal zone regions and inland water regions,the chlorophyll-a concentration products generated by the Neural Network algorithm(Chl2)presented high data coverage frequency in the estuarine coastal zone Case 2 water regions.The chlorophyll-a concentration products generated by the OC5 algorithm(ChlOC5)presented high data coverage frequency in Case I water regions and the turbid Case Ⅱ water regions.In terms of absolute precision,the Chl1-AV/AVW and Chl1-GSM chlorophyll-a concentration products performed better in Class I water regions,and the Chl2 product performed well only in Case Ⅱ estuarine coastal zones,while presenting large errors in absolute precision in the Case Ⅰ water regions.The ChlOC5 product presented a higher precision in Case Ⅰ and Case Ⅱ water regions,with a better and more stable performance in both regions compared to the other products.展开更多
文摘Investing in cryptocurrencies is progressively becoming a norm;however,these assets are excessively volatile and often decrease or increase in value instantly.Thus,rational investors holding cryptocurrencies for extended periods firmly search for assets that can diversify their risk,preferably with assets other than cryptocurrencies.In this study,we consider the two most studied cryptocurrencies with the highest capitalization and trading volume/value,namely Bitcoin and Ethereum.Specifically,we examine whether high-performing leading US tech stocks(Facebook,Amazon,Apple,Netflix,Google[FAANG])can provide any diversification benefits to cryptocurrency investors.To do so,we employ dynamic conditional correlation(DCC),asymmetric DCC,time-varying parameter vector autoregression-based connectedness measures,dynamic correlation-based hedge and safe-haven regression analyses,portfolio optimization and hedging strategies,time-and frequency-based wavelet coherence,and high-frequency 10-min intraday data from January 1,2018 to January 31,2023.We find that FAANG stocks can be considered(at least weak)safe havens for Bitcoin and Ethereum during the sample period.Our subperiod analyses reveal that the safehaven role of FAANG stocks,specifically for Bitcoin,has noticeably increased.While the safe-haven property of Facebook is the most promising,for Netflix it is blurred between a weak–safe-haven and a hedge.Our findings may help investors,policymakers,and academicians to invest in cryptocurrencies,formulate relevant investment guidelines,and extend the literature on cryptocurrencies,respectively.
文摘This work focuses on enhancing low frequency seismic data using a convolutional neural network trained on synthetic data.Traditional seismic data often lack both high and low frequencies,which are essential for detailed geological interpretation and various geophysical applications.Low frequency data is particularly valuable for reducing wavelet sidelobes and improving full waveform inversion(FWI).Conventional methods for bandwidth extension include seismic deconvolution and sparse inversion,which have limitations in recovering low frequencies.The study explores the potential of the U-net,which has been successful in other geophysical applications such as noise attenuation and seismic resolution enhancement.The novelty in our approach is that we do not rely on computationally expensive finite difference modelling to create training data.Instead,our synthetic training data is created from individual randomly perturbed events with variations in bandwidth,making it more adaptable to different data sets compared to previous deep learning methods.The method was tested on both synthetic and real seismic data,demonstrating effective low frequency reconstruction and sidelobe reduction.With a synthetic full waveform inversion to recover a velocity model and a seismic amplitude inversion to estimate acoustic impedance we demonstrate the validity and benefit of the proposed method.Overall,the study presents a robust approach to seismic bandwidth extension using deep learning,emphasizing the importance of diverse and well-designed but computationally inexpensive synthetic training data.
文摘Area Sampling Frames (ASFs) are the basis of many statistical programs around the world. To improve the accuracy, objectivity and efficiency of crop survey estimates, an automated stratification method based on geospatial crop planting frequency and cultivation data is proposed. This paper investigates using 2008-2013 geospatial corn, soybean and wheat planting frequency data layers to create three corresponding single crop specific and one multi-crop specific South Dakota (SD) U.S. ASF stratifications. Corn, soybeans and wheat are three major crops in South Dakota. The crop specific ASF stratifications are developed based on crop frequency statistics derived at the primary sampling unit (PSU) level based on the Crop Frequency Data Layers. The SD corn, soybean and wheat mean planting frequency strata of the single crop stratifications are substratified by percent cultivation based on the 2013 Cultivation Layer. The three newly derived ASF stratifications provide more crop specific information when compared to the current National Agricultural Statistics Service (NASS) ASF based on percent cultivation alone. Further, a multi-crop stratification is developed based on the individual corn, soybean and wheat planting frequency data layers. It is observed that all four crop frequency based ASF stratifications consistently predict corn, soybean and wheat planting patterns well as verified by the 2014 Farm Service Agency (FSA) Common Land Unit (CLU) and 578 administrative data. This demonstrates that the new stratifications based on crop planting frequency and cultivation are crop type independent and applicable to all major crops. Further, these results indicate that the new crop specific ASF stratifications have great potential to improve ASF accuracy, efficiency and crop estimates.
文摘The quality of the low frequency electromagnetic data is affected by the spike and the trend noises.Failure in removal of the spikes and the trends reduces the credibility of data explanation.Based on the analyses of the causes and characteristics of these noises,this paper presents the results of a preset statistics stacking method(PSSM)and a piecewise linear fitting method(PLFM)in de-noising the spikes and trends,respectively.The magnitudes of the spikes are either higher or lower than the normal values,which leads to distortion of the useful signal.Comparisons have been performed in removing of the spikes among the average,the statistics and the PSSM methods,and the results indicate that only the PSSM can remove the spikes successfully.On the other hand,the spectrums of the linear and nonlinear trends mainly lie in the low frequency band and can change the calculated resistivity significantly.No influence of the trends is observed when the frequency is higher than a certain threshold value.The PLSM can remove effectively both the linear and nonlinear trends with errors around 1% in the power spectrum.The proposed methods present an effective way for de-noising the spike and the trend noises in the low frequency electromagnetic data,and establish a research basis for de-noising the low frequency noises.
基金supported by Ministry of Educationunder Basic Science Research Program under Grant No.NRF-2013R1A1A2061478
文摘This paper describes a data transmission method using a cyclic redundancy check and inaudible frequencies.The proposed method uses inaudible high frequencies from 18 k Hz to 22 k Hz generated via the inner speaker of smart devices.Using the proposed method,the performance is evaluated by conducting data transmission tests between a smart book and smart phone.The test results confirm that the proposed method can send 32 bits of data in an average of 235 ms,the transmission success rate reaches 99.47%,and the error detection rate of the cyclic redundancy check is0.53%.
基金Projects (71874210,71633006,71573282,71403298) supported by the National Natural Science Foundation of ChinaProject (18ZWA07) supported by Think-Tank Major Project of Hunan Province,China
文摘Following Bessembinder and Seguins,trading volume is separated into expected and unexpected components.Meanwhile,realized volatility is divided into continuous and discontinuous jump components.We make the empirical research to investigate the relationship between trading volume components and various realized volatility using1min high frequency data of Shanghai copper and aluminum futures.Moreover,the asymmetry of volatility-volume relationship is investigated.The results show that there is strong positive correlation between volatility and trading volume when realized volatility and its continuous component are considered.The relationship between trading volume and discontinuous jump component is ambiguous.The expected and unexpected trading volumes have positive influence on volatility.Furthermore,the unexpected trading volume,which is caused by arrival of new information,has a larger influence on price volatility.The findings also show that an asymmetric volatility-volume relationship indeed exists,which can be interpreted by the fact that trading volume has more explanatory power in positive realized semi-variance than negative realized semi-variance.The influence of positive trading volume shock on volatility is larger than that of negative trading volume shock,which reflects strong arbitrage in Chinese copper and aluminum futures markets.
基金supported by National Natural Science Foundation of China(Grant No.41901382)Open Fund of State Key Laboratory of Remote Sensing Science(Grant No.OFSLRSS201917)the HZAU research startup fund(No.11041810340,No.11041810341).
文摘Background:Accurate mapping of tree species is highly desired in the management and research of plantation forests,whose ecosystem services are currently under threats.Time-series multispectral satellite images,e.g.,from Landsat-8(L8)and Sentinel-2(S2),have been proven useful in mapping general forest types,yet we do not know quantitatively how their spectral features(e.g.,red-edge)and temporal frequency of data acquisitions(e.g.,16-day vs.5-day)contribute to plantation forest mapping to the species level.Moreover,it is unclear to what extent the fusion of L8 and S2 will result in improvements in tree species mapping of northern plantation forests in China.Methods:We designed three sets of classification experiments(i.e.,single-date,multi-date,and spectral-temporal)to evaluate the performances of L8 and S2 data for mapping keystone timber tree species in northern China.We first used seven pairs of L8 and S2 images to evaluate the performances of L8 and S2 key spectral features for separating these tree species across key growing stages.Then we extracted the spectral-temporal features from all available images of different temporal frequency of data acquisition(i.e.,L8 time series,S2 time series,and fusion of L8 and S2)to assess the contribution of image temporal frequency on the accuracy of tree species mapping in the study area.Results:1)S2 outperformed L8 images in all classification experiments,with or without the red edge bands(0.4%–3.4%and 0.2%–4.4%higher for overall accuracy and macro-F1,respectively);2)NDTI(the ratio of SWIR1 minus SWIR2 to SWIR1 plus SWIR2)and Tasseled Cap coefficients were most important features in all the classifications,and for time-series experiments,the spectral-temporal features of red band-related vegetation indices were most useful;3)increasing the temporal frequency of data acquisition can improve overall accuracy of tree species mapping for up to 3.2%(from 90.1%using single-date imagery to 93.3%using S2 time-series),yet similar overall accuracies were achieved using S2 time-series(93.3%)and the fusion of S2 and L8(93.2%).Conclusions:This study quantifies the contributions of L8 and S2 spectral and temporal features in mapping keystone tree species of northern plantation forests in China and suggests that for mapping tree species in China's northern plantation forests,the effects of increasing the temporal frequency of data acquisition could saturate quickly after using only two images from key phenological stages.
基金Supported by National NSFC(11501503)Natural Science Foundation of Jiangsu Province of China(BK20131340)+3 种基金China Postdoctoral Science Foundation(2014M560471,2016T90534)Qing Lan Project of Jiangsu Province of ChinaPriority Academic Program Development of Jiangsu Higher Education Institutions(Applied Economics)Key Laboratory of Jiangsu Province(Financial Engineering Laboratory)
文摘In this paper we consider the problem of testing long memory for a continuous time process based on high frequency data. We provide two test statistics to distinguish between a semimartingale and a fractional integral process with jumps, where the integral is driven by a fractional Brownian motion with long memory. The small-sample performances of the statistics are evidenced by means of simulation studies. The real data analysis shows that the fractional integral process with jumps can capture the long memory of some financial data.
文摘This study examines the use of high frequency data in finance,including volatility estimation and jump tests.High frequency data allows the construction of model-free volatility measures for asset returns.Realized variance is a consistent estimator of quadratic variation under mild regularity conditions.Other variation concepts,such as power variation and bipower variation,are useful and important for analyzing high frequency data when jumps are present.High frequency data can also be used to test jumps in asset prices.We discuss three jump tests:bipower variation test,power variation test,and variance swap test in this study.The presence of market microstructure noise complicates the analysis of high frequency data.The survey introduces several robust methods of volatility estimation and jump tests in the presence of market microstructure noise.Finally,some applications of jump tests in asset pricing are discussed in this article.
基金supported by the European Regional Development Fund in the IT4Innovations Centre of Excellence project(CZ.1.05/1.1.00/02.0070).
文摘In this paper,we construct and implement a new architecture and learning method of customized hybrid RBF neural network for high frequency time series data forecasting.The hybridization is carried out using two running approaches.In the first one,the ARCH(Autoregressive Conditionally Heteroscedastic)-GARCH(Generalized ARCH)methodology is applied.The second modeling approach is based on RBF(Radial Basic Function)neural network using Gaussian activation function with cloud concept.The use of both methods is useful,because there is no knowledge about the relationship between the inputs into the system and its output.Both approaches are merged into one framework to predict the final forecast values.The question arises whether non-linear methods like neural networks can help modeling any non-linearities being inherent within the estimated statistical model.We also test the customized version of the RBF combined with the machine learning method based on SVM learning system.The proposed novel approach is applied to high frequency data of the BUX stock index time series.Our results show that the proposed approach achieves better forecast accuracy on the validation dataset than most available techniques.
文摘Increasing attention has been focused on the analysis of the realized volatil- ity, which can be treated as a proxy for the true volatility. In this paper, we study the potential use of the realized volatility as a proxy in a stochastic volatility model estimation. We estimate the leveraged stochastic volatility model using the realized volatility computed from five popular methods across six sampling-frequency transaction data (from 1-min to 60- min) based on the trust region method. Availability of the realized volatility allows us to estimate the model parameters via the MLE and thus avoids computational challenge in the high dimensional integration. Six stock indices are considered in the empirical investigation. We discover some consistent findings and interesting patterns from the empirical results. In general, the significant leverage effect is consistently detected at each sampling frequency and the volatility persistence becomes weaker at the lower sampling frequency.
基金NPRP grant NPRP09-1153-2-450 from the Qatar National Research Fund(a member of Qatar Foundation).
文摘This paper presents an approach to design proportional-integral-derivative controllers for inductionmachines usingmeasurements.Most controlmethods developed for induc-tion machines are generally based on mathematical models.Due to complex dynamics of induction machines,identified models are often unable to perfectly describe their behaviour.Thus,the system performance will be limited by the quality of the identified model.Hence,developing control methods that do not require the availability of system model is advantageous.Here,we propose an approach that uses the frequency response data to directly design controllers.The main idea here is to find controller parameters so that the closed-loop frequency response fits a desired frequency response.Its main advantage is that errors associated with the modelling process are avoided.Moreover,the control design process does not depend on the order and complexity of the plant.A practical application to induction machines illustrates the efficacy of the proposed approach.
基金funded by the Project for Fostering Outstanding Young talents of Henan Academy of Sciences(No.210401001)Special Project for Team Building of Henan Academy of Sciences(No.200501007)+1 种基金Science and Technology Research Project of Henan Province(Nos.212102310424,222102320467,and 212102310024)Major Scientific Research Focus Project of Henan Academy of Sciences(No.210101007).
文摘The chlorophyll-a concentration data obtained through remote sensing are important for a wide range of scientific concerns.However,cloud cover and limitations of inversion algorithms of chlorophyll-a concentration lead to data loss,which critically limits studying the mechanism of spatial-temporal patterns of chlorophyll-a concentration in response to marine environment changes.If the commonly used operational chlorophyll-a concentration products can offer the best data coverage frequency,highest accuracy,best applicability,and greatest robustness at different scales remains debatable to date.Therefore,in the present study,four commonly used operational multi-sensor multi-algorithm fusion products were compared and subjected to validation based on statistical analysis using the available data measured at multiple spatial and temporal scales.The experimental results revealed that in terms of spatial distribution,the chlorophyll-a concentration products generated by averaging method(Chl1-AV/AVW)and GSM model(Chl1-GSM)presented a relatively high data coverage frequency in Case Ⅰ water regions and extremely low or no data coverage frequency in the estuarine coastal zone regions and inland water regions,the chlorophyll-a concentration products generated by the Neural Network algorithm(Chl2)presented high data coverage frequency in the estuarine coastal zone Case 2 water regions.The chlorophyll-a concentration products generated by the OC5 algorithm(ChlOC5)presented high data coverage frequency in Case I water regions and the turbid Case Ⅱ water regions.In terms of absolute precision,the Chl1-AV/AVW and Chl1-GSM chlorophyll-a concentration products performed better in Class I water regions,and the Chl2 product performed well only in Case Ⅱ estuarine coastal zones,while presenting large errors in absolute precision in the Case Ⅰ water regions.The ChlOC5 product presented a higher precision in Case Ⅰ and Case Ⅱ water regions,with a better and more stable performance in both regions compared to the other products.