In this paper, I described the methods that I used for the creation of Xlets, which are Java applets that are developed for the IDTV environment;and the methods for online data retrieval and processing that I utilized...In this paper, I described the methods that I used for the creation of Xlets, which are Java applets that are developed for the IDTV environment;and the methods for online data retrieval and processing that I utilized in these Xlets. The themes that I chose for the Xlets of the IDTV applications are Earthquake and Tsunami Early Warning;Recent Seismic Activity Report;and Emergency Services. The online data regarding the Recent Seismic Activity Report application are provided by the Kandilli Observatory and Earthquake Research Institute (KOERI) of Bogazici University in Istanbul;while the online data for the Earthquake and Tsunami Early Warning and the Emergency Services applications are provided by the Godoro website which I used for storing (and retrieving by the Xlets) the earthquake and tsunami early warning simulation data, and the DVB network subscriber data (such as name and address information) for utilizing in the Emergency Services (Police, Ambulance and Fire Department) application. I have focused on the methodologies to use digital television as an efficient medium to convey timely and useful information regarding seismic warning data to the public, which forms the main research topic of this paper.展开更多
This paper analyzes the advantages of legal digital currencies and explores their impact on bank big data practices.By combining bank big data collection and processing,it clarifies that legal digital currencies can e...This paper analyzes the advantages of legal digital currencies and explores their impact on bank big data practices.By combining bank big data collection and processing,it clarifies that legal digital currencies can enhance the efficiency of bank data processing,enrich data types,and strengthen data analysis and application capabilities.In response to future development needs,it is necessary to strengthen data collection management,enhance data processing capabilities,innovate big data application models,and provide references for bank big data practices,promoting the transformation and upgrading of the banking industry in the context of legal digital currencies.展开更多
The processing of measuri ng data plays an important role in reverse engineering. Based on grey system the ory, we first propose some methods to the processing of measuring data in revers e engineering. The measured d...The processing of measuri ng data plays an important role in reverse engineering. Based on grey system the ory, we first propose some methods to the processing of measuring data in revers e engineering. The measured data usually have some abnormalities. When the abnor mal data are eliminated by filtering, blanks are created. The grey generation an d GM(1,1) are used to create new data for these blanks. For the uneven data sequ en ce created by measuring error, the mean generation is used to smooth it and then the stepwise and smooth generations are used to improve the data sequence.展开更多
There are multiple processes and corresponding parameters in steel production, and combinations of these comprise various process routes.Different steel products require distinct process routes due to variations in pe...There are multiple processes and corresponding parameters in steel production, and combinations of these comprise various process routes.Different steel products require distinct process routes due to variations in performance targets.Thus, how to accurately set each key process parameter in certain process routes is an ongoing conundrum, because it not only requires a wealth of expert experience but also generates additional costs from the trial productions.In this paper, a new production design system for plate steels is proposed.The proposed system consists of methodology and function development.For methodology, multi-task Elastic Net, clustering, classification, and other methods are used to design process routes.Furthermore, the results are expressed in the form of parameter confidence intervals, which are close to practical application scenarios.For function development, the steel plate process route design function is developed on the Process Intelligent Data Application System(PIDAS) intelligent big data platform.The results demonstrate the method’s practical application value.展开更多
Digital broadcasting is a novel paradigm for the next generation broadcasting. Its goal is to provide not only better quality of pictures but also a variety of services that is impossible in traditional airwaves broad...Digital broadcasting is a novel paradigm for the next generation broadcasting. Its goal is to provide not only better quality of pictures but also a variety of services that is impossible in traditional airwaves broadcasting. One of the important factors for this new broadcasting environment is the interoperability among broadcasting applications since the environment is distributed. Therefore the broadcasting metadata becomes increasingly important and one of the metadata standards for a digital broadcasting is TV-Anytime metadata. TV-Anytime metadata is defined using XML schema, so its instances are XML data. In order to fulfill interoperability, a standard query language is also required and XQuery is a natural choice. There are some researches for dealing with broadcasting metadata. In our previous study, we have proposed the method for efficiently managing the broadcasting metadata in a service provider. However, the environment of a Set-Top Box for digital broadcasting is limited such as low-cost and low-setting. Therefore there are some considerations to apply general approaches for managing the metadata into the Set-Top Box. This paper proposes a method for efficiently managing the broadcasting metadata based on the Set-Top Box and a prototype of metadata management system for evaluating our method. Our system consists of a storage engine to store the metadata and an XQuery engine to search the stored metadata and uses special index for storing and searching. Our two engines are designed independently with hardware platform therefore these engines can be used in any low-cost applications to manage broadcasting metadata.展开更多
With the advent of Industry 4.0, more and more investment casting enterprises are implementing production manufacturing systems, especially in the last two years. This paper summarizes three new common requirements of...With the advent of Industry 4.0, more and more investment casting enterprises are implementing production manufacturing systems, especially in the last two years. This paper summarizes three new common requirements of the digital management aspect in precision casting enterprises, and puts forward three corresponding techniques. They are: the production process tracking card technology based on the main-sub card mode; the workshop site production process processing technology based on the barcode; and the equipment data integration technology. Then, this paper discusses in detail the principle, application and effect of these technologies; to provide the reference for enterprises to move towards digital casting and intelligent casting.展开更多
To improve the human-physical-virtual coordination and integration of the digital twin workshop,3D visual monitoring and human-computer interaction of the digital twin workshop was studied.First,a novel 6D model of th...To improve the human-physical-virtual coordination and integration of the digital twin workshop,3D visual monitoring and human-computer interaction of the digital twin workshop was studied.First,a novel 6D model of the 3D visualization interactive system for digital twin workshops is proposed.As the traditional 5D digital twin model ignores the importance of human-computer interaction,a new dimension of the user terminal was added.A hierarchical real-time data-driven mapping model for the workshop production process is then proposed.Moreover,a real-time data acquisition method for the industrial Internet of things is proposed based on OPC UA(object linking and embedding for process control unified architecture).Based on the 6D model of the system,the process of creating a 3D visualization virtual environment based on virtual reality is introduced,in addition to a data-driven process based on the data management cloud platform.Finally,the 6D model of the system was confirmed using the blade rotor test workshop as the object,and a 3D visualization interactive system is developed.The results show that the system is more transparent,real-time,data-driven and more efficient,as well as promotes the coordination and integration of human-physical-virtual,which has practical significance for developing digital twin workshops.展开更多
As a GIS tool,visibility analysis is used in many areas to evaluate both visible and non-visible places.Visibility analysis builds on a digital surface model describing the terrain morphology,including the position an...As a GIS tool,visibility analysis is used in many areas to evaluate both visible and non-visible places.Visibility analysis builds on a digital surface model describing the terrain morphology,including the position and shapes of all objects that can sometimes act as visibility barriers.However,some barriers,for example vegetation,may be permeable to a certain degree.Despite extensive research and use of visibility analysis in different areas,standard GIS tools do not take permeability into account.This article presents a new method to calculate visibility through partly permeable obstacles.The method is based on a quasi-Monte Carlo simulation with 100 iterations of visibility calculation.Each iteration result represents 1%of vegetation permeability,which can thus range from 1%to 100%visibility behind vegetation obstacles.The main advantage of the method is greater accuracy of visibility results and easy implementation on any GIS software.The incorporation of the proposed method in GIS software would facilitate work in many fields,such as architecture,archaeology,radio communication,and the military.展开更多
This paper reports an effort to develop an intelligent integration framework for digital progressive die design and manufacturing. Both data-and process-centric integration functions are provided by the framework as i...This paper reports an effort to develop an intelligent integration framework for digital progressive die design and manufacturing. Both data-and process-centric integration functions are provided by the framework as if a special ight-weight PDM/PLM (Product Data Management/Product Lifecycle Management) and WM (Workflow Management) system is embedded in the integrated engineering environment. A flexible integration approach based on the CAD (Computer-Aided Design) framework tenet is employed to rapidly build up the system while the intrinsic characteristics of the process are comprehensively taken into account. Introduction of this integration framework would greatly improve the dynamic performance of the overall progressive die design and manufacturing process.展开更多
Propelled by the rise of artificial intelligence,cloud services,and data center applications,next-generation,low-power,local-oscillator-less,digital signal processing(DSP)-free,and short-reach coherent optical communi...Propelled by the rise of artificial intelligence,cloud services,and data center applications,next-generation,low-power,local-oscillator-less,digital signal processing(DSP)-free,and short-reach coherent optical communication has evolved into an increasingly prominent area of research in recent years.Here,we demonstrate DSP-free coherent optical transmission by analog signal processing in frequency synchronous optical network(FSON)architecture,which supports polarization multiplexing and higher-order modulation formats.The FSON architecture that allows the numerous laser sources of optical transceivers within a data center can be quasi-synchronized by means of a tree-distributed homology architecture.In conjunction with our proposed pilot-tone assisted Costas loop for an analog coherent receiver,we achieve a record dual-polarization 224-Gb/s 16-QAM 5-km mismatch transmission with reset-free carrier phase recovery in the optical domain.Our proposed DSP-free analog coherent detection system based on the FSON makes it a promising solution for next-generation,low-power,and high-capacity coherent data center interconnects.展开更多
The increasing demand for unconventional oil and gas resources,especially oil shale,has highlighted the urgent need to develop rapid and accurate strata characterization methods.This paper is the first case and examin...The increasing demand for unconventional oil and gas resources,especially oil shale,has highlighted the urgent need to develop rapid and accurate strata characterization methods.This paper is the first case and examines the drilling process monitoring(DPM)method as a digital,accurate,cost-effective method to characterize oil shale reservoirs in the Ordos Basin,China.The digital DPM method provides real-time in situ testing of the relative variation in rock mechanical strength along the drill bit depth.Furthermore,it can give a refined rock quality designation based on the DPM zoning result(RQD(V_(DPM)))and a strength-grade characterization at the site.Oil shale has high heterogeneity and low strata strength.The digital results are further compared and verified with manual logging,cored samples,and digital panoramic borehole cameras.The findings highlight the innovative potential of the DPM method in identifying the zones of oil shale reservoir along the drill bit depth.The digital results provide a better understanding of the oil shale in Tongchuan and the potential for future oil shale exploration in other regions.展开更多
Increasing volumes of of non-traditional stable isotope data have brought new opportunities to gain important insights into geochemical and planetary processes.However,there is a worrysome trend that the isotopic data...Increasing volumes of of non-traditional stable isotope data have brought new opportunities to gain important insights into geochemical and planetary processes.However,there is a worrysome trend that the isotopic data are interpreted in a fashion that its statistical approaches are chosen subjectively.This communication summarizes the rules regarding calculating the mean,standard deviation and relative standard deviation of a population,as well as error propagation and significant digits.These rules should be used when reporting geochemical data,especially for isotope ratios.Using two examples,I show that statistics matters in isotopic data interpretation.展开更多
A data processing method was proposed for eliminating the end restraint in triaxial tests of soil. A digital image processing method was used to calculate the local deformations and local stresses for any region on th...A data processing method was proposed for eliminating the end restraint in triaxial tests of soil. A digital image processing method was used to calculate the local deformations and local stresses for any region on the surface of triaxial soil specimens. The principle and implementation of this digital image processing method were introduced as well as the calculation method for local mechanical properties of soil specimens. Comparisons were made between the test results calculated by the data from both the entire specimen and local regions, and it was found that the deformations were more uniform in the middle region compared with the entire specimen. In order to quantify the nonuniform characteristic of deformation, the non-uniformity coefficients of strain were defined and calculated. Traditional and end-lubricated triaxial tests were conducted under the same condition to investigate the effects of using local region data for deformation calculation on eliminating the end restraint of specimens. After the statistical analysis of all test results, it was concluded that for the tested soil specimen with the size of 39.1 mm × 80 ram, the utilization of the middle 35 mm region of traditional specimens in data processing had a better effect on eliminating end restraint compared with end lubrication. Furthermore, the local data analysis in this paper was validated through the comparisons with the test results from other researchers.展开更多
A sixteen tree method of data compression of bilevel image is described.Thismethod has high efficiency,no information loss during compression,and easy to realize.
In this paper, a holographic storage scheme for multimedia data storage and retrieval based on the digital signal processing (DSP) is designed. A communication model for holographic storage system is obtained on the a...In this paper, a holographic storage scheme for multimedia data storage and retrieval based on the digital signal processing (DSP) is designed. A communication model for holographic storage system is obtained on the analogy of traditional communication system. Many characteristics of holographic storage are embodied in the communication model. Then some new methods of DSP including two-dimensional (2-D) shifting interleaving, encoding and decoding of modulation-array (MA) code and method of soft-decision, etc. are proposed and employed in the system. From the results of experiments it can be seen that those measures can effectively reduce the influence of noise. A segment of multimedia data, including video and audio data, is retrieved successfully after holographic storage by using those techniques.展开更多
A key requirement of today’s fast changing business outcome and innovation environment is the ability of organizations to adapt dynamically in an effective and efficient manner. Becoming a data-driven decision-making...A key requirement of today’s fast changing business outcome and innovation environment is the ability of organizations to adapt dynamically in an effective and efficient manner. Becoming a data-driven decision-making organization plays a crucially important role in addressing such adaptation requirements. The notion of “data democratization” has emerged as a mechanism with which organizations can address data-driven decision-making process issues and cross-pollinate data in ways that uncover actionable insights. We define data democratization as an attitude focused on curiosity, learning, and experimentation for delivering trusted data for trusted insights to a broad range of authorized stakeholders. In this paper, we propose a general indicator framework for data democratization by highlighting success factors that should not be overlooked in today’s data driven economy. In this practice-based research, these enablers are grouped into six broad building blocks: 1) “ethical guidelines, business context and value”, 2) “data leadership and data culture”, 3) “data literacy and business knowledge”, 4) “data wrangling, trustworthy & standardization”, 5) “sustainable data platform, access, & analytical tool”, 6) “intelligent data governance and privacy”. As an attitude, once it is planned and built, data democratization will need to be maintained. The utility of the approach is demonstrated through a case study for a Cameroon based start-up company that has ongoing data analytics projects. Our findings advance the concepts of data democratization and contribute to data free flow with trust.展开更多
Advances in technology require upgrades in the law. One such area involves data brokers, which have thus far gone unregulated. Data brokers use artificial intelligence to aggregate information into data profiles about...Advances in technology require upgrades in the law. One such area involves data brokers, which have thus far gone unregulated. Data brokers use artificial intelligence to aggregate information into data profiles about individual Americans derived from consumer use of the internet and connected devices. Data profiles are then sold for profit. Government investigators use a legal loophole to purchase this data instead of obtaining a search warrant, which the Fourth Amendment would otherwise require. Consumers have lacked a reasonable means to fight or correct the information data brokers collect. Americans may not even be aware of the risks of data aggregation, which upends the test of reasonable expectations used in a search warrant analysis. Data aggregation should be controlled and regulated, which is the direction some privacy laws take. Legislatures must step forward to safeguard against shadowy data-profiling practices, whether abroad or at home. In the meantime, courts can modify their search warrant analysis by including data privacy principles.展开更多
文摘In this paper, I described the methods that I used for the creation of Xlets, which are Java applets that are developed for the IDTV environment;and the methods for online data retrieval and processing that I utilized in these Xlets. The themes that I chose for the Xlets of the IDTV applications are Earthquake and Tsunami Early Warning;Recent Seismic Activity Report;and Emergency Services. The online data regarding the Recent Seismic Activity Report application are provided by the Kandilli Observatory and Earthquake Research Institute (KOERI) of Bogazici University in Istanbul;while the online data for the Earthquake and Tsunami Early Warning and the Emergency Services applications are provided by the Godoro website which I used for storing (and retrieving by the Xlets) the earthquake and tsunami early warning simulation data, and the DVB network subscriber data (such as name and address information) for utilizing in the Emergency Services (Police, Ambulance and Fire Department) application. I have focused on the methodologies to use digital television as an efficient medium to convey timely and useful information regarding seismic warning data to the public, which forms the main research topic of this paper.
文摘This paper analyzes the advantages of legal digital currencies and explores their impact on bank big data practices.By combining bank big data collection and processing,it clarifies that legal digital currencies can enhance the efficiency of bank data processing,enrich data types,and strengthen data analysis and application capabilities.In response to future development needs,it is necessary to strengthen data collection management,enhance data processing capabilities,innovate big data application models,and provide references for bank big data practices,promoting the transformation and upgrading of the banking industry in the context of legal digital currencies.
文摘The processing of measuri ng data plays an important role in reverse engineering. Based on grey system the ory, we first propose some methods to the processing of measuring data in revers e engineering. The measured data usually have some abnormalities. When the abnor mal data are eliminated by filtering, blanks are created. The grey generation an d GM(1,1) are used to create new data for these blanks. For the uneven data sequ en ce created by measuring error, the mean generation is used to smooth it and then the stepwise and smooth generations are used to improve the data sequence.
文摘There are multiple processes and corresponding parameters in steel production, and combinations of these comprise various process routes.Different steel products require distinct process routes due to variations in performance targets.Thus, how to accurately set each key process parameter in certain process routes is an ongoing conundrum, because it not only requires a wealth of expert experience but also generates additional costs from the trial productions.In this paper, a new production design system for plate steels is proposed.The proposed system consists of methodology and function development.For methodology, multi-task Elastic Net, clustering, classification, and other methods are used to design process routes.Furthermore, the results are expressed in the form of parameter confidence intervals, which are close to practical application scenarios.For function development, the steel plate process route design function is developed on the Process Intelligent Data Application System(PIDAS) intelligent big data platform.The results demonstrate the method’s practical application value.
文摘Digital broadcasting is a novel paradigm for the next generation broadcasting. Its goal is to provide not only better quality of pictures but also a variety of services that is impossible in traditional airwaves broadcasting. One of the important factors for this new broadcasting environment is the interoperability among broadcasting applications since the environment is distributed. Therefore the broadcasting metadata becomes increasingly important and one of the metadata standards for a digital broadcasting is TV-Anytime metadata. TV-Anytime metadata is defined using XML schema, so its instances are XML data. In order to fulfill interoperability, a standard query language is also required and XQuery is a natural choice. There are some researches for dealing with broadcasting metadata. In our previous study, we have proposed the method for efficiently managing the broadcasting metadata in a service provider. However, the environment of a Set-Top Box for digital broadcasting is limited such as low-cost and low-setting. Therefore there are some considerations to apply general approaches for managing the metadata into the Set-Top Box. This paper proposes a method for efficiently managing the broadcasting metadata based on the Set-Top Box and a prototype of metadata management system for evaluating our method. Our system consists of a storage engine to store the metadata and an XQuery engine to search the stored metadata and uses special index for storing and searching. Our two engines are designed independently with hardware platform therefore these engines can be used in any low-cost applications to manage broadcasting metadata.
基金financially supported by the National Science&Technology Key Projects of Numerical Control(2012ZX04012-011)National High-tech R&D Program(863 program)(2013031003)
文摘With the advent of Industry 4.0, more and more investment casting enterprises are implementing production manufacturing systems, especially in the last two years. This paper summarizes three new common requirements of the digital management aspect in precision casting enterprises, and puts forward three corresponding techniques. They are: the production process tracking card technology based on the main-sub card mode; the workshop site production process processing technology based on the barcode; and the equipment data integration technology. Then, this paper discusses in detail the principle, application and effect of these technologies; to provide the reference for enterprises to move towards digital casting and intelligent casting.
基金The National Natural Science Foundation of China(No.51875332)the Capacity Building Projects of Some Local Universities of Shanghai Science and Technology Commission(No.18040501600).
文摘To improve the human-physical-virtual coordination and integration of the digital twin workshop,3D visual monitoring and human-computer interaction of the digital twin workshop was studied.First,a novel 6D model of the 3D visualization interactive system for digital twin workshops is proposed.As the traditional 5D digital twin model ignores the importance of human-computer interaction,a new dimension of the user terminal was added.A hierarchical real-time data-driven mapping model for the workshop production process is then proposed.Moreover,a real-time data acquisition method for the industrial Internet of things is proposed based on OPC UA(object linking and embedding for process control unified architecture).Based on the 6D model of the system,the process of creating a 3D visualization virtual environment based on virtual reality is introduced,in addition to a data-driven process based on the data management cloud platform.Finally,the 6D model of the system was confirmed using the blade rotor test workshop as the object,and a 3D visualization interactive system is developed.The results show that the system is more transparent,real-time,data-driven and more efficient,as well as promotes the coordination and integration of human-physical-virtual,which has practical significance for developing digital twin workshops.
基金This work was financially supported by project 133/2016/RPP-TO-1/b“Teaching of advanced techniques for geodata processing for follow-up study of geoinformatics”.
文摘As a GIS tool,visibility analysis is used in many areas to evaluate both visible and non-visible places.Visibility analysis builds on a digital surface model describing the terrain morphology,including the position and shapes of all objects that can sometimes act as visibility barriers.However,some barriers,for example vegetation,may be permeable to a certain degree.Despite extensive research and use of visibility analysis in different areas,standard GIS tools do not take permeability into account.This article presents a new method to calculate visibility through partly permeable obstacles.The method is based on a quasi-Monte Carlo simulation with 100 iterations of visibility calculation.Each iteration result represents 1%of vegetation permeability,which can thus range from 1%to 100%visibility behind vegetation obstacles.The main advantage of the method is greater accuracy of visibility results and easy implementation on any GIS software.The incorporation of the proposed method in GIS software would facilitate work in many fields,such as architecture,archaeology,radio communication,and the military.
文摘This paper reports an effort to develop an intelligent integration framework for digital progressive die design and manufacturing. Both data-and process-centric integration functions are provided by the framework as if a special ight-weight PDM/PLM (Product Data Management/Product Lifecycle Management) and WM (Workflow Management) system is embedded in the integrated engineering environment. A flexible integration approach based on the CAD (Computer-Aided Design) framework tenet is employed to rapidly build up the system while the intrinsic characteristics of the process are comprehensively taken into account. Introduction of this integration framework would greatly improve the dynamic performance of the overall progressive die design and manufacturing process.
基金supported by the National Natural Science Foundation of China(Grant Nos.62405250 and 62471404)the China Postdoctoral Science Foundation(Grant No.2024M762955)+1 种基金the Key Project of Westlake Institute for Optoelectronics(Grant No.2023GD003)the Optical Com-munication and Sensing Laboratory,School of Engineering,Westlake University.
文摘Propelled by the rise of artificial intelligence,cloud services,and data center applications,next-generation,low-power,local-oscillator-less,digital signal processing(DSP)-free,and short-reach coherent optical communication has evolved into an increasingly prominent area of research in recent years.Here,we demonstrate DSP-free coherent optical transmission by analog signal processing in frequency synchronous optical network(FSON)architecture,which supports polarization multiplexing and higher-order modulation formats.The FSON architecture that allows the numerous laser sources of optical transceivers within a data center can be quasi-synchronized by means of a tree-distributed homology architecture.In conjunction with our proposed pilot-tone assisted Costas loop for an analog coherent receiver,we achieve a record dual-polarization 224-Gb/s 16-QAM 5-km mismatch transmission with reset-free carrier phase recovery in the optical domain.Our proposed DSP-free analog coherent detection system based on the FSON makes it a promising solution for next-generation,low-power,and high-capacity coherent data center interconnects.
基金supported by grants from the Research Grant Council of the Hong Kong Special Administrative Region,China(Grant No.HKU 7137/03E)the National Natural Science Foundation of China(Grant No.41977248)the Strategic Priority Research Program of the Chinese Academy of Sciences(Grant No.XDB10030100).
文摘The increasing demand for unconventional oil and gas resources,especially oil shale,has highlighted the urgent need to develop rapid and accurate strata characterization methods.This paper is the first case and examines the drilling process monitoring(DPM)method as a digital,accurate,cost-effective method to characterize oil shale reservoirs in the Ordos Basin,China.The digital DPM method provides real-time in situ testing of the relative variation in rock mechanical strength along the drill bit depth.Furthermore,it can give a refined rock quality designation based on the DPM zoning result(RQD(V_(DPM)))and a strength-grade characterization at the site.Oil shale has high heterogeneity and low strata strength.The digital results are further compared and verified with manual logging,cored samples,and digital panoramic borehole cameras.The findings highlight the innovative potential of the DPM method in identifying the zones of oil shale reservoir along the drill bit depth.The digital results provide a better understanding of the oil shale in Tongchuan and the potential for future oil shale exploration in other regions.
文摘Increasing volumes of of non-traditional stable isotope data have brought new opportunities to gain important insights into geochemical and planetary processes.However,there is a worrysome trend that the isotopic data are interpreted in a fashion that its statistical approaches are chosen subjectively.This communication summarizes the rules regarding calculating the mean,standard deviation and relative standard deviation of a population,as well as error propagation and significant digits.These rules should be used when reporting geochemical data,especially for isotope ratios.Using two examples,I show that statistics matters in isotopic data interpretation.
基金Supported by Major State Basic Research Development Program of China("973" Program,No.2010CB731502)
文摘A data processing method was proposed for eliminating the end restraint in triaxial tests of soil. A digital image processing method was used to calculate the local deformations and local stresses for any region on the surface of triaxial soil specimens. The principle and implementation of this digital image processing method were introduced as well as the calculation method for local mechanical properties of soil specimens. Comparisons were made between the test results calculated by the data from both the entire specimen and local regions, and it was found that the deformations were more uniform in the middle region compared with the entire specimen. In order to quantify the nonuniform characteristic of deformation, the non-uniformity coefficients of strain were defined and calculated. Traditional and end-lubricated triaxial tests were conducted under the same condition to investigate the effects of using local region data for deformation calculation on eliminating the end restraint of specimens. After the statistical analysis of all test results, it was concluded that for the tested soil specimen with the size of 39.1 mm × 80 ram, the utilization of the middle 35 mm region of traditional specimens in data processing had a better effect on eliminating end restraint compared with end lubrication. Furthermore, the local data analysis in this paper was validated through the comparisons with the test results from other researchers.
文摘A sixteen tree method of data compression of bilevel image is described.Thismethod has high efficiency,no information loss during compression,and easy to realize.
基金This work was supported by the National fuud for Fundamental Key Project(No.973G19990330) and theNational Natural Science Foundation of China(No.69977005).
文摘In this paper, a holographic storage scheme for multimedia data storage and retrieval based on the digital signal processing (DSP) is designed. A communication model for holographic storage system is obtained on the analogy of traditional communication system. Many characteristics of holographic storage are embodied in the communication model. Then some new methods of DSP including two-dimensional (2-D) shifting interleaving, encoding and decoding of modulation-array (MA) code and method of soft-decision, etc. are proposed and employed in the system. From the results of experiments it can be seen that those measures can effectively reduce the influence of noise. A segment of multimedia data, including video and audio data, is retrieved successfully after holographic storage by using those techniques.
文摘A key requirement of today’s fast changing business outcome and innovation environment is the ability of organizations to adapt dynamically in an effective and efficient manner. Becoming a data-driven decision-making organization plays a crucially important role in addressing such adaptation requirements. The notion of “data democratization” has emerged as a mechanism with which organizations can address data-driven decision-making process issues and cross-pollinate data in ways that uncover actionable insights. We define data democratization as an attitude focused on curiosity, learning, and experimentation for delivering trusted data for trusted insights to a broad range of authorized stakeholders. In this paper, we propose a general indicator framework for data democratization by highlighting success factors that should not be overlooked in today’s data driven economy. In this practice-based research, these enablers are grouped into six broad building blocks: 1) “ethical guidelines, business context and value”, 2) “data leadership and data culture”, 3) “data literacy and business knowledge”, 4) “data wrangling, trustworthy & standardization”, 5) “sustainable data platform, access, & analytical tool”, 6) “intelligent data governance and privacy”. As an attitude, once it is planned and built, data democratization will need to be maintained. The utility of the approach is demonstrated through a case study for a Cameroon based start-up company that has ongoing data analytics projects. Our findings advance the concepts of data democratization and contribute to data free flow with trust.
文摘Advances in technology require upgrades in the law. One such area involves data brokers, which have thus far gone unregulated. Data brokers use artificial intelligence to aggregate information into data profiles about individual Americans derived from consumer use of the internet and connected devices. Data profiles are then sold for profit. Government investigators use a legal loophole to purchase this data instead of obtaining a search warrant, which the Fourth Amendment would otherwise require. Consumers have lacked a reasonable means to fight or correct the information data brokers collect. Americans may not even be aware of the risks of data aggregation, which upends the test of reasonable expectations used in a search warrant analysis. Data aggregation should be controlled and regulated, which is the direction some privacy laws take. Legislatures must step forward to safeguard against shadowy data-profiling practices, whether abroad or at home. In the meantime, courts can modify their search warrant analysis by including data privacy principles.