Based on 1:50 000 basic geographic information data,the digital elevation,longitude,latitude,slope,aspect and all kinds of terrain databases of Benxi City (China) were established with Arcgis 9.2 software.Township bou...Based on 1:50 000 basic geographic information data,the digital elevation,longitude,latitude,slope,aspect and all kinds of terrain databases of Benxi City (China) were established with Arcgis 9.2 software.Township boundary database was established by digitizing map,and the township information was amended according to the reality.Taking the elaborate simulation of spatial distribution of annual mean temperature as an example,the paper reports the application of the geographic information database in analysis of climatic resources.展开更多
In contemporary society, the problem of information asymmetry in talent markets has been becoming more prominent. On one hand, the company and candidates fight against each other based on the information available, so...In contemporary society, the problem of information asymmetry in talent markets has been becoming more prominent. On one hand, the company and candidates fight against each other based on the information available, so both of them could make fraud that will make the market level lower and lower. On the other hand, former scholars have studied from enterprises' perspective and put forward methods to solve it based on the aspect of improving the technology and standard mechanism, which could not solve the problem of information asymmetry thoroughly. Consequently, this research put up with the idea that the market can reduce information asymmetry through the establishing personnel information database and related platforms, which has a great practical significance on realizing the optimal allocation of the market and saving cost. At the same time, this study discussed the problems of information asymmetry fundamentally, which was of great importance to enrich the related theory research. Specific models were constructed through two perspectives from the enterprise and the candidates. And then two models would be eventually integrated into a large system. Finally, this research put all related information into a system, which was beneficial to the optimal allocation of human resources with constraints of the market environment.展开更多
The article tries to discover the major authors in the field of information seeking behavior via social network analysis. It is to be accomplished through a literature review and also by focusing on a graphic map show...The article tries to discover the major authors in the field of information seeking behavior via social network analysis. It is to be accomplished through a literature review and also by focusing on a graphic map showing the seven most productive coauthors in this field. Based on these seven authors' work, five probable research directions about information seeking behavior are discerned and presented.展开更多
Quantum private query(QPQ)protocols have emerged as a pivotal innovation in quantum cryptography,offering users the ability to retrieve specific database information while preserving privacy.However,the practical impl...Quantum private query(QPQ)protocols have emerged as a pivotal innovation in quantum cryptography,offering users the ability to retrieve specific database information while preserving privacy.However,the practical implementation of these protocols faces significant security challenges,particularly from joint-measurement attacks in multi-round query scenarios.A recent study by Liu et al.addresses these vulnerabilities through a comprehensive analysis and proposes innovative solutions,marking a critical advancement in the field[1].展开更多
As global oil exploration ventures into deeper and more complex territories,drilling bit wear and damage have emerged as significant constraints on drilling efficiency and safety.Despite the publication of official bi...As global oil exploration ventures into deeper and more complex territories,drilling bit wear and damage have emerged as significant constraints on drilling efficiency and safety.Despite the publication of official bit wear evaluation standards by the International Association of Drill Contractors(IADC),the current lack of quantitative and scientific evaluation techniques means that bit wear assessments rely heavily on engineers'experience.Consequently,forming a standardized database of drilling bit information to underpin the mechanisms of bit wear and facilitate optimal design remains challenging.Therefore,an efficient and quantitative evaluation of bit wear is crucial for optimizing bit performance and improving penetration efficiency.This paper introduces an automatic standard workflow for the quantitative evaluation of bit wear and the design of a comprehensive bit information database.Initially,a method for acquiring images of worn bits at the drilling site was developed.Subsequently,the wear classification and grading models based on computer vision were established to determine bit status.The wear classification model focuses on the positioning and classification of bit cutters,while the wear grading model quantifies the extent of bit wear.After that,the automatic evaluation method of the bit wear is realized.Additionally,bit wear evaluation software was designed,integrating all necessary functions to assess bit wear in accordance with IADC standards.Finally,a drilling bit database was created by integrating bit wear data,logging data,mud-logging data,and basic drilling bit data.This workflow represents a novel approach to collecting and analyzing drilling bit information at drilling sites.It holds potential to facilitate the creation of a large-scale information database for the entire lifecycle of drilling bits,marking the inception of intelligent analysis,design,and manufacture of drilling bits,thereby enhancing performance in challenging drilling conditions.展开更多
Software development is getting a transition from centralized version control systems(CVCSs)like Subversion to decentralized version control systems(DVCDs)like Git due to lesser efficiency of former in terms of branch...Software development is getting a transition from centralized version control systems(CVCSs)like Subversion to decentralized version control systems(DVCDs)like Git due to lesser efficiency of former in terms of branching,fusion,time,space,merging,offline commits&builds and repository,etc.Git is having a share of 77%of total VCS,followed by Subversion with a share of 13.5%.The majority of software industries are getting a migration from Subversion to Git.Only a few migration tools are available in the software industry.Still,these too lack in many features like lack of identifying the empty directories as premigration check,failover capabilities during migration due to network failure or disk space issue,and detailed report generation as post-migration steps.In this work,a holistic,proactive and novel approach has been presented for pre/during/post-migration validation from Subversion to Git.Many scripts have been developed and executed run-time over various projects for overcoming the limitations of existing migration software tools for a Subversion to Git migration.During premigration,none of the available migration tools has the capability to fetch empty directories of Subversion,which results in an incomplete migration from Subversion to Git.Many Scripts have been developed and executed for pre-migration validation and migration preparation,which overcomes the problem of incomplete migration.Experimentation was conducted in SRLC Software Research Lab,Chicago,USA.During the migration process,in case of loss of network connection or due to any other reason,if migration stops or breaks,available migration tools do not have capabilities to start over from the same point where it left.Various Scripts have been developed and executed to keep the migration revision history in the cache(elastic cache)to start from the same point where it was left due to connection failure.During post-migration,none of the available version control migration tools generate a detailed report giving information about the total size of source Subversion repositories, the total volume of data migrated todestination repositories in Git, total number of pools migrated, time taken formigration, number of Subversion users with email notification, etc. VariousScripts have been developed and executed for the above purpose during thepost-migration process.展开更多
文摘Based on 1:50 000 basic geographic information data,the digital elevation,longitude,latitude,slope,aspect and all kinds of terrain databases of Benxi City (China) were established with Arcgis 9.2 software.Township boundary database was established by digitizing map,and the township information was amended according to the reality.Taking the elaborate simulation of spatial distribution of annual mean temperature as an example,the paper reports the application of the geographic information database in analysis of climatic resources.
文摘In contemporary society, the problem of information asymmetry in talent markets has been becoming more prominent. On one hand, the company and candidates fight against each other based on the information available, so both of them could make fraud that will make the market level lower and lower. On the other hand, former scholars have studied from enterprises' perspective and put forward methods to solve it based on the aspect of improving the technology and standard mechanism, which could not solve the problem of information asymmetry thoroughly. Consequently, this research put up with the idea that the market can reduce information asymmetry through the establishing personnel information database and related platforms, which has a great practical significance on realizing the optimal allocation of the market and saving cost. At the same time, this study discussed the problems of information asymmetry fundamentally, which was of great importance to enrich the related theory research. Specific models were constructed through two perspectives from the enterprise and the candidates. And then two models would be eventually integrated into a large system. Finally, this research put all related information into a system, which was beneficial to the optimal allocation of human resources with constraints of the market environment.
文摘The article tries to discover the major authors in the field of information seeking behavior via social network analysis. It is to be accomplished through a literature review and also by focusing on a graphic map showing the seven most productive coauthors in this field. Based on these seven authors' work, five probable research directions about information seeking behavior are discerned and presented.
文摘Quantum private query(QPQ)protocols have emerged as a pivotal innovation in quantum cryptography,offering users the ability to retrieve specific database information while preserving privacy.However,the practical implementation of these protocols faces significant security challenges,particularly from joint-measurement attacks in multi-round query scenarios.A recent study by Liu et al.addresses these vulnerabilities through a comprehensive analysis and proposes innovative solutions,marking a critical advancement in the field[1].
基金the National Key Research and Development Project(2019YFA0708300)the Strategic Cooperation Technology Projects of CNPC and CUPB(ZLZX 2020-03)+1 种基金the CNPC Science and Technology Innovation Fund(No.2022DO02-0308)the Distinguished Young Foundation of National Natural Science Foundation of China(No.52125401)for their financial support。
文摘As global oil exploration ventures into deeper and more complex territories,drilling bit wear and damage have emerged as significant constraints on drilling efficiency and safety.Despite the publication of official bit wear evaluation standards by the International Association of Drill Contractors(IADC),the current lack of quantitative and scientific evaluation techniques means that bit wear assessments rely heavily on engineers'experience.Consequently,forming a standardized database of drilling bit information to underpin the mechanisms of bit wear and facilitate optimal design remains challenging.Therefore,an efficient and quantitative evaluation of bit wear is crucial for optimizing bit performance and improving penetration efficiency.This paper introduces an automatic standard workflow for the quantitative evaluation of bit wear and the design of a comprehensive bit information database.Initially,a method for acquiring images of worn bits at the drilling site was developed.Subsequently,the wear classification and grading models based on computer vision were established to determine bit status.The wear classification model focuses on the positioning and classification of bit cutters,while the wear grading model quantifies the extent of bit wear.After that,the automatic evaluation method of the bit wear is realized.Additionally,bit wear evaluation software was designed,integrating all necessary functions to assess bit wear in accordance with IADC standards.Finally,a drilling bit database was created by integrating bit wear data,logging data,mud-logging data,and basic drilling bit data.This workflow represents a novel approach to collecting and analyzing drilling bit information at drilling sites.It holds potential to facilitate the creation of a large-scale information database for the entire lifecycle of drilling bits,marking the inception of intelligent analysis,design,and manufacture of drilling bits,thereby enhancing performance in challenging drilling conditions.
基金the Deanship of Scientific research at Majmaah University for the funding this work under Project No.(RGP-2019-26).
文摘Software development is getting a transition from centralized version control systems(CVCSs)like Subversion to decentralized version control systems(DVCDs)like Git due to lesser efficiency of former in terms of branching,fusion,time,space,merging,offline commits&builds and repository,etc.Git is having a share of 77%of total VCS,followed by Subversion with a share of 13.5%.The majority of software industries are getting a migration from Subversion to Git.Only a few migration tools are available in the software industry.Still,these too lack in many features like lack of identifying the empty directories as premigration check,failover capabilities during migration due to network failure or disk space issue,and detailed report generation as post-migration steps.In this work,a holistic,proactive and novel approach has been presented for pre/during/post-migration validation from Subversion to Git.Many scripts have been developed and executed run-time over various projects for overcoming the limitations of existing migration software tools for a Subversion to Git migration.During premigration,none of the available migration tools has the capability to fetch empty directories of Subversion,which results in an incomplete migration from Subversion to Git.Many Scripts have been developed and executed for pre-migration validation and migration preparation,which overcomes the problem of incomplete migration.Experimentation was conducted in SRLC Software Research Lab,Chicago,USA.During the migration process,in case of loss of network connection or due to any other reason,if migration stops or breaks,available migration tools do not have capabilities to start over from the same point where it left.Various Scripts have been developed and executed to keep the migration revision history in the cache(elastic cache)to start from the same point where it was left due to connection failure.During post-migration,none of the available version control migration tools generate a detailed report giving information about the total size of source Subversion repositories, the total volume of data migrated todestination repositories in Git, total number of pools migrated, time taken formigration, number of Subversion users with email notification, etc. VariousScripts have been developed and executed for the above purpose during thepost-migration process.