This study analyzes the User Interface(UI)and User Experience(UX)of information systems that provide local government information.The systems analyzed are the Local Administrative Comprehensive Information Disclosure ...This study analyzes the User Interface(UI)and User Experience(UX)of information systems that provide local government information.The systems analyzed are the Local Administrative Comprehensive Information Disclosure System(Zheripan),the Integrated Local Financial Disclosure System(Qinching Online),and the Local Regulations Information System(12348 Zhejiang Legal Network).The Local Administrative Comprehensive Information Disclosure System offers public service and personnel information,while the Integrated Local Financial Disclosure System provides financial information,and the Local Regulations Information System offers legal information as its main content.The analysis framework utilized three elements:objective data,psychological factors,and heuristic evaluation.The results of the first objective data analysis show that approximately 70%of visits to Zheripan and Qinching Online are through search,and the time spent on the homepage is short.In contrast,about 70%of visits to the 12348 Zhejiang Legal Network are direct visits,with users browsing multiple pages with a clear purpose.In terms of data provision methods,Zheripan provides two types of data in three formats,Qinching Online offers 28 types of data in five formats,and 12348 Zhejiang Legal Network provides one type of information in a single format.The second psychological factor analysis found that all three websites had a number of menus suitable for short-term cognitive capacity.However,only one of the sites had a layout that considered the user’s eye movement.Finally,the heuristic evaluation revealed that most of the evaluation criteria were not met.While the design is relatively simple and follows standards,feedback for users,error prevention,and help options were lacking.Moreover,the user-specific usability was low,and the systems remained at the information-providing level.Based on these findings,both short-term and long-term improvement measures for creating an interactive system beyond simple information disclosure are proposed.展开更多
In Japanese 'e-government' policy, called 'e-Japan', the 'administrative document management system' is functioning as information searching systems. On the other hand, this system has also gen...In Japanese 'e-government' policy, called 'e-Japan', the 'administrative document management system' is functioning as information searching systems. On the other hand, this system has also generated the problem that it is not fully functioning as a means for the information sharing in a governmental agency. So, the purpose of this research is to find how the administrative document management system can function as information sharing in administrative organization. For this purpose, this paper considers the current status and some problems firstly. And secondary, this paper proposes the idea and constructs some information systems using administrative official Website. This is the method and approach of this research. As a conclusion, this proposal information system junctions as information sharing support systems.展开更多
Fraudulent website is an important car-rier tool for telecom fraud.At present,criminals can use artificial intelligence generative content technol-ogy to quickly generate fraudulent website templates and build fraudul...Fraudulent website is an important car-rier tool for telecom fraud.At present,criminals can use artificial intelligence generative content technol-ogy to quickly generate fraudulent website templates and build fraudulent websites in batches.Accurate identification of fraudulent website will effectively re-duce the risk of public victimization.Therefore,this study developed a fraudulent website template iden-tification method based on DOM structure extraction of website fingerprint features,which solves the prob-lems of single-dimension identification,low accuracy,and the insufficient generalization ability of current fraudulent website templates.This method uses an im-proved SimHash algorithm to traverse the DOM tree of a webpage,extract website node features,calcu-late the weight of each node,and obtain the finger-print feature vector of the website through dimension-ality reduction.Finally,the random forest algorithm is used to optimize the training features for the best combination of parameters.This method automati-cally extracts fingerprint features from websites and identifies website template ownership based on these features.An experimental analysis showed that this method achieves a classification accuracy of 89.8%and demonstrates superior recognition.展开更多
Website fingerprinting (WF) attacks can reveal information about the websites users browse by de-anonymizing encrypted traffic. Traditional website fingerprinting attack models, focusing solely on a single spatial fea...Website fingerprinting (WF) attacks can reveal information about the websites users browse by de-anonymizing encrypted traffic. Traditional website fingerprinting attack models, focusing solely on a single spatial feature, are inefficient regarding training time. When confronted with the concept drift problem, they suffer from a sharp drop in attack accuracy within a short period due to their reliance on extensive, outdated training data. To address the above problems, this paper proposes a parallel website fingerprinting attack (APWF) that incorporates an attention mechanism, which consists of an attack model and a fine-tuning method. Among them, the APWF model innovatively adopts a parallel structure, fusing temporal features related to both the front and back of the fingerprint sequence, along with spatial features captured through channel attention enhancement, to enhance the accuracy of the attack. Meanwhile, the APWF method introduces isomorphic migration learning and adjusts the model by freezing the optimal model weights and fine-tuning the parameters so that only a small number of the target, samples are needed to adapt to web page changes. A series of experiments show that the attack model can achieve 83% accuracy with the help of only 10 samples per category, which is a 30% improvement over the traditional attack model. Compared to comparative modeling, APWF improves accuracy while reducing time costs. After further fine-tuning the freezing model, the method in this paper can maintain the accuracy at 92.4% in the scenario of 56 days between the training data and the target data, which is only 4% less loss compared to the instant attack, significantly improving the robustness and accuracy of the model in coping with conceptual drift.展开更多
In order to improve the accuracy and integrality of mining data records from the web, the concepts of isomorphic page and directory page and three algorithms are proposed. An isomorphic web page is a set of web pages ...In order to improve the accuracy and integrality of mining data records from the web, the concepts of isomorphic page and directory page and three algorithms are proposed. An isomorphic web page is a set of web pages that have uniform structure, only differing in main information. A web page which contains many links that link to isomorphic web pages is called a directory page. Algorithm 1 can find directory web pages in a web using adjacent links similar analysis method. It first sorts the link, and then counts the links in each directory. If the count is greater than a given valve then finds the similar sub-page links in the directory and gives the results. A function for an isomorphic web page judgment is also proposed. Algorithm 2 can mine data records from an isomorphic page using a noise information filter. It is based on the fact that the noise information is the same in two isomorphic pages, only the main information is different. Algorithm 3 can mine data records from an entire website using the technology of spider. The experiment shows that the proposed algorithms can mine data records more intactly than the existing algorithms. Mining data records from isomorphic pages is an efficient method.展开更多
Submission Before submitting the manuscript,authors should carefully read the“Instructions for Authors”“Submission Walkthrough”available at the journal’s official website http://www.keaipublishing.com/dcmed under...Submission Before submitting the manuscript,authors should carefully read the“Instructions for Authors”“Submission Walkthrough”available at the journal’s official website http://www.keaipublishing.com/dcmed under the“Submission”menu.The manuscript should be accompanied by a cover letter from the author who will be responsible for correspondence.Peer-review and refereeing are made online and anonymously.展开更多
People read many things online these days.It's an easy way to get a lot of information fast.They look at news,see posts and watch videos.But how much of the information is true?Some things online are fake.So it...People read many things online these days.It's an easy way to get a lot of information fast.They look at news,see posts and watch videos.But how much of the information is true?Some things online are fake.So it's important to check the facts before you believe or share anything.You can ask people or look at other sources first.Check newspapers or official websites.Always think carefully before you believe something online.展开更多
To improve the efficiency of the management of large farms, a digitalized system of economic statistics is designed based on the Internet platform of the digitalized agricultural integrated system of Friendship Farm, ...To improve the efficiency of the management of large farms, a digitalized system of economic statistics is designed based on the Internet platform of the digitalized agricultural integrated system of Friendship Farm, the largest farm in the world. The system can also realize data storage by using the access databank technology. A dynamic website system, based on ASP technology, is used to implement the on-line inquiry of the statistical index of the agricultural economy and the diagrams of the index every year. Furthermore, it can provide the value of comprehensive indicators of farms' economic profits for every year and a trend chart of the comprehensive appraisal of economic development, by using principal component analysis. An early-warning indicator boundary is decided based on the majority principle. The system can realize the farm's terminal data input with effective data-collecting channels and a normative gathering scope and system. This system breaks through the stand-alone database system in agricultural digitalized research to realize the database system in the Internet environment by integrating the existing technologies in China. The system lays a foundation for the further integrated research on the network platform of the digitalized agricultural integrated system in Friendship Farm.展开更多
With the rapid development of Web, there are more and more Web databases available for users to access. At the same time, job searchers often have difficulties in first finding the right sources and then querying over...With the rapid development of Web, there are more and more Web databases available for users to access. At the same time, job searchers often have difficulties in first finding the right sources and then querying over them, providing such an integrated job search system over Web databases has become a Web application in high demand. Based on such consideration, we build a deep Web data integration system that supports unified access for users to multiple job Web sites as a job meta-search engine. In this paper, the architecture of the system is given first, and the key components in the system are introduced.展开更多
This article explored China’s urban employment dynamics with particular focus on the city size effect.Big data derived from the largest recruitment website were used to examine the direct and indirect impacts of city...This article explored China’s urban employment dynamics with particular focus on the city size effect.Big data derived from the largest recruitment website were used to examine the direct and indirect impacts of city size on employment demand by using mediating and moderating models.We also investigated the roles of the government and location factors which have seldom been considered in literature.Results showed that the concentration degree of new jobs is higher than that of stock employment and population across cities,implying a path dependency mechanism of job creation and employment expansion.Meanwhile,numerous job posts in inland central cities are probably a symptom of more even distribution of employment in future China.Econometric models further verified the significant correlation between city size and job creation.Moreover,industrial diversity,fixed asset investment,and spatial location have heterogeneous effects on employment demand in cities of different sizes and different levels of administration.These results can not only deepen our understanding of the crucial role of city size in urban employment growth but also demonstrate the future trend of labor and population geography of China.Policy implications are then proposed for job creation in cities of China and other developing countries.展开更多
Nowadays, an increasing number of web applications require identification registration. However, the behavior of website registration has not ever been thoroughly studied. We use the database provided by the Chinese S...Nowadays, an increasing number of web applications require identification registration. However, the behavior of website registration has not ever been thoroughly studied. We use the database provided by the Chinese Software Develop Net (CSDN) to provide a complete perspective on this research point. We concentrate on the following three aspects: complexity, correlation, and preference. From these analyses, we draw the following conclusions: firstly, a considerable number of users have not realized the importance of identification and are using very simple identifications that can be attacked very easily. Secondly, there is a strong complexity correlation among the three parts of identification. Thirdly, the top three passwords that users like are 123456789, 12345678 and 11111111, and the top three email providers that they prefer are NETEASE, qq and sina. Further, we provide some suggestions to improve the quality of user passwords.展开更多
The fraudulent website image is a vital information carrier for telecom fraud.The efficient and precise recognition of fraudulent website images is critical to combating and dealing with fraudulent websites.Current re...The fraudulent website image is a vital information carrier for telecom fraud.The efficient and precise recognition of fraudulent website images is critical to combating and dealing with fraudulent websites.Current research on image recognition of fraudulent websites is mainly carried out at the level of image feature extraction and similarity study,which have such disadvantages as difficulty in obtaining image data,insufficient image analysis,and single identification types.This study develops a model based on the entropy method for image leader decision and Inception-v3 transfer learning to address these disadvantages.The data processing part of the model uses a breadth search crawler to capture the image data.Then,the information in the images is evaluated with the entropy method,image weights are assigned,and the image leader is selected.In model training and prediction,the transfer learning of the Inception-v3 model is introduced into image recognition of fraudulent websites.Using selected image leaders to train the model,multiple types of fraudulent websites are identified with high accuracy.The experiment proves that this model has a superior accuracy in recognizing images on fraudulent websites compared to other current models.展开更多
The feature analysis of fraudulent websites is of great significance to the combat,prevention and control of telecom fraud crimes.Aiming to address the shortcomings of existing analytical approaches,i.e.single dimensi...The feature analysis of fraudulent websites is of great significance to the combat,prevention and control of telecom fraud crimes.Aiming to address the shortcomings of existing analytical approaches,i.e.single dimension and venerability to anti-reconnaissance,this paper adopts the Stacking,the ensemble learning algorithm,combines multiple modalities such as text,image and URL,and proposes a multimodal fraudulent website identification method by ensembling heterogeneous models.Crossvalidation is first used in the training of multiple largely different base classifiers that are strong in learning,such as BERT model,residual neural network(ResNet)and logistic regression model.Classification of the text,image and URL features are then performed respectively.The results of the base classifiers are taken as the input of the meta-classifier,and the output of which is eventually used as the final identification.The study indicates that the fusion method is more effective in identifying fraudulent websites than the single-modal method,and the recall is increased by at least 1%.In addition,the deployment of the algorithm to the real Internet environment shows the improvement of the identification accuracy by at least 1.9%compared with other fusion methods.展开更多
A hl-quality website is crucial to a company for a successful e-business. The technique maintainers are always faced with the problem how to locate the prime factors which affect the quality of the websites. In view o...A hl-quality website is crucial to a company for a successful e-business. The technique maintainers are always faced with the problem how to locate the prime factors which affect the quality of the websites. In view of the complexity and fuzziness of BtoC webslte, a quality diagnosis method based on the multl-attribute and multi-layer fuzzy comprehensive evaluation model including all the quality factors is proposed. A simple example of diagnosis on a famous domestic BtoC websites shows the specific steps of this method and proves its validity. The process of quality evaluation and diagnosis system is illustrated and the computer program of diagnosis is Oven.展开更多
Agricultural product trading website is not only an important way to realize the agriculture informatization,but also the main manifestation of the agricultural informatization. Based on the preliminary understanding ...Agricultural product trading website is not only an important way to realize the agriculture informatization,but also the main manifestation of the agricultural informatization. Based on the preliminary understanding of the content and characteristics of China's agricultural product trading website,the paper builds a scientific evaluation indicator system and evaluates 50 typical agricultural product trading websites objectively by using classification and grading method. The results show that the overall construction level of China's agricultural product trading websites is general,and there are obvious differences between regions; the lack of website commercial function and the lag of informatization are the main factors restricting the development of agricultural product trading websites.展开更多
Phishing attacks are security attacks that do not affect only individuals’or organizations’websites but may affect Internet of Things(IoT)devices and net-works.IoT environment is an exposed environment for such atta...Phishing attacks are security attacks that do not affect only individuals’or organizations’websites but may affect Internet of Things(IoT)devices and net-works.IoT environment is an exposed environment for such attacks.Attackers may use thingbots software for the dispersal of hidden junk emails that are not noticed by users.Machine and deep learning and other methods were used to design detection methods for these attacks.However,there is still a need to enhance detection accuracy.Optimization of an ensemble classification method for phishing website(PW)detection is proposed in this study.A Genetic Algo-rithm(GA)was used for the proposed method optimization by tuning several ensemble Machine Learning(ML)methods parameters,including Random Forest(RF),AdaBoost(AB),XGBoost(XGB),Bagging(BA),GradientBoost(GB),and LightGBM(LGBM).These were accomplished by ranking the optimized classi-fiers to pick out the best classifiers as a base for the proposed method.A PW data-set that is made up of 4898 PWs and 6157 legitimate websites(LWs)was used for this study's experiments.As a result,detection accuracy was enhanced and reached 97.16 percent.展开更多
The advent of the Intemet has witnessed a revolution in the business world. One typical example is the emergence of the B2B website. The present paper looks at the B2B website, a conventionalized digital text, in term...The advent of the Intemet has witnessed a revolution in the business world. One typical example is the emergence of the B2B website. The present paper looks at the B2B website, a conventionalized digital text, in terms of its communicative purposes, move features as well as linguistic specialties, with the aim of presenting the generic structure of the B2B website and its principal linguistic features contributing to the realization of its communicative purposes. It is demonstrated that the B2B website is one instance of the promotional genres and it has a lot in common with advertisement English and "netsneak" in the aspect of lexico-grammatical features.展开更多
In this paper, we conduct research on the big data and the artificial intelligence aided decision-making mechanism with the applications on video website homemade program innovation. Make homemade video shows new medi...In this paper, we conduct research on the big data and the artificial intelligence aided decision-making mechanism with the applications on video website homemade program innovation. Make homemade video shows new media platform site content production with new possible, as also make the traditional media found in Internet age, the breakthrough point of the times. Site homemade video program, which is beneficial to reduce copyright purchase demand, reduce the cost, avoid the homogeneity competition, rich advertising marketing at the same time, improve the profit pattern, the organic combination of content production and operation, complete the strategic transformation. On the basis of these advantages, once the site of homemade video program to form a brand and a higher brand influence. Our later research provides the literature survey for the related issues.展开更多
文摘This study analyzes the User Interface(UI)and User Experience(UX)of information systems that provide local government information.The systems analyzed are the Local Administrative Comprehensive Information Disclosure System(Zheripan),the Integrated Local Financial Disclosure System(Qinching Online),and the Local Regulations Information System(12348 Zhejiang Legal Network).The Local Administrative Comprehensive Information Disclosure System offers public service and personnel information,while the Integrated Local Financial Disclosure System provides financial information,and the Local Regulations Information System offers legal information as its main content.The analysis framework utilized three elements:objective data,psychological factors,and heuristic evaluation.The results of the first objective data analysis show that approximately 70%of visits to Zheripan and Qinching Online are through search,and the time spent on the homepage is short.In contrast,about 70%of visits to the 12348 Zhejiang Legal Network are direct visits,with users browsing multiple pages with a clear purpose.In terms of data provision methods,Zheripan provides two types of data in three formats,Qinching Online offers 28 types of data in five formats,and 12348 Zhejiang Legal Network provides one type of information in a single format.The second psychological factor analysis found that all three websites had a number of menus suitable for short-term cognitive capacity.However,only one of the sites had a layout that considered the user’s eye movement.Finally,the heuristic evaluation revealed that most of the evaluation criteria were not met.While the design is relatively simple and follows standards,feedback for users,error prevention,and help options were lacking.Moreover,the user-specific usability was low,and the systems remained at the information-providing level.Based on these findings,both short-term and long-term improvement measures for creating an interactive system beyond simple information disclosure are proposed.
文摘In Japanese 'e-government' policy, called 'e-Japan', the 'administrative document management system' is functioning as information searching systems. On the other hand, this system has also generated the problem that it is not fully functioning as a means for the information sharing in a governmental agency. So, the purpose of this research is to find how the administrative document management system can function as information sharing in administrative organization. For this purpose, this paper considers the current status and some problems firstly. And secondary, this paper proposes the idea and constructs some information systems using administrative official Website. This is the method and approach of this research. As a conclusion, this proposal information system junctions as information sharing support systems.
基金This research is a phased achievement of The National Social Science Fund of China(23BGL272).
文摘Fraudulent website is an important car-rier tool for telecom fraud.At present,criminals can use artificial intelligence generative content technol-ogy to quickly generate fraudulent website templates and build fraudulent websites in batches.Accurate identification of fraudulent website will effectively re-duce the risk of public victimization.Therefore,this study developed a fraudulent website template iden-tification method based on DOM structure extraction of website fingerprint features,which solves the prob-lems of single-dimension identification,low accuracy,and the insufficient generalization ability of current fraudulent website templates.This method uses an im-proved SimHash algorithm to traverse the DOM tree of a webpage,extract website node features,calcu-late the weight of each node,and obtain the finger-print feature vector of the website through dimension-ality reduction.Finally,the random forest algorithm is used to optimize the training features for the best combination of parameters.This method automati-cally extracts fingerprint features from websites and identifies website template ownership based on these features.An experimental analysis showed that this method achieves a classification accuracy of 89.8%and demonstrates superior recognition.
基金supported by the National Defense Basic Scientific Research Program of China(No.JCKY2023602C026)the funding of Key Laboratory of Mobile Application Innovation and Governance Technology,Ministry of Industry and Information Technology(2023IFS080601-K).
文摘Website fingerprinting (WF) attacks can reveal information about the websites users browse by de-anonymizing encrypted traffic. Traditional website fingerprinting attack models, focusing solely on a single spatial feature, are inefficient regarding training time. When confronted with the concept drift problem, they suffer from a sharp drop in attack accuracy within a short period due to their reliance on extensive, outdated training data. To address the above problems, this paper proposes a parallel website fingerprinting attack (APWF) that incorporates an attention mechanism, which consists of an attack model and a fine-tuning method. Among them, the APWF model innovatively adopts a parallel structure, fusing temporal features related to both the front and back of the fingerprint sequence, along with spatial features captured through channel attention enhancement, to enhance the accuracy of the attack. Meanwhile, the APWF method introduces isomorphic migration learning and adjusts the model by freezing the optimal model weights and fine-tuning the parameters so that only a small number of the target, samples are needed to adapt to web page changes. A series of experiments show that the attack model can achieve 83% accuracy with the help of only 10 samples per category, which is a 30% improvement over the traditional attack model. Compared to comparative modeling, APWF improves accuracy while reducing time costs. After further fine-tuning the freezing model, the method in this paper can maintain the accuracy at 92.4% in the scenario of 56 days between the training data and the target data, which is only 4% less loss compared to the instant attack, significantly improving the robustness and accuracy of the model in coping with conceptual drift.
文摘In order to improve the accuracy and integrality of mining data records from the web, the concepts of isomorphic page and directory page and three algorithms are proposed. An isomorphic web page is a set of web pages that have uniform structure, only differing in main information. A web page which contains many links that link to isomorphic web pages is called a directory page. Algorithm 1 can find directory web pages in a web using adjacent links similar analysis method. It first sorts the link, and then counts the links in each directory. If the count is greater than a given valve then finds the similar sub-page links in the directory and gives the results. A function for an isomorphic web page judgment is also proposed. Algorithm 2 can mine data records from an isomorphic page using a noise information filter. It is based on the fact that the noise information is the same in two isomorphic pages, only the main information is different. Algorithm 3 can mine data records from an entire website using the technology of spider. The experiment shows that the proposed algorithms can mine data records more intactly than the existing algorithms. Mining data records from isomorphic pages is an efficient method.
文摘Submission Before submitting the manuscript,authors should carefully read the“Instructions for Authors”“Submission Walkthrough”available at the journal’s official website http://www.keaipublishing.com/dcmed under the“Submission”menu.The manuscript should be accompanied by a cover letter from the author who will be responsible for correspondence.Peer-review and refereeing are made online and anonymously.
文摘People read many things online these days.It's an easy way to get a lot of information fast.They look at news,see posts and watch videos.But how much of the information is true?Some things online are fake.So it's important to check the facts before you believe or share anything.You can ask people or look at other sources first.Check newspapers or official websites.Always think carefully before you believe something online.
基金The Key Technologies R& D Program of Heilongjiang Province (No.GB06B601)
文摘To improve the efficiency of the management of large farms, a digitalized system of economic statistics is designed based on the Internet platform of the digitalized agricultural integrated system of Friendship Farm, the largest farm in the world. The system can also realize data storage by using the access databank technology. A dynamic website system, based on ASP technology, is used to implement the on-line inquiry of the statistical index of the agricultural economy and the diagrams of the index every year. Furthermore, it can provide the value of comprehensive indicators of farms' economic profits for every year and a trend chart of the comprehensive appraisal of economic development, by using principal component analysis. An early-warning indicator boundary is decided based on the majority principle. The system can realize the farm's terminal data input with effective data-collecting channels and a normative gathering scope and system. This system breaks through the stand-alone database system in agricultural digitalized research to realize the database system in the Internet environment by integrating the existing technologies in China. The system lays a foundation for the further integrated research on the network platform of the digitalized agricultural integrated system in Friendship Farm.
基金Supportted by the Natural Science Foundation ofChina (60573091 ,60273018) National Basic Research and Develop-ment Programof China (2003CB317000) the Key Project of Minis-try of Education of China (03044) .
文摘With the rapid development of Web, there are more and more Web databases available for users to access. At the same time, job searchers often have difficulties in first finding the right sources and then querying over them, providing such an integrated job search system over Web databases has become a Web application in high demand. Based on such consideration, we build a deep Web data integration system that supports unified access for users to multiple job Web sites as a job meta-search engine. In this paper, the architecture of the system is given first, and the key components in the system are introduced.
基金Major Project of National Social Sciences Foundation of China,No.20&ZD173。
文摘This article explored China’s urban employment dynamics with particular focus on the city size effect.Big data derived from the largest recruitment website were used to examine the direct and indirect impacts of city size on employment demand by using mediating and moderating models.We also investigated the roles of the government and location factors which have seldom been considered in literature.Results showed that the concentration degree of new jobs is higher than that of stock employment and population across cities,implying a path dependency mechanism of job creation and employment expansion.Meanwhile,numerous job posts in inland central cities are probably a symptom of more even distribution of employment in future China.Econometric models further verified the significant correlation between city size and job creation.Moreover,industrial diversity,fixed asset investment,and spatial location have heterogeneous effects on employment demand in cities of different sizes and different levels of administration.These results can not only deepen our understanding of the crucial role of city size in urban employment growth but also demonstrate the future trend of labor and population geography of China.Policy implications are then proposed for job creation in cities of China and other developing countries.
基金supported by the Foundation for Key Program of Ministry of Education, China under Grant No.311007National Science Foundation Project of China under Grants No. 61202079, No.61170225, No.61271199+1 种基金the Fundamental Research Funds for the Central Universities under Grant No.FRF-TP-09-015Athe Fundamental Research Funds in Beijing Jiaotong University under Grant No.W11JB00630
文摘Nowadays, an increasing number of web applications require identification registration. However, the behavior of website registration has not ever been thoroughly studied. We use the database provided by the Chinese Software Develop Net (CSDN) to provide a complete perspective on this research point. We concentrate on the following three aspects: complexity, correlation, and preference. From these analyses, we draw the following conclusions: firstly, a considerable number of users have not realized the importance of identification and are using very simple identifications that can be attacked very easily. Secondly, there is a strong complexity correlation among the three parts of identification. Thirdly, the top three passwords that users like are 123456789, 12345678 and 11111111, and the top three email providers that they prefer are NETEASE, qq and sina. Further, we provide some suggestions to improve the quality of user passwords.
基金supported by the National Social Science Fund of China(23BGL272)。
文摘The fraudulent website image is a vital information carrier for telecom fraud.The efficient and precise recognition of fraudulent website images is critical to combating and dealing with fraudulent websites.Current research on image recognition of fraudulent websites is mainly carried out at the level of image feature extraction and similarity study,which have such disadvantages as difficulty in obtaining image data,insufficient image analysis,and single identification types.This study develops a model based on the entropy method for image leader decision and Inception-v3 transfer learning to address these disadvantages.The data processing part of the model uses a breadth search crawler to capture the image data.Then,the information in the images is evaluated with the entropy method,image weights are assigned,and the image leader is selected.In model training and prediction,the transfer learning of the Inception-v3 model is introduced into image recognition of fraudulent websites.Using selected image leaders to train the model,multiple types of fraudulent websites are identified with high accuracy.The experiment proves that this model has a superior accuracy in recognizing images on fraudulent websites compared to other current models.
基金supported by Zhejiang Provincial Natural Science Foundation of China(Grant No.LGF20G030001)Ministry of Public Security Science and Technology Plan Project(2022LL16)Key scientific research projects of agricultural and social development in Hangzhou in 2020(202004A06).
文摘The feature analysis of fraudulent websites is of great significance to the combat,prevention and control of telecom fraud crimes.Aiming to address the shortcomings of existing analytical approaches,i.e.single dimension and venerability to anti-reconnaissance,this paper adopts the Stacking,the ensemble learning algorithm,combines multiple modalities such as text,image and URL,and proposes a multimodal fraudulent website identification method by ensembling heterogeneous models.Crossvalidation is first used in the training of multiple largely different base classifiers that are strong in learning,such as BERT model,residual neural network(ResNet)and logistic regression model.Classification of the text,image and URL features are then performed respectively.The results of the base classifiers are taken as the input of the meta-classifier,and the output of which is eventually used as the final identification.The study indicates that the fusion method is more effective in identifying fraudulent websites than the single-modal method,and the recall is increased by at least 1%.In addition,the deployment of the algorithm to the real Internet environment shows the improvement of the identification accuracy by at least 1.9%compared with other fusion methods.
基金Supported by Key Discipline Project fromScience and Technology Committee of Shanghai(No.04JC14009) and the Research Fund ofDonghua University(No.108 10 0044934)
文摘A hl-quality website is crucial to a company for a successful e-business. The technique maintainers are always faced with the problem how to locate the prime factors which affect the quality of the websites. In view of the complexity and fuzziness of BtoC webslte, a quality diagnosis method based on the multl-attribute and multi-layer fuzzy comprehensive evaluation model including all the quality factors is proposed. A simple example of diagnosis on a famous domestic BtoC websites shows the specific steps of this method and proves its validity. The process of quality evaluation and diagnosis system is illustrated and the computer program of diagnosis is Oven.
基金Supported by Shandong Provincial Natural Science Foundation(ZR2011DM008)
文摘Agricultural product trading website is not only an important way to realize the agriculture informatization,but also the main manifestation of the agricultural informatization. Based on the preliminary understanding of the content and characteristics of China's agricultural product trading website,the paper builds a scientific evaluation indicator system and evaluates 50 typical agricultural product trading websites objectively by using classification and grading method. The results show that the overall construction level of China's agricultural product trading websites is general,and there are obvious differences between regions; the lack of website commercial function and the lag of informatization are the main factors restricting the development of agricultural product trading websites.
基金This research has been funded by the Scientific Research Deanship at University of Ha'il-Saudi Arabia through Project Number RG-20023.
文摘Phishing attacks are security attacks that do not affect only individuals’or organizations’websites but may affect Internet of Things(IoT)devices and net-works.IoT environment is an exposed environment for such attacks.Attackers may use thingbots software for the dispersal of hidden junk emails that are not noticed by users.Machine and deep learning and other methods were used to design detection methods for these attacks.However,there is still a need to enhance detection accuracy.Optimization of an ensemble classification method for phishing website(PW)detection is proposed in this study.A Genetic Algo-rithm(GA)was used for the proposed method optimization by tuning several ensemble Machine Learning(ML)methods parameters,including Random Forest(RF),AdaBoost(AB),XGBoost(XGB),Bagging(BA),GradientBoost(GB),and LightGBM(LGBM).These were accomplished by ranking the optimized classi-fiers to pick out the best classifiers as a base for the proposed method.A PW data-set that is made up of 4898 PWs and 6157 legitimate websites(LWs)was used for this study's experiments.As a result,detection accuracy was enhanced and reached 97.16 percent.
文摘The advent of the Intemet has witnessed a revolution in the business world. One typical example is the emergence of the B2B website. The present paper looks at the B2B website, a conventionalized digital text, in terms of its communicative purposes, move features as well as linguistic specialties, with the aim of presenting the generic structure of the B2B website and its principal linguistic features contributing to the realization of its communicative purposes. It is demonstrated that the B2B website is one instance of the promotional genres and it has a lot in common with advertisement English and "netsneak" in the aspect of lexico-grammatical features.
文摘In this paper, we conduct research on the big data and the artificial intelligence aided decision-making mechanism with the applications on video website homemade program innovation. Make homemade video shows new media platform site content production with new possible, as also make the traditional media found in Internet age, the breakthrough point of the times. Site homemade video program, which is beneficial to reduce copyright purchase demand, reduce the cost, avoid the homogeneity competition, rich advertising marketing at the same time, improve the profit pattern, the organic combination of content production and operation, complete the strategic transformation. On the basis of these advantages, once the site of homemade video program to form a brand and a higher brand influence. Our later research provides the literature survey for the related issues.