Low visibility conditions,particularly those caused by fog,significantly affect road safety and reduce drivers’ability to see ahead clearly.The conventional approaches used to address this problem primarily rely on i...Low visibility conditions,particularly those caused by fog,significantly affect road safety and reduce drivers’ability to see ahead clearly.The conventional approaches used to address this problem primarily rely on instrument-based and fixed-threshold-based theoretical frameworks,which face challenges in adaptability and demonstrate lower performance under varying environmental conditions.To overcome these challenges,we propose a real-time visibility estimation model that leverages roadside CCTV cameras to monitor and identify visibility levels under different weather conditions.The proposedmethod begins by identifying specific regions of interest(ROI)in the CCTVimages and focuses on extracting specific features such as the number of lines and contours detected within these regions.These features are then provided as an input to the proposed hierarchical clusteringmodel,which classifies them into different visibility levels without the need for predefined rules and threshold values.In the proposed approach,we used two different distance similaritymetrics,namely dynamic time warping(DTW)and Euclidean distance,alongside the proposed hierarchical clustering model and noted its performance in terms of numerous evaluation measures.The proposed model achieved an average accuracy of 97.81%,precision of 91.31%,recall of 91.25%,and F1-score of 91.27% using theDTWdistancemetric.We also conducted experiments for other deep learning(DL)-based models used in the literature and compared their performances with the proposed model.The experimental results demonstrate that the proposedmodel ismore adaptable and consistent compared to themethods used in the literature.The proposedmethod provides drivers real-time and accurate visibility information and enhances road safety during low visibility conditions.展开更多
The performance of Wireless Sensor Networks(WSNs)is an important fragment of the Internet of Things(IoT),where the current WSNbuilt IoT network’s sensor hubs are enticing due to their critical resources.By grouping h...The performance of Wireless Sensor Networks(WSNs)is an important fragment of the Internet of Things(IoT),where the current WSNbuilt IoT network’s sensor hubs are enticing due to their critical resources.By grouping hubs,a clustering convention offers a useful solution for ensuring energy-saving of hubs andHybridMedia Access Control(HMAC)during the course of the organization.Nevertheless,current grouping standards suffer from issues with the grouping structure that impacts the exhibition of these conventions negatively.In this investigation,we recommend an Improved Energy-Proficient Algorithm(IEPA)for HMAC throughout the lifetime of the WSN-based IoT.Three consecutive segments are suggested.For the covering of adjusted clusters,an ideal number of clusters is determined first.Then,fair static clusters are shaped,based on an updated calculation for fluffy cluster heads,to reduce and adapt the energy use of the sensor hubs.Cluster heads(CHs)are,ultimately,selected in optimal locations,with the pivot of the cluster heads working among cluster members.Specifically,the proposed convention diminishes and balances the energy utilization of hubs by improving the grouping structure,where the IEPAis reasonable for systems that need a long time.The assessment results demonstrate that the IEPA performs better than existing conventions.展开更多
Decision-making of investors at the stock exchange can be based on the fundamental indicators of stocks, on the technical indicators, or can exist as a combination of these two methods. The paper gives emphasis to the...Decision-making of investors at the stock exchange can be based on the fundamental indicators of stocks, on the technical indicators, or can exist as a combination of these two methods. The paper gives emphasis to the domain of technical analysis. In the broader sense the technical analysis enables the dynamics of the expected future values of the shares estimation. This can be performed on the basis of the data on historical trends of the revenues, profits and other indicators from the balance sheet, but also on the basis of historical data on changes in the values of the shares. Companies generally belong to the different sectors that have different presumptions of development resulting from the global market trends, technology and other characteristic. Processing of historical data values of the outstanding shares of the Zagreb Stock Exchange (ZSE) is origination of this research. Investors are interested to know the estimation of future returns for the stocks as well as the size of the risk associated with the expected returns. Research task in this paper is finding the optimal portfolio at the ZSE based on the concept of dominant portfolio by Markowitz approach. The portfolio is created by solving non-linear programming problem using the common software tools. The results of obtained optimal portfolios contain relevant conclusions about the specifics of the shares as well as the characteristics of the industrial sectors but also provide a further knowledge about diverse sectors treatment at the stock exchange in a multi-year period.展开更多
Planning and implementation as well as increased control over the Business Continuity Management (BCM) is a complex task in the company requiring adequate resources. BCM aims to reduce risks and develops plans for r...Planning and implementation as well as increased control over the Business Continuity Management (BCM) is a complex task in the company requiring adequate resources. BCM aims to reduce risks and develops plans for restoring business activities if they are interrupted by a disaster. The purpose of the paper is to analyze and describe two standards, Information Technology Infrastructure Library (ITIL) and Control Objectives for Information and Related Technology (COBIT), especially their mapping for the improved planning and implementation of the BCM as well as the increased control over the BCM activities. COBIT is used more as a management framework, providing management tools, such as control objectives, metrics and maturity models in order to complement the control framework. ITIL includes process steps and tasks because it is more oriented towards IT processes (process framework), defining the best practice for IT service management. Within this mapping, ITIL processes may be used to achieve and demonstrate compliance with COBIT control objectives for BCM process.展开更多
In all phases of forensic investigation, digital evidence is exposed to external influences and coming into contact with many factors. Legal admissibility of digital evidence is the ability of that evidence being acce...In all phases of forensic investigation, digital evidence is exposed to external influences and coming into contact with many factors. Legal admissibility of digital evidence is the ability of that evidence being accepted as evidence in a court of law. Life cycle of digital evidence is very complex. In each stage there is more impact that can violate a chain of custody and its integrity. Contact with different variables occurs through a life cycle of digital evidence and can disrupt its integrity. In order for the evidence to be accepted by the court as valid, chain of custody for digital evidence must be kept, or it must be known who exactly came into contact with evidence in each stage of the investigation. This paper presents a dynamics and life cycle of digital evidence. The Petri nets will be proposed and used for modeling and simulation of this process.展开更多
In this paper, the authors outline a formal system for reasoning about agents' knowledge in knowledge games-a special type of multi-agent system. Knowledge games are card games where the agents' actions involve an e...In this paper, the authors outline a formal system for reasoning about agents' knowledge in knowledge games-a special type of multi-agent system. Knowledge games are card games where the agents' actions involve an exchange of information with other agents in the game. The authors' system is modeled using Coq-a formal proof management system. To the best of the authors" knowledge, there are no papers in which knowledge games are considered using a Coq proof assistant. The authors use the dynamic logic of common knowledge, where they particularly focus on the epistemic consequences of epistemic actions carried out by agents. The authors observe the changes in the system that result from such actions. Those changes that can occur in such a system that are of interest to the authors take the form of agents' knowledge about the state of the system, knowledge about other agents' knowledge, higher-order agents' knowledge and so on, up to common knowledge. Besides an axiomatic ofepistemic logic, the authors use a known axiomatization of card games that is extended with some new axioms that are required for the authors' approach. Due to a deficit in implementations grounded in theory that enable players to compute their knowledge in any state of the game, the authors show how the authors' approach can be used for these purposes.展开更多
Sound recording quality, whether digital or analogue, presupposes true-to-life recording of the original audio signal. Contemporary audio media do not meet the requirements of high-fidelity recording since they do not...Sound recording quality, whether digital or analogue, presupposes true-to-life recording of the original audio signal. Contemporary audio media do not meet the requirements of high-fidelity recording since they do not insufficiently utilize the bandwidth for D/A conversion and sufficient word length. This paper deals with the values of these parameters in high-fidelity recording. The paper presents the results of a research into characteristics of faithful and high-fidelity audio recording considering the frequency range, signal-to-noise ratio and dynamic range.展开更多
文摘Low visibility conditions,particularly those caused by fog,significantly affect road safety and reduce drivers’ability to see ahead clearly.The conventional approaches used to address this problem primarily rely on instrument-based and fixed-threshold-based theoretical frameworks,which face challenges in adaptability and demonstrate lower performance under varying environmental conditions.To overcome these challenges,we propose a real-time visibility estimation model that leverages roadside CCTV cameras to monitor and identify visibility levels under different weather conditions.The proposedmethod begins by identifying specific regions of interest(ROI)in the CCTVimages and focuses on extracting specific features such as the number of lines and contours detected within these regions.These features are then provided as an input to the proposed hierarchical clusteringmodel,which classifies them into different visibility levels without the need for predefined rules and threshold values.In the proposed approach,we used two different distance similaritymetrics,namely dynamic time warping(DTW)and Euclidean distance,alongside the proposed hierarchical clustering model and noted its performance in terms of numerous evaluation measures.The proposed model achieved an average accuracy of 97.81%,precision of 91.31%,recall of 91.25%,and F1-score of 91.27% using theDTWdistancemetric.We also conducted experiments for other deep learning(DL)-based models used in the literature and compared their performances with the proposed model.The experimental results demonstrate that the proposedmodel ismore adaptable and consistent compared to themethods used in the literature.The proposedmethod provides drivers real-time and accurate visibility information and enhances road safety during low visibility conditions.
文摘The performance of Wireless Sensor Networks(WSNs)is an important fragment of the Internet of Things(IoT),where the current WSNbuilt IoT network’s sensor hubs are enticing due to their critical resources.By grouping hubs,a clustering convention offers a useful solution for ensuring energy-saving of hubs andHybridMedia Access Control(HMAC)during the course of the organization.Nevertheless,current grouping standards suffer from issues with the grouping structure that impacts the exhibition of these conventions negatively.In this investigation,we recommend an Improved Energy-Proficient Algorithm(IEPA)for HMAC throughout the lifetime of the WSN-based IoT.Three consecutive segments are suggested.For the covering of adjusted clusters,an ideal number of clusters is determined first.Then,fair static clusters are shaped,based on an updated calculation for fluffy cluster heads,to reduce and adapt the energy use of the sensor hubs.Cluster heads(CHs)are,ultimately,selected in optimal locations,with the pivot of the cluster heads working among cluster members.Specifically,the proposed convention diminishes and balances the energy utilization of hubs by improving the grouping structure,where the IEPAis reasonable for systems that need a long time.The assessment results demonstrate that the IEPA performs better than existing conventions.
文摘Decision-making of investors at the stock exchange can be based on the fundamental indicators of stocks, on the technical indicators, or can exist as a combination of these two methods. The paper gives emphasis to the domain of technical analysis. In the broader sense the technical analysis enables the dynamics of the expected future values of the shares estimation. This can be performed on the basis of the data on historical trends of the revenues, profits and other indicators from the balance sheet, but also on the basis of historical data on changes in the values of the shares. Companies generally belong to the different sectors that have different presumptions of development resulting from the global market trends, technology and other characteristic. Processing of historical data values of the outstanding shares of the Zagreb Stock Exchange (ZSE) is origination of this research. Investors are interested to know the estimation of future returns for the stocks as well as the size of the risk associated with the expected returns. Research task in this paper is finding the optimal portfolio at the ZSE based on the concept of dominant portfolio by Markowitz approach. The portfolio is created by solving non-linear programming problem using the common software tools. The results of obtained optimal portfolios contain relevant conclusions about the specifics of the shares as well as the characteristics of the industrial sectors but also provide a further knowledge about diverse sectors treatment at the stock exchange in a multi-year period.
文摘Planning and implementation as well as increased control over the Business Continuity Management (BCM) is a complex task in the company requiring adequate resources. BCM aims to reduce risks and develops plans for restoring business activities if they are interrupted by a disaster. The purpose of the paper is to analyze and describe two standards, Information Technology Infrastructure Library (ITIL) and Control Objectives for Information and Related Technology (COBIT), especially their mapping for the improved planning and implementation of the BCM as well as the increased control over the BCM activities. COBIT is used more as a management framework, providing management tools, such as control objectives, metrics and maturity models in order to complement the control framework. ITIL includes process steps and tasks because it is more oriented towards IT processes (process framework), defining the best practice for IT service management. Within this mapping, ITIL processes may be used to achieve and demonstrate compliance with COBIT control objectives for BCM process.
文摘In all phases of forensic investigation, digital evidence is exposed to external influences and coming into contact with many factors. Legal admissibility of digital evidence is the ability of that evidence being accepted as evidence in a court of law. Life cycle of digital evidence is very complex. In each stage there is more impact that can violate a chain of custody and its integrity. Contact with different variables occurs through a life cycle of digital evidence and can disrupt its integrity. In order for the evidence to be accepted by the court as valid, chain of custody for digital evidence must be kept, or it must be known who exactly came into contact with evidence in each stage of the investigation. This paper presents a dynamics and life cycle of digital evidence. The Petri nets will be proposed and used for modeling and simulation of this process.
文摘In this paper, the authors outline a formal system for reasoning about agents' knowledge in knowledge games-a special type of multi-agent system. Knowledge games are card games where the agents' actions involve an exchange of information with other agents in the game. The authors' system is modeled using Coq-a formal proof management system. To the best of the authors" knowledge, there are no papers in which knowledge games are considered using a Coq proof assistant. The authors use the dynamic logic of common knowledge, where they particularly focus on the epistemic consequences of epistemic actions carried out by agents. The authors observe the changes in the system that result from such actions. Those changes that can occur in such a system that are of interest to the authors take the form of agents' knowledge about the state of the system, knowledge about other agents' knowledge, higher-order agents' knowledge and so on, up to common knowledge. Besides an axiomatic ofepistemic logic, the authors use a known axiomatization of card games that is extended with some new axioms that are required for the authors' approach. Due to a deficit in implementations grounded in theory that enable players to compute their knowledge in any state of the game, the authors show how the authors' approach can be used for these purposes.
文摘Sound recording quality, whether digital or analogue, presupposes true-to-life recording of the original audio signal. Contemporary audio media do not meet the requirements of high-fidelity recording since they do not insufficiently utilize the bandwidth for D/A conversion and sufficient word length. This paper deals with the values of these parameters in high-fidelity recording. The paper presents the results of a research into characteristics of faithful and high-fidelity audio recording considering the frequency range, signal-to-noise ratio and dynamic range.