To increase the performance of bulk data transfer mission with ultra-long TCP ( transmission control protocol) connection in high-energy physics experiments, a series of experiments were conducted to explore the way...To increase the performance of bulk data transfer mission with ultra-long TCP ( transmission control protocol) connection in high-energy physics experiments, a series of experiments were conducted to explore the way to enhance the transmission efficiency. This paper introduces the overall structure of RC@ SEU ( regional center @ Southeast University) in AMS (alpha magnetic spectrometer)-02 ground data transfer system as well as the experiments conducted in CERNET (China Education and Research Network)/CERNET2 and global academic Internet. The effects of the number of parallel streams and TCP buffer size are tested. The test confirms that in the current circumstance of CERNET, to find the fight number of parallel TCP connections is the main method to improve the throughput. TCP buffer size tuning has little effect now, but may have good effects when the available bandwidth becomes higher.展开更多
Considering the increasing use of information technology with established standards, such as TCP/IP and XML in modem industrial automation, we present a high cost performance solution with FPGA (field programmable ga...Considering the increasing use of information technology with established standards, such as TCP/IP and XML in modem industrial automation, we present a high cost performance solution with FPGA (field programmable gate array) implementation of a novel reliable real-time data transfer system based on EPA (Ethemet for plant automation) protocol and IEEE 1588 standard. This combination can provide more predictable and real-time communication between automation equipments and precise synchronization between devices. The designed EPA system has been verified on Xilinx Spartan3 XC3S1500 and it consumed 75% of the total slices. The experimental results show that the novel industrial control system achieves high synchronization precision and provides a 1.59-ps standard deviation between the master device and the slave ones. Such a real-time data transfer system is an excellent candidate for automation equipments which require precise synchronization based on Ethemet at a comparatively low price.展开更多
Efficient real time data exchange over the Internet plays a crucial role in the successful application of web-based systems. In this paper, a data transfer mechanism over the Internet is proposed for real time web bas...Efficient real time data exchange over the Internet plays a crucial role in the successful application of web-based systems. In this paper, a data transfer mechanism over the Internet is proposed for real time web based applications. The mechanism incorporates the eXtensible Markup Language (XML) and Hierarchical Data Format (HDF) to provide a flexible and efficient data format. Heterogeneous transfer data is classified into light and heavy data, which are stored using XML and HDF respectively; the HDF data format is then mapped to Java Document Object Model (JDOM) objects in XML in the Java environment. These JDOM data objects are sent across computer networks with the support of the Java Remote Method Invocation (RMI) data transfer infrastructure. Client's defined data priority levels are implemented in RMI, which guides a server to transfer data objects at different priorities. A remote monitoring system for an industrial reactor process simulator is used as a case study to illustrate the proposed data transfer mechanism.展开更多
Background: Omission of patient information in perioperative communication is closely linked to adverse events. Use of checklists to standardize the handoff in the post anesthesia care unit (PACU) has been shown to ef...Background: Omission of patient information in perioperative communication is closely linked to adverse events. Use of checklists to standardize the handoff in the post anesthesia care unit (PACU) has been shown to effectively reduce medical errors. Objective: Our study investigates the use of a checklist to improve quantity of data transfer during handoffs in the PACU. Design: A cross-sectional observational study. Setting: PACU at Memorial Sloan Kettering Cancer Center (MSKCC);June 13, 2016 through July 15, 2016. Patients, other participants: We observed the handoff reports between the nurses, PACU midlevel providers, anesthesia staff, and surgical staff. Intervention: A physical checklist was provided to all anesthesia staff and recommended to adhere to the list at all observed PACU handoffs. Main outcome measure: Quantity of reported handoff items during 60 pre- and 60 post-implementation of a checklist. Results: Composite value from both surgical and anesthesia reports showed an increase in the mean report of 8.7 items from pre-implementation period to 10.9 post-implementation. Given that surgical staff reported the mean of 5.9 items pre-implementation and 5.5 items post-implementation without intervention, improvements in anesthesia staff report with intervention improved the overall handoff data transfer. Conclusions: Using a physical 12-item checklist for PACU handoff increased overall data transfer.展开更多
Ultrasonic power and data transfer is a promising technology for implantable medical devices because of its non-invasiveness,deep penetration depth,and potential for a high-power transmission rate with a low specific ...Ultrasonic power and data transfer is a promising technology for implantable medical devices because of its non-invasiveness,deep penetration depth,and potential for a high-power transmission rate with a low specific absorption rate.However,ultrasound-powered implantable devices still suffer from low power transfer efficiency due to beam misalignment and are limited to short-term use due to the bulkiness of the transmitting transducers.Here,we report the first proof of concept for adaptive positioning and targeting of ultrasound-based implantable devices through ultrasound image guidance.A lightweight patch-type ultrasonic transducer array is fabricated to enable ultrasound imaging and beam-forming during long-term operation.The uniform performance of the array is established through the silicon micromachining process.We demonstrate the complete scheme of imaging,positioning,and targeted power transfer in an ex vivo environment,achieving precise targeting of moving implanted devices through real-time ultrasound imaging.Enhanced power transfer efficiency through the use of patch-type ultrasonic transducers can enhance patient comfort and minimize invasive procedures,opening new applications for ultrasonic-powered implantable devices.展开更多
Integrated data and energy transfer(IDET)enables the electromagnetic waves to transmit wireless energy at the same time of data delivery for lowpower devices.In this paper,an energy harvesting modulation(EHM)assisted ...Integrated data and energy transfer(IDET)enables the electromagnetic waves to transmit wireless energy at the same time of data delivery for lowpower devices.In this paper,an energy harvesting modulation(EHM)assisted multi-user IDET system is studied,where all the received signals at the users are exploited for energy harvesting without the degradation of wireless data transfer(WDT)performance.The joint IDET performance is then analysed theoretically by conceiving a practical time-dependent wireless channel.With the aid of the AO based algorithm,the average effective data rate among users are maximized by ensuring the BER and the wireless energy transfer(WET)performance.Simulation results validate and evaluate the IDET performance of the EHM assisted system,which also demonstrates that the optimal number of user clusters and IDET time slots should be allocated,in order to improve the WET and WDT performance.展开更多
The dataflow architecture,which is characterized by a lack of a redundant unified control logic,has been shown to have an advantage over the control-flow architecture as it improves the computational performance and p...The dataflow architecture,which is characterized by a lack of a redundant unified control logic,has been shown to have an advantage over the control-flow architecture as it improves the computational performance and power efficiency,especially of applications used in high-performance computing(HPC).Importantly,the high computational efficiency of systems using the dataflow architecture is achieved by allowing program kernels to be activated in a simultaneous manner.Therefore,a proper acknowledgment mechanism is required to distinguish the data that logically belongs to different contexts.Possible solutions include the tagged-token matching mechanism in which the data is sent before acknowledgments are received but retried after rejection,or a handshake mechanism in which the data is only sent after acknowledgments are received.However,these mechanisms are characterized by both inefficient data transfer and increased area cost.Good performance of the dataflow architecture depends on the efficiency of data transfer.In order to optimize the efficiency of data transfer in existing dataflow architectures with a minimal increase in area and power cost,we propose a Look-Ahead Acknowledgment(LAA)mechanism.LAA accelerates the execution flow by speculatively acknowledging ahead without penalties.Our simulation analysis based on a handshake mechanism shows that our LAA increases the average utilization of computational units by 23.9%,with a reduction in the average execution time by 17.4%and an increase in the average power efficiency of dataflow processors by 22.4%.Crucially,our novel approach results in a relatively small increase in the area and power consumption of the on-chip logic of less than 0.9%.In conclusion,the evaluation results suggest that Look-Ahead Acknowledgment is an effective improvement for data transfer in existing dataflow architectures.展开更多
In mobile computing environments, most IoT devices connected to networks experience variable error rates and possess limited bandwidth. The conventional method of retransmitting lost information during transmission, c...In mobile computing environments, most IoT devices connected to networks experience variable error rates and possess limited bandwidth. The conventional method of retransmitting lost information during transmission, commonly used in data transmission protocols, increases transmission delay and consumes excessive bandwidth. To overcome this issue, forward error correction techniques, e.g., Random Linear Network Coding(RLNC) can be used in data transmission. The primary challenge in RLNC-based methodologies is sustaining a consistent coding ratio during data transmission, leading to notable bandwidth usage and transmission delay in dynamic network conditions. Therefore, this study proposes a new block-based RLNC strategy known as Adjustable RLNC(ARLNC), which dynamically adjusts the coding ratio and transmission window during runtime based on the estimated network error rate calculated via receiver feedback. The calculations in this approach are performed using a Galois field with the order of 256. Furthermore, we assessed ARLNC's performance by subjecting it to various error models such as Gilbert Elliott, exponential, and constant rates and compared it with the standard RLNC. The results show that dynamically adjusting the coding ratio and transmission window size based on network conditions significantly enhances network throughput and reduces total transmission delay in most scenarios. In contrast to the conventional RLNC method employing a fixed coding ratio, the presented approach has demonstrated significant enhancements, resulting in a 73% decrease in transmission delay and a 4 times augmentation in throughput. However, in dynamic computational environments, ARLNC generally incurs higher computational costs than the standard RLNC but excels in high-performance networks.展开更多
Integrated data and energy transfer(IDET)is capable of simultaneously delivering on-demand data and energy to low-power Internet of Everything(Io E)devices.We propose a multi-carrier IDET transceiver relying on superp...Integrated data and energy transfer(IDET)is capable of simultaneously delivering on-demand data and energy to low-power Internet of Everything(Io E)devices.We propose a multi-carrier IDET transceiver relying on superposition waveforms consisting of multi-sinusoidal signals for wireless energy transfer(WET)and orthogonal-frequency-divisionmultiplexing(OFDM)signals for wireless data transfer(WDT).The outdated channel state information(CSI)in aging channels is employed by the transmitter to shape IDET waveforms.With the constraints of transmission power and WDT requirement,the amplitudes and phases of the IDET waveform at the transmitter and the power splitter at the receiver are jointly optimised for maximising the average directcurrent(DC)among a limited number of transmission frames with the existence of carrier-frequencyoffset(CFO).For the amplitude optimisation,the original non-convex problem can be transformed into a reversed geometric programming problem,then it can be effectively solved with existing tools.As for the phase optimisation,the artificial bee colony(ABC)algorithm is invoked in order to deal with the nonconvexity.Iteration between the amplitude optimisation and phase optimisation yields our joint design.Numerical results demonstrate the advantage of our joint design for the IDET waveform shaping with the existence of the CFO and the outdated CSI.展开更多
In computational physics proton transfer phenomena could be viewed as pattern classification problems based on a set of input features allowing classification of the proton motion into two categories: transfer 'occu...In computational physics proton transfer phenomena could be viewed as pattern classification problems based on a set of input features allowing classification of the proton motion into two categories: transfer 'occurred' and transfer 'not occurred'. The goal of this paper is to evaluate the use of artificial neural networks in the classification of proton transfer events, based on the feed-forward back propagation neural network, used as a classifier to distinguish between the two transfer cases. In this paper, we use a new developed data mining and pattern recognition tool for automating, controlling, and drawing charts of the output data of an Empirical Valence Bond existing code. The study analyzes the need for pattern recognition in aqueous proton transfer processes and how the learning approach in error back propagation (multilayer perceptron algorithms) could be satisfactorily employed in the present case. We present a tool for pattern recognition and validate the code including a real physical case study. The results of applying the artificial neural networks methodology to crowd patterns based upon selected physical properties (e.g., temperature, density) show the abilities of the network to learn proton transfer patterns corresponding to properties of the aqueous environments, which is in turn proved to be fully compatible with previous proton transfer studies.展开更多
The deep learning algorithm,which has been increasingly applied in the field of petroleum geophysical prospecting,has achieved good results in improving efficiency and accuracy based on test applications.To play a gre...The deep learning algorithm,which has been increasingly applied in the field of petroleum geophysical prospecting,has achieved good results in improving efficiency and accuracy based on test applications.To play a greater role in actual production,these algorithm modules must be integrated into software systems and used more often in actual production projects.Deep learning frameworks,such as TensorFlow and PyTorch,basically take Python as the core architecture,while the application program mainly uses Java,C#,and other programming languages.During integration,the seismic data read by the Java and C#data interfaces must be transferred to the Python main program module.The data exchange methods between Java,C#,and Python include shared memory,shared directory,and so on.However,these methods have the disadvantages of low transmission efficiency and unsuitability for asynchronous networks.Considering the large volume of seismic data and the need for network support for deep learning,this paper proposes a method of transmitting seismic data based on Socket.By maximizing Socket’s cross-network and efficient longdistance transmission,this approach solves the problem of inefficient transmission of underlying data while integrating the deep learning algorithm module into a software system.Furthermore,the actual production application shows that this method effectively solves the shortage of data transmission in shared memory,shared directory,and other modes while simultaneously improving the transmission efficiency of massive seismic data across modules at the bottom of the software.展开更多
In this paper,the problem of increasing information transfer authenticity is formulated.And to reach a decision,the control methods and algorithms based on the use of statistical and structural information redundancy ...In this paper,the problem of increasing information transfer authenticity is formulated.And to reach a decision,the control methods and algorithms based on the use of statistical and structural information redundancy are presented.It is assumed that the controllable information is submitted as the text element images and it contains redundancy,caused by statistical relations and non-uniformity probability distribution of the transmitted data.The use of statistical redundancy allows to develop the adaptive rules of the authenticity control which take into account non-stationarity properties of image data while transferring the information.The structural redundancy peculiar to the container of image in a data transfer package is used for developing new rules to control the information authenticity on the basis of pattern recognition mechanisms.The techniques offered in this work are used to estimate the authenticity in structure of data transfer packages.The results of comparative analysis for developed methods and algorithms show that their parameters of efficiency are increased by criterion of probability of undetected mistakes,labour input and cost of realization.展开更多
This paper proposes the solution of tasks set required for autonomous robotic group behavior optimization during the mission on a distributed area in a cluttered hazardous terrain.The navigation scheme uses the benefi...This paper proposes the solution of tasks set required for autonomous robotic group behavior optimization during the mission on a distributed area in a cluttered hazardous terrain.The navigation scheme uses the benefits of the original real-time technical vision system(TVS)based on a dynamic triangulation principle.The method uses TVS output data with fuzzy logic rules processing for resolution stabilization.Based on previous researches,the dynamic communication network model is modified to implement the propagation of information with a feedback method for more stable data exchange inside the robotic group.According to the comparative analysis of approximation methods,in this paper authors are proposing to use two-steps post-processing path planning aiming to get a smooth and energy-saving trajectory.The article provides a wide range of studies and computational experiment results for different scenarios for evaluation of common cloud point influence on robotic motion planning.展开更多
When workflow task needs several datasets from different locations m cloud, data transfer becomes a challenge. To avoid the unnecessary data transfer, a graphical-based data placement algo- rithm for cloud workflow is...When workflow task needs several datasets from different locations m cloud, data transfer becomes a challenge. To avoid the unnecessary data transfer, a graphical-based data placement algo- rithm for cloud workflow is proposed. The algorithm uses affinity graph to group datasets while keeping a polynomial time complexity. By integrating the algorithm, the workflow engine can intelligently select locations in which the data will reside to avoid the unnecessary data transfer during the initial stage and runtime stage. Simulations show that the proposed algorithm can effectively reduce data transfer during the workflow' s execution.展开更多
Although the existing legal norms and judicial practic-es can provide basic guidance for the right to personal data portabili-ty, it can be concluded that there are obstacles to the realization of this right through e...Although the existing legal norms and judicial practic-es can provide basic guidance for the right to personal data portabili-ty, it can be concluded that there are obstacles to the realization of this right through empirical research of the privacy policies of 66 mobile apps, such as whether they have stipulations on the right to personal data portability, whether they are able to derive copies of personal in-formation automatically, whether there are textual examples, whether ID verification is required, whether the copied documents are encrypt-ed, and whether the scope of personal information involved is consis-tent. This gap in practice, on the one hand, reflects the misunderstand-ing of the right to personal data portability, and on the other hand, is a result of the negative externalities, practical costs and technical lim-itations of the right to personal data portability. Based on rethinking the right to data portability, we can somehow solve practical problems concerning the right to personal data portability through multiple measures such as promoting the fulfillment of this right by legislation, optimizing technology-oriented operations, refining response process mechanisms, and enhancing system interoperability.展开更多
基金The National Basic Research Program of China (973Program) (No.2003CB314803).
文摘To increase the performance of bulk data transfer mission with ultra-long TCP ( transmission control protocol) connection in high-energy physics experiments, a series of experiments were conducted to explore the way to enhance the transmission efficiency. This paper introduces the overall structure of RC@ SEU ( regional center @ Southeast University) in AMS (alpha magnetic spectrometer)-02 ground data transfer system as well as the experiments conducted in CERNET (China Education and Research Network)/CERNET2 and global academic Internet. The effects of the number of parallel streams and TCP buffer size are tested. The test confirms that in the current circumstance of CERNET, to find the fight number of parallel TCP connections is the main method to improve the throughput. TCP buffer size tuning has little effect now, but may have good effects when the available bandwidth becomes higher.
文摘Considering the increasing use of information technology with established standards, such as TCP/IP and XML in modem industrial automation, we present a high cost performance solution with FPGA (field programmable gate array) implementation of a novel reliable real-time data transfer system based on EPA (Ethemet for plant automation) protocol and IEEE 1588 standard. This combination can provide more predictable and real-time communication between automation equipments and precise synchronization between devices. The designed EPA system has been verified on Xilinx Spartan3 XC3S1500 and it consumed 75% of the total slices. The experimental results show that the novel industrial control system achieves high synchronization precision and provides a 1.59-ps standard deviation between the master device and the slave ones. Such a real-time data transfer system is an excellent candidate for automation equipments which require precise synchronization based on Ethemet at a comparatively low price.
文摘Efficient real time data exchange over the Internet plays a crucial role in the successful application of web-based systems. In this paper, a data transfer mechanism over the Internet is proposed for real time web based applications. The mechanism incorporates the eXtensible Markup Language (XML) and Hierarchical Data Format (HDF) to provide a flexible and efficient data format. Heterogeneous transfer data is classified into light and heavy data, which are stored using XML and HDF respectively; the HDF data format is then mapped to Java Document Object Model (JDOM) objects in XML in the Java environment. These JDOM data objects are sent across computer networks with the support of the Java Remote Method Invocation (RMI) data transfer infrastructure. Client's defined data priority levels are implemented in RMI, which guides a server to transfer data objects at different priorities. A remote monitoring system for an industrial reactor process simulator is used as a case study to illustrate the proposed data transfer mechanism.
文摘Background: Omission of patient information in perioperative communication is closely linked to adverse events. Use of checklists to standardize the handoff in the post anesthesia care unit (PACU) has been shown to effectively reduce medical errors. Objective: Our study investigates the use of a checklist to improve quantity of data transfer during handoffs in the PACU. Design: A cross-sectional observational study. Setting: PACU at Memorial Sloan Kettering Cancer Center (MSKCC);June 13, 2016 through July 15, 2016. Patients, other participants: We observed the handoff reports between the nurses, PACU midlevel providers, anesthesia staff, and surgical staff. Intervention: A physical checklist was provided to all anesthesia staff and recommended to adhere to the list at all observed PACU handoffs. Main outcome measure: Quantity of reported handoff items during 60 pre- and 60 post-implementation of a checklist. Results: Composite value from both surgical and anesthesia reports showed an increase in the mean report of 8.7 items from pre-implementation period to 10.9 post-implementation. Given that surgical staff reported the mean of 5.9 items pre-implementation and 5.5 items post-implementation without intervention, improvements in anesthesia staff report with intervention improved the overall handoff data transfer. Conclusions: Using a physical 12-item checklist for PACU handoff increased overall data transfer.
基金supported by the Korea Medical Device Development Fund grant funded by the Korea government(the Ministry of Science and ICT,the Ministry of Trade,Industry and Energy,the Ministry of Health&Welfare,Republic of Korea,the Ministry of Food and Drug Safety)(202011B01,RS-2020-KD000007)by the K-Brain Project of the National Research Foundation(NRF)funded by the Korean government(MSIT)(RS-2023-00262568)+3 种基金by a grant of the Korea Dementia Research Project through the Korea Dementia Research Center(KDRC)funded by the Ministry of Health&Welfare and Ministry of Science and ICT,Republic of Korea(RS-2024-00355871)by Nanomedical Devices Development Project of NNFC(1711197701)by Samsung Electronics.
文摘Ultrasonic power and data transfer is a promising technology for implantable medical devices because of its non-invasiveness,deep penetration depth,and potential for a high-power transmission rate with a low specific absorption rate.However,ultrasound-powered implantable devices still suffer from low power transfer efficiency due to beam misalignment and are limited to short-term use due to the bulkiness of the transmitting transducers.Here,we report the first proof of concept for adaptive positioning and targeting of ultrasound-based implantable devices through ultrasound image guidance.A lightweight patch-type ultrasonic transducer array is fabricated to enable ultrasound imaging and beam-forming during long-term operation.The uniform performance of the array is established through the silicon micromachining process.We demonstrate the complete scheme of imaging,positioning,and targeted power transfer in an ex vivo environment,achieving precise targeting of moving implanted devices through real-time ultrasound imaging.Enhanced power transfer efficiency through the use of patch-type ultrasonic transducers can enhance patient comfort and minimize invasive procedures,opening new applications for ultrasonic-powered implantable devices.
基金supported in part by the MOST Major Research and Development Project(Grant No.2021YFB2900204)the National Natural Science Foundation of China(NSFC)(Grant No.62201123,No.62132004,No.61971102)+3 种基金China Postdoctoral Science Foundation(Grant No.2022TQ0056)in part by the financial support of the Sichuan Science and Technology Program(Grant No.2022YFH0022)Sichuan Major R&D Project(Grant No.22QYCX0168)the Municipal Government of Quzhou(Grant No.2022D031)。
文摘Integrated data and energy transfer(IDET)enables the electromagnetic waves to transmit wireless energy at the same time of data delivery for lowpower devices.In this paper,an energy harvesting modulation(EHM)assisted multi-user IDET system is studied,where all the received signals at the users are exploited for energy harvesting without the degradation of wireless data transfer(WDT)performance.The joint IDET performance is then analysed theoretically by conceiving a practical time-dependent wireless channel.With the aid of the AO based algorithm,the average effective data rate among users are maximized by ensuring the BER and the wireless energy transfer(WET)performance.Simulation results validate and evaluate the IDET performance of the EHM assisted system,which also demonstrates that the optimal number of user clusters and IDET time slots should be allocated,in order to improve the WET and WDT performance.
基金supported by the Project of the State Grid Corporation of China in 2020"Integration Technology Research and Prototype Development for High End Controller Chip"under Grant No.5700-202041264A-0-0-00.
文摘The dataflow architecture,which is characterized by a lack of a redundant unified control logic,has been shown to have an advantage over the control-flow architecture as it improves the computational performance and power efficiency,especially of applications used in high-performance computing(HPC).Importantly,the high computational efficiency of systems using the dataflow architecture is achieved by allowing program kernels to be activated in a simultaneous manner.Therefore,a proper acknowledgment mechanism is required to distinguish the data that logically belongs to different contexts.Possible solutions include the tagged-token matching mechanism in which the data is sent before acknowledgments are received but retried after rejection,or a handshake mechanism in which the data is only sent after acknowledgments are received.However,these mechanisms are characterized by both inefficient data transfer and increased area cost.Good performance of the dataflow architecture depends on the efficiency of data transfer.In order to optimize the efficiency of data transfer in existing dataflow architectures with a minimal increase in area and power cost,we propose a Look-Ahead Acknowledgment(LAA)mechanism.LAA accelerates the execution flow by speculatively acknowledging ahead without penalties.Our simulation analysis based on a handshake mechanism shows that our LAA increases the average utilization of computational units by 23.9%,with a reduction in the average execution time by 17.4%and an increase in the average power efficiency of dataflow processors by 22.4%.Crucially,our novel approach results in a relatively small increase in the area and power consumption of the on-chip logic of less than 0.9%.In conclusion,the evaluation results suggest that Look-Ahead Acknowledgment is an effective improvement for data transfer in existing dataflow architectures.
文摘In mobile computing environments, most IoT devices connected to networks experience variable error rates and possess limited bandwidth. The conventional method of retransmitting lost information during transmission, commonly used in data transmission protocols, increases transmission delay and consumes excessive bandwidth. To overcome this issue, forward error correction techniques, e.g., Random Linear Network Coding(RLNC) can be used in data transmission. The primary challenge in RLNC-based methodologies is sustaining a consistent coding ratio during data transmission, leading to notable bandwidth usage and transmission delay in dynamic network conditions. Therefore, this study proposes a new block-based RLNC strategy known as Adjustable RLNC(ARLNC), which dynamically adjusts the coding ratio and transmission window during runtime based on the estimated network error rate calculated via receiver feedback. The calculations in this approach are performed using a Galois field with the order of 256. Furthermore, we assessed ARLNC's performance by subjecting it to various error models such as Gilbert Elliott, exponential, and constant rates and compared it with the standard RLNC. The results show that dynamically adjusting the coding ratio and transmission window size based on network conditions significantly enhances network throughput and reduces total transmission delay in most scenarios. In contrast to the conventional RLNC method employing a fixed coding ratio, the presented approach has demonstrated significant enhancements, resulting in a 73% decrease in transmission delay and a 4 times augmentation in throughput. However, in dynamic computational environments, ARLNC generally incurs higher computational costs than the standard RLNC but excels in high-performance networks.
基金financial support of Natural Science Foundation of China(No.61971102,62132004)MOST Major Research and Development Project(No.2021YFB2900204)+1 种基金Sichuan Science and Technology Program(No.2022YFH0022)Key Research and Development Program of Zhejiang Province(No.2022C01093)。
文摘Integrated data and energy transfer(IDET)is capable of simultaneously delivering on-demand data and energy to low-power Internet of Everything(Io E)devices.We propose a multi-carrier IDET transceiver relying on superposition waveforms consisting of multi-sinusoidal signals for wireless energy transfer(WET)and orthogonal-frequency-divisionmultiplexing(OFDM)signals for wireless data transfer(WDT).The outdated channel state information(CSI)in aging channels is employed by the transmitter to shape IDET waveforms.With the constraints of transmission power and WDT requirement,the amplitudes and phases of the IDET waveform at the transmitter and the power splitter at the receiver are jointly optimised for maximising the average directcurrent(DC)among a limited number of transmission frames with the existence of carrier-frequencyoffset(CFO).For the amplitude optimisation,the original non-convex problem can be transformed into a reversed geometric programming problem,then it can be effectively solved with existing tools.As for the phase optimisation,the artificial bee colony(ABC)algorithm is invoked in order to deal with the nonconvexity.Iteration between the amplitude optimisation and phase optimisation yields our joint design.Numerical results demonstrate the advantage of our joint design for the IDET waveform shaping with the existence of the CFO and the outdated CSI.
基金Dr. Steve Jones, Scientific Advisor of the Canon Foundation for Scientific Research (7200 The Quorum, Oxford Business Park, Oxford OX4 2JZ, England). Canon Foundation for Scientific Research funded the UPC 2013 tuition fees of the corresponding author during her writing this article
文摘In computational physics proton transfer phenomena could be viewed as pattern classification problems based on a set of input features allowing classification of the proton motion into two categories: transfer 'occurred' and transfer 'not occurred'. The goal of this paper is to evaluate the use of artificial neural networks in the classification of proton transfer events, based on the feed-forward back propagation neural network, used as a classifier to distinguish between the two transfer cases. In this paper, we use a new developed data mining and pattern recognition tool for automating, controlling, and drawing charts of the output data of an Empirical Valence Bond existing code. The study analyzes the need for pattern recognition in aqueous proton transfer processes and how the learning approach in error back propagation (multilayer perceptron algorithms) could be satisfactorily employed in the present case. We present a tool for pattern recognition and validate the code including a real physical case study. The results of applying the artificial neural networks methodology to crowd patterns based upon selected physical properties (e.g., temperature, density) show the abilities of the network to learn proton transfer patterns corresponding to properties of the aqueous environments, which is in turn proved to be fully compatible with previous proton transfer studies.
基金supported by the PetroChina Prospective,Basic,and Strategic Technology Research Project(No.2021ZG03-02 and No.2023DJ8402)。
文摘The deep learning algorithm,which has been increasingly applied in the field of petroleum geophysical prospecting,has achieved good results in improving efficiency and accuracy based on test applications.To play a greater role in actual production,these algorithm modules must be integrated into software systems and used more often in actual production projects.Deep learning frameworks,such as TensorFlow and PyTorch,basically take Python as the core architecture,while the application program mainly uses Java,C#,and other programming languages.During integration,the seismic data read by the Java and C#data interfaces must be transferred to the Python main program module.The data exchange methods between Java,C#,and Python include shared memory,shared directory,and so on.However,these methods have the disadvantages of low transmission efficiency and unsuitability for asynchronous networks.Considering the large volume of seismic data and the need for network support for deep learning,this paper proposes a method of transmitting seismic data based on Socket.By maximizing Socket’s cross-network and efficient longdistance transmission,this approach solves the problem of inefficient transmission of underlying data while integrating the deep learning algorithm module into a software system.Furthermore,the actual production application shows that this method effectively solves the shortage of data transmission in shared memory,shared directory,and other modes while simultaneously improving the transmission efficiency of massive seismic data across modules at the bottom of the software.
文摘In this paper,the problem of increasing information transfer authenticity is formulated.And to reach a decision,the control methods and algorithms based on the use of statistical and structural information redundancy are presented.It is assumed that the controllable information is submitted as the text element images and it contains redundancy,caused by statistical relations and non-uniformity probability distribution of the transmitted data.The use of statistical redundancy allows to develop the adaptive rules of the authenticity control which take into account non-stationarity properties of image data while transferring the information.The structural redundancy peculiar to the container of image in a data transfer package is used for developing new rules to control the information authenticity on the basis of pattern recognition mechanisms.The techniques offered in this work are used to estimate the authenticity in structure of data transfer packages.The results of comparative analysis for developed methods and algorithms show that their parameters of efficiency are increased by criterion of probability of undetected mistakes,labour input and cost of realization.
文摘This paper proposes the solution of tasks set required for autonomous robotic group behavior optimization during the mission on a distributed area in a cluttered hazardous terrain.The navigation scheme uses the benefits of the original real-time technical vision system(TVS)based on a dynamic triangulation principle.The method uses TVS output data with fuzzy logic rules processing for resolution stabilization.Based on previous researches,the dynamic communication network model is modified to implement the propagation of information with a feedback method for more stable data exchange inside the robotic group.According to the comparative analysis of approximation methods,in this paper authors are proposing to use two-steps post-processing path planning aiming to get a smooth and energy-saving trajectory.The article provides a wide range of studies and computational experiment results for different scenarios for evaluation of common cloud point influence on robotic motion planning.
基金Supported by the National Natural Science Foundation of China(No.60903137,60970132)
文摘When workflow task needs several datasets from different locations m cloud, data transfer becomes a challenge. To avoid the unnecessary data transfer, a graphical-based data placement algo- rithm for cloud workflow is proposed. The algorithm uses affinity graph to group datasets while keeping a polynomial time complexity. By integrating the algorithm, the workflow engine can intelligently select locations in which the data will reside to avoid the unnecessary data transfer during the initial stage and runtime stage. Simulations show that the proposed algorithm can effectively reduce data transfer during the workflow' s execution.
基金the current result of the “research on the basic category system of contemporary Chinese digital law” (23&ZD154), a major project of the National Social Science Fund of China.
文摘Although the existing legal norms and judicial practic-es can provide basic guidance for the right to personal data portabili-ty, it can be concluded that there are obstacles to the realization of this right through empirical research of the privacy policies of 66 mobile apps, such as whether they have stipulations on the right to personal data portability, whether they are able to derive copies of personal in-formation automatically, whether there are textual examples, whether ID verification is required, whether the copied documents are encrypt-ed, and whether the scope of personal information involved is consis-tent. This gap in practice, on the one hand, reflects the misunderstand-ing of the right to personal data portability, and on the other hand, is a result of the negative externalities, practical costs and technical lim-itations of the right to personal data portability. Based on rethinking the right to data portability, we can somehow solve practical problems concerning the right to personal data portability through multiple measures such as promoting the fulfillment of this right by legislation, optimizing technology-oriented operations, refining response process mechanisms, and enhancing system interoperability.