Over the past few decades, numerous optimization-based methods have been proposed for solving the classification problem in data mining. Classic optimization-based methods do not consider attribute interactions toward...Over the past few decades, numerous optimization-based methods have been proposed for solving the classification problem in data mining. Classic optimization-based methods do not consider attribute interactions toward classification. Thus, a novel learning machine is needed to provide a better understanding on the nature of classification when the interaction among contributions from various attributes cannot be ignored. The interactions can be described by a non-additive measure while the Choquet integral can serve as the mathematical tool to aggregate the values of attributes and the corresponding values of a non-additive measure. As a main part of this research, a new nonlinear classification method with non-additive measures is proposed. Experimental results show that applying non-additive measures on the classic optimization-based models improves the classification robustness and accuracy compared with some popular classification methods. In addition, motivated by well-known Support Vector Machine approach, we transform the primal optimization-based nonlinear classification model with the signed non-additive measure into its dual form by applying Lagrangian optimization theory and Wolfes dual programming theory. As a result, 2n – 1 parameters of the signed non-additive measure can now be approximated with m (number of records) Lagrangian multipliers by applying necessary conditions of the primal classification problem to be optimal. This method of parameter approximation is a breakthrough for solving a non-additive measure practically when there are relatively small number of training cases available (mn-1). Furthermore, the kernel-based learning method engages the nonlinear classifiers to achieve better classification accuracy. The research produces practically deliverable nonlinear models with the non-additive measure for classification problem in data mining when interactions among attributes are considered.展开更多
The critical role of patient-reported outcome measures(PROMs)in enhancing clinical decision-making and promoting patient-centered care has gained a profound significance in scientific research.PROMs encapsulate a pati...The critical role of patient-reported outcome measures(PROMs)in enhancing clinical decision-making and promoting patient-centered care has gained a profound significance in scientific research.PROMs encapsulate a patient's health status directly from their perspective,encompassing various domains such as symptom severity,functional status,and overall quality of life.By integrating PROMs into routine clinical practice and research,healthcare providers can achieve a more nuanced understanding of patient experiences and tailor treatments accordingly.The deployment of PROMs supports dynamic patient-provider interactions,fostering better patient engagement and adherence to tre-atment plans.Moreover,PROMs are pivotal in clinical settings for monitoring disease progression and treatment efficacy,particularly in chronic and mental health conditions.However,challenges in implementing PROMs include data collection and management,integration into existing health systems,and acceptance by patients and providers.Overcoming these barriers necessitates technological advancements,policy development,and continuous education to enhance the acceptability and effectiveness of PROMs.The paper concludes with recommendations for future research and policy-making aimed at optimizing the use and impact of PROMs across healthcare settings.展开更多
Coal measures are significant hydrocarbon source rocks and reservoirs in petroliferous basins.Many large gas fields and coalbed methane fields globally are originated from coal-measure source rocks or accumulated in c...Coal measures are significant hydrocarbon source rocks and reservoirs in petroliferous basins.Many large gas fields and coalbed methane fields globally are originated from coal-measure source rocks or accumulated in coal rocks.Inspired by the discovery of shale oil and gas,and guided by“the overall exploration concept of considering coal rock as reservoir”,breakthroughs in the exploration and development of coal-rock gas have been achieved in deep coal seams with favorable preservation conditions,thereby opening up a new development frontier for the unconventional gas in coal-rock reservoirs.Based on the data from exploration and development practices,a systematic study on the accumulation mechanism of coal-rock gas has been conducted.The mechanisms of“three fields”controlling coal-rock gas accumulation are revealed.It is confirmed that the coal-rock gas is different from CBM in accumulation process.The whole petroleum systems in the Carboniferous–Permian transitional facies coal measures of the eastern margin of the Ordos Basin and in the Jurassic continental facies coal measures of the Junggar Basin are characterized,and the key research directions for further developing the whole petroleum system theory of coal measures are proposed.Coal rocks,compared to shale,possess intense hydrocarbon generation potential,strong adsorption capacity,dual-medium reservoir properties,and partial or weak oil and gas self-sealing capacity.Additionally,unlike other unconventional gas such as shale gas and tight gas,coal-rock gas exhibits more complex accumulation characteristics,and its accumulation requires a certain coal-rock play form lithological and structural traps.Coal-rock gas also has the characteristics of conventional fractured gas reservoirs.Compared with the basic theory and model of the whole petroleum system established based on detrital rock formations,coal measures have distinct characteristics and differences in coal-rock reservoirs and source-reservoir coupling.The whole petroleum system of coal measures is composed of various types of coal-measure hydrocarbon plays with coal(and dark shale)in coal measures as source rock and reservoir,and with adjacent tight layers as reservoirs or cap or transport layers.Under the action of source-reservoir coupling,coal-rock gas is accumulated in coal-rock reservoirs with good preservation conditions,tight oil/gas is accumulated in tight layers,conventional oil/gas is accumulated in traps far away from sources,and coalbed methane is accumulated in coal-rock reservoirs damaged by later geological processes.The proposed whole petroleum system of coal measures represents a novel type of whole petroleum system.展开更多
In the practice of healthcare,patient-reported outcomes(PROs)and PRO measures(PROMs)are used as an attempt to observe the changes in complex clinical situations.They guide us in making decisions based on the evidence ...In the practice of healthcare,patient-reported outcomes(PROs)and PRO measures(PROMs)are used as an attempt to observe the changes in complex clinical situations.They guide us in making decisions based on the evidence regarding patient care by recording the change in outcomes for a particular treatment to a given condition and finally to understand whether a patient will benefit from a particular treatment and to quantify the treatment effect.For any PROM to be usable in health care,we need it to be reliable,encapsulating the points of interest with the potential to detect any real change.Using structured outcome measures routinely in clinical practice helps the physician to understand the functional limitation of a patient that would otherwise not be clear in an office interview,and this allows the physician and patient to have a meaningful conver-sation as well as a customized plan for each patient.Having mentioned the rationale and the benefits of PROMs,understanding the quantification process is crucial before embarking on management decisions.A better interpretation of change needs to identify the treatment effect based on clinical relevance for a given condition.There are a multiple set of measurement indices to serve this effect and most of them are used interchangeably without clear demarcation on their differences.This article details the various quantification metrics used to evaluate the treatment effect using PROMs,their limitations and the scope of usage and implementation in clinical practice.展开更多
Robustness against measurement uncertainties is crucial for gas turbine engine diagnosis.While current research focuses mainly on measurement noise,measurement bias remains challenging.This study proposes a novel perf...Robustness against measurement uncertainties is crucial for gas turbine engine diagnosis.While current research focuses mainly on measurement noise,measurement bias remains challenging.This study proposes a novel performance-based fault detection and identification(FDI)strategy for twin-shaft turbofan gas turbine engines and addresses these uncertainties through a first-order Takagi-Sugeno-Kang fuzzy inference system.To handle ambient condition changes,we use parameter correction to preprocess the raw measurement data,which reduces the FDI’s system complexity.Additionally,the power-level angle is set as a scheduling parameter to reduce the number of rules in the TSK-based FDI system.The data for designing,training,and testing the proposed FDI strategy are generated using a component-level turbofan engine model.The antecedent and consequent parameters of the TSK-based FDI system are optimized using the particle swarm optimization algorithm and ridge regression.A robust structure combining a specialized fuzzy inference system with the TSK-based FDI system is proposed to handle measurement biases.The performance of the first-order TSK-based FDI system and robust FDI structure are evaluated through comprehensive simulation studies.Comparative studies confirm the superior accuracy of the first-order TSK-based FDI system in fault detection,isolation,and identification.The robust structure demonstrates a 2%-8%improvement in the success rate index under relatively large measurement bias conditions,thereby indicating excellent robustness.Accuracy against significant bias values and computation time are also evaluated,suggesting that the proposed robust structure has desirable online performance.This study proposes a novel FDI strategy that effectively addresses measurement uncertainties.展开更多
Due to the numerous variables to take into account as well as the inherent ambiguity and uncertainty,evaluating educational institutions can be difficult.The concept of a possibility Pythagorean fuzzy hypersoft set(pP...Due to the numerous variables to take into account as well as the inherent ambiguity and uncertainty,evaluating educational institutions can be difficult.The concept of a possibility Pythagorean fuzzy hypersoft set(pPyFHSS)is more flexible in this regard than other theoretical fuzzy set-like models,even though some attempts have been made in the literature to address such uncertainties.This study investigates the elementary notions of pPyFHSS including its set-theoretic operations union,intersection,complement,OR-and AND-operations.Some results related to these operations are also modified for pPyFHSS.Additionally,the similarity measures between pPyFHSSs are formulated with the assistance of numerical examples and results.Lastly,an intelligent decision-assisted mechanism is developed with the proposal of a robust algorithm based on similarity measures for solving multi-attribute decision-making(MADM)problems.A case study that helps the decision-makers assess the best educational institution is discussed to validate the suggested system.The algorithmic results are compared with the most pertinent model to evaluate the adaptability of pPyFHSS,as it generalizes the classical possibility fuzzy set-like theoretical models.Similarly,while considering significant evaluating factors,the flexibility of pPyFHSS is observed through structural comparison.展开更多
Understanding the mechanical properties of the lithologies is crucial to accurately determine the horizontal stress magnitude.To investigate the correlation between the rock mass properties and maximum horizontal stre...Understanding the mechanical properties of the lithologies is crucial to accurately determine the horizontal stress magnitude.To investigate the correlation between the rock mass properties and maximum horizontal stress,the three-dimensional(3D)stress tensors at 89 measuring points determined using an improved overcoring technique in nine mines in China were adopted,a newly defined characteristic parameter C_(ERP)was proposed as an indicator for evaluating the structural properties of rock masses,and a fuzzy relation matrix was established using the information distribution method.The results indicate that both the vertical stress and horizontal stress exhibit a good linear growth relationship with depth.There is no remarkable correlation between the elastic modulus,Poisson's ratio and depth,and the distribution of data points is scattered and messy.Moreover,there is no obvious relationship between the rock quality designation(RQD)and depth.The maximum horizontal stress σ_(H) is a function of rock properties,showing a certain linear relationship with the C_(ERP)at the same depth.In addition,the overall change trend of σ_(H) determined by the established fuzzy identification method is to increase with the increase of C_(ERP).The fuzzy identification method also demonstrates a relatively detailed local relationship betweenσ_H and C_(ERP),and the predicted curve rises in a fluctuating way,which is in accord well with the measured stress data.展开更多
The centroid coordinate serves as a critical control parameter in motion systems,including aircraft,missiles,rockets,and drones,directly influencing their motion dynamics and control performance.Traditional methods fo...The centroid coordinate serves as a critical control parameter in motion systems,including aircraft,missiles,rockets,and drones,directly influencing their motion dynamics and control performance.Traditional methods for centroid measurement often necessitate custom equipment and specialized positioning devices,leading to high costs and limited accuracy.Here,we present a centroid measurement method that integrates 3D scanning technology,enabling accurate measurement of centroid across various types of objects without the need for specialized positioning fixtures.A theoretical framework for centroid measurement was established,which combined the principle of the multi-point weighing method with 3D scanning technology.The measurement accuracy was evaluated using a designed standard component.Experimental results demonstrate that the discrepancies between the theoretical and the measured centroid of a standard component with various materials and complex shapes in the X,Y,and Z directions are 0.003 mm,0.009 mm,and 0.105 mm,respectively,yielding a spatial deviation of 0.106 mm.Qualitative verification was conducted through experimental validation of three distinct types.They confirmed the reliability of the proposed method,which allowed for accurate centroid measurements of various products without requiring positioning fixtures.This advancement significantly broadened the applicability and scope of centroid measurement devices,offering new theoretical insights and methodologies for the measurement of complex parts and systems.展开更多
The accurate characterization of thermoelectric properties at low temperatures is crucial for the development of high-performance thermoelectric cooling devices. While measurement errors of thermoelectric properties a...The accurate characterization of thermoelectric properties at low temperatures is crucial for the development of high-performance thermoelectric cooling devices. While measurement errors of thermoelectric properties at temperatures above room temperature have been extensively discussed, there is a lack of standard measurement protocols and error analyses for low-temperature transport properties. In this study, we present a measurement system capable of characterizing all three key thermoelectric parameters, i.e., Seebeck coefficient, electrical conductivity, and thermal conductivity, for a single sample across a temperature range of 10 K to 300 K. We investigated six representative commercial Bi_(2)Te_(3)-based samples(three N-type and three P-type). Using an error propagation model, we systematically analyzed the measurement uncertainties of the three intrinsic parameters and the resulting thermoelectric figure of merit. Our findings reveal that measurement uncertainties for both N-type and P-type Bi_(2)Te_(3)-based materials can be effectively maintained below 5% in the temperature range of 40 K to 300 K. However, the uncertainties increase to over 10% at lower temperatures, primarily due to the relatively smaller values of electrical resistivity and Seebeck coefficients in this regime. This work establishes foundational data for Bi_(2)Te_(3)-based thermoelectric materials and provides a framework for broader investigations of advanced low-temperature thermoelectrics.展开更多
The concept of emissivity has been with the scientific and engineering world since Planck formulated his blackbody radiation law more than a century ago.Nevertheless,emissivity is an elusive concept even for ex⁃perts....The concept of emissivity has been with the scientific and engineering world since Planck formulated his blackbody radiation law more than a century ago.Nevertheless,emissivity is an elusive concept even for ex⁃perts.It is a vague and fuzzy concept for the wider community of engineers.The importance of remote sensing of temperature by measuring IR radiation has been recognized in a wide range of industrial,medical,and environ⁃mental uses.One of the major sources of errors in IR radiometry is the emissivity of the surface being measured.In real experiments,emissivity may be influenced by many factors:surface texture,spectral properties,oxida⁃tion,and aging of surfaces.While commercial blackbodies are prevalent,the much-needed grey bodies with a known emissivity,are unavailable.This study describes how to achieve a calibrated and stable emissivity with a blackbody,a perforated screen,and a reliable and linear novel IR thermal sensor,18 dubbed TMOS.The Digital TMOS is now a low-cost commercial product,it requires low power,and it has a small form factor.The method⁃ology is based on two-color measurements,with two different optical filters,with selected wavelengths conform⁃ing to the grey body definition of the use case under study.With a photochemically etched perforated screen,the effective emissivity of the screen is simply the hole density area of the surface area that emits according to the blackbody temperature radiation.The concept is illustrated with ray tracing simulations,which demonstrate the approach.Measured results are reported.展开更多
A pseudo-cone in ℝ^(n) is a nonempty closed convex set K not containing the origin and such thatλK⊆K for allλ≥1.It is called a C-pseudo-cone if C is its recession cone,where C is a pointed closed convex cone with i...A pseudo-cone in ℝ^(n) is a nonempty closed convex set K not containing the origin and such thatλK⊆K for allλ≥1.It is called a C-pseudo-cone if C is its recession cone,where C is a pointed closed convex cone with interior points.The cone-volume measure of a pseudo-cone can be defined similarly as for convex bodies,but it may be infinite.After proving a necessary condition for cone-volume measures of C-pseudo-cones,we introduce suitable weights for cone-volume measures,yielding finite measures.Then we provide a necessary and sufficient condition for a Borel measure on the unit sphere to be the weighted cone-volume measure of some C-pseudo-cone.展开更多
This study presents a new approach that advances the algorithm of similarity measures between generalized fuzzy numbers. Following a brief introduction to some properties of the proposed method, a comparative analysis...This study presents a new approach that advances the algorithm of similarity measures between generalized fuzzy numbers. Following a brief introduction to some properties of the proposed method, a comparative analysis based on 36 sets of generalized fuzzy numbers was performed, in which the degree of similarity of the fuzzy numbers was calculated with the proposed method and seven methods established by previous studies in the literature. The results of the analytical comparison show that the proposed similarity outperforms the existing methods by overcoming their drawbacks and yielding accurate outcomes in all calculations of similarity measures under consideration. Finally, in a numerical example that involves recommending cars to customers based on a nine-member linguistic term set, the proposed similarity measure proves to be competent in addressing fuzzy number recommendation problems.展开更多
In modern industrial design trends featuring with integration,miniaturization,and versatility,there is a growing demand on the utilization of microstructural array devices.The measurement of such microstructural array...In modern industrial design trends featuring with integration,miniaturization,and versatility,there is a growing demand on the utilization of microstructural array devices.The measurement of such microstructural array components often encounters challenges due to the reduced scale and complex structures,either by contact or noncontact optical approaches.Among these microstructural arrays,there are still no optical measurement methods for micro corner-cube reflector arrays.To solve this problem,this study introduces a method for effectively eliminating coherent noise and achieving surface profile reconstruction in interference measurements of microstructural arrays.The proposed denoising method allows the calibration and inverse solving of system errors in the frequency domain by employing standard components with known surface types.This enables the effective compensation of the complex amplitude of non-sample coherent light within the interferometer optical path.The proposed surface reconstruction method enables the profile calculation within the situation that there is complex multi-reflection during the propagation of rays in microstructural arrays.Based on the measurement results,two novel metrics are defined to estimate diffraction errors at array junctions and comprehensive errors across multiple array elements,offering insights into other types of microstructure devices.This research not only addresses challenges of the coherent noise and multi-reflection,but also makes a breakthrough for quantitively optical interference measurement of microstructural array devices.展开更多
Developing highly active and stable oxygen evolution reaction(OER)catalysts necessitates the establishment of a comprehensive OER catalyst database.However,the absence of a standardized benchmarking protocol has hinde...Developing highly active and stable oxygen evolution reaction(OER)catalysts necessitates the establishment of a comprehensive OER catalyst database.However,the absence of a standardized benchmarking protocol has hindered this progress.In this work,we present a systematic protocol for electrochemical measurements to thoroughly evaluate the activity and stability of OER electrocatalysts.We begin with a detailed introduction to constructing the electrochemical system,encompassing experimental setup and the selection criteria for electrodes and electrolytes.Potential contaminants originating from electrolytes,cells,and electrodes are identified and their impacts are discussed.We also examine the effects of external factors,such as temperature,magnetic fields,and natural light,on OER measurements.The protocol outlines operational mechanisms and recommended settings for various electrochemical techniques,including cyclic voltammetry(CV),potentiostatic electrochemical impedance spectroscopy(PEIS),Tafel slope analysis,and pulse voltammetry(PV).We summarize existing evaluation methodologies for assessing intrinsic activities and long-term stabilities of catalysts.Based on these discussions,we propose a comprehensive protocol for evaluating OER electrocatalysts’performance.Finally,we offer perspectives on advancing OER catalysts from laboratory research to industrial applications.展开更多
Real-time and accurate drogue pose measurement during docking is basic and critical for Autonomous Aerial Refueling(AAR).Vision measurement is the best practicable technique,but its measurement accuracy and robustness...Real-time and accurate drogue pose measurement during docking is basic and critical for Autonomous Aerial Refueling(AAR).Vision measurement is the best practicable technique,but its measurement accuracy and robustness are easily affected by limited computing power of airborne equipment,complex aerial scenes and partial occlusion.To address the above challenges,we propose a novel drogue keypoint detection and pose measurement algorithm based on monocular vision,and realize real-time processing on airborne embedded devices.Firstly,a lightweight network is designed with structural re-parameterization to reduce computational cost and improve inference speed.And a sub-pixel level keypoints prediction head and loss functions are adopted to improve keypoint detection accuracy.Secondly,a closed-form solution of drogue pose is computed based on double spatial circles,followed by a nonlinear refinement based on Levenberg-Marquardt optimization.Both virtual simulation and physical simulation experiments have been used to test the proposed method.In the virtual simulation,the mean pixel error of the proposed method is 0.787 pixels,which is significantly superior to that of other methods.In the physical simulation,the mean relative measurement error is 0.788%,and the mean processing time is 13.65 ms on embedded devices.展开更多
This study investigates a consistent fusion algorithm for distributed multi-rate multi-sensor systems operating in feedback-memory configurations, where each sensor's sampling period is uniform and an integer mult...This study investigates a consistent fusion algorithm for distributed multi-rate multi-sensor systems operating in feedback-memory configurations, where each sensor's sampling period is uniform and an integer multiple of the state update period. The focus is on scenarios where the correlations among Measurement Noises(MNs) from different sensors are unknown. Firstly, a non-augmented local estimator that applies to sampling cases is designed to provide unbiased Local Estimates(LEs) at the fusion points. Subsequently, a measurement-equivalent approach is then developed to parameterize the correlation structure between LEs and reformulate LEs into a unified form, thereby constraining the correlations arising from MNs to an admissible range. Simultaneously, a family of upper bounds on the joint error covariance matrix of LEs is derived based on the constrained correlations, avoiding the need to calculate the exact error cross-covariance matrix of LEs. Finally, a sequential fusion estimator is proposed in the sense of Weighted Minimum Mean Square Error(WMMSE), and it is proven to be unbiased, consistent, and more accurate than the well-known covariance intersection method. Simulation results illustrate the effectiveness of the proposed algorithm by highlighting improvements in consistency and accuracy.展开更多
Investment remains a cornerstone of China’s economic resilience amid global headwinds and structural transitions.In the short term,it drives sectoral growth,creates jobs,and stabilizes economic momentum.Over the long...Investment remains a cornerstone of China’s economic resilience amid global headwinds and structural transitions.In the short term,it drives sectoral growth,creates jobs,and stabilizes economic momentum.Over the long term,a well-structured investment allocation fuels industrial innovation and upgrading,laying a solid foundation for sustainable economic development.展开更多
To address the standardization demands of the railway industry under the new circumstances,the National Railway Administration revised current regulations, and issued the Measures for Management of Railway Technical S...To address the standardization demands of the railway industry under the new circumstances,the National Railway Administration revised current regulations, and issued the Measures for Management of Railway Technical Standards to implement relevant deployments and mechanism cons truc tion, based on the current situation of railway standards system and standards management.展开更多
The estimation of quantum phase differences plays an important role in quantum simulation and quantum computation,yet existing quantum phase estimation algorithms face critical limitations in noisy intermediate-scale ...The estimation of quantum phase differences plays an important role in quantum simulation and quantum computation,yet existing quantum phase estimation algorithms face critical limitations in noisy intermediate-scale quantum(NISQ)devices due to their excessive depth and circuit complexity.We demonstrate a high-precision phase difference estimation protocol based on the Bayesian phase difference estimation algorithm and single-photon projective measurement.The iterative framework of the algorithm,combined with the independence from controlled unitary operations,inherently mitigates circuit depth and complexity limitations.Through an experimental realization on the photonic system,we demonstrate high-precision estimation of diverse phase differences,showing root-mean-square errors(RMSE)below the standard quantum limit𝒪(1/√N)and reaching the Heisenberg scaling𝒪(1/N)after a certain number of iterations.Our scheme provides a critical advantage in quantum resource-constrained scenarios,and advances practical implementations of quantum information tasks under realistic hardware constraints.展开更多
Two-dimensional(2D)materials are promising for next-generation electronic devices and systems due to their unique physical properties.The interfacial adhesion plays a vital role not only in the synthesis,transfer and ...Two-dimensional(2D)materials are promising for next-generation electronic devices and systems due to their unique physical properties.The interfacial adhesion plays a vital role not only in the synthesis,transfer and manipulation of 2D materials but also in the manufacture,integration and performance of the functional devices.However,the atomic thickness and limited lateral dimensions of 2D materials make the accurate measurement and modulation of their interfacial adhesion energy challenging.In this review,the recent advances in the measurement and modulation of the interfacial adhesion properties of 2D materials are systematically combed.Experimental methods and relative theoretical models for the adhesion measurement of 2D materials are summarized,with their scope of application and limitations discussed.The measured adhesion energies between 2D materials and various substrates are described in categories,where the typical adhesion modulation strategies of 2D materials are also introduced.Finally,the remaining challenges and opportunities for the interfacial adhesion measurement and modulation of 2D materials are presented.This paper provides guidance for addressing the adhesion issues in devices and systems involving 2D materials.展开更多
文摘Over the past few decades, numerous optimization-based methods have been proposed for solving the classification problem in data mining. Classic optimization-based methods do not consider attribute interactions toward classification. Thus, a novel learning machine is needed to provide a better understanding on the nature of classification when the interaction among contributions from various attributes cannot be ignored. The interactions can be described by a non-additive measure while the Choquet integral can serve as the mathematical tool to aggregate the values of attributes and the corresponding values of a non-additive measure. As a main part of this research, a new nonlinear classification method with non-additive measures is proposed. Experimental results show that applying non-additive measures on the classic optimization-based models improves the classification robustness and accuracy compared with some popular classification methods. In addition, motivated by well-known Support Vector Machine approach, we transform the primal optimization-based nonlinear classification model with the signed non-additive measure into its dual form by applying Lagrangian optimization theory and Wolfes dual programming theory. As a result, 2n – 1 parameters of the signed non-additive measure can now be approximated with m (number of records) Lagrangian multipliers by applying necessary conditions of the primal classification problem to be optimal. This method of parameter approximation is a breakthrough for solving a non-additive measure practically when there are relatively small number of training cases available (mn-1). Furthermore, the kernel-based learning method engages the nonlinear classifiers to achieve better classification accuracy. The research produces practically deliverable nonlinear models with the non-additive measure for classification problem in data mining when interactions among attributes are considered.
文摘The critical role of patient-reported outcome measures(PROMs)in enhancing clinical decision-making and promoting patient-centered care has gained a profound significance in scientific research.PROMs encapsulate a patient's health status directly from their perspective,encompassing various domains such as symptom severity,functional status,and overall quality of life.By integrating PROMs into routine clinical practice and research,healthcare providers can achieve a more nuanced understanding of patient experiences and tailor treatments accordingly.The deployment of PROMs supports dynamic patient-provider interactions,fostering better patient engagement and adherence to tre-atment plans.Moreover,PROMs are pivotal in clinical settings for monitoring disease progression and treatment efficacy,particularly in chronic and mental health conditions.However,challenges in implementing PROMs include data collection and management,integration into existing health systems,and acceptance by patients and providers.Overcoming these barriers necessitates technological advancements,policy development,and continuous education to enhance the acceptability and effectiveness of PROMs.The paper concludes with recommendations for future research and policy-making aimed at optimizing the use and impact of PROMs across healthcare settings.
基金Supported by the PetroChina Basic Project(2024DJ23)CNPC Science Research and Technology Development Project(2021DJ0101)。
文摘Coal measures are significant hydrocarbon source rocks and reservoirs in petroliferous basins.Many large gas fields and coalbed methane fields globally are originated from coal-measure source rocks or accumulated in coal rocks.Inspired by the discovery of shale oil and gas,and guided by“the overall exploration concept of considering coal rock as reservoir”,breakthroughs in the exploration and development of coal-rock gas have been achieved in deep coal seams with favorable preservation conditions,thereby opening up a new development frontier for the unconventional gas in coal-rock reservoirs.Based on the data from exploration and development practices,a systematic study on the accumulation mechanism of coal-rock gas has been conducted.The mechanisms of“three fields”controlling coal-rock gas accumulation are revealed.It is confirmed that the coal-rock gas is different from CBM in accumulation process.The whole petroleum systems in the Carboniferous–Permian transitional facies coal measures of the eastern margin of the Ordos Basin and in the Jurassic continental facies coal measures of the Junggar Basin are characterized,and the key research directions for further developing the whole petroleum system theory of coal measures are proposed.Coal rocks,compared to shale,possess intense hydrocarbon generation potential,strong adsorption capacity,dual-medium reservoir properties,and partial or weak oil and gas self-sealing capacity.Additionally,unlike other unconventional gas such as shale gas and tight gas,coal-rock gas exhibits more complex accumulation characteristics,and its accumulation requires a certain coal-rock play form lithological and structural traps.Coal-rock gas also has the characteristics of conventional fractured gas reservoirs.Compared with the basic theory and model of the whole petroleum system established based on detrital rock formations,coal measures have distinct characteristics and differences in coal-rock reservoirs and source-reservoir coupling.The whole petroleum system of coal measures is composed of various types of coal-measure hydrocarbon plays with coal(and dark shale)in coal measures as source rock and reservoir,and with adjacent tight layers as reservoirs or cap or transport layers.Under the action of source-reservoir coupling,coal-rock gas is accumulated in coal-rock reservoirs with good preservation conditions,tight oil/gas is accumulated in tight layers,conventional oil/gas is accumulated in traps far away from sources,and coalbed methane is accumulated in coal-rock reservoirs damaged by later geological processes.The proposed whole petroleum system of coal measures represents a novel type of whole petroleum system.
文摘In the practice of healthcare,patient-reported outcomes(PROs)and PRO measures(PROMs)are used as an attempt to observe the changes in complex clinical situations.They guide us in making decisions based on the evidence regarding patient care by recording the change in outcomes for a particular treatment to a given condition and finally to understand whether a patient will benefit from a particular treatment and to quantify the treatment effect.For any PROM to be usable in health care,we need it to be reliable,encapsulating the points of interest with the potential to detect any real change.Using structured outcome measures routinely in clinical practice helps the physician to understand the functional limitation of a patient that would otherwise not be clear in an office interview,and this allows the physician and patient to have a meaningful conver-sation as well as a customized plan for each patient.Having mentioned the rationale and the benefits of PROMs,understanding the quantification process is crucial before embarking on management decisions.A better interpretation of change needs to identify the treatment effect based on clinical relevance for a given condition.There are a multiple set of measurement indices to serve this effect and most of them are used interchangeably without clear demarcation on their differences.This article details the various quantification metrics used to evaluate the treatment effect using PROMs,their limitations and the scope of usage and implementation in clinical practice.
文摘Robustness against measurement uncertainties is crucial for gas turbine engine diagnosis.While current research focuses mainly on measurement noise,measurement bias remains challenging.This study proposes a novel performance-based fault detection and identification(FDI)strategy for twin-shaft turbofan gas turbine engines and addresses these uncertainties through a first-order Takagi-Sugeno-Kang fuzzy inference system.To handle ambient condition changes,we use parameter correction to preprocess the raw measurement data,which reduces the FDI’s system complexity.Additionally,the power-level angle is set as a scheduling parameter to reduce the number of rules in the TSK-based FDI system.The data for designing,training,and testing the proposed FDI strategy are generated using a component-level turbofan engine model.The antecedent and consequent parameters of the TSK-based FDI system are optimized using the particle swarm optimization algorithm and ridge regression.A robust structure combining a specialized fuzzy inference system with the TSK-based FDI system is proposed to handle measurement biases.The performance of the first-order TSK-based FDI system and robust FDI structure are evaluated through comprehensive simulation studies.Comparative studies confirm the superior accuracy of the first-order TSK-based FDI system in fault detection,isolation,and identification.The robust structure demonstrates a 2%-8%improvement in the success rate index under relatively large measurement bias conditions,thereby indicating excellent robustness.Accuracy against significant bias values and computation time are also evaluated,suggesting that the proposed robust structure has desirable online performance.This study proposes a novel FDI strategy that effectively addresses measurement uncertainties.
基金supported by the Deanship of Graduate Studies and Scientific Research at Qassim University(QU-APC-2024-9/1).
文摘Due to the numerous variables to take into account as well as the inherent ambiguity and uncertainty,evaluating educational institutions can be difficult.The concept of a possibility Pythagorean fuzzy hypersoft set(pPyFHSS)is more flexible in this regard than other theoretical fuzzy set-like models,even though some attempts have been made in the literature to address such uncertainties.This study investigates the elementary notions of pPyFHSS including its set-theoretic operations union,intersection,complement,OR-and AND-operations.Some results related to these operations are also modified for pPyFHSS.Additionally,the similarity measures between pPyFHSSs are formulated with the assistance of numerical examples and results.Lastly,an intelligent decision-assisted mechanism is developed with the proposal of a robust algorithm based on similarity measures for solving multi-attribute decision-making(MADM)problems.A case study that helps the decision-makers assess the best educational institution is discussed to validate the suggested system.The algorithmic results are compared with the most pertinent model to evaluate the adaptability of pPyFHSS,as it generalizes the classical possibility fuzzy set-like theoretical models.Similarly,while considering significant evaluating factors,the flexibility of pPyFHSS is observed through structural comparison.
基金financially supported by the National Natural Science Foundation of China(No.52204084)the Open Research Fund of the State Key Laboratory of Coal Resources and safe Mining,CUMT,China(No.SKLCRSM 23KF004)+3 种基金the Interdisciplinary Research Project for Young Teachers of USTB(Fundamental Research Funds for the Central Universities),China(No.FRF-IDRY-GD22-002)the Fundamental Research Funds for the Central Universities and the Youth Teacher International Exchange and Growth Program,China(No.QNXM20220009)the National Key R&D Program of China(Nos.2022YFC2905600 and 2022 YFC3004601)the Science,Technology&Innovation Project of Xiongan New Area,China(No.2023XAGG0061)。
文摘Understanding the mechanical properties of the lithologies is crucial to accurately determine the horizontal stress magnitude.To investigate the correlation between the rock mass properties and maximum horizontal stress,the three-dimensional(3D)stress tensors at 89 measuring points determined using an improved overcoring technique in nine mines in China were adopted,a newly defined characteristic parameter C_(ERP)was proposed as an indicator for evaluating the structural properties of rock masses,and a fuzzy relation matrix was established using the information distribution method.The results indicate that both the vertical stress and horizontal stress exhibit a good linear growth relationship with depth.There is no remarkable correlation between the elastic modulus,Poisson's ratio and depth,and the distribution of data points is scattered and messy.Moreover,there is no obvious relationship between the rock quality designation(RQD)and depth.The maximum horizontal stress σ_(H) is a function of rock properties,showing a certain linear relationship with the C_(ERP)at the same depth.In addition,the overall change trend of σ_(H) determined by the established fuzzy identification method is to increase with the increase of C_(ERP).The fuzzy identification method also demonstrates a relatively detailed local relationship betweenσ_H and C_(ERP),and the predicted curve rises in a fluctuating way,which is in accord well with the measured stress data.
基金supported by National Natural Science Foundation of China(No.52176122).
文摘The centroid coordinate serves as a critical control parameter in motion systems,including aircraft,missiles,rockets,and drones,directly influencing their motion dynamics and control performance.Traditional methods for centroid measurement often necessitate custom equipment and specialized positioning devices,leading to high costs and limited accuracy.Here,we present a centroid measurement method that integrates 3D scanning technology,enabling accurate measurement of centroid across various types of objects without the need for specialized positioning fixtures.A theoretical framework for centroid measurement was established,which combined the principle of the multi-point weighing method with 3D scanning technology.The measurement accuracy was evaluated using a designed standard component.Experimental results demonstrate that the discrepancies between the theoretical and the measured centroid of a standard component with various materials and complex shapes in the X,Y,and Z directions are 0.003 mm,0.009 mm,and 0.105 mm,respectively,yielding a spatial deviation of 0.106 mm.Qualitative verification was conducted through experimental validation of three distinct types.They confirmed the reliability of the proposed method,which allowed for accurate centroid measurements of various products without requiring positioning fixtures.This advancement significantly broadened the applicability and scope of centroid measurement devices,offering new theoretical insights and methodologies for the measurement of complex parts and systems.
基金supported by the National Natural Science Foundation of China (Grant No. 52172259)the National Key Research and Development Program of China (Grant Nos. 2021YFA0718700 and 2022YFB3803900)the Fundamental Research Funds for the Inner Mongolia Normal University (Grant No. 2022JBTD008)。
文摘The accurate characterization of thermoelectric properties at low temperatures is crucial for the development of high-performance thermoelectric cooling devices. While measurement errors of thermoelectric properties at temperatures above room temperature have been extensively discussed, there is a lack of standard measurement protocols and error analyses for low-temperature transport properties. In this study, we present a measurement system capable of characterizing all three key thermoelectric parameters, i.e., Seebeck coefficient, electrical conductivity, and thermal conductivity, for a single sample across a temperature range of 10 K to 300 K. We investigated six representative commercial Bi_(2)Te_(3)-based samples(three N-type and three P-type). Using an error propagation model, we systematically analyzed the measurement uncertainties of the three intrinsic parameters and the resulting thermoelectric figure of merit. Our findings reveal that measurement uncertainties for both N-type and P-type Bi_(2)Te_(3)-based materials can be effectively maintained below 5% in the temperature range of 40 K to 300 K. However, the uncertainties increase to over 10% at lower temperatures, primarily due to the relatively smaller values of electrical resistivity and Seebeck coefficients in this regime. This work establishes foundational data for Bi_(2)Te_(3)-based thermoelectric materials and provides a framework for broader investigations of advanced low-temperature thermoelectrics.
文摘The concept of emissivity has been with the scientific and engineering world since Planck formulated his blackbody radiation law more than a century ago.Nevertheless,emissivity is an elusive concept even for ex⁃perts.It is a vague and fuzzy concept for the wider community of engineers.The importance of remote sensing of temperature by measuring IR radiation has been recognized in a wide range of industrial,medical,and environ⁃mental uses.One of the major sources of errors in IR radiometry is the emissivity of the surface being measured.In real experiments,emissivity may be influenced by many factors:surface texture,spectral properties,oxida⁃tion,and aging of surfaces.While commercial blackbodies are prevalent,the much-needed grey bodies with a known emissivity,are unavailable.This study describes how to achieve a calibrated and stable emissivity with a blackbody,a perforated screen,and a reliable and linear novel IR thermal sensor,18 dubbed TMOS.The Digital TMOS is now a low-cost commercial product,it requires low power,and it has a small form factor.The method⁃ology is based on two-color measurements,with two different optical filters,with selected wavelengths conform⁃ing to the grey body definition of the use case under study.With a photochemically etched perforated screen,the effective emissivity of the screen is simply the hole density area of the surface area that emits according to the blackbody temperature radiation.The concept is illustrated with ray tracing simulations,which demonstrate the approach.Measured results are reported.
文摘A pseudo-cone in ℝ^(n) is a nonempty closed convex set K not containing the origin and such thatλK⊆K for allλ≥1.It is called a C-pseudo-cone if C is its recession cone,where C is a pointed closed convex cone with interior points.The cone-volume measure of a pseudo-cone can be defined similarly as for convex bodies,but it may be infinite.After proving a necessary condition for cone-volume measures of C-pseudo-cones,we introduce suitable weights for cone-volume measures,yielding finite measures.Then we provide a necessary and sufficient condition for a Borel measure on the unit sphere to be the weighted cone-volume measure of some C-pseudo-cone.
文摘This study presents a new approach that advances the algorithm of similarity measures between generalized fuzzy numbers. Following a brief introduction to some properties of the proposed method, a comparative analysis based on 36 sets of generalized fuzzy numbers was performed, in which the degree of similarity of the fuzzy numbers was calculated with the proposed method and seven methods established by previous studies in the literature. The results of the analytical comparison show that the proposed similarity outperforms the existing methods by overcoming their drawbacks and yielding accurate outcomes in all calculations of similarity measures under consideration. Finally, in a numerical example that involves recommending cars to customers based on a nine-member linguistic term set, the proposed similarity measure proves to be competent in addressing fuzzy number recommendation problems.
基金Supported by National Natural Science Foundation of China(Grant Nos.52375414,52075100)Shanghai Science and Technology Committee Innovation Grant of China(Grant No.23ZR1404200).
文摘In modern industrial design trends featuring with integration,miniaturization,and versatility,there is a growing demand on the utilization of microstructural array devices.The measurement of such microstructural array components often encounters challenges due to the reduced scale and complex structures,either by contact or noncontact optical approaches.Among these microstructural arrays,there are still no optical measurement methods for micro corner-cube reflector arrays.To solve this problem,this study introduces a method for effectively eliminating coherent noise and achieving surface profile reconstruction in interference measurements of microstructural arrays.The proposed denoising method allows the calibration and inverse solving of system errors in the frequency domain by employing standard components with known surface types.This enables the effective compensation of the complex amplitude of non-sample coherent light within the interferometer optical path.The proposed surface reconstruction method enables the profile calculation within the situation that there is complex multi-reflection during the propagation of rays in microstructural arrays.Based on the measurement results,two novel metrics are defined to estimate diffraction errors at array junctions and comprehensive errors across multiple array elements,offering insights into other types of microstructure devices.This research not only addresses challenges of the coherent noise and multi-reflection,but also makes a breakthrough for quantitively optical interference measurement of microstructural array devices.
基金supported by the Fundamental Research Funds for the Central Universities(20822041H4082)。
文摘Developing highly active and stable oxygen evolution reaction(OER)catalysts necessitates the establishment of a comprehensive OER catalyst database.However,the absence of a standardized benchmarking protocol has hindered this progress.In this work,we present a systematic protocol for electrochemical measurements to thoroughly evaluate the activity and stability of OER electrocatalysts.We begin with a detailed introduction to constructing the electrochemical system,encompassing experimental setup and the selection criteria for electrodes and electrolytes.Potential contaminants originating from electrolytes,cells,and electrodes are identified and their impacts are discussed.We also examine the effects of external factors,such as temperature,magnetic fields,and natural light,on OER measurements.The protocol outlines operational mechanisms and recommended settings for various electrochemical techniques,including cyclic voltammetry(CV),potentiostatic electrochemical impedance spectroscopy(PEIS),Tafel slope analysis,and pulse voltammetry(PV).We summarize existing evaluation methodologies for assessing intrinsic activities and long-term stabilities of catalysts.Based on these discussions,we propose a comprehensive protocol for evaluating OER electrocatalysts’performance.Finally,we offer perspectives on advancing OER catalysts from laboratory research to industrial applications.
基金supported by the National Science Fund for Distinguished Young Scholars,China(No.51625501)Aeronautical Science Foundation of China(No.20240046051002)National Natural Science Foundation of China(No.52005028).
文摘Real-time and accurate drogue pose measurement during docking is basic and critical for Autonomous Aerial Refueling(AAR).Vision measurement is the best practicable technique,but its measurement accuracy and robustness are easily affected by limited computing power of airborne equipment,complex aerial scenes and partial occlusion.To address the above challenges,we propose a novel drogue keypoint detection and pose measurement algorithm based on monocular vision,and realize real-time processing on airborne embedded devices.Firstly,a lightweight network is designed with structural re-parameterization to reduce computational cost and improve inference speed.And a sub-pixel level keypoints prediction head and loss functions are adopted to improve keypoint detection accuracy.Secondly,a closed-form solution of drogue pose is computed based on double spatial circles,followed by a nonlinear refinement based on Levenberg-Marquardt optimization.Both virtual simulation and physical simulation experiments have been used to test the proposed method.In the virtual simulation,the mean pixel error of the proposed method is 0.787 pixels,which is significantly superior to that of other methods.In the physical simulation,the mean relative measurement error is 0.788%,and the mean processing time is 13.65 ms on embedded devices.
基金supported by the National Natural Science Foundation of China (Nos. 62276204, 62203343)。
文摘This study investigates a consistent fusion algorithm for distributed multi-rate multi-sensor systems operating in feedback-memory configurations, where each sensor's sampling period is uniform and an integer multiple of the state update period. The focus is on scenarios where the correlations among Measurement Noises(MNs) from different sensors are unknown. Firstly, a non-augmented local estimator that applies to sampling cases is designed to provide unbiased Local Estimates(LEs) at the fusion points. Subsequently, a measurement-equivalent approach is then developed to parameterize the correlation structure between LEs and reformulate LEs into a unified form, thereby constraining the correlations arising from MNs to an admissible range. Simultaneously, a family of upper bounds on the joint error covariance matrix of LEs is derived based on the constrained correlations, avoiding the need to calculate the exact error cross-covariance matrix of LEs. Finally, a sequential fusion estimator is proposed in the sense of Weighted Minimum Mean Square Error(WMMSE), and it is proven to be unbiased, consistent, and more accurate than the well-known covariance intersection method. Simulation results illustrate the effectiveness of the proposed algorithm by highlighting improvements in consistency and accuracy.
文摘Investment remains a cornerstone of China’s economic resilience amid global headwinds and structural transitions.In the short term,it drives sectoral growth,creates jobs,and stabilizes economic momentum.Over the long term,a well-structured investment allocation fuels industrial innovation and upgrading,laying a solid foundation for sustainable economic development.
文摘To address the standardization demands of the railway industry under the new circumstances,the National Railway Administration revised current regulations, and issued the Measures for Management of Railway Technical Standards to implement relevant deployments and mechanism cons truc tion, based on the current situation of railway standards system and standards management.
基金Project supported by the Natural Science Foundation of Jiangsu Province(Grant Nos.BK20233001 and BK20243060)the National Natural Science Foundation of China(Grant No.62288101)。
文摘The estimation of quantum phase differences plays an important role in quantum simulation and quantum computation,yet existing quantum phase estimation algorithms face critical limitations in noisy intermediate-scale quantum(NISQ)devices due to their excessive depth and circuit complexity.We demonstrate a high-precision phase difference estimation protocol based on the Bayesian phase difference estimation algorithm and single-photon projective measurement.The iterative framework of the algorithm,combined with the independence from controlled unitary operations,inherently mitigates circuit depth and complexity limitations.Through an experimental realization on the photonic system,we demonstrate high-precision estimation of diverse phase differences,showing root-mean-square errors(RMSE)below the standard quantum limit𝒪(1/√N)and reaching the Heisenberg scaling𝒪(1/N)after a certain number of iterations.Our scheme provides a critical advantage in quantum resource-constrained scenarios,and advances practical implementations of quantum information tasks under realistic hardware constraints.
基金supported by the National Natural Science Foundation of China(Grant Nos.12002133,12372109,and 11972171)the Natural Science Foundation of Jiangsu Province(Grant Nos.BK20200590 and BK20180031)+4 种基金the Fundamental Research Funds for the Central Universities(Grant No.JUSRP121040)the National Key R&D Program of China(Grant No.2023YFB4605101)the 111 project(Grant No.B18027)the Open Fund of Key Laboratory for Intelligent Nano Materials and Devices of the Ministry of Education(Grant No.NJ2020003)the Sixth Phase of Jiangsu Province“333 High Level Talent Training Project”Second Level Talents.
文摘Two-dimensional(2D)materials are promising for next-generation electronic devices and systems due to their unique physical properties.The interfacial adhesion plays a vital role not only in the synthesis,transfer and manipulation of 2D materials but also in the manufacture,integration and performance of the functional devices.However,the atomic thickness and limited lateral dimensions of 2D materials make the accurate measurement and modulation of their interfacial adhesion energy challenging.In this review,the recent advances in the measurement and modulation of the interfacial adhesion properties of 2D materials are systematically combed.Experimental methods and relative theoretical models for the adhesion measurement of 2D materials are summarized,with their scope of application and limitations discussed.The measured adhesion energies between 2D materials and various substrates are described in categories,where the typical adhesion modulation strategies of 2D materials are also introduced.Finally,the remaining challenges and opportunities for the interfacial adhesion measurement and modulation of 2D materials are presented.This paper provides guidance for addressing the adhesion issues in devices and systems involving 2D materials.