Given the growing number of vehicle accidents caused by unintended acceleration and braking failure,verifying Sudden Unintended Acceleration(SUA)incidents has become a persistent challenge.A central issue of debate is...Given the growing number of vehicle accidents caused by unintended acceleration and braking failure,verifying Sudden Unintended Acceleration(SUA)incidents has become a persistent challenge.A central issue of debate is whether such events stem frommechanical malfunctions or driver pedalmisapplications.However,existing verification procedures implemented by vehiclemanufacturers often involve closed tests after vehicle recalls;thus raising ongoing concerns about reliability and transparency.Consequently,there is a growing need for a user-driven framework that enables independent data acquisition and verification.Although previous studies have addressed SUA detection using deep learning,few have explored howclass granularity optimization affects power efficiency and inference performance in real-time Edge AI systems.To address this problem,this work presents a cloud-assisted artificial intelligence(AI)solution for the reliable verification of SUA occurrences.The proposed system integrates multimodal sensor streams including camera-based foot images,On-Board Diagnostics II(OBD-II)signals,and six-axismeasurements to determine whether the brake pedal was actually engaged at themoment of a suspected SUA.Beyond image acquisition,convolutional neural network(CNN)models perform real-time inference to classify the driver’s pedal operation states with the resulting outputs transmitted and archived in the cloud.A dedicated dataset of brake and accelerator pedal images was collected from 15 vehicles produced by 6 domestic and international manufacturers.Using this dataset,transfer learning techniques were applied to compare and analyze model performance and generalization as the CNN class granularity varied from coarse to fine levels.Furthermore,classification performance was evaluated in terms of latency and power efficiency under different class configurations.The experimental results demonstrated that the proposed solution identified the driver’s pedal behavior accurately and promptly,with the two-class model achieving the highest F1-score and accuracy among all granularity settings.展开更多
In order to investigate the influence of processing parameters on the granularity distribution of superalloy powders during the atomization of plasma rotating electrode processing (PREP), in this paper FGH95 superallo...In order to investigate the influence of processing parameters on the granularity distribution of superalloy powders during the atomization of plasma rotating electrode processing (PREP), in this paper FGH95 superalloy powders is prepared under different processing conditions by PREP and the influence of PREP processing parameters on the granularity distribution of FGH95 superalloy powders is discussed based on fractal geometry theory. The results show that with the increase of rotating velocity of the self-consuming electrode, the fractal dimension of the granularity distribution increases linearly, which results in the increase of the proportion of smaller powders. The change of interval between plasma gun and the self-consuming electrode has a little effect on the granularity distribution, also the fractal dimension of the granularity distribution changed a little correspondingly.展开更多
Person re-identification(Re-ID)has achieved great progress in recent years.However,person Re-ID methods are still suffering from body part missing and occlusion problems,which makes the learned representations less re...Person re-identification(Re-ID)has achieved great progress in recent years.However,person Re-ID methods are still suffering from body part missing and occlusion problems,which makes the learned representations less reliable.In this paper,we pro⁃pose a robust coarse granularity part-level network(CGPN)for person Re-ID,which ex⁃tracts robust regional features and integrates supervised global features for pedestrian im⁃ages.CGPN gains two-fold benefit toward higher accuracy for person Re-ID.On one hand,CGPN learns to extract effective regional features for pedestrian images.On the other hand,compared with extracting global features directly by backbone network,CGPN learns to extract more accurate global features with a supervision strategy.The single mod⁃el trained on three Re-ID datasets achieves state-of-the-art performances.Especially on CUHK03,the most challenging Re-ID dataset,we obtain a top result of Rank-1/mean av⁃erage precision(mAP)=87.1%/83.6%without re-ranking.展开更多
Precise integration methods to solve structural dynamic responses and the corresponding time integration formula are composed of two parts: the multiplication of an exponential matrix with a vector and the integratio...Precise integration methods to solve structural dynamic responses and the corresponding time integration formula are composed of two parts: the multiplication of an exponential matrix with a vector and the integration term. The second term can be solved by the series solution. Two hybrid granularity parallel algorithms are designed, that is, the exponential matrix and the first term are computed by the fine-grained parallel algorithra and the second term is computed by the coarse-grained parallel algorithm. Numerical examples show that these two hybrid granularity parallel algorithms obtain higher speedup and parallel efficiency than two existing parallel algorithms.展开更多
Dynamic distribution model is one of the best schemes for parallel volume rendering. How- ever, in homogeneous cluster system.since the granularity is traditionally identical, all processors communicate almost simulta...Dynamic distribution model is one of the best schemes for parallel volume rendering. How- ever, in homogeneous cluster system.since the granularity is traditionally identical, all processors communicate almost simultaneously and computation load may lose balance. Due to problems above, a dynamic distribution model with prime granularity for parallel computing is presented. Granularities of each processor are relatively prime, and related theories are introduced. A high parallel performance can be achieved by minimizing network competition and using a load balancing strategy that ensures all processors finish almost simultaneously. Based on Master-Slave-Gleaner ( MSG) scheme, the parallel Splatting Algorithm for volume rendering is used to test the model on IBM Cluster 1350 system. The experimental results show that the model can bring a considerable improvement in performance, including computation efficiency, total execution time, speed, and load balancing.展开更多
This paper proposes an optimal solution to choose the number of enhancement layers in fine granularity scalability (FGS) scheme under the constraint of minimum transmission energy, in which FGS is combined with transm...This paper proposes an optimal solution to choose the number of enhancement layers in fine granularity scalability (FGS) scheme under the constraint of minimum transmission energy, in which FGS is combined with transmission energy control, so that FGS enhancement layer transmission energy is minimized while the distortion guaranteed. By changing the bit-plane level and packet loss rate, minimum transmission energy of enhancement layer is obtained, while the expected distortion is satisfied.展开更多
Based on the content of radioactive elements (U, Th, K) of strata in two drill holes in the Fuzhou basin, and combined with the result of spore_pollen analysis, the relationship between radioactivity and lithology and...Based on the content of radioactive elements (U, Th, K) of strata in two drill holes in the Fuzhou basin, and combined with the result of spore_pollen analysis, the relationship between radioactivity and lithology and deposit environments is discussed and the results show that the content of radioactive substances is related to the granularity and lithology in sediment, and it is higher in argillaceous sediment (e.g. silt and clay), lower in sand sediment and in the middle in gravels between the above two kinds of sediment. The content of radioactive substances is also related to paleoclimate. A warm and humid environment is propitious to the deposition of radioactive substances, while a cool and dry climate is just the reverse.展开更多
Rough set philosophy hinges on the granularity of data, which is used to build all its basic concepts, like approximations, dependencies, reduction etc. Genetic Algorithms provides a general frame to optimize problem ...Rough set philosophy hinges on the granularity of data, which is used to build all its basic concepts, like approximations, dependencies, reduction etc. Genetic Algorithms provides a general frame to optimize problem solution of complex system without depending on the domain of problem.It is robust to many kinds of problems.The paper combines Genetic Algorithms and rough sets theory to compute granular of knowledge through an example of information table. The combination enable us to compute granular of knowledge effectively.It is also useful for computer auto-computing and information processing.展开更多
In this paper, some important issues of granularity are discussed mainly in information systems (ISs) based on binary relation. Firstly, the vector representation method of knowledge granules is proposed in an infor-m...In this paper, some important issues of granularity are discussed mainly in information systems (ISs) based on binary relation. Firstly, the vector representation method of knowledge granules is proposed in an infor-mation system based on binary relation to eliminate limitations of set representation method. Secondly, operators among knowledge granularity are introduced and some important properties of them are studied carefully. Thirdly, distance between two knowledge granules is established and granular space is constructed based on it. Fourthly, axiomatic definition of knowledge granularity is investigated, and one can find that some existed knowledge granularities are special cases under the definition. In addition, as an application of knowledge granular space, an example is employed to validate some results in our work.展开更多
In this paper, we conduct research on the development trend and general applications of the fuzzy rough granular computing theory. Granular computing is a new concept of general information processing and computing pa...In this paper, we conduct research on the development trend and general applications of the fuzzy rough granular computing theory. Granular computing is a new concept of general information processing and computing paradigm which covers all the granularity the study of the theory, methods, techniques and the tools. In many areas are the basic ideas of granular computing, such as the interval analysis, rough set theory, clustering analysis and information retrieval, machine learning, database, etc. With the theory of domain known division of target concept and rule acquisition, in knowledge discovery, data mining and the pattern recognition is widely used. Under this basis, in this paper, we propose the fuzzy rough theory based computing paradigm that gains ideal performance.展开更多
Background:Systemic lupus erythematosus(SLE)is a complex autoimmune disease characterized by fluctuating activity.The Systemic Lupus Erythematosus Disease Activity Index(SLEDAI‐2K)is commonly used to assess disease a...Background:Systemic lupus erythematosus(SLE)is a complex autoimmune disease characterized by fluctuating activity.The Systemic Lupus Erythematosus Disease Activity Index(SLEDAI‐2K)is commonly used to assess disease activity,but its complexity and limited diagnostic availability pose challenges.Alternative biomarkers,such as extended inflammatory parameters(EIP),can be measured using hematology analyzers.This study examined the correlation between changes in SLEDAI‐2K scores and EIP values in SLE patients.Methods:A retrospective cohort study was conducted using secondary data from the Hasan Sadikin Lupus Registry.SLEDAI‐2K scores and EIP values were recorded at baseline and during follow‐up(1–6 months).EIP included neutrophils granularity intensity(Neut‐GI),neutrophils reactivity intensity,reactive lymphocytes(Re‐Lymp),and antibody synthesizing lymphocytes.Correlations were analyzed using Rank Spearman analysis.Results:Among 53 patients(median age 32.2[Q1–Q3:27.9–43.8]years),Rank Spearman analysis revealed a moderate negative correlation between Neut‐GI changes and SLEDAI‐2K scores(r=−0.4,p=0.002).A weak negative correlation was found between Re‐Lymp changes and SLEDAI‐2K scores(r=−0.3,p=0.016),while neutrophils reactivity intensity and antibody synthesizing lymphocytes showed no significant correlation.Conclusions:Changes in Neut‐GI and Re‐Lymp correlated with SLEDAI‐2K score fluctuations,suggesting that decreasing Neut‐GI and Re‐Lymp levels accompany increasing disease activity.These parameters may serve as simpler,cost‐effective alternatives for monitoring SLE activity.展开更多
Fine-grained visual parsing, including fine-grained part segmentation and fine-grained object recognition, has attracted considerable critical attention due to its importance in many real-world applications, e.g., agr...Fine-grained visual parsing, including fine-grained part segmentation and fine-grained object recognition, has attracted considerable critical attention due to its importance in many real-world applications, e.g., agriculture, remote sensing, and space technologies. Predominant research efforts tackle these fine-grained sub-tasks following different paradigms, while the inherent relations between these tasks are neglected. Moreover, given most of the research remains fragmented, we conduct an in-depth study of the advanced work from a new perspective of learning the part relationship. In this perspective, we first consolidate recent research and benchmark syntheses with new taxonomies. Based on this consolidation, we revisit the universal challenges in fine-grained part segmentation and recognition tasks and propose new solutions by part relationship learning for these important challenges. Furthermore, we conclude several promising lines of research in fine-grained visual parsing for future research.展开更多
Declarative Programming Languages (DPLs) apply a process model of Horn claun es such as PARLOG[8] or a reduction model of A-calculus such as SML[7] and are) in principle, well suited to multiprocessor implemelltation....Declarative Programming Languages (DPLs) apply a process model of Horn claun es such as PARLOG[8] or a reduction model of A-calculus such as SML[7] and are) in principle, well suited to multiprocessor implemelltation. However, the performance of a parallel declarative program can be impaired by a mismatch between the parallelism available in an application and the parallelism available in the architecture. A particularly attractive solution is to automatically match the parallelism of the program to the parallelism of the target hardware as a compilation step. In this paper) we present an optimizillg compilation technique called granularity analysis which identi fies and removes excess parallelism that would degrade performance. The main steps are: an analysis of the flow of data to form an attributed call graph between function (or predicate) arguments; and an asymptotic estimation of granularity of a function (or predicate) to generate approximate grain size. Compiled procedure calls can be annotated with grain size and a task scheduler can make scheduling decisions with the classilication scheme of grains to control parallelism at runtime. The resulting granularity analysis scheme is suitable for exploiting adaptive parallelism of declarative programming languages on multiprocessors.展开更多
The combination of spatial distribution,semantic characteristics,and sometimes temporal dynamics of POIs inside a geographic region can capture its unique land use characteristics.Most previous studies on POI-based la...The combination of spatial distribution,semantic characteristics,and sometimes temporal dynamics of POIs inside a geographic region can capture its unique land use characteristics.Most previous studies on POI-based land use modeling research focused on one geographic region and select one spatial scale and semantic granularity for land use characterization.There is a lack of understanding on the impact of spatial scale,semantic granularity,and geographic context on POI-based land use modeling,particularly large-scale land use modeling.In this study,we developed a scalable POI-based land use modeling framework and examined the impact of these three factors on POI-based land use characterization using data from three geographic regions.We developed a unified semantic representation framework for POI semantics that can help fuse heterogeneous POI data sources.Then,by combining POIs with a neural network language model,we developed a spatially explicit approach to learn the embedding representation of POIs and AOIs.We trained multiple supervised classifiers using AOI embeddings as input features to predict AOI land use at different semantic granularities.The classification performance of different land use classes was analyzed and compared across three geographic regions to identify the semantic representativeness of POI-based AOI embedding and the impact of geographic context.展开更多
Pores among particles provide the main space for the storage and migration of deep underground fluids(such as oil,gas,groundwater,and unconventional natural gas).The pores form a pore structure with complex morphology...Pores among particles provide the main space for the storage and migration of deep underground fluids(such as oil,gas,groundwater,and unconventional natural gas).The pores form a pore structure with complex morphology which is mainly dominated by the shape and distribution of particles.Therefore,the reconstruction of the pore structure or granular porous media and the evaluation of particle roundness have become an important foundation for the study of fluid flow through deep underground rock mass.This research proposes a novel approach for the multi-scale model with angular vertexes.The fractal topology theory and Voronoi space segmentation technology are combinedly used for the reconstruction of fractal granular porous media.The angular shapes are smoothed by using a modified B-spline technique and the particles with varying degrees of roundness are generated.To validate the superiority of our approach,the roundness based on the Wadell roundness calculation method is calculated and compared with the roundness obtained from particles smoothed using the vertex rounding substitution method.Results show that the roundness of particles smoothed with the modified B-spline technique closely aligns with the corresponding set rounded level(a nondimensional variable).Conversely,the vertex rounding substitution method is limited to a single dimensionally rounded radius.This innovative approach can offer a new method for the construction of granular porous media for the fluid flow study in deep underground rock mass.展开更多
Granular materials exhibit complex macroscopic mechanical behaviors closely related to their microscalemicrostructural features.Traditional macroscopic phenomenological elasto-plastic models,however,usually have compl...Granular materials exhibit complex macroscopic mechanical behaviors closely related to their microscalemicrostructural features.Traditional macroscopic phenomenological elasto-plastic models,however,usually have complex formulations and lack explicit relations to these microstructural features.To avoid these limitations,this study proposes a micromechanics-based softening hyperelastic model for granular materials,integrating softening hyperelasticity withmicrostructural insights to capture strain softening,critical state,and strain localization behaviors.The model has two key advantages:(1)a clear conceptualization,straightforward formulation,and ease of numerical implementation(via Abaqus UMAT subroutine in this study);(2)explicit incorporation of micro-scale features(e.g.,contact stiffness,particle size,porosity)to reveal their influences on macroscopic responses.An isotropic directional distribution density of contacts and three specific microstructures are considered,and their softening hyperelastic constitutive modulus tensors are explicitly derived.By introducing a softening factor and critical failure energy density,the model can describe geomaterial behaviors,simulating residual strength,X-shaped shear bands,and strain localization evolution.Numerical validations in comparison with themacro-scale hyperelastic model,Abaqus Drucker-Prager model,and the experiment confirm its accuracy.Parametric studies reveal critical dependencies:a normal to tangential contact stiffness ratio of 2-8(depending on stiffness magnitude),an internal length of 2-4 mm to ensure shear band formation,and a critical failure energy density(≤10 kJ/m^(3))to trigger strain softening and localization.Influences of the specific microstructures on strain localization and softening are investigated.The model also shows mesh independence due to the introduction of an internal length.The model’s applicability is further demonstrated by slope stability analysis,capturing slip surface evolution,and load-displacement characteristics.This study develops a robust microstructure-aware hyperelastic framework to describe the mechanical behaviors of granular materials,providing multiscale insights for geotechnical engineering applications.展开更多
In this paper,a novel method for investigating the particle-crushing behavior of breeding particles in a fusion blanket is proposed.The fractal theory and Weibull distribution are combined to establish a theoretical m...In this paper,a novel method for investigating the particle-crushing behavior of breeding particles in a fusion blanket is proposed.The fractal theory and Weibull distribution are combined to establish a theoretical model,and its validity was verified using a simple impact test.A crushable discrete element method(DEM)framework is built based on the previously established theoretical model.The tensile strength,which considers the fractal theory,size effect,and Weibull variation,was assigned to each generated particle.The assigned strength is then used for crush detection by comparing it with its maximum tensile stress.Mass conservation is ensured by inserting a series of sub-particles whose total mass was equal to the quality loss.Based on the crushable DEM framework,a numerical simulation of the crushing behavior of a pebble bed with hollow cylindrical geometry under a uniaxial compression test was performed.The results of this investigation showed that the particle withstands the external load by contact and sliding at the beginning of the compression process,and the results confirmed that crushing can be considered an important method of resisting the increasing external load.A relatively regular particle arrangement aids in resisting the load and reduces the occurrence of particle crushing.However,a limit exists to the promotion of resistance.When the strain increases beyond this limit,the distribution of the crushing position tends to be isotropic over the entire pebble bed.The theoretical model and crushable DEM framework provide a new method for exploring the pebble bed in a fusion reactor,considering particle crushing.展开更多
Granular flow,such as hopper discharge and debris flows,involves complex multi-scale,multi-phase,and multi-physics coupling,posing significant challenges for numerical simulation.Over the past two decades,methods like...Granular flow,such as hopper discharge and debris flows,involves complex multi-scale,multi-phase,and multi-physics coupling,posing significant challenges for numerical simulation.Over the past two decades,methods like the Discrete Element Method(DEM),Smoothed Particle Hydrodynamics(SPH),and Depth-Averaging Method(DAM),have been developed to address these problems.However,their applicability across different scales remains unclear due to differences in physical assumptions and numerical algorithms.Therefore,a comprehensive evaluation is critically needed.This study selects three typical methods(DEM,SPH,and DAM)to examine their convergence behavior,boundary condition implementation,and limitations in physical and numerical modeling.We numerically studied three extreme deformation flow cases with the three chosen methods.These cases include granular column collapse at the particle scale,flow-structure interaction at the laboratory scale,and reconstruction of the 2015 Shenzhen Guangming landslide at the field scale.By comparing the granular flow dynamics,deposition morphology,and structure interactions,and also the simulation accuracy and computational efficiency,we show the applicability of the three models across different scales.Further,we provide practical guidance for model selection in large-deformation flow problems in a granular system of different scales.展开更多
Purpose:Three-way decision(3WD)and probabilistic rough sets(PRSs)are theoretical tools capable of simulating humans’multi-level and multi-perspective thinking modes in the field of decision-making.They are proposed t...Purpose:Three-way decision(3WD)and probabilistic rough sets(PRSs)are theoretical tools capable of simulating humans’multi-level and multi-perspective thinking modes in the field of decision-making.They are proposed to assist decision-makers in better managing incomplete or imprecise information under conditions of uncertainty or fuzziness.However,it is easy to cause decision losses and the personal thresholds of decision-makers cannot be taken into account.To solve this problem,this paper combines picture fuzzy(PF)multi-granularity(MG)with 3WD and establishes the notion of PF MG 3WD.Design/methodology/approach:An effective incomplete model based on PF MG 3WD is designed in this paper.First,the form of PF MG incomplete information systems(IISs)is established to reasonably record the uncertain information.On this basis,the PF conditional probability is established by using PF similarity relations,and the concept of adjustable PF MG PRSs is proposed by using the PF conditional probability to fuse data.Then,a comprehensive PF multi-attribute group decision-making(MAGDM)scheme is formed by the adjustable PF MG PRSs and the VlseKriterijumska Optimizacija I Kompromisno Resenje(VIKOR)method.Finally,an actual breast cancer data set is used to reveal the validity of the constructed method.Findings:The experimental results confirm the effectiveness of PF MG 3WD in predicting breast cancer.Compared with existing models,PF MG 3WD has better robustness and generalization performance.This is mainly due to the incomplete PF MG 3WD proposed in this paper,which effectively reduces the influence of unreasonable outliers and threshold settings.Originality/value:The model employs the VIKOR method for optimal granularity selections,which takes into account both group utility maximization and individual regret minimization,while incorporating decision-makers’subjective preferences as well.This ensures that the experiment maintains higher exclusion stability and reliability,enhancing the robustness of the decision results.展开更多
Granular information has emerged as a potent tool for data represen-tation and processing across various domains.However,existing time series data granulation techniques often overlook the influence of external factors...Granular information has emerged as a potent tool for data represen-tation and processing across various domains.However,existing time series data granulation techniques often overlook the influence of external factors.In this study,a multisource time series data granularity conversion model is proposed that achieves granularity conversion effectively while maintaining result consis-tency and stability.The model incorporates the impact of external source data using a multivariate linear regression model,and the entropy weighting method is employed to allocate weights andfinalize the granularity conversion.Through experimental analysis using Beijing’s 2022 air quality dataset,our proposed method outperforms traditional information granulation approaches,providing valuable decision-making insights for industrial system optimization and research.展开更多
基金supported by Basic Science Research Program to Research Institute for Basic Sciences(RIBS)of Jeju National University through the National Research Foundation of Korea(NRF)funded by the Ministry of Education(RS-2019-NR040080)This research was also carried out with the support of the Jeju RISE Center,funded by the Ministry of Education and Jeju Special Self-Governing Province in 2025,as part of the“Regional Innovation System&Education(RISE):Glocal University 30”initiative.
文摘Given the growing number of vehicle accidents caused by unintended acceleration and braking failure,verifying Sudden Unintended Acceleration(SUA)incidents has become a persistent challenge.A central issue of debate is whether such events stem frommechanical malfunctions or driver pedalmisapplications.However,existing verification procedures implemented by vehiclemanufacturers often involve closed tests after vehicle recalls;thus raising ongoing concerns about reliability and transparency.Consequently,there is a growing need for a user-driven framework that enables independent data acquisition and verification.Although previous studies have addressed SUA detection using deep learning,few have explored howclass granularity optimization affects power efficiency and inference performance in real-time Edge AI systems.To address this problem,this work presents a cloud-assisted artificial intelligence(AI)solution for the reliable verification of SUA occurrences.The proposed system integrates multimodal sensor streams including camera-based foot images,On-Board Diagnostics II(OBD-II)signals,and six-axismeasurements to determine whether the brake pedal was actually engaged at themoment of a suspected SUA.Beyond image acquisition,convolutional neural network(CNN)models perform real-time inference to classify the driver’s pedal operation states with the resulting outputs transmitted and archived in the cloud.A dedicated dataset of brake and accelerator pedal images was collected from 15 vehicles produced by 6 domestic and international manufacturers.Using this dataset,transfer learning techniques were applied to compare and analyze model performance and generalization as the CNN class granularity varied from coarse to fine levels.Furthermore,classification performance was evaluated in terms of latency and power efficiency under different class configurations.The experimental results demonstrated that the proposed solution identified the driver’s pedal behavior accurately and promptly,with the two-class model achieving the highest F1-score and accuracy among all granularity settings.
文摘In order to investigate the influence of processing parameters on the granularity distribution of superalloy powders during the atomization of plasma rotating electrode processing (PREP), in this paper FGH95 superalloy powders is prepared under different processing conditions by PREP and the influence of PREP processing parameters on the granularity distribution of FGH95 superalloy powders is discussed based on fractal geometry theory. The results show that with the increase of rotating velocity of the self-consuming electrode, the fractal dimension of the granularity distribution increases linearly, which results in the increase of the proportion of smaller powders. The change of interval between plasma gun and the self-consuming electrode has a little effect on the granularity distribution, also the fractal dimension of the granularity distribution changed a little correspondingly.
文摘Person re-identification(Re-ID)has achieved great progress in recent years.However,person Re-ID methods are still suffering from body part missing and occlusion problems,which makes the learned representations less reliable.In this paper,we pro⁃pose a robust coarse granularity part-level network(CGPN)for person Re-ID,which ex⁃tracts robust regional features and integrates supervised global features for pedestrian im⁃ages.CGPN gains two-fold benefit toward higher accuracy for person Re-ID.On one hand,CGPN learns to extract effective regional features for pedestrian images.On the other hand,compared with extracting global features directly by backbone network,CGPN learns to extract more accurate global features with a supervision strategy.The single mod⁃el trained on three Re-ID datasets achieves state-of-the-art performances.Especially on CUHK03,the most challenging Re-ID dataset,we obtain a top result of Rank-1/mean av⁃erage precision(mAP)=87.1%/83.6%without re-ranking.
基金the National Natural Science Foundation of China(No.60273048).
文摘Precise integration methods to solve structural dynamic responses and the corresponding time integration formula are composed of two parts: the multiplication of an exponential matrix with a vector and the integration term. The second term can be solved by the series solution. Two hybrid granularity parallel algorithms are designed, that is, the exponential matrix and the first term are computed by the fine-grained parallel algorithra and the second term is computed by the coarse-grained parallel algorithm. Numerical examples show that these two hybrid granularity parallel algorithms obtain higher speedup and parallel efficiency than two existing parallel algorithms.
基金Supported by Natural Science Foundation of China ( No. 60373061).
文摘Dynamic distribution model is one of the best schemes for parallel volume rendering. How- ever, in homogeneous cluster system.since the granularity is traditionally identical, all processors communicate almost simultaneously and computation load may lose balance. Due to problems above, a dynamic distribution model with prime granularity for parallel computing is presented. Granularities of each processor are relatively prime, and related theories are introduced. A high parallel performance can be achieved by minimizing network competition and using a load balancing strategy that ensures all processors finish almost simultaneously. Based on Master-Slave-Gleaner ( MSG) scheme, the parallel Splatting Algorithm for volume rendering is used to test the model on IBM Cluster 1350 system. The experimental results show that the model can bring a considerable improvement in performance, including computation efficiency, total execution time, speed, and load balancing.
文摘This paper proposes an optimal solution to choose the number of enhancement layers in fine granularity scalability (FGS) scheme under the constraint of minimum transmission energy, in which FGS is combined with transmission energy control, so that FGS enhancement layer transmission energy is minimized while the distortion guaranteed. By changing the bit-plane level and packet loss rate, minimum transmission energy of enhancement layer is obtained, while the expected distortion is satisfied.
基金This project was granted bythe National Developmentand Reform Commission.Item Number:20041138
文摘Based on the content of radioactive elements (U, Th, K) of strata in two drill holes in the Fuzhou basin, and combined with the result of spore_pollen analysis, the relationship between radioactivity and lithology and deposit environments is discussed and the results show that the content of radioactive substances is related to the granularity and lithology in sediment, and it is higher in argillaceous sediment (e.g. silt and clay), lower in sand sediment and in the middle in gravels between the above two kinds of sediment. The content of radioactive substances is also related to paleoclimate. A warm and humid environment is propitious to the deposition of radioactive substances, while a cool and dry climate is just the reverse.
文摘Rough set philosophy hinges on the granularity of data, which is used to build all its basic concepts, like approximations, dependencies, reduction etc. Genetic Algorithms provides a general frame to optimize problem solution of complex system without depending on the domain of problem.It is robust to many kinds of problems.The paper combines Genetic Algorithms and rough sets theory to compute granular of knowledge through an example of information table. The combination enable us to compute granular of knowledge effectively.It is also useful for computer auto-computing and information processing.
文摘In this paper, some important issues of granularity are discussed mainly in information systems (ISs) based on binary relation. Firstly, the vector representation method of knowledge granules is proposed in an infor-mation system based on binary relation to eliminate limitations of set representation method. Secondly, operators among knowledge granularity are introduced and some important properties of them are studied carefully. Thirdly, distance between two knowledge granules is established and granular space is constructed based on it. Fourthly, axiomatic definition of knowledge granularity is investigated, and one can find that some existed knowledge granularities are special cases under the definition. In addition, as an application of knowledge granular space, an example is employed to validate some results in our work.
文摘In this paper, we conduct research on the development trend and general applications of the fuzzy rough granular computing theory. Granular computing is a new concept of general information processing and computing paradigm which covers all the granularity the study of the theory, methods, techniques and the tools. In many areas are the basic ideas of granular computing, such as the interval analysis, rough set theory, clustering analysis and information retrieval, machine learning, database, etc. With the theory of domain known division of target concept and rule acquisition, in knowledge discovery, data mining and the pattern recognition is widely used. Under this basis, in this paper, we propose the fuzzy rough theory based computing paradigm that gains ideal performance.
文摘Background:Systemic lupus erythematosus(SLE)is a complex autoimmune disease characterized by fluctuating activity.The Systemic Lupus Erythematosus Disease Activity Index(SLEDAI‐2K)is commonly used to assess disease activity,but its complexity and limited diagnostic availability pose challenges.Alternative biomarkers,such as extended inflammatory parameters(EIP),can be measured using hematology analyzers.This study examined the correlation between changes in SLEDAI‐2K scores and EIP values in SLE patients.Methods:A retrospective cohort study was conducted using secondary data from the Hasan Sadikin Lupus Registry.SLEDAI‐2K scores and EIP values were recorded at baseline and during follow‐up(1–6 months).EIP included neutrophils granularity intensity(Neut‐GI),neutrophils reactivity intensity,reactive lymphocytes(Re‐Lymp),and antibody synthesizing lymphocytes.Correlations were analyzed using Rank Spearman analysis.Results:Among 53 patients(median age 32.2[Q1–Q3:27.9–43.8]years),Rank Spearman analysis revealed a moderate negative correlation between Neut‐GI changes and SLEDAI‐2K scores(r=−0.4,p=0.002).A weak negative correlation was found between Re‐Lymp changes and SLEDAI‐2K scores(r=−0.3,p=0.016),while neutrophils reactivity intensity and antibody synthesizing lymphocytes showed no significant correlation.Conclusions:Changes in Neut‐GI and Re‐Lymp correlated with SLEDAI‐2K score fluctuations,suggesting that decreasing Neut‐GI and Re‐Lymp levels accompany increasing disease activity.These parameters may serve as simpler,cost‐effective alternatives for monitoring SLE activity.
基金supported in part by National Natural Science Foundation of China(Nos.62132002,61825101 and 62202010)the Key-Area Research and Development Program of Guangdong Province,China(No.2021B0101400002)the China Postdoctoral Science Foundation(No.2022M710212).
文摘Fine-grained visual parsing, including fine-grained part segmentation and fine-grained object recognition, has attracted considerable critical attention due to its importance in many real-world applications, e.g., agriculture, remote sensing, and space technologies. Predominant research efforts tackle these fine-grained sub-tasks following different paradigms, while the inherent relations between these tasks are neglected. Moreover, given most of the research remains fragmented, we conduct an in-depth study of the advanced work from a new perspective of learning the part relationship. In this perspective, we first consolidate recent research and benchmark syntheses with new taxonomies. Based on this consolidation, we revisit the universal challenges in fine-grained part segmentation and recognition tasks and propose new solutions by part relationship learning for these important challenges. Furthermore, we conclude several promising lines of research in fine-grained visual parsing for future research.
文摘Declarative Programming Languages (DPLs) apply a process model of Horn claun es such as PARLOG[8] or a reduction model of A-calculus such as SML[7] and are) in principle, well suited to multiprocessor implemelltation. However, the performance of a parallel declarative program can be impaired by a mismatch between the parallelism available in an application and the parallelism available in the architecture. A particularly attractive solution is to automatically match the parallelism of the program to the parallelism of the target hardware as a compilation step. In this paper) we present an optimizillg compilation technique called granularity analysis which identi fies and removes excess parallelism that would degrade performance. The main steps are: an analysis of the flow of data to form an attributed call graph between function (or predicate) arguments; and an asymptotic estimation of granularity of a function (or predicate) to generate approximate grain size. Compiled procedure calls can be annotated with grain size and a task scheduler can make scheduling decisions with the classilication scheme of grains to control parallelism at runtime. The resulting granularity analysis scheme is suitable for exploiting adaptive parallelism of declarative programming languages on multiprocessors.
文摘The combination of spatial distribution,semantic characteristics,and sometimes temporal dynamics of POIs inside a geographic region can capture its unique land use characteristics.Most previous studies on POI-based land use modeling research focused on one geographic region and select one spatial scale and semantic granularity for land use characterization.There is a lack of understanding on the impact of spatial scale,semantic granularity,and geographic context on POI-based land use modeling,particularly large-scale land use modeling.In this study,we developed a scalable POI-based land use modeling framework and examined the impact of these three factors on POI-based land use characterization using data from three geographic regions.We developed a unified semantic representation framework for POI semantics that can help fuse heterogeneous POI data sources.Then,by combining POIs with a neural network language model,we developed a spatially explicit approach to learn the embedding representation of POIs and AOIs.We trained multiple supervised classifiers using AOI embeddings as input features to predict AOI land use at different semantic granularities.The classification performance of different land use classes was analyzed and compared across three geographic regions to identify the semantic representativeness of POI-based AOI embedding and the impact of geographic context.
基金Natural Science Foundation of Henan Province of China,Grant/Award Number:232300420438Fundamental Research Funds for the Universities of Henan Province,Grant/Award Numbers:NSFRF220427,NSFRF220204+2 种基金National Natural Science foundation of China,Grant/Award Number:41972175Excellent Youth Foundation of Henan Scientific Committee,Grant/Award Number:232300421025Doctoral Foundation of Henan Polytechnic University,Grant/Award Number:B2021-78。
文摘Pores among particles provide the main space for the storage and migration of deep underground fluids(such as oil,gas,groundwater,and unconventional natural gas).The pores form a pore structure with complex morphology which is mainly dominated by the shape and distribution of particles.Therefore,the reconstruction of the pore structure or granular porous media and the evaluation of particle roundness have become an important foundation for the study of fluid flow through deep underground rock mass.This research proposes a novel approach for the multi-scale model with angular vertexes.The fractal topology theory and Voronoi space segmentation technology are combinedly used for the reconstruction of fractal granular porous media.The angular shapes are smoothed by using a modified B-spline technique and the particles with varying degrees of roundness are generated.To validate the superiority of our approach,the roundness based on the Wadell roundness calculation method is calculated and compared with the roundness obtained from particles smoothed using the vertex rounding substitution method.Results show that the roundness of particles smoothed with the modified B-spline technique closely aligns with the corresponding set rounded level(a nondimensional variable).Conversely,the vertex rounding substitution method is limited to a single dimensionally rounded radius.This innovative approach can offer a new method for the construction of granular porous media for the fluid flow study in deep underground rock mass.
基金supported by the National Natural Science Foundation of China through grant numbers 12002245 and 12172263the Science and Technology Research Program of Chongqing Municipal Education Commission through grant number KJQN202300742+1 种基金the National Natural Science Foundation of ChongqingMunicipality through grant number CSTB2025NSCQ-GPX0841Chongqing Jiaotong University through grant number F1220038.
文摘Granular materials exhibit complex macroscopic mechanical behaviors closely related to their microscalemicrostructural features.Traditional macroscopic phenomenological elasto-plastic models,however,usually have complex formulations and lack explicit relations to these microstructural features.To avoid these limitations,this study proposes a micromechanics-based softening hyperelastic model for granular materials,integrating softening hyperelasticity withmicrostructural insights to capture strain softening,critical state,and strain localization behaviors.The model has two key advantages:(1)a clear conceptualization,straightforward formulation,and ease of numerical implementation(via Abaqus UMAT subroutine in this study);(2)explicit incorporation of micro-scale features(e.g.,contact stiffness,particle size,porosity)to reveal their influences on macroscopic responses.An isotropic directional distribution density of contacts and three specific microstructures are considered,and their softening hyperelastic constitutive modulus tensors are explicitly derived.By introducing a softening factor and critical failure energy density,the model can describe geomaterial behaviors,simulating residual strength,X-shaped shear bands,and strain localization evolution.Numerical validations in comparison with themacro-scale hyperelastic model,Abaqus Drucker-Prager model,and the experiment confirm its accuracy.Parametric studies reveal critical dependencies:a normal to tangential contact stiffness ratio of 2-8(depending on stiffness magnitude),an internal length of 2-4 mm to ensure shear band formation,and a critical failure energy density(≤10 kJ/m^(3))to trigger strain softening and localization.Influences of the specific microstructures on strain localization and softening are investigated.The model also shows mesh independence due to the introduction of an internal length.The model’s applicability is further demonstrated by slope stability analysis,capturing slip surface evolution,and load-displacement characteristics.This study develops a robust microstructure-aware hyperelastic framework to describe the mechanical behaviors of granular materials,providing multiscale insights for geotechnical engineering applications.
基金supported by Anhui Provincial Natural Science Foundation(2408085QA030)Natural Science Research Project of Anhui Educational Committee,China(2022AH050825)+3 种基金Medical Special Cultivation Project of Anhui University of Science and Technology(YZ2023H2C008)the Excellent Research and Innovation Team of Anhui Province,China(2022AH010052)the Scientific Research Foundation for High-level Talents of Anhui University of Science and Technology,China(2021yjrc51)Collaborative Innovation Program of Hefei Science Center,CAS,China(2019HSC-CIP006).
文摘In this paper,a novel method for investigating the particle-crushing behavior of breeding particles in a fusion blanket is proposed.The fractal theory and Weibull distribution are combined to establish a theoretical model,and its validity was verified using a simple impact test.A crushable discrete element method(DEM)framework is built based on the previously established theoretical model.The tensile strength,which considers the fractal theory,size effect,and Weibull variation,was assigned to each generated particle.The assigned strength is then used for crush detection by comparing it with its maximum tensile stress.Mass conservation is ensured by inserting a series of sub-particles whose total mass was equal to the quality loss.Based on the crushable DEM framework,a numerical simulation of the crushing behavior of a pebble bed with hollow cylindrical geometry under a uniaxial compression test was performed.The results of this investigation showed that the particle withstands the external load by contact and sliding at the beginning of the compression process,and the results confirmed that crushing can be considered an important method of resisting the increasing external load.A relatively regular particle arrangement aids in resisting the load and reduces the occurrence of particle crushing.However,a limit exists to the promotion of resistance.When the strain increases beyond this limit,the distribution of the crushing position tends to be isotropic over the entire pebble bed.The theoretical model and crushable DEM framework provide a new method for exploring the pebble bed in a fusion reactor,considering particle crushing.
基金supported by the National Natural Science Foundation of China(Nos.12572465,12032005).
文摘Granular flow,such as hopper discharge and debris flows,involves complex multi-scale,multi-phase,and multi-physics coupling,posing significant challenges for numerical simulation.Over the past two decades,methods like the Discrete Element Method(DEM),Smoothed Particle Hydrodynamics(SPH),and Depth-Averaging Method(DAM),have been developed to address these problems.However,their applicability across different scales remains unclear due to differences in physical assumptions and numerical algorithms.Therefore,a comprehensive evaluation is critically needed.This study selects three typical methods(DEM,SPH,and DAM)to examine their convergence behavior,boundary condition implementation,and limitations in physical and numerical modeling.We numerically studied three extreme deformation flow cases with the three chosen methods.These cases include granular column collapse at the particle scale,flow-structure interaction at the laboratory scale,and reconstruction of the 2015 Shenzhen Guangming landslide at the field scale.By comparing the granular flow dynamics,deposition morphology,and structure interactions,and also the simulation accuracy and computational efficiency,we show the applicability of the three models across different scales.Further,we provide practical guidance for model selection in large-deformation flow problems in a granular system of different scales.
基金funded by the National Natural Science Foundation of China(Nos:62272284,61972238 and 62072294)the Special Fund for Science and Technology Innovation Teams of Shanxi Province(No:202204051001015)the Cultivate Scientific Research Excellence Programs of Higher Education Institutions in Shanxi(CSREP)(No:2019SK036)。
文摘Purpose:Three-way decision(3WD)and probabilistic rough sets(PRSs)are theoretical tools capable of simulating humans’multi-level and multi-perspective thinking modes in the field of decision-making.They are proposed to assist decision-makers in better managing incomplete or imprecise information under conditions of uncertainty or fuzziness.However,it is easy to cause decision losses and the personal thresholds of decision-makers cannot be taken into account.To solve this problem,this paper combines picture fuzzy(PF)multi-granularity(MG)with 3WD and establishes the notion of PF MG 3WD.Design/methodology/approach:An effective incomplete model based on PF MG 3WD is designed in this paper.First,the form of PF MG incomplete information systems(IISs)is established to reasonably record the uncertain information.On this basis,the PF conditional probability is established by using PF similarity relations,and the concept of adjustable PF MG PRSs is proposed by using the PF conditional probability to fuse data.Then,a comprehensive PF multi-attribute group decision-making(MAGDM)scheme is formed by the adjustable PF MG PRSs and the VlseKriterijumska Optimizacija I Kompromisno Resenje(VIKOR)method.Finally,an actual breast cancer data set is used to reveal the validity of the constructed method.Findings:The experimental results confirm the effectiveness of PF MG 3WD in predicting breast cancer.Compared with existing models,PF MG 3WD has better robustness and generalization performance.This is mainly due to the incomplete PF MG 3WD proposed in this paper,which effectively reduces the influence of unreasonable outliers and threshold settings.Originality/value:The model employs the VIKOR method for optimal granularity selections,which takes into account both group utility maximization and individual regret minimization,while incorporating decision-makers’subjective preferences as well.This ensures that the experiment maintains higher exclusion stability and reliability,enhancing the robustness of the decision results.
基金This work was supported by the National Key R&D Program of China under Grant No.2020YFB1710200.
文摘Granular information has emerged as a potent tool for data represen-tation and processing across various domains.However,existing time series data granulation techniques often overlook the influence of external factors.In this study,a multisource time series data granularity conversion model is proposed that achieves granularity conversion effectively while maintaining result consis-tency and stability.The model incorporates the impact of external source data using a multivariate linear regression model,and the entropy weighting method is employed to allocate weights andfinalize the granularity conversion.Through experimental analysis using Beijing’s 2022 air quality dataset,our proposed method outperforms traditional information granulation approaches,providing valuable decision-making insights for industrial system optimization and research.