Density-based approaches in content extraction, whose task is to extract contents from Web pages, are commonly used to obtain page contents that are critical to many Web mining applications. How- ever, traditional den...Density-based approaches in content extraction, whose task is to extract contents from Web pages, are commonly used to obtain page contents that are critical to many Web mining applications. How- ever, traditional density-based approaches cannot effectively manage pages that contain short contents and long noises. To overcome this problem, in this paper, we propose a content extraction approach for obtain- ing content from news pages that combines a segmentation-like approach and a density-based approach. A tool called BlockExtractor was developed based on this approach. BlockExtractor identifies contents in three steps. First, it looks for all Block-Level Elements (BLE) & Inline Elements (IE) blocks, which are designed to roughly segment pages into blocks. Second, it computes the densities of each BLE&IE block and its ele- ment to eliminate noises. Third, it removes all redundant BLE&IE blocks that have emerged in other pages from the same site. Compared with three other density-based approaches, our approach shows significant advantages in both precision and recall.展开更多
In recent years,there has been a concerted effort to improve anomaly detection tech-niques,particularly in the context of high-dimensional,distributed clinical data.Analysing patient data within clinical settings reve...In recent years,there has been a concerted effort to improve anomaly detection tech-niques,particularly in the context of high-dimensional,distributed clinical data.Analysing patient data within clinical settings reveals a pronounced focus on refining diagnostic accuracy,personalising treatment plans,and optimising resource allocation to enhance clinical outcomes.Nonetheless,this domain faces unique challenges,such as irregular data collection,inconsistent data quality,and patient-specific structural variations.This paper proposed a novel hybrid approach that integrates heuristic and stochastic methods for anomaly detection in patient clinical data to address these challenges.The strategy combines HPO-based optimal Density-Based Spatial Clustering of Applications with Noise for clustering patient exercise data,facilitating efficient anomaly identification.Subsequently,a stochastic method based on the Interquartile Range filters unreliable data points,ensuring that medical tools and professionals receive only the most pertinent and accurate information.The primary objective of this study is to equip healthcare pro-fessionals and researchers with a robust tool for managing extensive,high-dimensional clinical datasets,enabling effective isolation and removal of aberrant data points.Furthermore,a sophisticated regression model has been developed using Automated Machine Learning(AutoML)to assess the impact of the ensemble abnormal pattern detection approach.Various statistical error estimation techniques validate the efficacy of the hybrid approach alongside AutoML.Experimental results show that implementing this innovative hybrid model on patient rehabilitation data leads to a notable enhance-ment in AutoML performance,with an average improvement of 0.041 in the R2 score,surpassing the effectiveness of traditional regression models.展开更多
BACKGROUND Due to the increasing rate of thyroid nodules diagnosis,and the desire to avoid the unsightly cervical scar,remote thyroidectomies were invented and are increasingly performed.Transoral endoscopic thyroidec...BACKGROUND Due to the increasing rate of thyroid nodules diagnosis,and the desire to avoid the unsightly cervical scar,remote thyroidectomies were invented and are increasingly performed.Transoral endoscopic thyroidectomy vestibular approach and trans-areolar approaches(TAA)are the two most commonly used remote approaches.No previous meta-analysis has compared postoperative infections and swallowing difficulties among the two procedures.AIM To compared the same among patients undergoing lobectomy for unilateral thyroid carcinoma/benign thyroid nodule.METHODS We searched PubMed MEDLINE,Google Scholar,and Cochrane Library from the date of the first published article up to August 2025.The term used were transoral thyroidectomy vestibular approach,trans areolar thyroidectomy,scarless thyroidectomy,remote thyroidectomy,infections,postoperative,inflammation,dysphagia,and swallowing difficulties.We identified 130 studies,of them,30 full texts were screened and only six studies were included in the final meta-analysis.RESULTS Postoperative infections were not different between the two approaches,odd ratio=1.33,95%confidence interval:0.50-3.53,theχ2 was 1.92 and the P-value for overall effect of 0.57.Similarly,transient swallowing difficulty was not different between the two forms of surgery,with odd ratio=0.91,95%confidence interval:0.35-2.40;theχ2 was 1.32,and the P-value for overall effect of 0.85.CONCLUSION No significant statistical differences were evident between trans-oral endoscopic Mirghani H.Infections and swallowing difficulty in scarless thyroidectomy WJCC https://www.wjgnet.com 2 January 6,2026 Volume 14 Issue 1 thyroidectomy vestibular approach and trans-areolar approach regarding postoperative infection and transient swallowing difficulties.Further longer randomized trials are needed.展开更多
With the development of global position system(GPS),wireless technology and location aware services,it is possible to collect a large quantity of trajectory data.In the field of data mining for moving objects,the pr...With the development of global position system(GPS),wireless technology and location aware services,it is possible to collect a large quantity of trajectory data.In the field of data mining for moving objects,the problem of anomaly detection is a hot topic.Based on the development of anomalous trajectory detection of moving objects,this paper introduces the classical trajectory outlier detection(TRAOD) algorithm,and then proposes a density-based trajectory outlier detection(DBTOD) algorithm,which compensates the disadvantages of the TRAOD algorithm that it is unable to detect anomalous defects when the trajectory is local and dense.The results of employing the proposed algorithm to Elk1993 and Deer1995 datasets are also presented,which show the effectiveness of the algorithm.展开更多
Finding clusters based on density represents a significant class of clustering algorithms.These methods can discover clusters of various shapes and sizes.The most studied algorithm in this class is theDensity-Based Sp...Finding clusters based on density represents a significant class of clustering algorithms.These methods can discover clusters of various shapes and sizes.The most studied algorithm in this class is theDensity-Based Spatial Clustering of Applications with Noise(DBSCAN).It identifies clusters by grouping the densely connected objects into one group and discarding the noise objects.It requires two input parameters:epsilon(fixed neighborhood radius)and MinPts(the lowest number of objects in epsilon).However,it can’t handle clusters of various densities since it uses a global value for epsilon.This article proposes an adaptation of the DBSCAN method so it can discover clusters of varied densities besides reducing the required number of input parameters to only one.Only user input in the proposed method is the MinPts.Epsilon on the other hand,is computed automatically based on statistical information of the dataset.The proposed method finds the core distance for each object in the dataset,takes the average of these distances as the first value of epsilon,and finds the clusters satisfying this density level.The remaining unclustered objects will be clustered using a new value of epsilon that equals the average core distances of unclustered objects.This process continues until all objects have been clustered or the remaining unclustered objects are less than 0.006 of the dataset’s size.The proposed method requires MinPts only as an input parameter because epsilon is computed from data.Benchmark datasets were used to evaluate the effectiveness of the proposed method that produced promising results.Practical experiments demonstrate that the outstanding ability of the proposed method to detect clusters of different densities even if there is no separation between them.The accuracy of the method ranges from 92%to 100%for the experimented datasets.展开更多
Overlapping community detection in a network is a challenging issue which attracts lots of attention in recent years.A notion of hesitant node(HN) is proposed. An HN contacts with multiple communities while the comm...Overlapping community detection in a network is a challenging issue which attracts lots of attention in recent years.A notion of hesitant node(HN) is proposed. An HN contacts with multiple communities while the communications are not strong or even accidental, thus the HN holds an implicit community structure.However, HNs are not rare in the real world network. It is important to identify them because they can be efficient hubs which form the overlapping portions of communities or simple attached nodes to some communities. Current approaches have difficulties in identifying and clustering HNs. A density-based rough set model(DBRSM) is proposed by combining the virtue of densitybased algorithms and rough set models. It incorporates the macro perspective of the community structure of the whole network and the micro perspective of the local information held by HNs, which would facilitate the further "growth" of HNs in community. We offer a theoretical support for this model from the point of strength of the trust path. The experiments on the real-world and synthetic datasets show the practical significance of analyzing and clustering the HNs based on DBRSM. Besides, the clustering based on DBRSM promotes the modularity optimization.展开更多
We propose a new clustering algorithm that assists the researchers to quickly and accurately analyze data. We call this algorithm Combined Density-based and Constraint-based Algorithm (CDC). CDC consists of two phases...We propose a new clustering algorithm that assists the researchers to quickly and accurately analyze data. We call this algorithm Combined Density-based and Constraint-based Algorithm (CDC). CDC consists of two phases. In the first phase, CDC employs the idea of density-based clustering algorithm to split the original data into a number of fragmented clusters. At the same time, CDC cuts off the noises and outliers. In the second phase, CDC employs the concept of K-means clustering algorithm to select a greater cluster to be the center. Then, the greater cluster merges some smaller clusters which satisfy some constraint rules. Due to the merged clusters around the center cluster, the clustering results show high accuracy. Moreover, CDC reduces the calculations and speeds up the clustering process. In this paper, the accuracy of CDC is evaluated and compared with those of K-means, hierarchical clustering, and the genetic clustering algorithm (GCA) proposed in 2004. Experimental results show that CDC has better performance.展开更多
Cluster analysis is a crucial technique in unsupervised machine learning,pattern recognition,and data analysis.However,current clustering algorithms suffer from the need for manual determination of parameter values,lo...Cluster analysis is a crucial technique in unsupervised machine learning,pattern recognition,and data analysis.However,current clustering algorithms suffer from the need for manual determination of parameter values,low accuracy,and inconsistent performance concerning data size and structure.To address these challenges,a novel clustering algorithm called the fully automated density-based clustering method(FADBC)is proposed.The FADBC method consists of two stages:parameter selection and cluster extraction.In the first stage,a proposed method extracts optimal parameters for the dataset,including the epsilon size and a minimum number of points thresholds.These parameters are then used in a density-based technique to scan each point in the dataset and evaluate neighborhood densities to find clusters.The proposed method was evaluated on different benchmark datasets andmetrics,and the experimental results demonstrate its competitive performance without requiring manual inputs.The results show that the FADBC method outperforms well-known clustering methods such as the agglomerative hierarchical method,k-means,spectral clustering,DBSCAN,FCDCSD,Gaussian mixtures,and density-based spatial clustering methods.It can handle any kind of data set well and perform excellently.展开更多
Since data services are penetrating into our daily life rapidly, the mobile network becomes more complicated, and the amount of data transmission is more and more increasing. In this case, the traditional statistical ...Since data services are penetrating into our daily life rapidly, the mobile network becomes more complicated, and the amount of data transmission is more and more increasing. In this case, the traditional statistical methods for anomalous cell detection cannot adapt to the evolution of networks, and data mining becomes the mainstream. In this paper, we propose a novel kernel density-based local outlier factor(KLOF) to assign a degree of being an outlier to each object. Firstly, the notion of KLOF is introduced, which captures exactly the relative degree of isolation. Then, by analyzing its properties, including the tightness of upper and lower bounds, sensitivity of density perturbation, we find that KLOF is much greater than 1 for outliers. Lastly, KLOFis applied on a real-world dataset to detect anomalous cells with abnormal key performance indicators(KPIs) to verify its reliability. The experiment shows that KLOF can find outliers efficiently. It can be a guideline for the operators to perform faster and more efficient trouble shooting.展开更多
Clustering evolving data streams is important to be performed in a limited time with a reasonable quality. The existing micro clustering based methods do not consider the distribution of data points inside the micro c...Clustering evolving data streams is important to be performed in a limited time with a reasonable quality. The existing micro clustering based methods do not consider the distribution of data points inside the micro cluster. We propose LeaDen-Stream (Leader Density-based clustering algorithm over evolving data Stream), a density-based clustering algorithm using leader clustering. The algorithm is based on a two-phase clustering. The online phase selects the proper mini-micro or micro-cluster leaders based on the distribution of data points in the micro clusters. Then, the leader centers are sent to the offline phase to form final clusters. In LeaDen-Stream, by carefully choosing between two kinds of micro leaders, we decrease time complexity of the clustering while maintaining the cluster quality. A pruning strategy is also used to filter out real data from noise by introducing dense and sparse mini-micro and micro-cluster leaders. Our performance study over a number of real and synthetic data sets demonstrates the effectiveness and efficiency of our method.展开更多
Cardiovascular diseases(CVDs)remain the leading cause of morbidity and mortality worldwide,necessitating innovative diagnostic and prognostic strategies.Traditional biomarkers like C-reactive protein,uric acid,troponi...Cardiovascular diseases(CVDs)remain the leading cause of morbidity and mortality worldwide,necessitating innovative diagnostic and prognostic strategies.Traditional biomarkers like C-reactive protein,uric acid,troponin,and natriuretic peptides play crucial roles in CVD management,yet they are often limited by sensitivity and specificity constraints.This narrative review critically examines the emerging landscape of cardiac biomarkers and advocates for a multiple-marker approach to enhance early detection,prognosis,and risk stratification of CVD.In recent years,several novel biomarkers have shown promise in revolutionizing CVD diagnostics.Gamma-glutamyltransferase,microRNAs,endothelial microparticles,placental growth factor,trimethylamine N-oxide,retinol-binding protein 4,copeptin,heart-type fatty acid-binding protein,galectin-3,growth differentiation factor-15,soluble suppression of tumorigenicity 2,fibroblast growth factor 23,and adrenomedullin have emerged as significant indicators of CV health.These biomarkers provide insights into various pathophysiological processes,such as oxidative stress,endothelial dysfunction,inflammation,metabolic disturbances,and myocardial injury.The integration of these novel biomarkers with traditional ones offers a more comprehensive understanding of CVD mechanisms.This multiple-marker approach can improve diagnostic accuracy,allowing for better risk stratification and more personalized treatment strategies.This review underscores the need for continued research to validate the clinical utility of these biomarkers and their potential incorporation into routine clinical practice.By leveraging the strengths of both traditional and novel biomarkers,precise therapeutic plans can be developed,thereby improving the management and prognosis of patients with CVDs.The ongoing exploration and validation of these biomarkers are crucial for advancing CV care and addressing the limitations of current diagnostic tools.展开更多
For large-scale heterogeneous multi-agent systems(MASs)with characteristics of dense-sparse mixed distribution,this paper investigates the practical finite-time deployment problem by establishing a novel crossspecies ...For large-scale heterogeneous multi-agent systems(MASs)with characteristics of dense-sparse mixed distribution,this paper investigates the practical finite-time deployment problem by establishing a novel crossspecies bionic analytical framework based on the partial differential equation-ordinary differential equation(PDE-ODE)approach.Specifically,by designing a specialized network communication protocol and employing the spatial continuum method for densely distributed agents,this paper models the tracking errors of densely distributed agents as a PDE equivalent to a human disease transmission model,and that of sparsely distributed agents as several ODEs equivalent to the predator population models.The coupling relationship between the PDE and ODE models is established through boundary conditions of the PDE,thereby forming a PDE-ODE-based tracking error model for the considered MASs.Furthermore,by integrating adaptive neural control scheme with the aforementioned biological models,a“Flexible Neural Network”endowed with adaptive and self-stabilized capabilities is constructed,which acts upon the considered MASs,enabling their practical finite-time deployment.Finally,effectiveness of the developed approach is illustrated through a numerical example.展开更多
The progressive loss of dopaminergic neurons in affected patient brains is one of the pathological features of Parkinson's disease,the second most common human neurodegenerative disease.Although the detailed patho...The progressive loss of dopaminergic neurons in affected patient brains is one of the pathological features of Parkinson's disease,the second most common human neurodegenerative disease.Although the detailed pathogenesis accounting for dopaminergic neuron degeneration in Parkinson's disease is still unclear,the advancement of stem cell approaches has shown promise for Parkinson's disease research and therapy.The induced pluripotent stem cells have been commonly used to generate dopaminergic neurons,which has provided valuable insights to improve our understanding of Parkinson's disease pathogenesis and contributed to anti-Parkinson's disease therapies.The current review discusses the practical approaches and potential applications of induced pluripotent stem cell techniques for generating and differentiating dopaminergic neurons from induced pluripotent stem cells.The benefits of induced pluripotent stem cell-based research are highlighted.Various dopaminergic neuron differentiation protocols from induced pluripotent stem cells are compared.The emerging three-dimension-based brain organoid models compared with conventional two-dimensional cell culture are evaluated.Finally,limitations,challenges,and future directions of induced pluripotent stem cell–based approaches are analyzed and proposed,which will be significant to the future application of induced pluripotent stem cell-related techniques for Parkinson's disease.展开更多
Deep learning algorithms have been rapidly incorporated into many different applications due to the increase in computational power and the availability of massive amounts of data.Recently,both deep learning and ensem...Deep learning algorithms have been rapidly incorporated into many different applications due to the increase in computational power and the availability of massive amounts of data.Recently,both deep learning and ensemble learning have been used to recognize underlying structures and patterns from high-level features to make predictions/decisions.With the growth in popularity of deep learning and ensemble learning algorithms,they have received significant attention from both scientists and the industrial community due to their superior ability to learn features from big data.Ensemble deep learning has exhibited significant performance in enhancing learning generalization through the use of multiple deep learning algorithms.Although ensemble deep learning has large quantities of training parameters,which results in time and space overheads,it performs much better than traditional ensemble learning.Ensemble deep learning has been successfully used in several areas,such as bioinformatics,finance,and health care.In this paper,we review and investigate recent ensemble deep learning algorithms and techniques in health care domains,medical imaging,health care data analytics,genomics,diagnosis,disease prevention,and drug discovery.We cover several widely used deep learning algorithms along with their architectures,including deep neural networks(DNNs),convolutional neural networks(CNNs),recurrent neural networks(RNNs),and generative adversarial networks(GANs).Common healthcare tasks,such as medical imaging,electronic health records,and genomics,are also demonstrated.Furthermore,in this review,the challenges inherent in reducing the burden on the healthcare system are discussed and explored.Finally,future directions and opportunities for enhancing healthcare model performance are discussed.展开更多
This article provides a comprehensive analysis of the study by Hou et al,focusing on the complex interplay between psychological and physical factors in the postoperative recovery(POR)of patients with perianal disease...This article provides a comprehensive analysis of the study by Hou et al,focusing on the complex interplay between psychological and physical factors in the postoperative recovery(POR)of patients with perianal diseases.The study sheds light on how illness perception,anxiety,and depression significantly influence recovery outcomes.Hou et al developed a predictive model that demonstrated high accuracy in identifying patients at risk of poor recovery.The article explores the critical role of pre-operative psychological assessment,highlighting the need for mental health support and personalized recovery plans in enhancing POR quality.A multidisciplinary approach,integrating mental health professionals with surgeons,anesthesiologists,and other specialists,is emphasized to ensure comprehensive care for patients.The study’s findings serve as a call to integrate psychological care into surgical practice to optimize outcomes for patients with perianal diseases.展开更多
Objective: To explore the effect of Health Action Process Approach (HAPA) theory in patients with type D personality psoriasis. Methods: A total of 66 patients with type D personality psoriasis admitted to the dermato...Objective: To explore the effect of Health Action Process Approach (HAPA) theory in patients with type D personality psoriasis. Methods: A total of 66 patients with type D personality psoriasis admitted to the dermatology department of a top-three hospital in Jingzhou City from November 2022 to July 2023 were selected and divided into control group and test group with 33 cases in each group by random number table method. The control group received routine health education, and the experimental group received health education based on the HAPA theory. Chronic disease self-efficacy scale, hospital anxiety and depression scale and skin disease quality of life scale were used to evaluate the effect of intervention. Results: After 3 months of intervention, the scores of self-efficacy in experimental group were higher than those in control group (P P Conclusion: Health education based on the theory of HAPA can enhance the self-efficacy of patients with type D personality psoriasis, relieve negative emotions and improve their quality of life.展开更多
BACKGROUND The root of mesentery dissection is one of the critical maneuvers,especially in borderline resectable pancreatic head cancer.Intra-abdominal chyle leak(CL)including chylous ascites may ensue in up to 10%of ...BACKGROUND The root of mesentery dissection is one of the critical maneuvers,especially in borderline resectable pancreatic head cancer.Intra-abdominal chyle leak(CL)including chylous ascites may ensue in up to 10%of patients after pancreatic resections.Globally recognized superior mesenteric artery(SMA)first approaches are invariably performed.The mesenteric dissection through the inferior infracolic approach has been discussed in this study emphasizing its post-operative impact on CL which is the cornerstone of this study.AIM To assess incidence,risk factors,clinical impact of CL following root of mesentery dissection,and the different treatment modalities.METHODS This is a retrospective study incorporating the patients who underwent dissection of the root of mesentery with inferior infracolic SMA first approach pancreat-oduodenectomy for the ventral body and uncinate mass of pancreas in the Department of Gastrointestinal and General Surgery of Kathmandu Medical College and Teaching Hospital from January 1,2021 to February 28,2024.Intraop-erative findings and postoperative outcomes were analyzed.RESULTS In three years,ten patients underwent root of mesentery dissection with inferior infracolic SMA first approach pancreatoduodenectomy.The mean age was 67.6 years with a male-to-female ratio of 4:5.CL was seen in four patients.With virtue of CL,Clavien-Dindo grade Ⅱ or higher morbidity was observed in four patients.Two patients had a hospital stay of more than 20 days with the former having a delayed gastric emptying and the latter with long-term total parenteral nutrition requirement.The mean operative time was 330 minutes.Curative resection was achieved in 100%of the patients.The mean duration of the intensive care unit and hospital stay were 2.55±1.45 days and 15.7±5.32 days,respectively.CONCLUSION Root of mesentery dissection with lymphadenectomy and vascular resection correlated with occurrence of CL.After complete curative resection,these were managed with total parenteral nutrition without adversely impacting outcome.展开更多
BACKGROUND Laparoscopic hepatectomy has been widely accepted for the treatment of liver tumors.Compared with open surgery,it provides a reduced hospital stay,less intraoperative blood loss,less trauma,and fewer incisi...BACKGROUND Laparoscopic hepatectomy has been widely accepted for the treatment of liver tumors.Compared with open surgery,it provides a reduced hospital stay,less intraoperative blood loss,less trauma,and fewer incisional infections,without affecting tumor outcomes.However,lesions in the right lobe of the liver are deep and obstructed by the ribs,making exposure difficult and increasing the degree of surgical difficulty;thus,liver tumors in the deep right lobe pose technical challenges in standard laparoscopic surgery.AIM To investigate the safety and efficacy of laparoscopic retroperitoneal partial hepatectomy for liver tumors.METHODS The clinical data of 72 patients who underwent laparoscopic retroperitoneal partial hepatectomy for liver tumors between January 2018 and December 2024 at the First People’s Hospital of Yunnan Province were analyzed.Of the 72 patients included,34 were male and 38 were female,with ages ranging from 34 years to 72 years(median age,45 years).The tumors were all located in the right lobe of the liver,with 30 cases in segment S6,27 cases in segment S7,and 15 cases in segment S8;the mean tumor diameter was 7.5±3.4 cm.The postoperative tumor indices,liver function,and postoperative complications were analyzed to evaluate the clinical efficacy of laparoscopic partial hepatectomy via the retroperitoneal approach.RESULTS The surgeries were successfully completed in all patients,and conversion to open surgery was required in 10 patients.The mean operative time,blood loss,drain retention time,and length of postoperative hospital stay were 140±30 minutes,150±46 mL,3.8±1.2 days,and 8.3±5.3 days,respectively.Liver function tests returned to normal in all patients within two weeks of surgery.Fifteen patients developed atelectasis and pleural effusion and were managed with incision and drainage and antibiotics.Two patients developed uncomplicated minimal ascites,and the remaining patients had no perioperative complications,such as abdominal hemorrhage,infection,liver failure,bile leakage,and other adverse events.All patients were successfully treated.CONCLUSION Laparoscopic retroperitoneal partial hepatectomy is a safe and effective approach for right hepatic space-occupying lesions,particularly in segments S6,S7,and S8,with fewer postoperative complications,less trauma,and faster recovery times.This procedure provides a new surgical access for resection of deep tumors in the right lobe of the liver and has clear clinical implications.展开更多
In response to the common problems in college English writing teaching,such as the separation of learning and application,students’low interest in writing,and difficulties in expression,this paper,based on the theore...In response to the common problems in college English writing teaching,such as the separation of learning and application,students’low interest in writing,and difficulties in expression,this paper,based on the theoretical framework of the production-oriented approach(POA)proposed by Professor Wen Qiufang,designed and implemented a set of IELTS writing teaching plan.This plan takes“motivating,enabling,and assessing”as the core teaching process,and selects typical IELTS argumentative essay topics(such as food diversity)to create real communication scenarios.In the motivating stage,diverse inputs are used to stimulate students’interest and expose their language weaknesses;in the enabling stage,language knowledge,viewpoint generation,and text structure are focused on for targeted input and training;in the assessing stage,a combination of teacher-student cooperation and peer evaluation is adopted to guide students to identify and correct deficiencies in language use.The research results show that the POA model can effectively enhance students’writing interest,active learning awareness,and writing ability,particularly in overcoming vocabulary poverty and material shortages,as well as improving language accuracy and expression richness.This provides an operational theoretical basis and practical path for improving the teaching effect of IELTS writing.展开更多
Background:Dorsal approach is the potentially effective strategy for minimally invasive liver resection.This study aimed to compare the outcomes between robot-assisted and laparoscopic hemihepatectomy through dorsal a...Background:Dorsal approach is the potentially effective strategy for minimally invasive liver resection.This study aimed to compare the outcomes between robot-assisted and laparoscopic hemihepatectomy through dorsal approach.Methods:We compared the patients who underwent robot-assisted hemihepatectomy(Rob-HH)and who had laparoscopic hemihepatectomy(Lap-HH)through dorsal approach between January 2020 and December 2022.A 1:1 propensity score-matching(PSM)analysis was performed to minimize bias and confounding factors.Results:Ninety-six patients were included,41 with Rob-HH and 55 with Lap-HH.Among them,58 underwent left hemihepatectomy(LHH)and 38 underwent right hemihepatectomy(RHH).Compared with LapHH group,patients with Rob-HH had less estimated blood loss(median:100.0 vs.300.0 m L,P=0.016),lower blood transfusion rates(4.9%vs.29.1%,P=0.003)and postoperative complication rates(26.8%vs.54.5%,P=0.016).These significant differences consistently existed after PSM and in the LHH subgroups.Furthermore,robot-assisted LHH was associated with decreased Pringle duration(45 vs.60 min,P=0.047).RHH subgroup analysis showed that compared with Lap-RHH,Rob-RHH was associated with less estimated blood loss(200.0 vs.400.0 m L,P=0.013).No significant differences were found in other perioperative outcomes among pre-and post-PSM cohorts,such as Pringle duration,operative time,and hospital stay.Conclusions:The dorsal approach was a safe and feasible strategy for hemi-hepatectomy with favorable outcomes under robot-assisted system in reducing intraoperative blood loss,transfusion,and postoperative complications.展开更多
基金Supported by the Program for Beijing Municipal Commission of Education (No.1320037010601)the 111 Project of Beijing Institute of Technologythe National Key Basic Research and Development (973) Program of China (No. 2012CB7207002)
文摘Density-based approaches in content extraction, whose task is to extract contents from Web pages, are commonly used to obtain page contents that are critical to many Web mining applications. How- ever, traditional density-based approaches cannot effectively manage pages that contain short contents and long noises. To overcome this problem, in this paper, we propose a content extraction approach for obtain- ing content from news pages that combines a segmentation-like approach and a density-based approach. A tool called BlockExtractor was developed based on this approach. BlockExtractor identifies contents in three steps. First, it looks for all Block-Level Elements (BLE) & Inline Elements (IE) blocks, which are designed to roughly segment pages into blocks. Second, it computes the densities of each BLE&IE block and its ele- ment to eliminate noises. Third, it removes all redundant BLE&IE blocks that have emerged in other pages from the same site. Compared with three other density-based approaches, our approach shows significant advantages in both precision and recall.
文摘In recent years,there has been a concerted effort to improve anomaly detection tech-niques,particularly in the context of high-dimensional,distributed clinical data.Analysing patient data within clinical settings reveals a pronounced focus on refining diagnostic accuracy,personalising treatment plans,and optimising resource allocation to enhance clinical outcomes.Nonetheless,this domain faces unique challenges,such as irregular data collection,inconsistent data quality,and patient-specific structural variations.This paper proposed a novel hybrid approach that integrates heuristic and stochastic methods for anomaly detection in patient clinical data to address these challenges.The strategy combines HPO-based optimal Density-Based Spatial Clustering of Applications with Noise for clustering patient exercise data,facilitating efficient anomaly identification.Subsequently,a stochastic method based on the Interquartile Range filters unreliable data points,ensuring that medical tools and professionals receive only the most pertinent and accurate information.The primary objective of this study is to equip healthcare pro-fessionals and researchers with a robust tool for managing extensive,high-dimensional clinical datasets,enabling effective isolation and removal of aberrant data points.Furthermore,a sophisticated regression model has been developed using Automated Machine Learning(AutoML)to assess the impact of the ensemble abnormal pattern detection approach.Various statistical error estimation techniques validate the efficacy of the hybrid approach alongside AutoML.Experimental results show that implementing this innovative hybrid model on patient rehabilitation data leads to a notable enhance-ment in AutoML performance,with an average improvement of 0.041 in the R2 score,surpassing the effectiveness of traditional regression models.
文摘BACKGROUND Due to the increasing rate of thyroid nodules diagnosis,and the desire to avoid the unsightly cervical scar,remote thyroidectomies were invented and are increasingly performed.Transoral endoscopic thyroidectomy vestibular approach and trans-areolar approaches(TAA)are the two most commonly used remote approaches.No previous meta-analysis has compared postoperative infections and swallowing difficulties among the two procedures.AIM To compared the same among patients undergoing lobectomy for unilateral thyroid carcinoma/benign thyroid nodule.METHODS We searched PubMed MEDLINE,Google Scholar,and Cochrane Library from the date of the first published article up to August 2025.The term used were transoral thyroidectomy vestibular approach,trans areolar thyroidectomy,scarless thyroidectomy,remote thyroidectomy,infections,postoperative,inflammation,dysphagia,and swallowing difficulties.We identified 130 studies,of them,30 full texts were screened and only six studies were included in the final meta-analysis.RESULTS Postoperative infections were not different between the two approaches,odd ratio=1.33,95%confidence interval:0.50-3.53,theχ2 was 1.92 and the P-value for overall effect of 0.57.Similarly,transient swallowing difficulty was not different between the two forms of surgery,with odd ratio=0.91,95%confidence interval:0.35-2.40;theχ2 was 1.32,and the P-value for overall effect of 0.85.CONCLUSION No significant statistical differences were evident between trans-oral endoscopic Mirghani H.Infections and swallowing difficulty in scarless thyroidectomy WJCC https://www.wjgnet.com 2 January 6,2026 Volume 14 Issue 1 thyroidectomy vestibular approach and trans-areolar approach regarding postoperative infection and transient swallowing difficulties.Further longer randomized trials are needed.
基金supported by the Aeronautical Science Foundation of China(20111052010)the Jiangsu Graduates Innovation Project (CXZZ120163)+1 种基金the "333" Project of Jiangsu Provincethe Qing Lan Project of Jiangsu Province
文摘With the development of global position system(GPS),wireless technology and location aware services,it is possible to collect a large quantity of trajectory data.In the field of data mining for moving objects,the problem of anomaly detection is a hot topic.Based on the development of anomalous trajectory detection of moving objects,this paper introduces the classical trajectory outlier detection(TRAOD) algorithm,and then proposes a density-based trajectory outlier detection(DBTOD) algorithm,which compensates the disadvantages of the TRAOD algorithm that it is unable to detect anomalous defects when the trajectory is local and dense.The results of employing the proposed algorithm to Elk1993 and Deer1995 datasets are also presented,which show the effectiveness of the algorithm.
基金The author extends his appreciation to theDeputyship forResearch&Innovation,Ministry of Education in Saudi Arabia for funding this research work through the project number(IFPSAU-2021/01/17758).
文摘Finding clusters based on density represents a significant class of clustering algorithms.These methods can discover clusters of various shapes and sizes.The most studied algorithm in this class is theDensity-Based Spatial Clustering of Applications with Noise(DBSCAN).It identifies clusters by grouping the densely connected objects into one group and discarding the noise objects.It requires two input parameters:epsilon(fixed neighborhood radius)and MinPts(the lowest number of objects in epsilon).However,it can’t handle clusters of various densities since it uses a global value for epsilon.This article proposes an adaptation of the DBSCAN method so it can discover clusters of varied densities besides reducing the required number of input parameters to only one.Only user input in the proposed method is the MinPts.Epsilon on the other hand,is computed automatically based on statistical information of the dataset.The proposed method finds the core distance for each object in the dataset,takes the average of these distances as the first value of epsilon,and finds the clusters satisfying this density level.The remaining unclustered objects will be clustered using a new value of epsilon that equals the average core distances of unclustered objects.This process continues until all objects have been clustered or the remaining unclustered objects are less than 0.006 of the dataset’s size.The proposed method requires MinPts only as an input parameter because epsilon is computed from data.Benchmark datasets were used to evaluate the effectiveness of the proposed method that produced promising results.Practical experiments demonstrate that the outstanding ability of the proposed method to detect clusters of different densities even if there is no separation between them.The accuracy of the method ranges from 92%to 100%for the experimented datasets.
基金supported by the National Natural Science Foundation of China(71271018)
文摘Overlapping community detection in a network is a challenging issue which attracts lots of attention in recent years.A notion of hesitant node(HN) is proposed. An HN contacts with multiple communities while the communications are not strong or even accidental, thus the HN holds an implicit community structure.However, HNs are not rare in the real world network. It is important to identify them because they can be efficient hubs which form the overlapping portions of communities or simple attached nodes to some communities. Current approaches have difficulties in identifying and clustering HNs. A density-based rough set model(DBRSM) is proposed by combining the virtue of densitybased algorithms and rough set models. It incorporates the macro perspective of the community structure of the whole network and the micro perspective of the local information held by HNs, which would facilitate the further "growth" of HNs in community. We offer a theoretical support for this model from the point of strength of the trust path. The experiments on the real-world and synthetic datasets show the practical significance of analyzing and clustering the HNs based on DBRSM. Besides, the clustering based on DBRSM promotes the modularity optimization.
文摘We propose a new clustering algorithm that assists the researchers to quickly and accurately analyze data. We call this algorithm Combined Density-based and Constraint-based Algorithm (CDC). CDC consists of two phases. In the first phase, CDC employs the idea of density-based clustering algorithm to split the original data into a number of fragmented clusters. At the same time, CDC cuts off the noises and outliers. In the second phase, CDC employs the concept of K-means clustering algorithm to select a greater cluster to be the center. Then, the greater cluster merges some smaller clusters which satisfy some constraint rules. Due to the merged clusters around the center cluster, the clustering results show high accuracy. Moreover, CDC reduces the calculations and speeds up the clustering process. In this paper, the accuracy of CDC is evaluated and compared with those of K-means, hierarchical clustering, and the genetic clustering algorithm (GCA) proposed in 2004. Experimental results show that CDC has better performance.
基金the Deanship of Scientific Research at Umm Al-Qura University,Grant Code:(23UQU4361009DSR001).
文摘Cluster analysis is a crucial technique in unsupervised machine learning,pattern recognition,and data analysis.However,current clustering algorithms suffer from the need for manual determination of parameter values,low accuracy,and inconsistent performance concerning data size and structure.To address these challenges,a novel clustering algorithm called the fully automated density-based clustering method(FADBC)is proposed.The FADBC method consists of two stages:parameter selection and cluster extraction.In the first stage,a proposed method extracts optimal parameters for the dataset,including the epsilon size and a minimum number of points thresholds.These parameters are then used in a density-based technique to scan each point in the dataset and evaluate neighborhood densities to find clusters.The proposed method was evaluated on different benchmark datasets andmetrics,and the experimental results demonstrate its competitive performance without requiring manual inputs.The results show that the FADBC method outperforms well-known clustering methods such as the agglomerative hierarchical method,k-means,spectral clustering,DBSCAN,FCDCSD,Gaussian mixtures,and density-based spatial clustering methods.It can handle any kind of data set well and perform excellently.
基金supported by the National Basic Research Program of China (973 Program: 2013CB329004)
文摘Since data services are penetrating into our daily life rapidly, the mobile network becomes more complicated, and the amount of data transmission is more and more increasing. In this case, the traditional statistical methods for anomalous cell detection cannot adapt to the evolution of networks, and data mining becomes the mainstream. In this paper, we propose a novel kernel density-based local outlier factor(KLOF) to assign a degree of being an outlier to each object. Firstly, the notion of KLOF is introduced, which captures exactly the relative degree of isolation. Then, by analyzing its properties, including the tightness of upper and lower bounds, sensitivity of density perturbation, we find that KLOF is much greater than 1 for outliers. Lastly, KLOFis applied on a real-world dataset to detect anomalous cells with abnormal key performance indicators(KPIs) to verify its reliability. The experiment shows that KLOF can find outliers efficiently. It can be a guideline for the operators to perform faster and more efficient trouble shooting.
文摘Clustering evolving data streams is important to be performed in a limited time with a reasonable quality. The existing micro clustering based methods do not consider the distribution of data points inside the micro cluster. We propose LeaDen-Stream (Leader Density-based clustering algorithm over evolving data Stream), a density-based clustering algorithm using leader clustering. The algorithm is based on a two-phase clustering. The online phase selects the proper mini-micro or micro-cluster leaders based on the distribution of data points in the micro clusters. Then, the leader centers are sent to the offline phase to form final clusters. In LeaDen-Stream, by carefully choosing between two kinds of micro leaders, we decrease time complexity of the clustering while maintaining the cluster quality. A pruning strategy is also used to filter out real data from noise by introducing dense and sparse mini-micro and micro-cluster leaders. Our performance study over a number of real and synthetic data sets demonstrates the effectiveness and efficiency of our method.
文摘Cardiovascular diseases(CVDs)remain the leading cause of morbidity and mortality worldwide,necessitating innovative diagnostic and prognostic strategies.Traditional biomarkers like C-reactive protein,uric acid,troponin,and natriuretic peptides play crucial roles in CVD management,yet they are often limited by sensitivity and specificity constraints.This narrative review critically examines the emerging landscape of cardiac biomarkers and advocates for a multiple-marker approach to enhance early detection,prognosis,and risk stratification of CVD.In recent years,several novel biomarkers have shown promise in revolutionizing CVD diagnostics.Gamma-glutamyltransferase,microRNAs,endothelial microparticles,placental growth factor,trimethylamine N-oxide,retinol-binding protein 4,copeptin,heart-type fatty acid-binding protein,galectin-3,growth differentiation factor-15,soluble suppression of tumorigenicity 2,fibroblast growth factor 23,and adrenomedullin have emerged as significant indicators of CV health.These biomarkers provide insights into various pathophysiological processes,such as oxidative stress,endothelial dysfunction,inflammation,metabolic disturbances,and myocardial injury.The integration of these novel biomarkers with traditional ones offers a more comprehensive understanding of CVD mechanisms.This multiple-marker approach can improve diagnostic accuracy,allowing for better risk stratification and more personalized treatment strategies.This review underscores the need for continued research to validate the clinical utility of these biomarkers and their potential incorporation into routine clinical practice.By leveraging the strengths of both traditional and novel biomarkers,precise therapeutic plans can be developed,thereby improving the management and prognosis of patients with CVDs.The ongoing exploration and validation of these biomarkers are crucial for advancing CV care and addressing the limitations of current diagnostic tools.
基金The National Key R&D Program of China(2021ZD0201300)the National Natural Science Foundation of China(624B2058,U1913602 and 61936004)+1 种基金the Innovation Group Project of the National Natural Science Foundation of China(61821003)the 111 Project on Computational Intelligence and Intelligent Control(B18024).
文摘For large-scale heterogeneous multi-agent systems(MASs)with characteristics of dense-sparse mixed distribution,this paper investigates the practical finite-time deployment problem by establishing a novel crossspecies bionic analytical framework based on the partial differential equation-ordinary differential equation(PDE-ODE)approach.Specifically,by designing a specialized network communication protocol and employing the spatial continuum method for densely distributed agents,this paper models the tracking errors of densely distributed agents as a PDE equivalent to a human disease transmission model,and that of sparsely distributed agents as several ODEs equivalent to the predator population models.The coupling relationship between the PDE and ODE models is established through boundary conditions of the PDE,thereby forming a PDE-ODE-based tracking error model for the considered MASs.Furthermore,by integrating adaptive neural control scheme with the aforementioned biological models,a“Flexible Neural Network”endowed with adaptive and self-stabilized capabilities is constructed,which acts upon the considered MASs,enabling their practical finite-time deployment.Finally,effectiveness of the developed approach is illustrated through a numerical example.
基金supported by Singapore National Medical Research Council(NMRC)grants,including CS-IRG,HLCA2022(to ZDZ),STaR,OF LCG 000207(to EKT)a Clinical Translational Research Programme in Parkinson's DiseaseDuke-Duke-NUS collaboration pilot grant(to ZDZ)。
文摘The progressive loss of dopaminergic neurons in affected patient brains is one of the pathological features of Parkinson's disease,the second most common human neurodegenerative disease.Although the detailed pathogenesis accounting for dopaminergic neuron degeneration in Parkinson's disease is still unclear,the advancement of stem cell approaches has shown promise for Parkinson's disease research and therapy.The induced pluripotent stem cells have been commonly used to generate dopaminergic neurons,which has provided valuable insights to improve our understanding of Parkinson's disease pathogenesis and contributed to anti-Parkinson's disease therapies.The current review discusses the practical approaches and potential applications of induced pluripotent stem cell techniques for generating and differentiating dopaminergic neurons from induced pluripotent stem cells.The benefits of induced pluripotent stem cell-based research are highlighted.Various dopaminergic neuron differentiation protocols from induced pluripotent stem cells are compared.The emerging three-dimension-based brain organoid models compared with conventional two-dimensional cell culture are evaluated.Finally,limitations,challenges,and future directions of induced pluripotent stem cell–based approaches are analyzed and proposed,which will be significant to the future application of induced pluripotent stem cell-related techniques for Parkinson's disease.
基金funded by Taif University,Saudi Arabia,project No.(TU-DSPP-2024-263).
文摘Deep learning algorithms have been rapidly incorporated into many different applications due to the increase in computational power and the availability of massive amounts of data.Recently,both deep learning and ensemble learning have been used to recognize underlying structures and patterns from high-level features to make predictions/decisions.With the growth in popularity of deep learning and ensemble learning algorithms,they have received significant attention from both scientists and the industrial community due to their superior ability to learn features from big data.Ensemble deep learning has exhibited significant performance in enhancing learning generalization through the use of multiple deep learning algorithms.Although ensemble deep learning has large quantities of training parameters,which results in time and space overheads,it performs much better than traditional ensemble learning.Ensemble deep learning has been successfully used in several areas,such as bioinformatics,finance,and health care.In this paper,we review and investigate recent ensemble deep learning algorithms and techniques in health care domains,medical imaging,health care data analytics,genomics,diagnosis,disease prevention,and drug discovery.We cover several widely used deep learning algorithms along with their architectures,including deep neural networks(DNNs),convolutional neural networks(CNNs),recurrent neural networks(RNNs),and generative adversarial networks(GANs).Common healthcare tasks,such as medical imaging,electronic health records,and genomics,are also demonstrated.Furthermore,in this review,the challenges inherent in reducing the burden on the healthcare system are discussed and explored.Finally,future directions and opportunities for enhancing healthcare model performance are discussed.
基金Supported by National Research Foundation of Korea,No.NRF-2021S1A5A8062526.
文摘This article provides a comprehensive analysis of the study by Hou et al,focusing on the complex interplay between psychological and physical factors in the postoperative recovery(POR)of patients with perianal diseases.The study sheds light on how illness perception,anxiety,and depression significantly influence recovery outcomes.Hou et al developed a predictive model that demonstrated high accuracy in identifying patients at risk of poor recovery.The article explores the critical role of pre-operative psychological assessment,highlighting the need for mental health support and personalized recovery plans in enhancing POR quality.A multidisciplinary approach,integrating mental health professionals with surgeons,anesthesiologists,and other specialists,is emphasized to ensure comprehensive care for patients.The study’s findings serve as a call to integrate psychological care into surgical practice to optimize outcomes for patients with perianal diseases.
文摘Objective: To explore the effect of Health Action Process Approach (HAPA) theory in patients with type D personality psoriasis. Methods: A total of 66 patients with type D personality psoriasis admitted to the dermatology department of a top-three hospital in Jingzhou City from November 2022 to July 2023 were selected and divided into control group and test group with 33 cases in each group by random number table method. The control group received routine health education, and the experimental group received health education based on the HAPA theory. Chronic disease self-efficacy scale, hospital anxiety and depression scale and skin disease quality of life scale were used to evaluate the effect of intervention. Results: After 3 months of intervention, the scores of self-efficacy in experimental group were higher than those in control group (P P Conclusion: Health education based on the theory of HAPA can enhance the self-efficacy of patients with type D personality psoriasis, relieve negative emotions and improve their quality of life.
文摘BACKGROUND The root of mesentery dissection is one of the critical maneuvers,especially in borderline resectable pancreatic head cancer.Intra-abdominal chyle leak(CL)including chylous ascites may ensue in up to 10%of patients after pancreatic resections.Globally recognized superior mesenteric artery(SMA)first approaches are invariably performed.The mesenteric dissection through the inferior infracolic approach has been discussed in this study emphasizing its post-operative impact on CL which is the cornerstone of this study.AIM To assess incidence,risk factors,clinical impact of CL following root of mesentery dissection,and the different treatment modalities.METHODS This is a retrospective study incorporating the patients who underwent dissection of the root of mesentery with inferior infracolic SMA first approach pancreat-oduodenectomy for the ventral body and uncinate mass of pancreas in the Department of Gastrointestinal and General Surgery of Kathmandu Medical College and Teaching Hospital from January 1,2021 to February 28,2024.Intraop-erative findings and postoperative outcomes were analyzed.RESULTS In three years,ten patients underwent root of mesentery dissection with inferior infracolic SMA first approach pancreatoduodenectomy.The mean age was 67.6 years with a male-to-female ratio of 4:5.CL was seen in four patients.With virtue of CL,Clavien-Dindo grade Ⅱ or higher morbidity was observed in four patients.Two patients had a hospital stay of more than 20 days with the former having a delayed gastric emptying and the latter with long-term total parenteral nutrition requirement.The mean operative time was 330 minutes.Curative resection was achieved in 100%of the patients.The mean duration of the intensive care unit and hospital stay were 2.55±1.45 days and 15.7±5.32 days,respectively.CONCLUSION Root of mesentery dissection with lymphadenectomy and vascular resection correlated with occurrence of CL.After complete curative resection,these were managed with total parenteral nutrition without adversely impacting outcome.
基金Supported by Yunnan Provincial Clinical Medicine Center for Digestive System Diseases,No.2024YNLCYXZX0132.
文摘BACKGROUND Laparoscopic hepatectomy has been widely accepted for the treatment of liver tumors.Compared with open surgery,it provides a reduced hospital stay,less intraoperative blood loss,less trauma,and fewer incisional infections,without affecting tumor outcomes.However,lesions in the right lobe of the liver are deep and obstructed by the ribs,making exposure difficult and increasing the degree of surgical difficulty;thus,liver tumors in the deep right lobe pose technical challenges in standard laparoscopic surgery.AIM To investigate the safety and efficacy of laparoscopic retroperitoneal partial hepatectomy for liver tumors.METHODS The clinical data of 72 patients who underwent laparoscopic retroperitoneal partial hepatectomy for liver tumors between January 2018 and December 2024 at the First People’s Hospital of Yunnan Province were analyzed.Of the 72 patients included,34 were male and 38 were female,with ages ranging from 34 years to 72 years(median age,45 years).The tumors were all located in the right lobe of the liver,with 30 cases in segment S6,27 cases in segment S7,and 15 cases in segment S8;the mean tumor diameter was 7.5±3.4 cm.The postoperative tumor indices,liver function,and postoperative complications were analyzed to evaluate the clinical efficacy of laparoscopic partial hepatectomy via the retroperitoneal approach.RESULTS The surgeries were successfully completed in all patients,and conversion to open surgery was required in 10 patients.The mean operative time,blood loss,drain retention time,and length of postoperative hospital stay were 140±30 minutes,150±46 mL,3.8±1.2 days,and 8.3±5.3 days,respectively.Liver function tests returned to normal in all patients within two weeks of surgery.Fifteen patients developed atelectasis and pleural effusion and were managed with incision and drainage and antibiotics.Two patients developed uncomplicated minimal ascites,and the remaining patients had no perioperative complications,such as abdominal hemorrhage,infection,liver failure,bile leakage,and other adverse events.All patients were successfully treated.CONCLUSION Laparoscopic retroperitoneal partial hepatectomy is a safe and effective approach for right hepatic space-occupying lesions,particularly in segments S6,S7,and S8,with fewer postoperative complications,less trauma,and faster recovery times.This procedure provides a new surgical access for resection of deep tumors in the right lobe of the liver and has clear clinical implications.
文摘In response to the common problems in college English writing teaching,such as the separation of learning and application,students’low interest in writing,and difficulties in expression,this paper,based on the theoretical framework of the production-oriented approach(POA)proposed by Professor Wen Qiufang,designed and implemented a set of IELTS writing teaching plan.This plan takes“motivating,enabling,and assessing”as the core teaching process,and selects typical IELTS argumentative essay topics(such as food diversity)to create real communication scenarios.In the motivating stage,diverse inputs are used to stimulate students’interest and expose their language weaknesses;in the enabling stage,language knowledge,viewpoint generation,and text structure are focused on for targeted input and training;in the assessing stage,a combination of teacher-student cooperation and peer evaluation is adopted to guide students to identify and correct deficiencies in language use.The research results show that the POA model can effectively enhance students’writing interest,active learning awareness,and writing ability,particularly in overcoming vocabulary poverty and material shortages,as well as improving language accuracy and expression richness.This provides an operational theoretical basis and practical path for improving the teaching effect of IELTS writing.
基金supported by grants from the National Nat-ural Science Foundation of China(82173129)the Innova-tive and Entrepreneurial Talent Doctor of Jiangsu Province,China(JSSCBS20221872)。
文摘Background:Dorsal approach is the potentially effective strategy for minimally invasive liver resection.This study aimed to compare the outcomes between robot-assisted and laparoscopic hemihepatectomy through dorsal approach.Methods:We compared the patients who underwent robot-assisted hemihepatectomy(Rob-HH)and who had laparoscopic hemihepatectomy(Lap-HH)through dorsal approach between January 2020 and December 2022.A 1:1 propensity score-matching(PSM)analysis was performed to minimize bias and confounding factors.Results:Ninety-six patients were included,41 with Rob-HH and 55 with Lap-HH.Among them,58 underwent left hemihepatectomy(LHH)and 38 underwent right hemihepatectomy(RHH).Compared with LapHH group,patients with Rob-HH had less estimated blood loss(median:100.0 vs.300.0 m L,P=0.016),lower blood transfusion rates(4.9%vs.29.1%,P=0.003)and postoperative complication rates(26.8%vs.54.5%,P=0.016).These significant differences consistently existed after PSM and in the LHH subgroups.Furthermore,robot-assisted LHH was associated with decreased Pringle duration(45 vs.60 min,P=0.047).RHH subgroup analysis showed that compared with Lap-RHH,Rob-RHH was associated with less estimated blood loss(200.0 vs.400.0 m L,P=0.013).No significant differences were found in other perioperative outcomes among pre-and post-PSM cohorts,such as Pringle duration,operative time,and hospital stay.Conclusions:The dorsal approach was a safe and feasible strategy for hemi-hepatectomy with favorable outcomes under robot-assisted system in reducing intraoperative blood loss,transfusion,and postoperative complications.