Bigeye tuna is a protein-rich fish that is susceptible to spoilage during cold storage,however,there is limited information on untargeted metabolomic profiling of bigeye tuna concerning spoilage-associated enzymes and...Bigeye tuna is a protein-rich fish that is susceptible to spoilage during cold storage,however,there is limited information on untargeted metabolomic profiling of bigeye tuna concerning spoilage-associated enzymes and metabolites.This study aimed to investigate how cold storage affects enzyme activities,nutrient composition,tissue microstructures and spoilage metabolites of bigeye tuna.The activities of cathepsins B,H,L increased,while Na^(+)/K^(+)-ATPase and Mg^(2+)-ATPase decreased,α-glucosidase,lipase and lipoxygenase first increased and then decreased during cold storage,suggesting that proteins undergo degradation and ATP metabolism occurs at a faster rate during cold storage.Nutrient composition(moisture and lipid content),total amino acids decreased,suggesting that the nutritional value of bigeye tuna was reduced.Besides,a logistic regression equation has been established as a food analysis tool and assesses the dynamics and correlation of the enzyme of bigeye tuna during cold storage.Based on untargeted metabolomic profiling analysis,a total of 524 metabolites were identified in the bigeye tuna contained several spoilage metabolites involved in lipid metabolism(glycerophosphocholine and choline phosphate),amino acid metabolism(L-histidine,5-deoxy-5′-(methylthio)adenosine,5-methylthioadenosine),carbohydrate metabolism(D-gluconic acid,α-D-fructose 1,6-bisphosphate,D-glyceraldehyde 3-phosphate).The results of tissue microstructures of tuna showed a looser network and visible deterioration of tissue fiber during cold storage.Therefore,metabolomic analysis and tissue microstructures provide insight into the spoilage mechanism investigations on bigeye tuna during cold storage.展开更多
Damage to electrical equipment in an earthquake can lead to power outage of power systems.Seismic fragility analysis is a common method to assess the seismic reliability of electrical equipment.To further guarantee th...Damage to electrical equipment in an earthquake can lead to power outage of power systems.Seismic fragility analysis is a common method to assess the seismic reliability of electrical equipment.To further guarantee the efficiency of analysis,multi-source uncertainties including the structure itself and seismic excitation need to be considered.A method for seismic fragility analysis that reflects structural and seismic parameter uncertainty was developed in this study.The proposed method used a random sampling method based on Latin hypercube sampling(LHS)to account for the structure parameter uncertainty and the group structure characteristics of electrical equipment.Then,logistic Lasso regression(LLR)was used to find the seismic fragility surface based on double ground motion intensity measures(IM).The seismic fragility based on the finite element model of an±1000 kV main transformer(UHVMT)was analyzed using the proposed method.The results show that the seismic fragility function obtained by this method can be used to construct the relationship between the uncertainty parameters and the failure probability.The seismic fragility surface did not only provide the probabilities of seismic damage states under different IMs,but also had better stability than the fragility curve.Furthermore,the sensitivity analysis of the structural parameters revealed that the elastic module of the bushing and the height of the high-voltage bushing may have a greater influence.展开更多
Purpose:The purpose of this study is to develop and compare model choice strategies in context of logistic regression.Model choice means the choice of the covariates to be included in the model.Design/methodology/appr...Purpose:The purpose of this study is to develop and compare model choice strategies in context of logistic regression.Model choice means the choice of the covariates to be included in the model.Design/methodology/approach:The study is based on Monte Carlo simulations.The methods are compared in terms of three measures of accuracy:specificity and two kinds of sensitivity.A loss function combining sensitivity and specificity is introduced and used for a final comparison.Findings:The choice of method depends on how much the users emphasize sensitivity against specificity.It also depends on the sample size.For a typical logistic regression setting with a moderate sample size and a small to moderate effect size,either BIC,BICc or Lasso seems to be optimal.Research limitations:Numerical simulations cannot cover the whole range of data-generating processes occurring with real-world data.Thus,more simulations are needed.Practical implications:Researchers can refer to these results if they believe that their data-generating process is somewhat similar to some of the scenarios presented in this paper.Alternatively,they could run their own simulations and calculate the loss function.Originality/value:This is a systematic comparison of model choice algorithms and heuristics in context of logistic regression.The distinction between two types of sensitivity and a comparison based on a loss function are methodological novelties.展开更多
In view of the composition analysis and identification of ancient glass products, L1 regularization, K-Means cluster analysis, elbow rule and other methods were comprehensively used to build logical regression, cluste...In view of the composition analysis and identification of ancient glass products, L1 regularization, K-Means cluster analysis, elbow rule and other methods were comprehensively used to build logical regression, cluster analysis, hyper-parameter test and other models, and SPSS, Python and other tools were used to obtain the classification rules of glass products under different fluxes, sub classification under different chemical compositions, hyper-parameter K value test and rationality analysis. Research can provide theoretical support for the protection and restoration of ancient glass relics.展开更多
The burning of crop residues in fields is a significant global biomass burning activity which is a key element of the terrestrial carbon cycle,and an important source of atmospheric trace gasses and aerosols.Accurate ...The burning of crop residues in fields is a significant global biomass burning activity which is a key element of the terrestrial carbon cycle,and an important source of atmospheric trace gasses and aerosols.Accurate estimation of cropland burned area is both crucial and challenging,especially for the small and fragmented burned scars in China.Here we developed an automated burned area mapping algorithm that was implemented using Sentinel-2 Multi Spectral Instrument(MSI)data and its effectiveness was tested taking Songnen Plain,Northeast China as a case using satellite image of 2020.We employed a logistic regression method for integrating multiple spectral data into a synthetic indicator,and compared the results with manually interpreted burned area reference maps and the Moderate-Resolution Imaging Spectroradiometer(MODIS)MCD64A1 burned area product.The overall accuracy of the single variable logistic regression was 77.38%to 86.90%and 73.47%to 97.14%for the 52TCQ and 51TYM cases,respectively.In comparison,the accuracy of the burned area map was improved to 87.14%and 98.33%for the 52TCQ and 51TYM cases,respectively by multiple variable logistic regression of Sentind-2 images.The balance of omission error and commission error was also improved.The integration of multiple spectral data combined with a logistic regression method proves to be effective for burned area detection,offering a highly automated process with an automatic threshold determination mechanism.This method exhibits excellent extensibility and flexibility taking the image tile as the operating unit.It is suitable for burned area detection at a regional scale and can also be implemented with other satellite data.展开更多
Road traffic crashes are becoming thorny issues being faced worldwide.Traffic crashes are spatiotemporal events and the research on the spatiotemporal patterns and variation trends of traffic crashes has been carried ...Road traffic crashes are becoming thorny issues being faced worldwide.Traffic crashes are spatiotemporal events and the research on the spatiotemporal patterns and variation trends of traffic crashes has been carried out.However,the impact of built environment on traffic crash spatiotemporal trends has not received much attention.Moreover,the spatial non-stationarity between the variation trends of traffic crashes and their influencing factors is usually neglected.To make up for the lack of analysis of built environment factors influencing spatiotemporal hotspot trends in traffic crashes,this paper proposed a method of“ST-GWLR”for analyzing the influence of built environment factors on spatiotemporal hotspot trends of traffic crashes by combining the spatiotemporal hotspot trend analysis and Geographically Weighted Logistic Regression(GWLR)modeling methods.Firstly,the traffic crash spatiotemporal hotspot trends were explored using the space-time cube model,hotspot analysis,and Mann-Kendall trend test.Then,the GWLR was introduced to capture the spatial non-stationarity neglected by the classic Global Logistic Regression(GLR)model,to improve the accuracy of the model estimation.GWLR model is used for the first time to analyze the significant local correlation between the traffic crash spatiotemporal hotspot trends and the built environment factors,to accurately and effectively identify the built environment factors that have significant influences on the hotspot trends of traffic crashes.The performance of the GWLR models and GLR models was examined and compared sufficiently.The results showed that the proposed ST-GWLR,which captured spatial non-stationarity,performed better than the classic GLR combined with spatiotemporal analysis,and improved the prediction accuracy of the models by 14.9%,13.9%,and 15.1%,respectively.There were significant local correlations between intensifying hotspots and persistent hotspots of traffic crashes and the built environment factors.The findings of this paper have positive implications for traffic safety management and urban built environment planning.展开更多
[Objectives] To analyze the influencing factors of fixed defects in patients with catheter fixation in clinical nursing work, in order to provide the best catheter fixation nursing plan for patients.[Methods] 176 inpa...[Objectives] To analyze the influencing factors of fixed defects in patients with catheter fixation in clinical nursing work, in order to provide the best catheter fixation nursing plan for patients.[Methods] 176 inpatients with indwelling catheter from surgical system of Taihe Hospital in Shiyan City from August 2022 to March 2023 were selected. Using a retrospective analysis method, the influencing factors of catheter fixation defects in the study subjects were divided into two categories based on objective characteristics: type I non modifiable influencing factors and type II modifiable influencing factors. Using the standard for catheter fixation defects, whether the patient had catheter fixation defects was determined. After classified and statistically analyzed item by item, binary Logistic multiple regression analysis was used to identify the influencing factors.[Results] The occurrence of catheter fixation defects in patients with catheter fixation was related to factors such as whether the patient was evaluated before fixation, whether the fixation method was standardized and systematic, whether there was sufficient communication between nurses and patients, and the patient s knowledge of catheter fixation. It was also influenced by factors such as the patient s age, catheterization site, catheterization number, catheterization duration, where there was a consciousness disorder, educational level, and external environmental temperature.[Conclusions] Early attention to the key factors affecting patients with catheter fixation defects can effectively prevent adverse factors and provide patients with the best catheter fixation nursing plan to improve nursing quality.展开更多
In this paper, a logistical regression statistical analysis (LR) is presented for a set of variables used in experimental measurements in reversed field pinch (RFP) machines, commonly known as “slinky mode” (SM), ob...In this paper, a logistical regression statistical analysis (LR) is presented for a set of variables used in experimental measurements in reversed field pinch (RFP) machines, commonly known as “slinky mode” (SM), observed to travel around the torus in Madison Symmetric Torus (MST). The LR analysis is used to utilize the modified Sine-Gordon dynamic equation model to predict with high confidence whether the slinky mode will lock or not lock when compared to the experimentally measured motion of the slinky mode. It is observed that under certain conditions, the slinky mode “locks” at or near the intersection of poloidal and/or toroidal gaps in MST. However, locked mode cease to travel around the torus;while unlocked mode keeps traveling without a change in the energy, making it hard to determine an exact set of conditions to predict locking/unlocking behaviour. The significant key model parameters determined by LR analysis are shown to improve the Sine-Gordon model’s ability to determine the locking/unlocking of magnetohydrodyamic (MHD) modes. The LR analysis of measured variables provides high confidence in anticipating locking versus unlocking of slinky mode proven by relational comparisons between simulations and the experimentally measured motion of the slinky mode in MST.展开更多
目的基于Logistic回归和随机森林算法构建全身麻醉复苏延迟的预判模型并验证。方法选择2021—2023年浙江某三甲医院复苏室收治的1177例全麻患者作为研究对象,按7︰3的比例随机分为训练组和验证组两组,采用Logistic单因素+多因素回归分析...目的基于Logistic回归和随机森林算法构建全身麻醉复苏延迟的预判模型并验证。方法选择2021—2023年浙江某三甲医院复苏室收治的1177例全麻患者作为研究对象,按7︰3的比例随机分为训练组和验证组两组,采用Logistic单因素+多因素回归分析,构建全身麻醉复苏延迟的预判模型并用列线图展示。利用随机森林算法筛选全身麻醉患者复苏延迟的影响因素并按重要性排序。采用受试者操作特征曲线(Receiver operating characteristic curve,ROC)下面积(Area of the under curve,AUC)检验模型的预测效果,采用校准曲线以及决策曲线综合评价模型的预测性能。结果1177例患者复苏延迟发生99例,发生率为8.41%。Logistic回归显示性别、ASA分级、年龄、手术时间、手术种类、输液量是全麻患者复苏延迟的独立危险因素。随机森林算法结果显示复苏延迟各变量的重要性排序为手术种类、年龄、手术时间、输液量、ASA分级、性别。Logistic回归模型的训练组AUC为0.87(95%CI 0.83~0.91),验证组为0.86(95%CI 0.81~0.91)。随机森林模型训练组AUC为0.85(95%CI 0.49~1.00),验证组AUC为0.76(95%CI 0.26~1.00)。提示模型具有良好的区分能力,预测能力较高,具有一定的临床价值。结论手术种类、年龄、手术时间、输液量、ASA分级、性别是全麻患者复苏延迟的独立危险因素,根据此构建预判模型的区分度与校准度较高,有助于预测全麻患者苏醒延迟的发生,可以为临床护理干预措施的制定与实施提供参考。展开更多
基金supported by the Shanghai Sailing Program(22YF1416300)Youth Fund Project of National Natural Science Foundation of China(32202117)+1 种基金National Key Research and Development Program of China(2022YFD2100104)the China Agriculture Research System(CARS-47).
文摘Bigeye tuna is a protein-rich fish that is susceptible to spoilage during cold storage,however,there is limited information on untargeted metabolomic profiling of bigeye tuna concerning spoilage-associated enzymes and metabolites.This study aimed to investigate how cold storage affects enzyme activities,nutrient composition,tissue microstructures and spoilage metabolites of bigeye tuna.The activities of cathepsins B,H,L increased,while Na^(+)/K^(+)-ATPase and Mg^(2+)-ATPase decreased,α-glucosidase,lipase and lipoxygenase first increased and then decreased during cold storage,suggesting that proteins undergo degradation and ATP metabolism occurs at a faster rate during cold storage.Nutrient composition(moisture and lipid content),total amino acids decreased,suggesting that the nutritional value of bigeye tuna was reduced.Besides,a logistic regression equation has been established as a food analysis tool and assesses the dynamics and correlation of the enzyme of bigeye tuna during cold storage.Based on untargeted metabolomic profiling analysis,a total of 524 metabolites were identified in the bigeye tuna contained several spoilage metabolites involved in lipid metabolism(glycerophosphocholine and choline phosphate),amino acid metabolism(L-histidine,5-deoxy-5′-(methylthio)adenosine,5-methylthioadenosine),carbohydrate metabolism(D-gluconic acid,α-D-fructose 1,6-bisphosphate,D-glyceraldehyde 3-phosphate).The results of tissue microstructures of tuna showed a looser network and visible deterioration of tissue fiber during cold storage.Therefore,metabolomic analysis and tissue microstructures provide insight into the spoilage mechanism investigations on bigeye tuna during cold storage.
基金National Key R&D Program of China under Grant Nos.2018YFC1504504 and 2018YFC0809404。
文摘Damage to electrical equipment in an earthquake can lead to power outage of power systems.Seismic fragility analysis is a common method to assess the seismic reliability of electrical equipment.To further guarantee the efficiency of analysis,multi-source uncertainties including the structure itself and seismic excitation need to be considered.A method for seismic fragility analysis that reflects structural and seismic parameter uncertainty was developed in this study.The proposed method used a random sampling method based on Latin hypercube sampling(LHS)to account for the structure parameter uncertainty and the group structure characteristics of electrical equipment.Then,logistic Lasso regression(LLR)was used to find the seismic fragility surface based on double ground motion intensity measures(IM).The seismic fragility based on the finite element model of an±1000 kV main transformer(UHVMT)was analyzed using the proposed method.The results show that the seismic fragility function obtained by this method can be used to construct the relationship between the uncertainty parameters and the failure probability.The seismic fragility surface did not only provide the probabilities of seismic damage states under different IMs,but also had better stability than the fragility curve.Furthermore,the sensitivity analysis of the structural parameters revealed that the elastic module of the bushing and the height of the high-voltage bushing may have a greater influence.
文摘Purpose:The purpose of this study is to develop and compare model choice strategies in context of logistic regression.Model choice means the choice of the covariates to be included in the model.Design/methodology/approach:The study is based on Monte Carlo simulations.The methods are compared in terms of three measures of accuracy:specificity and two kinds of sensitivity.A loss function combining sensitivity and specificity is introduced and used for a final comparison.Findings:The choice of method depends on how much the users emphasize sensitivity against specificity.It also depends on the sample size.For a typical logistic regression setting with a moderate sample size and a small to moderate effect size,either BIC,BICc or Lasso seems to be optimal.Research limitations:Numerical simulations cannot cover the whole range of data-generating processes occurring with real-world data.Thus,more simulations are needed.Practical implications:Researchers can refer to these results if they believe that their data-generating process is somewhat similar to some of the scenarios presented in this paper.Alternatively,they could run their own simulations and calculate the loss function.Originality/value:This is a systematic comparison of model choice algorithms and heuristics in context of logistic regression.The distinction between two types of sensitivity and a comparison based on a loss function are methodological novelties.
文摘In view of the composition analysis and identification of ancient glass products, L1 regularization, K-Means cluster analysis, elbow rule and other methods were comprehensively used to build logical regression, cluster analysis, hyper-parameter test and other models, and SPSS, Python and other tools were used to obtain the classification rules of glass products under different fluxes, sub classification under different chemical compositions, hyper-parameter K value test and rationality analysis. Research can provide theoretical support for the protection and restoration of ancient glass relics.
基金Under the auspices of National Natural Science Foundation of China(No.42101414)Natural Science Found for Outstanding Young Scholars in Jilin Province(No.20230508106RC)。
文摘The burning of crop residues in fields is a significant global biomass burning activity which is a key element of the terrestrial carbon cycle,and an important source of atmospheric trace gasses and aerosols.Accurate estimation of cropland burned area is both crucial and challenging,especially for the small and fragmented burned scars in China.Here we developed an automated burned area mapping algorithm that was implemented using Sentinel-2 Multi Spectral Instrument(MSI)data and its effectiveness was tested taking Songnen Plain,Northeast China as a case using satellite image of 2020.We employed a logistic regression method for integrating multiple spectral data into a synthetic indicator,and compared the results with manually interpreted burned area reference maps and the Moderate-Resolution Imaging Spectroradiometer(MODIS)MCD64A1 burned area product.The overall accuracy of the single variable logistic regression was 77.38%to 86.90%and 73.47%to 97.14%for the 52TCQ and 51TYM cases,respectively.In comparison,the accuracy of the burned area map was improved to 87.14%and 98.33%for the 52TCQ and 51TYM cases,respectively by multiple variable logistic regression of Sentind-2 images.The balance of omission error and commission error was also improved.The integration of multiple spectral data combined with a logistic regression method proves to be effective for burned area detection,offering a highly automated process with an automatic threshold determination mechanism.This method exhibits excellent extensibility and flexibility taking the image tile as the operating unit.It is suitable for burned area detection at a regional scale and can also be implemented with other satellite data.
基金supported by the National Natural Science Foundation of China[grant numbers 42101449,42090012 and 61825103]the Natural Science Foundation of Hubei Province,China[grant numbers 2022CFB773 and 2020CFA001]+2 种基金the Key Research and Development Program of Hubei Province,China[grant number 2022BAA048]the Chutian Scholar Program of Hubei Provincethe Yellow Crane Talent Scheme.
文摘Road traffic crashes are becoming thorny issues being faced worldwide.Traffic crashes are spatiotemporal events and the research on the spatiotemporal patterns and variation trends of traffic crashes has been carried out.However,the impact of built environment on traffic crash spatiotemporal trends has not received much attention.Moreover,the spatial non-stationarity between the variation trends of traffic crashes and their influencing factors is usually neglected.To make up for the lack of analysis of built environment factors influencing spatiotemporal hotspot trends in traffic crashes,this paper proposed a method of“ST-GWLR”for analyzing the influence of built environment factors on spatiotemporal hotspot trends of traffic crashes by combining the spatiotemporal hotspot trend analysis and Geographically Weighted Logistic Regression(GWLR)modeling methods.Firstly,the traffic crash spatiotemporal hotspot trends were explored using the space-time cube model,hotspot analysis,and Mann-Kendall trend test.Then,the GWLR was introduced to capture the spatial non-stationarity neglected by the classic Global Logistic Regression(GLR)model,to improve the accuracy of the model estimation.GWLR model is used for the first time to analyze the significant local correlation between the traffic crash spatiotemporal hotspot trends and the built environment factors,to accurately and effectively identify the built environment factors that have significant influences on the hotspot trends of traffic crashes.The performance of the GWLR models and GLR models was examined and compared sufficiently.The results showed that the proposed ST-GWLR,which captured spatial non-stationarity,performed better than the classic GLR combined with spatiotemporal analysis,and improved the prediction accuracy of the models by 14.9%,13.9%,and 15.1%,respectively.There were significant local correlations between intensifying hotspots and persistent hotspots of traffic crashes and the built environment factors.The findings of this paper have positive implications for traffic safety management and urban built environment planning.
文摘[Objectives] To analyze the influencing factors of fixed defects in patients with catheter fixation in clinical nursing work, in order to provide the best catheter fixation nursing plan for patients.[Methods] 176 inpatients with indwelling catheter from surgical system of Taihe Hospital in Shiyan City from August 2022 to March 2023 were selected. Using a retrospective analysis method, the influencing factors of catheter fixation defects in the study subjects were divided into two categories based on objective characteristics: type I non modifiable influencing factors and type II modifiable influencing factors. Using the standard for catheter fixation defects, whether the patient had catheter fixation defects was determined. After classified and statistically analyzed item by item, binary Logistic multiple regression analysis was used to identify the influencing factors.[Results] The occurrence of catheter fixation defects in patients with catheter fixation was related to factors such as whether the patient was evaluated before fixation, whether the fixation method was standardized and systematic, whether there was sufficient communication between nurses and patients, and the patient s knowledge of catheter fixation. It was also influenced by factors such as the patient s age, catheterization site, catheterization number, catheterization duration, where there was a consciousness disorder, educational level, and external environmental temperature.[Conclusions] Early attention to the key factors affecting patients with catheter fixation defects can effectively prevent adverse factors and provide patients with the best catheter fixation nursing plan to improve nursing quality.
文摘In this paper, a logistical regression statistical analysis (LR) is presented for a set of variables used in experimental measurements in reversed field pinch (RFP) machines, commonly known as “slinky mode” (SM), observed to travel around the torus in Madison Symmetric Torus (MST). The LR analysis is used to utilize the modified Sine-Gordon dynamic equation model to predict with high confidence whether the slinky mode will lock or not lock when compared to the experimentally measured motion of the slinky mode. It is observed that under certain conditions, the slinky mode “locks” at or near the intersection of poloidal and/or toroidal gaps in MST. However, locked mode cease to travel around the torus;while unlocked mode keeps traveling without a change in the energy, making it hard to determine an exact set of conditions to predict locking/unlocking behaviour. The significant key model parameters determined by LR analysis are shown to improve the Sine-Gordon model’s ability to determine the locking/unlocking of magnetohydrodyamic (MHD) modes. The LR analysis of measured variables provides high confidence in anticipating locking versus unlocking of slinky mode proven by relational comparisons between simulations and the experimentally measured motion of the slinky mode in MST.
文摘目的基于Logistic回归和随机森林算法构建全身麻醉复苏延迟的预判模型并验证。方法选择2021—2023年浙江某三甲医院复苏室收治的1177例全麻患者作为研究对象,按7︰3的比例随机分为训练组和验证组两组,采用Logistic单因素+多因素回归分析,构建全身麻醉复苏延迟的预判模型并用列线图展示。利用随机森林算法筛选全身麻醉患者复苏延迟的影响因素并按重要性排序。采用受试者操作特征曲线(Receiver operating characteristic curve,ROC)下面积(Area of the under curve,AUC)检验模型的预测效果,采用校准曲线以及决策曲线综合评价模型的预测性能。结果1177例患者复苏延迟发生99例,发生率为8.41%。Logistic回归显示性别、ASA分级、年龄、手术时间、手术种类、输液量是全麻患者复苏延迟的独立危险因素。随机森林算法结果显示复苏延迟各变量的重要性排序为手术种类、年龄、手术时间、输液量、ASA分级、性别。Logistic回归模型的训练组AUC为0.87(95%CI 0.83~0.91),验证组为0.86(95%CI 0.81~0.91)。随机森林模型训练组AUC为0.85(95%CI 0.49~1.00),验证组AUC为0.76(95%CI 0.26~1.00)。提示模型具有良好的区分能力,预测能力较高,具有一定的临床价值。结论手术种类、年龄、手术时间、输液量、ASA分级、性别是全麻患者复苏延迟的独立危险因素,根据此构建预判模型的区分度与校准度较高,有助于预测全麻患者苏醒延迟的发生,可以为临床护理干预措施的制定与实施提供参考。