Large-scale Language Models(LLMs)have achieved significant breakthroughs in Natural Language Processing(NLP),driven by the pre-training and fine-tuning paradigm.While this approach allows models to specialize in speci...Large-scale Language Models(LLMs)have achieved significant breakthroughs in Natural Language Processing(NLP),driven by the pre-training and fine-tuning paradigm.While this approach allows models to specialize in specific tasks with reduced training costs,the substantial memory requirements during fine-tuning present a barrier to broader deployment.Parameter-Efficient Fine-Tuning(PEFT)techniques,such as Low-Rank Adaptation(LoRA),and parameter quantization methods have emerged as solutions to address these challenges by optimizing memory usage and computational efficiency.Among these,QLoRA,which combines PEFT and quantization,has demonstrated notable success in reducing memory footprints during fine-tuning,prompting the development of various QLoRA variants.Despite these advancements,the quantitative impact of key variables on the fine-tuning performance of quantized LLMs remains underexplored.This study presents a comprehensive analysis of these key variables,focusing on their influence across different layer types and depths within LLM architectures.Our investigation uncovers several critical findings:(1)Larger layers,such as MLP layers,can maintain performance despite reductions in adapter rank,while smaller layers,like self-attention layers,aremore sensitive to such changes;(2)The effectiveness of balancing factors depends more on specific values rather than layer type or depth;(3)In quantization-aware fine-tuning,larger layers can effectively utilize smaller adapters,whereas smaller layers struggle to do so.These insights suggest that layer type is a more significant determinant of fine-tuning success than layer depth when optimizing quantized LLMs.Moreover,for the same discount of trainable parameters,reducing the trainable parameters in a larger layer is more effective in preserving fine-tuning accuracy than in a smaller one.This study provides valuable guidance for more efficient fine-tuning strategies and opens avenues for further research into optimizing LLM fine-tuning in resource-constrained environments.展开更多
Putonghua prosody is characterized by its hierarchical structure when influenced by linguistic environments. Based on this, a neural network, with specially weighted factors and optimizing outputs, is described and ap...Putonghua prosody is characterized by its hierarchical structure when influenced by linguistic environments. Based on this, a neural network, with specially weighted factors and optimizing outputs, is described and applied to construct the Putonghua prosodic model in Text-to-Speech (TTS) system. Extensive tests show that the structure of the neural network characterizes the Putonghua prosody more exactly than traditional models. Learning rate is speeded up and computational precision is improved, which makes the whole prosodic model more efficient. Furthermore, the paper also stylizes the Putonghua syllable pitch contours with SPiS parameters (Syllable Pitch Stylized Parameters), and analyzes them in adjusting the syllable pitch. It shows that the SPiS parameters effectively characterize the Putonghua syllable pitch contours, and facilitate the establishment of the network model and the prosodic controlling.展开更多
Hypothyroidism is not uncommon in dogs,but it is actually very often diagnosed in elderly dogs.When and how does the disease start?What are the first recognizable signs?The first symptoms are usually changes in the be...Hypothyroidism is not uncommon in dogs,but it is actually very often diagnosed in elderly dogs.When and how does the disease start?What are the first recognizable signs?The first symptoms are usually changes in the behavior.First,these changes are quite subtle,but as the illness progresses,they can get very grave.We do often hear from the worried owners,that their report of a behavioral change to their vet is often ignored,not taken seriously or simply interpreted as unsteady or insufficient dog training/education.This not taking seriously of the first signs is very concerning and a big problem in many ways.It is delaying the finding of the right diagnosis and treatment,which leads to suffering of the animal and the owner.In some cases,it leads to giving the dog up as an unbearable danger to the family.So the dog,who is only ill and could be back to normal with the right medical treatment,finally ends up in a dog shelter or a new family.The common understanding is,that hypothyreoidism is an illness solely occurring in the elderly dog.In contrast to this,the authors found out,that thyroidal problems occur already at relatively young ages.This is a very important finding,considering that many clinically practising veterinarians expect hypothyreoidism only in the aged or elderly dog and will not run any diagnostics in relatively young or middle-aged animals.The authors also found significant differences in the personality traits of emotional stability and extraversion.Therefore,we would like to expand the existing studies,so that this widely underestimated topic finally comes to the fore and hopefully,in the future the right diagnostcal steps can be taken at an early stage of the disease.展开更多
基金supported by the National Key R&D Program of China(No.2021YFB0301200)National Natural Science Foundation of China(No.62025208).
文摘Large-scale Language Models(LLMs)have achieved significant breakthroughs in Natural Language Processing(NLP),driven by the pre-training and fine-tuning paradigm.While this approach allows models to specialize in specific tasks with reduced training costs,the substantial memory requirements during fine-tuning present a barrier to broader deployment.Parameter-Efficient Fine-Tuning(PEFT)techniques,such as Low-Rank Adaptation(LoRA),and parameter quantization methods have emerged as solutions to address these challenges by optimizing memory usage and computational efficiency.Among these,QLoRA,which combines PEFT and quantization,has demonstrated notable success in reducing memory footprints during fine-tuning,prompting the development of various QLoRA variants.Despite these advancements,the quantitative impact of key variables on the fine-tuning performance of quantized LLMs remains underexplored.This study presents a comprehensive analysis of these key variables,focusing on their influence across different layer types and depths within LLM architectures.Our investigation uncovers several critical findings:(1)Larger layers,such as MLP layers,can maintain performance despite reductions in adapter rank,while smaller layers,like self-attention layers,aremore sensitive to such changes;(2)The effectiveness of balancing factors depends more on specific values rather than layer type or depth;(3)In quantization-aware fine-tuning,larger layers can effectively utilize smaller adapters,whereas smaller layers struggle to do so.These insights suggest that layer type is a more significant determinant of fine-tuning success than layer depth when optimizing quantized LLMs.Moreover,for the same discount of trainable parameters,reducing the trainable parameters in a larger layer is more effective in preserving fine-tuning accuracy than in a smaller one.This study provides valuable guidance for more efficient fine-tuning strategies and opens avenues for further research into optimizing LLM fine-tuning in resource-constrained environments.
基金This work was supported by the National Natural Science Foundation of China (69875008) and 863National High Technology Project
文摘Putonghua prosody is characterized by its hierarchical structure when influenced by linguistic environments. Based on this, a neural network, with specially weighted factors and optimizing outputs, is described and applied to construct the Putonghua prosodic model in Text-to-Speech (TTS) system. Extensive tests show that the structure of the neural network characterizes the Putonghua prosody more exactly than traditional models. Learning rate is speeded up and computational precision is improved, which makes the whole prosodic model more efficient. Furthermore, the paper also stylizes the Putonghua syllable pitch contours with SPiS parameters (Syllable Pitch Stylized Parameters), and analyzes them in adjusting the syllable pitch. It shows that the SPiS parameters effectively characterize the Putonghua syllable pitch contours, and facilitate the establishment of the network model and the prosodic controlling.
文摘Hypothyroidism is not uncommon in dogs,but it is actually very often diagnosed in elderly dogs.When and how does the disease start?What are the first recognizable signs?The first symptoms are usually changes in the behavior.First,these changes are quite subtle,but as the illness progresses,they can get very grave.We do often hear from the worried owners,that their report of a behavioral change to their vet is often ignored,not taken seriously or simply interpreted as unsteady or insufficient dog training/education.This not taking seriously of the first signs is very concerning and a big problem in many ways.It is delaying the finding of the right diagnosis and treatment,which leads to suffering of the animal and the owner.In some cases,it leads to giving the dog up as an unbearable danger to the family.So the dog,who is only ill and could be back to normal with the right medical treatment,finally ends up in a dog shelter or a new family.The common understanding is,that hypothyreoidism is an illness solely occurring in the elderly dog.In contrast to this,the authors found out,that thyroidal problems occur already at relatively young ages.This is a very important finding,considering that many clinically practising veterinarians expect hypothyreoidism only in the aged or elderly dog and will not run any diagnostics in relatively young or middle-aged animals.The authors also found significant differences in the personality traits of emotional stability and extraversion.Therefore,we would like to expand the existing studies,so that this widely underestimated topic finally comes to the fore and hopefully,in the future the right diagnostcal steps can be taken at an early stage of the disease.