English reading proficiency is essential for university students in a globalized academic environment,yet many L2 learners encounter challenges,such as limited vocabulary,complex syntax,and unclear text organization,l...English reading proficiency is essential for university students in a globalized academic environment,yet many L2 learners encounter challenges,such as limited vocabulary,complex syntax,and unclear text organization,leading to cognitive overload.Grounded in the Cognitive Load Theory(CLT),this study examines the role of the Chunking Reading Processing Strategy-which integrates fragmented linguistic information into meaningful units at lexical,syntactic,and discourse levels-in alleviating cognitive load and improving reading comprehension.Through a mixed-methods approach,the research investigates how learners at different proficiency levels perceive and apply the chunking strategy,and how such application relates to cognitive load management.The results indicate that higher-proficiency learners employ chunking more frequently and report greater benefits,whereas lower-proficiency learners depend more on instructional support.The study confirms the theoretical and pedagogical value of chunk-based reading instruction and suggests that differentiated,cognitively informed teaching of the chunking strategy can enhance both reading efficiency and strategic awareness among L2 learners.展开更多
The batch splitting scheduling problem has recently become a major target in manufacturing systems, and the researchers have obtained great achievements, whereas most of existing related researches focus on equal-size...The batch splitting scheduling problem has recently become a major target in manufacturing systems, and the researchers have obtained great achievements, whereas most of existing related researches focus on equal-sized and consistent-sized batch splitting scheduling problem, and solve the problem by fixing the number of sub-batches, or the sub-batch sizes, or both. Under such circumstance and to provide a practical method for production scheduling in batch production mode, a study was made on the batch splitting scheduling problem on alternative machines, based on the objective to minimize the makespan. A scheduling approach was presented to address the variable-sized batch splitting scheduling problem in job shops trying to optimize both the number of sub-bathes and the sub-batch sizes, based on differential evolution(DE), making full use of the finding that the sum of values of genes in one chromosome remains the same before and after mutation in DE. Considering before-arrival set-up time and processing time separately, a variable-sized batch splitting scheduling model was established and a new hybrid algorithm was brought forward to solve both the batch splitting problem and the batch scheduling problem. A new parallel chromosome representation was adopted, and the batch scheduling chromosome and the batch splitting chromosome were treated separately during the global search procedure, based on self-adaptive DE and genetic crossover operator, respectively. A new local search method was further designed to gain a better performance. A solution consists of the optimum number of sub-bathes for each operation per job, the optimum batch size for each sub-batch and the optimum sequence of sub-batches. Computational experiments of four test instances and a realistic problem in a speaker workshop were performed to testify the effectiveness of the proposed scheduling method. The study takes advantage of DE's distinctive feature, and employs the algorithm as a solution approach, and thereby deepens and enriches the content of batch splitting scheduling.展开更多
Given a list of items and a sequence of variable-sized bins arriving one by one, it is NP-hard to pack the items into the bin list with a goal to minimize the total size of bins from the earliest one to the last used....Given a list of items and a sequence of variable-sized bins arriving one by one, it is NP-hard to pack the items into the bin list with a goal to minimize the total size of bins from the earliest one to the last used. In this paper a set of approximation algorithms is presented for cases in which the ability to preview at most k(〉=2) arriving bins is given. With the essential assumption that all bin sizes are not less than the largest item size, analytical results show the asymptotic worst case ratios of all k-bounded space and offiine algorithms are 2. Based on experiments by applying algorithms to instances in which item sizes and bin sizes are drawn independently from the continuous uniform distribution respectively in the interval [0,u] and [u,l ], averagecase experimental results show that, with fixed k, algorithms with the Best Fit packing(closing) rule are statistically better than those with the First Fit packing(closing) rule.展开更多
This paper suggests a chunk approach to solve the plateau problem among advanced English learners. The paper first discusses the extant problems and then provides a definition of the chunk approach. Based on some rese...This paper suggests a chunk approach to solve the plateau problem among advanced English learners. The paper first discusses the extant problems and then provides a definition of the chunk approach. Based on some research results in cognitive psychology, it analyses the important role that chunks play in language acquisition and production and thus provides a cognitive foundation for implementing the chunk approach in English teaching. The paper also offers a set of classroom activities which can be easily adopted or adapted by other teachers.展开更多
Fluency on oral English has always been the goal of Chinese English learners. Language corpuses offer great convenience to language researches. Prefabricated chunks are a great help for learners to achieve oral Englis...Fluency on oral English has always been the goal of Chinese English learners. Language corpuses offer great convenience to language researches. Prefabricated chunks are a great help for learners to achieve oral English fluency. With the help of computer software, chunks in SECCL are categorized. The conclusion is in the process of chunks acquiring, emphasis should be on content-related chunks, especially specific topic-related ones. One effective way to gain topic-related chunks is to build topic-related English corpus of native speakers.展开更多
This paper aims to demonstrate the pervasiveness of metaphor chunks in News English and introduce effective ways of understanding themcorrectly from the perspective of cognitive linguistics.Considering the difficulty ...This paper aims to demonstrate the pervasiveness of metaphor chunks in News English and introduce effective ways of understanding themcorrectly from the perspective of cognitive linguistics.Considering the difficulty in making out the accurate meaning of metaphor chunks in News Eng-lish,some translation strategies have also been proposed in hopes that it will benefit readers in their understanding and appreciation of News English.展开更多
Language is the most important tool for human beings with the outside world.In order to improve the efficiency of communication,people need to maximize the efficiency of language processing to ensure the smooth produc...Language is the most important tool for human beings with the outside world.In order to improve the efficiency of communication,people need to maximize the efficiency of language processing to ensure the smooth production and understanding of the meaning,although it is a subtle and complex process in human communication.As Dr.Widdowson proposed that language knowledge is largely chunk knowledge in the 1980’s.The process of language output is the process of copying prefabricated Chunks knowledge and then transferring it to language output.Based on data collected before,this paper intends to study the dominant reproduction and implicit output of the English language from the perspective of prefabricated chunks,to play a guiding role in optimizing the output ability of EFL learners.展开更多
Based on the concepts of Lexical Chunks and Multimodal Teaching,this paper focuses the input source of English vocabulary learning,integrating the advantages of Lexical Approach with Multimodal Teaching.As a new teach...Based on the concepts of Lexical Chunks and Multimodal Teaching,this paper focuses the input source of English vocabulary learning,integrating the advantages of Lexical Approach with Multimodal Teaching.As a new teaching exploration of English vocabulary,the teaching practice in classroom has shown that teachers should make full and reasonable use of various teaching means and resources to achieve multimodal teaching of lexical chunks,which is helpful to promote students to learn vocabulary quickly and effectively,and improve students'English language competence and performance.展开更多
Short-term memory allows individuals to recall stimuli, such as numbers or words, for several seconds to several minutes without rehearsal. Although the capacity of short-term memory is considered to be 7 ±?2 ...Short-term memory allows individuals to recall stimuli, such as numbers or words, for several seconds to several minutes without rehearsal. Although the capacity of short-term memory is considered to be 7 ±?2 items, this can be increased through a process called chunking. For example, in Japan, 11-digit cellular phone numbers and 10-digit toll free numbers are chunked into three groups of three or four digits: 090-XXXX-XXXX and 0120-XXX-XXX, respectively. We use probability theory to predict that the most effective chunking involves groups of three or four items, such as in phone numbers. However, a 16-digit credit card number exceeds the capacity of short-term memory, even when chunked into groups of four digits, such as XXXX-XXXX-XXXX-XXXX. Based on these data, 16-digit credit card numbers should be sufficient for security purposes.展开更多
Currently, large amounts of information exist in Web sites and various digital media. Most of them are in natural lan-guage. They are easy to be browsed, but difficult to be understood by computer. Chunk parsing and e...Currently, large amounts of information exist in Web sites and various digital media. Most of them are in natural lan-guage. They are easy to be browsed, but difficult to be understood by computer. Chunk parsing and entity relation extracting is important work to understanding information semantic in natural language processing. Chunk analysis is a shallow parsing method, and entity relation extraction is used in establishing relationship between entities. Because full syntax parsing is complexity in Chinese text understanding, many researchers is more interesting in chunk analysis and relation extraction. Conditional random fields (CRFs) model is the valid probabilistic model to segment and label sequence data. This paper models chunk and entity relation problems in Chinese text. By transforming them into label solution we can use CRFs to realize the chunk analysis and entities relation extraction.展开更多
Network processing in the current Internet is at the entirety of the data packet,which is problematic when encountering network congestion.The newly proposed Internet service named Qualitative Communication changes th...Network processing in the current Internet is at the entirety of the data packet,which is problematic when encountering network congestion.The newly proposed Internet service named Qualitative Communication changes the network processing paradigm to an even finer granularity,namely chunk level,which obsoletes many existing networking policies and schemes,especially the caching algorithms and cache replacement policies that have been extensively explored in Web Caching,Content Delivery Networks(CDN)or Information-Centric Networks(ICN).This paper outlines all the new factors that are brought by random linear network coding-based Qualitative Communication and proves the importance and necessity of considering them.A novel metric is proposed by taking these new factors into consideration.An optimization problem is formulated to maximize the metric value of all retained chunks in the local storage of network nodes under the constraint of storage limit.A cache replacement scheme that obtains the optimal result in a recursive manner is proposed correspondingly.With the help of the introduced intelligent cache replacement algorithm,the performance evaluations show remarkably reduced end-to-end latency compared to the existing schemes in various network scenarios.展开更多
This letter presents a new chunking method based on Maximum Entropy (ME) model with N-fold template correction model.First two types of machine learning models are described.Based on the analysis of the two models,the...This letter presents a new chunking method based on Maximum Entropy (ME) model with N-fold template correction model.First two types of machine learning models are described.Based on the analysis of the two models,then the chunking model which combines the profits of conditional probability model and rule based model is proposed.The selection of features and rule templates in the chunking model is discussed.Experimental results for the CoNLL-2000 corpus show that this approach achieves impressive accuracy in terms of the F-score:92.93%.Compared with the ME model and ME Markov model,the new chunking model achieves better performance.展开更多
Lexical chunks minimize the language learners'burden of memorization and play a very important role in saving language pro cessing efforts so as to improve the learners'language fluency,appropriacy and idiomat...Lexical chunks minimize the language learners'burden of memorization and play a very important role in saving language pro cessing efforts so as to improve the learners'language fluency,appropriacy and idiomaticity.Lexical chunks are taken as"scaffolding"in college English teaching to effectively enhance learners'language proficiency.展开更多
基金funded by the 2024 Shanghai Social Science Planning Annual Project titled“A Study on the Chunk Processing Mechanisms and Cognitive Motivations of ESL Reading”(Fund No.2024BYY012).
文摘English reading proficiency is essential for university students in a globalized academic environment,yet many L2 learners encounter challenges,such as limited vocabulary,complex syntax,and unclear text organization,leading to cognitive overload.Grounded in the Cognitive Load Theory(CLT),this study examines the role of the Chunking Reading Processing Strategy-which integrates fragmented linguistic information into meaningful units at lexical,syntactic,and discourse levels-in alleviating cognitive load and improving reading comprehension.Through a mixed-methods approach,the research investigates how learners at different proficiency levels perceive and apply the chunking strategy,and how such application relates to cognitive load management.The results indicate that higher-proficiency learners employ chunking more frequently and report greater benefits,whereas lower-proficiency learners depend more on instructional support.The study confirms the theoretical and pedagogical value of chunk-based reading instruction and suggests that differentiated,cognitively informed teaching of the chunking strategy can enhance both reading efficiency and strategic awareness among L2 learners.
基金supported by National Hi-tech Research and Development Program of China (863 Program, Grant No. 2007AA04Z155)National Natural Science Foundation of China (Grant No. 60970021)Zhejiang Provincial Natural Science Foundation of China (Grant No. Y1090592)
文摘The batch splitting scheduling problem has recently become a major target in manufacturing systems, and the researchers have obtained great achievements, whereas most of existing related researches focus on equal-sized and consistent-sized batch splitting scheduling problem, and solve the problem by fixing the number of sub-batches, or the sub-batch sizes, or both. Under such circumstance and to provide a practical method for production scheduling in batch production mode, a study was made on the batch splitting scheduling problem on alternative machines, based on the objective to minimize the makespan. A scheduling approach was presented to address the variable-sized batch splitting scheduling problem in job shops trying to optimize both the number of sub-bathes and the sub-batch sizes, based on differential evolution(DE), making full use of the finding that the sum of values of genes in one chromosome remains the same before and after mutation in DE. Considering before-arrival set-up time and processing time separately, a variable-sized batch splitting scheduling model was established and a new hybrid algorithm was brought forward to solve both the batch splitting problem and the batch scheduling problem. A new parallel chromosome representation was adopted, and the batch scheduling chromosome and the batch splitting chromosome were treated separately during the global search procedure, based on self-adaptive DE and genetic crossover operator, respectively. A new local search method was further designed to gain a better performance. A solution consists of the optimum number of sub-bathes for each operation per job, the optimum batch size for each sub-batch and the optimum sequence of sub-batches. Computational experiments of four test instances and a realistic problem in a speaker workshop were performed to testify the effectiveness of the proposed scheduling method. The study takes advantage of DE's distinctive feature, and employs the algorithm as a solution approach, and thereby deepens and enriches the content of batch splitting scheduling.
文摘Given a list of items and a sequence of variable-sized bins arriving one by one, it is NP-hard to pack the items into the bin list with a goal to minimize the total size of bins from the earliest one to the last used. In this paper a set of approximation algorithms is presented for cases in which the ability to preview at most k(〉=2) arriving bins is given. With the essential assumption that all bin sizes are not less than the largest item size, analytical results show the asymptotic worst case ratios of all k-bounded space and offiine algorithms are 2. Based on experiments by applying algorithms to instances in which item sizes and bin sizes are drawn independently from the continuous uniform distribution respectively in the interval [0,u] and [u,l ], averagecase experimental results show that, with fixed k, algorithms with the Best Fit packing(closing) rule are statistically better than those with the First Fit packing(closing) rule.
文摘This paper suggests a chunk approach to solve the plateau problem among advanced English learners. The paper first discusses the extant problems and then provides a definition of the chunk approach. Based on some research results in cognitive psychology, it analyses the important role that chunks play in language acquisition and production and thus provides a cognitive foundation for implementing the chunk approach in English teaching. The paper also offers a set of classroom activities which can be easily adopted or adapted by other teachers.
文摘Fluency on oral English has always been the goal of Chinese English learners. Language corpuses offer great convenience to language researches. Prefabricated chunks are a great help for learners to achieve oral English fluency. With the help of computer software, chunks in SECCL are categorized. The conclusion is in the process of chunks acquiring, emphasis should be on content-related chunks, especially specific topic-related ones. One effective way to gain topic-related chunks is to build topic-related English corpus of native speakers.
文摘This paper aims to demonstrate the pervasiveness of metaphor chunks in News English and introduce effective ways of understanding themcorrectly from the perspective of cognitive linguistics.Considering the difficulty in making out the accurate meaning of metaphor chunks in News Eng-lish,some translation strategies have also been proposed in hopes that it will benefit readers in their understanding and appreciation of News English.
文摘Language is the most important tool for human beings with the outside world.In order to improve the efficiency of communication,people need to maximize the efficiency of language processing to ensure the smooth production and understanding of the meaning,although it is a subtle and complex process in human communication.As Dr.Widdowson proposed that language knowledge is largely chunk knowledge in the 1980’s.The process of language output is the process of copying prefabricated Chunks knowledge and then transferring it to language output.Based on data collected before,this paper intends to study the dominant reproduction and implicit output of the English language from the perspective of prefabricated chunks,to play a guiding role in optimizing the output ability of EFL learners.
文摘Based on the concepts of Lexical Chunks and Multimodal Teaching,this paper focuses the input source of English vocabulary learning,integrating the advantages of Lexical Approach with Multimodal Teaching.As a new teaching exploration of English vocabulary,the teaching practice in classroom has shown that teachers should make full and reasonable use of various teaching means and resources to achieve multimodal teaching of lexical chunks,which is helpful to promote students to learn vocabulary quickly and effectively,and improve students'English language competence and performance.
文摘Short-term memory allows individuals to recall stimuli, such as numbers or words, for several seconds to several minutes without rehearsal. Although the capacity of short-term memory is considered to be 7 ±?2 items, this can be increased through a process called chunking. For example, in Japan, 11-digit cellular phone numbers and 10-digit toll free numbers are chunked into three groups of three or four digits: 090-XXXX-XXXX and 0120-XXX-XXX, respectively. We use probability theory to predict that the most effective chunking involves groups of three or four items, such as in phone numbers. However, a 16-digit credit card number exceeds the capacity of short-term memory, even when chunked into groups of four digits, such as XXXX-XXXX-XXXX-XXXX. Based on these data, 16-digit credit card numbers should be sufficient for security purposes.
文摘Currently, large amounts of information exist in Web sites and various digital media. Most of them are in natural lan-guage. They are easy to be browsed, but difficult to be understood by computer. Chunk parsing and entity relation extracting is important work to understanding information semantic in natural language processing. Chunk analysis is a shallow parsing method, and entity relation extraction is used in establishing relationship between entities. Because full syntax parsing is complexity in Chinese text understanding, many researchers is more interesting in chunk analysis and relation extraction. Conditional random fields (CRFs) model is the valid probabilistic model to segment and label sequence data. This paper models chunk and entity relation problems in Chinese text. By transforming them into label solution we can use CRFs to realize the chunk analysis and entities relation extraction.
文摘Network processing in the current Internet is at the entirety of the data packet,which is problematic when encountering network congestion.The newly proposed Internet service named Qualitative Communication changes the network processing paradigm to an even finer granularity,namely chunk level,which obsoletes many existing networking policies and schemes,especially the caching algorithms and cache replacement policies that have been extensively explored in Web Caching,Content Delivery Networks(CDN)or Information-Centric Networks(ICN).This paper outlines all the new factors that are brought by random linear network coding-based Qualitative Communication and proves the importance and necessity of considering them.A novel metric is proposed by taking these new factors into consideration.An optimization problem is formulated to maximize the metric value of all retained chunks in the local storage of network nodes under the constraint of storage limit.A cache replacement scheme that obtains the optimal result in a recursive manner is proposed correspondingly.With the help of the introduced intelligent cache replacement algorithm,the performance evaluations show remarkably reduced end-to-end latency compared to the existing schemes in various network scenarios.
基金Supported by National Natural Science Foundation of China (No.60504021).
文摘This letter presents a new chunking method based on Maximum Entropy (ME) model with N-fold template correction model.First two types of machine learning models are described.Based on the analysis of the two models,then the chunking model which combines the profits of conditional probability model and rule based model is proposed.The selection of features and rule templates in the chunking model is discussed.Experimental results for the CoNLL-2000 corpus show that this approach achieves impressive accuracy in terms of the F-score:92.93%.Compared with the ME model and ME Markov model,the new chunking model achieves better performance.
文摘Lexical chunks minimize the language learners'burden of memorization and play a very important role in saving language pro cessing efforts so as to improve the learners'language fluency,appropriacy and idiomaticity.Lexical chunks are taken as"scaffolding"in college English teaching to effectively enhance learners'language proficiency.