期刊文献+
共找到1篇文章
< 1 >
每页显示 20 50 100
An overview of large AI models and their applications 被引量:3
1
作者 Xiaoguang Tu Zhi He +3 位作者 Yi Huang Zhi-Hao Zhang Ming Yang Jian Zhao 《Visual Intelligence》 2024年第1期419-440,共22页
In recent years,large-scale artificial intelligence(AI)models have become a focal point in technology,attracting widespread attention and acclaim.Notable examples include Google’s BERT and OpenAI’s GPT,which have sc... In recent years,large-scale artificial intelligence(AI)models have become a focal point in technology,attracting widespread attention and acclaim.Notable examples include Google’s BERT and OpenAI’s GPT,which have scaled their parameter sizes to hundreds of billions or even tens of trillions.This growth has been accompanied by a significant increase in the amount of training data,significantly improving the capabilities and performance of these models.Unlike previous reviews,this paper provides a comprehensive discussion of the algorithmic principles of large-scale AI models and their industrial applications from multiple perspectives.We first outline the evolutionary history of these models,highlighting milestone algorithms while exploring their underlying principles and core technologies.We then evaluate the challenges and limitations of large-scale AI models,including computational resource requirements,model parameter inflation,data privacy concerns,and specific issues related to multi-modal AI models,such as reliance on text-image pairs,inconsistencies in understanding and generation capabilities,and the lack of true“multi-modality”.Various industrial applications of these models are also presented.Finally,we discuss future trends,predicting further expansion of model scale and the development of cross-modal fusion.This study provides valuable insights to inform and inspire future future research and practice. 展开更多
关键词 Artificial intelligence Large AI models Large language models GPT
在线阅读 下载PDF
上一页 1 下一页 到第
使用帮助 返回顶部