Skip to content

jiangfeibo/ComLAM

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

310 Commits
 
 
 
 
 
 

Repository files navigation

A Comprehensive Survey of Large AI Models for Future Communications: Foundations, Applications and Challenges

Authors

Feibo Jiang, Cunhua Pan, Li Dong, Kezhi Wang, Merouane Debbah, Dusit Niyato, Zhu Han

Paper

Code

Abstract

The 6G wireless communications aim to establish an intelligent world of ubiquitous connectivity, providing an unprecedented communication experience. Large artificial intelligence models (LAMs) are characterized by significantly larger scales (e.g.,billions or trillions of parameters) compared to typical artificial intelligence (AI) models. LAMs exhibit outstanding cognitive abilities, including strong generalization capabilities for fine-tuning to downstream tasks, and emergent capabilities to handle tasks unseen during training. Therefore, LAMs efficiently provide AI services for diverse communication applications, making them crucial tools for addressing complex challenges in future wireless communication systems. This study provides a comprehensive review of the foundations, applications, and challenges of LAMs in communication. First, we introduce the current state of AI-based communication systems, emphasizing the motivation behind integrating LAMs into communications and summarizing the key contributions. We then present an overview of the essential concepts of LAMs in communication. This includes an introduction to the main architectures of LAMs, such as transformer, diffusion models, and mamba. We also explore the classification of LAMs,including large language models (LLMs), large vision models (LVMs), large multimodal models (LMMs), and world models,and examine their potential applications in communication.Additionally, we cover the training methods and evaluation techniques for LAMs in communication systems. Lastly, we introduce optimization strategies such as chain of thought (CoT), retrieval augmented generation (RAG), and agentic systems. Following this,we discuss the research advancements of LAMs across various communication scenarios, including physical layer design, resource allocation and optimization, network design and management,edge intelligence, semantic communication, agentic systems, and emerging applications. Finally, we analyze the challenges in the current research and provide insights into potential future research directions.

Contents

fig.png

Fig. 1: The development history of LAMs.

Fig. 2: The role of LAMs in AI.
Fig. 2: The role of LAMs in AI.

Fig. 3: The role of LAMs in AI.
Fig. 3: Overall organization of the survey.

fig4.png

Fig. 4: Applications of LAMs in Communication. LAMs can be applied across various domains in communication, including physical layer design, resource allocation and optimization, network design and management, edge intelligence, semantic communication, agentic systems, and emerging applications.

Communication datasets for LAMs

Category datasets Release Time Link Download
General datasets Common Crawl 2020 Code
Pile 2023 Code
Dolma 2024 Code
RedPajama-data 2024 Code
Communication content filtering Common Crawl 2024 Code
RedPajama 2024 Code
Communication pre-training datasets TSpec-LLM 2023 Paper Code
OpenTelecom dataset 2024 Paper Code
CommData-PT dataset 2025 Paper
TeleQnA dataset 2024 Paper Code
Tele-Data dataset 2024 Paper Code
Communication fine-tuning datasets TelecomInstruct dataset 2024 Paper
CSI dataset compliant with 3GPP standards 2024 Paper
CommData-FT dataset 2025 Paper
Communication alignment datasets TelecomAlign dataset 2024 Paper
Dataset for multi-server multi-user offloading problem dataset 2024 Code

Classification of LAMs

LAM Category Specific Models model Release Time Link Download
Large Language Model GPT series GPT-1 2020 Paper Code
GPT-2 2023 Paper Code
GPT-3 2023 Paper
GPT-4 2023 Paper
OpenAI o1 2024 Paper Code
Gemma series Gemma 1 2024 Paper
Gemma 2 2024 Paper
LLaMA series LLaMA-1 2023 Paper Code
LLaMA-2 2023 Paper Code
LLaMA-3 2024 Paper Code
Large Vision Model SAM series SAM-1 2023 Paper Code
SAM-2 2024 Paper Code
DINO series DINO V1 2021 Paper Code
DINO V2 2023 Paper Code
Stable Diffusion series Stable Diffusion V1 2022 Paper Code
Stable Diffusion V2 2022 Paper Code
Stable Diffusion V3 2024 Paper
Vision Language Model LLaVA LLaVA 2024 Paper Code
Qwen-VL Qwen-VL 2023 Paper Code
Qwen-VL-Chat 2023 Paper Code
Mini-GPT4 Mini-GPT4 2023 Paper Code
Large Multimodal Model CoDi series CoDi-1 2024 Paper Code
CoDi-2 2024 Paper Code
Meta-Transformer Meta-Transformer 2023 Paper Code
ImageBind ImageBind 2023 Paper Code
World Model Sora Sora 2024 Paper
JEPA JEPA 2022 Paper
Vista Vista 2024 Paper Code
Lightweight Large AI Model TinyLlama TinyLlama 2024 Paper Code
MobileVLM MobileVLM 2024 Paper Code
Mini-Gemini Mini-Gemini 2024 Paper Code
Large Reasoning Model
OpenAI o3-mini OpenAI o3-mini 2025 Paper
DeepSeek DeepSeek-R1 2025 Paper Code

Paper with code

Category Title Link Download
Variational autoencoder Joint coding-modulation for digital semantic communications via variational autoencoder Paper Code
Diffusion models Beyond deep reinforcement learning: A tutorial on generative diffusion models in network optimization Paper Code
Large language model Large language model enhanced multi-agent systems for 6g communications Paper Code
Large vision model Large ai model-based semantic communications Paper Code
In-context learning In-context learning for MIMO equalization using transformer-based sequence models Paper Code
Retrieval-augmented generation Telco-rag: Navigating the challenges of retrieval-augmented language models for telecommunications Paper Code
Multi-agent system Large language model enhanced multi-agent systems for 6g communications Paper Code
LLM-assisted physical layer design Llm4cp: Adapting large language models for channel prediction Paper Code
LLM-assisted physical layer design Generative ai agent for next-generation mimo design: Fundamentals, challenges, and vision Paper Code
GAI model-assisted physical layer design Mimo channel estimation using score-based generative models Paper Code
Computing resource allocation Diffusion-based reinforcement learning for edge-enabled ai-generated content services Paper Code
Edge training and application of LAMs Edge-llm: Enabling efficient large language model adaptation on edge devices via layerwise unified compression and adaptive layer tuning and voting Paper Code
Edge training and application of LAMs Federated fine-tuning of billion-sized language models across mobile devices Paper Code
Federated fine-tuning for LAMs Fwdllm: Efficient fedllm using forward gradient Paper Code
Agent systems based on LLMs Large language model enhanced multi-agent systems for 6g communications Paper Code
Agent systems based on LLMs Wirelessagent: Large language model agents for intelligent wireless networks Paper Code
Agent systems based on LLMs Generative ai agent for next-generation mimo design: Fundamentals, challenges, and vision Paper Code
LAMs for digital twin Towards autonomous system: flexible modular production system enhanced with large language model agents Paper Code
Smart healthcare Conversational health agents: A personalized llm-powered agent framework Paper Code
Carbon emissions Generative ai for low-carbon artificial intelligence of things Paper Code

The Team

Here is the list of our student contributors in each section.

Section Student Contributors
The whole paper Zhengyu Du , Yuhan Zhang
Literature Search Jian Zou , Dandan Qi
Project Maintenance Xitao Pan

Contact Information for Source Code Submission or Update

If you intend to add or update the source code in the repository, please contact the following email addresses: jiangfb@hunnu.edu.cn, Dlj2017@hunnu.edu.cn, 240620854087@stu.hutb.edu.cn and 240620854065@stu.hutb.edu.cn.

Update Log

Version Time Update Content
v1 2024/12/09 The initial version.
v2 2024/12/18 Improve the writing.
Correct some minor errors.
v3 2025/05/07 Improve the writing.
Correct some minor errors.

Citation

 @ARTICLE{2025arXiv250503556J,
      title = {A Comprehensive Survey of Large AI Models for Future Communications: Foundations, Applications and Challenges},
      author = {Feibo Jiang, Cunhua Pan, Li Dong, Kezhi Wang, Merouane Debbah, Dusit Niyato, Zhu Han},
      journal = {arXiv preprint arXiv:2505.03556v1},
      year = {2025}
}


About

The code repository for the paper "A Comprehensive Survey of Large AI Models for Future Communications: Foundations, Applications and Challenges"

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors