Skip to main content

Use Case of Computing-Aware AI large model

Document Type Expired Internet-Draft (individual)
Expired & archived
Author Qing An
Last updated 2024-02-12 (Latest revision 2023-08-11)
RFC stream (None)
Intended RFC status (None)
Stream Stream state (No stream defined)
Consensus boilerplate Unknown
RFC Editor Note (None)
IESG IESG state Expired
Telechat date (None)
Responsible AD (None)
Send notices to (None)

This Internet-Draft is no longer active. A copy of the expired Internet-Draft is available in these formats:


AI models, especially AI large models have been fastly developed and widely deployed to serve the needs of users and multiple industries. Due to that AI large models involve mega-scale data and parameters, high consumption on computing and network resources, distributed computing becomes a natural choice to deploy AI large models. This document desribes the key concepts and deployment scenarios of AI large model, to demonstrate the necessity of considering computing and network resources to meet the requirements of AI tasks.


Qing An

(Note: The e-mail addresses provided for the authors of this Internet-Draft may no longer be valid.)