¡Ø º» »óǰÀº ¿µ¹® ÀÚ·á·Î Çѱ۰ú ¿µ¹® ¸ñÂ÷¿¡ ºÒÀÏÄ¡ÇÏ´Â ³»¿ëÀÌ ÀÖÀ» °æ¿ì ¿µ¹®À» ¿ì¼±ÇÕ´Ï´Ù. Á¤È®ÇÑ °ËÅ並 À§ÇØ ¿µ¹® ¸ñÂ÷¸¦ Âü°íÇØÁֽñ⠹ٶø´Ï´Ù.
ÀÌ º¸°í¼´Â AI µ¥ÀÌÅͼ¾ÅÍÀÇ ³×Æ®¿öÅ© ¹× Àü·Â ¿ä±¸»çÇ׿¡ ´ëÇØ Á¶»çÇßÀ¸¸ç, ½ÃÀå °³¿ä¿Í AI ÇöȲ, »õ·Î¿î ±âȸ, AI µ¥ÀÌÅͼ¾ÅÍ ¼ö, AI ¼¹ö ¹× Æ÷Æ® ¼Óµµ, ³×Æ®¿öÅ© ±â¼ú/ÇÁ·ÎÅäÄݺ° AI ¼¹ö, AI µ¥ÀÌÅͼ¾ÅÍ¿ë µ¥ÀÌÅÍ ½ºÅ丮Áö, AI µ¥ÀÌÅͼ¾ÅÍ¿ë µ¥ÀÌÅÍ ÀúÀå, AI µ¥ÀÌÅͼ¾ÅÍÀÇ Àü·Â ¼Òºñ, AI µ¥ÀÌÅͼ¾ÅÍ ³Ã°¢ ±â¼ú¿¡ ´ëÇÑ 10³â ¿¹ÃøÀ» Á¦°øÇÕ´Ï´Ù.
¿À´Ã³¯ÀÇ AI Á¦Ç°(LLM ¹× °¡»ó ºñ¼)ÀÇ µ¥ÀÌÅͼ¾ÅÍ ¿ä±¸»çÇ×À» ½ÅÁßÇÏ°Ô Æò°¡ÇÕ´Ï´Ù. ¿¹¸¦ µé¾î ºñµð¿À AIÀÇ ¿µÇâ·ÂÀÌ Ä¿Áü¿¡ µû¶ó ÀÌ·¯ÇÑ ¿ä±¸»çÇ×ÀÌ ¾î¶»°Ô ÁøÈÇÒ °ÍÀÎÁö, IT, ³×Æ®¿öÅ·, ¿¡³ÊÁö ºÐ¾ßÀÇ »õ·Î¿î ¸ÅÃâÀº ¾öû³ªÁö¸¸, AIÀÇ °ú´ë ±¤°í·Î ÀÎÇØ À§ÇùÀ» ¹ÞÀ» ¼öµµ ÀÖ½À´Ï´Ù. ÀÌ º¸°í¼¿¡¼´Â AI Ã߷аú ÇнÀÀÇ ºÎ»óÀÌ µ¥ÀÌÅͼ¾ÅÍ¿¡ ¾î¶² ¿µÇâÀ» ¹ÌÄ¥Áö Çö½ÇÀûÀÎ °üÁ¡¿¡¼ »ìÆìº¾´Ï´Ù.
2Àå¿¡¼´Â ¾ÐµµÀûÀÎ Á¡À¯À²À» Â÷ÁöÇϰí ÀÖ´Â ÇÏÀÌÆÛ½ºÄÉÀÏ µ¥ÀÌÅͼ¾ÅÍ¿¡ ÃÊÁ¡À» ¸ÂÃß¾î AI°¡ ÀúÁö¿¬ ¿ä±¸·Î À̾îÁö¸é¼ AI µ¥ÀÌÅͼ¾ÅͰ¡ ¾î¶»°Ô º¯ÈÇÒ °ÍÀÎÁö¿¡ ´ëÇØ ¼³¸íÇÕ´Ï´Ù. µ¥ÀÌÅͼ¾ÅÍÀÇÀ§Ä¡·ÎÁÖ¸ñµµµµ ³ô¾ÆÁú °ÍÀ̶ó°í ÁÖÀåÇϰí ÀÖ½À´Ï´Ù. ÀÌ Àå¿¡¼´Â ºÎµ¿»ê ¾÷°è¿¡¼ AI µ¥ÀÌÅͼ¾ÅÍÀÇ ºÎ»óÀ¸·Î ´©°¡ ÇýÅÃÀ» ¹ÞÀ» ¼ö ÀÖ´ÂÁö¿¡ ´ëÇØ¼µµ »ìÆìº¾´Ï´Ù.
3Àå¿¡¼´Â AIÀÇ Æ¯¼öÇÑ ¿ä±¸¸¦ ÃæÁ·½Ã۱â À§ÇØ µ¥ÀÌÅͼ¾ÅÍÀÇ ¼³°è, ·¹À̾ƿô, Àåºñ ¼±Åÿ¡ ´ëÇØ Å©°Ô Àç°ËÅäÇÕ´Ï´Ù. µû¶ó¼ ÀÌ Àå¿¡¼´Â AI µ¥ÀÌÅͼ¾ÅÍÀÇ »õ·Î¿î ¿ä±¸»çÇ׿¡ ¸ÂÃß¾î ÀÌ´õ³ÝÀÌ ¾î¶»°Ô ÀûÀÀÇϰí ÀÖ´ÂÁö¿¡ ÃÊÁ¡À» ¸ÂÃß°í, I.6T±îÁö ¼ºÀåÇÒ °ÍÀ¸·Î ¿¹»óµÇ´Â ¿ïÆ®¶ó ÀÌ´õ³ÝÀÇ ÇüÅ¿¡ ÃÊÁ¡À» ¸ÂÃß¾ú½À´Ï´Ù. ¶ÇÇÑ ¼¹ö¿Í ½ºÅ丮Áö ¹Ú½º°¡ AI ½Ã´ë¸¦ À§ÇØ ¾î¶»°Ô Àç°ËÅäµÇ°í ÀÖ´ÂÁö¿¡ ´ëÇØ¼µµ ´Ù·ç°í ÀÖ½À´Ï´Ù.
¶ÇÇÑ ±¤ÁýÀûȰ¡ ¾î¶»°Ô ³ôÀº µ¥ÀÌÅÍ ¼Óµµ¿Í ³·Àº Áö¿¬ÀÇ µ¥ÀÌÅͼ¾Å͸¦ °¡´ÉÇÏ°Ô ÇÏ´ÂÁö¿¡ ´ëÇØ¼µµ »ìÆìº¾´Ï´Ù. ÀÌ Àå¿¡¼´Â Ĩ·¿, ½Ç¸®ÄÜ Æ÷Åä´Ð½º, CPO(Co-Packaged Optics)¿Í °°Àº ÁÖ¿ä ÁýÀûÈ Ç÷§ÆûÀ» »ìÆìº½À¸·Î½á ÀÌ·¯ÇÑ ¸ñÇ¥ÀÇ ÀϺθ¦ ´Þ¼ºÇϰí ÀÖ½À´Ï´Ù. ¶ÇÇÑ ºñÁî´Ï½º ±âȸ·Î¼ÀÇ AI ÇÁ·Î¼¼¼¿Í µ¥ÀÌÅͼ¾ÅÍ¿¡¼ CPU, GPU, FPGA, ASIC, Ư¼ö Ãß·Ð ¿£Áø ¹× Æ®·¹ÀÌ´× ¿£ÁøÀÌ ÇâÈÄ µ¥ÀÌÅͼ¾ÅÍ¿¡¼ ¾î¶² ¿ªÇÒÀ» ÇÒ °ÍÀÎÁö¿¡ ´ëÇØ¼µµ ÀÚ¼¼È÷ »ìÆìº¾´Ï´Ù.
4Àå¿¡¼´Â AI¸¦ '±¸¿ø'ÇÒ ¼ö ÀÖ´Â °ÍÀº ¿øÀڷ»ÓÀ̶ó´Â °á·ÐÀ» ³»¸³´Ï´Ù. ÀÌ Àå¿¡¼´Â dz·Â¹ßÀüÀÌ AI ½ÃÀå¿¡¼ Â÷ÁöÇÏ´Â ºñÁßÀº ÀÛÀ» ¼ö ÀÖÁö¸¸, AI°¡ ¼ÒÇü ¸ðµâ(¿øÀÚ·Â) ¿øÀÚ·Î ½ÃÀå ±âȸ¸¦ âÃâÇÒ ¼ö ÀÖ´Ù´Â Á¡À» ³íÀÇÇÕ´Ï´Ù. ÇÑÆí, º¸°í¼´Â AI µ¥ÀÌÅͼ¾ÅÍÀÇ È¿°úÀûÀÎ ³Ã°¢À» À§ÇÑ ¿©·¯ °¡Áö ¹æ¹ýÀÌ ÀÖÁö¸¸, ´ëºÎºÐ ¾×ü ³Ã°¢ Àü·«À̶ó°í ÁöÀûÇϸç, CIRÀº µ¥ÀÌÅͼ¾ÅÍ ³Ã°¢ÀÇ »õ·Î¿î Ç¥ÁØÀÌ Àß ÀÛµ¿ÇÒ °ÍÀ¸·Î ¿¹ÃøÇß½À´Ï´Ù.
¸ñÂ÷
°³¿ä : AI µ¥ÀÌÅͼ¾ÅÍ¿Í ÇâÈÄ ±âȸ
Á¦1Àå AI µ¥ÀÌÅͼ¾ÅÍÀÇ ¸ØÃßÁö ¾Ê´Â ¼ºÀå : AI ½ÇÁ¤
- ÀÌ º¸°í¼ÀÇ ¸ñÀû
- AI : ÇöȲ
- LLM : ÇâÈÄ °í°´ ¿ä±¸, ±â¼ú ¿ä±¸, ±âȸ
- °¡»ó ºñ¼¿Í AI ÀÎÇÁ¶ó
- ºñµð¿À AIÀÇ ¿ªÇÒ È®´ë
- ±â°èÇнÀ¿¡ °üÇÑ ¸Þ¸ð
- AI ¼ÒÇÁÆ®¿þ¾î ¼ºñ½º¿Í AIaaS
- ¹®Á¦°¡ µÉ °¡´É¼ºÀÌ ÀÖ´Â °ÍÀº?
Á¦2Àå AI Çõ¸íÀ» ÇâÇÑ µ¥ÀÌÅͼ¾ÅÍÀÇ À籸Ãà : »õ·Î¿î ±âȸ
- AI µ¥ÀÌÅͼ¾ÅÍÀÇ ½ÃÀÛ
- AI µ¥ÀÌÅͼ¾ÅÍ¿¡¼ 'µ¿¼' Æ®·¡ÇÈÀÇ Áõ°¡
- AI°¡ µ¥ÀÌÅͼ¾ÅÍ¿¡¼ Àú·¹ÀÌÅϽÃÀÇ ¿ä±¸¸¦ ÃËÁøÇÏ´Â ¹æ¹ý
- AI µ¥ÀÌÅͼ¾ÅÍÀÇ Áö¿ª º¯È : Àå¼Ò
- AI¿Í ¿§Áö ³×Æ®¿öÅ©
- µ¥ÀÌÅͼ¾ÅÍ »óÈ£ Á¢¼Ó¿¡ °üÇÑ ÁÖÀÇ »çÇ×
Á¦3Àå °ø±Þ : AI ³×Æ®¿öÅ· : Çϵå¿þ¾î¿Í ÀÌ¿ë °¡´ÉÇÑ Å×Å©³î·¯Áö
- µ¥ÀÌÅͼ¾ÅÍ Çϵå¿þ¾î¿¡ ´ëÇÑ ¼¹®
- AI, µ¥ÀÌÅͼ¾ÅÍ, ¹ÝµµÃ¼ ºÐ¾ß
- AI µ¥ÀÌÅͼ¾ÅÍ¿¡¼ÀÇ PIC, »óÈ£ Á¢¼Ó, ±¤ÅëÇÕ
- AI µ¥ÀÌÅͼ¾ÅÍ¿ë ±¤³×Æ®¿öÅ© ÀÎÇÁ¶ó
- AI µ¥ÀÌÅͼ¾ÅÍ¿¡¼ÀÇ ¿ïÆ®¶ó ÀÌ´õ³Ý : IEEE P802.3Dj
- AI µ¥ÀÌÅͼ¾ÅÍ¿¡¼ ÄÚÆÐŰÁö ±¤ÇÐ ºÎǰÀÇ ¹Ì·¡
- ¼¹öÀÇ Àç°í
- AI µ¥ÀÌÅͼ¾ÅÍÀÇ ½ºÅ丮Áö ¿ä°Ç
- °í¼º´É AI µ¥ÀÌÅͼ¾Å͸¦ ÇâÇÑ ¸Þ¸ð
Á¦4Àå AI µ¥ÀÌÅͼ¾ÅÍÀÇ Àü·Â°ú ³Ã°¢ ¿ä°Ç
- AI µ¥ÀÌÅͼ¾ÅÍÀÇ Àü·Â°ú ³Ã°¢ ¿ä°Ç
- AI µ¥ÀÌÅͼ¾ÅÍÀÇ Àü·Â ¼Òºñ
- ÇÙ¿É¼Ç : ÇÙÀÇ ¼ÒÇüÈ
- ¾×ü ³Ã°¢ : ÄðÇÑ AI µ¥ÀÌÅͼ¾ÅÍÀÇ ¹Ì·¡
Á¦5Àå 10³â°£ ½ÃÀå ¿¹Ãø
- ½ÃÀå ¿¹ÃøÀÇ ¼·Ð
- AI µ¥ÀÌÅͼ¾ÅÍÀÇ ¼ö
- AI µ¥ÀÌÅͼ¾ÅÍ Á¢¼ÓÀÇ 10³â ¿¹Ãø : ¼¹ö¿Í Æ÷Æ® ¼Óµµ
- AI µ¥ÀÌÅͼ¾ÅÍ¿ë µ¥ÀÌÅÍ ½ºÅ丮ÁöÀÇ 10³â ¿¹Ãø
- AI µ¥ÀÌÅͼ¾ÅÍÀÇ ³Ã°¢°ú Àü·ÂÀÇ 10³â ¿¹Ãø
ÀúÀÚ ¼Ò°³
ÀÌ ¸®Æ÷Æ®¿¡¼ »ç¿ëµÇ°í ÀÖ´Â µÎÀÚ¾î¿Í ¾à¾î
KSA 25.03.11
"Network and Power Requirements for AI Data Centers: A Ten-Year Market Forecast and Technology Forecast" is an up-to-the-minute market study forecasting business opportunities flowing from the new breed of AI data centers.
- Report embraces a realistic take on AI: The report begins with a careful assessment of the data center requirements of today's AI products (LLMs and Virtual Assistants) and how these requirements will evolve as, for example, video AI makes has its impact. New revenues for the IT, Networking and Energy sectors will be vast, but also threatened by AI hype. In this report, we take a realistic look at how the rise of AI inference and training will impact the data center
- How AI data centers will deal with latency issues: Chapter Two focuses on the dominant Hyperscale Data Centers and shows how AI data centers will change as AI leads to demands for lower latency. CIR claims this trend will drive the industry to AI edge networks and higher data rate networking as well as to a growing attention to the location of data centers. This Chapter also examines who will benefit from the rise of AI data centers in the real estate industry
- Novel products for networking, servers and storage for the AI era. Chapter Three looks at the major re-think in the design, layout and equipment choices for data centers to meet the special needs of AI. Thus, the Chapter focuses on how Ethernet is being adapted to match the emerging requirements of the AI data center, taking on form of Ultra Ethernet with the prospect of growing to I.6T. It also covers how servers and storage boxes are being rethought for the AI era
- How optical integration and novel processors will be an AI enabler. Chapter Three also examines how optical integration enables high-data rate, low latency data centers. The chapter accomplishes this goal in part by looking at key integration platforms such as chiplets, silicon photonics and co-packaged optics (CPO). It also takes a close look at AI processors as a business opportunity and the future role that will be played by CPUs, GPUs, FPGAs, ASICs and specialized inference and training engines in the data center
- New power and cooling sources are vital for AI. Chapter Four concludes that only nuclear can "save" AI. This Chapter discusses how AI creating a market opportunity for Small Modular (Nuclear) reactors, although wind power may have a small share of the AI market. Meanwhile, the report points out that effective cooling in the AI data center has many paths leading to it, although most can be characterized as liquid cooling strategies. CIR predicts creates the new standard for data center cooling will do well
- Report contains detailed ten-year market forecasts. The report contains ten-year projections for the number of AI data centers, AI servers and port speeds, AI servers by networking technology/protocol, data storage for AI data centers, power consumption by AI data centers, and cooling technologies in AI data centers
The strategic analysis provided throughout the report is illustrated with case studies from the recent history of major equipment companies and service providers. This report will be essential reading for networking vendors, service providers, AI software firms, computer companies and investors. This report will be essential reading for networking vendors, service providers, AI software firms, computer companies and investors.
Table of Contents
Executive Summary: AI Data Centers and Opportunities to Come
- E.1. Summary of AI Data Center Evolution: Ten-year Market Forecasts
- E.2. Chip Development Opportunities for AI Data Center
- E.3. PICs, Interconnects, Optical Integration and AI
- E.4. Connectivity Solutions in the AI Data Center
- E.4.1. The Future of Co-Packaged Optics in the AI Data Center
- E.5. Rethinking Servers
- E.6. Storage Requirements for AI Data Centers
- E.7. Power Consumption by AI Data Centers
- E.7.1. Liquid Cooling: The Future of Cool AI Data Centers
Chapter One: The Unstoppable Rise of the AI Data Center: AI Real
- 1.1. Objective of this Report
- 1.1.1. Sources of Information
- 1.2. AI: The State of Play
- 1.2.1. How AI Data Throughput Creates Opportunities in Data Centers
- 1.3. LLMs: Future Customer Needs, Technical Needs and Opportunities
- 1.3.1. Inference Requirements
- 1.3.2. Training Requirements
- 1.3.3. LLM Infrastructure Opportunities
- 1.4. Virtual Assistants and the AI Infrastructure
- 1.5. A Growing Role for Video AI
- 1.5.1. Machine Perception (MP)
- 1.5.2. Comparing Video Services and AI: Cautionary Tales
- 1.6. Notes on Machine Learning
- 1.6.1. Neural Networks and Deep Learning
- 1.6.2. Consumer vs. Enterprise AI
- 1.6.3. Importance of Consumer AI to Traffic Growth
- 1.6.4. Impact of Enterprise AI
- 1.7. AI Software Services and AIaaS
- 1.8. What Can Possibly Go wrong?
- 1.8.1. AI Hallucinations
- 1.8.2. AI Underperforms
- 1.8.3. A Future with Too Many Features
Chapter Two: Restructuring the Data Center for The AI Revolution: Emerging Opportunities
- 2.1. AI Data Centers Begin
- 2.1.1. The Critical Role of AI Clusters: How They Are Being Built Today
- 2.2. The Rise of "East-West" Traffic in the AI Data Center
- 2.3. How AI Drives the Need for Low Latency in Data Centers
- 2.3.1. High Data Rate Interfaces as a Solution to the AI Data Center Latency Problem
- 2.4. The Changing Geography of AI Data Centers: Location, Location, Location!
- 2.4.1. Who is Playing the AI Data Center Game in the Real Estate Industry?
- 2.4.2. Hyperscalers: Dominant Players in the AI Data Center Space
- 2.5. AI and Edge Networks
- 2.5.1. Some Notes on Edge Hardware
- 2.6. Some Notes on Data Center Interconnection
Chapter Three Supply: AI Networking: Hardware and the Available Technologies
- 3.1. A Preamble to Data Center Hardware
- 3.2. AI, Data Centers and the Semiconductor Sector
- 3.2.1. CPUs the AI Data Center
- 3.2.2. GPUs in the AI Data Center
- 3.2.3. Inference and Training Engines: The Hyperscaler Response
- 3.2.4. FPGAs in the AI Data Center
- 3.2.5. ASICs
- 3.3. PICs, Interconnects and Optical Integration in the AI Data Center
- 3.3.1. Silicon Photonics in the AI Data Center
- 3.3.2. Other Platforms for Interconnects in the AI Data Center
- 3.3.3. Some Notes on Chiplets and Interconnects
- 3.4. Optical Networking Infrastructure for AI Data Centers
- 3.5. Ultra Ethernet in the AI Data Center: IEEE P802.3Dj
- 3.5.1. FEC and Latency
- 3.5.2. Ultra Ethernet Consortium (UEC)
- 3.6. The Future of Co-Packaged Optics in the AI Data Center
- 3.6.1. Uncertainties about when CPO will happen in the AI Data Center
- 3.7. Rethinking Servers
- 3.7.1. Scale-out Networks for AI: Horizontal Scaling
- 3.7.2. Scale-up Networks
- 3.8. Storage Requirements for AI Data Centers
- 3.9. Notes Toward High-Performance AI Data Centers
Chapter Four: Power and Cooling Requirements for AI Data Centers
- 4.1. Power and Cooling Requirements for AI Data Centers
- 4.2. Power Consumption by AI Data Centers
- 4.2.1. Conventional and "Green" Power Solutions for Data Centers
- 4.3. Nuclear Option: Nuclear Miniaturized
- 4.3.1. Current Plans for Using Nuclear Power in the AI Sector
- 4.4. Liquid Cooling: The Future of Cool AI Data Centers
- 4.4.1. Evolution of Liquid Cooling
- 4.3.2. Liquid Immersion Cooling
- 4.3.3. Microconvective Cooling
- 4.3.4. Direct Chip-chip Cooling
- 4.3.5. Microchannel Cooling
- 4.3.6. Oil Cooling
Chapter Five Ten-year Market Forecasts
- 5.1. Preamble to the Market Forecasts
- 5.1.1. Do We Have Hard Market Data for AI?
- 5.2. How Many AI Data Centers are there?
- 5.2.1. Worldwide AI Data Centers in Operation
- 5.3. Ten-year Forecast of AI Data Center Connectivity: Servers and Port Speeds
- 5.3.1. Ten-year Forecast of AI Servers
- 5.3.2. Ten-year Forecast of AI Server Ports by Speed
- 5.3.3. Ten-year Forecast of AI Server Ports by Technology Type/ Protocol
- 5.4. Ten-year Forecast of Data Storage for AI Data Centers
- 5.5. Ten-year Forecast of Cooling and Power for AI Data Centers
About the Author
Acronyms and Abbreviations Used in this Report
List of Exhibits
- Exhibit E-1: Opportunities from AI Data Centers at a Glance ($ Millions, Except Data Centers)
- Exhibit 1-1: Enterprise Applications for Virtual Assistants
- Exhibit 1-2: Uses of Enterprise AI
- Exhibit 2-1: Selected Opportunities Stemming from Rebuilding Data Centers
- Exhibit 2-2: Solutions to Latency Problem
- Exhibit 3-1: Connectivity Technologies for AI Data Centers
- Exhibit 3-2: AI - A CPO Future for AI?
- Exhibit 4-1: Power and Cooling Solutions for AI Data Centers
- Exhibit 5-1: Ten-Year Forecasts of AI Market ($ Billions)
- Exhibit 5-2: Worldwide AI Data Center In Operation Worldwide
- Exhibit 5-3: Worldwide AI Server Markets
- Exhibit 5-4: Distribution of Ports Shipped by Speed ($ Million)
- Exhibit 5-5: Distribution of Ports Shipped by Protocol ($ Million)
- Exhibit 5-6: Forecast of Data Storage for AI Data Centers
- Exhibit 5-7: Ten-year Forecast of Power Consumption
- Exhibit 5-8: Ten-year Forecast of Cooling Technology in AI Data Centers