ToDown
제품문의
ToDown
Hamburger Icon
GPU / DGX
씨이랩은 NVIDIA의 Partner로서
HW와 SW 패키지 서비스를 제공합니다.
vidigo-banner

제품선택

GPU / DGX 사양을 확인해 보세요

NVIDIA GB200 NVL72

생성형 AI의 새로운 시대를 지원합니다.

arrow-icon
GB200 NVL72 GB200 Grace Blackwell Superchip
Configuration 36 Grace CPU : 72 Blackwell GPUs 1 Grace CPU : 2 Blackwell GPU
FP4 Core² 1,440 PFLOPS 40 PFLOPS
FP8/FP6 Core² 720 PFLOPS 20 PFLOPS
INT8 Tensor Core² 720 POPS 20 POPS
FP16/BF16 Tensor Core² 360 PFLOPS 10 PFLOPS
TF32 Tensor Core² 180 PFLOPS 5 PFLOPS
FP32 6,480 TFLOPS 180 TFLOPS
FP64/FP64 Tensor Core 3,240 TFLOPS 90 TFLOPS
GPU memory bandwidth 최대 13.5TB HBM3e | 576TB/s 최대 384GB HBM3e | 16TB/s
NVLink memory bandwidth 130TB/s 3.6TB/s
CPU Core 2592 Arm® Neoverse V2 Core 72 Arm® Neoverse V2 Core
CPU memory Bandwith 최대 17TB LPDDR5X | 최대 18.4TB/s 최대 480GB LPDDR5X | 최대 512GB/s

NVIDIA GB200 NVL2

모든 데이터센터에서
컴퓨팅의 새로운 시대를 엽니다.

arrow-icon
Form Factor Grace CPU 2개, Blackwell GPU 2개
FP64 Tensor Core² 40 PFLOPS
FP8/FP6 Tensor Core² 20 PFLOPS
INT8 Tensor Core² 20 POPS
FP16/BF16 Tensor Core² 10 PFLOPS
TF32 Tensor Core² 5 PFLOPS
FP32 180 TFLOPS
FP64/FP64 Tensor Core 90 TFLOPS
GPU memory
bandwidth
최대 384GP | 16TB/s
CPU Core Arm® Neoverse V2 코어 144개
LPDDRSX 메모리 | Bandwith 최대 960GP | 최대 1,024GB/s
Interconnect 인터커넥트 NVLink: 1.8TB/s
NVLink-C2C: 2x 900GB/s
PCIe Gen 6: 2x 256GB/s
Server options NVIDIA MGX를 사용하는 다양한 NVIDIA GB200 NVL2 구성 옵션

NVIDIA H200
Tensor 코어 GPU

AI 및 HPC
워크로드 강화하기.

arrow-icon
Form Factor H200 SXM¹ H200 NVL¹
FP64 34 teraFLOPS 34 teraFLOPS
FP64 Tensor Core 67 teraFLOPS 67 teraFLOPS
FP32 67 teraFLOPS 67 teraFLOPS
TF32 Tensor Core 989 teraFLOPS² 989 teraFLOPS²
BFLOAT16
Tensor Core
1,979 teraFLOPS² 1,979 teraFLOPS²
FP16 Tensor Core 1,979 teraFLOPS² 1,979 teraFLOPS²
FP8 Tensor Core 3,958 teraFLOPS² 3,958 teraFLOPS²
INT8 Tensor Core 3,958 TOPS² 3,958 TOPS²
GPU memory 141GB 141GB
GPU memory
bandwidth
4.8TB/s 4.8TB/s
Decoders 7 NVDEC
7 JPEG
7 NVDEC
7 JPEG
Max thermal design
power (TDP)
Up to 700W
(configurable)
Up to 600W
(configurable)
Multi-Instance
GPUs
Up to 7 MIGS
@ 18GB each
Up to 7 MIGS
@ 18GB each
Form factor SXM PCIe
Interconnect NVLink: 900GB/s
PCIe Gen5: 128GB/s
2 ウェイまたは 4 ウェイの NVIDIA NVLink ブリッジ: 900GB/秒
PCIe Gen5: 128GB/秒
Server options GPU가 4개 또는 8개인 NVIDIA HGX H100
파트너 및 NVIDIA-Certified System™
GPU を 8 基搭載の NVIDIA MGX™ H200 NVL
Partner および NVIDIA-Certified Systems
NVIDIA AI Enterprise Add-on 同梱

NVIDIA H100
Tensor 코어 GPU

모든 데이터센터를 위한
전례 없는 성능, 확장성 보안

arrow-icon
Form Factor H100 SXM H100 PCIe H100 NVL²
FP64 34 teraFLOPS 26 teraFLOPS 68 teraFLOPs
FP64 Tensor Core 67 teraFLOPS 51 teraFLOPS 134 teraFLOPs
FP32 67 teraFLOPS 51 teraFLOPS 134 teraFLOPs
TF32 Tensor Core 989 teraFLOPS² 756 teraFLOPS² 1,979 teraFLOPs²
BFLOAT16
Tensor Core
1,979 teraFLOPS² 1,513 teraFLOPS² 3,958 teraFLOPs²
FP16 Tensor Core 1,979 teraFLOPS² 1,513 teraFLOPS² 3,958 teraFLOPs²
FP8 Tensor Core 3,958 teraFLOPS² 3,026 teraFLOPS² 7,916 teraFLOPs²
INT8 Tensor Core 3,958 TOPS² 3,026 TOPS² 7,916 TOPS²
GPU memory 80GB 80GB 188GB
GPU memory
bandwidth
3.35TB/s 2TB/s 7.8TB/s³
Decoders 7 NVDEC
7 JPEG
7 NVDEC
7 JPEG
14 NVDEC
14 JPEG
Max thermal design
power (TDP)
Up to 700W
(configurable)
300-350W
(configurable)
2x 350-400W
(configurable)
Multi-Instance
GPUs
Up to 7 MIGS
@ 10GB each
Up to 7 MIGS
@ 10GB each
Up to 14 MIGS
@ 12GB each
Form factor SXM PCIe
dual-slot air-cooled
2x PCIe
dual-slot air-cooled
Interconnect NVLink: 900GB/s
PCIe Gen5: 128GB/s
NVLink: 600GB/s
PCIe Gen5: 128GB/s
NVLink: 600GB/s
PCIe Gen5: 128GB/s
Server options NVIDIA HGX H100 Partner
and NVIDIA-Certified
Systems™ with 4 or 8
GPUs NVIDIA DGX H100
with 8 GPUs
Partner and
NVIDIA-Certified
Systems
with 1–8 GPUs
Partner and
NVIDIA-Certified
Systems
with 2-4 pairs
NVIDIA AI Enterprise Add-on Included Included

NVIDIA A16

전례없는 VDI 사용자 경험

arrow-icon
GPU Memory 4x 16GB GDDR6 with error-correcting code (ECC)
GPU Memory Bandwidth 4x 200 GB/s
Max power consumption 250W
Interconnect PCI Express Gen 4 x16
Form factor Full height, full length (FHFL) dual slot
Thermal Passive
vGPU Software Support NVIDIA Virtual PC (vPC), NVIDIA Virtual Applications (vApps), NVIDIA RTX Virtual Workstation (vWS), NVIDIA Virtual Compute Server (vCS), and NVIDIA AI Enterprise
vGPU Profiles Supported See the Virtual GPU Licensing Guide
See the NVIDIA AI Enterprise Licensing Guide
NVENC | NVDEC 4x | 8x (includes AV1 decode)
Secure and measured boot with hardware root of trust Yes
(optional)
NEBS Ready Level 3
Power Connector 8-pin CPU

NVIDIA A10

메인스트림 엔터프라이즈 서버용
AI를 통해 가속화된 그래픽 및 영상

arrow-icon
FP32 31.2 teraFLOPS
TF32 Tensor Core 62.5 teraFLOPS | 125 teraFLOPS*
BFLOAT16 Tensor Core 125 teraFLOPS | 250 teraFLOPS*
FP16 Tensor Core 125 teraFLOPS | 250 teraFLOPS*
INT8 Tensor Core 250 TOPS | 500 TOPS*
INT4 Tensor Core 500 TOPS | 1,000 TOPS*
RT Core 72 RT Cores
Encode/decode 1 encoder
2 decoder (+AV1 decode)
GPU memory 24GB GDDR6
GPU memory bandwidth 600GB/s
Interconnect PCIe Gen4 64GB/s
Form factors Single-slot, full-height, full-length (FHFL)
Max thermal design
power (TDP)
150W
vGPU software support NVIDIA Virtual PC, NVIDIA Virtual Applications,
NVIDIA RTX Virtual Workstation,NVIDIA Virtual Compute Server,
NVIDIA AI Enterprise

NVIDIA A2

어떠한 서버에든 NVIDIA AI를 가져오는
엔트리 레벨급 GPU

arrow-icon
Peak FP32 4.5 TF
TF32 Tensor Core 9 TF | 18 TF¹
BFLOAT16 Tensor Core 18 TF | 36 TF¹
Peak FP16 Tensor Core 18 TF | 36 TF¹
Peak INT8 Tensor Core 36 TOPS | 72 TOPS¹
Peak INT4 Tensor Core 72 TOPS | 144 TOPS¹
RT Cores 10
Media engines 1 video encoder
2 video decoders (includes AV1 decode)
GPU memory 16GB GDDR6
GPU memory bandwidth 200GB/s
Interconnect PCIe Gen4 x8
Form factor 1-slot, low-profile PCIe
Max thermal design
power (TDP)
40–60W (configurable)
Virtual GPU (vGPU)
software support²
NVIDIA Virtual PC (vPC), NVIDIA Virtual Applications (vApps), NVIDIA RTX Virtual Workstation (vWS), NVIDIA AI Enterprise, NVIDIA Virtual Compute Server (vCS)

NVIDIA L40S

데이터 센터를 위한
독보적인 AI 및 그래픽 성능

arrow-icon
GPU Architecture NVIDIA Ada Lovelace architecture
GPU Memory 48GB GDDR6 with ECC
Memory Bandwidth 864GB/s
Interconnect Interface PCIe Gen4 x16: 64GB/s bidirectional
NVIDIA Ada Lovelace
Architecture-Based
CUDA® Cores
18,176
NVIDIA Third-Generation
RT Cores
142
NVIDIA Fourth-Generation
Tensor Cores
568
RT Core Performance
TFLOPS
212
FP32 TFLOPS 91.6
TF32 Tensor Core
TFLOPS
183 I 366*
BFLOAT16 Tensor Core
TFLOPS
362.05 I 733*
FP16 Tensor Core 362.05 I 733*
FP8 Tensor Core 733 I 1,466*
Peak INT8 Tensor TOPS
Peak INT4 Tensor TOPS
733 I 1,466*
733 I 1,466*
Form Factor 4.4" (H) x 10.5" (L), dual slot
Display Ports 4x DisplayPort 1.4a
Max Power Consumption 350W
Power Connector 16-pin

NVIDIA L4

비디오, AI 및 그래픽을 위한
획기적인 범용 가속기

arrow-icon
FP32 30.3 teraFLOPs
TF32 Tensor Core 120 teraFLOPS*
FP16 Tensor Core 242 teraFLOPS*
BFLOAT16 Tensor Core 242 teraFLOPS*
FP8 Tensor Core 485 teraFLOPs*
INT8 Tensor Core 485 TOPs*
GPU memory 24GB
GPU memory bandwidth 300GB/s
NVENC | NVDEC | JPEG
decoders
2 | 4 | 4
Max thermal design
power (TDP)
72W
Form factor 1-slot low-profile, PCIe
Interconnect PCIe Gen4 x16 64GB/s
Server options Partner and NVIDIA-Certified Systems with 1–8 GPUs

NVIDIA DGX H100

세계적으로 입증된 엔터프라이즈 AI

arrow-icon
GPUs 8x NVIDIA H100 Tensor Core GPUs
GPU memory 640GB total
Performance 32 petaFLOPS FP8
NVIDIA® NVSwitch™ 4x
System power usage 10.2kW max
CPU Dual Intel® Xeon® Platinum 8480C
Processors 112 Cores total, 2.00 GHz(Base),
3.80 GHz (Max Boost)
System memory 2TB
Networking 4x OSFP ports serving 8x single-port NVIDIA ConnectX-7 VPI
> Up to 400Gb/s InfiniBand/Ethernet
2x dual-port QSFP112 NVIDIA ConnectX-7 VPI
> Up to 400Gb/s InfiniBand/Ethernet
Management
Networking
10Gb/s onboard NIC with RJ45 100Gb/s Ethernet NIC
Host baseboard management controller (BMC) with RJ45
Storage OS: 2x 1.92TB NVMe M.2
Internal storage 8x 3.84TB NVMe U.2
Software NVIDIA AI Enterprise – Optimized AI software
NVIDIA Base Command – Orchestration, scheduling, and cluster
management DGX OS / Ubuntu / Red Hat Enterprise Linux /
Rocky – Operating System
Support Comes with 3-year business-standard hardware
and software support
System weight 287.6lbs (130.45kgs)
Packaged system weight 376lbs (170.45kgs)
System dimensions Height: 14.0in (356mm)
Width: 19.0in (482.2mm)
Length: 35.3in (897.1mm)
Operating temperature range 5–30°C (41–86°F)