Nvidia B200 Low Power Usage for Inference AI Workload
Posted4 months agoActive4 months ago
lightly.aiTechstory
calmpositive
Debate
0/100
NvidiaAI InferenceGPU Performance
Key topics
Nvidia
AI Inference
GPU Performance
Comparison of NVIDIA B200 and H100 GPUs for AI inference workloads, highlighting B200's low power usage.
Snapshot generated from the HN discussion
Discussion Activity
Light discussionFirst comment
51m
Peak period
1
0-1h
Avg / period
1
Key moments
- 01Story posted
Sep 13, 2025 at 12:02 PM EDT
4 months ago
Step 01 - 02First comment
Sep 13, 2025 at 12:53 PM EDT
51m after posting
Step 02 - 03Peak activity
1 comments in 0-1h
Hottest window of the conversation
Step 03 - 04Latest activity
Sep 13, 2025 at 12:53 PM EDT
4 months ago
Step 04
Generating AI Summary...
Analyzing up to 500 comments to identify key contributors and discussion patterns
Discussion (1 comments)
Showing 1 comments
viyopsAuthor
4 months ago
Full Cluster Power: Our entire 8xB200 node (GPUs only) drew approximately 4.8 kW under heavy training load. Including CPUs, RAM, storage, etc., the total system power draw was roughly 6.5–7 kW at the wall.
View full discussion on Hacker News
ID: 45233146Type: storyLast synced: 11/17/2025, 2:02:35 PM
Want the full context?
Jump to the original sources
Read the primary article or dive into the live Hacker News thread when you're ready.