Infra

Building a16z’s Personal AI Workstation with four NVIDIA RTX 6000 Pro Blackwell Max-Q GPUs

Marco Mascorro Posted August 22, 2025

Building a16z’s Personal AI Workstation with four NVIDIA RTX 6000 Pro Blackwell Max-Q GPUs Table of Contents

a16z Personal AI Workstation with 4x NVIDIA RTX 6000 Pro Blackwell Max-Q GPUs

In the era of foundation models, multimodal AI, LLMs, and ever-larger datasets, access to raw compute is still one of the biggest bottlenecks for researchers, founders, developers, and engineers. While the cloud offers scalability, building a personal AI Workstation delivers complete control over your environment, latency reduction, custom configurations and setups, and the privacy of running all workloads locally.

This post covers our version of a four-GPU workstation powered by the new NVIDIA RTX 6000 Pro Blackwell Max-Q GPUs. This build pushes the limits of desktop AI computing with 384GB of VRAM (96GB each GPU), all in a shell that can fit under your desk.

a16z Personal AI Workstation with 4x NVIDIA RTX 6000 Pro Blackwell Max-Q GPUs

Why Build This Workstation?

Training, fine-tuning, and running inference on modern AI models require massive VRAM bandwidth, high CPU throughput, and ultra-fast storage. Running these workloads in the cloud can introduce latency, setup overhead, slower data transfer speeds, and privacy tradeoffs.

By building a workstation around enterprise-grade GPUs with full PCIe 5.0 x16 connectivity, we get:

  • Maximum GPU-to-CPU bandwidth: No bottlenecks from PCIe switches or shared lanes.
  • Enterprise-class VRAM: Each RTX 6000 Pro Blackwell Max-Q provides 96GB of VRAM, enabling dense training runs and large model inference without quantization. Each card consumes only 300W of power at peak (Max-Q version).
  • 8TB of NVMe 5.0 storage: 4x 2TB of NVMe PCIe 5.0 x4 modules.
  • 256 GB of total ECC DDR5 RAM.
  • Surprising efficiency: Despite its scale, the workstation pulls 1650W at peak, low enough to run on a standard 15-amp / 120V household circuit.
  • Next-gen data GDS streaming: While we are still in the process of testing this support, this setup should be compatible with the NVIDIA GPUDirect Storage (GDS), which allows datasets or models to stream directly from PCIe 5.0 NVMe SSDs into GPU VRAM, bypassing CPU memory, to reduce latency and maximize throughput.

We are planning to test and make a limited number of these custom a16z Founders Edition AI Workstations.

Detail of a16z Personal AI Workstation with 4x NVIDIA RTX 6000 Pro Blackwell Max-Q GPUs

Core Specifications

Let’s break down the hardware:

  • GPUs:
    • 4 × NVIDIA RTX 6000 Pro Blackwell Max-Q
    • 96GB VRAM per GPU (384GB total VRAM)
    • Each card on a dedicated PCIe 5.0 x16 lane
    • 300W per GPU
  • CPU:
    • AMD Ryzen Threadripper PRO 7975WX (liquid cooled with Silverstone XE360-TR5)
    • 32 cores / 64 threads
    • Base clock: 4.0 GHz, Boost up to 5.3 GHz
    • 8-channel DDR5 memory controller
  • Memory:
    • 256GB ECC DDR5 RAM
    • Running across 8 channels (32GB each)
    • Expandable up to 2TB
  • Storage:
    • 8TB total: 4x 2TB PCIe 5.0 NVMe SSDs x4 lanes each (up to 14,900 MB/s – theoretical read speed for each NVMe module)
    • Configurable in RAID 0 for ~59.6GB/s aggregate theoretical read throughput (we are in the process of testing real throughput).
  • Power Supply:
    • Thermaltake Toughpower GF3 1650W 80 PLUS Gold
    • System-wide max draw: 1650W, operable on a standard, dedicated 15A 120V outlet
  • Motherboard:
    • GIGABYTE MH53-G40 (AMD WRX90 Chipset)
  • Case:
    • Off the shelf Extended ATX case with some custom modifications.

a16z Personal AI Workstation with 4x NVIDIA RTX 6000 Pro Blackwell Max-Q GPUs

Design Highlights

Full PCIe 5.0 Bandwidth

Each GPU is connected via its own dedicated PCIe 5.0 x16, ensuring maximum data transfer rates between CPU and GPU. Unlike multi-GPU setups that rely on bifurcated lanes, multiplexers, or external bridges, this build guarantees no compromise on lane allocation or defaulting in lower PCIe versions.

Storage for High-Speed Datasets

The four PCIe 5.0 NVMe SSDs provide read speeds of up to ~14.9 GB/s each (theoretical), scaling to ~59 GB/s theoretical in RAID 0. While we are still in the process of testing full NVIDIA GPUDirect Storage (GDS) compatibility, it could allow GPUs to fetch data directly from NVMe drives, enabling direct-memory access (DMA).

Power Efficiency & Practicality

The overall system consumes 1650W peak and fits comfortably into a home or office environment without requiring dedicated circuits or 220V wiring. With built-in wheels, it is designed for effortless transport between locations.

Baseboard Management Controller (BMC)

Integrated AST2600, a Baseboard Management Controller (BMC) that serves as a dedicated processor for remote out-of-band server management, operating independently of the host CPU and OS to handle critical monitoring and control tasks.

CPU connection diagram of a16z Personal AI Workstation with 4x NVIDIA RTX 6000 Pro Blackwell Max-Q GPUs

Use Cases

  • Training and fine-tuning LLMs with up to tens of billions of parameters in full precision.
  • Running dense multimodal inference across image, text, audio, and video models simultaneously.
  • Experimenting with model parallelism (tensor, pipeline, or expert-based sharding) across four GPUs.
  • Streaming high-throughput datasets directly from SSD RAID into GPU memory for reinforcement learning or diffusion-based workloads.

With libraries like vLLM, DeepSpeed, SGLang, etc., this machine serves as a foundation for training and serving custom LLMs, RL training pipelines, multimodal models, autonomous agents, etc., without cloud dependency and with a custom setup and environment.

This RTX 6000 Pro Blackwell workstation represents a sweet spot between datacenter power and desktop accessibility; all while staying within the footprint and power draw of a high-end AI Workstation under your desk.

Whether you’re a researcher exploring new architectures, a startup prototyping private LLM deployments, or simply an enthusiast, this build demonstrates an efficient, AI Workstation under your desk.

Some temperature tests:

Full Utilization statistics of a16z Personal AI Workstation with 4x NVIDIA RTX 6000 Pro Blackwell Max-Q GPUs

Idle statistics for a16z Personal AI Workstation with 4x NVIDIA RTX 6000 Pro Blackwell Max-Q GPUs

 

 

Want More a16z Infra?

Analysis and news covering the latest trends reshaping AI and infrastructure.

Learn More

Want More Infra?

Analysis and news covering the latest trends reshaping AI and infrastructure.

Sign Up On Substack

Views expressed in “posts” (including podcasts, videos, and social media) are those of the individual a16z personnel quoted therein and are not the views of a16z Capital Management, L.L.C. (“a16z”) or its respective affiliates. a16z Capital Management is an investment adviser registered with the Securities and Exchange Commission. Registration as an investment adviser does not imply any special skill or training. The posts are not directed to any investors or potential investors, and do not constitute an offer to sell — or a solicitation of an offer to buy — any securities, and may not be used or relied upon in evaluating the merits of any investment.

The contents in here — and available on any associated distribution platforms and any public a16z online social media accounts, platforms, and sites (collectively, “content distribution outlets”) — should not be construed as or relied upon in any manner as investment, legal, tax, or other advice. You should consult your own advisers as to legal, business, tax, and other related matters concerning any investment. Any projections, estimates, forecasts, targets, prospects and/or opinions expressed in these materials are subject to change without notice and may differ or be contrary to opinions expressed by others. Any charts provided here or on a16z content distribution outlets are for informational purposes only, and should not be relied upon when making any investment decision. Certain information contained in here has been obtained from third-party sources, including from portfolio companies of funds managed by a16z. While taken from sources believed to be reliable, a16z has not independently verified such information and makes no representations about the enduring accuracy of the information or its appropriateness for a given situation. In addition, posts may include third-party advertisements; a16z has not reviewed such advertisements and does not endorse any advertising content contained therein. All content speaks only as of the date indicated.

Under no circumstances should any posts or other information provided on this website — or on associated content distribution outlets — be construed as an offer soliciting the purchase or sale of any security or interest in any pooled investment vehicle sponsored, discussed, or mentioned by a16z personnel. Nor should it be construed as an offer to provide investment advisory services; an offer to invest in an a16z-managed pooled investment vehicle will be made separately and only by means of the confidential offering documents of the specific pooled investment vehicles — which should be read in their entirety, and only to those who, among other requirements, meet certain qualifications under federal securities laws. Such investors, defined as accredited investors and qualified purchasers, are generally deemed capable of evaluating the merits and risks of prospective investments and financial matters.

There can be no assurances that a16z’s investment objectives will be achieved or investment strategies will be successful. Any investment in a vehicle managed by a16z involves a high degree of risk including the risk that the entire amount invested is lost. Any investments or portfolio companies mentioned, referred to, or described are not representative of all investments in vehicles managed by a16z and there can be no assurance that the investments will be profitable or that other investments made in the future will have similar characteristics or results. A list of investments made by funds managed by a16z is available here: https://a16z.com/investments/. Past results of a16z’s investments, pooled investment vehicles, or investment strategies are not necessarily indicative of future results. Excluded from this list are investments (and certain publicly traded cryptocurrencies/ digital assets) for which the issuer has not provided permission for a16z to disclose publicly. As for its investments in any cryptocurrency or token project, a16z is acting in its own financial interest, not necessarily in the interests of other token holders. a16z has no special role in any of these projects or power over their management. a16z does not undertake to continue to have any involvement in these projects other than as an investor and token holder, and other token holders should not expect that it will or rely on it to have any particular involvement.

With respect to funds managed by a16z that are registered in Japan, a16z will provide to any member of the Japanese public a copy of such documents as are required to be made publicly available pursuant to Article 63 of the Financial Instruments and Exchange Act of Japan. Please contact compliance@a16z.com to request such documents.

For other site terms of use, please go here. Additional important information about a16z, including our Form ADV Part 2A Brochure, is available at the SEC’s website: http://www.adviserinfo.sec.gov.