Elon Musk’s xAI made quite a splash when it built a data center with 200,000 GPUs that consumes approximately 250 MW of power. However, it appears that OpenAI has an even larger data center in Texas, which consumes 300 MW and houses hundreds of thousands of AI GPUs, details of which were not disclosed. Furthermore, the company is expanding the site, and by mid-2026, it aims to reach a gigawatt scale, according to SemiAnalysis. Such gargantuan AI clusters are creating challenges for power companies not only in power generation but also in power grid safety.

OpenAI appears to operate what is described as the world’s largest single data center building, with an IT load capacity of around 300 MW and a maximum power capacity of approximately 500 MW. This facility includes 210 air-cooled substations and a massive on-site electrical substation, which further highlights its immense scale. A second identical building is already under construction on the same site as of January 2025. When completed, this expansion will bring the total capacity of the campus to around a gigawatt, a record.

  • fckreddit@lemmy.ml
    link
    fedilink
    arrow-up
    2
    ·
    2 days ago

    For the amount of energy they consume, LLMs sure suck ass. Scaling has not improved their accuracy or ‘intelligence’. In fact, they seem to be performing worse. Not to mention, scaling requires exponentially more training data, which they have run out of, apparently.

OSZAR »