These unnamed companies have collectively pre-purchased several exabytes of storage capacity
So.. What's the point of this? Even the openai ram purchase was bizarre to me as they've been able to train their LLMs without gobbling up all the ram just fine, then the ssds and now hard disks? It's not even the write cache that sold out, the hard disks themselves are.
I don't think there's any untapped source of data for llm training is there? Datasets can't get larger so I fail to see any reason this would be necessary.
Does anyone know the reason they're buying up storage? No cynical doomerism please.