The artificial intelligence data center landscape is undergoing a significant transformation, with developers increasingly focusing on facilities closer to major population centers rather than remote locations. This shift is exemplified by DataBank’s recent $2 billion financing achievement for a substantial project in the Dallas suburbs, marking one of the largest data center investments in the region.
DataBank, headquartered in Dallas, has successfully secured the massive construction loan from a banking consortium led by Mitsubishi UFJ Financial Group (MUFG). The financing will support the development of three data center buildings in Red Oak, Texas, located approximately 20 miles outside Dallas. The company is additionally pursuing around $600 million through private placement markets for a fourth building at the same location.
The Red Oak campus represents a strategic investment in what industry experts call “inference computing” facilities. Unlike the massive, remote data centers primarily used for training AI models, inference facilities handle the real-time processing of user requests and AI interactions. These facilities require proximity to users to minimize latency and ensure seamless AI service delivery.
According to Raul Martynek, DataBank’s CEO, the entire four-building complex will provide 240 megawatts of computing capacity and has already been leased to an unnamed hyperscaler – one of the major technology giants including Amazon, Google, Microsoft, Oracle, or Meta. The strategic location near Dallas offers crucial advantages: abundant fiber optic infrastructure and proximity to a major metropolitan area.
The project timeline anticipates the first building becoming operational in the third quarter of 2026, with the remaining structures completed by the end of 2027. These four buildings represent just the first phase of DataBank’s ambitious plans for Red Oak, which ultimately envisions an eight-building campus with 480 megawatts of total capacity.
Industry data from JLL reveals the rapid growth trajectory of inference computing. In 2025, inference computing accounts for 9% of data center workloads, compared to 14% for AI training and 77% for traditional cloud computing. By 2030, JLL projects inference computing will surge to 37% of all data center workloads, while training will remain relatively stable at 13%.
“The next phase is once your models have learned and trained up, to convert that over to inference and be closer to the population center,” explained Carl Beardsley, head of data centers for JLL Capital Markets. This evolution reflects the maturation of the AI industry from development-focused infrastructure to deployment-ready systems serving end users.
The financing process faced notable challenges, reflecting broader caution in the lending market despite surging demand for AI infrastructure. The syndication market, where multiple banks share the risks and costs of large loans, has shown signs of strain. Earlier in January, major banks including JPMorgan Chase and MUFG encountered difficulties selling portions of $38 billion in debt tied to Oracle’s data center projects.
Martynek acknowledged these market conditions affected DataBank’s financing timeline. The decision to separate the fourth building’s financing into a private placement reflected the need to navigate a more cautious lending environment. “If a major money center bank says, ‘Hey, we’re going to slow down the amount of credit we’re going to provide to this sector just because there’s so much of it,’ we’re going to be impacted by that,” Martynek noted.
Texas has emerged as a premier destination for data center development, attracting both training-focused mega-campuses in remote areas and inference facilities near cities. The state offers favorable business conditions, available land, and robust power infrastructure – critical factors for energy-intensive data center operations.
The DataBank project underscores a fundamental shift in how AI infrastructure is being deployed. While the initial AI boom focused on building massive training facilities in locations with cheap land and power, the emphasis is now moving toward delivering AI services to users with minimal delay. This transition requires data centers positioned strategically near population centers with strong network connectivity.
For the broader technology industry, this evolution signals the maturation of AI from an experimental technology to a consumer-ready service. As AI applications become more integrated into daily life – from chatbots to image generation to complex analytics – the physical infrastructure supporting these services must evolve accordingly.
The successful financing of DataBank’s Red Oak project, despite market headwinds, demonstrates continued investor confidence in well-positioned AI infrastructure. As the distinction between training and inference facilities becomes more pronounced, developers who can secure prime locations near major cities while navigating complex financing landscapes will likely emerge as winners in the next phase of the AI revolution.








