
The Data Center Frontier Show
Technology Podcasts
Data Center Frontier’s editors are your guide to how next-generation technologies are changing our world, and the critical role the data center industry plays in creating our extraordinary future.
Location:
United States
Genres:
Technology Podcasts
Description:
Data Center Frontier’s editors are your guide to how next-generation technologies are changing our world, and the critical role the data center industry plays in creating our extraordinary future.
Twitter:
@dcfrontier
Language:
English
Episodes
CoreSite Expands in Denver with Strategic Acquisition of Iconic Carrier Hotel
7/10/2025
In this episode of the Data Center Frontier Show, we explore CoreSite’s strategic acquisition of the Denver Gas and Electric Building, widely regarded as the most network-dense facility in the Rocky Mountain region.
Now the sole owner and operator of the DE1 data center housed within the historic building, CoreSite is doubling down on its interconnection strategy and reshaping the future of Denver’s cloud and network ecosystem.
Podcast guests Yvonne Ng, CoreSite’s Central Region General Manager, and Adam Post, SVP of Finance and Corporate Development, discuss how the acquisition enables CoreSite to simplify access to the Google Cloud Platform onramp and supercharge the Any2Denver peering exchange.
The deal also adds over 100 interconnection-rich customers to CoreSite’s portfolio and sets the stage for a broader Denver campus strategy including the under-construction DE3 facility built for AI-scale workloads.
The conversation explores key themes around modernizing legacy carrier hotels for high-density computing, integrating newly acquired customers, and how CoreSite, as backed by parent company American Tower, is evaluating similar interconnection-focused acquisitions in other metro markets.
This is a timely deep dive into how legacy infrastructure is being reimagined to meet AI, multicloud, and edge computing demands. Denver is now positioned as a cloud peering hotspot, and CoreSite is at the center of the story.
Duration:00:22:50
Hunter Newby and Connected Nation: Kansas Breaks Ground on First IXP
7/1/2025
The digital geography of America is shifting, and in Wichita, Kansas, that shift just became tangible.
In a groundbreaking ceremony this spring, Connected Nation and Wichita State University launched construction on the state’s first carrier-neutral Internet Exchange Point (IXP), a modular facility designed to serve as the heart of regional interconnection. When completed, the site will create the lowest-latency, highest-resilience internet hub in Kansas, a future-forward interconnection point positioned to drive down costs, enhance performance, and unlock critical capabilities for cloud and AI services across the Midwest.
In this episode of The Data Center Frontier Show podcast, I sat down with two of the leaders behind this transformative project: Tom Ferree, Chairman and CEO of Connected Nation (CN), and Hunter Newby, co-founder of CNIXP and a veteran pioneer of neutral interconnection infrastructure. Together, they outlined how this facility in Wichita is more than a local improvement, it’s a national proof-of-concept.
“This is a foundation,” Ferree said. “We are literally bringing the internet to Wichita, and that has profound implications for performance, equity, and future participation in the digital economy.”
A Marriage of Mission and Know-How
The Wichita IXP is being developed by Connected Nation Internet Exchange Points, LLC (CNIXP), a joint venture between the nonprofit Connected Nation and Hunter Newby’s Newby Ventures. The project is supported by a $5 million state grant from Governor Laura Kelly’s broadband infrastructure package, with Wichita State providing a 40-year ground lease adjacent to its Innovation Campus.
For Ferree, this partnership represents a synthesis of purpose.
“Connected Nation has always been about closing the digital divide in all its forms, geographic, economic, and educational,” he explained. “What Hunter brings is two decades of experience in building and owning carrier-neutral interconnection facilities, from New York to Atlanta and beyond. Together, we’ve formed something that’s not only technically rigorous, but mission-aligned.”
“This isn’t just a building,” Ferree added. “It’s a gateway to economic empowerment for communities that have historically been left behind.”
Closing the Infrastructure Gap
Newby, who’s built and acquired more than two dozen interconnection facilities over the years, including 60 Hudson Street in New York and 56 Marietta Street in Atlanta, said Wichita represents a different kind of challenge: starting from scratch in a region with no existing IXP.
“There are still 14 states in the U.S. without an in-state Internet exchange,” he said. “Kansas was one of them. And Wichita, despite being the state’s largest city, had no neutral meetpoint. All their IP traffic was backhauled out to Kansas City, Missouri. That’s an architectural flaw, and it adds cost and latency.”
Newby described how his discovery process, poring over long-haul fiber maps, researching where neutral infrastructure did not exist, ultimately led him to connect with Ferree and the Connected Nation team.
“What Connected Nation was missing was neutral real estate for networks to meet,” he said. “What I was looking for was a way to apply what I know to rural and underserved areas. That’s how we came together.”
The AI Imperative: Localizing Latency
While IXPs have long played a key role in optimizing traffic exchange, their relevance has surged in the age of AI, particularly AI inference workloads, which require sub–3 millisecond round-trip delays to operate in real time.
Newby illustrated this with a high-stakes use case: fraud detection at major banks using AI models running on Nvidia Blackwell chips.
“These systems need to validate a transaction at the keystroke. If the latency is too high, if you’re routing traffic out of state to validate it, it doesn’t work. The fraud gets through. You can’t protect people.”
“It’s not just about faster Netflix anymore,” he said. “It’s about...
Duration:00:30:18
Engineering a Cool Revolution: Shumate’s HDAC Design Tackles AI-Era Density
6/26/2025
As artificial intelligence surges across the digital infrastructure landscape, its impacts are increasingly physical. Higher densities, hotter chips, and exponentially rising energy demands are pressuring data center operators to rethink the fundamentals, and especially cooling.
That’s where Shumate Engineering steps in, with a patent-pending system called Hybrid Dry Adiabatic Cooling (HDAC) that reimagines how chilled water loops are deployed in high-density environments.
In this episode of The Data Center Frontier Show, Shumate founder Daren Shumate and Director of Mission Critical Services Stephen Spinazzola detailed the journey behind HDAC, from conceptual spark to real-world validation, and laid out why this system could become a cornerstone for sustainable AI infrastructure.
“Shumate Engineering is really my project to design the kind of firm I always wanted to work for: where engineers take responsibility early and are empowered to innovate,” said Shumate. “HDAC was born from that mindset.”
Two Temperatures, One Loop: Rethinking the Cooling Stack The challenge HDAC aims to solve is deceptively simple: how do you cool legacy air-cooled equipment and next-gen liquid-cooled racks, simultaneously and efficiently?
Shumate’s answer is a closed-loop system with two distinct temperature taps:
Both flows draw from a single loop fed by a hybrid adiabatic cooler, a dry cooler with “trim” evaporative functionality when conditions demand it. During cooler months or off-peak hours, the system economizes fully; during warmer conditions, it modulates to maintain optimal output.
“This isn’t magic; it’s just applying known products in a smarter sequence,” said Spinazzola. “One loop, two outputs, no waste.”
The system is fully modular, relies on conventional chillers and pumps, and is compatible with heat exchangers for immersion or CDU-style deployment. And according to Spinazzola, “we can make 90°F water just about anywhere” as long as the local wet bulb temperature stays below 83°F, a threshold met in most of North America.
Duration:00:30:32
Safe, Scalable, Sustainable: Enabling AI’s Future with Two-Phase Direct-to-Chip Liquid Cooling
6/23/2025
The future of AI isn’t coming; it’s already here. With NVIDIA’s recent announcement of forthcoming 600kW+ racks, alongside the skyrocketing power costs of inference-based AI workloads, now’s the time to assess whether your data center is equipped to meet these demands.
Fortunately, two-phase direct-to-chip liquid cooling is prepared to empower today’s AI boom—and accommodate the next few generations of high-powered CPUs and GPUs. Join Accelsius CEO Josh Claman and CTO Dr. Richard Bonner as they walk through the ways in which their NeuCool™ 2P D2C technology can safely and sustainably cool your data center.
During the webinar, Accelsius leadership will illustrate how NeuCool can reduce energy savings by up to 50% vs. traditional air cooling, drastically slash operational overhead vs. single-phase direct-to-chip, and protect your critical infrastructure from any leak-related risks. While other popular liquid cooling methods carry require constant oversight or designer fluids to maintain peak performance, two-phase direct-to-chip technologies require less maintenance and lower flow rates to achieve better results.
Beyond a thorough overview of NeuCool, viewers will take away these critical insights:
Be sure to join us to discover how two-phase direct-to-chip cooling is enabling the next era of AI.
Duration:00:16:06
Why MOOG is focused on Liquid Cooling and Motion Control for Data Centers
6/18/2025
During the 14-minute interview, Walsh discusses MOOG’s legacy in designing and manufacturing high-performance motion control products and how the company’s experience with mission critical solutions translates into the data center space. He outlines how intelligent cooling controls and maintenance services contribute to overall data center sustainability and explains what sets MOOG’s purpose-built data center products apart from the competition.
Walsh also discusses recent advancements in motion control and cooling systems for data centers, including a new ultrasonic sensor that measures cavitation in liquid cooling fluids.
During the interview, Walsh shares his thoughts on the rise of liquid cooling across the data center industry and the role MOOG plans to play in this transformation.
Duration:00:15:11
Leading with People, Process, and Performance in Digital Transformation
6/17/2025
Join us for an insightful conversation with Jenny Zhan, the newly appointed EdgeConneX Chief Transformation Officer, as she shares her unique perspective on leading organizational change in today’s fast-paced, competitive environment. Transitioning from her previous role as Chief Accounting Officer to spearheading digital transformation efforts, Zhan brings a wealth of expertise and a fresh approach to the role.
Duration:00:31:49
Open Source, AMD GPUs, and the Future of Edge Inference: Vultr’s Big AI Bet
6/12/2025
In this episode of the Data Center Frontier Show, we sit down with Kevin Cochrane, Chief Marketing Officer of Vultr, to explore how the company is positioning itself at the forefront of AI-native cloud infrastructure, and why they’re all-in on AMD’s GPUs, open-source software, and a globally distributed strategy for the future of inference.
Cochrane begins by outlining the evolution of the GPU market, moving from a scarcity-driven, centralized training era to a new chapter focused on global inference workloads. With enterprises now seeking to embed AI across every application and workflow, Vultr is preparing for what Cochrane calls a “10-year rebuild cycle” of enterprise infrastructure—one that will layer GPUs alongside CPUs across every corner of the cloud.
Vultr’s recent partnership with AMD plays a critical role in that strategy. The company is deploying both the MI300X and MI325X GPUs across its 32 data center regions, offering customers optimized options for inference workloads. Cochrane explains the advantages of AMD’s chips, such as higher VRAM and power efficiency, which allow large models to run with fewer GPUs—boosting both performance and cost-effectiveness. These deployments are backed by Vultr’s close integration with Supermicro, which delivers the rack-scale servers needed to bring new GPU capacity online quickly and reliably.
Another key focus of the episode is ROCm (Radeon Open Compute), AMD’s open-source software ecosystem for AI and HPC workloads. Cochrane emphasizes that Vultr is not just deploying AMD hardware; it’s fully aligned with the open-source movement underpinning it. He highlights Vultr’s ongoing global ROCm hackathons and points to zero-day ROCm support on platforms like Hugging Face as proof of how open standards can catalyze rapid innovation and developer adoption.
“Open source and open standards always win in the long run,” Cochrane says. “The future of AI infrastructure depends on a global, community-driven ecosystem, just like the early days of cloud.”
The conversation wraps with a look at Vultr’s growth strategy following its $3.5 billion valuation and recent funding round. Cochrane envisions a world where inference workloads become ubiquitous and deeply embedded into everyday life—from transportation to customer service to enterprise operations. That, he says, will require a global fabric of low-latency, GPU-powered infrastructure.
“The world is going to become one giant inference engine,” Cochrane concludes. “And we’re building the foundation for that today.”
Tune in to hear how Vultr’s bold moves in open-source AI infrastructure and its partnership with AMD may shape the next decade of cloud computing, one GPU cluster at a time.
Duration:00:25:00
DCIM (Data Center Infrastructure Management) and its Role in Data Center Security
6/9/2025
Explore the critical intersection of Data Center Infrastructure Management (DCIM), Common Data Center Security issues and Zero Trust Architecture (ZTA) with a special focus on how our innovative OpenData solution can help.
As data centers face increasing security threats and regulatory pressures, understanding how to effectively integrate DCIM into a Zero Trust framework is essential for safeguarding operations and ensuring compliance.
Duration:00:17:25
Reliving International Data Center Day 2025 with 7x24 Exchange Leaders
6/5/2025
As the digital economy accelerates on the back of AI and hyperscale infrastructure, the question of who will build and run tomorrow’s data centers has never been more urgent. Since its inception in 2015, International Data Center Day (IDCD), organized by 7x24 Exchange International, has steadily grown into a global campaign to answer that question—by inspiring the next generation of mission-critical talent.
This year’s IDCD, observed in March but increasingly seen as a year-round initiative, was the subject of a recent Data Center Frontier Show podcast conversation with 7x24 Exchange International Chairman and CEO Bob Cassiliano and Aheli Purkayastha, Chief Product Officer of Purkay Labs and President of the New England Chapter. The two industry leaders outlined how 7x24 Exchange is advancing the mission of IDCD through grassroots engagement, structured resources, and a growing constellation of strategic partnerships.
A Response to the Talent Shortage
The origin of IDCD traces back to 7x24 Exchange’s recognition—at a 2015 leadership event—that there was not only a lack of awareness of data center careers among students, but also a vacuum of visibility in the educational system. In response, the organization launched IDCD to build a long-term pipeline by introducing the industry to students early, consistently, and accessibly.
Today, that mission is more critical than ever. As generative AI workloads surge and new builds stretch power and land capacity, the need for skilled, motivated professionals to support design, operations, and innovation across the lifecycle of data centers has intensified.
Turning Awareness Into Action
In 2025, IDCD expanded its reach through a broad range of local chapter events and partner activations. These included data center tours, educational presentations, interactive demos, 5K runs, and a hackathon hosted by the New England Chapter. The hackathon stood out as a model for applied learning, pairing 50 high school students with industry professionals in a challenge to design a data center in space—all in just five hours. The result: heightened student interest, deeper industry engagement, and a clear illustration of the educational value these events can offer.
While university students remain a key audience, organizers have recognized the need to reach even younger learners. Initiatives are increasingly targeting elementary and middle school students through age-appropriate programming, with a special emphasis on encouraging young women to consider careers in mission-critical infrastructure.
Resources, Reach, and Real Outcomes
The IDCD campaign is more than a collection of events—it is supported by a robust infrastructure of tools, templates, and thought leadership. At the core is InternationalDataCenterDay.org, a centralized hub offering educational content tailored to different age groups, a career path “tree,” and a library of interviews with professionals across the ecosystem. These resources empower volunteers, educators, and sponsors to create consistent, high-impact programming.
The outcomes speak for themselves. IDCD has helped catalyze the development of data center curricula at both the secondary and postsecondary levels. The Carolinas Chapter, for instance, played a key role in helping Cleveland Community College secure a $23 million grant to develop a full-fledged data center program. Elsewhere, scholarships are on the rise, and growing numbers of students and faculty are attending industry conferences.
Supporting these gains are complementary 7x24 Exchange programs such as WIMCO (Women in Mission Critical Operations), STEM mentoring, and Data Center 101 sessions—designed to provide clear entry points for newcomers while reinforcing the industry's inclusive, interdisciplinary nature.
Duration:00:29:33
Navigating the Future of Data Center Project Management
6/2/2025
The data center industry is undergoing rapid transformation, driven by technological advancements, sustainability concerns, and evolving market demands. This conversation with JLL data center expert Sean Farney explores the world of data center project management, offering insights into current challenges and opportunities.
One of the most significant trends in the industry is the growing need for liquid cooling retrofits. With only 4.6% of global data center critical load currently supporting liquid cooling, there's a substantial opportunity for upgrading existing facilities to meet the demands of high-density computing. This shift is driven by rapid advancements in chip technology, forcing data centers to adapt quickly to maintain efficiency and performance.
Adaptive reuse has emerged as another key strategy in the data center sector. This approach involves converting non-traditional spaces into data centers or updating existing facilities for new technologies. Beyond addressing capacity demands, adaptive reuse offers significant sustainability benefits, aligning with the industry's growing focus on environmental responsibility.
Energy efficiency and sustainability are critical considerations in modern data center design and operations. Often driven by cost savings, these initiatives are reshaping the industry. For instance, some estimates suggest that liquid cooling can reduce carbon impact by up to 40% in new facilities, highlighting the potential for both operational and environmental improvements.
The global nature of data center operations presents unique challenges for project managers. Navigating complex regulatory environments across different markets requires a deep understanding of local codes and standards while meeting global corporate objectives. This complexity underscores the need for project management teams with both global reach and local expertise.
As the industry grapples with a significant talent shortage, innovative approaches to attracting, training, and retaining skilled professionals are crucial. Comprehensive training programs and strategies for bridging the skills gap are becoming increasingly important in this rapidly evolving field.
Emerging technologies continue to shape the future of data center project management. The integration of AI and machine learning in facility management is becoming more common, while the potential impact of quantum computing looms on the horizon. Project managers must stay ahead of these technological shifts to deliver future-ready facilities.
As the data center industry continues to evolve, project management will play a crucial role in delivering cost-effective, efficient, and future-ready facilities. By addressing key challenges such as energy efficiency, technological adaptation, global operations, and talent management, project managers can help transform data center portfolios into strategic assets that support critical business objectives.
Duration:00:26:02
Powering the Future with Fuel Cells: A Deep Dive into On-Site Power Solutions for Data Centers
5/29/2025
In today’s podcast, Matt Vincent, Editor in Chief of Data Center Frontier is joined by Bala Naidu, Vice President – Energy Transition Solutions at Bloom Energy to discuss how the exponential growth of data centers in the United States is putting immense pressure on the power infrastructure. With traditional power sources struggling to keep up, data centers are facing a critical challenge: how to secure timely access to affordable power while adhering to sustainability and permitting regulations.
Duration:00:17:03
Solving the Power Problem for Data Centers
5/26/2025
The data center industry is experiencing substantial growth, placing increasing pressure on the power grid to meet the rising demand. These facilities necessitate continuous power supply with zero interruptions and demand highly reliable backup power to minimize downtime. The expansion of data centers is contributing to a disparity between the demand for power and the capacity of the grid to supply it, which may result in gaps ranging from several months to multiple years. Consequently, numerous developers are exploring alternative power supply options to address these challenges. Solutions that act as a bridge to grid power, commonly referred to as bridge power, are becoming increasingly essential. Reliable bridge power solutions are critical for enabling stakeholders to expedite revenue generation and enhance the resilience of these mission-critical developments. Users may also decide to forgoe the utility and procure a self-generated behind-the-meter permanent solution.
When considering a bridge power or self-generation behind-the-meter solution, one of the first factors to examine is the length of time from power need to utility availability. A key question arises: when can we expect the utility power to be available? Accurately assessing the length of time for which the bridge solution is required is vital in determining various other components of the power system. A bridge power solution acts as a temporary or permanent on-site power plant for a data center, providing not only immediate energy needs but also the potential for long-term flexibility and scalability. This adaptability in both duration and equipment selection significantly accelerates the ability to respond to market demands, ensuring that the data center capacity can continue to expand to meet data storage needs.
The next critical consideration in the development of bridge or behind-the-meter power energy solutions is fuel, as it represents one of the most significant ongoing expenses for projects that operate continuously, 24/7. Natural-gas-fueled reciprocating engine generators have been proven to be highly effective in distributed generation applications. They offer reliable power supply, straightforward maintenance procedures, and low life-cycle costs, making them an attractive option for many operators. Additionally, natural gas is widely available across most regions in the country, and its comparatively low market prices in various areas enhance the appeal of reciprocating engines, making them a cost-effective solution. As projects extend into longer timeframes, the option to incorporate gas turbines becomes increasingly relevant. These turbines are particularly well-suited for long-term applications and can be effectively combined with reciprocating engines to optimize capacity and ensure an uninterrupted power supply. This combination allows operators to leverage the strengths of both technologies, ensuring efficiency and reliability in energy production.
In situations where natural gas is not accessible, but the project's duration justifies the use of natural gas solutions, a virtual pipeline system can be deployed. A virtual pipeline consists of a modular approach utilizing either Compressed Natural Gas (CNG) or Liquefied Natural Gas (LNG). These gases can be transported through various modes effectively bridging the gap in areas lacking direct natural gas infrastructure. The flexibility of virtual pipelines enables efficient delivery of fuel to remote sites well before a conventional pipeline is constructed.
A bridge or behind-the-meter power solution represents a substantial investment, and like any significant financial commitment, it comes with various inherent risks the project. These risks can be categorized into several areas including: technology risks, environmental permitting risks, construction risks, and financial risks. To streamline the complexities of the project, it is advisable to collaborate with an experienced partner...
Duration:00:19:00
Tony Grayson Breaks Down Northstar–Compass Quantum Deal for AI Data Centers
5/20/2025
For this episode of the Data Center Frontier Show podcast, DCF Editor-in-Chief Matt Vincent and Senior Editor David Chernicoff sat down with Tony Grayson, President and General Manager of Northstar Technology Group's Enterprise and Defense unit, to unpack a strategic acquisition that’s shaking up the edge and modular data center space.
The conversation centered on Northstar’s acquisition of Compass Quantum, a company known for its rapidly deployable, composite-based modular infrastructure tailored for both enterprise and defense applications.
From Compass to Northstar: A Strategic Realignment “We were developing a modular brand at Compass,” said Grayson. “Where Compass was building the gigawatt-scale campuses, I was building the smaller campuses using building blocks of modules—versus, you know, kind of a stick build.” That smaller-scale focus gained traction with enterprise clients, including several Fortune 50 companies, but new opportunities in the defense sector introduced regulatory friction.
“Compass is Canadian-owned, and that goes against some of the rules that the U.S. government has,” Grayson explained. “Chris Crosby was a huge supporter… he wanted to sell us so he wouldn’t hinder us from growing the company or servicing U.S. defense needs.”
Enter Northstar Technology Group, which brings a strategic partnership with Owens Corning—the manufacturer and IP holder behind Compass Quantum’s composite materials. With engineering, manufacturing, and construction capabilities now integrated under one roof, Grayson sees the acquisition as a natural fit. “Everything is now in-house instead of trying to go outside to other consultants,” he said.
AI-Ready Modulars in 5MW Increments As hyperscale demands evolve, Grayson noted growing customer appetite for 5 megawatt modular units—mirroring the scale at which Nvidia and others are now building AI infrastructure. “You’ve seen Wade Vinson talk about it at Data Center World, and you see Jensen [Huang] talking about 5 megawatts being the line where you cross between the L2 and L3 network,” he said. “We can build in 5 megawatt increments and drop that stuff in parking lots—either as an operating lease or as a sale.”
That flexibility extends to Northstar’s channel partners, who are increasingly seeking a variety of procurement models. “Some want sales, not just leases. It gives us more freedom to do that kind of stuff,” said Grayson. “Sometimes it’s better to be lucky than good, and I feel like the timing of this couldn’t have been better for where the industry’s at right now.”
Veteran-Led Advisory Team Strengthens Defense Strategy In addition to the materials and platform innovations, Northstar’s defense ambitions are underpinned by what Grayson describes as a “dream team” of senior military advisors. “We basically have every outgoing ‘six’—the people in charge of IT and comms for the Air Force, Marine Corps, Army, and Navy—as advisors,” he said. “Some will be coming on full time.”
These high-level advisors, many of whom retired as three-star generals, are instrumental in helping Northstar align its solutions with evolving defense requirements, particularly in distributed compute and real-time data processing.
“We’re making huge progress on the enterprise side, but the defense side is where we need to catch up,” Grayson added. “Defense globally needs distributed compute… they’re ahead of enterprise when it comes to inference platforms.”
He also highlighted Northstar’s engagement with the Navy, particularly around airborne systems. “That’s why we have the old air boss, Admiral Weitzel. He helps us with aircraft systems. These planes generate so much data, and we need advice on how best to internalize and analyze it.”
Material Advantage: Why FRP Composites Are a Game-Changer: Durability, Customization—and No Tariffs A key differentiator for Northstar’s modular approach is its use of fiber-reinforced polymer (FRP) composites instead of traditional steel or concrete enclosures....
Duration:00:27:55
Meeting Increasing Cooling Demands in the Data Center Market with LG
5/14/2025
Global demand for data center capacity is expected to grow between 19 and 22 percent annually through 2030, according to McKinsey & Company. As data center capacity expands, so does the challenge of managing the heat generated by high-performance chips. This includes heat at the chip, as well as external heat rejection and room cooling. LG, a global HVAC technology leader, discusses the evolving landscape and the latest technology to ensure efficient, reliable cooling for data centers. This includes the full suite of data center cooling solutions that LG debuted at Data Center World 2025. The cutting-edge cooling technologies, including direct-to-chip, room, and chiller plant cooling capabilities, are intended to meet the challenge of increasing data center capacity head-onm helping provide reliable, energy-efficient solutions.
Duration:00:14:00
Digital Hub on Fortaleza: The role of Tecto and V.tal
5/13/2025
This episode will explore how Tecto Data Centers is shaping the future of digital infrastructure in Latin America through its operations in Fortaleza, Brazil. André Busnardo, Head of Data Center Sales at Tecto, discuss why the region is considered one of the most important connectivity hubs in LATAM and how the company’s investment strategy is helping address the growing demand for reliable, neutral, and scalable infrastructure.
Duration:00:18:57
Nomads at the Frontier: Nabeel Mahmood on the Future of Data Centers and Disruptive Sustainability
5/1/2025
WASHINGTON, D.C.— At this year’s Data Center World 2025, held earlier this month at the Walter E. Washington Convention Center, the halls were buzzing with what could only be described as industry sensory overload. As hyperscalers, hardware vendors, and infrastructure specialists converged on D.C., the sheer density of innovation underscored a central truth: the data center sector is in the midst of rapid, almost disorienting, expansion.
That made it the perfect setting for the latest episode in our ongoing podcast miniseries with Nomad Futurist, aptly titled Nomads at the Frontier. This time, I sat down in person with Nabeel Mahmood, co-founder and board director of the Nomad Futurist Foundation—a rare face-to-face meeting after years of remote collaboration.
“Lovely seeing you in person,” Mahmood said. “It’s brilliant to get to spend some quality time at an event that’s really started to hit its stride—especially in terms of content.”
Mahmood noted a welcome evolution in conference programming: a shift away from vendor-heavy pitches and toward deeper, mission-driven dialogue about the sector’s true challenges and future trajectory. “Events like these were getting overloaded by vendor speak,” he said. “We need to talk about core challenges, advancements, and what we’re doing to improve and move forward.”
A standout example of this renewed focus was a panel on disruptive sustainability, in which Mahmood joined representatives from Microsoft, AWS, and a former longtime lieutenant of Elon Musk’s sustainability operations. “It’s not just about e-cycling or carbon,” Mahmood emphasized. “We have to build muscle memory. We’ve got to do things for the right reasons—and start early.”
That starting point, he argued, is education—but not in the traditional sense. Instead, Mahmood called for a multi-layered approach that spans K–12, higher education, and workforce reskilling. “We’ve come out from behind the Wizard of Oz curtain,” he said. “Now we’re in the boardroom. We need to teach people not just how technology works, but why we use it—and how to design platforms with real intention.”
Mahmood’s remarks highlighted a growing consensus among forward-thinking leaders: data is no longer a support function. It is foundational. “There is no business, no government, no economy that can operate today—or in the future—without data,” he said. “So let’s measure what we do. That’s the KPI. That’s the minimum threshold.”
Drawing a memorable parallel, Mahmood compared this kind of education to swimming lessons. “Sure, you might not swim for 20 years,” he said. “But if you learned as a kid, you’ll still be able to make it back to shore.”
Inside-Out Sustainability and Building the Data Center Workforce of Tomorrow As our conversation continued, we circled back to Mahmood’s earlier analogy of swimming as a foundational skill—like technology fluency, it stays with you for life. I joked that I could relate, recalling long-forgotten golf lessons from middle school. “I'm a terrible golfer,” I said. “But I still go out and do it. It’s muscle memory.”
“Exactly,” Mahmood replied. “There’s a social element. You’re able to enjoy it. But you still know your handicap—and that’s part of it too. You know your limits.”
Limits and possibilities are central to today’s discourse around sustainability, especially as the industry’s most powerful players—the hyperscalers—increasingly self-regulate in the absence of comprehensive mandates. I asked Mahmood whether sustainability had truly become “chapter and verse” for major cloud operators, or if it remained largely aspirational, despite high-profile initiatives.
His answer was candid.
“Yes and no,” he said. “No one's following a perfect process. There are some who use it for market optics—buying carbon credits and doing carbon accounting to claim carbon neutrality. But there are others genuinely trying to meet their own internal expectations.”
The real challenge, Mahmood noted, lies in the absence of...
Duration:00:28:15
From Concept to Reality: The Future of Hydrogen Fuel Cells in Data Centers
4/28/2025
As the data center industry continues to expand, two powerful forces are reshaping the search for next-generation power solutions. First, the rapid expansion of AI, IoT, and digital transformation is significantly increasing global power demand, placing increased pressure on traditional grid systems to meet the energy needs. The International Energy Agency forecasts that electricity consumption by data centers and AI could double by 2026, adding an amount equal to the entire current electricity usage of Japan. The second force is the urgent need for a smaller environmental footprint. As energy consumption rises, the drive for decarbonization becomes more critical, making it harder for data centers to balance environmental sustainability with performance reliability.
In response to these challenges, data center leaders are looking beyond conventional solutions and exploring innovative alternatives that can meet the demands of a rapidly evolving industry. This podcast will focus on hydrogen fuel cell technology as a potential fuel source. This emerging technology has the potential to transform how data centers power their operations, providing a sustainable solution that not only helps reduce carbon emissions but also ensures reliable and scalable energy for the future.
Hydrogen fuel cells present an opportunity for data centers. Unlike traditional fossil fuel-based systems, hydrogen fuel cells generate power through an electrochemical reaction between hydrogen and oxygen, with water and heat as the only byproducts. This makes them a virtually emission-free, environmentally friendly power solution. Moreover, hydrogen fuel cells can reduce data center emissions by up to 99%, providing one of the most effective means of decarbonizing the industry. The environmental benefits are matched by their impressive efficiency, as fuel cells operate with fewer energy losses compared to traditional combustion-based systems.
In this episode, Ben Rapp, Strategic Product Development Manager at Rehlko, will explore the science behind hydrogen fuel cells, offering an overview of the key components that make them a viable power solution for data centers. He will also highlight the practical advantages of hydrogen fuel cells, particularly their ability to deliver reliable, on-demand power with minimal disruption. This episode also addresses the challenges of adopting hydrogen fuel cells, including infrastructure development, cost, and the need for a robust hydrogen distribution network.
Additionally, we talked to Ben about Rehlko’s hydrogen fuel cell project and the partnerships involved. As part of this initiative, Rehlko has collaborated with companies like Toyota to develop a 100-kilowatt hydrogen fuel cell solution aimed at reducing the carbon footprint of data centers. We’ll go over the progress of this partnership and the practical steps being taken to make hydrogen fuel cells a viable and scalable power solution.
Finally, Ben will talk about his perspective on the future role of hydrogen fuel cells in data centers worldwide. With the industry facing increasing pressure to meet sustainability targets while ensuring performance reliability, hydrogen fuel cells are poised to play a critical role in the evolution of data center power systems. They offer both environmental and operational benefits that are essential for the industry’s future.
Whether used as a primary power source, backup system, or for grid stabilization, hydrogen fuel cells are poised to become a key player in the future of data center energy management.
Duration:00:21:36
Are we coming up short? Navigating the Global Power Deficit
4/22/2025
Global power deficit and solutions
The discussion will address the power deficit we are experiencing and how new demands for power are navigated across different regions.
Duration:00:35:40
Beyond the Wires — Packet Power & the Future of Data Center Monitoring
4/21/2025
In this episode of the Data Center Frontier Show podcast, we explore how Packet Power is transforming data center monitoring. As the demand for energy efficiency and operational transparency grows, organizations need solutions that provide real-time insights without adding complexity. Packet Power’s wireless, scalable, and secure technology offers an easy, streamlined approach to power and environmental monitoring.
Monitoring Made Easy®
Traditional monitoring solutions can be difficult to install, configure, and scale. Packet Power’s wireless, out-of-band technology removes these hurdles, offering a plug-and-play system that allows organizations to start with a few monitoring nodes and expand as needed. With built-in fleet management, remote diagnostics, and broad compatibility with existing systems, Packet Power helps data centers gain visibility into their power and environmental conditions with minimal effort.
Fast, Flexible Deployment
Deploying monitoring solutions can be time-consuming and resource-intensive, particularly in large-scale facilities. Many systems require extensive cabling, specialized personnel, and lengthy configuration processes. Packet Power eliminates these roadblocks by offering a vendor-agnostic, rapidly deployable system that works seamlessly with existing infrastructure. Designed and manufactured in the USA, Packet Power products ship in just 2-3 weeks, avoiding the delays often associated with global supply chain issues and ensuring data centers can implement monitoring solutions without unnecessary downtime.
Security Built from the Ground Up
Security is a critical concern in mission-critical environments. Unlike traditional monitoring solutions that focus primarily on encryption, Packet Power integrates security at every level—from hardware to networking and software. Their read-only architecture ensures that failed hardware won’t disrupt power delivery, while out-of-band monitoring prevents exposure to network vulnerabilities. One-way communication protocols and optional physical data isolation further enhance security, ensuring that critical infrastructure remains protected from cyber threats and misconfigurations.
Adapting to Industry Changes
The data center landscape is rapidly evolving, with increasing demands for efficiency, flexibility, and sustainability. Packet Power’s solutions are designed to keep pace with these changes, offering a non-intrusive way to enhance monitoring capabilities without modifying existing infrastructure. Their technology is easily embedded into power and cooling systems, enabling organizations to implement real-time monitoring across a wide range of devices while maintaining operational agility.
Why Wireless Wins
Traditional wired monitoring solutions often require extensive installation efforts and ongoing maintenance, while common consumer wireless options—such as WiFi, Bluetooth, and Zigbee—are not designed for industrial environments. These protocols pose security risks and struggle in settings with high electromagnetic interference. Packet Power’s proprietary wireless system is optimized for reliability in data centers, eliminating IP-based vulnerabilities while supporting secure, large-scale deployments.
Cost Savings & Efficiency
Monitoring solutions should provide a return on investment, not create additional overhead. Packet Power reduces costs by minimizing IT infrastructure needs, eliminating the expense of network switches, dedicated cabling, and IP address management. Their wireless monitoring approach streamlines deployment, allowing organizations to instantly gain actionable insights into their energy usage and environmental conditions. This improves cost allocation, supports sustainability initiatives, and enhances operational efficiency.
Versatile Applications
Energy monitoring is crucial across multiple aspects of data center management. Packet Power’s solutions support a wide range of applications, including tracking energy use in...
Duration:00:39:42
Vaire Computing Bets on Reversible Logic for 'Near Zero Energy' AI Data Centers
4/15/2025
The AI revolution is charging ahead—but powering it shouldn't cost us the planet. That tension lies at the heart of Vaire Computing’s bold proposition: rethinking the very logic that underpins silicon to make chips radically more energy efficient.
Speaking on the Data Center Frontier Show podcast, Vaire CEO Rodolfo Rossini laid out a compelling case for why the next era of compute won't just be about scaling transistors—but reinventing the way they work.
“Moore's Law is coming to an end, at least for classical CMOS,” Rossini said. “There are a number of potential architectures out there—quantum and photonics are the most well known. Our bet is that the future will look a lot like existing CMOS, but the logic will look very, very, very different.”
That bet is reversible computing—a largely untapped architecture that promises major gains in energy efficiency by recovering energy lost during computation.
Product, Not IP Unlike some chip startups focused on licensing intellectual property, Vaire is playing to win with full-stack product development.
“Right now we’re not really planning to license. We really want to build product,” Rossini emphasized. “It’s very important today, especially from the point of view of the customer. It’s not just the hardware—it’s the hardware and software.”
Rossini points to Nvidia’s CUDA ecosystem as the gold standard for integrated hardware/software development.
“The reason why Nvidia is so great is because they spent a decade perfecting their CUDA stack,” he said. “You can’t really think of a chip company being purely a hardware company anymore. Better hardware is the ticket to the ball—and the software is how you get to dance.”
A great metaphor for a company aiming to rewrite the playbook on compute logic.
The Long Game: Reimagining Chips Without Breaking the System In an industry where even incremental change can take years to implement, Vaire Computing is taking a pragmatic approach to a deeply ambitious goal: reimagining chip architecture through reversible computing — but without forcing the rest of the computing stack to start over.
“We call it the Near-Zero Energy Chip,” said Rossini. “And by that we mean a chip that operates at the lowest possible energy point compared to classical chips—one that dissipates the least amount of energy, and where you can reuse the software and the manufacturing supply chain.”
That last point is crucial. Vaire isn’t trying to uproot the hyperscale data center ecosystem — it's aiming to integrate into it. The company’s XPU architecture is designed to deliver breakthrough efficiency while remaining compatible with existing tooling, manufacturing processes, and software paradigms.
Duration:00:31:00