The POWER Podcast-logo

The POWER Podcast

Business & Economics Podcasts

The POWER Podcast provides listeners with insight into the latest news and technology that is poised to affect the power industry. POWER’s Executive Editor Aaron Larson conducts interviews with leading industry experts and gets updates from insiders at power-related conferences and events held around the world.

Location:

United States

Description:

The POWER Podcast provides listeners with insight into the latest news and technology that is poised to affect the power industry. POWER’s Executive Editor Aaron Larson conducts interviews with leading industry experts and gets updates from insiders at power-related conferences and events held around the world.

Language:

English

Contact:

3203052657


Episodes
Ask host to enable sharing for playback control

192. Grid Enhancing Technologies Do Exactly What They Say

6/2/2025
The world’s electricity grids are facing unprecedented strain as demand surges from electrification, data centers, and renewable energy integration, while aging infrastructure struggles to keep pace. Traditional approaches to grid expansion—building new transmission lines and substations—face mounting challenges including sometimes decade-long permitting processes, escalating costs that can reach billions per project, and growing public resistance to new infrastructure. This mounting pressure has created an urgent need for innovative solutions that can unlock the hidden capacity already embedded within existing transmission networks. What Are GETs and What Do They Do? Grid enhancing technologies (GETs) represent a transformative approach to this challenge, offering utilities the ability to safely increase power flows on existing transmission lines by up to 40% in some cases without the need for new construction. These advanced technologies—including dynamic line ratings (DLR) that adjust capacity based on real-time weather conditions, high-temperature advanced conductors that can carry significantly more current, and sophisticated power flow controllers that optimize electricity routing—work by maximizing the utilization of current infrastructure. Rather than building around bottlenecks, GETs eliminate them through smarter, more responsive grid management. On an episode of The POWER Podcast, Anna Lafoyiannis, program lead for the integration of renewables and co-lead of the GET SET (Grid Enhancing Technologies for a Smart Energy Transition) initiative with EPRI, explained that GETs can be either hardware or software solutions. “Their purpose is to increase the capacity, efficiency, reliability, or safety of transmission lines. So, think of these as adders to your transmission lines to make them even better,” Lafoyiannis said. “Typically, they reduce congestion costs. They improve the integration of renewables. They increase capacity. They can provide grid service applications. So, they’re really multifaceted—very helpful for the grid,” she said. “At EPRI, we think of them as kind of like a tool in a toolbox.” The economic and environmental implications are profound. Deploying GETs can defer or eliminate the need for costly new transmission projects while accelerating the integration of renewable energy resources that are often stranded due to transmission constraints. As utilities worldwide grapple with the dual pressures of modernizing their grids and meeting ambitious clean energy targets, GETs offer a compelling path forward that leverages innovation over infrastructure expansion to create a more resilient, efficient, and sustainable electricity system.

Duration:00:43:47

Ask host to enable sharing for playback control

191. Modular Geothermal Power: Gradient’s Scalable Solution for Oil and Gas Sites

5/20/2025
As the world transitions toward renewable energy sources, geothermal power has emerged as one of the most promising, yet underutilized, options in the clean energy portfolio. Unlike solar and wind, geothermal offers consistent baseload power generation capacity without intermittency challenges, making it an increasingly attractive component in the renewable energy mix. The geothermal sector has shown increasing potential in recent years, with technological innovations expanding its possible applications beyond traditional volcanic regions. These advances are creating opportunities to tap into moderate-temperature resources that were previously considered uneconomical, potentially unlocking gigawatts of clean, renewable power across the globe. It's within this expanding landscape that companies like Gradient Geothermal are pioneering new approaches. As a guest on The POWER Podcast, Ben Burke, CEO of Gradient Geothermal, outlined his company’s innovative approach to geothermal energy extraction that could transform how we think about energy recovery from oil and gas operations. Modular and Mobile Geothermal Solutions Gradient Geothermal differentiates itself in the geothermal marketplace through its focus on modular, portable equipment designed specifically for oil field operations, geothermal operators, and potentially data centers. Unlike traditional geothermal installations that require permanent infrastructure, Gradient’s equipment can be moved every six to 18 months as needed, allowing clients to adjust their thermal capacity by adding or removing units as requirements change. “The advantage of mobility and modularity is really important to oil and gas operators,” Burke said. The company’s solution consists of two main components: an off-the-shelf organic Rankine cycle (ORC) unit and a primary heat exchanger loop. This system can handle various ratios of oil, gas, and water—even “dirty” water containing sand, brines, and minerals—and convert that heat into usable power. One of the most compelling aspects of Gradient’s technology is its ease of installation. “Installation takes one day,” Burke explained. “It’s two pipes and three wires, and it’s able to sit on a gravel pad or sit on trailers.” This quick setup contrasts sharply with traditional geothermal plants that can take years to construct. The units come in three sizes: 75 kW, 150 kW, and 300 kW. The modular nature allows for flexible configurations, with units able to be connected in series or parallel to handle varying water volumes and temperatures.

Duration:00:22:16

Ask host to enable sharing for playback control

190. What Trump’s First 100 Days Have Meant to the Power Industry

4/30/2025
U.S. President Donald Trump was sworn into office for the second time on Jan. 20, 2025. That means April 30 marks his 100th day back in office. A lot has happened during that relatively short period of time. The Trump administration has implemented sweeping changes to U.S. energy policy, primarily focused on promoting fossil fuels while curtailing renewable energy development. The administration declared a “national energy emergency” to expedite approvals for fossil fuel infrastructure and lifted regulations on coal plants, exempting nearly 70 facilities from toxic pollutant rules. Coal was officially designated a “critical mineral,” with the Department of Justice directed to investigate regulatory bias against the industry. Additionally, the administration ended the Biden-era pause on approvals for new liquefied natural gas (LNG) export facilities, signaling strong support for natural gas expansion. On the environmental front, U.S. Environmental Protection Agency (EPA) Administrator Lee Zeldin announced 31 deregulatory actions designed in part to “unleash American energy.” The administration is also challenging the 2009 EPA finding that greenhouse gases endanger public health—a foundational element of climate regulation. President Trump announced the U.S.’s withdrawal from the Paris Climate Agreement, effective in early 2026, and terminated involvement in all climate-related international agreements, effectively eliminating previous emissions reduction commitments. Renewable energy has faced significant obstacles under the new administration. A six-month pause was imposed on offshore wind lease sales and permitting in federal waters, with specific projects targeted for cancellation. The administration issued a temporary freeze on certain Inflation Reduction Act (IRA) and Bipartisan Infrastructure Law (BIL) funds designated for clean energy projects. Policies were implemented to weaken federal clean car standards, potentially eliminate electric vehicle (EV) tax credits, and halt funding for EV charging networks—indirectly affecting power generation by potentially reducing electricity demand from EVs. Yet, the administration’s tariff policy may end up impacting the power industry more than anything else it has done. “One thing in particular that I think would be hard to argue is not the most impactful, and that’s the current status of tariffs and a potential trade war,” Greg Lavigne, a partner with the global law firm Sidley Austin, said as a guest on The POWER Podcast. In April, President Trump declared a national emergency to address trade deficits, imposing a 10% tariff on all countries and higher tariffs on nations with large trade deficits with the U.S. These tariffs particularly affect solar panels and components from China, potentially increasing costs for renewable energy projects and disrupting supply chains. Meanwhile, the offshore wind energy industry has also taken a hard hit under the Trump administration. “My second-biggest impact in the first 100 days would certainly be the proclamations pausing evaluation of permitting of renewable projects, but particularly wind projects, on federal lands,” said Lavigne. “That is having real-world impacts today on the offshore wind market off the eastern seaboard of the United States.” Despite the focus on traditional energy sources, the Trump administration has expressed support for nuclear energy as a tool for energy dominance and global competitiveness against Russian and Chinese nuclear exports. Key appointees, including Energy Secretary Chris Wright, have signaled a favorable stance toward nuclear power development, including small modular reactors. All these actions remain subject to ongoing legal and political developments, with their full impact on the power generation industry yet to unfold.

Duration:00:38:47

Ask host to enable sharing for playback control

189. Optimizing Supply Chain Processes to Ensure a Reliable Electric Power System

4/24/2025
The power industry supply chain is facing unprecedented strain as utilities race to upgrade aging infrastructure against a backdrop of lengthening lead times and increasing project complexity. This supply chain gridlock arrives precisely when utilities face mounting pressure to modernize systems. As the industry confronts this growing crisis, innovations in procurement, manufacturing, and strategic planning are essential. “Utilities can optimize their supply chain for grid modernization projects by taking a collaborative approach between the services themselves and how they can support the projects, as well as having a partner to be able to leverage their sourcing capabilities and have the relationships with the right manufacturers,” Ian Rice, senior director of Programs and Services for Grid Services at Wesco, explained as a guest on The POWER Podcast. “At the end of the day, it’s how can the logistical needs be accounted for and taken care of by the partnered firm to minimize the overall delays that are going to naturally come and mitigate the risks,” he said. Headquartered in Pittsburgh, Pennsylvania, Wesco is a leading global supply chain solutions provider. Rice explained that through Wesco, utilities gain access to a one-stop solution for program services, project site services, and asset management. The company claims its tailored approach “ensures cost reduction, risk mitigation, and operational efficiencies, allowing utilities to deliver better outcomes for their customers.” “We take a really comprehensive approach to this,” said Rice. “In the utility market, we believe pricing should be very transparent.” To promote a high level of transparency, Wesco builds out special recovery models for its clients. “What this looks like is: we take a complete cradle-to-grave approach on the lifecycle of the said project or program, and typically, it could be up to nine figures—very, very large programs,” Rice explained. “It all starts with building that model and understanding the complexity. What are the inputs, what are the outputs, and what constraints are there in the short term as well as the long term? And, really, what’s the goal of that overall program?” The answers to those questions are accounted for in the construction of the model. “It all starts with demand management, which closely leads to a sourcing and procurement strategy,” Rice said. “From there, we can incorporate inventory control, and set up SOPs [standard operating procedures] of how we want to deal with the contractors and all the other stakeholders within that program or project. And that really ties into what’s going to be the project management approach, as well in setting up all the different processes, or even the returns and reclamation program. We’re really covering everything minute to minute, day to day, the entire duration of that project, and tying that into a singular model.” But that’s not all. Rice said another thing that sets Wesco apart from others in the market is when it takes this program or project approach, depending on the scale of it, the company remains agnostic when it comes to suppliers. “We’re doing procurement on behalf of our customers,” he said. “So, if they have direct relationships, we can facilitate that. If they’re working with other distributors, we can also manage that. The whole idea here is: what’s in the best interest of the customer to provide the most value.”

Duration:00:19:40

Ask host to enable sharing for playback control

188. DOE’s Loan Programs Office Offers Game-Changing Possibilities

4/10/2025
As the presidential inauguration loomed on the horizon in January this year, the U.S. Department of Energy’s (DOE’s) Loan Programs Office (LPO) published a “year-in-review” article, highlighting accomplishments from 2024 and looking ahead to the future. It noted that the previous four years had been the most productive in the LPO’s history. “Under the Biden-Harris Administration, the Office has announced 53 deals totaling approximately $107.57 billion in committed project investment––approximately $46.95 billion for 28 active conditional commitments and approximately $60.62 billion for 25 closed loans and loan guarantees,” it said. Much of the funding for these investments came through the passing of the Bipartisan Infrastructure Law (BIL) and the Inflation Reduction Act (IRA). The LPO reported that U.S. clean energy investment more than doubled from $111 billion in 2020 to $236 billion in 2023, creating more than 400,000 clean energy jobs. The private sector notably led the way, enabled by U.S. government policy and partnerships. “There were 55 deals that we got across the finish line,” Jigar Shah, director of the LPO from March 2021 to January 2025, said as a guest on The POWER Podcast, while noting there were possibly 200 more projects that were nearly supported. “They needed to do more work on their end to improve their business,” he explained. That might have meant they needed to de-risk their feedstock agreement or their off-take agreement, for example, or get better quality contractors to do the construction of their project. “It was a lot of education work,” Shah said, “but I’m really proud of that work, because I think a lot of those companies, regardless of whether they used our office or not, were better for the interactions that they had with us.” A Framework for Success When asked about doling out funds, Shah viewed the term somewhat negatively. “As somebody who’s been an investor in my career, you don’t dole out money, because that’s how you lose money,” he explained. “What you do is you create a framework. And you tell people, ‘Hey, if you meet this framework, then we’ve got a loan for you, and if you don’t meet this framework, then we don’t have a loan for you.” Shah noted that the vast majority of the 400 to 500 companies that the LPO worked closely with during his tenure didn’t quite meet the framework. Still, most of those that did have progressed smoothly. “Everything that started construction is still under construction, and so, they’re all going to be completed,” said Shah. “I think all in all, the thesis worked. Certainly, there are many people who had a hard time raising equity or had a hard time getting to the finish line and final investment decision, but for those folks who got to final investment decision and started construction, I think they’re doing very well.” Notable Projects When asked which projects he was most excited about, Shah said, “All of them are equally exciting to me. I mean, that’s the beauty of the work I do.” He did, however, go on to mention several that stood out to him. Specifically, he pointed to the Wabash, Montana Renewables, EVgo, and Holtec Palisades projects, which were all supported under the LPO’s Title 17 Clean Energy Financing Program, as particularly noteworthy. Perhaps the most important of the projects Shah mentioned from a power industry perspective, was the Holtec Palisades endeavor. Valued at $1.52 billion, the loan guarantee will allow upgrading and repowering of the Palisades nuclear plant in Covert, Michigan, a first in U.S. history, which has spurred others to bring retired nuclear plants back online. “[It’s] super exciting to see our first nuclear plant being restarted, and as a result, the Constellation folks have decided to restart a nuclear reactor in Pennsylvania, and NextEra has decided to restart a nuclear reactor in Iowa. So, it’s great to have that catalytic impact,” said Shah.

Duration:00:25:06

Ask host to enable sharing for playback control

187. TVA’s Clinch River Nuclear Power Project: Where Things Stand Today

4/1/2025
The Tennessee Valley Authority (TVA) has for many years been evaluating emerging nuclear technologies, including small modular reactors, as part of technology innovation efforts aimed at developing the energy system of the future. TVA—the largest public power provider in the U.S., serving more than 10 million people in parts of seven states—currently operates seven reactors at three nuclear power plants: Browns Ferry, Sequoyah, and Watts Bar. Meanwhile, it’s also been investing in the exploration of new nuclear technology by pursuing small modular reactors (SMRs) at the Clinch River Nuclear (CRN) site in Tennessee. “TVA does have a very diverse energy portfolio, including the third-largest nuclear fleet [in the U.S.],” Greg Boerschig, TVA’s vice president for the Clinch River project, said as a guest on The POWER Podcast. “Our nuclear power plants provide about 40% of our electricity generated at TVA. So, this Clinch River project and our new nuclear program is building on a long history of excellence in nuclear at the Tennessee Valley.” TVA completed an extensive site selection process before choosing the CRN site as the preferred location for its first SMR. The CRN site was originally the site of the Clinch River Breeder Reactor project in the early 1980s. Extensive grading and excavation disturbed approximately 240 acres on the project site before the project was terminated. Upon termination of the project, the site was redressed and returned to an environmentally acceptable condition. The CRN property is approximately 1,200 acres of land located on the northern bank of the Clinch River arm of the Watts Bar Reservoir in Oak Ridge, Roane County, Tennessee. The CRN site has a number of significant advantages, which include two existing power lines that cross the site, easy access off of Tennessee State Route 58, and the fact that it is a brownfield site previously disturbed and characterized as a part of the Clinch River Breeder Reactor project. The Oak Ridge area is also noted to have a skilled local workforce, including many people familiar with the complexities of nuclear work. “The community acceptance here is really just phenomenal,” said Boerschig. “The community is very educated and very well informed.” TVA began exploring advanced nuclear technologies in 2010. In 2016, it submitted an application to the Nuclear Regulatory Commission (NRC) for an Early Site Permit for one or more SMRs with a total combined generating capacity not to exceed 800 MW of electricity for the CRN site. In December 2019, TVA became the first utility in the nation to successfully obtain approval for an Early Site Permit from the NRC to potentially construct and operate SMRs at the site. While the decision to potentially build SMRs is an ongoing discussion as part of the asset strategy for TVA’s future generation portfolio, significant investments have been made in the Clinch River project with the goal of moving it forward. OPG has a BWRX-300 project well underway at its Darlington New Nuclear Project site in Clarington, Ontario, with construction expected to be complete by the end of 2028. While OPG is developing its project in parallel with the design process, TVA expects to wait for more design maturity before launching its CRN project. “As far as the standard design is concerned, we’re at the same pace, but overall, their project is about two years in front of ours,” said Boerschig. “And that’s by design—they are the lead plant for this effort.” In the meantime, there are two primary items on TVA’s to-do list. “Right now, the two biggest things that we have on our list are completing the standard design work, and then the construction permit application,” Boerschig said, noting the standard design is “somewhere north of 75% complete” and that TVA’s plan is to submit the construction permit application “sometime around mid-year of this year.”

Duration:00:23:09

Ask host to enable sharing for playback control

186. How Virtual Power Plants Enhance Grid Operations and Resilience

3/18/2025
A virtual power plant (VPP) is a network of decentralized, small- to medium-scale power generating units, flexible power consumers, and storage systems that are aggregated and operated as a single entity through sophisticated software and control systems. Unlike a traditional power plant that exists in a single physical location, a VPP is distributed across multiple locations but functions as a unified resource. VPPs are important to power grid operations because they provide grid flexibility. VPPs help balance supply and demand on the grid by coordinating many smaller assets to respond quickly to fluctuations. This becomes increasingly important as more intermittent renewable energy sources—wind and solar—are added to the grid. “A virtual power plant is essentially an aggregation of lots of different resources or assets from the grid,” Sally Jacquemin, vice president and general manager of Power & Utilities with AspenTech, said as a guest on The POWER Podcast. “As a whole, they have a bigger impact on the grid than any individual asset would have on its own. And so, you aggregate all these distributed energy resources and assets together to create a virtual power plant that can be dispatched to help balance the overall system supply to demand.” VPPs provide a way to effectively integrate and manage distributed energy resources such as rooftop solar, small wind turbines, battery storage systems, electric vehicles, and demand response programs. VPPs can reduce strain on the grid during peak demand periods by strategically reducing consumption or increasing generation from distributed sources, helping to avoid blackouts and reducing the need for expensive peaker plants. Other benefits provided by VPPs include enhancing grid resilience, enabling smaller energy resources to participate in electricity markets that would otherwise be inaccessible to them individually, and reducing infrastructure costs by making better use of existing assets and reducing peak demand. VPPs enable consumers to become “prosumers,” that is, both producers and consumers of energy, giving them more control over their energy use and potentially reducing their costs. “Virtual power plants are becoming important, not only for utilities, but also in the private sector,” Jacquemin explained. “Because of the commercial value of electricity rising and the market system rates, it’s now profitable for these virtual power plants in many markets due to the value of power that they can supply during these periods of low supply.” AspenTech is a leading industrial software partner, with more than 60 locations worldwide. The company’s solutions address complex environments where it is critical to optimize the asset design, operation, and maintenance lifecycle. AspenTech says its Digital Grid Management solutions “enable the resilient, sustainable, and intelligent utility of the future.” “At AspenTech Digital Grid Management, our software is in control rooms of utilities around the world,” said Jacquemin. “All utilities know they need to be investing in their digital solutions and modernizing their control room technology in order to meet the demands of the energy transition. So, utilities need to be focusing more time and more money to ensure that their software and their systems are capable of enabling that utility of the future.”

Duration:00:27:35

Ask host to enable sharing for playback control

185. AI-Powered Energy Forecasting: How Accurate Predictions Could Save Your Power Company

3/11/2025
Net-demand energy forecasts are critical for competitive market participants, such as in the Electric Reliability Council of Texas (ERCOT) and similar markets, for several key reasons. For example, accurate forecasting helps predict when supply-demand imbalances will create price spikes or crashes, allowing traders and generators to optimize their bidding strategies. It’s also important for asset optimization. Power generators need to know when to commit resources to the market and at what price levels. Poor forecasting can lead to missed profit opportunities or operating assets when prices don’t cover costs. Fortunately, artificial intelligence (AI) is now capable of producing highly accurate forecasts from the growing amount of meter and weather data that is available. The complex and robust calculations performed by these machine-learning algorithms is well beyond what human analysts are capable of, making advance forecasting systems essential to utilities. Plus, they are increasingly valuable to independent power producers (IPPs) and other energy traders making decisions about their positions in the wholesale markets. Sean Kelly, co-founder and CEO of Amperon, a company that provides AI-powered forecasting solutions, said using an Excel spreadsheet as a forecasting tool was fine back in 2005 when he got started in the business as a power trader, but that type of system no longer works adequately today. “Now, we’re literally running at Amperon four to six models behind the scenes, with five different weather vendors that are running an ensemble each time,” Kelly said as a guest on The POWER Podcast. “So, as it gets more confusing, we’ve got to stay on top of that, and that’s where machine learning really kicks in.” The consequences of being ill-prepared can be dire. Having early and accurate forecasts can mean the difference between a business surviving or failing. Effects from Winter Storm Uri offer a case in point. Normally, ERCOT wholesale prices fluctuate from about $20/MWh to $50/MWh. During Winter Storm Uri (Feb. 13–17, 2021), ERCOT set the wholesale electricity price at its cap of $9,000/MWh due to extreme demand and widespread generation failures caused by the storm. This price remained in effect for approximately 4.5 days (108 hours). This 180-fold price increase had devastating financial impacts across the Texas electricity market. The financial fallout was severe. Several retail electricity providers went bankrupt, most notably Griddy Energy, which passed the wholesale prices directly to customers, resulting in some receiving bills of more than $10,000 for just a few days of power. “Our clients were very appreciative of the work we had at Amperon,” Kelly recalled. “We probably had a dozen or so clients at that time, and we told them on February 2 that this was coming,” he said. With that early warning, Kelly said Amperon’s clients were able to get out in front of the price swing and buy power at much lower rates. “Our forecasts go out 15 days, ERCOT’s forecasts only go out seven,” Kelly explained. “So, we told everyone, ‘Alert! Alert! This is coming!’ Dr. Mark Shipham, our in-house meteorologist, was screaming it from the rooftops. So, we had a lot of clients who bought $60 power per megawatt. So, think about buying 60s, and then your opportunity is 9,000. So, a lot of traders made money,” he said. “All LSEs—load serving entities—still got hit extremely bad, but they got hit a lot less bad,” Kelly continued. “I remember one client saying: ‘I bought power at 60, then I bought it at 90, then I bought it at 130, then I bought it at 250, because you kept telling me that load was going up and that this was getting bad.’ And they’re like, ‘That is the best expensive power I’ve ever bought. I was able to keep my company as a retail energy provider.’ And, so, those are just some of the ways that these forecasts are extremely helpful.”

Duration:00:29:31

Ask host to enable sharing for playback control

184. Nuclear Power Renaissance Underway in West Texas

3/3/2025
When you think of innovative advancements in nuclear power technology, places like the Idaho National Laboratory and the Massachusetts Institute of Technology probably come to mind. But today, some very exciting nuclear power development work is being done in West Texas, specifically, at Abilene Christian University (ACU). That’s where Natura Resources is working to construct a molten salt–cooled, liquid-fueled reactor (MSR). “We are in the process of building, most likely, the country’s first advanced nuclear reactor,” Doug Robison, founder and CEO of Natura Resources, said as a guest on The POWER Podcast. Natura has taken an iterative, milestone-based approach to advanced reactor development and deployment, focused on efficiency and performance. This started in 2020 when the company brought together ACU’s NEXT Lab with Texas A&M University; the University of Texas, Austin; and the Georgia Institute of Technology to form the Natura Resources Research Alliance. In only four years, Natura and its partners developed a unique nuclear power system and successfully licensed the design. The U.S. Nuclear Regulatory Commission (NRC) issued a construction permit for deployment of the system at ACU last September. Called the MSR-1, ACU’s unit will be a 1-MWth molten salt research reactor (MSRR). It is expected to provide valuable operational data to support Natura’s 100-MWe systems. It will also serve as a “world-class research tool” to train advanced reactor operators and educate students, the company said. Natura is not only focused on its ACU project, but it is also moving forward on commercial reactor projects. In February, the company announced the deployment of two advanced nuclear projects, which are also in Texas. These deployments, located in the Permian Basin and at Texas A&M University’s RELLIS Campus, represent significant strides in addressing energy and water needs in the state. “Our first was a deployment of a Natura commercial reactor in the Permian Basin, which is where I spent my career. We’re partnering with a Texas produced-water consortium that was created by the legislature in 2021,” said Robison. One of the things that can be done with the high process heat from an MSR is desalinization. “So, we’re going to be desalinating produced water and providing power—clean power—to the oil and gas industry for their operations in the Permian Basin,” said Robison. Meanwhile, at Texas A&M’s RELLIS Campus, which is located about eight miles northwest of the university’s main campus in College Station, Texas, a Natura MSR-100 reactor will be deployed. The initiative is part of a broader project known as “The Energy Proving Ground,” which involves multiple nuclear reactor companies. The project aims to bring commercial-ready small modular reactors (SMRs) to the site, providing a reliable source of clean energy for the Electric Reliability Council of Texas (ERCOT).

Duration:00:35:13

Ask host to enable sharing for playback control

183. Geothermal Energy Storage: The Clean Power Solution You Haven’t Heard Of

2/23/2025
Geothermal energy has been utilized by humans for millennia. While the first-ever use may be a mystery, we do know the Romans tapped into it in the first century for hot baths at Aquae Sulis (modern-day Bath, England). Since then, many other people and cultures have found ways to use the Earth’s underground heat to their benefit. Geothermal resources were used for district heating in France as far back as 1332. In 1904, Larderello, Italy, was home to the world’s first experiment in geothermal electricity generation, when five lightbulbs were lit. By 1913, the first commercial geothermal power plant was built there, which expanded to power the local railway system and nearby villages. However, one perhaps lesser-known geothermal concept revolves around energy storage. “It’s very much like pumped-storage hydropower, where you pump a lake up a mountain, but instead of going up a mountain, we’re putting that lake deep in the earth,” Cindy Taff, CEO of Sage Geosystems, explained as a guest on The POWER Podcast. Sage Geosystems’ technology utilizes knowledge gleaned from the oil and gas industry, where Taff spent more than 35 years as a Shell employee. “What we do is we drill a well. We’re targeting a very low-permeability formation, which is the opposite of what oil and gas is looking for, and quite frankly, it’s the opposite of what most geothermal technologies are looking for. That low permeability then allows you to place a fracture in that formation, and then operate that fracture like a balloon or like your lungs,” Taff explained. “When the demand is low, we use electricity to power an electric pump. We pump water into the fracture. We balloon that fracture open and store the water under pressure until a time of day that power demand peaks. Then, you open a valve at surface. That fracture is naturally going to close. It drives the water to surface. You put it through a Pelton turbine, which looks like a kid’s pinwheel. You spin the turbine, which spins the generator, and you generate electricity.” Unlike more traditional geothermal power generation systems that use hot water or steam extracted from underground geothermal reservoirs, Sage’s design uses what’s known as hot dry rock technology. To reach hot dry rock, drillers may have to go deeper to find desired formations, but these formations are much more common and less difficult to identify, which greatly reduces exploration risks. Taff said traditional geothermal energy developers face difficulties because they need to find three things underground: heat, water, and high-permeability formations. “The challenge is the exploration risk, or in other words, finding the resource where you’ve got the heat, the large body of water deep in the earth, as well as the permeability,” she said. “In hot dry rock geothermal, which is what we’re targeting, you’re looking only for that heat. We want a low-permeability formation, but again, that’s very prevalent.” Sage is now in the process of commissioning its first commercial energy storage project in Texas. “We’re testing the piping, and we’re function testing the generator and the Pelton turbine, so we’ll be operating that facility here in the next few weeks,” Taff said. Meanwhile, the company has also signed an agreement with the California Resources Corporation to establish a collaborative framework for pursuing commercial projects and joint funding opportunities related to subsurface energy storage and geothermal power generation in California. It also has ongoing district heating projects in Lithuania and Romania, and Taff said the U.S. Department of Defense has shown a lot of interest in the company’s geothermal technology. Additionally, Meta signed a contract for a 150-MW geothermal power generation system to supply one of its data centers.

Duration:00:22:54

Ask host to enable sharing for playback control

182. Space-Based Solar Power: The Future of 24/7 Clean Energy Generation

2/17/2025
Imagine a field of solar panels floating silently in the endless day of Earth’s orbit. Unlike their terrestrial cousins, this space-based solar array never faces nighttime, clouds, or atmospheric interference. Instead, they bathe in constant, intense sunlight, converting this endless stream of energy into electricity with remarkable efficiency. But the true innovation lies in how this power is transmitted to power grids on Earth. The electricity generated in space is converted into invisible beams of microwaves or laser light that pierce through the atmosphere with minimal losses. These beams are precisely aimed at receiving stations on Earth—collections of antennas or receivers known as “rectennas” that capture and reconvert the energy back into electricity that can be supplied to the power grid. This isn’t science fiction—it’s space-based solar power (SBSP), a technology that could revolutionize how clean energy is generated and distributed. While conventional solar panels on Earth can only produce power during daylight hours and are at the mercy of weather conditions, orbital solar arrays could beam massive amounts of clean energy to Earth 24 hours a day, 365 days a year, potentially transforming the global energy landscape.

Duration:00:43:58

Ask host to enable sharing for playback control

181. A New Paradigm for Power Grid Operation

2/9/2025
Power grids operate like an intricate ballet of energy generation and consumption that must remain perfectly balanced at all times. The grid maintains a steady frequency (60 Hz in North America and 50 Hz in many other regions) by matching power generation to demand in real-time. Traditional power plants with large rotating turbines and generators play a crucial role in this balance through their mechanical inertia—the natural tendency of these massive spinning machines to resist changes in their rotational speed. This inertia acts as a natural stabilizer for the grid. When there’s a sudden change in power demand or generation, such as a large factory turning on or a generator failing, the rotational energy stored in these spinning masses automatically helps cushion the impact. The machines momentarily speed up or slow down slightly, giving grid operators precious seconds to respond and adjust other power sources. However, as we transition to renewable energy sources like solar and wind that don’t have this natural mechanical inertia, maintaining grid stability becomes more challenging. This is why grid operators are increasingly focusing on technologies like synthetic inertia from wind turbines, battery storage systems, and advanced control systems to replicate the stabilizing effects traditionally provided by conventional power plants. Alex Boyd, CEO of PSC, a global specialist consulting firm working in the areas of power systems and control systems engineering, believes the importance of inertia will lessen, and probably sooner than most people think. In fact, he suggested stability based on physical inertia will soon be the least-preferred approach. Boyd recognizes that his view, which was expressed while he was a guest on The POWER Podcast, is potentially controversial, but there is a sound basis behind his prediction. Power electronics-based systems utilize inverter-based resources, such as wind, solar, and batteries. These systems can detect and respond to frequency deviations almost instantaneously using fast frequency response mechanisms. This actually allows for much faster stabilization compared to mechanical inertia. Power electronics reduce the need for traditional inertia by enabling precise control of grid parameters like frequency and voltage. While they decrease the available physical inertia, they also decrease the amount of inertia required for stability through advanced control strategies. Virtual synchronous generators and advanced inverters can emulate inertia dynamically, offering tunable responses that adapt to grid conditions. For example, adaptive inertia schemes provide high initial inertia to absorb faults but reduce it over time to prevent oscillations. Power electronic systems address stability issues across a wide range of frequencies and timescales, including harmonic stability and voltage regulation. This is achieved through multi-timescale modeling and control techniques that are not possible with purely mechanical systems. Inverter-based resources allow for distributed coordination of grid services, such as frequency regulation and voltage support, enabling more decentralized grid operation compared to centralized inertia-centric systems. Power electronic systems are essential for grids with a high penetration of renewable energy sources, which lack inherent mechanical inertia. These systems ensure stability while facilitating the transition to low-carbon energy by emulating or replacing traditional generator functions. “I do foresee a time in the not-too-distant future where we’ll be thinking about how do we actually design a system so that we don’t need to be impacted so much by the physical inertia, because it’s preventing us from doing what we want to do,” said Boyd. “I think that time is coming. There will be a lot of challenges to overcome, and there’ll be a lot of learning that needs to be done, but I do think the time is coming.”

Duration:00:41:18

Ask host to enable sharing for playback control

180. Data Centers Consume 3% of Energy in Europe: Understand Geographic Hotspots and How AI Is Reshaping Demand

1/30/2025
The rapid rise of data centers has put many power industry demand forecasters on edge. Some predict the power-hungry nature of the facilities will quickly create problems for utilities and the grid. ICIS, a data analytics provider, calculates that in 2024, demand from data centers in Europe accounted for 96 TWh, or 3.1% of total power demand. “Now, you could say it’s not a lot—3%—it’s just a marginal size, but I’m going to spice it up a bit with two additional layers,” Matteo Mazzoni, director of Energy Analytics at ICIS, said as a guest on The POWER Podcast. “One is: that power demand is very consolidated in just a small subset of countries. So, five countries account of over 60% of that European power demand. And within those five countries, which are the usual suspects in terms of Germany, France, the UK, Ireland, and Netherlands, half of that consumption is located in the FLAP-D market, which sounds like a fancy new coffee, but in reality is just five big cities: Frankfurt, London, Amsterdam, Paris, and Dublin.” Predicting where and how data center demand will grow in the future is challenging, however, especially when looking out more than a few years. “What we’ve tried to do with our research is to divide it into two main time frames,” Mazzoni explained. “The next three to five years, where we see our forecast being relatively accurate because we looked at the development of new data centers, where they are being built, and all the information that are currently available. And, then, what might happen past 2030, which is a little bit more uncertain given how fast technology is developing and all that is happening on the AI [artificial intelligence] front.” Based on its research, ICIS expects European data center power demand to grow 75% by 2030, to 168 TWh. “It’s going to be a lot of the same,” Mazzoni predicted. “So, those big centers—those big cities—are still set to attract most of the additional data center consumption, but we see the emergence of also new interesting markets, like the Nordics and to a certain extent also southern Europe with Iberia [especially Spain] being an interesting market.” Yet, there is still a fair amount of uncertainty around demand projections. Advances in liquid cooling methods will likely reduce data center power usage. That’s because liquid cooling offers more efficient heat dissipation, which translates directly into lower electricity consumption. Additionally, there are opportunities for further improvement in power usage effectiveness (PUE), which is a widely used data center energy efficiency metric. At the global level, the average PUE has decreased from 2.5 in 2007 to a current average of 1.56, according to the ICIS report. However, new facilities consistently achieve a PUE of 1.3 and sometimes much better. Google, which has many state-of-the-art and highly efficient data centers, reported a global average PUE of 1.09 for its facilities over the last year. Said Mazzoni, “An expert in the field told us when we were doing our research, when tech moves out of the equation and you have energy engineers stepping in, you start to see that a lot of efficiency improvements will come, and demand will inevitably fall.” Thus, data center load growth projections should be taken with a grain of salt. “The forecast that we have beyond 2030 will need to be revised,” Mazzoni predicted. “If we look at the history of the past 20 years—all analysts and all forecasts around load growth—they all overshoot what eventually happened. The first time it happened when the internet arrived—there was obviously great expectations—and then EVs, electric vehicles, and then heat pumps. But if we look at, for example, last year—2024—European power demand was up by 1.3%, U.S. power demand was up by 1.8%, and probably weather was the main driver behind that growth.”

Duration:00:30:59

Ask host to enable sharing for playback control

179. District Energy Systems: The Invisible Giant of Urban Efficiency

1/21/2025
District energy systems employ a centralized facility to supply heating, cooling, and sometimes electricity for multiple buildings in an area through a largely underground, mostly unseen network of pipes. When district energy systems are utilized, individual buildings do not need their own boilers, chillers, and cooling towers. This offers a number of benefits to building owners and tenants. Among them are: • Energy Efficiency. Centralized heating/cooling is more efficient than individual building systems, reducing energy use by 30% to 50% in some cases. • Cost Savings. Lower operations and maintenance costs through economies of scale and reduced equipment needs per building. • Reduced Environmental Impacts. Emissions are lessened and renewable energy resources can often be more easily integrated. • Reliability. A more resilient energy supply is often provided, with redundant systems and professional operation. • Space Optimization. Buildings need less mechanical equipment, freeing up valuable space. The concept is far from new. In fact, Birdsill Holly is credited with deploying the U.S.’s first district energy system in Lockport, New York, in 1877, and many other cities incorporated district systems into their infrastructure soon thereafter. While district energy systems are particularly effective in dense urban areas, they’re also widely used at hospitals and at other large campuses around the world. “There’s over 600 operating district energy systems in the U.S., and that’s in cities, also on college and university campuses, healthcare, military bases, airports, pharma, even our sort of newer industries like Meta, Apple, Google, their campuses are utilizing district energy, because, frankly, there’s economies of scale,” Rob Thornton, president and CEO of the International District Energy Association (IDEA), said as a guest on The POWER Podcast. “District energy is actually quite ubiquitous,” said Thornton, noting that systems are common in Canada, throughout Europe, in the Middle East, and many other parts of the world. “But, you know, not that well-known. We’re not visible. Basically, the assets are largely underground, and so we don’t necessarily have the visibility opportunity of like wind turbines or solar panels,” he said. “So, we quietly do our work. But, I would guess that for the listeners of this podcast, if they went to a college or university in North America, I bet, eight out of 10 lived in a dorm that was supplied by a district heating system. So, it’s really a lot more common than people realize,” said Thornton.

Duration:00:51:20

Ask host to enable sharing for playback control

178. Why LVOE May Be a Better Decision-Making Tool Than LCOE for Power Companies

12/18/2024
Most POWER readers are probably familiar with levelized cost of energy (LCOE) and levelized value of energy (LVOE) as metrics used to help evaluate potential power plant investment options. LCOE measures the average net present cost of electricity generation over a facility’s lifetime. It includes capital costs, fuel costs, operation and maintenance (O&M) costs, financing costs, expected capacity factor, and project lifetime. Meanwhile, LVOE goes beyond LCOE by considering the actual value the power provides to the grid, including time of generation (peak vs. off-peak), location value, grid integration costs and benefits, contributions to system reliability, environmental attributes, and capacity value. Some of the key differences stem from the perspective and market context each provides. LCOE, for example, focuses on pure cost comparison between technologies, while LVOE evaluates actual worth to the power system. Notably, LCOE ignores when and where power is generated; whereas, LVOE accounts for temporal and locational value variations. Concerning system integration, LCOE treats all generation as equally valuable, while LVOE considers grid integration costs and system needs. “Things like levelized cost of energy or capacity factors are probably the wrong measure to use in many of these markets,” Karl Meeusen, director of Markets, Legislative, and Regulatory Policy with Wärtsilä North America, said as a guest on The POWER Podcast. “Instead, I think one of the better metrics to start looking at and using more deeply is what we call the levelized value of energy, and that’s really looking at what the prices at the location where you’re trying to build that resource are going to be.” Wärtsilä is a company headquartered in Finland that provides innovative technologies and lifecycle solutions for the marine and energy markets. Among its main offerings are reciprocating engines that can operate on a variety of fuels for use in electric power generating plants. Wärtsilä has modeled different power systems in almost 200 markets around the world. It says the data consistently shows that a small number of grid-balancing gas engines in a system can provide the balancing and flexibility to enable renewables to flourish—all while maintaining reliable, resilient, and affordable electricity. Meeusen noted that a lot of the models find engines offer greater value than other technologies on the system because of their flexibility, even though they may operate at lower capacity factors. Having the ability to turn on and off allows owners to capture high price intervals, where prices spike because of scarcity or ramp shortages, while avoiding negative prices by turning off as prices start to dip and drop lower. “That levelized value is one of the things that we think is really important going forward,” he said. “I think what a lot of models and planning scenarios miss when they look at something like LCOE—and they’re looking at a single resource added into the system—is how it fits within the system, and what does it do to the value of the rest of their portfolio?” Meeusen explained. “I call this: thinking about the cannibalistic costs. If I look at an LCOE with a capacity factor for a combined cycle resource, and don’t consider how that might impact or increase the curtailment of renewable energy—no cost renewable energy—I don’t really necessarily see the true cost of some of those larger, inflexible generators on the system. And, so, when we think about that, we really want to make sure that what we’re covering and capturing is the true value that a generator has in a portfolio, not just as a standalone resource.”

Duration:00:33:33

Ask host to enable sharing for playback control

177. How Nuclear Power Could Help Decarbonize Industrial Steam Needs

12/11/2024
Steam is used for a wide variety of critical processes across many industrial sectors. For example, pulp and paper facilities use steam to power paper machines, dry paper and wood products, and provide heat for chemical recovery processes. Steam is used by metal and mining companies, as well as in the food and beverage industry, petroleum refining, pharmaceutical manufacturing, textile production, and many other industrial processes. “About 20% of global carbon emissions come from the industrial heat sector, and virtually all of that industrial heat today is produced by burning hydrocarbons—coal and natural gas—and emitting carbon into the atmosphere,” Clay Sell, CEO of X-energy, said as a guest on The POWER Podcast. “With our technology, we have the opportunity to replace hydrocarbons and use nuclear-generated carbon-free steam to dramatically decarbonize these so-called hard-to-decarbonize sectors.” X-energy is a nuclear reactor and fuel design engineering company. It is developing Generation-IV high-temperature gas-cooled nuclear reactors and what’s known as TRISO-X fuel to power them. The company’s Xe-100 small modular reactor (SMR) is an 80-MWe reactor that can be scaled into a four-pack (320-MWe power plant) that can grow even larger as needed. “The most significant advantages that we have over large-scale traditional nuclear power plants is the evolution of our technology, our safety case, and the smaller, more simplified designs that can be built with much less time and much less money,” Sell said. “We’re a high-temperature gas-cooled reactor using a TRISO fuel form—that’s ceramic, encapsulated fuel in a round pebble that flows through the reactor like gumballs through a gumball machine.” The Xe-100 design’s intrinsic safety makes it especially unique. “This is a plant that cannot melt down under any scenario that one could imagine affecting the plant. So, that extraordinary safety case allows us to operate on a very small footprint,” said Sell. The simplified design has fewer subsystems and components, less concrete, less steel, and less equipment than traditional nuclear power plants. As noted previously, X-energy’s SMR is capable of producing high-quality steam, which is especially attractive for use in industrial processes. As such, Dow Inc., one of the world’s leading materials science companies, has agreed to deploy the first Xe-100 unit at its Union Carbide Corp. Seadrift Operations, a sprawling chemical materials manufacturing site in Seadrift, Calhoun County, Texas. “Our first project is going to be deployed in a public-private partnership with the U.S. government and Dow Inc., the large chemical manufacturer, at a site southwest of Houston, Texas, that will come online around the end of this decade,” Sell reported. Currently, X-energy is in the final stages of its design effort. Once complete, the next step will be to submit a construction permit application to the Nuclear Regulatory Commission (NRC). If all goes according to plan, the application should be approved by the NRC in early 2027, which would allow construction to start around that time. “We anticipate construction on the plant to be about a three- to three-and-a-half-year process, which will then bring it online in the early 2030s,” Sell explained. Beyond that, X-energy has an agreement to supply Amazon with 5 GW of new SMR projects (64 units) by 2039, starting with an initial four-unit 320-MWe Xe-100 plant with regional utility Energy Northwest in central Washington. Sell believes the deal positions X-energy to quickly apply lessons learned from its first-of-a-kind project with Dow, replicate and repeat the effort to achieve scale, and reach a favorable nth-of-a-kind cost structure faster than anyone else in the SMR market today. Said Sell, “When we imagine a future of a decarbonized economy with reliable power supporting dramatic growth at a reasonable cost, I believe X-energy is going to be a central technology to that future.”

Duration:00:32:29

Ask host to enable sharing for playback control

176. Hydrogen Use Cases for the Power Industry

12/3/2024
Hydrogen is becoming increasingly important to the electric power generation industry for several reasons. One is that hydrogen offers a promising pathway to decarbonize the power sector. When used in fuel cells or burned for electricity generation, hydrogen produces only water vapor as a byproduct, making it a zero-emission energy source. This is crucial for meeting global climate change mitigation goals and reducing greenhouse gas emissions from power generation. Hydrogen also provides a potential energy storage solution, which is critical for integrating solar and wind energy into the power grid. These renewable resources are intermittent—sometimes they produce more energy than is needed by the grid, while at other times, they may completely go away. Hydrogen can be produced through electrolysis during periods of excess renewable energy production, then stored and used to generate electricity when needed. This helps address the challenge of matching energy supply with demand. Hydrogen is a flexible and versatile fuel that can be used in fuel cells, gas turbines, or internal combustion engines. It can also be blended with natural gas to accommodate existing equipment limitations. The wide range of options make hydrogen a great backup fuel for microgrids and other systems that require excellent reliability. “We’ve actually seen quite a bit of interest in that,” Tim Lebrecht, industry manager for Energy Transition and the Chemicals Process Industries with Air Products, said as a guest on The POWER Podcast. Lebrecht noted that hydrogen can be a primary use in microgrids, or used as a source of backup or supplement. “Think of a peaking unit that as temperature goes up during the day, your pricing for power could also be going up,” Lebrecht explained. “At a point, hydrogen may be a peak shave–type situation, where you then maximize the power from the grid, but then you’re using hydrogen as a supplement during that time period.” Another hydrogen use case revolves around data centers. “Data centers, specifically, have been really interested in: ‘How do we use hydrogen as a backup type material?’ ” Lebrecht said. Air Products is the world’s leading supplier of hydrogen with more than 65 years of experience in hydrogen production, storage, distribution, and dispensing. Lebrecht noted that his team regularly works with original equipment manufacturers (OEMs); engineering, procurement, and construction (EPC) companies; and other firms to collaborate on solutions involving hydrogen. “We’ve got a great history,” he said. “My team has a great amount of experience.”

Duration:00:31:19

Ask host to enable sharing for playback control

175. Communication Is Key to Successful Power Projects

11/20/2024
Power plant construction and retrofit projects come in all shapes and sizes, but they all generally have at least one thing in common: complexity. There are usually a lot of moving pieces that must be managed. This can include sourcing the right materials and components, getting equipment delivered to the site at the right time, finding qualified contractors, and overseeing handoffs between working groups. Getting a job done on time and on budget is not as easy as some people might think. “It absolutely can be difficult and a lot of things to consider,” Kevin Slepicka, vice president of Sales for Heat Recovery Boilers at Rentech Boiler Systems, said as a guest on The POWER Podcast. “You’ve got to make sure that communication is ongoing between your suppliers and the end user.” Rentech is a leading manufacturer of boiler systems including package boilers, waste heat boilers, and heat recovery steam generators (HRSGs). Rentech’s fabrication facilities are in Abilene, Texas. “We have three shops,” Slepicka explained. “There’s 197,000 square feet of manufacturing space under roof. We’ve got over 100 tons of lift capability with cranes, and we can bring in other cranes for our heavier lifts. Our properties are located on 72 acres, so we have a lot of room for staging equipment, storing equipment, if customers aren’t ready to take delivery at the time the units are done.” Moving large boilers from Texas to sites around the country and other parts of the world can be difficult, which is another reason why good communication is imperative. “Shipping is a major consideration on how the unit is constructed, how much is going to be built in the facility, and how large we can ship. So, it really goes hand in hand with the design of the boiler,” Slepicka said. “It really is important that we work with our logistics people and work with our partner companies that do our transportation for us.” Communication with customers on potential future needs is also important. Slepicka said knowing that a retrofit may be required down the road to account for a new environmental regulation, for example, could allow a boiler system to be designed with space to accommodate changes. This could save a lot of money and headaches in the long run. “That’s where you’ve got to be able to work with the customer—make sure you understand the space available and make sure that the unit’s going to work properly,” he said. Slepicka said Rentech had a customer recently that faced new formaldehyde restrictions and needed its HRSG system modified. “Luckily, we had the space in the unit where that catalyst could be installed in the right location to address the concern they had, so it was a relatively easy retrofit for them to make.” If the prospect had not been considered up front, the cost and complexity could have been much greater.

Duration:00:20:07

Ask host to enable sharing for playback control

174. Kingston Coal Ash Spill: Cleanup Workers Were the Unfortunate Losers

11/5/2024
On Dec. 22, 2008, a major dike failure occurred on the north slopes of the ash pond at the Tennessee Valley Authority’s (TVA’s) Kingston Fossil Plant. The failure resulted in the release of approximately 5.4 million cubic yards of coal ash spilling onto adjacent land and into the Emory River. The Kingston spill is considered one of the most significant and costly events in TVA history. In a project completion fact sheet issued jointly by the U.S. Environmental Protection Agency (EPA) and the TVA in December 2014, it says the cleanup took about six years, required a total of 6.7 million man-hours, and cost $1.178 billion. TVA hired various contractors to perform the post-spill cleanup, removal, and recovery of fly ash at the Kingston site. Perhaps most notable among them was Jacobs Engineering. TVA hired Jacobs in 2009 specifically to provide program management services to assist with the cleanup. Jacobs claims to have “a strong track record of safely managing some of the world’s most complex engineering and environmental challenges.” It has noted that TVA and the EPA’s on-scene coordinator oversaw the worker safety programs for the Kingston cleanup, approving all actions in consultation with the Tennessee Department of Environment and Conservation. Jacobs said TVA maintained rigorous safety standards throughout the cleanup, and that it worked closely with TVA in following and supporting those standards. Jared Sullivan, author of Valley So Low: One Lawyer’s Fight for Justice in the Wake of America’s Great Coal Catastrophe, studied the Kingston cleanup and followed some of the plaintiffs for more than five years while writing his book. As a guest on The POWER Podcast, Sullivan suggested many of the workers felt fortunate to be employed on the Kingston cleanup. The U.S. economy was not thriving at the time; housing and stock markets were in a funk, and unemployment was relatively high. “These workers—these 900 men and women—this disaster is kind of a godsend for them as far as their employment goes, you know. A lot of them needed work. Many of them were very, very pleased to get this call,” Sullivan explained. “The trouble is that after a year or so of working on this job site—of scooping up and hauling off this coal ash muck from the landscape, also from the river—they start feeling really, really terribly,” he said. “At first they kind of write off their symptoms as overworking themselves. In many cases, these workers were working 14-hour shifts and just pushing themselves really, really hard because there’s a lot of overtime opportunities. So, that was good for them—that they could work so much, that this mess was so big,” Sullivan continued. But after a while, some workers start blacking out in their cars, having nosebleeds, start coughing up black mucous, and it becomes clear to them that the coal ash is the cause. Jacobs reports several contractors’ workers at the Kingston site filed workers compensation claims against their employer in 2013. These workers alleged that conditions at the site caused them to experience various health issues that were a result of excessive exposure to coal ash. Jacobs said many of these claims were found to be unsubstantiated and were rejected. Then, many of the same workers filed lawsuits against Jacobs, even though they may not have been Jacobs employees. Jacobs says it stands by its safety record, and that it did not cause any injuries to the workers. “The case resolved early last year, after almost 10 years of litigation,” Sullivan said. “Jacobs Engineering and the plaintiffs—230 of them—finally settled the case. $77.5 million dollars for 230 plaintiffs. So, it works out to a couple hundred thousand dollars each for the plaintiffs after the lawyers take their fees—so, not tons of money.” In a statement, Jacobs said, “To avoid further litigation, the parties chose to enter into an agreement to resolve the cases.”

Duration:00:33:36

Ask host to enable sharing for playback control

173. Why Data Center Developers Should Think ‘Power First’

10/29/2024
You don’t need me to tell you how artificial intelligence (AI) is impacting the power grid; you can just ask AI. Claude, an AI assistant created by Anthropic, told POWER, “AI training and inference are driving unprecedented demand for data center capacity, particularly due to large language models and other compute-intensive AI workloads.” It also said, “AI servers, especially those with multiple GPUs [graphics processing units], require significantly more power per rack than traditional servers—often 2–4x higher power density.” So, what does that mean for power grid operators and electricity suppliers? Claude said there could be several effects, including local grid strain in AI hub regions, the need for upgraded transmission infrastructure, higher baseline power consumption, and potential grid stability issues in peak usage periods. Notably, it said AI data centers tend to cluster in specific regions with favorable power costs and regulations, creating “hotspots” of extreme power demand. Sheldon Kimber, founder and CEO of Intersect Power, a clean energy company that develops, owns, and operates a base portfolio of 2.2 GW of operating solar PV and 2.4 GWh of storage in operation or construction, understands the challenges data centers present for the grid. As a guest on The POWER Podcast, Kimber suggested the only way to meet the massive increase in power demand coming from data centers is with scalable behind-the-meter solutions. “These assets may still touch the grid—they may still have some reliance on the grid—but they’re going to have to bring with them an enormous amount of behind-the-meter generation and storage and other things to make sure that they are flexible enough that the grid can integrate them without creating such a strain on the grid, on rate payers, and on the utilities that service them,” Kimber said. Yet, data center developers have not traditionally kept power top-of-mind. “The data center market to date has been more of a real estate development game,” Kimber explained. “How close to a labor pool are you? What does it look like on the fiber side? What does the land look like?” He said electric power service was certainly part of the equation, but it was more like part of a “balanced breakfast of real estate criteria,” rather than a top priority for siting a data center. In today’s environment, that needs to change. Kimber said Intersect Power has been talking to data center companies for at least three years, pitching them on the idea of siting data centers behind-the-meter at some of his projects. The response has been lukewarm at best. Most of the companies want to keep their data centers in already well-established hubs, such as in northern Virginia; Santa Clara, California; or the Columbia River Gorge region in Oregon, for example. Kimber’s comeback has been, “Tell us when you’re ready to site for ‘Power First.’ ” What “Power First” means is simple. Start with power, and the availability of power, as the first criteria, and screen out all the sites that don’t have power. “To date, data center development that was not ‘Power First’ has really been focused on: ‘What does the plug look like?’ ” Kimber said. In other words: How is the developer connecting the data center to the power grid—or plugging in? The developers basically assumed that if they could get connected to the grid, the local utility would find a way to supply the electricity needed. However, it’s getting harder and harder for utilities to provide what developers are asking for. “The realization that the grid just isn’t going to be able to provide power in most of the places that people want it is now causing a lot of data center customers to re-evaluate the need to move from where they are. And when they’re making those moves, obviously, the first thing that’s coming to mind is: ‘Well, if I’m going to have to move anyway, I might as well move to where the binding constraint, which is power, is no longer a constraint,’ ” he said.

Duration:00:42:10