The POWER Podcast-logo

The POWER Podcast

Business & Economics Podcasts

The POWER Podcast provides listeners with insight into the latest news and technology that is poised to affect the power industry. POWER’s Executive Editor Aaron Larson conducts interviews with leading industry experts and gets updates from insiders at power-related conferences and events held around the world.


United States


The POWER Podcast provides listeners with insight into the latest news and technology that is poised to affect the power industry. POWER’s Executive Editor Aaron Larson conducts interviews with leading industry experts and gets updates from insiders at power-related conferences and events held around the world.





Ask host to enable sharing for playback control

167. Shifting from Coal to Gas: One Co-op’s Award-Winning Journey

In 2018, Cooperative Energy, a generation and transmission co-op headquartered in Hattiesburg, Mississippi, had an issue to deal with. Several years earlier, it had joined the Midcontinent Independent System Operator (MISO), giving the power provider access to a competitive market. However, Cooperative Energy’s R.D. Morrow Sr. Generating Station, a 400-MW two-unit coal-fired facility that had opened about 40 years earlier, was not being dispatched as the co-op would have liked. In fact, the facility’s capacity factor in those days was running at only about 3%. “We could not compete in the MISO market due to the cost of the unit, the lack of flexibility, [and] startup time—when you’re bidding the unit into a day-ahead market, a 42-hour startup time is not a good place to be,” Mark Smith, senior vice president of Power Generation with Cooperative Energy, explained as a guest on The POWER Podcast. Smith continued: “We had high transportation costs. Our coal came in by rail and the route from the mine to the plant was roughly 440 miles one way. So, the transportation cost was excessive. Environmental regulations—the goal post seems to keep moving and things keep ratcheting down—we didn’t know where we were heading. At the point that we did decommission, we were well within compliance, but the future was uncertain. It was going to require a lot of capital investment in the coal unit.” With that as a backdrop, Cooperative Energy made the decision to build a new gas-fired unit to take the place of the coal units. Cooperative Energy took a somewhat unconventional approach for the project, utilizing many of its own people to manage the job, rather than opting for a turnkey EPC (engineering, procurement, and construction) contractor. “There were several reasons for us to choose what we call the multi-contract approach, as opposed to utilizing an EPC contractor,” Trey Cannon, director of Generation Projects with Cooperative Energy, said on the podcast. “Probably the one that was most important to us is just having that full transparency and full control of the entire project, including technology selections and equipment procurement, selection of construction contractors, and things of that nature,” Cannon explained. There was also a cost savings involved. “We estimated that we probably saved at least 15% on the total budget by utilizing the self-build self-manage approach,” said Cannon. The results were phenomenal. The project finished well ahead of schedule and well under budget. Yet, Cannon admitted that a lot of the savings was due to circumstances. “The market conditions and the timing of the project couldn’t have been better,” he said. The market for power plants in 2018 was down, so Cooperative Energy was able to get very competitive pricing on the gas turbine and a lot of other equipment. As construction work kicked into full swing in 2020, the market took another dip with COVID and other factors pushing projects to the back burner. Cooperative Energy, however, pressed on and was able to cherry pick the best contractors and the best workers. To underscore how the project benefited from the quality of personnel it was able to attract, Smith noted, “The weld rejection rate for our mechanical contractor was 0.41%, which was remarkable.” Today, the repowered Morrow plant is the heavy-load-carrying unit in Cooperative Energy’s fleet. “Since we went commercial, I think we’re carrying a 90-plus-percent capacity factor on the unit,” said Cannon. “If it’s not the most-efficient plant in MISO South, it’s very close,” added Smith. “And, needless to say, if the unit is available—we’re not in a planned outage—it’s operating and it’s typically baseloaded. In MISO, the name of the game is flexibility, efficiency, and reliability. The Morrow repower has checked all of those boxes for us and has Cooperative Energy in a great position for many years to come.”


Ask host to enable sharing for playback control

166. Analyst Says Nuclear Industry Is ‘Totally Irrelevant’ in the Market for New Power Capacity

Nuclear power has consistently provided about 19% to 20% of total annual U.S. electricity generation since 1990. It provides significant amounts of electricity in many other countries as well. According to data from The World Nuclear Industry Status Report (WNISR), a total of 414 reactors were operating in 32 countries, as of July 1, 2024. Preliminary data says China generated the second-most electricity from nuclear power in 2023 (behind the U.S.), while France came in third and had the highest percentage share of national power generation from nuclear power at 65%. Many power industry experts and environmental activists consider nuclear power an important component in the world’s transition to carbon-free energy. Yet, Mycle Schneider, an independent international analyst on energy and nuclear policy, and coordinator, editor, and publisher of the annual WNISR, said, “in [new] capacity terms, the nuclear industry, from what is going on, on the ground, is totally irrelevant.” Schneider was speaking as a guest on The POWER Podcast and prefaced his statement by comparing nuclear power additions to solar power additions in recent years. “Let’s look at China, because China is the only country that has been massively building nuclear power plants over the past 20 years,” he said. “China connected one reactor to the grid in 2023—one gigawatt. In the same year, they connected, and the numbers vary, but over 200 gigawatts of solar alone. Solar power generates more electricity in China than nuclear power since 2022. And, of course, wind power generates more than nuclear power in China for a decade already,” Schneider said. Furthermore, he noted, the disparity has gone “completely unnoticed by the general public or even within the energy professionals that are in Europe or often also in North America.” Schneider said the media often gives the impression that the nuclear industry is booming, but the facts suggest otherwise. “Over the past 20 years—2004 to 2023—104 reactors were closed down and 102 started up,” Schneider said. “But here is important that almost half, 49 of those new reactors started, were in China [where none closed], so the balance outside China is minus 51.” Some nuclear advocates might suggest that things are changing. They might argue that small modular reactors (SMRs) or other advanced designs are poised to reinvigorate the industry. But Schneider disagrees. He noted that since the construction start of the second unit at Hinkley Point C in the UK in 2019—almost five years ago—there have been 35 nuclear project construction starts in the world. Twenty-two of those were in China and the other 13 were all implemented by the Russian nuclear industry in a few different countries. “Nothing else. Not an SMR here or an SMR there, or a large reactor here or a large reactor there by any other player,” reported Schneider. Schneider noted that the vast majority of new capacity being added to the grid is from solar and wind energy. “These guys are building tens of thousands of wind turbines, and literally hundreds of millions of solar cells, so the learning effect is just absolutely stunning,” he said. “On the nuclear side, we’re talking about a handful. That’s very difficult. Very, very difficult—very challenging—to have a learning effect with so few units.” Schneider said the nuclear discussion in general needs a “really thorough reality check.” He suggested the possibilities and feasibilities must be investigated. “Then, choices can be made on a solid basis,” he said.


Ask host to enable sharing for playback control

165. How to Improve U.S. Power Distribution System Reliability

The U.S. Energy Information Administration (EIA) reports SAIDI and SAIFI values in its Electric Power Annual report, which is regularly released in October each year. In the most recent report, the U.S. distribution system’s average SAIDI value including all events was 335.5 minutes per customer in 2022. If major event days were excluded, which is often a worthwhile exercise to get accurate long-term trends because hurricanes and severe winter storms, for example, can skew the numbers quite dramatically in a given year, the figure dropped to 125.7 minutes per customer. Notably, this the highest SAIDI value tallied in the past decade and it continued what has effectively been a steady year-over-year decline in performance from 2013 through 2022. (2017 saw a brief improvement over 2016, but every year before and since has been worse than the previous year during the timespan covered by the report.) For comparison, in 2013, the SAIDI value was 106.1 minutes per customer. SAIFI values do not vary as noticeably as SAIDI, but still have been worsening. In 2022, the U.S. distribution system’s average SAIFI value including all events was 1.4 power interruptions per customer. With major events excluded, SAIFI was 1.1 interruptions per customer in the U.S. While this was not substantially worse than values reported in other years over the past decade (every year from 2013 onward has been 1.0, except for 2016 when the value was also 1.1), it seems to confirm that the system hasn’t been improving. Yet, Mike Edmonds, Chief Operating Officer for S&C Electric Company, said several things can be done to improve the reliability and resiliency of the power distribution system. “The grid looks different depending on what state you’re in,” Edmonds said as a guest on The POWER Podcast. “We’ve got great experience with Florida Power & Light [FPL],” he said. “We’ve helped them create a resilient grid. So, that’s not only a grid that is reliable, but a grid that can actually weather the storms and all the challenges thrown at the grid.” Notably, FPL reported in March that it had provided “the most reliable electric service in company history in 2023.” Over the past two decades, FPL said its customers have realized a remarkable 45% improvement in reliability. In NextEra Energy’s (the parent company of FPL) Sustainability Report 2023, the company reported FPL’s SAIDI was 47.1 and SAIFI was 0.85, confirming markedly better results than the U.S. averages noted earlier. Furthermore, FPL said this is the ninth time in the past 10 years that it achieved “its best-ever reliability rating.” To better understand some of the innovative new equipment S&C Electric Company offers, Edmonds provided an example. “We have some technology that does something called ‘pulse finding,’ and what Florida Power & Light does, it just lets our equipment do what it does best. If there’s a problem, it’ll pulse to see if the problem is there or not on the grid, if it’s not, it reenergizes,” he said. “This technology is available to really change how the grid operates.” Edmonds said S&C Electric Company invented the fuse 115 years ago, and he noted fuses have served the industry well since that time. However, today there is better technology available that doesn’t require a lineworker to respond to an outage to replace a fuse. “Let’s take fuses off the grid and have a fuseless grid, and have much more intelligent devices that can actually re-energize,” Edmonds decreed.


Ask host to enable sharing for playback control

164. Why the U.S. Government Should Fund Cybersecurity Efforts to Protect Power Grid

FBI Director Christopher Wray, while speaking at the Vanderbilt Summit on Modern Conflict and Emerging Threats in Nashville, Tennessee, in April, warned that U.S. critical infrastructure is a prime target of the Chinese government. “The fact is, the PRC’s [People’s Republic of China’s] targeting of our critical infrastructure is both broad and unrelenting,” he said. Wray also noted that the immense size and expanding nature of the Chinese Communist Party’s hacking program isn’t just aimed at stealing American intellectual property. “It’s using that mass, those numbers, to give itself the ability to physically wreak havoc on our critical infrastructure at a time of its choosing,” he said. Wray noted that during the FBI’s recent Volt Typhoon investigation, the Bureau found that the Chinese government had gained illicit access to networks within America’s “critical telecommunications, energy, water, and other infrastructure sectors.” Some cybersecurity experts have likened this activity to an act of war, although NATO hasn’t defined it as such just yet. In any case, it is a serious threat to national security. “In this country, critical infrastructure is operated by the private sector, most of which are publicly traded companies,” said Alex Santos, CEO of Fortress Information Security, a company that specializes in cyber supply chain security for organizations that operate critical infrastructure including utilities and government agencies. Santos was speaking as a guest on The POWER Podcast. “Somehow, the private sector has taken on the responsibility to defend these acts of war, which I was always taught is the responsibility of the government,” he said. “I think what’s really the point here is that the government is asking us to do more. We’re being attacked more by the adversaries. Regulations are coming in. It’s becoming more and more complicated with technology change. And, our budgets are being cut,” said Santos. Thus, while Wray can be commended for pointing out the national security problem Chinese hackers present to critical infrastructure, his words fall flat if the government doesn’t put its money where its mouth is, Santos suggested. That’s not to say money isn’t being spent by the U.S. government. “The government is spending a lot on cybersecurity to help companies, but it’s going to research and universities,” Santos said. “How many research studies do we need to tell us that cybersecurity is a problem? How many research studies do we need to tell us that we don’t have enough cybersecurity workers? How much research do we need to give us 10 recommendations for how to increase the capability of our cybersecurity workforce? At some point, we need to actually do the work.” Santos suggested money could be better spent helping companies repair vulnerabilities or by getting small businesses to install basic security precautions like endpoint protection and network monitoring. “Does the government study how to build a tank or do they build tanks?” Santos asked rhetorically. “The government builds tanks and they buy bullets,” he answered. “So, think of it that way. We need to buy more tanks and bullets, and less research studies on which tanks, how many tanks, what kind of tanks—tanks with wheels, tanks with tracks—you know, let’s buy some tanks,” he said.


Ask host to enable sharing for playback control

163. Effective Training and Mentoring Programs Are Critical to Power Project Success

The power industry has long been lamenting its aging workforce. While turnover has been happening for years, there remains a large percentage of power professionals on the verge of retirement. Furthermore, the U.S. Bureau of Labor Statistics predicts faster than average job growth for engineering occupations. That means experienced workers with the skills needed by the power industry are in high demand and can be choosy when looking for new opportunities. They can also demand higher compensation to make a change. Meanwhile, relative youngsters coming out of college and trade schools, while often having the fundamental knowledge to do power jobs, don’t usually have the experience needed to add immediate value to an organization. The situation is forcing companies to implement workforce development strategies. Mechanical Dynamics & Analysis (MD&A) is a company that offers a full-service alternative to original equipment manufacturer services, parts, and repairs for steam, gas, and industrial turbines and generators. Like other power industry companies, MD&A has found it challenging to recruit experienced engineers. “When we started out back in the early 80s, we started out as a company who tended to hire engineers who were very experienced. And back around 2009, we started to realize that those people were becoming a little harder to find,” Charles Monestere, general manager for Technical Services with MD&A, said as a guest on The POWER Podcast. “So, we started hiring a few engineers a year—some years one person, some years two or three people, maybe even a little bit more—and we developed an in-house program where we would bring in generally recent graduates, within a year or two or three out of school, and put them through some classroom training, but then a structured on-the-job training where we would have weekly meetings reviewing the activities on the job sites,” he explained. “And we’d put the young engineers with very experienced project managers and technical directors that are at the sites—the field engineers who have been doing this for many years.” Called the Engineers in Training (EIT) program, the instruction tasked learners with becoming proficient at and gaining knowledge on many different technical aspects of the job. “A good part of the work is on the job sites; however, there is some structured classroom training, which is integrated into it,” Monestere said. In recent years, finding experienced people has become even more difficult, leading MD&A to increase its hiring into the EIT program. “We’re actually targeting about 10 people a year now,” said Monestere. “We’re just hiring in five more this summer, and then, probably another five or so at the end of the year. So, that’s the direction we’re heading.” Colin Baker, one of MD&A’s newest field engineers, participated in the program and found it very worthwhile. “Working with all these really great and really smart engineers, you get all of their experience firsthand, and you learn what’s right and what’s wrong,” he said. “Also, with all these classes that you’re put through, you use all of that knowledge and you learn where to apply it when you’re actually out in the field.” Meanwhile, Baker said the program also offered him an opportunity to network within the industry and in the company. Baker said he now has multiple experts he can contact when he runs into problems. “Especially with MD&A, you can always reach out to anyone for help. Everyone is pretty much readily available for any kind of questions or something of that matter,” he said. “I’m still very new in the industry and I’m not going to know everything. I know people who do know most things, so it’s good to get these kinds of resources.”


Ask host to enable sharing for playback control

162. How PG&E Is Reducing Wildfire Risks Using Satellite Imagery

Wildfires have had a devastating impact on California and on the state’s largest utility company, Pacific Gas and Electric (PG&E). Potential wildfire liabilities exceeding $30 billion led PG&E to file for bankruptcy in January 2019. The company emerged from bankruptcy on July 1, 2020, with a renewed focus on mitigating wildfires within its 70,000-square-mile service territory in northern and central California. “A lot has changed,” Andy Abranches, senior director of Wildfire Preparedness and Operations with PG&E, said as a guest on The POWER Podcast. “We really saw the devastation that could occur from these wildfires, and so, that was the point that PG&E started really making a big pivot to addressing the wildfire risk. The way we address the wildfire risk is really through what we consider our layers of protection. We started initially learning as much as we could from San Diego Gas and Electric [SDG&E], and put in place the public safety power shutoff program.” High-fire-threat district maps were important in understanding risks. About half of PG&E’s service territory falls in high-fire-threat areas. “We have 25,000 distribution miles that run through the high-fire-threat districts and 5,000 transmission miles,” said Abranches. Vegetation plays a critical role in the risk, and while precisely quantifying the number of trees in and around those risky transmission and distribution lines is difficult, Abranches estimated it’s in the range of eight to 10 million. With such a large area and so many trees to monitor, PG&E turned to Planet Labs, a San Francisco-based provider of global, daily satellite imagery and geospatial solutions, for help. Planet’s satellite-derived data on vegetation, including canopy height, cover, and proximity to electric-system infrastructure, is used by PG&E to prioritize the mitigation of vegetation-associated risks. Quantifying Threats and Consequences Abranches explained PG&E’s risk characterization process by likening it to a bowtie. “The first part of your risk bowtie is: ‘How do you quantify and in a probabilistic way build a risk model to predict ignitions are going to happen?’ ” He noted that the biggest source of ignitions is through contact with vegetation, such as a tree falling on a line or a branch coming into contact with a line on a windy day, but birds and other animals can also cause ignitions. “The second half of the bowtie is the consequence,” said Abranches. “If an ignition occurs at a particular location, if the vegetation around it is just not there, that ignition will never spread.” The fire triangle requires heat (or a spark), oxygen, and fuel. The fuel is the vegetation bed around the line where the ignition event occurs. If there happens to be a lot of dry fuel, that’s when an ignition becomes a wildfire. Depending on the oxygen, which can be heavily influenced by wind conditions, it could become a catastrophic fire, Abranches explained. “As we built our risk models, you needed to understand the vegetation dimension on two levels. One level is for probability of ignitions: ‘How do we get better at predicting where we expect vegetation ignitions to occur?’ And the data that we’re able to get from Planet every year helps improve and keeps those models updated,” said Abranches. “The second piece of it is the consequence of the ignition—understanding the fuel layer. That also—data from Planet—helps inform and continually refreshes that information to make sure it’s most current. So, the risk model actually uses the Planet data on both sides of the bowtie, because it’s probability of ignition times the consequence of ignition gives you the risk event.”


Ask host to enable sharing for playback control

161. How Regulatory Burdens and Misguided Incentives Are Degrading Power System Reliability

It’s no secret that the U.S. electric power system has undergone a remarkable transition that continues today. Coal-fired generation, which was the leading source of power generation during the 20th century, often providing more than half of the country’s electricity supply, fell to about 16.2% of the mix in 2023. Meanwhile, the U.S. solar market installed 32.4 GWdc of electricity-generation capacity last year, a 51% increase from 2022, and the industry’s biggest year by far, exceeding the 30-GWdc threshold for the first time. Solar accounted for 53% of all new electricity-generating capacity added to the U.S. grid in 2023, far greater than natural gas and wind, which were second and third on the list, accounting for 18% and 13% of new additions, respectively. But, how is the shift in resources affecting power system reliability? Some experts say it’s not good. “We’ve got a lot of warning lights that appear to be flashing today,” Todd Snitchler, president and CEO of the Electric Power Supply Association (EPSA), said as a guest on The POWER Podcast. “I say that not just from our perspective, but from NERC [the North American Electric Reliability Corp.]—the reliability coordinator—or from FERC [the Federal Energy Regulatory Commission], who has also expressed concerns, and all of the grid operators around the country have raised concerns about the pace of the energy transition.” EPSA is the national trade association representing America’s competitive power suppliers. It believes strongly in the value of competition and the benefits competitive markets provide to power customers. “Our members have every incentive to be the least-cost, most-reliable option that’s available, because if you are that resource, you’re going to be the resource that’s selected to run,” said Snitchler. Yet, not all markets are providing a level playing field, according to Snitchler. “The challenge we’re seeing is that there are a number of resources that are either having regulatory burdens that are placed on them that make them less competitive in comparison to resources that are not facing the same challenges, or there are resources that are highly subsidized, and as a result of those subsidies, it creates an economic disadvantage to unsubsidized resources, and that puts economic pressure on units that would otherwise be able to run and would earn a sufficient amount of revenue to remain on the system,” he explained. “We’re also seeing a pretty significant acceleration in retirements off of the system of dispatchable resources,” Snitchler continued. “What does that mean? So, of course, it means the coal plants that have been on the system for decades, as a result of economics and environmental policies, are retiring and moving off of the system. You’re seeing some of the older gas units experience the same kind of financial and regulatory pressures, and that is forcing some of them off of the system. And we’re seeing a large penetration of new renewable resources come onto the system that, frankly, are good energy resources, but don’t have the same performance characteristics that the dispatchable resources have. “And so, we’re having to fill a gap, or as I call it, the delta between aspirational policy goals and operational realities of the system, because too much retirement of dispatchable resources without sufficient resources that can replicate or deliver the same types of services that those dispatchable resources can provide, creates reliability concerns,” said Snitchler.


Ask host to enable sharing for playback control

160. How Grid Enhancing Technologies Are Expanding Electric Power Transmission System Capabilities

It’s no secret that power grids around the world need to expand to accommodate more renewable energy and the so-called “electrification of everything.” The latter, of course, refers to the growing trend of using electricity to power various sectors and applications that have traditionally relied on fossil fuels, such as natural gas or petroleum-based products. The electrification of everything includes the push toward electric vehicles; the transition from fossil fuel–based heating and cooling systems to electric alternatives, as well as the adoption of electric appliances; and the shift to more electric motors, furnaces, and other electric-powered equipment in manufacturing processes. Add to that the expected power needed to supply data centers and the growth of artificial intelligence-related computing, and current estimates of 50% load growth by 2050 could be vastly understated. Yet, getting new transmission lines planned, approved, and constructed is a daunting task, often taking a decade or longer to complete. So, how can the world more quickly add transmission capacity to the system without investing enormous time and money in the process? The answer: grid enhancing technologies, or GETs. “GETs are exciting to us because they are technologies that help us unlock quickly the additional headroom or additional capability of the grid to carry energy across the system,” Alexina Jackson, vice president of Strategic Development with AES Corp., said as a guest on The POWER Podcast. “This is something that is very important, because today, we are not making the fullest use of the electricity system as it’s built.” The system is operated below its maximum capacity for very good reasons, specifically, to maintain reliability, but by implementing GETs, it can be operated closer to its true limits without risk of failure. “Once we have these technologies, such as dynamic line rating, which helps us visualize the dynamic and full headroom of the electrical grid, and then technologies like storage as transmission, advanced power flow control, topology optimization—they all allow us to operate the grid in its dynamic capability. By doing both these things—visualization and operation dynamically—we’re able to start making fuller use of that carrying capacity for energy, which will allow us to add additional energy more quickly, serve our customer needs more efficiently, and ultimately decarbonize faster,” Jackson said. To read AES's white paper, visit:


Ask host to enable sharing for playback control

159. Navigating the Interconnection Queue Is One of Many Challenges Clean-Energy Projects Face

There are several obstacles to overcome when building a clean-energy project, but perhaps the biggest is getting through the generator interconnection queue (GIQ). Every regional transmission organization (RTO) and independent system operator (ISO) in the U.S. has a significant backlog in its GIQ and processing interconnection requests can take years to complete. This has created a significant barrier to deploying renewable energy, as companies often face long wait times, and high costs for new transmission lines and other upgrades when the local grid is near or at capacity. Part of the problem is the complexity of the interconnection process, which involves multiple studies. The Midcontinent Independent System Operator (MISO) reports that historically about 70% of projects submitted to its queue ultimately withdraw, resulting in extensive rework and delays, as studies must be redone when projects withdraw. MISO recognizes change is necessary and has implemented some reforms. On Jan. 19, 2024, the Federal Energy Regulatory Commission (FERC) accepted MISO’s filing (ER24-340) to increase milestone payments, adopt an automatic withdrawal penalty, revise withdrawal penalty provisions, and expand site control requirements. These provisions were designed to help expedite the GIQ process, and maximize transparency and certainty. MISO said the filing was developed through extensive collaboration in the stakeholder process, including multiple discussions in the Planning Advisory Committee and Interconnection Process Working Group. MISO expects these reforms to reduce the number of queue requests withdrawing from the process. It said the fewer projects in studies, the quicker the evaluations can be completed, and the fewer projects that withdraw, the more certain phase 1 and 2 study results are. Still, it’s likely that more needs to be done to improve the GIQ process. The Clean Grid Alliance (CGA), a nonprofit organization that works to advance renewable energy in the Midwest, conducted a survey of 14 clean energy developers who’ve had solar, wind, hybrid, and battery storage projects in the MISO interconnection queue over the last five years to better understand the challenges they’ve faced. Aside from interconnection queue challenges, the CGA survey also identified other hindrances to clean-energy project development. Soholt explained that a lot of development work is done face to face. COVID prevented that, which was a big problem that had a ripple effect. Some leases that developers had negotiated began to expire, so they had to go back out to communities and renegotiate. “Siting in general is getting more difficult, as we do more volume, as we do transmission in the MISO footprint,” said Soholt. “We need new generation to be sited, we need new transmission, and we have to find a pathway forward on that community acceptance piece,” she said. Among other challenges, Soholt said some projects saw generator interconnection agreements (GIAs) timing out and needing MISO extensions. Meanwhile, transmission upgrade delays also presented problems, not only the large backbone transmission upgrades, but also the transmission owners building interconnections for individual projects to connect breakers, transformers, and other equipment. Soholt said longer and longer component lead times presented timing challenges, which were also problematic for developers. These were all important takeaways from the CGA survey, and items the group will work to resolve. Yet, for all the difficulties, Soholt seemed optimistic that MISO would continue to find ways to improve the process. “When we get overwhelmed, we really step back and say, ‘What’s going to be the best thing to work on to really make a difference?’ So far, that really has been the big things like transmission planning. We feel good about where that’s at in MISO—they are doing good long-range planning,” Soholt said.


Ask host to enable sharing for playback control

158. Molten Salt Reactor Technology Solves Several Nuclear Industry Problems

Today, molten salt reactors (MSRs) are experiencing a resurgence of interest worldwide, with numerous companies and research institutions actively developing various designs. MSRs offer several potential advantages, including enhanced safety, reduced waste generation, and the ability to utilize thorium as a fuel source, as previously mentioned. “There are several molten salt reactor companies that are in the process of cutting deals and getting MOIs [memorandums of intent] with foreign countries,” Mike Conley, author of the book Earth Is a Nuclear Planet: The Environmental Case for Nuclear Power, said as a guest on The POWER Podcast. Conley is a nuclear energy advocate and strong believer in MSR technology. He called MSRs “a far superior reactor technology” compared to light-water reactors (LWRs). The thorium fuel cycle is a key component in at least some MSR designs. The thorium fuel cycle is the path that thorium transmutes through from fertile source fuel to uranium fuel ready for fission. Thorium-232 (Th-232) absorbs a neutron, transmuting it into Th-233. Th-233 beta decays to protactinium-233 (Pa-233), and finally undergoes a second beta minus decay to become uranium-233 (U-233). This is the one way of turning natural and abundant Th-232 into something fissionable. Since U-233 is not naturally found but makes an ideal nuclear reactor fuel, it is a much sought-after fuel cycle. “The best way to do this is in a molten salt reactor, which is an incredible advance in reactor design. And the big thing is, whether you’re fueling a molten salt reactor with uranium or thorium or plutonium or whatever, it’s a far superior reactor technology. It absolutely cannot melt down under any circumstances whatsoever period,” said Conley. Conley suggested that most of the concern people have about nuclear power revolves around the spread of radioactive material. Specifically, no matter how unlikely it is, if an accident occurred and contamination went airborne, the fact that it could spread beyond the plant boundary is worrisome to many people who oppose nuclear power. “The nice thing about a molten salt reactor is: if a molten salt reactor just goes belly up and breaks or gets destroyed or gets sabotaged, you’ll have a messed-up reactor room with a pancake of rock salt on the floor, but not a cloud of radioactive steam that’s going to go 100 miles downwind,” Conley explained. And the price for an MSR could be much more attractive than the cost of currently available GW-scale LWR units. “The ThorCon company is predicting that they will be able to build for $1 a watt,” said Conley. “That’s one-fourteenth of what Vogtle was,” he added, referring to Southern Company’s nuclear expansion project in Georgia, which includes two Westinghouse AP1000 units. Of course, projections do not always align with reality, so MSR pilot projects will be keenly watched to validate claims. There is progress being made on MSR projects. For example, in February 2022, TerraPower and Southern Company announced an agreement to design, construct, and operate the Molten Chloride Reactor Experiment (MCRE)—the world’s first critical fast-spectrum salt reactor—at Idaho National Laboratory (INL). Since then, Southern Company reported successfully commencing pumped-salt operations in the Integrated Effects Test (IET), signifying a major achievement for the project. The IET is a non-nuclear, externally heated, 1-MW multiloop system, located at TerraPower’s laboratory in Everett, Washington. “The IET will inform the design, licensing, and operation of an approximately 180-MW MCFR [Molten Chloride Fast Reactor] demonstration planned for the early 2030s timeframe,” Southern Company said.


Ask host to enable sharing for playback control

157. How Utilities Are Planning for Extreme Weather Events and Mitigating Risks

In mid-January, scientists who maintain the world’s temperature records announced that 2023 was the hottest year on record. NASA researchers say extreme weather across the planet, including heat extremes, wildfires, droughts, tropical cyclones, heavy precipitation, floods, high-tide flooding, and marine heat waves, will become more common and severe as the planet warms. That’s a big problem for power grids, because extreme weather often causes outages and damage to grid assets. Michael Levy, U.S. Networks lead and Global Head of Asset Resilience at Baringa Partners, a global management consulting firm, is highly focused on extreme weather risks and developing plans to help mitigate the threats. He suggested accurately forecasting dollars of risk at the asset level from extreme weather events is very important to his clients. “Every facility all across the U.S. is having a heightened awareness of some of these extreme weather events, and more importantly, how they can protect themselves and their customers against those in the future,” Levy said as a guest on The POWER Podcast. “Utilities have always been really good, generally, at keeping the lights on and maintaining a fair level of reliability,” said Levy. “In general, they’re making the right investments—they have the right ambitions—but what’s challenging about these extreme weather events is that because they’re so infrequent at individual locations, and the impacts are so severe, what we find is that utility clients often are really challenged to estimate those high-impact, low-frequency events, and integrate them into their investment plans.” However, Levy said advances in attribution climate science are helping utilities overcome some of the challenges. “Scientists are now able to associate, with reasonable level of accuracy, what increasing warming means physically for the rest of the world in terms of how the frequency and severity of these extreme weather events may change,” he explained. “One of the big things that we focus on with our utility clients is converting those climate forecasts into dollars of risk, and that way, it gives them an adjustable baseline that they can substantiate spend against,” said Levy. “If you’re undergrounding lines to protect them against wildfire, elevating substations to protect them against flooding, all of those things cost money, and we’re increasingly seeing regulators—they want to see the benefits, they want to see that the money is being spent prudently. So, that’s what we’re talking to our clients about today,” he said. And utilities have proven that sound planning does pay off. Levy pointed to actions taken in Florida following particularly active and intense hurricane seasons in 2004 and 2005. Soon thereafter, the Florida Public Service Commission adopted extensive storm hardening initiatives. Wooden pole inspection and replacement programs were adopted, and vegetative remediation solutions were implemented, vastly improving grid reliability. Additionally, investor-owned electric utilities were ordered to file updated storm hardening plans for the commission to review every three years. However, the proof is in the pudding, and for Florida, grid hardening has tasted very good. Levy compared the effects experienced from Hurricane Michael in 2018 to those of Hurricane Ian in 2022. “When Ian came, despite being a bigger and stronger hurricane, they had no transmission lines down, which, of course, are very costly and time intensive to replace, and they were able to restore customers three times as fast, despite having more customers out. So, they’re experiencing what we like to call at Baringa ‘the rewards of resilience,’ because investing in resilience is a fraction of restoration costs,” said Levy.


Ask host to enable sharing for playback control

156. Community Solar Projects Bring Renewable Energy to the Masses

The National Renewable Energy Laboratory (NREL) explains that community solar, also known as shared solar or solar gardens, is a distributed solar energy deployment model that allows customers to buy or lease part of a larger, off-site shared solar photovoltaic (PV) system. It says community solar arrangements allow customers to enjoy advantages of solar energy without having to install their own solar energy system. The U.S. Department of Energy says community solar customers typically subscribe to—or in some cases own—a portion of the energy generated by a solar array, and receive an electric bill credit for electricity generated by their share of the community solar system. It suggests community solar can be a great option for people who are unable to install solar panels on their roofs because they are renters, or because their roofs or electrical systems aren’t suited to solar. The Solar Energy Industries Association (SEIA) reports 6.5 GW of community solar capacity has been installed in the U.S. through the 1st quarter of 2024. Furthermore, SEIA predicts more than 6 GW of community solar capacity will be added over the next five years. It says 41 states, plus the District of Columbia, have at least one community solar project online. “These programs are very attractive and provide a lot of benefit to a whole range of consumers,” Nate Owen, CEO and founder of Ampion, said as a guest on The POWER Podcast. Ampion currently manages distributed generation projects for developers in nine states, with new states being added as more programs become active. “It’s fundamentally a different way of developing energy assets,” Owen said. “These things [community solar farms] are their own asset class. They produce a very significant value because they are generally located closer to load, and so, they fortify and strengthen local distribution networks quite a bit. And right now, they are very popular—there’s quite a bit of development going on in states across the country that have put programs in place.” Owen specifically mentioned Colorado, Illinois, Maine, Maryland, Massachusetts, Minnesota, New Jersey, and New York as states with active community solar programs. “There’s a lot of activity going on in a lot of states right now,” he said. According to Owen, community solar saves customers money. “The contract structure of community solar means that, ultimately, everybody’s guaranteed savings,” he said. “Nearly every community solar contract we’ve ever done has been provided at a percent off the value of the utility bill credit. So, at its essence, we are selling dollars’ worth of utility bill credits for 90 cents, and so, you automatically save money.” Contract terms often vary from project to project and state to state. “I think residential customers these days are generally signing contracts that are at least a year, if not three or five in some cases,” explained Owen. He noted that some states, such as Maine and New York, have a statutory 90-day termination notice clause for residential customers, so it doesn’t really matter how long the term is because subscribers have the right to terminate deals when they choose. In such cases, Owen said the “replaceability feature” of community solar is vital to success. “We can drop a customer and replace them—and we do,” he said.


Ask host to enable sharing for playback control

155. Improving Nuclear Plant Construction Processes: How to Build Projects More Efficiently

If you have paid any attention to nuclear power plant construction projects over the years, you know that there is a long history of cost overruns and schedule delays on many of them. In fact, many nuclear power plants that were planned in the 1960s and 1970s were never completed, even after millions (or billions) of dollars were spent on development. As POWER previously reported, by 1983, several factors including project management deficiencies prompted the delay or cancellation of more than 100 nuclear units planned in the U.S.—nearly 45% of total commercial capacity previously ordered. Yet, at least one construction expert believes nuclear power plants can be built on time and on budget. “To me, nuclear should be far, far more competitive than it is,” Todd Zabelle, a 30-plus-year veteran of the construction industry and author of the book Built to Fail: Why Construction Projects Take So Long, Cost Too Much, and How to Fix It, said as a guest on The POWER Podcast. Owners have a big role to play in the process. “The owner has to get educated on how to deliver these projects, because the owner gets the value out of any decisions that are made,” Zabelle said. “You cannot just hand it over to a construction management firm and hope for the best, or EPCM [engineering, procurement, construction, and management firm]. It’s just not going to work.” “What it boils down to is a lot of people doing a lot of administrative work—people watching the people doing the technical work or the craft work—and we become an industry of bureaucracy and administration,” said Zabelle. “Everyone’s forgot about ‘How do we actually do the work?’ That has huge implications because of the disconnect between those two.” According to Zabelle, the problem can be solved by implementing a production operations mentality. “My proposal in all this is: we need way more thinking about operations management, specifically operations science,” he said. “Not that it’s what happens after the asset’s delivered, but it’s actually a field of knowledge that assists with how to take inputs and make their outputs. The construction industry doesn’t understand anything about operations—they don’t understand the fundamentals.” In Zabelle’s book, he provides a more thorough explanation of the concept. “Operations science is the study of how to improve and optimize processes and systems to achieve the desired objectives. It involves the use of mathematical models and other techniques to analyze and optimize systems,” he wrote. “It is used to improve efficiency and reduce costs, while ensuring that the quality of the output remains high. Operations science is used to improve the effectiveness of operations, while also reducing waste and improving customer satisfaction.” Near the end of his book, Zabelle noted that the time for business as usual is rapidly closing. “The pain of the status quo in construction is going to increase exponentially as our capacity to develop and execute projects falls short of expectations,” he wrote. “Until we recognize projects as production systems and use operations science to drive project results, we are doomed to failure. We need to free ourselves from the prior eras and instead focus on a new era of project delivery, one in which projects will be highly efficient production systems that utilize the bounty of the technology (AI [artificial intelligence], robotics, data analytics, etc.) we are privileged to have access to.” Zabelle sounded hopeful about the future of nuclear power construction. “I truly believe—I would actually throw down the gauntlet—we can make the Westinghouse AP1000 financially viable,” he said. “I’m happy to work with anybody on how to make nuclear competitive because I think it should be and could be.”


Ask host to enable sharing for playback control

154. Hydrogen: ‘The Swiss Army Knife of Decarbonization’

It seems everywhere you go, both inside and outside of the power industry, people are talking about hydrogen. Last October, the U.S. Department of Energy (DOE) announced an investment of $7 billion to launch seven Regional Clean Hydrogen Hubs (H2Hubs) across the nation and accelerate the commercial-scale deployment of “low-cost, clean hydrogen.” Hydrogen is undoubtedly a valuable energy product that can be produced with zero or near-zero carbon emissions using renewable energy and electrolyzers. The Biden administration says it “is crucial to meeting the President’s climate and energy security goals.” “Hydrogen is one of the hottest topics in the energy transition conversation right now, and that’s because it really is a super versatile energy carrier. A lot of folks refer to it as ‘the Swiss Army knife of decarbonization,’ including our founder, Mr. Gates,” Robin Millican, senior director of U.S. Policy and Advocacy at Breakthrough Energy, said as a guest on The POWER Podcast. Breakthrough Energy is a network of entities and initiatives founded by Bill Gates, which include investment funds, philanthropic programs, and policy efforts linked by a common commitment to scale the technologies needed to achieve a path to net-zero emissions by 2050. “If you think about the ways that you can use hydrogen, you can use it as a feedstock for industrial materials, you can combine it with CO2 to make electrofuels [also known as e-fuels], you can use it for grid balancing if you’re storing it and then deploying that hydrogen when it’s needed, so it can be used a lot of different ways, which is great,” Millican said. “But actually, to us, the more salient question that we should be asking ourselves is: you can use hydrogen in a lot of these different ways, but should you be using hydrogen in all of those different applications?” Millican said there’s a simple framework that she uses to answer that question. “If there’s a way that you can electrify a process, in almost all cases, that’s going to be cheaper and more efficient from an energy conversion standpoint than using hydrogen,” she said. Millican suggested electrification is a better option than hydrogen for most building and light-duty transportation applications. While noting that hydrogen could be a suitable option for aviation e-fuels, she said biofuels might be an even better fit. However, when it comes to fertilizers and ammonia, clean hydrogen is very likely the best pathway to reducing emissions in that particular sector, she said. Breakthrough Energy isn’t the first group to think about hydrogen in this way. Millican noted that Michael Liebreich’s “Hydrogen Ladder” has been focusing on the best possible uses for hydrogen for years. According to Liebreich, hydrogen shouldn’t routinely be used in power systems to generate power because the cycle losses—going from power to green hydrogen, storing it, moving it around, and then using it to generate electricity—are too large. However, he says, “The standout use for clean hydrogen here is for long-term storage.” Yet, Millican said there is a scenario where hydrogen could be extremely affordable at scale. She said “geologic hydrogen” is something Breakthrough Energy is very interested in. “There are companies out there that are working on identifying where hydrogen exists naturally in the subsurface, and then trying to extract that hydrogen, which could be super affordable, because again, it’s abundant in some areas,” she explained. “If we’re thinking about hydrogen in that scenario, we might want to use it a lot more ubiquitously.”


Ask host to enable sharing for playback control

153. PGE Leans into an All-of-the-Above Strategy to Decarbonize Its Power System

Climate change has led many states and countries to set targets for reducing greenhouse gas (GHG) emissions from power systems. Oregon, for example, has set targets for all power sold to retail customers in the state to have GHG emissions cut by 80% by 2030, 90% by 2035, and 100% by 2040. It’s a challenging task, but Portland General Electric (PGE), a fully integrated energy company that generates, transmits, and distributes electricity to roughly half of Oregon’s population, and for about 75% of its commercial and industrial activity, is working hard to achieve those objectives. As the first utility in the U.S. to sign The Climate Pledge, an initiative co-founded by Amazon and Global Optimism in 2019, which has since had 464 signatories join, committing to reach net-zero carbon emissions by 2040, PGE is leading the way toward a cleaner energy future. Kristen Sheeran, senior director of sustainability, strategy, and resources planning at PGE, said the process is pretty straightforward in some ways. “In order to reduce carbon on our system, we have to back out fossil fuels that we currently rely on to generate power for our customers, and we have to replace that with non-emitting alternatives,” she said as a guest on The POWER Podcast. Up to this point in time, that has primarily been done with wind, solar, and batteries, and it’s not a new thing for PGE. The company’s first wind farm—the Biglow Canyon site—began operation in 2007. Meanwhile, in 2012, PGE opened the Camino del Sol Solar Station, an interstate highway solar project. Since then, the company has partnered with schools, government agencies, and corporations to grow solar energy throughout Oregon. In partnership with NextEra Energy Resources, it also opened North America’s first major renewable energy facility to combine wind, solar, and battery storage in one location—the Wheatridge Renewable Energy Facility in Morrow County. Today, PGE boasts having more than 1 GW of wind power capacity in service in the Northwest, and it aims to procure between 3.5 GW and 4.5 GW of new non-emitting resources and storage between now and 2030. Perhaps more difficult than decarbonizing the system, however, is doing so while also maintaining reliability, affordability, and an equitable system for all its customers. “It’s a very interesting point in time—an inflection point for the industry,” Sheeran said. “How do you balance affordability? How do you balance reliability with emissions reduction?” she asked. PGE closed its last Oregon-based coal-fired power plant in October 2020, 20 years ahead of schedule, as part of an agreement with stakeholders, customer groups, and regulators to significantly reduce air emissions from power production in Oregon. PGE still receives a small amount of coal-fired power from the Colstrip plant, which is located near Billings, Montana. The company has an ownership stake in the facility, but it plans to exit its ownership in Colstrip no later than 2029. Brett Greene, PGE’s senior director of clean energy origination and structuring, suggested striking the right energy balance will take more than just wind and solar, however. “We are supportive of all technology. We really think it takes a lot of innovation and creativity to hit that net-zero goal in 2040,” he said. Greene noted that resources such as hydro, pumped storage, offshore wind, and even nuclear, hydrogen, and carbon capture technologies may ultimately be needed to fully decarbonize PGE’s power mix.


Ask host to enable sharing for playback control

152. A Boiler for Any Occasion

Boilers obviously play an important role in the power generation industry, providing the mechanism to convert heat produced by burning fuel into steam that can be used to drive a turbine to generate electricity. But many other industries also use boilers to produce steam for a variety of purposes. Boilers are commonly used for space heating in industrial facilities, including in factories, warehouses, and office buildings, as well as on university campuses and in large medical complexes. Boilers often provide hot water or steam, which is then distributed throughout buildings using radiators, convectors, or underfloor heating systems, to heat the air. Many industrial processes utilize high-temperature steam for manufacturing operations. Boilers are regularly used for processes such as chemical manufacturing, food processing, paper production, and textile manufacturing. Boilers are also essential in petroleum refineries for processes like distillation, cracking, and reforming. Steam can also be used as a source of energy for industrial processes such as sterilization, cleaning, and drying. In some cases, cogeneration (also called combined heat and power) systems are utilized to first generate electricity, and then, extraction steam is diverted for other purposes. This can greatly improve the overall system efficiency, saving money and reducing emissions. Rentech Boiler Systems Inc. is one of the leading manufacturers of custom water tube and waste heat recovery boilers. The company is headquartered in Abilene, Texas, but sells its boilers around the world. “We have shipped boilers to about 35 countries in the world. So, we’re a company known globally,” Gerardo Lara, vice president of Fired Boiler Sales with Rentech, said as a guest on The POWER Podcast. “I think our best feature at Rentech is that we build only custom solutions,” Jon Backlund, senior sales engineer with Rentech, said on the podcast. “We don’t have a catalog of standard sizes or standard designs. So, we will basically custom fit the application, and that means, we will read the specifications carefully, talk to the client about special needs, special fuels, any kind of space constraints, delivery issues, and design our system to fit exactly what they require.” Rentech typically manufacturers boilers with capacities ranging from about 40,000 lb/hr to 600,000 lb/hr of steam. Moving boiler systems of that size—which can weigh up to half a million pounds—from a manufacturing facility to a site can be challenging, but Lara suggested Rentech is very proficient at the task. “There is a wide range of logistics that have to be studied, and yes, we live in the middle of Texas, but we certainly are very well versed on how to get a big boiler to Australia, if need be,” he said. “If we can do that, we certainly can get one to any state here within the U.S., or even Canada or Mexico.” The fuel used to fire boilers can vary widely. Natural gas is very common in the U.S. because it is highly available and relatively inexpensive, but many other fuels are also suitable for industrial boilers. Backlund said there are a lot of “opportunity fuels” available in different locations. For example, landfill gas can be captured and utilized at many landfills. Likewise, biogas from brewing or sewage treatment processes are also usable. Many experts believe hydrogen will be an important fuel as the world transitions to greater carbon-free energy resources. Backlund said hydrogen has been burned in boilers for decades. “There’s a lot of talk about equipping our boilers to burn hydrogen in the future, but this is not a new technology in the boiler business,” he said. “Those kinds of plants have been around for generations.” Where the hydrogen comes from and how it is produced may change, but today’s boilers are already capable of utilizing hydrogen efficiently.


Ask host to enable sharing for playback control

151. Microgrids a Win for Both Owners and Grid Operators

According to a guidebook issued by Sandia National Laboratories, a U.S. Department of Energy (DOE) multi-mission laboratory, microgrids are defined as a group of interconnected loads and distributed energy resources (DERs) that act as a single controllable entity. A microgrid can operate in either grid-connected or island mode, which includes some entirely off-grid applications. A microgrid can span multiple properties, generating and storing power at a dedicated/shared location, or it can be contained on one privately owned site. The latter condition, where all generation, storage, and conduction occur on one site, is commonly referred to as “behind-the-meter.” Microgrids come in a wide variety of sizes. Behind-the-meter installations are growing, especially as entities like hospitals and college campuses are installing their own systems. Where some once served a single residence or building, many now power entire commercial complexes and large housing communities. “Today, there’s a whole new way to do DER management, which is a significant component of microgrids,” Nick Tumilowicz, director of Product Management for Distributed Energy Management with Itron, said as a guest on The POWER Podcast. “There is a way now to do that in a very local, automated, and cost-effective way just by leveraging what utilities have already deployed—hundreds of thousands of meters and the mesh networks that are communicating with those meters.” Tumilowicz said a variety of factors can influence if and/or when a microgrid gets deployed. Sometimes, a company is focused on running cleaner and greener operations. Other times, the grid a company is connected to may have reliability challenges that are affecting business adversely, or the company may just want to be energy independent, so the decision is frequently case specific. “The customer has this motivation to have this backup concept known as resiliency—if the grid’s not there for me, I’ll be there for me,” he said. “Generally speaking, nationally, we’re well above 99.9% grid reliability,” Tumilowicz noted. Yet, even when power outages are rare, a microgrid can still provide value. “It can provide flexible services, such as capacity or resource adequacy, or energy services back to the distribution and the transmission up to the market operator level,” explained Tumilowicz. “So, this is a whole other way to be able to start thinking about how we participate with microgrids when 99-plus percent of the time they’re grid connected, but they’re also there for when the grid is not connected—in that very low probability of time.” However, the return on investment for microgrid systems is highly affected by location. “If you’re in Australia, the equation is different than if you’re in Hawaii, versus if you’re in the northeast U.S.—one of the better-known accelerated paybacks to do this,” said Tumilowicz. For example, in areas where the market operator, such as an independent system operator or regional transmission organization, places a high value on peak power reductions within its system, the economics for microgrid owners can be greatly improved. But regardless of what may have driven the initial decision to create a microgrid, Tumilowicz said being flexible is important. “You might deploy your microgrid to satisfy three use cases and market mechanisms that exist in the beginning of 2024, but you need to be open and receptive—and this is where the innovation comes in—to add use cases over time, because the system is going through a significant energy transition, and you need to be dynamic and accommodating to do that,” he said.


Ask host to enable sharing for playback control

150. How Coal Fly Ash Is Reducing CO2 Emissions and Improving Concrete

Concrete is the most widely used construction material in the world. One of the key ingredients in concrete is Portland cement. The American Concrete Institute explains that Portland cement is a product obtained by pulverizing material consisting of hydraulic calcium silicates to which some calcium sulfate has usually been provided as an interground addition. When first made and used in the early 19th century in England, it was termed Portland cement because its hydration product resembled a building stone from the Isle of Portland off the British coast. Without going into detail, it suffices to say that a great deal of energy is required to produce Portland cement. The chemical and thermal combustion processes involved in its production are a large source of carbon dioxide (CO2) emissions. According to Chatham House, a UK-based think tank, more than 4 billion tonnes of cement are produced each year, accounting for about 8% of global CO2 emissions. However, fly ash from coal-fired power plants is a suitable substitute for a portion of the Portland cement used in most concrete mixtures. In fact, substituting fly ash for 20% to 25% of the Portland cement used in concrete mixtures has been proven to enhance the strength, impermeability, and durability of the final product. Therefore, using fly ash for this purpose rather than placing it in landfills or impoundments near coal power plants not only reduces waste management at sites, but also reduces CO2 emissions and improves concrete performance. Rob McNally, Chief Growth Officer and executive vice president with Eco Material Technologies, explained as a guest on The POWER Podcast that the ready-mix concrete industry has been reaping the benefits of using fly ash for years. “In terms of economics, fly ash was typically cheaper than Portland cement. It also has beneficial properties that typically makes it stronger long term and reduces permeability, which keeps water out of the concrete mixture and helps concrete to last longer. And, then, it’s also environmentally friendly, because they’re using what is a waste product as opposed to more Portland cement—and Portland cement is highly CO2 intensive. For every tonne of Portland cement produced, it’s almost a tonne of CO2 that’s introduced into the atmosphere. So, they have seen those benefits for years with the use of fresh fly ash,” McNally said. However, as climate change concerns have grown, many power companies have come under pressure to retire coal-fired power plants. As plants are retired, fresh fly ash has become less and less available. “The availability of fresh fly ash is declining,” said McNally. “In some places—many places actually—around the country, replacement rates that used to be 20% of Portland cement was replaced by fly ash are now down in single digits. But that’s a reflection of fly ash availability.” Eco Material Technologies, which claims to be the leading producer of sustainable cementitious materials in the U.S., has a solution, however. It has developed a fly ash harvesting process and has nine fly ash harvesting plants in operation or under development to harvest millions of tons of landfilled ash from coal power plants. Locations include sites in Arizona, Georgia, North Dakota, Oregon, and Texas. “There are billions—with a b—of tons of impounded fly ash around the country, so we have many, many years of supply,” McNally said. Still, Eco Material is not resting its business solely on fly ash harvesting, or marketing fresh fly ash, which it has also done for years. “The other piece where we will fill the gap that fresh fly ash leaves behind is with the green cement products. Because with those, we’re able to use natural pozzolans, like volcanic ash, and process those and replace 50% plus of Portland cement in concrete mixes. So, we think there’s an answer for the decline in fly ash and that’s where the next leg of our business is taking.”


Ask host to enable sharing for playback control

149. DOE Competition Helps College Students Prepare for Cyber Jobs in the Energy Industry

There is growing demand for cybersecurity professionals all around the world. According to the “2023 Official Cybersecurity Jobs Report,” sponsored by eSentire and released by Cybersecurity Ventures, there will be 3.5 million unfilled jobs in the cybersecurity industry through 2025. Furthermore, having these positions open can be costly. The researchers said damages resulting from cybercrime are expected to reach $10.5 trillion by 2025. In response to the escalating demand for adept cybersecurity professionals in the U.S., the Department of Energy (DOE) has tried to foster a well-equipped energy cybersecurity workforce through a hands-on operational technology cybersecurity competition with real-world challenges. On Nov. 4, the DOE hosted the ninth edition of its CyberForce Competition. The all-day event, led by DOE’s Argonne National Laboratory (ANL), drew 95 teams—with nearly 550 students total—from universities and colleges across the nation. This year the focus was on distributed energy resources including solar panels and wind turbines. “The CyberForce Competition comes out of the Department of Energy’s Office of Cybersecurity, Energy Security, and Emergency Response, which is CESER for short,” Amanda Theel, group leader for workforce development at ANL, said as a guest on The POWER Podcast. “Their main goal for this is really to help develop the pipeline of qualified cybersecurity applicants for the energy sector. And I say that meaning, we really dive heavily on the competition and looking at the operational technology side, along with the information technology side.” Theel said each team gets about six or seven virtual machines (VMs) that they have to harden and defend to the best of their ability. Besides monitoring and protecting the VMs, which include normal business systems such as email and file servers, the teams also have to defend grid operations and other energy resources. “We have a Red Team that’s constantly trying to either come into the system from your regular attack-defend penetration. We also have a portion of our Red Team that we like to call our ‘assumed breach,’ so we assume that adversary is already in the system,” Theel explained. “The Blue Team, which is what we call our college students, their job is to work to try to get those Red Team members out.” She said they also have what they call “our whack-a-mole,” which are vulnerabilities built into the system for the Blue Team members to identify and patch. Besides the college students, ANL brings in volunteers—high school students, parents, grandparents, people from the lab, and people from the general public—to test websites and try to pay pretend bills by logging in and out of the simulated systems. Theel said this helps students understand that while security is important, they must also ensure that owners, operators, and end-users can still get in and use the systems as intended. “So, you have to kind of play the balance of that,” she said. Other distractions are also incorporated into the competition, such as routine meetings and requests from supervisors, for example, to review a forensics file and check the last time a person in question logged into the system. The intention is to overload the teams with tasks so evaluators can see if the most critical items are prioritized and remedied. For the second year in a row, a team from the University of Central Florida (UCF) won first place in the competition (Figure 1). They received a score of 8,538 out of 10,000. Theel said the scores do vary quite significantly from the top-performing teams to lower-ranked groups. “What we’ve found is obviously teams that have returned year after year already have that—I’ll use the word expectation—of already knowing what to expect in the competition,” explained Theel. “Once they come to year two, we’ve definitely seen massive improvements with teams.”


Ask host to enable sharing for playback control

148. Advanced Nuclear Fuel Approved for Installation at Plant Vogtle

Southern Nuclear, Southern Company’s nuclear power plant operations business, announced in late September that it had received “first-of-a-kind approval” from the Nuclear Regulatory Commission (NRC) to use advanced fuel—accident tolerant fuel (ATF)—exceeding 5% enrichment of uranium-235 (U-235) in Plant Vogtle Unit 2. The fuel is expected to be loaded in 2025 and will have enrichments up to 6 weight % U-235. The company said this milestone “underscores the industry’s effort to optimize fuel, enabling increased fuel efficiency and long-term affordability for nuclear power plants.” “5 weight % was deeply ingrained in all of our regulatory basis, licensing basis for shipment containers, licensing basis for the operation of the plants—it was somewhat of a line drawn in the sand,” Johnathan Chavers, Southern Nuclear’s director of Nuclear Fuels and Analysis, explained as a guest on The POWER Podcast. “Testing of the increased enrichment component has been a licensing and regulatory exercise to see how we would move forward with existing licensing infrastructure to install weight percents above that legacy 5 weight %,” Chavers told POWER. Chavers said ATF became a focal point for the industry in March 2011 following the magnitude 9.0 Tohoku-Oki earthquake and resulting tsunami, which caused a crisis at the Fukushima nuclear power plant. “In 2012, Congress used the term ‘accident tolerant fuel’ for the first time in an Appropriations Act, and that’s where it all began,” Chavers explained. “It was really for the labs and the DOE [Department of Energy] to incentivize enhanced safety for our fuel in response to the Fukushima incident.” In 2015, the DOE issued a report to Congress outlining details of its accident tolerant fuel program. The report, titled “Development of Light Water Reactor Fuels with Enhanced Accident Tolerance,” set a target for inserting a lead fuel assembly into a commercial light water reactor by the end of fiscal year 2022. Notably, Southern Company achieved the goal four years early. “We were the first in the world to install fueled accident tolerant fuel assemblies of different technologies that were developed by GE at our Hatch unit in 2018,” Chavers noted. The following year, Southern Nuclear installed four Framatome-developed GAIA lead fuel assemblies containing enhanced accident-tolerant features applied to full-length fuel rods in Unit 2 at Plant Vogtle. “This is the third set that we’re actually installing that is a Westinghouse-developed accident tolerant fuel, which also includes enrichments that exceed the historical limits of 5 weight %,” Chavers explained. While enhanced safety is perhaps the most significant benefit provided by ATF, advanced nuclear fuel is also important in lowering the cost of electricity. “Our ultimate goal is to enable 24-month [refueling] cycles for all U.S. nuclear power plants, to improve the quality of life for our workers, to lower the cost of electricity,” said Chavers. “Fundamentally, [nuclear power] is a clean green power source—carbon-free. The more we can keep it running—that’s something we’re trying to go after,” noted Chavers. “We see a lot of positives in this program in that not only are we improving safety, lowering the cost, but we’re also increasing the amount of megawatts electric we can get out of the nuclear assets.”