
For this episode of the Data Center Frontier Show podcast, DCF's editors sat down with James Walker, BEng, MSc, CEng, PEng, CEO and board member of Nano Nuclear Energy Inc., and Jay Jiang Yu, Nano Nuclear Energy's founder, executive chairman and president, for a discussion regarding industry news and technology updates surrounding small modular reactor (SMR) and microreactor nuclear onsite power generation systems for data centers.
James Walker is a nuclear physicist and was the project lead and manager for constructing the new Rolls-Royce Nuclear Chemical Plant; he was the UK Subject Matter Expert for the UK Nuclear Material Recovery Capabilities, and was the technical project manager for constructing the UK reactor core manufacturing facilities. Walker has extensive experience in engineering and project management, particularly within nuclear engineering, mining engineering, mechanical engineering, construction, manufacturing, engineering design, infrastructure, and safety management. He has executive experience in several public companies, as well as acquiring and re-developing the only fluorspar mine in the U.S.
Jay Jiang Yu is a serial entrepreneur and has over 16 years of capital markets experience on Wall Street. He is a private investor in a multitude of companies and has advised a magnitude of private and public company executives with corporate advisory services such as capital funding, mergers and acquisitions, structured financing, IPO listings, and other business development services. He is a self-taught and private self-investor whose relentless passion for international business has helped him develop key, strategic and valuable relationships throughout the world. Yu leads the corporate structuring, capital financings, executive level recruitment, governmental relationships and international brand growth of Nano Nuclear Energy Inc. Previously, he worked as an analyst as part of the Corporate & Investment Banking Division at Deutsche Bank in New York City.
Here's a timeline of key points discussed during the podcast:
0:22 - Nano Nuclear Energy Expert Introductions1:38 - Topic Set-up Re: DCF Senior Editor David Chernicoff's recent data center microreactor and SMR explorations.1:59 - How microreactors might impact the data center industry. (Can time-to-market hurdles be shrunk?)2:20 - Chernicoff begins the interview with James and Jay. How the NuScale project difficulties in the SMR segment resulted in the DoD pulling back on preliminary microreactor contracts in Alaska due to market uncertainties directly related to NuScale. 3:23 - Perspectives on NuScale and nuclear power.4:21 - James Walker on NuScale vs. microreactor prospects:
"They have a very good technology. They're still the only licensed company out there, and they probably will bounce back from this. It's not good optics when people are expecting product to come out of the market. And NuScale was to be the first, but market conditions and the structure of SPACs and the lack of us infrastructure can all complicate what they want to do. Half the reason for them taking so long is because the infrastructure was not in place to support what they wanted to do.
But even hypothetically, even if the SMR market, as an example, was to collapse, microreactors are really targeting a very different area of market. SMRs are looking to power cities and big things like that. Microreactors, you're looking at mine sites, charging stations, free vehicles, disaster relief areas, military bases, remote habitation, where they principally fund all their energy using diesel. It's kind of hitting a different market. So even if the SMR market goes away, there's still a huge, tremendous upside, potential untapped market in the microreactor space."
5:39 - DCF Editor in Chief Matt Vincent asks, "What's the pros and cons of the prospects for microreactors versus what we're commonly thinking about in terms of SMR for data centers?"
5:51 - Nano Nuclear's James Walker
Dec 19, 2023
35 min

For this episode of the Data Center Frontier Show podcast, we sit down with Brian Kennedy, Director of Business Development and Marketing at Natron Energy.
As recounted by Kennedy in the course of our talk, Colin Wessells founded Natron Energy as a Stanford PhD student in 2012. His vision in building the company, which started in a garage in Palo Alto, was to deliver ultra-safe, high-power batteries.
As stated on the company's website, "After countless hours of development with an ever expanding team of scientists and engineers, Natron now operates a state of the art pilot production line for sodium-ion batteries in Santa Clara, California."
The company notes that most industrial power utilizes decades-old, more environmentally hazardous battery technology such as lead-acid and lithium-ion.
In contract, Natron says its "revolutionary sodium-ion battery leverages Prussian Blue electrode materials to deliver a high power, high cycle life, completely fire-safe battery solution without toxic materials, rare earth elements, or conflict minerals."
In 2020, Natron became the world’s first sodium-ion battery to achieve a UL 1973 listing for its battery product, and commercial shipments to customers in the data center, forklift, and EV fast-charging markets soon began.
Natron notes that its technology leverages standard, existing li-ion manufacturing techniques, allowing the company to scale quickly. With U.S. and Western-based supply chain and factory agreements in place, Natron says it saw its manufacturing capacity increase 200x in 2022.
In the course of the podcast discussion, Natron's Kennedy provides an update on Natron's data center industry doings this year and into next year. Here's a timeline of key points discussed:
:29 - 7x24 Fall Conference Memories:51 - Teeing Up Sodium Ion1:18 - Talking Pros and Cons, Sustainability2:15 - Handing It Over to Brian2:30 - Background on Natron Energy and founder/CEO Colin Wessells2:55 - Background on Sodium Ion Technology3:11 - Perfecting a New Sodium Ion Chemistry and Manufacturing with 34 International Patents In Play3:28 - The Prominent Feature of Sodium-Ion Technology Is Its Inherent Safety; Eliminates Risk of Thermal Runaway3:51 - U.S. Government ARPA-E Advanced Technology Grants Have Been Pivotal Funding for Natron4:13 - Sodium Ion Battery Technology Comparison and Value Proposition5:28 - How Often Is A Data Centers Battery Punctured? Ever Seen a Forklift Driven Through One?6:10 - On The Science of the Natron Cell's Extremely High Power Density, Fast Discharge and Recharge6:55 - Comparing Sodium-Ion to Most of the Lithium Chemistries7:25 - The Meaning of UL Tests8:00 - Natron Has Published Unredacted UL Test Results8:35 - On the Longevity of Sodium Ion Batteries9:51 - "There's No Maintenance Involved."10:18 - Natron Blue Rack: Applications10:52 - How Natron Is In the Process of Launching Three Standard Battery Cabinets11:20 - Performance Enhancements Will Take Standard Data Center Cabinets "Well North" of 250 kW11:45 - Though Data Centers are Its Largest Market, Natron Also Serves the Oil and Gas Peak Load Shaving and Industrial Spaces 12:21 - Sustainability Advantages12:51 - ESG Is About More Than Just Direct Emissions13:15 - The Importance of Considering the Sourcing and Mining of Battery Elements14:09 - "That Fact That You May Be Pushing [Certain] Atrocities Up the Supply Chain Where You Can't See Them, Doesn't Make It OK"14:34 - Notes On Supply Chain Security with Secure, U.S.-Based Manufacturing15:45 - Wrapping Up: Global UPS Manufacturer Selects Natron Battery Cabinet; Looking Ahead to 2024.
Here are links to some related DCF articles:
Will Battery Storage Solutions Replace Generators?New NFPA Battery Standard Could Impact Data Center UPS DesignsMicrosoft Taps UPS Batteries to Help Add Wind Power to Ireland’s GridData Center of the Future: Equinix Test-Drives New Power, Cooling SolutionsCorscale Will Use Nickel-Zinc Batteries in New Data Center Campus
Dec 5, 2023
17 min

In this episode of the Data Center Frontier Show podcast, Matt Vincent, Editor-in-Chief of Data Center Frontier, and Steven Carlini, Vice President of Innovation and Data Centers for Schneider Electric, break down the challenges of AI for each physical infrastructure category including power, cooling, racks, and software management.
Dec 5, 2023
17 min

For this episode of the Data Center Frontier Show podcast, DCF's Editor in Chief Matt Vincent chats with Brian Green, EVP Operations, Engineering and Project Management, for EdgeConneX. The discussion touches on data center operations, sustainable implementations/deployments, renewable power strategies, and ways to operationalize renewables in the data center.
Under Brian’s leadership, the EdgeConneX Houston data center completed a year-long project measuring the viability of 24/7 carbon-free energy utilizing AI-enabled technology. With this approach, EdgeConneX ensured the data center is powered with 100% renewable electricity, and proved that even if the power grid operates on fossil-fueled electricity generation, real-time hourly increments can be applied to new and existing data centers.
As a result, for every given hour, EdgeConneX and its customers can operate throughout the year without emitting any CO2 with zero reliance on fossil standby generation during dark or cloudy periods. This innovative program will be duplicated at other EdgeConneX facilities globally.
Another real-world example discussed is related to a facility where the local community complained about the noise of the fans. Brian's team worked to improve the noise level by changing fan speeds, and as a result, the data center and the local community realized multiple benefits, including enhanced community relations by removing the noise disturbance, increased efficiencies, and reducing amount of power used, a big cost-saver for the data center.
Along the way, Brian explains how he, along with EdgeConneX team, are big believers in the company's motto: Together, we can innovate for good.
Nov 29, 2023
12 min

For this special episode of the DCF Show podcast, Data Center Frontier's founder and present Editor at Large, Rich Miller, returns for a visit. Tune in to hear Rich engage with the site's daily editors, Matt Vincent and David Chernicoff, in a discussion covering a range of current data center industry news and views.
Topics include: Dominion Energy's transmission line expansion in Virginia; Aligned Data Centers' market exit in Maryland over a rejected plan for backup diesel generators; an update on issues surrounding Virginia's proposed Prince William Digital Gateway project; Rich's take on the recent Flexential/Cloudflare outages in Hillsboro, Oregon; and more.
Here's a timeline of key points discussed on the podcast:
:10 - For those concerned that the inmates might be running the asylum, the doctor is now in: Rich discusses his latest beat as DCF Editor at Large.
1:30 - We look at the power situation in No. Virginia as explained by one of Rich's latest articles, vis a vis what's going to be required to support growth already in the pipeline, in the form of contracts that Dominion Energy has for power. "Of course, the big issue there is transmission lines," adds Miller. "That's the real constraint on data center power delivery right now. You can build local lines and even substations much more quickly than you can transmission at the regional level. That's really where the bottlenecks are right now."
3:00 - Senior Editor David Chernicoff asks for Rich's take on Aligned Data Centers' recent market exit in Maryland, related to its rejected plan for backup diesel generators. "Is this really going to be the future of how large-scale data center projects are going to have to be approached, with more focus put on dealing with permission to build?" wonders Chernicoff, adding, "And are we going to see a more structured data center lobbying effort on the local level beyond what, say, the DCC [Data Center Coalition] currently does?"
5:19 - In the course of his reponse, Rich says he thinks we'll see just about every data center company realizing the importance of doing their research on the full range of permissions required to build these megascale campuses, which are only getting bigger.
6:12 - Rich adds that he thinks the situation in Maryland illustrates how it's important for data center developers to step back for a strategic discussion regarding depth of planning. "The first thing to know," he points out, "is that Maryand was eager to have the data center industry. They specifically passed incentives that would make them more competitive with Virginia. They saw that Northern Virginia was getting super crowded...and they thought, we've got lots of resources up here in Frederick County, let's see if we can bring some of these folks across the river. And based on that, the Quantum Loophole team found this site."
8:20 - Rich goes on to note how "the key element for a lot of data centers is fiber, and a key component, both strategically and from an investment perspective [in Maryland] is that Quantum Loophole needed to have a connection to the Northern Virginia data center cluster in Ashburn, in Data Center Alley - which is not that far as the crow flies, but to get fiber there, they wound up boring a tunnel underneath the Potomac River, an expensive and time-consuming project that they're in the late stages of now. That's a big investment, and all that was done with the expectation that Maryland wanted data centers."
10:26 - Rich summarizes how the final ruling for Aligned in Maryland "was, effectively, that you can have up to 70 MW but beyond that, you have to follow this other process [where] you're more like a power plant than a data center with backup energy." He adds, "I think one of the issues was [in determining], will all of this capacity ever be turned on all at once? Obviously with diesel generators, that's a lot of emissions. So the air quality boards are wrestling with, on the one hand, having a large company
Nov 21, 2023
36 min

Ten years into the fourth industrial revolution, we now live in a “datacentered” world where data has become the currency of both business and personal value. In fact, the value proposition for every Fortune 500 company involves data. And now, seemingly out of nowhere, artificial intelligence has come along and is looking to be one of the most disruptive changes to digital infrastructure that we’ve ever seen.In this episode of the Data Center Frontier Show podcast, Matt Vincent, Editor-in-Chief of Data Center Frontier, talks to Sean Farney, Vice President for Data Center Strategy for JLL Americas, about how AI will impact data centers.
Nov 16, 2023
34 min

The Legend Energy Advisors (Legend EA) vision of energy usage is one in which all companies have real-time visibility into related processes and factors such as equipment efficiency, labor intensity, and consumption of power and other energy resources across their operations.
During this episode of the Data Center Frontier Show podcast, the company's CEO and founder, Dan Crosby, and his associate, Ralph Rodriguez, RCDD, discussed the Legend Analytics platform, which offers commodity risk assessment infrastructure services, and real-time metering for energy usage and efficiency.
The firm contends that only through such "total transparency" will their clients be able to "radically impact" energy and resource consumption intensity at every stage of their businesses.
"My background was in construction and energy brokerage for a number of years before founding Legend," said Crosby. "The basis of it was helping customers understand how they're using energy, and how to use it better so that they can actually interact with markets more proactively and intelligently."
"That helps reduce your carbon footprint in the process," he added. "Our mantra is: it doesn't matter whether you're trying to save money or save the environment, you're going to do both of those things through efficiency -- which will also let you navigate markets more efficiently."
Legend EA's technology empowers the firm's clients to integrate all interrelated energy components of their businesses, while enabling clear, coherent communication across them.
This process drives transparency and accountability on “both sides of the meter,” as reckoned by the company, the better to eliminate physical and financial waste.
As stated on the firm's website, "This transparency drives change from the bottom up, enabling legitimate and demonstrable changes in enterprises’ environmental and financial sustainability."
Legend Analytics is offered as a software as a service (SaaS) platform, with consulting services tailored to the needs of individual customers, who include industrial firms and data center operators, in navigating the power market.
Additionally, the Ledge device, a network interface card (NIC), was recently introduced by Legend EA as a way to securely gather energy consumption data from any system in an organization and bring it to the cloud in real-time.
Here's a timeline of key points discussed on the podcast:
1:15 - Crosby details the three interconnected parts of his firm's service: commodity risk assessment, infrastructure services, and the Legend Analytics platform for understanding energy usage and efficiency.
2:39 - Crosby explains how the Legend Analytics platform works in the case of data center customers, by providing capabilities such as real-time metering at various levels of a facility, as well as automated carbon reporting.
4:46 - The discussion unpacks how the platform is offered as a SaaS, and includes consulting services tailored to each customer's needs.
7:49 - Notes on how the Legend Analytics platform can gather data from disparate systems and consolidate it into one dashboard, allowing for AI analysis and identification of previously unknown issues.
10:25 - Crosby reviews the importance of accurate and real-time emissions tracking for ESG reporting, and provides examples of how the Legend Analytics platform has helped identify errors and save costs for clients.
12:23 - Crosby explains how the company's new, proprietary NIC device, dubbed the Ledge, can securely gather data from any system and bring it to their cloud in real time, lowering costs and improving efficiency.
23:54 - Crosby touches on issues including challenges with power availability; trends in building fiber to power; utilizing power capacity from industrial plants; and on-site generation for enabling stable voltage.
Keep pace with the fast-moving world of data centers and cloud computing by connecting with Data Center Frontier on LinkedIn, following us on X/Twitter and F
Nov 7, 2023
30 min

For this episode of the Data Center Frontier Show Podcast, we sat down for a chat with Andy Pernsteiner, Field CTO of VAST Data.
The VAST Data Platform embodies a revolutionary approach to data-intensive AI computing which the company says serves as "the comprehensive software infrastructure required to capture, catalog, refine, enrich, and preserve data" through real-time deep data analysis and deep learning.
In September, VAST Data announced a strategic partnership with CoreWeave, whereby CoreWeave will employ the VAST Data Platform to build a global, NVIDIA-powered accelerated computing cloud for deploying, managing and securing hundreds of petabytes of data for generative AI, high performance computing (HPC) and visual effects (VFX) workloads.
That announcement followed news in August that Core42 (formerly G42 Cloud), a leading cloud provider in the UAE and VAST Data had joined forces in an ambitious strategic partnership to build a central data foundation for a global network of AI supercomputers that will store and learn from hundreds of petabytes of data.
This week, VAST Data has announced another strategic partnership with Lambda, a, Infrastructure-as-a-Service and compute provider for public and private NVIDIA GPU infrastructure, that will enable a hybrid cloud dedicated to AI and deep learning workloads. The partners will build an NVIDIA GPU-powered accelerated computing platform for Generative AI across both public and private clouds. Lambda selected the VAST Data Platform to power its On-Demand GPU Cloud, providing customer GPU deployments for LLM training and inference workloads.
The Lambda, CoreWeave and Core42 announcements represent three burgeoning AI cloud providers within the short space of three months who've chosen to standardize with VAST Data as the scalable data platform behind their respective clouds. Such key partnerships position VAST Data to innovate through a new category of data infrastructure that will build the next-generation public cloud, the company contends
As Field CTO at VAST Data, Andy Pernsteiner is helping the company's customers to build, deploy, and scale some of the world’s largest and most demanding computing environments. Andy spent the past 15 years focused on supporting and building large scale, high performance data platform solutions.
As recounted by his biographical statement, from his humble beginnings as an escalations engineer at pre-IPO Isilon, to leading a team of technical ninjas at MapR, Andy has consistently been on the frontlines of solving some of the toughest challenges that customers face when implementing big data analytics and new-generation AI technologies.
Here's a timeline of key points discussed on the podcast:
0:00 - 4:12 - Introducing the VAST Data Platform; recapping VAST Data's latest news announcements; and introducing VAST Data's Field CTO, Andy Pernsteiner.
4:45 - History of the VAST Data Platform. Observations on the growing "stratification" of AI computing practices.
5:34 - Notes on implementing the evolving VAST Data managed platform, both now and in the future.
6:32 - Andy Pernsteiner: "It won't be for everybody...but we're trying to build something that the vast majority of customers and enterprises can use for AI/ML and deep learning."
07:13 - Reading the room, when very few inside that have heard of "a GPU..." or know what its purpose and role is inside AI/ML infrastructure.
07:56 - Andy Pernsteiner: "The fact that CoreWeave exists at all is proof that the market doesn't yet have a way of solving for this big gap between where we are right now, and where we need to get tom in terms of generative AI and in terms of deep learning."
08:17 - How VAST started as a data storage platform, and was extended to include an ambitious database geared for large-scale AI training and inference.
09:02 - How another aspect of VAST is consolidation, "considering what you'd have to do to stitch together a generative AI practice in the cloud."
09:57 - O
Oct 24, 2023
37 min

For the latest episode of the Data Center Frontier Show Podcast, editors Matt Vincent and David Chernicoff sat down with Mike Jackson, Global Director of Product, Data Center and Distributed IT Software for Eaton.
The purpose of the talk was to learn about the company's newly launched BrightLayer Data Centers suite, and how it covers the traditional DCIM use case - and a lot more.
According to Eaton, the BrightLayer Data Centers suite's digital toolset enables facilities to efficiently manage an increasingly complex ecosystem of IT and OT assets, while providing full system visibility into data center white space, grey space and/or distributed infrastructure environments.
"We're looking at a holistic view of the data center and understanding the concepts of space, power, cooling, network fiber," said Jackson. "It starts with the assets and capacity, and understanding: what do you have, and how is it used?"
Here's a timeline of points discussed on the podcast:
0:39 - Inquiring about the BrightLayer platform and its relevance to facets of energy, sustainability, and design in data centers.
7:57 - Explaining the platform's "three legs of the stool": Data center performance management, electrical power monitoring, and distributed IT performance management. Jackson describes how all three elements are part of one code base.
10:42 - Jackson recounts the BrightLayer Data Center suite's beta launch in June and the product's official, commercial launch in September; whereby, out of the gate, over 30 customers are already actively using the platform across different use cases.
13:02 - Jackson explains how the BrightLayer Data Center suite's focus on performance management and sustainability is meant to differentiate the platform from other DCIM systems, in attracting both existing and new Eaton customers.
17:16 - Jackson observes that many customers are being regulated or pushed into sustainability goals, and how the first step for facilities in this situation is measuring and tracking data center consumption. He further contends that the BrightLayer tools can help reduce data center cooling challenges while optimizing workload placement for sustainability, and cost savings.
20:11 - Jackson talks about the importance of integration with other software and data center processes, and the finer points of open API layers and out-of-the-box integrations.
22:26 - In terms of associated hardware, Jackson reviews the Eaton EnergyAware UPS series' ability to proactively manage a data center's power drop via handling utility and battery sources at the same time. He further notes that many customers are now expressing interest in microgrid technology and use of alternative energy sources.
27:21 - Jackson discusses the potential for multitenant data centers to use smart hardware and software to offset costs and improve efficiency, while offering new services to customers and managed service providers.
Keep pace with the fast-moving world of data centers and cloud computing by connecting with Data Center Frontier on LinkedIn.
Oct 10, 2023
35 min

For this episode of the Data Center Frontier Show Podcast, DCF editors Matt Vincent and David Chernicoff chat with Tiffany Osias, VP of Colocation for Equinix.
Osias begins by discussing the company's investment in a range of data center innovations to help its customers enter new markets and gain competitive advantages through burgeoning AI and machine learning tools.
In the course of the discussion, we also learn about Equinix's deployment of closed loop liquid cooling technologies in six data centers in 2023, and where the company stands on offering increased rack densities for powering AI workloads.
Osias also discusses developments along the course of Equinix helping its customers to optimize their hybrid cloud and multi-cloud architectures and strategies.
Data center sustainability also factors into the conversation, as Osias touches on how Equinix aims to achieve 100% renewable energy coverage by 2030.
Here's a timeline of key discussion points in Data Center Frontier's podcast interview with Equinix VP of Colocation, Tiffany Osias:
1:09 - Osias explains how Equinix invests in data center innovation to help its customers enter new markets, contain costs, and gain a competitive advantage, especially as AI and machine learning become more prevalent in decision-making processes.
1:50 - The discussion turns to how Equinix enables its customers' use of AI by providing secure, reliable service and efficient cooling options, including advanced liquid cooling technologies.
4:07 - Osias remarks on how Equinix plans to deploy closed loop liquid cooling in six data centers in 2023 to meet increasing demand from customers for full liquid-to-liquid environments.
5:49 - We learn how Equinix offers high-density racks for customers running 10-50+ kW per rack, and provides a bespoke footprint for each customer based on their power consumption needs and cooling capabilities.
7:14 - Osias remarks on how liquid cooling can have a positive impact on data center sustainability by reducing physical footprint, carbon emissions from manufacturing, and improving cooling efficiency. The company's use of renewable energy is also examined.
10:19 - Osias describes how AI impacts the Equinix approach to data center infrastructure, and the importance of partnerships and interconnection strategies.
12:09 - Osias discusses how Equinix aims to achieve 100% renewable energy coverage by 2030 and has made progress towards that goal.
13:21 - Notes on how Equinix helps customers optimize their hybrid multi-cloud architecture and interconnect with cloud and storage providers.
Read the full article about the podcast for interview transcript highlights, plus a recent video from Equinix regarding data center sustainability.
Sep 26, 2023
24 min
Load more