Ãë²¥

Skip to main content
Start of main content

Go slow to go fast: How to improve hosting and loading capacity maps for interconnection

June 06, 2025

By Peter Nearing and Glenn Peters

Planning and data validation are essential for utilities looking to develop reliable hosting and loading capacity maps

As a distribution utility, daily you manage customer interconnection requests. Sometimes, it¡¯s from a large commercial building owner or an independent power producer. And, nowadays, perhaps it¡¯s even from a data center facility. Today, building owners may be using all electric heat. Power producers may be connecting a combined solar and energy storage installation. And data centers may have very high load requirements as well as stringent reliability requirements.

All of these requirements complicate the interconnection process. So, even though you may have distribution infrastructure in the area, there may or may not be grid capacity for them.

How would you proceed to find out? It¡¯s a multistep process.

First, the interconnection requestor would need to develop a scoping study. This will help you understand what load or generation they would like to connect. Then, they would need to apply to you, the distribution utility. The utility would then conduct a study using the interconnection information as the basis. This process could be lengthy and entail significant costs. Plus, in the end, the requestor may find out that the selected location, or their generation or load profile, is not ideal. At this point, they are back to the drawing board to try a new location, generator/load profile, or contribute to grid upgrades.

Today, interconnection requests are demanding more out of the grid. All of these requirements complicate the interconnection process.

One way to streamline the process? If the local distribution utility publicly shares the hosting and loading capacity maps. Those maps provide information about a distribution grid¡¯s ability to host new loads or generation facilities. Hosting and loading capacity maps are a detailed map of a geographic region. They usually contain background data of roads and geographic features, similar to an interactive map like those you have on your phone. Drawn on top of this map are details of the distribution system, showing distribution lines and facilities. When a user clicks on any one of these distribution lines, an information box displays information specific to that feeder or line. This may include:

  • The feeder¡¯s tag
  • System voltage
  • Analysis timestamps
  • Load capacity in kVA
  • Hosting capacity in megawatts or kilowatts
  • Queued generation or load

New developments may even calculate dynamic capacities as opposed to static capacity. The information is readily available in an externally facing, web-based format. That way, customers can easily find this information and make plans based on real-world data.

Granted, a more fulsome analysis would be needed if a project were to be finally connected. But having this information upfront greatly speeds up and streamlines many aspects of grid interconnection. This is true for both customers and utilities. The customers benefit because they can direct their time and efforts to locations that are more likely to succeed. The utilities benefit because the customers are now self-selecting away from unfavorable locations. The result? A reduction in the number of fruitless applications. It¡¯s a win-win.

A shared interest

There are benefits for utility planners, regulators, and end users. They all have a shared interest in hosting and loading capacity visibility. It will be an input toward enabling the future of the utility grid.

An expected question: If we¡¯ve managed without hosting and loading capacity maps, why do we need them now?

Because we are seeing more electrification. There are grid-responsive loads. And there is the integration of distributed energy resources. These are things that will likely complicate the grid. This creates a bi-directional flow of electricity. Historically, the grid has only gone one direction¡ªout from the power source.

Today, utilities, regulators, and end users alike see the value that capacity maps can provide in grid planning. The New York Public Service Commission mandated that investor-owned utilities have hosting capacity maps available by October 1, 2017. Since then, a utility in New York said: ¡°Now we can refer developers to the map, which may indicate that one feeder has the potential to handle six megawatts while another may be able to handle only one megawatt. This gives developers guidance on which locations may be better suited to proceed with an interconnection project.¡±

Adding more localized energy onto our existing grids will require a careful balance. It will also demand a robust understanding of our existing infrastructure and its data, specifically the distribution system.

To meet regulatory requirements and customer demands, the utility¡¯s hosting and loading capacity maps will need to be accurate. If the underlying data is flawed, the capacity results could be impacted. The old saying of ¡°garbage in, garbage out¡± applies here. Improving data quality to enable these maps is one of the toughest tasks. It takes a lot of time and effort for utilities. A major hurdle for adopting the maps is the cost on the utility¡¯s existing budget. That¡¯s why a carefully planned, multiphased approach is crucial. It should solicit and include input from stakeholders, especially utilities who know their system best as well as end users.

Data is the foundation of the analysis

High-quality data is key to an accurate hosting capacity analysis and map. Bad data from source systems will have a negative impact on the hosting and loading capacity results. If these inaccuracies go uncorrected, they can lead to erroneous results such as large sections of the grid showing zero capacity. This can impact the reliability and accuracy of the capacity maps. Ultimately, the public won¡¯t trust them.

We will see challenges arise if thorough checks are not conducted at critical ¡°gates¡± of the analysis and the mapping process. And utilities may be unaware. Then, customers, developers, or internal detailed studies identify conflicts between the actual capacity of a segment and the mapping solution. By this point, the cost of fixing these errors has increased. Worse yet, rework is likely needed to address the issues. And it could impact the credibility and brand reputation of the utilities.

So, it¡¯s critical to check the accuracy and reliability of data at every stage of the process. It also helps to maintain trust with stakeholders. This highlights why it¡¯s important to have deliberate checks and validation procedures throughout the process.

This methodical approach underscores the core principle of go slow to go fast. Careful planning and validation in the early stages leads to faster, more successful outcomes in the long run.

Go slow to go fast

The ¡°go slow to go fast¡± approach is an iterative process. The key? The first two phases allow for adjustments that have a lower cost and effort for utilities early in the development cycle. During these initial phases, it¡¯s critical to answer key questions before jumping to technology solutions.

First, identify the immediate use cases for the hosting capacity analysis map and envision how it will function in the future. Then, we must determine the data sources, assess the data quality, and establish metrics for measuring data quality. Next, define the analysis process for the hosting capacity map. Here, you want to pinpoint where data validation gates are needed and how data will be validated.

One of the most overlooked steps is setting up these gates in the analysis process and using data validation checks to catch inaccuracies. This step can save a lot of rework. This is true during the proof of concept and deployment phases. To avoid costly changes later, it¡¯s recommended utilities address these questions and set up a robust validation process early. Ultimately, it makes the procedure smoother and more efficient.

This methodical approach underscores the core principle of go slow to go fast. Careful planning and validation in the early stages leads to faster, more successful outcomes in the long run.

Common data validations during the analysis

In many places, we have seen capacity analysis and maps undergo major developments. Now, other jurisdictions are pinpointing what¡¯s worked well. And that allows utilities to move more quickly. In terms of common data validation gates and what they entail, we share several helpful guides below.

Input data from source systems: The focus here is on data quality in the underlying source systems. This can include a utility¡¯s GIS or advanced metering infrastructure data. Common data quality issues include

  • Topology errors
  • Incorrect equipment nameplate values
  • Missing loading/generation data sets

Circuit model validations: This involves issues in a utility¡¯s existing or developed circuit models within its power system modeling software. Some common errors include existing violations in the models and convergence issues.

Simulation results: This involves the quality of simulation results. It looks into when a utility planner should investigate further. What are some common signs that there may be issues in the underlying analysis? When there are widespread zero-capacity results as well as large deviations in previous simulations. These require more investigation to see if we can trust the results or if there are errors.

Good data quality is the foundation of an accurate capacity analysis and their corresponding maps.

As utilities develop hosting and loading capacity analysis, we suggest they identify and assess data gaps and errors. This will help them understand the extent of the issues. Where possible, they should have a standard corrective process. The process should address chronic issues in the source database as part of a data improvement plan, rather than just fixing them at specific points in time.

Data validation and the energy transition

The demand for electricity generation and load are going to rise as we press forward with the energy transition. It¡¯s great to add more localized energy and electrified systems onto our existing grids. But it will require a careful balance. It will also demand a robust understanding of our existing infrastructure¡¯s data, specifically the distribution system. And that doesn¡¯t even take into account the future grid infrastructure we¡¯ll need to build in the coming years.

The go slow to go fast approach can help both utilities and users. As we¡¯ve shared above, planning and data validation are essential. They will help us avoid errors and unwanted costs or delays to grid visibility from hosting and loading capacity maps. By following the process we¡¯ve outlined, we can be more confident that our understanding of the grid¡ªspecifically the hosting and loading capacities¡ªare accurate.?

  • Peter Nearing

    A principal and senior management consultant, Peter provides strategic advisory, analytics, and technology solutions for energy-intensive clients across Ãë²¥¡¯s global network.

    Contact Peter
  • Glenn Peters

    As an engineering manager and power systems technical practice leader, Glenn has over 20 years¡¯ experience in industrial, distribution, substation, and renewable energy fields. He leads a multi-geography team of power engineers.

    Contact Glenn
End of main content
To top