Microgrid technology, the utility environment, and customer sites are rapidly evolving with the changing energy system, and microgrid controls must evolve to play their part in the transition. Amanda Kabak, chief technology officer and principal architect at CleanSpark, charts this evolution in an interview with Microgrid Knowledge. She discusses the ability of CleanSpark’s adaptive controls to process large amounts of data, using the results to manage energy resources and squeeze value out of microgrids.
How have microgrid controls evolved over time?
Amanda Kabak: Historically, controls systems have been implemented with custom code for specific pieces of equipment at a given site, designed to meet the site’s goals at that particular moment in time. There is a problem with this approach; as the system runs, something within or around it will inevitably change. Any meaningful change requires the control code to be manually adjusted. Take, for example, a solar only system, set to first satisfy existing load before pushing any excess over the meter. This approach will no longer be the most effective when utility time of use periods change or the customer adds on to their site, increasing load when solar isn’t available. With today’s pace of change, a hard-coded, custom approach is no longer viable.
As a result, we see the rise of intelligent controls systems. These systems automatically adjust their behavior to meet the customer’s pre-set objectives, often based on analysis from a microgrid modeling tools. Intelligent controllers can evaluate weather forecasts, load conditions, and operational constraints to provide more effective control over time. They also open up microgrids to continuously evolving over-the-meter opportunities. Generally, intelligent control considers much more data, managing energy in a more sophisticated way to squeeze the most value out of the available resources.
How do CleanSpark’s controls differ to others?
AK: CleanSpark’s controls are not just intelligent, but adaptive. Patented forecasting and machine learning tools are used to build up an accurate picture of what is going to happen on a site over the coming 24 hours. How much solar will be produced? What will the load profile look like? This information is combined with the site’s objectives, which could be as diverse as cost reductions, resiliency, and carbon neutrality. The path for optimal use of the assets to meet these objectives is plotted with high granularity, multiple times during each 15-minute interval.
Also, CleanSpark’s commissioning process for the system has a high degree of automation, allowing us to avoid site specific custom coding. Our vendor agnostic library of equipment, combined with our approach of configuration-first controls, take site specific information and automatically set up the data and algorithms through a single API call. This eliminates potential problems with custom code and reduces setup costs.
Can the controls adapt if the microgrid or conditions change?
AK: Yes, site changes can be easily incorporated into the model, for example the addition of more solar generation. Updates of utility rates are automatically pushed out to the sites, allowing operation with the most recent information. We can also integrate new over-the-meter opportunities, presenting additional revenue opportunities.
The initial insight from site modeling is used to set the control logic, and using machine learning and optimization results in better performance as more operational data becomes available. Due to COVID-19, a site was commissioned into operation with a significantly different load profile from the data initially used to set up the model. With about six weeks of new operational data, the model was retrained, giving more accurate forecasts and enabling the microgrid to provide more value.
Our patented forecasting model is constantly ingesting new data and adapting to improve our outcomes. We could even take this a step further and look at the efficacy of the weather forecasts themselves. Do the weather forecasts used as the source for our models consistently under- or over-estimate certain characteristics such as temperature or cloud cover? If we could answer that, we could adjust the forecast itself to improve results. The amount of information consumed and generated over time makes this field a data science playground.
What does the future hold for controls?
AK: From a technology perspective, cloud-based control aggregation and real-time adjustment to algorithms will be very important for providing value to customers. Making use of the cloud for machine learning and artificial intelligence means huge amounts of data can be processed cost effectively, without the need for heavy hardware at each site.
Looking ahead at the energy market, over-the-meter opportunities will continue to expand. The existing day ahead and day-of peak pricing tools will evolve, getting closer to becoming real time energy markets. Lead times between price signals and delivery will continue to shorten. There is a huge opportunity to provide customers with additional revenue streams by taking advantage of these changes; CleanSpark’s control algorithms are already set up for this future market.
Amanda Kabak is the chief technology officer and principal architect at CleanSpark. Join CleanSpark at the Software and Technology Innovation Showcase, 2 pm EST, November 19 at Microgrid 2020 Global, a free conference hosted by Microgrid Knowledge.