Skip to Content

Solar panels.jpgThis past summer a peculiar event took place in Germany.  On a sunny, breezy day, generation from alternative energy sources provided more than half of total demand requiring companies to rapidly shed output from traditional generation.  Due to some interesting rate regulations, power had to be taken from these alternative sources first leaving coal and gas to run at less than 10%.  Nuclear still provided base-level power to the grid.  In order to keep the grid stable, exporting of energy to other connected countries spiked.

As a manager of traditional power generation, you should understand market dynamics, part of which includes pricing your product in such a way that a grid operator wants to buy it from you.  When there is too much energy going into the grid, and you are second in line, your ask price drops drastically.  In this case, pricing went negative to encourage shedding from the grid.

Events like this will become more prevalent as the world incorporates alternative sources while still trying to manage an ever increasing thirst for power driven by more homes with energy hungry devices.  Managing this transition, however long it takes, requires significant insight leveraging inputs from many sources that are in constant flux.

So how is one able to manage a highly variable source of energy in such a way as to keep the grid stable and provide a reliable service to constituents?  Certainly there are monitors on all the lines one could look at and see that power from alternative sources is rising or falling and adjust accordingly.  This is similar to driving a car by looking in the rear view mirror, it can be done but it can also be fatal.  You can look at historical data on usage, generation from multiple sources, etc. but you still end up hoping that you are close to matching demand.  Decisions like this need to be done in real-time with real-time data from more sources than one or a group of people can decipher reliably.

A relatively new technology called Predictive  Analyticsc an help here.  It has been used widely in retail department stores for everything from product placement to discounts and new product development.  A small number of utilities are beginning to pilot the technology to better understand operations in their organizations.  This technology can take data from weather sources, down to specific GPS coordinates, to determine output from alternative energy sources and match that with current traditional generation capabilities to provide a more complete picture before the event occurs.  This would allow for a more structured reduction in generation as in the case above.  In a retail market, it would allow for better pricing on an open market, and potentially higher margins.

Getting the right analytics will take some time as many algorithms need to be tuned and fine-tuned, additional data sources need to be added, variability of weather factored in and more.  It is a never ending process as power generation moves to more alternative sources, but doing so now could reduce the impact of situations like that which happened this past summer.

Follow us on Twitter @sap4utilities

To report this post you need to login first.

2 Comments

You must be Logged on to comment or reply to a post.

  1. Rory Shaffer

    David,  great article and solution – keep in mind that the problem is in the new grid models where no one any longer wants to pay for the spinning reserve that will keep the “fuses from blowing” when the renewable’s go “dim or dark”.

    The day-night cycle, clouds and other natural phenomenon have a profound variable impact on the predictive models that are expecting steady state output from these greener sources.

    Until we establish sufficent stored energy reserves in the form of pumped reserves, underground compressed air, battery banks or … we are going to face the same issue.

    The critical need for reliable base load supplies of energy being jeopardized by a mad rush to be yet greener.

    (0) 
  2. Roger Fong

    Looking forward to time when we have critical mass of internal (or partner) experts, data scientists with statistical / mathematical knowledge to deliver projects with customer from usecase prioritization to solution rollout.

    Recently in various discussions with China customers on the topic, all agree that data scientists will be critical. To have a seat at the table we have to bridge the gap somehow… unfortunately the requirements for such expertize is quite high, it be tough one for rest of us (without the Math / Stat MS & PHD degrees) to learn enough and at least continue the conversation far enough with customer in the engagement!

    (0) 

Leave a Reply