Saturday, July 11, 2015

The next chapter: solar energy

On may 13th we had our solar PV panels installed; 12 monocrystalline black panels with a total system size of 3720 watt-peak. The inverter is a StecaGrid 3600 Coolcept, a german brand, not commonly seen.

The array of panels nicely fits below the flat-plate thermal collector that came with the house and had already been located all the way at the top deliberately.

Our roof's azimuth is 146° (south-east) and its tilt 45°. There's no shadow, except for a short moment at the end of summer afternoons when the sun is just about to move to the front of the house. Then the chimney casts a shadow across the entire surface; we'll actually see this effect in our graphs later.


In the weeks before installation I had already been monitoring the values of the electricity meter, visualizing this in separate "power consumption" and "power generation" gauges, the latter - of course - had always shown 0 watt. As mentioned in an earlier post, this interpretation had worked fine until the moment the inverter was activated on may 13th.

Let's begin with the basic situation:



We use electricity in our home appliances, and their requested power is pulled from the grid. What we measure at the smart meter is the amount of energy flowing into the house and in this scenario this happens to be exactly equal to our energy consumption. So far so good.

Now when the inverter is activated and it starts pumping amps into our house, we suddenly get the following situation:



I remember seeing my virgin "power generation" gauge go up to 2500 watt, while the "power consumption" value dropped down to 0... Hmmm... In this scenario a portion of the generated power is directly consumed while the rest is pushed out into the power grid.

Now imagine - and this could be just several seconds later - that we turn on the microwave while a cloud blocks the sun:



Our generated power is no longer enough to fulfil the request and the remaining power is drawn from the grid.

It becomes clear that with our P1 telegram measurements we only measure the net energy that passes through the meter, and as we see, this could be in either direction throughout the day. A sad result is that we have lost all insight in our actual consumption (or generation, for that matter).

Fortunately our electricity meter does have separate counters for the two directions in which energy can flow through it, so we can at least distinguish those; let's call these export and import from now on.

Note that at any moment in time you can either import or export energy, but never both simultaneously. Now you may think that our P1 monitoring will always have one value at 0, but remember that we're actually measuring averages over 5 minute periods. And in one such period both import and export could have occurred.

We can toss our variables in a formula to see how they relate to each other:

      Import + Generation = Export + Consumption

This formula makes sense if we assume that we don't store any energy in batteries for later use: all the energy that comes into the house (either generated or imported) must have gone somewhere (either consumed or exported).

Since we're already measuring import and export periodically we only need the value of one additional variable for each period, and we would have a complete picture again. An obvious candidate is generation; both the inverter itself and a separate kWh meter are already keeping track of this.

We'll have to figure out a way to obtain this additional measurement automatically and get it at the right place in order to apply the formula. Well, this turned out to be a proper challenge, but fun adventure...

Sunday, July 5, 2015

Bash and Python to make it happen

Before anything else, I should give credits to Marco Bakker for sharing his scripts that got me started straight away. Within one day after I got my Raspberry Pi, I had it capture P1 telegrams, generate graphs and even upload data to external logging sites. In the weeks that followed I polished the script alot, and the result of that is what this post is about.

An overview of the most important changes:
  • Calculations are now based on the actual time between measurements. Even though the script is triggered every 5 minutes, the exact time between measurements can vary because of the fact that we may have to wait up to 10 seconds for the telegram to arrive. But it's also possible that the Raspberry Pi has had some downtime and resumes after an unknown period of time.
  • The script was getting so big that it was hard to navigate around, so I split it into smaller parts; one that holds all the configuration, one that contains the long rrdtool commands, another one that outputs data into various file formats, and a last one that contains interfaces to external websites.
  • Clean output, so we can capture it to a useful log.
  • Various readability, robustness and flow tweaks.

Also a little disclaimer; the original script provides support for several features that I don't use myself, and are therefore untested (but perhaps fully functional) in my version of the script:
  • Upload to Xively.
  • P1 dsmr v4 format. In principle, you could easily support any P1 format by adding an appropriate awk script.
  • Output to csv format.
  • Output to html page. The generated html is awfully retro, but I haven't bothered changing it.

This script is distributed under the GNU General Public License v3, and by downloading the script you agree to the terms of this license. Furthermore, usage is at your own risk and without guarantees.

mbsolget_svg  .tar.gz (10kb) .zip (12kb)


Optional dependencies: rrdtool, ncftp, sendemail, mysql

After extraction into its own directory, the only thing you need to do is edit config.sh to set P1PORT to the name of the serial device that connects to the P1 port, and also set the correct WORKDIR (and actually anything else you find in there - but let's keep it simple for now).

Finally, make the main script executable with: chmod +x mbsolget_p1.sh

You're ready to go, so give it a test run: ./mbsolget_p1.sh

No errors? Good, then you'll find a bunch of new files and directories. Note that most interesting stuff is configured to happen on every 5 minute interval of the hour.

The raw captured P1 telegrams are stored as log/p1-YYYYMMDD-hhmm.log
The extracted values of the last P1 telegram are stored in p1_gas.tmp and p1_value.tmp
All calculated statistics are written to debug.tmp, the meaning of which are in comments inside the main script. These statistics are used to build the json, csv, sql or xml data output.
In order to initialize your rrd storage file, you should run the script once with the 'create' argument: ./mbsolget_p1.sh create

When you have it configured the way you want, schedule the script to run automatically with crontab -e, and add the following line (but with the proper path of course):
*/5 * * * * /home/pi/path/to/mbsolget_p1.sh >> ~/log/mbsolget.log 2>&1

At this point - and for me this was early april - the data flow shown in last post is fully functional (except for the backup part, but this is just another cron job). I'll leave it to you to get creative with the json or xml that is pushed to your webserver every 5 minutes. You could even write a smartphone app around it.

On the right I've included a screenshot of the dashboard I created. It keeps itself up-to-date by requesting the json and graphs periodically via ajax requests.

Next chapter: solar PV panels!