Logging temperature using a USR-HTW, forecast.io, Grafana, InfluxDB and of course Docker
I've previously posted about running InfluxDB and Grafana using Docker. This post covers getting data from a USR-HTW temperature sensor and weather data from forecast.io.
USR-HTW
The USR-HTW is an inexpensive, Wifi enabled temperature and humidity sensor. Although it doesn't have a documented API, it's already been reverse engineered.
Data is retrieved from it by establishing a TCP connection to port 8899 then periodically (roughly every 15 seconds) packets will be received that can be decoded to the temperature and humidity reading.
I've written a Dockerfile which is just a minimal Alpine Linux container with python and curl installed. I also add the script for retrieving the data from the USR-HTW sensor and from forecast.io.
Running the container
docker run --name usr-htw --restart=always -d --link influxdb:influxdb tomdee/usr-htw usr-htw.py <ADDRESS OF USR_HTW>
The container is configured to always restart (because it will die if it loses contact with the sensor)
It's also linked to the influx DB container so it can store it's data there.
Script details
The script itself is just a simple "while True" that blocks on data coming over the TCP connection then pumps it into influx DB using curl. Not super efficient but it doesn't need to be
subprocess.call("curl -sS -i -XPOST 'http://influxdb:8086/write?db=tomdee' --data-binary 'temperature,room=XXX,location=YYY value=%s'" % temp, shell=True)
Forecast.io
Forecast.io provides both forecast and historical data for a multitude of weather related data. They have an excellent API that's simple to use and crucially free (for up to 1000 reqs/day).
Running the container
The USR-HTW container has the forecast.io script in it too, so running it is very similar.
docker run --name usr-htw --restart=always -d --link influxdb:influxdb tomdee/usr-htw forecast.py <FORECAST API KEY>
You can get an API key from https://developer.forecast.io/register
Script details
Another simple script that grabs the json from forecast.io, reads the JSON then curls the result to influxdb.
The forecast.io API call explicitly sets the units to be "si" and excludes all the unwanted forecast data.
https://api.forecast.io/forecast/%s?units=si&exclude=minutely,hourly,daily,alerts,flags
Putting it together
I've been collecting data for a number of weeks now. I've created two dashboards in Grafana, one which doesn't aggregate the data and one which aggregates it daily (since the weather follows a diurnal cycle)
Leave a comment