Make my freezer smarter (Part 1)!

About a month ago, I went out into the garage to grab some hamburger meat for dinner. As I opened the deep freezer in the garage I was immediately slammed with a horrific odor. That’s right, worst case scenario, for some reason my garage freezer was at room temperature and everything was ruined. This is the second time in 6 months this had happened. Both times the GFCI popped in the garage (for reasons unknown), which caused power to be cut to the deep freezer. Massive loss of about $350 worth of meats and other frozen foods.

Thus begins my journey to make my deep freezer smarter.

Step one was to determine root cause (why did the GFCI Pop?)

Well, this is a simple note to anyone moving into an older home with GFCI outlets. Older outlets can wear out over time if they have been popped multiple times. Easiest fix is to first replace the outlet.

I wanted to go with a Z-Wave GFCI outlet to monitor with SmartThings, but to my surprise, this product doesn’t exist! Sure I could add a smart plugin outlet and maybe monitor power, but you kind of need something with a backup battery. Even if I installed this, I still would not know if the freezer was getting cold.

So the first step was to buy a new GFCI outlet. I ended going with this one, which has an audible alarm. I figured an audible beep is better than nothing, even though we wouldn’t be able to hear this beeping from inside the house.

Next, I decided to go an entirely different route.  Let’s monitor temperature.  Easy enough.  Find a Z-Wave/ZigBee sensor that works in a freezer, drop it in there, setup SmartThings, notify me when it goes above a certain temp.

After reading about some success stories of people using a SmartThings ZigBee MultiSensor for this purpose in a deep freezer, I decided to give it a shot!  Why not right?  So I picked up one from Amazon, dropped it in a moisture tight freezer bag with a desiccant pack, put this in the freezer and set myself up to monitor freezer temps.

 

EASY PEASY, until day 2….

Now, I have around 5 of these MultiSensors doing various things in my house.  Battery life is terrific…lasting roughly 6 months or longer depending on the application.  I immediately noticed battery life plummeting on this sensor.  Fast forward 2 weeks later, and I’ve now gone through three 2450 coin cell batteries.

Some background on deep freezers and compressor applications.  Unlike a regular outdoor temperature or inside temperature, a deep freezer actually swings in temperature (quite dramatically) from -15F to -9F before the compressor turns back on to lower it back to -15F.  Your temperatures may vary, but you get the idea.  The MultiSensor is designed to capture 1F changes in temperature and report.

My immediate thought was – mmkay, so it’s reporting too much, killing my battery.  I rewrote the stock groovy file for the MultiSensor to just focus on temperature and only report every couple of hours.  I was going to link that here, but who cares, it doesn’t work.  Battery still dies in 2-3 days.

Even though the batteries are specced for this temperature range, these devices simply eat batteries at temperatures below -10F.  I even did research on 2450 batteries and chose a brand that was specced down to an operational range of -30F.

The sensor works great, but I can’t replace batteries weekly in this thing.  So I thought – maybe hardwire the sensor to power, solder in a 3V PSU and go for it.  I almost immediately changed my mind.  The only thing worse than a moist deep freezer is having live voltage running into it from a DC power supply in a GFCI outlet.  Shot this idea down before I started.

Okay easy enough, I’ll find a Z-Wave, ZigBee, of Wi-Fi device with an external waterproof temperature sensor.  No dice, can’t find one.

So now I’m back to the drawing board.  Here’s my gameplan.  I’m going to buy a NodeMCU ESP8266 and a DS18B20 Waterproof sensor.

Shopping List:

Total Cost: $78.29

Now, there are a lot of parts on the list above you might already have, or might find in lower quantities. I actually wanted some extra of everything for upcoming projects (Stay tuned!). BOM cost for this project will be around $21.65. That isn’t too bad IMHO.

Other stuff (if you don’t already have these), just to make life easier:

Total Cost: $89.81

You may also want some standard electronics equipment like a multimeter, pliers, ESD protection, oscilloscope, etc. These aren’t totally required for this project.

Next Post (Part 2) will detail this DIY build and code examples.  I plan to integrate this directly with my SmartThings HUB to report temperatures in real time!

 

New Host for my Lab!

I recently picked up a new host to add to my home lab.  For years, I’ve been running VMWare ESXi/VSphere on a single node. Recently the RAM utilization was starting to creep up to the point where I couldn’t squeeze any more VMs on it. I started out on a month long deal-hunt to find something that would work for me.

My requirements were:

  • Cheap (like real cheap!) <$300 preferred
  • Relatively modern CPU architecture (at least 4-cores, 8 logical)
  • Power Bill Friendly (Energy Efficiency for performance)
  • Preferably rack mounted
  • At least  32GB of RAM
  • Quiet if possible!
  • Some type of remote management
  • Support for remote monitoring (SNMP/iDRAC/etc).
  • Dual GB NIC with room to add a NIC card

I ended up scoring a used Dell R210ii.  This ticked off nearly everything on the list.  Here are the specs!

  • Intel Xeon Quad Core E3-1240 v2 3.4GHz
  • 32GB DDR3 1600MHz (Maxed out)
  • 240GB SSD
  • Dell iDRAC 6
  • Dual GB NIC + Management port

All in shipped price of $285

Read more… Pictures included!
(more…)

The easy way to pull UPS statistics using CyberPower Panel

Since setting up my Grafana panel, one thing that has been bugging me is that I had to modify the “vanilla” Cyberpower Panel software with pwrstat to pull the statistics I wanted. Not only was I concerned with conflicts with the actual shutdown software, but I didn’t want to introduce a forced post-process after upgrading the panel in the future. A redditor recommended that I check out init_status.js on the Cyber Power Panel, to see if I could potentially pull data from there instead of from pwrstat as my previous script had been doing.

http://<your CPP IP>:3052/agent/ppbe.js/init_status.js

You get an output like this

var ppbeJsObj={"status":{"communicationAvaiable":true,"onlyPhaseArch":false,"utility":{"state":"Normal","stateWarning":false,"voltage":"122.0","frequency":null,"voltages":null,"currents":null,"frequencies":null,"powerFactors":null},"bypass":{"state":"Normal","stateWarning":false,"voltage":null,"current":null,"frequency":null,"voltages":null,"currents":null,"frequencies":null,"powerFactors":null},"output":{"state":"Normal","stateWarning":false,"voltage":"122.0","frequency":null,"load":10,"watt":90,"current":null,"outputLoadWarning":false,"outlet1":null,"outlet2":null,"activePower":null,"apparentPower":null,"reactivePower":null,"voltages":null,"currents":null,"frequencies":null,"powerFactors":null,"loads":null,"activePowers":null,"apparentPowers":null,"reactivePowers":null,"emergencyOff":null,"batteryExhausted":null},"battery":{"state":"Normal, Fully Charged","stateWarning":false,"voltage":"24.0","capacity":100,"runtimeFormat":0,"runtimeFormatWarning":false,"runtimeHour":1,"runtimeMinute":1,"chargetimeFormat":null,"chargetimeHour":null,"chargetimeMinute":null,"temperatureCelsius":null,"highVoltage":null,"lowVoltage":null,"highCurrent":null,"lowCurrent":null},"upsSystem":{"state":"Normal","stateWarning":false,"temperatureCelsius":null,"temperatureFahrenheit":null,"maintenanceBreak":null,"systemFaultDueBypass":null,"systemFaultDueBypassFan":null},"modules":null,"deviceId":1}};

Well look at that! There’s pretty much everything I ever wanted, in a completely open format that does not require ANY login. Guess it’s time to dust off vi and get to bashing this. One COULD go the route of installing a JSON parser, but if you look closely, the output isn’t exactly perfect JSON. Either way, why institute the overhead when you could simply grep what you want. So without further ado, here’s how I did just that.

First you want to pull the status page into a variable

cpp_json_data=$(curl http://<your CPP IP>:3052/agent/ppbe.js/init_status.js)

Then you’ll want to parse what you want with grep like this:

#input voltage
cpp_involts=$(echo $cpp_json_data |\
grep -oP '(?<="voltage":")[^."]*' | head -1)
#battery voltage
cpp_battvolts=$(echo $cpp_json_data |\
grep -oP '(?<="voltage":")[^."]*' | tail -1)
#Load(Watts)
cpp_loadwatt=$(echo $cpp_json_data |\
grep -oP '(?<="watt":)[^,]*' | head -1)
#Capacity %
cpp_capacity=$(echo $cpp_json_data |\
grep -oP '(?<="capacity":)[^,]*' | head -1)
#Runtime
runtimeHour=$(echo $cpp_json_data |\
grep -oP '(?<="runtimeHour":)[^,]*' | head -1)
runtimeMinute=$(echo $cpp_json_data |\
grep -oP '(?<="runtimeMinute":)[^,]*' | head -1)
cpp_runtime=$(($runtimeHour*60*60+$runtimeMinute*60))
#Load %
cpp_loadpercent=$(echo $cpp_json_data |\
grep -oP '(?<="load":)[^,]*' | head -1)

What you see here, is that we request the CPP page with the “JSON”-like data. The rest of the commands are simply parsing that data to pull out the numbers I wanted to plug into my influxdb. This turned out to be infinitely easier than doing it the other method, and easy to maintain going forward. I’ve since updated my upsmon.sh script to incorporate this function, which competely eliminates the need to have the cpupssmon.sh script I previously had running on the CPP instance.

Note: As with my other scripts, you’ll need to use a version of grep that supports regular expressions (the -P flag). The default in Ubuntu supports this. OSX does not, but a simple brew will get you the grep you’re looking for :). Alternatively you could use awk, or perl if you are fancy!

There you have it, a completely transparent modification

grafana_power

Here is the new upsmon.sh script
Here are the changes made to upsmon.sh

Let me know what you think, or if you have any suggestions for improvement! Happy labbin.

Setup a wicked Grafana Dashboard to monitor practically anything

I recently made a post on Reddit showcasing my Grafana dashboard. I wasn’t expecting it to really get any clicks, but as a labor of love I thought I’d spark the interest of a few people. I got a significant amount of feedback requesting that I make a blog post to show how I setup the individual parts to make my Grafana dashboard sparkle.

Let’s start with the basics.  What the heck is Grafana?  Well this image should give you an idea of what you could be able to make, or make better with Grafana.

grafdashboard

I use this single page to graph all my the statistics I care about glancing at in a moment’s notice.  It allows me to see a quick overview of how my server is doing without having to access five or six different hosts to see where things are at.  Furthermore, it graphs these over time, so you can quickly see how your resources are managing the workload you have on the server at any given point.  So if you’re sold – let’s get started!  There is a lot to cover, so I’ll start with laying out the basics to help new users understand how it all ties together.

Let’s start with terminology and applications that will be used in this tutorial.

  • Grafana – The front end used to graph data from a database. What you see in the image above, and by far the most fun part of the whole setup.
  • Grafite – A backend database supported by Grafana. It has a lot of neat custom features that make it an attractive option for handling all of the incoming data.
  • InfluxDB – Another backend database supported by Grafana. I prefer this database for speed to implement, my own prior knowledge, and as a byproduct of a few tutorials I dug up online. This tutorial will be showing you how to setup services using InfluxDB, however I’m sure that Grafite would work equally as well if you want to color outside of the box.
  • SNMP – Simple Network Management Protocol. I use this protocol as a standard query tool that most network devices natively support, or can have support added. SNMP uses OIDs to query data, but don’t worry, you don’t have to have any special addons if you don’t want them. I recommend you look up the specific SNMP datasheet for your device, as some devices have custom OIDs that give you very interesting graphable information! I’ll explain this more later.
  • IPMI – Intelligent Platform Management Interface. This is used to pull CPU temperatures and fan speeds from my Supermicro motherboard. Most server grade motherboards have a management port with SNMP support. Look it up, you’ll be surprised the information you can get!
  • Telegraf – During the course of this article you’ll see that I use a lot of custom scripts to get SNMP/IPMI data. Another option would be to use Telegraf. I eventually will move most of my polling to Telegraf, but for right now I’m using Telegraf purely for docker statistics. I’ll explain how to set it up here.
  • Collectd – CollectD is an old popular favorite. It’s an agent that runs on the baremetal server or in a VM that will automatically write data into your InfluxDB database. Very cool – but I don’t use it, because I prefer to limit installing extra tools on every server to monitor them.

I’ll walk you through how I setup the following monitoring applications:

  • ESXi CPU and RAM Monitoring via SNMP and a custom script for RAM
  • Supermicro IPMI for temperature and fan speed monitoring
  • Sophos SNMP for traffic load monitoring
  • UPS/Load monitoring with custom scripts and SNMP through a Synology NAS and CyberPower Panel
  • Docker Statistics for CPU and RAM monitoring via Telegraf
  • Synology Temperature and Storage information using SNMP
  • Plex Current Transcodes using a simple PHP script

Read More…
(more…)

Setting up a Dockerized GitLab at Home

There are few things I love more than git. It’s part of my daily workflow, and I’m not even a developer by profession (any more). I frequently will git init folders just to have history, and to transfer things between servers. One thing I do often is create git repositories in my configuration folders on my servers so I can see what I changed, and roll back in case I royally mucked something up.

This isn’t a git primer, instead I want to share how I setup an instance of Gitlab on my DMZ docker that’s hosting a few external services for me. Compared to the vanilla installation guide, this is MILES easier to load via a docker. What does it give you? Well, future upgrades are easy, the whole database, configuration, and history is in a convenient and easy to backup folder structure, and finally the ability to move this server around as needed.

To get started, this tutorial assumes a few things.

  1. You have an Ubuntu linux server with docker installed.
  2. You’re already familiar with the basics of docker (this isn’t a tutorial for that either).
  3. You have a basic understanding of linux operations, moving files around, and what these commands mean.
  4. Your docker server/VM has 2 CPU cores and 2GB of RAM available.

So let’s get started!

Read More…

(more…)

Docker @ Home? Why yes!

For years I’ve run a personal home server.  Well scratch that, maybe I should call it a lab.  I’ve been a long-term user of VSphere at home, and over the years I’ve slowly but surely expanded our environment to be more akin to a small business setup.  It’s a hobby, and I enjoy it.  Why not?

Recently while visiting r/homelab, I ran across a post about a guy who set up a linux host running docker all of his home media applications.  Docker?  What the heck is docker?  Why do I need that when I can just spin-up VMs.  Or how about a better question… Why do I need that when I can just install all of these apps in a single VM?

In short – here are the advantages:

  • Less resource intensive than separate VMs.  (Duh!)
  • Complete environment isolation, meaning no more mono-libs or java libs cluttering up your host server.
  • Speaking of Libs – having the RIGHT environment for the app you are running.  What I need an OLD version of PHP to run this?  No big deal!
  • Separation of configuration data from application data.
  • EASY upgrades docker pull / docker run / docker rm.  Or my personal favorite… just script it!
  • Quick and easy deployment of new tools/toys to play with without causing harm to the rest of your system.

There are MANY other reasons to use docker, especially when it comes to development (both linux and web development).  I won’t get into that here.  Mostly logging this setup for myself. I’ll share after the break how I set up a brand new Ubuntu 16.04 VM with docker, and migrated my entire home media server to it in one night.

Before continuing – I always give credit where it’s due. Much of my inspiration came from this post [zackreed.me]. Check it out for more (likely better written) posts similar to this one!

Read More…
(more…)

Finally a place for things…

I finally got around to setting up a blog for my tech hobbies.

What can you expect here?  Well mostly just things I discover as I’m playing around with my home lab.  Product reviews (rare), programming and opensource tools.  I’m going to primarily use this to document neat things I’ve setup for myself.

Stay Tuned!