Skip to content

DrBlokmeister/HASS_NUC

Repository files navigation

HASS_NUC

Configuration for Home Assistant running on an Intel NUC for a three room apartment, offering convenience automations over lights and climate while providing multiple intuitive user controls.

| Go to Menu | Database | Ambilight |

Device Quantity Connection HA Component Notes
Hue Hub v2 1 Ethernet Philips Hue Controls all Philips Hue smart lights
Conbee II 1 USB 2.0 deCONZ Controls zigbee motion sensors and various other sensors
Xiaomi Aqara Gateway v2 1 Wi-Fi Xiaomi Aqara Deprecated. Everything switched to Conbee II

Config uses packages to split up the configuration and improve user friendliness. The bulk of automations/configuration can be found here.

Config uses Home Assistant Community Store to implement currently unsupported integrations. The add-ons can be found here.

I use Hyperion to provide Ambilight for my TV. Hyperion forwards the data to the three LED strips connected to a Lego Millennium Falcon and a Lego Super Star Destroyer. The configs can be found under esphome.

  • Component list with links for AtmoOrbs
  • Component list with links for Falconlights
  • Component list with links for SSDlights
  • Document Hyperion configuration
  • Document used protocol
  • Link to scripts in ./areas/mediacenter.yaml Atmoorbs:

The AtmoOrbs are made up by a NodeMCU board connected to a LED ring with 35 integrated WS2812 LEDs built inside an Ikea Fado lamp. The lamp requires a bit of modification to fit the LED ring, and I have designed a holder to fix the ring and NodeMCU to the lamp base. The color data is sent from Hyperion to the NodeMCU via the UDP E1.31 protocol, and there is no noticable delay.

Falcon:

Super Star Destroyer:

I use the MariaDB plugin for Home Assistant. I save the last 7 days of data and I found out that I'm using over 8 GB to store all the data. That was too much for me, especially since the backup directory started growing to humungous sizes. I found this topic on the Home Assistant forums (thanks to the author) in an attempt to tame the database.

Since I didn't really know how to connect to my database, I wanted to find a nice GUI-capable tool. For this, I have used dbForge.

Stepwise:

  • download and install dbForge.
  • Go to the MariaDB config page, and in the Network section, enter your port number. In my case the Container column reads 3306/tcp and I have entered 3306 below Host.
  • Open dbForge and use this tutorial to enter your database details
    • Use type TCP/IP
    • As Host enter your HA IP address and as port enter the port you used (3306 in my case)
    • As User and Password, use the entries in your MariaDB plugin config under logins
    • As Database, use the database name set in your MariaDB plugin config (default is homeassistant)
    • Press Test Connection or Connect if you're feeling lucky!
  • Press ctrl+n to start a new SQL
  • Query your database using SQL

Some useful commands I have used: To find out which entities use the most data

SELECT entity_id, COUNT(*) as count FROM states GROUP BY entity_id ORDER BY count DESC LIMIT 100;

To remove entities from the database directly using regular expressions:

-- first test if the regular expression works. I'm looking for e.g.: sensor.plug_nuc_voltage
SELECT entity_id, COUNT(*) as count FROM states WHERE entity_id LIKE 'sensor.blitzwolf%status' GROUP BY entity_id ORDER BY count DESC LIMIT 10;
-- then remove the entities. This is final!
DELETE FROM states WHERE entity_id LIKE 'sensor.blitzwolf%energy\_voltage';

To find out how much data each table (I think it's called a table) uses (credit goes to mbuscher)

SELECT
    table_name AS `Table Name`,
	table_rows AS `Row Count`,
	ROUND(SUM(data_length)/(1024*1024*1024), 3) AS `Table Size [GB]`,
	ROUND(SUM(index_length)/(1024*1024*1024), 3) AS `Index Size [GB]`,
	ROUND(SUM(data_length+index_length)/(1024*1024*1024), 3) `Total Size [GB]`
FROM information_schema.TABLES
WHERE table_schema = 'homeassistant'
GROUP BY table_name
ORDER BY table_name

Another thing I found useful was to plot the first 1000 entities of the first query using Excel and then calculate the sum of all counts up until that entity. That way I found out I could reduce the size of my database by a factor of 10, simply by removing the first 100 entities from the database.

License

MIT

References

Geekofweek's README.md was used as a source of inspiration for this document

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published