0.21: Improved Web and support for EnOcean, LIRC and Osram Lightify

two minutes reading time
  • Release-Notes
Comments

It’s time for release 0.21 and it contains massive core improvements: replacement of our home grown HTTP stack with a standardized WSGI stack. This will improve performance, speed, security and make future development of advanced HTTP features a breeze.

This work was driven by the amazing Josh Wright. His knowledge, high standards and drive for security has helped improve Home Assistant a lot ever since he started helping out. Hip hip hurray for Josh!

Alright, time for the changes:

Breaking Changes

  • Our work in the WSGI stack is not fully done yet. We still have a minor issues where retrieving the error log in the about screen can raise an encoding error
  • The API used to incorrectly accept a JSON body with form-url-encoded headers. Our cURL examples on the website used to be wrong and have been updated.
  • Make sure your configuration.yaml file contains frontend: to serve the frontend

Hotfixes 0.21.1 and 0.21.2

We released two hotfixes to address some issues that couldn’t wait till the next release.

0.21.1 - June 12
  • Add eventlet to base requirements to resolve some installation issues (@balloob)
  • GTFS will filter out routes in the wrong direction (@imrehg)
  • Recover from rare error condition from LIRC (@partofthething)
  • Z-Wave autoheal will no longer raise exception (@balloob)
  • Alexa will now execute the script before making reply (@balloob)
  • Fix MJPEG camera streaming (@stjohnjohnson)
  • Fix frontend in older browsers (@balloob)
  • Fix history in more info dialog being cut off (@balloob)
0.21.2 - June 15
  • Fix input_select calling the set_option service again when changed (@persandstrom)
  • Fix more info dialog not being able to open on Safari (@balloob)
  • Add support for OPTIONS HTTP command to get CORS working (@JshWright)

Community Highlights

1 minute reading time
  • Community
  • Video
Comments

Our community is amazingly helpful and creative. If you haven’t been there yet, make sure to stop by our chat room and come hang out with us. In this blog post I want to highlight a few recent awesome projects and videos from the community.

SceneGen - cli for making scenes

SceneGen is a new command line utility developed by Andrew Cockburn that helps with creating scene configurations for Home Assistant. To use it, you put your house in the preferred state, run SceneGen and it will print the scene configuration for your current states.

Videos

Nick Touran has been working on integrating IR remotes with Home Assistant. He made it into a component which should be available in the next release which should arrive in a couple of days. In the meanwhile, he wrote up a blog post and has put out a video showing the new integration, very cool!

Ben from BRUH Automation has put out another great video how to get started tracking your location in Home Assistant using MQTT and OwnTracks.

Muhammed Kilic has created a video how to make your Home Assistant instance accessible from the internet using the free dynamic DNS service DuckDNS.


iBeacons: How to track things that can’t track themselves (part II)

eight minutes reading time
  • iBeacons
  • Device-Tracking
  • OwnTracks
Comments

This post is by Home Assistant contributor Greg Dowling.

In Part 1 I talked about using iBeacons to improve presence tracking. In part 2 I’ll talk about how to track things like keys that can’t track themselves by using iBeacons.

Tracking things using iBeacons

In the first part I mentioned that iBeacons just send out I’m here packets, and we used this to trigger an update when your phone came close to a fixed beacon.

But beacons don’t have to be fixed.

Your phone knows roughly where it is located (based on mobile phone masts, Wi-Fi networks or GPS). If your phone sees an I’m here message then it knows the beacon is close.

If your phone can remember (or tell a server) where it was when it last saw the iBeacon - then it knows where the beacon was. So the result of this is that you can track where an iBeacon was - even though the iBeacon doesn’t have any tracking technology itself.

So if you put an iBeacon on your keys or in your car - then you can track them.

Here are my keys - with a Estimote Nearable iBeacon stuck to them. Ugly but effective!

Read on →

Raspberry Pi all-in-one installer

Less than one minute reading time
  • Video
Comments

We are always hard at work at the virtual Home Assistant headquarters to make it easier for you to get started with Home Assistant. That’s why @jbags81 recently introduced the all-in-one installer. It allows you to get up and running with a complete Home Assistant setup by entering one line of code into your Raspberry Pi running Raspbian Jessie:

wget -Nnv https://raw.githubusercontent.com/home-assistant/fabric-home-assistant/master/hass_rpi_installer.sh && bash hass_rpi_installer.sh;

This feature wouldn’t be complete if it wasn’t accompanied by a new video by Ben from BRUH Automation. The video shows how to install Raspbian Jessie on your Raspberry Pi and use the new installation script to get a full Home Assistant system up and running.


0.20: Roku, Last.fm, AWS, Twilio

two minutes reading time
  • Release-Notes
Comments

Tons of new supported things in 0.20.

Breaking changes

  • Asus WRT will now default to SSH with Telnet being an option
device_tracker:
  platform: asuswrt
  protocol: telnet

Why we use web components and Polymer

three minutes reading time
  • Technology
Comments

I’ve been planning to write this post for a while now as we get questions like this a lot: “Why does Home Assistant use Polymer? Why not React, Redux and what not?”

It’s understandable, Polymer is quite the underdog in the world of web frameworks. A corporate backer does not guarantee popularity or an active community and this shows in the number of projects using Polymer.

Still, we use Polymer and it’s awesome. To explain why, I’ll be referencing the React workflow quite a bit, as they do a lot of things right, and show how it is done in Polymer.

Polymer gives us components for the web, just like React, but based on web standards: web components, CSS variables. These standards don’t have wide browser support yet but it’s being implemented by every major browser: It’s the future. For now they are being polyfilled and that works just fine but in the future the Home Assistant web app will be able to run native in the browsers == fast.

Read on →

Video: How To Configure Home Assistant

Less than one minute reading time
  • Video
Comments

Ben from BRUH Automation authors a lot of great video’s about how he is using Home Assistant and how you can get started with it too. The video below will walk you through how to configure Home Assistant. Enjoy!

Make sure to subscribe to his YouTube channel for more Home Assistant video’s.


0.19: Empowering scripts and Alexa

three minutes reading time
  • Release-Notes
Comments

This release is big. Until now, our automations and scripts have been very static. Starting today it should all be a bit more dynamic.

Scripts are now available in automations and when responding to Alexa/Amazon Echo. Both of these components will now expose data to be used in script templates (including from_state !). Passing data to script entities is available by passing the data to the script services.

automation:
  trigger:
    platform: mqtt
    topic: some/notify/topic
  action:
    service: notify.notify
    data_template:
      message: 

automation 2:
  trigger:
    platform: state
    entity_id: light.hue
  action:
    service: notify.notify
    data_template:
      message:  is now 

Entity Namespaces allow you to influence the entity ids for a specific platform. For example you can turn light.living_room into light.holiday_home_living_room with the following config:

light:
  platform: hue
  entity_namespace: holiday_home

Deprecations

  • Conditions in automations should now specify which condition to use with condition: instead of platform:. For example condition: state.
  • RFXtrx has a new config format.

Old RFXtrx config format:

  devices:
    123efab1:
      name: My DI.0 light device
      packetid: 1b2200000890efab1213f60

New RFXtrx config format:

  devices:
    1b2200000890efab1213f60:
      name: My DI.0 light device


iBeacons: Making presence detection work better (part I)

nine minutes reading time
  • iBeacons
  • Presence-Detection
  • OwnTracks
Comments

This post is by Home Assistant contributor Greg Dowling.

In 2013 Apple introduced iBeacons: a class of Bluetooth low energy (LE) devices that broadcast their identifier to nearby devices, including most smartphones. At first glance it’s hard to imagine why they might be useful. In this two part blog I’ll try and explain why they are useful and how you can use them with Home Assistant.

The reason I started using iBeacons was to improve presence detection (and I think that’s the case with most people) so that’s what I’ll discuss in part 1. In part 2 I’ll talk about using iBeacons to track devices that can’t track themselves.

Using beacons to improve OwnTracks location data

When you use OwnTracks in standard major move mode (which is kind to your phone battery) it sometimes fails to update when you’d like it to. In my case I found that it would often send a location update as I was on my way home, but then not update when I got home. The result would be that Home Assistant would think I was 500M away from home, and take quite a while to notice I was home. It would also mean that the automation that should turn on my lights when I got home didn’t work very well! There were a few times when my phone location updated at 2am and turned the lights on for me. Fortunately my wife is very patient!

Luckily, OwnTracks supports iBeacons so I could use them to make presence detection more reliable. When OwnTracks sees a beacon it recognises, it will send an update. This means that if you put a beacon at your front door - OwnTracks will see it within a few seconds of you arriving home - and send an update saying it has seen this iBeacon.

Read on →