Smart Home Solution with Data Visualization

A Smart home solution monitors the locker door status using a data pipeline that consists of an ELK stack. A Raspberry Pi (RPI) based IoT system gets the door status using a magnetic switch. It sends ‘tweets’ whenever the door status is getting changed and also periodically. An ELK stack (Elastic Search/Logstash/Kibana) processes these tweets and displays the door status in a dashboard. RPI is a node in a Blockchain network that consists of GCP VMs and laptop-based Blockchain nodes. An AI/Android face recognition app opens if the user shows the face and closes the locker door if the user hides the face. Also, this mini-project involves nine digital transformation technologies such as AI, Data Science, Blockchain, Android, IoT, Cloud computing, Web3.0, Big data analytics and Social media.
The following development environments are required:
1. Web application to train human faces (HTML/JavaScript)
2. Google Colab to convert web model to TFLite model (Python)
3. Android Studio to build Android application with Web3J (Java)
4. Google cloud VMs (2nos) for running miners (Geth)
5. Raspberry PI as a Blockchain node (Geth)
6. Truffle Environment to build and deploy Smart Contract (Solidity)
7. IOT client application running on RPI (JavaScript/NodeJS)
Pre-requisite
The following medium articles need to be executed as a pre-requisite:
I. Smart Home Solution using Smart Contract.
https://anbunathanramaiah.medium.com/smart-home-solution-using-smart-contract-7eb7cd3407d3
II. Smart Home Solution using Android and Blockchain.
https://anbunathanramaiah.medium.com/smart-home-solution-using-android-and-blockchain-c5d6f23122b0
III. Smart Home Solution using Gesture Recognition.
https://anbunathanramaiah.medium.com/smart-home-solution-using-gesture-recognition-3be30071a78f
IV. Smart Home Solution using Face Recognition. https://anbunathanramaiah.medium.com/smart-home-solution-using-face-recognition-8284644d49a
In this article, the focus is on building the data pipeline and get a real time stream using Twitter.
The steps are broadly classified as follows
1. Configure ELK VM
2. Kibana Visualization Setup
3. Hardware Connection
4. Start Blockchain Nodes
5. Start RPI Application
6. Start Kibana Visualization
7. Check Project Setup
8. Run Face Recognition App
9. Demo
- Configure ELK VM
1.1 Installation
Create GCP VM as given in gcp vm.txt (External IP = 35.185.251.251)
sudo apt-get update
sudo apt install openjdk-11-jre-headless
wget –qO — https://artifacts.elastic.co/GPG-KEY-elasticsearch | sudo apt-key add –
sudo apt-get install apt-transport-https
echo “deb https://artifacts.elastic.co/packages/7.x/apt stable main” | sudo tee –a /etc/apt/sources.list.d/elastic-7.x.list
sudo apt-key adv — keyserver keyserver.ubuntu.com — recv-keys D27D666CD88E42B4
sudo apt-get update && sudo apt-get install elasticsearch logstash kibana
1.2 Configure files
sudo nano /etc/elasticsearch/elasticsearch.yml
— — — — — -
network.host: 0.0.0.0
discovery.type: single-node
— — — — — -
sudo nano /etc/apt/sources.list.d/elastic.list
sudo nano /etc/kibana/kibana.yml
— — — — — —
server.port: 5601
server.host: “0.0.0.0”
— — — — — —
1.3 Start ELK Stack
sudo systemctl start elasticsearch
sudo systemctl start logstash
sudo systemctl start kibana
sudo systemctl enable elasticsearch
sudo systemctl enable logstash
sudo systemctl enable kibana
sudo systemctl status elasticsearch
sudo systemctl status logstash
sudo systemctl status kibana
journalctl -xe
1.4 Setting Up Nginx
sudo apt-get update
sudo apt install nginx
echo “admin:’openssl passwd -apr1 admin’” | sudo tee -a /etc/nginx/htpasswd.kibana
sudo nano /etc/nginx/sites-available/kibana
— — — — — — —
server {
listen 80;
location / {
proxy_pass http://10.138.0.2:5601;
proxy_http_version 1.1;
proxy_set_header Upgrade $http_upgrade;
proxy_set_header Connection ‘upgrade’;
proxy_set_header Host $host;
proxy_cache_bypass $http_upgrade;
}
}
— — — — — — —
1.5 Run Nginx
cd /etc/nginx/sites-available
sudo rm default
sudo ln -s /etc/nginx/sites-available/kibana /etc/nginx/sites-enabled/kibana
sudo rm /etc/nginx/sites-enabled/default
sudo systemctl stop nginx
sudo systemctl start nginx
1.6 Check ELK Status
Check Kibana in browser
Observe Kibana opens
Check elasticsearch in VM:
curl -XGET ‘http://localhost:9200/?pretty'
Observe JSON data is diplayed
curl -XGET ‘http://localhost:9200/_cluster/health?pretty=true'
Observe status=green
curl -XGET ‘http://localhost:9200/_cat/indices'
— — — — -
green open .kibana_7.13.2_001 V3UCxO1zQ-uT76VKxOOumg 1 0 23 11 2.1mb 2.1mb
green open .apm-custom-link q9M9fkD2SkCAWrzhoBR-jw 1 0 0 0 208b 208b
green open .kibana-event-log-7.13.2–000001 h0eKMtzISgm-pOxNRuMdsw 1 0 6 0 27.4kb 27.4kb
green open .apm-agent-configuration a3RvR_MLQT2SxYG5tgEQsA 1 0 0 0 208b 208b
green open .kibana_task_manager_7.13.2_001 aQXHUcmCTdm3e3CTdQrHcg 1 0 10 638 208.5kb 208.5kb
green open .tasks yGqlEIW6RSqxkMDbsn98PQ 1 0 6 0 34.8kb 34.8kb
— — — — -
1.7 Install Plugin
sudo /usr/share/logstash/bin/logstash-plugin install — no-verify
1.8 Add twitter-template
sudo nano /etc/logstash/conf.d/twitter-template.json
Take source code from
1.9 Add twitter.conf
sudo nano /etc/logstash/conf.d/twitter.conf
Take source code from
1.10 Start ELK Stack
sudo /usr/share/logstash/bin/logstash -f /etc/logstash/conf.d/twitter.conf
//observe: twitter — Starting twitter tracking {:track=>”#doorstatus”}
//observe: Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
//observe: twitter messages are getting displayed
curl -XGET ‘http://localhost:9200/_cat/indices'
//Observe new index — twitter is added

2. Kibana Visualization Setup
2.1 Index Creation
Open Kibana in browser:
Left top navigation->Discover->Index patterns->Create Index pattern
index-name=twitter
Click on ‘Next step’
Time field = @timestamp
Click on ‘Create index’
Left navigation -> Discover
Observe twitter index is selected by default
Also, 15 mins is selected
set refresh = 10seconds and start
Under Auto, select day
2.2 Visualization Setup
Left Navigation -> Visualize library -> Create new visualization
Aggregation based -> Explore options -> Metric -> Select twitter
Observe now count is displayed
Right side, under Metric, select count
Custom label = DOOR OPENED
Under bucket
Select split group
Aggregation = Data Range
Field = @timestamp
Acceptance date formats = now-30s -> now
Click on ‘Update’
In filter (KQL) section,
filter=entities.hashtags.text.keyword:”dooropened”
Observe count with respective tags are getting dislayed
Save visualization -> New dashboard = twitter-dashboard
Similarly create visualization for “doorclosed”

3. Hardware Connection
RPI connection with Servo motor
Connect brown wire to RPI->GND pin 6
Connect orange wire to RPI->GPIO17 pin 11
Connect red wire to Battery +5volys
Connect Battery Gnd with RPI Gnd
RPI connection with door sensor
Magnetic switch terminal1 to RPI->GND pin 14
Magnetic switch terminal2 to RPI->GPIO18 pin 12

4. Start Blockchain Nodes
4.1 Start Blockchain Nodes on Lenova Laptop
Login into Lenova Laptop through putty,
user:anbu
pw:
1st putty window:
cd RPI
sh startminer1.sh
2nd putty window:
cd RPI
sh startminer2.sh
3rd putty window:
sudo geth attach “http://localhost:8042"
miner.start()
net.peerCount
4th putty window:
sudo geth attach “http://localhost:8043"
miner.start()
net.peerCount
4.2 Start Blockchain Nodes on GCP VMs
Login into GCP Console,
user:digitransolutions11@gmail.com
pw:
Launch atleast 2 VMs (34.83.61.45, 34.83.39.253)
1st VM SSH window:
cd RPI
sh startminer1.sh
2nd VM SSH window:
cd RPI
sh startminer2.sh
1st VM (2nd)SSH window:
sudo geth attach “http://localhost:8042"
miner.start()
net.peerCount
2nd VM (2nd)SSH window:
sudo geth attach “http://localhost:8043"
miner.start()
net.peerCount
4.3 Start Blockchain Nodes on RPI
start RPI from putty
user: anbunathanr
pw: ubuntu
in 1st putty window:
sudo classic
cd rpi-node
sh startrpinode.sh
In 2nd putty window:
sudo classic
sudo geth attach “http://localhost:8042"
net.peerCount
//observe:4
In 3rd putty window:
sudo classic
sudo killall pigpiod
//in classic window:
sudo pigpiod -p 8888

5. Start RPI Application
cd /home/anbunathanr/RPI/SmartToken
sudo nano smart_token_servo_visual.js
Take source code of ‘smart_token_servo_visual.js’ from
https://github.com/anbunathan/bda_examples/tree/master/data-visualization
node smart_token_servo_visual.js
//Now RPI listens for any token change
//Now tweets are received on Twitter

6. Start Kibana Visualization
Open twitter-dashboard
Visualize->dashboard->twitter-dashboard
Refresh = 10 seconds
observe = last 95 seconds
(provided loop = 90seconds)

7. Check Project Setup
//In geth console attached to GCP VM miner1 (digitransolutions17@gmail.com):
//Create instance of Smart Contract
var abi=[{“anonymous”:false,”inputs”:[{“indexed”:true,”name”:”_from”,”type”:”address”},{“indexed”:false,”name”:”_value”,”type”:”uint256"}],”name”:”OnValueChanged”,”type”:”event”},{“constant”:false,”inputs”:[{“name”:”recipient”,”type”:”address”},{“name”:”value”,”type”:”uint256"}],”name”:”depositToken”,”outputs”:[{“name”:”success”,”type”:”bool”}],”payable”:false,”stateMutability”:”nonpayable”,”type”:”function”},{“constant”:false,”inputs”:[{“name”:”recipient”,”type”:”address”},{“name”:”value”,”type”:”uint256"}],”name”:”withdrawToken”,”outputs”:[{“name”:”success”,”type”:”bool”}],”payable”:false,”stateMutability”:”nonpayable”,”type”:”function”},{“constant”:true,”inputs”:[{“name”:”recipient”,”type”:”address”}],”name”:”getTokens”,”outputs”:[{“name”:”value”,”type”:”uint256"}],”payable”:false,”stateMutability”:”view”,”type”:”function”}]
var MyContract = web3.eth.contract(abi);
var MyContractInstance = MyContract.at(‘0xa5c53be143769d625c1185cc67c705f039a1f876’);
//transfer 10 tokens to RPI
MyContractInstance.depositToken(“0x8f87dbf765be30e1bc65361c99252cd33b67bd0d”, 10, {from: eth.accounts[0], gas:3000000});
//check the transaction after 10 seconds
MyContractInstance.getTokens(“0x8f87dbf765be30e1bc65361c99252cd33b67bd0d”, {from: eth.accounts[0], gas:3000000});
//it should display 10
//Observe RPI port = turn 90degrees
//withdraw 10 tokens from RPI account
MyContractInstance.withdrawToken(“0x8f87dbf765be30e1bc65361c99252cd33b67bd0d”, 10, {from: eth.accounts[0], gas:3000000});
//Observe RPI port = turn 0 degrees
8. Run Face Recognition App
Open Face recogniton app in Android mobile
Show the face
Observe door is opened
Tweets are sent with hashtags — “dooropened”
In Kibana dashboard, observe counts are displayed under ‘DOOR OPENED’
Hide the face
Observe door is closed
Tweets are sent with hashtags — “doorclosed”
In Kibana dashboard, observe counts are displayed under ‘DOOR CLOSED’
9. Demo
The demo of this project is given in the link:
Source code:
References
1.Raspberry Pi — Door Sensor
https://www.ryansouthgate.com/2015/08/10/raspberry-pi-door-sensor/
2.Realtime Bigdata analysis using Elasticsearch Logstash Kibana demonstration in Tamil
https://www.youtube.com/watch?v=OT0uCdLuwVI
http://www.kaniyam.com/elk-stack-%e0%ae%aa%e0%ae%95%e0%af%81%e0%ae%a4%e0%ae%bf-1/
http://www.kaniyam.com/elk-stack-%e0%ae%aa%e0%ae%95%e0%af%81%e0%ae%a4%e0%ae%bf-2/
http://www.kaniyam.com/elk-stack-part-3/
http://www.kaniyam.com/elk-stack-part-4/
3.Indexing Twitter With Logstash and Elasticsearch
https://david.pilato.fr/blog/2015/06/01/indexing-twitter-with-logstash-and-elasticsearch/
4.GCP for ELK Stack
https://logz.io/blog/elk-stack-google-cloud/
https://linuxconfig.org/install-elk-on-ubuntu-20-04-focal-fossa-linux
https://ubuntu.com/tutorials/install-and-configure-nginx#2-installing-nginx
5.Create filter with time query for the last 3 days
Contact details:
Mail: anbunathan.r@gmail.com