Making PostgreSQL database realtime with Pusher and Node
You will need a basic understanding of Node.js, SQL, Postgre and JavaScript. You should have already installed PostgreSQL, Node and npm.
Introduction
Sometimes, we want to watch over specific tables in our database and make new records or entries available to users as soon as they come in. While doing this, we do not want to take away the flexibility and advantages we get from using PostgreSQL databases. In this tutorial, we will learn about making PostgreSQL database realtime using Pusher. Just before then, for those of us who don’t have an idea what PostgreSQL is, let’s look at it.
PostgreSQL is a powerful, open source object-relational database system that uses and extends the SQL language combined with many features that safely store and scale the most complicated data workloads.
PostgreSQL comes with many features aimed to help developers build applications, administrators to protect data integrity and build fault-tolerant environments, and help you manage your data no matter how big or small the dataset. Besides being free and open source, PostgreSQL is highly extensible. For example, you can define your own data types, build out custom functions, even write code from different programming languages without recompiling your database!
In this tutorial, we will build a realtime table that shows a list of movies from our database.
Prerequisites
A basic understanding of Node.js, SQL, and client-side JavaScript is required for this tutorial.
You also need to have PostgreSQL, Node.js and npm installed.
Some familiarity with PostgreSQL(as this tutorial isn’t helping with absolute basics like installation and setup).
Setting up a Pusher account and app
Pusher is a hosted service that makes it super-easy to add realtime data and functionality to web and mobile applications.
Pusher acts as a realtime layer between your servers and clients. Pusher maintains persistent connections to the clients - over Web-socket if possible and falling back to HTTP-based connectivity - so that as soon as your servers have new data they want to push to the clients they can do, via Pusher.
If you do not already have one, head over to Pusher and create a free account.
We will register a new app on the dashboard. The only compulsory options are the app name and cluster. A cluster represents the physical location of the Pusher server that will handle your app’s requests. Also, copy out your App ID, Key, and Secret from the App Keys section, as we will need them later on.
Setting up the project
Let’s create a new node project by running:
#create directory
mkdir realtime_postgres
#move into the new directory
cd realtime_postgres
#initialize a node project
npm init -y
Next, let’s move ahead by installing the required libraries:
npm install express body-parser ejs pusher db-migrate db-migrate-pg dotenv pg --save
In the command above, we have installed the following:
- Express: fast, unopinionated, minimalistic web framework for Node.js.
- Body-parser: parse incoming request bodies in a middleware before your handlers, available under the
req.body
property. - Ejs: templating engine for Node.js,
- Pusher: the official Node.js library for Pusher.
- Db-migrate: Database migration framework for node.js.
- Db-migrate-pg: A PostgreSQL driver for db-migrate, a database migration framework for node.js.
- Dotenv: Dotenv is a zero-dependency module that loads environment variables from a
.env
file - Pg: Non-blocking PostgreSQL client for node.js. Pure JavaScript and optional native libpq bindings.
Setting up the app
Setting up environment variables
First, we need to define a file called .env
to hold our environment variables.
The POSTGRES_CONNECTION_URL
is a combination of our PostgreSQL username, password, host, port and database. For example, if our username is user
, password is securePassword
, our host if installed on local is 127.0.0.1
, the default port is 5432
and the name of our database is realtime_postgres
, our POSTGRES_CONNECTION_URL
would look like:
postgres://user:securePassword@127.0.0.1:5431/realtime_postgres
For this tutorial, we will need our POSTGRES_CONNECTION_URL
and Pusher credentials which we had gotten earlier, so our file looks like:
POSTGRES_CONNECTION_URL=postgres://user:securePassword@127.0.0.1:5431/realtime_postgres
PUSHER_APP_ID=XXX_APP_ID
PUSHER_APP_KEY=XXX_APP_KEY
PUSHER_APP_SECRET=XXX_APP_SECRET
PUSHER_APP_CLUSTER=XXX_APP_CLUSTER
Setting up db-migrate
Next, we need to create a file called database.json
in the root of our app, which comprises a JSON structure, telling db-migrate
which database connection we will be using.
In the snippet below, we set the connection URL to be retrieved from the POSTGRES_CONNECTION_URL
variable which we had declared in our .env
when our environment is dev
.
{
"dev": {
"ENV": "POSTGRES_CONNECTION_URL"
}
}
Now we have our db-migrate
config set up, we want to generate the SQL
file to hold migrations for the table we will make realtime. To generate migrations, run :
node node_modules/db-migrate/bin/db-migrate create realtime_table --sql-file
The command above will create a folder called migrations
. In the migrations folder, another folder called sqls
will be created. In that folder, you would notice two files with the current timestamp as the prefix, followed by the name of the table you want to create, and a suffice of up and down.
In my case, the file was named 20180423192559-realtime-table-up.sql
and 20180423192559-realtime-table-down.sql
.
Open the 20180423192559-realtime-table-up.sql
file and replace its content with:
/* Replace with your SQL commands */
CREATE TABLE realtime_table (id serial primary key, title varchar, year varchar, producer varchar);
CREATE FUNCTION notify_trigger() RETURNS trigger AS $$
DECLARE
BEGIN
PERFORM pg_notify('watch_realtime_table', row_to_json(NEW)::text);
RETURN new;
END;
$$ LANGUAGE plpgsql;
CREATE TRIGGER watch_realtime_table_trigger AFTER INSERT ON realtime_table
FOR EACH ROW EXECUTE PROCEDURE notify_trigger();
What happens in the code above? We have some SQL
statements which will be active once we migrate our database. In the first part of the statements, we create a table called realtime_table
with an ID, title, year and producer columns respectively.
The next thing we do is to create a function which returns a trigger. Hold on first, what is a trigger? Triggers are stored programs, which are automatically executed or fired when some events occur. Triggers are, in fact, written to be executed in response to any of the following events:
- A
database manipulation
(DML) statement (DELETE, INSERT, or UPDATE) - A
database definition
(DDL) statement (CREATE, ALTER, or DROP). - A
database operation
(SERVERERROR, LOGON, LOGOFF, STARTUP, or SHUTDOWN).
In the function which returns a trigger which we defined above, we perform a pg_notify
function on our table, and we asked it to return the row as JSON, cast to text. Now, what exactly is this pg_notify
function?
The pg_notify
command sends a notification event together with an optional payload
string to each client application that has previously executed LISTEN channel
for the specified channel name in the current database. The arguments here are the channel name to send the notification to and the second part is the message. This means on the server side, we can listen to a channel called watch_realtime_table
once we attach a trigger to it. Cool stuff.
The last part of our script is where we create our trigger called watch_realtime_table_trigger
which we set to trigger after an INSERT
action on our realtime_table
. We tell the trigger to use the notify_trigger
function for each new row added.
Let’s migrate our database by running:
node node_modules/db-migrate/bin/db-migrate up
Setting up the entry point
We have a channel which would be broadcast to on every new INSERT into our database, but we are yet to watch for the app and also send it to any client we want to send it to in realtime. Here is where Pusher comes in. In the notification of clients in realtime when a new row has been inserted.
Create a file called index.js
in the root folder and paste in:
const express = require('express');
const bodyParser = require('body-parser');
const Pusher = require('pusher');
const pg = require ('pg');
require('dotenv').config();
const app = express();
// declare variable to hold database connection
let pgClient;
// set the view engine to ejs
app.set('view engine', 'ejs');
const pool = new pg.Pool({
connectionString: process.env.POSTGRES_CONNECTION_URL,
});
// Body parser middleware
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: true }));
// Session middleware
// Create an instance of Pusher
const pusher = new Pusher({
appId: process.env.PUSHER_APP_ID,
key: process.env.PUSHER_APP_KEY,
secret: process.env.PUSHER_APP_SECRET,
cluster: process.env.PUSHER_APP_CLUSTER,
encrypted: true
});
pool.connect((err, client) => {
if(err) {
console.log(err);
}
pgClient = client;
});
//listen on the app
app.listen(3000, () => {
return console.log('Server is up on 3000')
});
Here, what we have is a basic Express setup, we have added the required libraries, used the body-parser
middle-ware, and started an instance of Pusher, passing in the app id, key, secret, and cluster from our environment variables. We set ejs
as our templating engine, which we will use later on in this tutorial. Also, we connect to our PostgreSQL connection, then assign the client to a global variable.
Listening to the PostgreSQL notification
Let’s update our PostgreSQL connection function to include our Listener and publishing the new data in realtime. Replace the pool.connect
function with this:
pool.connect((err, client) => {
if(err) {
console.log(err);
}
pgClient = client;
client.on('notification', function(msg) {
pusher.trigger('watch_realtime_table', 'new_record', JSON.parse(msg.payload));
});
const query = client.query('LISTEN watch_realtime_table');
});
What has changed in this case? First, we added a notification handler for notifications
in our database. In this response, we have two main attributes useful for us namely channel
and payload
. Here, since we are listening on only one channel, all notifications would come from our watch_realtime_table
channel. If you watch for over one channel, be sure to use the channel
attribute as a condition to be sure you are sending the right data to the right place. Once the data arrives, we make a Pusher trigger to send the record to our Pusher channel and event for realtime notification.
Next, we executed a query to listen for notifications on the channel by doing LISTEN watch_realtime_table
.
Receiving data on the frontend
Now we are set up to receive data in realtime once a record is inserted into our table. Let us set up a new route to show how to receive this data on the front-end.
Let us add a new route to our index.js
file as seen:
app.get('/', async(req, res) => {
const data = await pgClient.query('SELECT * FROM realtime_table');
return res.render('index', {table: data.rows});
});
Here, we defined the route as the root of our application. First, we grab all current data in our database and pass it on to a view named index
.
Note: we are using
ejs
as our templating engine, andindex
passed to the render function refers toviews/index.ejs
. The views folder defaults toviews
if we do not set it in our application.
Next, create the file views/index.ejs
and add:
<link href="//maxcdn.bootstrapcdn.com/bootstrap/3.3.0/css/bootstrap.min.css" rel="stylesheet" id="bootstrap-css">
<script src="https://js.pusher.com/4.0/pusher.min.js"></script>
<div class="container">
<div class="row">
<div class="col-md-12">
<h4>Realtime Movie Database</h4>
<div class="table-responsive">
<table id="mytable" class="table table-bordred table-striped">
<thead>
<th>Movie Title</th>
<th>Year Released</th>
<th>Producer</th>
</thead>
<tbody id="tableBody">
<% table.forEach(function(row) { %>
<tr>
<td>
<%= row.title %>
</td>
<td>
<%= row.year %>
</td>
<td>
<%= row.producer %>
</td>
</tr>
<% }); %>
</tbody>
</table>
<div class="clearfix"></div>
</div>
</div>
</div>
</div>
<script>
var pusher = new Pusher('XXX_APP_KEY', {
cluster: 'XXX_APP_CLUSTER',
encrypted: true
});
var channel = pusher.subscribe('watch_realtime_table');
channel.bind('new_record', function(data) {
var table = document.getElementById('mytable');
var row = table.insertRow(-1);
var title = row.insertCell(0);
var year = row.insertCell(1);
var producer = row.insertCell(2);
title.innerHTML = data.title;
year.innerHTML = data.year;
producer.innerHTML = data.producer;
});
</script>
In the code above, we require the bootstrap library and the Pusher client library, we then declared a table with an ID of mytable
, then looped the current table data to be displayed.
Next, we declared our script function, where we instantiated Pusher with our app key and cluster, respectively. We move on to subscribe to the watch_realtime_table
channel which we push new data to. Immediately after subscribing, we bind to the event called new_record
which is triggered every time a record is added.
We grab the current table, append a new row to the end of the table, create three new cells to hold the data, then we set the data to the innerHTML
attributes of those cells.
After setting all of this up, we can use any PostgreSQL client such as pgAdmin or Adminer to add records to the table while we see the changes spin up on the frontend.
To view changes on the frontend, run node index.js
and visit http://localhost:3000 in your browser to view the magic.
In my case, I have Adminer
already on my system, so I used Adminer. Here is a demo of this functionality:
Conclusion
In this tutorial, you learned how to make PostgreSQL database realtime using Pusher. We covered creating stored procedures, triggers, watchers, and listeners.
From here you can take things further and explore more complex functionalities by watching over more than one table at once, making triggers respond to update and delete operations, e.t.c!
The code base to this tutorial is hosted in a public GitHub repository. Experiment with the code.
30 April 2018
by Samuel Ogundipe