41 How to Integrate R with PostgreSQL

Julia Wang

41.1 Motivation

This guide will teach you how to connect RStudio to your local instance of PostgreSQL, a popular open source object-relational database server management system, that is free to download. Once we have become familiar with connecting to our PostgreSQL instance, we will perform a quick Linear Regression on some sample data, store the data in your PostgreSQL database, and then reuse that LR model to make a prediction.

Note: instance and database will be used interchangeably in this guide, but mean the same thing.

41.2 How to Integrate R with PostgreSQL

If you haven’t already, you will have to [download PostgreSQL]((https://www.postgresql.org/download/){target="_blank"} and setup pgAdmin 4. You should note down is the user and password that you create in this process, as well as make a note of what port and host you used to set up PostgreSQL (the defaults for this are localhost and 5432). You will need this information later to establish a connection between RStudio and your PostgreSQL instance.

Once you have set up PostgreSQL and were able to connect to pgAdmin 4, now in RStudio, we want to install and use the relevant RPostgreSQL packages. You can do so by running the following commands.

install.packages("RPostgres")
install.packages("devtools")
install.packages("DBI")
install.packages("remotes")

41.2.1 Create your connection

In the case you want to use a separate dummy/admin user, you can create a user by going to Servers > PostgreSQL > Databases > > Login > right click Create > Login/Group Role. The name will be the user name and set the password in Definition to whatever you want. It should have the priviledge to login at the bare minimum. In the following R block, we are going to setup a connection to our PostgreSQL instance. One question you might have reading the following block is what the drv (driver) does. A driver in this context is a tool that allows applications to access data in your database in other systems, similar to how printer drivers allow you to talk to your printer and print things. We pass a driver and relevant information about your database and a database user to create a connection, which we are able to use to then run R commands, such as dbListTables, which list all of the tables in your database.

db <- '<your db name here>'  # provide the name of your db

host_db <- '<your db server here>' # i.e. 'ec2-54-83-201-96.compute-1.amazonaws.com or localhost if you are connecting to an instance on your own laptop'  
db_port <- '5432'  # or any other port specified by the DBA

db_user <- '<your db user here>' # user that has access to your SQL instance

db_password <- '<your db user pwd here>' # password of the user who has access to your SQL instance.

drv <- RPostgres::Postgres()

con <- dbConnect(drv, dbname = db, host=host_db, port=db_port, user=db_user, password=db_password)  

dbListTables(con) # returns a list of tables in your database
dbExistsTable(con, "<table name>") # checks if the table exists in your database

41.2.2 Basic SQL commands

In addition to the two commands that I showed above, there are a few more basic commands that are built into the RPostgres library, which will allow you to read and write from your database. The following block will show you how to write a dataframe into a table in your instance, as well as query it afterwards.

# Create a table
library(tidyverse)

cars <- mtcars %>% rownames_to_column("carname") 
  
dbWriteTable(con, 'cars', cars) # add the mtcars data set to your database as a table called "cars"

cars <- cars %>% mutate(id = row_number()) # if you want to change your data, such as adding an id column

dbWriteTable(con, 'cars', cars, overwrite=TRUE, append=FALSE) # you will need to pass in additional parameters. Overwrite will drop and recreate the table with your new data. Append just appends your data, your df needs to be in the same shape with the same variable names in order to work.

If you don’t notice your table immediately, make sure to right click the tables dropdown in the sidebar and hit refresh.

Read your table:

dbReadTable(con, "cars") # read your newly created table

result <- dbReadTable(con, "cars") # can also be stored as variable

In case you prefer writing raw sql, you can use the dbGetQuery method. There is also a dbSendQuery method which is a bit more involved, but for our purposes, dbGetQuery will be sufficient.

library(ggplot2)

dbGetQuery(con, 'ALTER TABLE cars ADD CONSTRAINT cars_pk PRIMARY KEY (id)') # add primary key to the id column

db_cars <- dbGetQuery(con, 'SELECT * FROM cars WHERE id <= 20') # filter down our data set 

ggplot(db_cars, aes(x=disp, y=mpg)) + 
  geom_point() +
  ggtitle("Miles Per Gallon vs. Displacement (cu.in.)") +
  ylab("Miles Per Gallon") +
  xlab("Displacement (cu.in.)")

The dbSendQuery method only submits and synchronously executes the SQL query to the database engine. It does not extract any records, for that you need to use the dbFetch method, and then you must call dbClearResult when you finish fetching the records you need. If you are going to run SQL queries through RStudio, then most likely you should be using dbGetQuery.

Now that we are able to connect to and talk to our PostgreSQL instance, we can now run analyses using the data stored there as well as store our results there. We can also pull that saved model to make predictions as well, which can save us time on retraining our models.

model <- lm(mpg ~ disp, db_cars)

serialized_model <- rawToChar(serialize(model, NULL, ascii=TRUE)) # serialize (convert string) model so it can be stored in database

# create table for model to be stored in
dbGetQuery(con, 'CREATE TABLE models (
    id SERIAL PRIMARY KEY,
    model TEXT NOT NULL
);') 

# insert model into models table
insert_query <-'INSERT INTO models (model) VALUES ($1)'
rs <- dbSendQuery(con, insert_query, list(serialized_model))
dbClearResult(rs)

# read the model from postgreSQL
result <- dbGetQuery(con, "SELECT model FROM models WHERE id = 1")

# revert serialization, and we have a working model again!
db_model <- unserialize(charToRaw(as.character(result[,c('model')])))
summary(db_model)

# use the model to make a prediction
disp <- 128
X_test <- data.frame(disp)
predict(db_model, X_test)

Make sure to clean up your connections once you are finished working with your database.

# disconnect once you are finished
dbDisconnect(con)
dbUnloadDriver(drv)