Postgres in the Cloud for Django Apps
Several weeks ago, we tried our hand at building a basic Dungeons and Dragons player application powered by Django. Last time, we hooked up the app to a containerized Postgres instance. In this post I'll walk through the steps to connect a Django app to PostgreSQL in the cloud via Crunchy Bridge.
Note that in this scenario we're not actually taking the original database and moving it to Crunchy Bridge. We may have a follow-up blog post on migrating this existing database but for now we'll show how we're plugging in a new web app to a fresh instance of Postgres.
Prepare the database
Django supports PostgreSQL 9.5 and higher as a database backend. We used Python 3/Django 3.0 in a virtual environment, and with any Django-based app, we also need psycopg2 to work with the Postgres backend.
Spin up a new database cluster with Crunchy Bridge
To start, I'm provisioning a new Postgres database cluster in my Crunchy Bridge account (you can sign up for an account on the Crunchy Bridge homepage). You'll be able to select the Postgres version, storage capacity, set up an initial password for postgres user, etc. Since I'm using this with a demo app, I went with one of the Hobby tiers. More details on tiers and other provisioning options are available in the docs.
After provisioning a new database cluster (which takes about a few minutes), we get the host address and port plus other details.
Create a new database for the app
I can now quickly create the new database for our app on the command line like so:
createdb -U postgres -h p.s7d2ocnyg5bkvbllzckli7mwi4.db.postgresbridge.com -p 5432 ddapp
In running createdb
, I'm naming the new database ddapp
which is the same name as our original database. You'll see why in the next step.
Set up the database schema using a script
We used pgModeler to design our original database; it also allows you to export your database model to a SQL script, which I'm sharing in this GitHub gist. Like I mentioned earlier, the old database was named ddapp
, which means that our generated script contains references for ddapp
specifically. (I certainly could've gone with a different name, but I would have to make sure to update the script accordingly.) We also assigned a devuser
role for the app connection, so this script will recreate that role in this new database as well.
In Crunchy Bridge, the connection string info is comprised of the postgres username, the password assigned when provisioning the cluster (my_password
as an example), and the postgres database. I already have the new ddapp
database created, so I 'll run the dd_database.sql
script with psql and modify the connection string like so:
psql "postgres://postgres:my_password@p.s7d2ocnyg5bkvbllzckli7mwi4.db.postgresbridge.com:5432/ddapp" -f dd_database.sql
Now if I log in via psql (using the same connection string as above) to check, I get the tables that I wanted:
ddapp=# \dt
List of relations
Schema | Name | Type | Owner
--------+-----------------+-------+---------
public | alignment | table | devuser
public | campaign | table | devuser
public | pc | table | devuser
public | pc_class | table | devuser
public | person | table | devuser
public | person_campaign | table | devuser
public | race | table | devuser
(7 rows)
Then, I run the following statements so our app has all database permissions it needs:
GRANT ALL PRIVILEGES ON ALL TABLES IN SCHEMA public TO devuser;
GRANT ALL PRIVILEGES ON ALL sequences IN SCHEMA public TO devuser;
We're all set with Postgres, so we'll just need to finish up in Django and we'll be up and running.
Set up the Django data layer
We've already created our new Django project, so we can go ahead and update our database connection parameters in the project's settings.py
. Following best practices for security we'll store our database connection information in a .env
file in our project:
DATABASE_URL=postgres://devuser:devuserpassword@p.s7d2ocnyg5bkvbllzckli7mwi4.db.postgresbridge.com:5432/ddapp
And in settings.py
we can do the following, as an example (this may look different for you depending on how you want to manage your Django settings and how you deploy your app):
import dj_database_url
if os.environ.get('DATABASE_URL'):
DATABASES['default'] =
dj_database_url.config(default=os.environ['DATABASE_URL'])
We run an initial migration at this point which will create a few more database tables that Django needs:
(venv)> py manage.py migrate
Then, we can use inspectdb
to auto-generate our models based on the ddapp database schema:
(venv)> py manage.py inspectdb > managerapp/models.py
Note from the official Django docs that inspectdb isn't meant to be a definitive way of creating models. We'll still need to do some cleanup so that the database layer works as expected. (Steve wrote about our experience doing exactly this in another blog post.)
If we had gone far enough in the project to have already written our models, we'd run a migration instead. And in that case Django would do the work of setting up the schema in the new database for us. We're a Postgres shop and so we usually design the database first. Your workflow might follow what's outlined in the Django tutorial, where you start with creating models and then populate the database.
At this point, we're good to go. We can now do things like issue queries with the Django database API, use forms, and be able to update and sync with the database.
Python and Postgres in the cloud
For the Pythonistas out there: Crunchy Bridge is the only cloud PostgreSQL service that includes PL/Python and libraries like SciPy and NumPy out of the box, with more to come. It's easy to get started by creating an account; if you have questions feel free to contact or tweet at us.
You may be interested in our PL/Python content:
- Getting Started with Postgres Functions in PL/Python
- Postgres Data Types in PL/Python
- Exploring PL/Python: Turn Postgres Table Data Into a NumPy Array
- Building a recommendation engine inside Postgres with Python and Pandas
And here are the other posts so far in our Django series:
Related Articles
- Crunchy Data Warehouse: Postgres with Iceberg for High Performance Analytics
8 min read
- Loading the World! OpenStreetMap Import In Under 4 Hours
6 min read
- Easy Totals and Subtotals in Postgres with Rollup and Cube
5 min read
- A change to ResultRelInfo - A Near Miss with Postgres 17.1
8 min read
- Accessing Large Language Models from PostgreSQL
5 min read