- Docker, Postgres, Node, Typescript Setup
- Docker
- Connecting To Postgres
- Conclusion
- Saved searches
- Use saved searches to filter your results more quickly
- GeekyAnts/express-typescript-postgres
- Name already in use
- Sign In Required
- Launching GitHub Desktop
- Launching GitHub Desktop
- Launching Xcode
- Launching Visual Studio Code
- Latest commit
- Git stats
- Files
- README.md
Docker, Postgres, Node, Typescript Setup
Next, create an .env file at the root so that we use the same variables when configuring Docker Compose & the server. Also, we can hide the env variables used in Docker Compose as docker-compose.yml are commited to Github whereas the .env file is not.
For now, add a PORT variable to set the port the server will run at:
Create a app.ts in a new src folder with the following content:
import express, < NextFunction, Request, Response >from "express"; import dotenv from "dotenv"; const app = express(); dotenv.config(); //Reads .env file and makes it accessible via process.env app.get("/test", (req: Request, res: Response, next: NextFunction) => < res.send("hi"); >); app.listen(process.env.PORT, () => < console.log(`Server is running at $`); >);
To verify everything is setup correctly thus far, start the server:
Now, make a GET request to localhost:5000/test . The response should be hi . Also, notice there should be a dist folder with all the converted .ts files.
Docker
Now, we will run the server & Postgres in a Docker container.
Before that, you might ask why use Docker at all?
Docker allows your app to run in isolated environments known as containers. Consequently, this solves the age-old problem of «the code works on my machine».
Also, it allows you to use all the tools you want without installing them locally but by using images.
Docker images can installed from Docker Hub or created using a Dockerfile .
Create a file named Dockerfile at the root:
# Installs Node.js image FROM node:16.13.1-alpine3.14 # sets the working directory for any RUN, CMD, COPY command # all files we put in the Docker container running the server will be in /usr/src/app (e.g. /usr/src/app/package.json) WORKDIR /usr/src/app # Copies package.json, package-lock.json, tsconfig.json, .env to the root of WORKDIR COPY ["package.json", "package-lock.json", "tsconfig.json", ".env", "./"] # Copies everything in the src directory to WORKDIR/src COPY ./src ./src # Installs all packages RUN npm install # Runs the dev npm script to build & start the server CMD npm run dev
The Dockerfile will build our Express Server as an image, which we can then run in a container.
When creating applications that use multiple containers, it is best to use Docker Compose to configure them.
But before Docker Compose, let’s add some more variables to the .env file as we will require them shortly.
DB_USER='postgres' DB_HOST='db' DB_NAME='db_name' DB_PASSWORD='password' DB_PORT=5432
- DB_HOST corresponds to the name of the DB service below. This is because each Docker container has its own definition of localhost . You can think of db as the container’s localhost.
- DB_PORT is the default port Postgres uses
- DB_PASSWORD & DB_USER are the default auth credentials Postgres uses
Create a docker-compose.yml file at the root:
version: '3.8' services: api: container_name: api restart: always build: . ports: - $:$ depends_on: - db volumes: - .:/usr/src/app db: container_name: postgres image: postgres ports: - '5433:$' volumes: - data:/data/db environment: - POSTGRES_PASSWORD=$ - POSTGRES_DB=$ volumes: data: <>
Note: The $ syntax lets us use variables from the .env file. Docker Compose can automatically get variables from the root .env file.
For the api service, we are:
- using the Dockerfile to build the container
- exposing $ (which was 5000 from the .env file). When we expose a port, it allows us to access the server via localhost:$
- only starting the container once the db service finishes starting up
- mapping all the files in the project directory to WORKDIR of the container using volumes
For the db service, we are:
- using the postgres image from Docker Hub
- using volumes so that our DB data does not erase when we shut down the container
- mapping port 5432 of the container to port 5433 of our localhost
- using env variables from the .env file and passing it to the postgres image. The image requires at least the POSTGRES_PASSWORD as per the documentation on Docker Hub. We also included POSTGRES_DB as it specifies a different name for the default database that is created when the image is first started
Connecting To Postgres
To connect the server to Postgres container, add the following to app.ts :
import < Pool >from "pg"; const pool = new Pool(< host: process.env.DB_HOST, user: process.env.DB_USER, database: process.env.DB_NAME, password: process.env.DB_PASSWORD, port: parseInt(process.env.DB_PORT || "5432") >); const connectToDB = async () => < try < await pool.connect(); >catch (err) < console.log(err); >>; connectToDB();
Now, we can startup the server & DB by the following command:
This will build & start the containers ( api & db ). Remember, first db will start then api as api depends on db .
Try making the same GET request we did earlier and you should get the same response.
Before we end the tutorial, you might be wondering, how do I view the DB and its contents? There are 2 ways:
- You can add a new service to the docker-compose.yml file that uses the pgadmin4 image
- If you have PgAdmin installed locally:
- Use localhost as the host & 5433 as the port when adding a new server. Why 5433 and not 5432 — the default port of Postgres? Earlier, we mapped port 5432 of the container to port 5433 of our localhost . But, why 5433 ? It could’ve been any port, just not 5432 because if you have Postgres already installed locally, it is already using port 5432 . So, you cannot have the Postgres container also using the same port.
Conclusion
I hope my explanation was clear & helped you in some way. If you want the source code, you can find the full code here.
Saved searches
Use saved searches to filter your results more quickly
You signed in with another tab or window. Reload to refresh your session. You signed out in another tab or window. Reload to refresh your session. You switched accounts on another tab or window. Reload to refresh your session.
An API Boilerplate for Node.js, Express.js & PostgresSQL.
GeekyAnts/express-typescript-postgres
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Name already in use
A tag already exists with the provided branch name. Many Git commands accept both tag and branch names, so creating this branch may cause unexpected behavior. Are you sure you want to create this branch?
Sign In Required
Please sign in to use Codespaces.
Launching GitHub Desktop
If nothing happens, download GitHub Desktop and try again.
Launching GitHub Desktop
If nothing happens, download GitHub Desktop and try again.
Launching Xcode
If nothing happens, download Xcode and try again.
Launching Visual Studio Code
Your codespace will open once ready.
There was a problem preparing your codespace, please try again.
Latest commit
fix: removing persons reference
Git stats
Files
Failed to load latest commit information.
README.md
An API Boilerplate for Node.js, Express.js & PostgresSQL
Here’s few of the challenges we faced while working on an enterprise applications;
- Scaling the relational database with the upgrading application becomes difficult,
- Adding an extra layer of ORM costs your application in performance,
- Documenting the evolution of your application & the APIs has always been difficult,
- Reading a table’s or a relation’s structure of the database without going into the pgAdmin or actually writing «DESCRIBE table» SQL query was impossible.
We came up with express-typescript-postgress, this repository solves all the above mentioned problems as well as the things beyond that! Here’s the list of few things we’ve accomplished;
- Express JWT for API Authentication,
- Events & Listeners for sending emails & other background works,
- Swagger / OpenAPI for API documentation,
- ES Lint with Prettier for finding & fixing common code problems,
- Migration logic base to maintain the database changes,
- Snapshots logic base to maintain all the table structure available for documentation,
- Middleware to manage RBAC (role based access control),
What are the Pre-requisites?
├── config │ └── index.sample.ts ├── database │ ├── migrations │ ├── seeders │ └── snapshots ├── src │ ├── controllers │ │ ├── app.ts │ │ ├── auth.ts │ │ ├── index.ts │ │ └── user.ts │ ├── db_pool │ │ ├── auto_schema.ts │ │ ├── helper.ts │ │ ├── pg_pool.ts │ │ └── schema.ts │ ├── events │ │ ├── listeners │ │ │ └── auth_listener.ts │ │ └── index.ts │ ├── helpers │ │ ├── exception_wrapper.ts │ │ ├── file.ts │ │ ├── index.ts │ │ ├── notfound_handler.ts │ │ ├── random_string.ts │ │ └── upload.ts │ ├── middlewares │ │ ├── check_auth.ts │ │ └── schema.ts │ ├── models │ │ ├── auth.ts │ │ ├── common.ts │ │ ├── index.ts │ │ ├── log.ts │ │ ├── user_login.ts │ │ └── user.ts │ ├── providers │ │ ├── cors.ts │ │ ├── logger.ts │ │ └── version.ts │ ├── routes │ │ ├── app │ │ │ ├── index.ts │ │ │ └── schema.ts │ │ ├── auth │ │ │ ├── index.ts │ │ │ └── schema.ts │ │ ├── user │ │ │ ├── index.ts │ │ │ └── schema.ts │ │ └── index.ts │ ├── services │ │ ├── common_service.ts │ │ ├── email_service.ts │ │ ├── index.ts │ │ ├── log_service.ts │ │ └── user_service.ts │ ├── swagger │ │ ├── backend_api.yaml │ │ └── index.ts │ ├── typings │ │ ├── interface.ts │ │ └── types.ts │ ├── validators │ │ ├── auth.ts │ │ └── user.ts │ ├── extractOpenAPI.ts │ ├── index.ts │ ├── package.json │ ├── prettier.config.js │ └── tsconfig.json ├── .gitignore └── README.md
# Clone the repository git clone # Create the config file from the sample-config file cp config/index.sample.ts config/index.ts; # Add your database details user: 'db_username', password: 'db_password', database: 'db_dbname', host: 'db_host', # Goto the source code cd src; # Install NPM dependencies npm install; # Map new-migration command sudo npm link;
- You should have «postgres» user available in your postgres eco-system.
- Create a database with a name of your choice & assign «postgres» user to the database.
- Now, you should run the initial seed file into your DB’s Query Tool or we run it for you when you run this application for the first time.
- Define your migrations inside /database/migrations with format yyyymmdd-001_(schemas/data/functions)_description.sql
To ensure consitency of database across every system that uses this boilerplate code we use in house logic and auto update functions to maintain our migrations and snapshots.
- Create a new file inside database/migrations folder with .sql extension defining any SQL operation (UPDATE/CREATE/DROP/INSERT).
- New file name should follow convention like yyyymmdd-001_(schemas/data/functions)_description.sql. schemas — any changes in database tables or design, data — new data added to tables and functions — operations on sql functions.
- Every time application runs it checks for any newly added sql scripts inside migrations folder and add them to the database with respective statuses pending, successful and failed.
- Pending — Script execution has not occurred yet, Successful — Script execution completed without error and Failed — Script execution was unsuccessful.
How to create new Migration File
- Inside database/snapshots folder you fill find all the tables with defined structure and views that are there in your connected database.
- Every time application runs the snapshots are updated with any new changes to the database design.
- If you are working in a large dev-team having knowledge of any new changes can help reduce errors.
Explanation for custom postgres functions
Go to file PostgresFunctions.md for detailed explanation of various functions we have used.
Go to file Inheritance.md for the explanation on use of inheritance.
Auth | User | App |
---|---|---|
/auth/login | /user | /app/version |
/auth/forgot-password | /user/add | |
/auth/change-password | /user/roles | |
/auth/whoami | /user/ | |
auth/refresh-token | /user/ |