1 of 33

Writing a Go application with PostgreSQL using pgx

Henrique Vicente

2 of 33

Trajectory to PostgreSQL

Migration from a document-oriented database to�PostgreSQL (RDS) + OpenSearch (Amazon's fork of Elasticsearch)

3 of 33

Problems with previous database

  • Poor developer experience
  • Reliability and performance issues
  • High cost: support, licensing, and operational
  • Vendor lock-in
  • Steep learning curve

4 of 33

Why PostgreSQL

  • Object-Relational Database Management System
  • Almost 100% SQL standards-compliant
  • Excellent for client/server architecture
  • JSON types
  • Custom data types
  • Inheritance between tables

5 of 33

6 of 33

PostgreSQL advantages

  • Great developer experience
  • Reliability
  • Performance
  • Open-source software without vendor lock-in
  • Community and multiple support providers
  • Works everywhere: even on a new M1 MacBook

7 of 33

PostgreSQL drawbacks

  • Plan schema up-front (wait, isn't this an advantage?)
  • Full-text search capabilities is limited (but we've got it covered).

8 of 33

Environment variables for configuring PostgreSQL�https://www.postgresql.org/docs/current/libpq-envars.html

9 of 33

Environment variables might be problematic due to security concerns, so be careful.

  • Globals
  • Easy to leak to child processes or metrics probes

Also:

  • Process --flags are as bad!
  • Can you point to secrets, rather than expose them?

Tip: direnv makes it very convenient to use it on a development environment.

10 of 33

database/sql vs. pgx interface

Talking to SQL databases in Go, you almost always want to use a database through database/sql.

It provides a nice common interface, and makes supporting different databases easier.

PostgreSQL drivers for database/sql:

  • github.com/jackc/pgx
  • github.com/lib/pq (discontinued in favor of pgx)

11 of 33

database/sql vs. pgx interface

Among better performance, by using pgx interface you've:

  • Support for approximately 70 different PostgreSQL types
  • Automatic statement preparation and caching
  • Batch queries
  • Conversion of PostgreSQL arrays to Go slice mapping
  • JSON and JSONB support
  • COPY protocol support

database/sql is limited to int64, float64, bool, []byte, string, time.Time, or nil

12 of 33

Concurrency: pgxpool

  • pgx.Conn is a low-level implementation
  • Use pgxpool.Conn to access the database concurrently

Default:

conf.MaxConns = runtime.NumCPU()

13 of 33

Database migrations: 001_initial_schema.sql, 002_…

14 of 33

tern: standalone migration tool�https://asciinema.org/a/450576

15 of 33

Database layer

It might be useful to define an interface, even if bothersome.

What about mocks?

16 of 33

Testing strategy

Get superior returns from your tests:

  • Favor writing integration test with a real implementation
  • Use test doubles to test scenarios of database failure.

Inspiration: Software Engineering at Google: Lessons Learned from Programming Over Time�https://amzn.to/3uqZWP7

Real implementation

Test doubles (mocks, fakes…)

SELECT returning data or not found.

Database connection issue.

INSERT, UPDATE, etc.

Simulation of unexpected DB error.

Constraints checks.

Expensive operation (ask why first)

17 of 33

Just like Theranos' blood tests…

Testing interfaces for the sake of it is overrated.

18 of 33

Some tools

19 of 33

GoMock

20 of 33

Comparing values with go-cmp

github.com/google/go-cmp

  • A more powerful and safer alternative to reflect.DeepEqual for comparing whether two values are semantically equal.

21 of 33

22 of 33

scany

23 of 33

pgtools

github.com/hatch-studio/pgtools contains:

  • Package to make creating SELECT for writing queries easier.
  • Package sqltest, which makes writing integration tests very efficient.

24 of 33

Tests with sqltest

  • Auto-discovery recognizes PostgreSQL configuration automatically.
  • Creates a database with prefix test_ (safety) + normalized test names.
  • Your tests can run in parallel just as fine.

DON'T

25 of 33

sqltest + tern = easy tests https://asciinema.org/a/450576

26 of 33

27 of 33

Then you discover running its integration tests is hard.

28 of 33

$ go test ./…

From zero to running integration tests in few seconds. Try it:

  1. Install or configure PostgreSQL environment variables so that psql works.
  2. $ export INTEGRATION_TESTDB=true�$ go test -v ./...

29 of 33

Continuous Integration

Run your DB on any CI/CD system.

For example, with a pull request workflow on GitHub Actions, stateless.

30 of 33

31 of 33

32 of 33

Takeaways

  • PostgreSQL is good
  • There are many tools to help you
  • Balance between integration and unit tests
  • Continuous Integration is easier than you think

33 of 33

Thanks. Questions?

Henrique Vicente, HATCH Studio